Ad-hoc reporting API query fails when joining standard and custom tables

Ad-hoc reporting API queries that join standard ERP tables with custom tables are failing with ambiguous column errors in Snowflake 7.0. Simple queries work, but joins consistently fail.


POST /api/v1/reports/ad-hoc/execute
{
  "query": {
    "from": ["sales_orders", "custom_order_attributes"],
    "join": {"on": "order_id"},
    "select": ["order_id", "customer_name", "custom_attribute"]
  }
}
// Error: Ambiguous column reference: 'order_id' exists in multiple tables

Need guidance on column aliasing in joins and how the API query payload structure should handle table prefixes. Also, should we review custom table schema for naming conflicts?

Join configuration needs full table qualification. Change your join structure to: "join": {"type": "inner", "on": "sales_orders.order_id = custom_order_attributes.order_id"}. The API expects SQL-like join conditions, not just column names.

Solved! The solution required proper column aliasing in joins, API query payload structure with table aliases, and custom table schema review for best practices:

1. Column Aliasing in Joins: Use fully qualified column references with table aliases:


POST /api/v1/reports/ad-hoc/execute
{
  "query": {
    "from": [
      {"table": "sales_orders", "alias": "so"},
      {"table": "custom_order_attributes", "alias": "coa"}
    ],
    "join": {
      "type": "inner",
      "on": "so.order_id = coa.order_id"
    },
    "select": [
      "so.order_id",
      "so.customer_name",
      "coa.custom_attribute"
    ]
  }
}

2. API Query Payload Structure: For complex joins with multiple custom tables:


POST /api/v1/reports/ad-hoc/execute
{
  "query": {
    "from": [
      {"table": "sales_orders", "alias": "so"},
      {"table": "custom_order_attributes", "alias": "coa"},
      {"table": "custom_order_pricing", "alias": "cop"}
    ],
    "joins": [
      {
        "type": "inner",
        "table": "coa",
        "on": "so.order_id = coa.order_id"
      },
      {
        "type": "left",
        "table": "cop",
        "on": "so.order_id = cop.order_id"
      }
    ],
    "select": [
      "so.order_id AS order_number",
      "coa.custom_attribute",
      "cop.custom_price"
    ]
  }
}

Note the joins array (plural) for multiple joins - this is required in 7.0 API.

3. Custom Table Schema Review: Documented naming conventions for future custom tables:

  • Prefix all custom table names with `custom_
  • Use descriptive column names that include context (e.g., attribute_value instead of just value)
  • Avoid reusing standard ERP column names unless intentionally matching for joins
  • Document join keys in custom table metadata

The key was understanding that the ad-hoc reporting API requires explicit table aliases in the from array (not just table names) and SQL-style join conditions with full qualification. The "select" array supports AS aliases for output column naming. Without proper aliasing, any column name appearing in multiple tables causes ambiguity errors. This structure works for any number of joined tables.

Use table aliases in your API query payload structure. The API supports alias definitions: "from": [{"table": "sales_orders", "alias": "so"}, {"table": "custom_order_attributes", "alias": "coa"}]. Then reference aliases in joins and selects: "join": {"on": "so.order_id = coa.order_id"}, "select": ["so.order_id", "coa.custom_attribute"].

Added table prefixes to the select fields, but now the join condition is failing. The join ‘on’ clause also has the ambiguous column issue. How do I specify table prefixes in the join configuration?