Compare commits

..

30 Commits

Author SHA1 Message Date
Beto Dealmeida
0f08f016d2 Address semantic layer review nits
Improve semantic layer schema refresh error handling and connections endpoint behavior to reduce noisy failures while keeping this feature branch focused. Also restore frontend typing consistency and add debounce coverage for dynamic schema refresh.
2026-04-23 14:00:59 -04:00
Beto Dealmeida
65fb2ff834 Fix rebase 2026-04-23 13:33:47 -04:00
Beto Dealmeida
d659089c59 feat: UI for semantic layers 2026-04-23 13:33:47 -04:00
Beto Dealmeida
5e046a857c Update permissions 2026-04-23 13:33:47 -04:00
Beto Dealmeida
36554237aa Address comments 2026-04-23 13:33:47 -04:00
Beto Dealmeida
6f93e1cbb1 feat: API for semantic layers 2026-04-23 13:33:47 -04:00
Beto Dealmeida
913259299e Address comments 2026-04-23 13:26:40 -04:00
Beto Dealmeida
2351e0ead7 Move logic to commands 2026-04-16 18:16:23 -04:00
Beto Dealmeida
8c6f211003 Address comments 2026-04-16 18:16:23 -04:00
Beto Dealmeida
0e3d78817f Fix imports 2026-04-16 18:16:23 -04:00
Beto Dealmeida
f0c8304e24 feat: UI for semantic views 2026-04-16 18:16:23 -04:00
Beto Dealmeida
80233aed46 Fix DAO 2026-04-16 18:16:23 -04:00
Beto Dealmeida
6f350428df Check uniqueness 2026-04-16 18:16:23 -04:00
Beto Dealmeida
548ccfde44 feat: API for semantic views 2026-04-16 18:16:23 -04:00
Beto Dealmeida
596008203c Fix Datasource type 2026-04-16 18:16:23 -04:00
Beto Dealmeida
ff46c86df3 feat: Explore integration 2026-04-16 18:16:23 -04:00
Beto Dealmeida
4e30638024 Address more comments 2026-04-16 18:14:39 -04:00
Beto Dealmeida
efa9159cc8 Address comments 2026-04-16 11:22:44 -04:00
Beto Dealmeida
14668f37bd Improvements 2026-03-10 15:32:07 -04:00
Beto Dealmeida
27a2466855 feat: models and DAOs 2026-03-10 14:15:07 -04:00
Beto Dealmeida
e35c6946ec Fix lint/tests 2026-03-10 13:48:21 -04:00
Beto Dealmeida
12c5bfa0a5 Improve types 2026-03-10 12:56:09 -04:00
Beto Dealmeida
0303a234a3 Fix tests 2026-03-10 12:56:09 -04:00
Beto Dealmeida
09e9927652 Simplify 2026-03-10 12:56:09 -04:00
Beto Dealmeida
3f9ea361bb docs: add semantic layers to contribution types 2026-03-10 12:56:09 -04:00
Beto Dealmeida
f1047140ee feat: add @semantic_layer decorator for extension discovery 2026-03-10 12:56:09 -04:00
Beto Dealmeida
15e3ab4493 Address comments 2026-03-10 12:56:09 -04:00
Beto Dealmeida
755aa2e32f Address TODOs 2026-03-10 12:56:09 -04:00
Beto Dealmeida
17d1ed7353 chore: remove AdhocFilter 2026-03-10 12:56:09 -04:00
Beto Dealmeida
9c1bcb70d0 feat: semantic layer extension 2026-03-10 12:56:09 -04:00
276 changed files with 15120 additions and 11001 deletions

View File

@@ -68,7 +68,7 @@ jobs:
yarn install --check-cache
- name: Download database diagnostics (if triggered by integration tests)
if: github.event_name == 'workflow_run' && github.event.workflow_run.conclusion == 'success'
uses: dawidd6/action-download-artifact@v17
uses: dawidd6/action-download-artifact@v16
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml
@@ -77,7 +77,7 @@ jobs:
path: docs/src/data/
- name: Try to download latest diagnostics (for push/dispatch triggers)
if: github.event_name != 'workflow_run'
uses: dawidd6/action-download-artifact@v17
uses: dawidd6/action-download-artifact@v16
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml

View File

@@ -111,13 +111,12 @@ jobs:
run: |
yarn install --check-cache
- name: Download database diagnostics from integration tests
uses: dawidd6/action-download-artifact@v17
uses: dawidd6/action-download-artifact@v16
with:
workflow: superset-python-integrationtest.yml
run_id: ${{ github.event.workflow_run.id }}
name: database-diagnostics
path: docs/src/data/
if_no_artifact_found: 'warning'
- name: Use fresh diagnostics
run: |
if [ -f "src/data/databases-diagnostics.json" ]; then

View File

@@ -52,6 +52,7 @@ jobs:
SUPERSET_SECRET_KEY: not-a-secret
run: |
pytest --durations-min=0.5 --cov=superset/sql/ ./tests/unit_tests/sql/ --cache-clear --cov-fail-under=100
pytest --durations-min=0.5 --cov=superset/semantic_layers/ ./tests/unit_tests/semantic_layers/ --cache-clear --cov-fail-under=100
- name: Upload code coverage
uses: codecov/codecov-action@v5
with:

View File

@@ -24,13 +24,13 @@ assists people when migrating to a new version.
## Next
### Deck.gl MapBox viewport and opacity controls are functional
### Combined datasource list endpoint
The Deck.gl MapBox chart's **Opacity**, **Default longitude**, **Default latitude**, and **Zoom** controls were previously non-functional — changing them had no effect on the rendered map. These controls are now wired up correctly.
Added a new combined datasource list endpoint at `GET /api/v1/datasource/` to serve datasets and semantic views in one response.
**Behavior change for existing charts:** Previously, the viewport controls had hard-coded default values (`-122.405293`, `37.772123`, zoom `11` — San Francisco) that were stored in each chart's `form_data` but never applied. The map always used `fitBounds` to center on the data. With this fix, those stored values are now respected, which means existing MapBox charts may open centered on the old default coordinates instead of fitting to data bounds.
**To restore fit-to-data behavior:** Open the chart in Explore, clear the **Default longitude**, **Default latitude**, and **Zoom** fields in the Viewport section, and re-save the chart.
- The endpoint is available to users with at least one of `can_read` on `Dataset` or `SemanticView`.
- Semantic views are included only when the `SEMANTIC_LAYERS` feature flag is enabled.
- The endpoint enforces strict `order_column` validation and returns `400` for invalid sort columns.
### ClickHouse minimum driver version bump

View File

@@ -47,10 +47,10 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get the CSRF token](/developer-docs/api/get-the-csrf-token) | `/api/v1/security/csrf_token/` |
| `POST` | [Get a guest token](/developer-docs/api/get-a-guest-token) | `/api/v1/security/guest_token/` |
| `POST` | [Create security login](/developer-docs/api/create-security-login) | `/api/v1/security/login` |
| `POST` | [Create security refresh](/developer-docs/api/create-security-refresh) | `/api/v1/security/refresh` |
| `GET` | [Get the CSRF token](./api/get-the-csrf-token) | `/api/v1/security/csrf_token/` |
| `POST` | [Get a guest token](./api/get-a-guest-token) | `/api/v1/security/guest_token/` |
| `POST` | [Create security login](./api/create-security-login) | `/api/v1/security/login` |
| `POST` | [Create security refresh](./api/create-security-refresh) | `/api/v1/security/refresh` |
---
@@ -63,32 +63,32 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Bulk delete dashboards](/developer-docs/api/bulk-delete-dashboards) | `/api/v1/dashboard/` |
| `GET` | [Get a list of dashboards](/developer-docs/api/get-a-list-of-dashboards) | `/api/v1/dashboard/` |
| `POST` | [Create a new dashboard](/developer-docs/api/create-a-new-dashboard) | `/api/v1/dashboard/` |
| `GET` | [Get metadata information about this API resource (dashboard--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-dashboard-info) | `/api/v1/dashboard/_info` |
| `GET` | [Get a dashboard detail information](/developer-docs/api/get-a-dashboard-detail-information) | `/api/v1/dashboard/{id_or_slug}` |
| `GET` | [Get a dashboard's chart definitions.](/developer-docs/api/get-a-dashboard-s-chart-definitions) | `/api/v1/dashboard/{id_or_slug}/charts` |
| `POST` | [Create a copy of an existing dashboard](/developer-docs/api/create-a-copy-of-an-existing-dashboard) | `/api/v1/dashboard/{id_or_slug}/copy/` |
| `GET` | [Get dashboard's datasets](/developer-docs/api/get-dashboard-s-datasets) | `/api/v1/dashboard/{id_or_slug}/datasets` |
| `DELETE` | [Delete a dashboard's embedded configuration](/developer-docs/api/delete-a-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `GET` | [Get the dashboard's embedded configuration](/developer-docs/api/get-the-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `POST` | [Set a dashboard's embedded configuration](/developer-docs/api/set-a-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `PUT` | [Update dashboard by id_or_slug embedded](/developer-docs/api/update-dashboard-by-id-or-slug-embedded) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `GET` | [Get dashboard's tabs](/developer-docs/api/get-dashboard-s-tabs) | `/api/v1/dashboard/{id_or_slug}/tabs` |
| `DELETE` | [Delete a dashboard](/developer-docs/api/delete-a-dashboard) | `/api/v1/dashboard/{pk}` |
| `PUT` | [Update a dashboard](/developer-docs/api/update-a-dashboard) | `/api/v1/dashboard/{pk}` |
| `POST` | [Compute and cache a screenshot (dashboard-pk-cache-dashboard-screenshot)](/developer-docs/api/compute-and-cache-a-screenshot-dashboard-pk-cache-dashboard-screenshot) | `/api/v1/dashboard/{pk}/cache_dashboard_screenshot/` |
| `PUT` | [Update colors configuration for a dashboard.](/developer-docs/api/update-colors-configuration-for-a-dashboard) | `/api/v1/dashboard/{pk}/colors` |
| `DELETE` | [Remove the dashboard from the user favorite list](/developer-docs/api/remove-the-dashboard-from-the-user-favorite-list) | `/api/v1/dashboard/{pk}/favorites/` |
| `POST` | [Mark the dashboard as favorite for the current user](/developer-docs/api/mark-the-dashboard-as-favorite-for-the-current-user) | `/api/v1/dashboard/{pk}/favorites/` |
| `PUT` | [Update native filters configuration for a dashboard.](/developer-docs/api/update-native-filters-configuration-for-a-dashboard) | `/api/v1/dashboard/{pk}/filters` |
| `GET` | [Get a computed screenshot from cache (dashboard-pk-screenshot-digest)](/developer-docs/api/get-a-computed-screenshot-from-cache-dashboard-pk-screenshot-digest) | `/api/v1/dashboard/{pk}/screenshot/{digest}/` |
| `GET` | [Get dashboard's thumbnail](/developer-docs/api/get-dashboard-s-thumbnail) | `/api/v1/dashboard/{pk}/thumbnail/{digest}/` |
| `GET` | [Download multiple dashboards as YAML files](/developer-docs/api/download-multiple-dashboards-as-yaml-files) | `/api/v1/dashboard/export/` |
| `GET` | [Check favorited dashboards for current user](/developer-docs/api/check-favorited-dashboards-for-current-user) | `/api/v1/dashboard/favorite_status/` |
| `POST` | [Import dashboard(s) with associated charts/datasets/databases](/developer-docs/api/import-dashboard-s-with-associated-charts-datasets-databases) | `/api/v1/dashboard/import/` |
| `GET` | [Get related fields data (dashboard-related-column-name)](/developer-docs/api/get-related-fields-data-dashboard-related-column-name) | `/api/v1/dashboard/related/{column_name}` |
| `DELETE` | [Bulk delete dashboards](./api/bulk-delete-dashboards) | `/api/v1/dashboard/` |
| `GET` | [Get a list of dashboards](./api/get-a-list-of-dashboards) | `/api/v1/dashboard/` |
| `POST` | [Create a new dashboard](./api/create-a-new-dashboard) | `/api/v1/dashboard/` |
| `GET` | [Get metadata information about this API resource (dashboard--info)](./api/get-metadata-information-about-this-api-resource-dashboard-info) | `/api/v1/dashboard/_info` |
| `GET` | [Get a dashboard detail information](./api/get-a-dashboard-detail-information) | `/api/v1/dashboard/{id_or_slug}` |
| `GET` | [Get a dashboard's chart definitions.](./api/get-a-dashboard-s-chart-definitions) | `/api/v1/dashboard/{id_or_slug}/charts` |
| `POST` | [Create a copy of an existing dashboard](./api/create-a-copy-of-an-existing-dashboard) | `/api/v1/dashboard/{id_or_slug}/copy/` |
| `GET` | [Get dashboard's datasets](./api/get-dashboard-s-datasets) | `/api/v1/dashboard/{id_or_slug}/datasets` |
| `DELETE` | [Delete a dashboard's embedded configuration](./api/delete-a-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `GET` | [Get the dashboard's embedded configuration](./api/get-the-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `POST` | [Set a dashboard's embedded configuration](./api/set-a-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `PUT` | [Update dashboard by id_or_slug embedded](./api/update-dashboard-by-id-or-slug-embedded) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `GET` | [Get dashboard's tabs](./api/get-dashboard-s-tabs) | `/api/v1/dashboard/{id_or_slug}/tabs` |
| `DELETE` | [Delete a dashboard](./api/delete-a-dashboard) | `/api/v1/dashboard/{pk}` |
| `PUT` | [Update a dashboard](./api/update-a-dashboard) | `/api/v1/dashboard/{pk}` |
| `POST` | [Compute and cache a screenshot (dashboard-pk-cache-dashboard-screenshot)](./api/compute-and-cache-a-screenshot-dashboard-pk-cache-dashboard-screenshot) | `/api/v1/dashboard/{pk}/cache_dashboard_screenshot/` |
| `PUT` | [Update colors configuration for a dashboard.](./api/update-colors-configuration-for-a-dashboard) | `/api/v1/dashboard/{pk}/colors` |
| `DELETE` | [Remove the dashboard from the user favorite list](./api/remove-the-dashboard-from-the-user-favorite-list) | `/api/v1/dashboard/{pk}/favorites/` |
| `POST` | [Mark the dashboard as favorite for the current user](./api/mark-the-dashboard-as-favorite-for-the-current-user) | `/api/v1/dashboard/{pk}/favorites/` |
| `PUT` | [Update native filters configuration for a dashboard.](./api/update-native-filters-configuration-for-a-dashboard) | `/api/v1/dashboard/{pk}/filters` |
| `GET` | [Get a computed screenshot from cache (dashboard-pk-screenshot-digest)](./api/get-a-computed-screenshot-from-cache-dashboard-pk-screenshot-digest) | `/api/v1/dashboard/{pk}/screenshot/{digest}/` |
| `GET` | [Get dashboard's thumbnail](./api/get-dashboard-s-thumbnail) | `/api/v1/dashboard/{pk}/thumbnail/{digest}/` |
| `GET` | [Download multiple dashboards as YAML files](./api/download-multiple-dashboards-as-yaml-files) | `/api/v1/dashboard/export/` |
| `GET` | [Check favorited dashboards for current user](./api/check-favorited-dashboards-for-current-user) | `/api/v1/dashboard/favorite_status/` |
| `POST` | [Import dashboard(s) with associated charts/datasets/databases](./api/import-dashboard-s-with-associated-charts-datasets-databases) | `/api/v1/dashboard/import/` |
| `GET` | [Get related fields data (dashboard-related-column-name)](./api/get-related-fields-data-dashboard-related-column-name) | `/api/v1/dashboard/related/{column_name}` |
</details>
@@ -97,26 +97,26 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Bulk delete charts](/developer-docs/api/bulk-delete-charts) | `/api/v1/chart/` |
| `GET` | [Get a list of charts](/developer-docs/api/get-a-list-of-charts) | `/api/v1/chart/` |
| `POST` | [Create a new chart](/developer-docs/api/create-a-new-chart) | `/api/v1/chart/` |
| `GET` | [Get metadata information about this API resource (chart--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-chart-info) | `/api/v1/chart/_info` |
| `DELETE` | [Delete a chart](/developer-docs/api/delete-a-chart) | `/api/v1/chart/{pk}` |
| `GET` | [Get a chart detail information](/developer-docs/api/get-a-chart-detail-information) | `/api/v1/chart/{pk}` |
| `PUT` | [Update a chart](/developer-docs/api/update-a-chart) | `/api/v1/chart/{pk}` |
| `GET` | [Compute and cache a screenshot (chart-pk-cache-screenshot)](/developer-docs/api/compute-and-cache-a-screenshot-chart-pk-cache-screenshot) | `/api/v1/chart/{pk}/cache_screenshot/` |
| `GET` | [Return payload data response for a chart](/developer-docs/api/return-payload-data-response-for-a-chart) | `/api/v1/chart/{pk}/data/` |
| `DELETE` | [Remove the chart from the user favorite list](/developer-docs/api/remove-the-chart-from-the-user-favorite-list) | `/api/v1/chart/{pk}/favorites/` |
| `POST` | [Mark the chart as favorite for the current user](/developer-docs/api/mark-the-chart-as-favorite-for-the-current-user) | `/api/v1/chart/{pk}/favorites/` |
| `GET` | [Get a computed screenshot from cache (chart-pk-screenshot-digest)](/developer-docs/api/get-a-computed-screenshot-from-cache-chart-pk-screenshot-digest) | `/api/v1/chart/{pk}/screenshot/{digest}/` |
| `GET` | [Get chart thumbnail](/developer-docs/api/get-chart-thumbnail) | `/api/v1/chart/{pk}/thumbnail/{digest}/` |
| `POST` | [Return payload data response for the given query (chart-data)](/developer-docs/api/return-payload-data-response-for-the-given-query-chart-data) | `/api/v1/chart/data` |
| `GET` | [Return payload data response for the given query (chart-data-cache-key)](/developer-docs/api/return-payload-data-response-for-the-given-query-chart-data-cache-key) | `/api/v1/chart/data/{cache_key}` |
| `GET` | [Download multiple charts as YAML files](/developer-docs/api/download-multiple-charts-as-yaml-files) | `/api/v1/chart/export/` |
| `GET` | [Check favorited charts for current user](/developer-docs/api/check-favorited-charts-for-current-user) | `/api/v1/chart/favorite_status/` |
| `POST` | [Import chart(s) with associated datasets and databases](/developer-docs/api/import-chart-s-with-associated-datasets-and-databases) | `/api/v1/chart/import/` |
| `GET` | [Get related fields data (chart-related-column-name)](/developer-docs/api/get-related-fields-data-chart-related-column-name) | `/api/v1/chart/related/{column_name}` |
| `PUT` | [Warm up the cache for the chart](/developer-docs/api/warm-up-the-cache-for-the-chart) | `/api/v1/chart/warm_up_cache` |
| `DELETE` | [Bulk delete charts](./api/bulk-delete-charts) | `/api/v1/chart/` |
| `GET` | [Get a list of charts](./api/get-a-list-of-charts) | `/api/v1/chart/` |
| `POST` | [Create a new chart](./api/create-a-new-chart) | `/api/v1/chart/` |
| `GET` | [Get metadata information about this API resource (chart--info)](./api/get-metadata-information-about-this-api-resource-chart-info) | `/api/v1/chart/_info` |
| `DELETE` | [Delete a chart](./api/delete-a-chart) | `/api/v1/chart/{pk}` |
| `GET` | [Get a chart detail information](./api/get-a-chart-detail-information) | `/api/v1/chart/{pk}` |
| `PUT` | [Update a chart](./api/update-a-chart) | `/api/v1/chart/{pk}` |
| `GET` | [Compute and cache a screenshot (chart-pk-cache-screenshot)](./api/compute-and-cache-a-screenshot-chart-pk-cache-screenshot) | `/api/v1/chart/{pk}/cache_screenshot/` |
| `GET` | [Return payload data response for a chart](./api/return-payload-data-response-for-a-chart) | `/api/v1/chart/{pk}/data/` |
| `DELETE` | [Remove the chart from the user favorite list](./api/remove-the-chart-from-the-user-favorite-list) | `/api/v1/chart/{pk}/favorites/` |
| `POST` | [Mark the chart as favorite for the current user](./api/mark-the-chart-as-favorite-for-the-current-user) | `/api/v1/chart/{pk}/favorites/` |
| `GET` | [Get a computed screenshot from cache (chart-pk-screenshot-digest)](./api/get-a-computed-screenshot-from-cache-chart-pk-screenshot-digest) | `/api/v1/chart/{pk}/screenshot/{digest}/` |
| `GET` | [Get chart thumbnail](./api/get-chart-thumbnail) | `/api/v1/chart/{pk}/thumbnail/{digest}/` |
| `POST` | [Return payload data response for the given query (chart-data)](./api/return-payload-data-response-for-the-given-query-chart-data) | `/api/v1/chart/data` |
| `GET` | [Return payload data response for the given query (chart-data-cache-key)](./api/return-payload-data-response-for-the-given-query-chart-data-cache-key) | `/api/v1/chart/data/{cache_key}` |
| `GET` | [Download multiple charts as YAML files](./api/download-multiple-charts-as-yaml-files) | `/api/v1/chart/export/` |
| `GET` | [Check favorited charts for current user](./api/check-favorited-charts-for-current-user) | `/api/v1/chart/favorite_status/` |
| `POST` | [Import chart(s) with associated datasets and databases](./api/import-chart-s-with-associated-datasets-and-databases) | `/api/v1/chart/import/` |
| `GET` | [Get related fields data (chart-related-column-name)](./api/get-related-fields-data-chart-related-column-name) | `/api/v1/chart/related/{column_name}` |
| `PUT` | [Warm up the cache for the chart](./api/warm-up-the-cache-for-the-chart) | `/api/v1/chart/warm_up_cache` |
</details>
@@ -125,24 +125,24 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Bulk delete datasets](/developer-docs/api/bulk-delete-datasets) | `/api/v1/dataset/` |
| `GET` | [Get a list of datasets](/developer-docs/api/get-a-list-of-datasets) | `/api/v1/dataset/` |
| `POST` | [Create a new dataset](/developer-docs/api/create-a-new-dataset) | `/api/v1/dataset/` |
| `GET` | [Get metadata information about this API resource (dataset--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-dataset-info) | `/api/v1/dataset/_info` |
| `DELETE` | [Delete a dataset](/developer-docs/api/delete-a-dataset) | `/api/v1/dataset/{pk}` |
| `GET` | [Get a dataset](/developer-docs/api/get-a-dataset) | `/api/v1/dataset/{pk}` |
| `PUT` | [Update a dataset](/developer-docs/api/update-a-dataset) | `/api/v1/dataset/{pk}` |
| `DELETE` | [Delete a dataset column](/developer-docs/api/delete-a-dataset-column) | `/api/v1/dataset/{pk}/column/{column_id}` |
| `DELETE` | [Delete a dataset metric](/developer-docs/api/delete-a-dataset-metric) | `/api/v1/dataset/{pk}/metric/{metric_id}` |
| `PUT` | [Refresh and update columns of a dataset](/developer-docs/api/refresh-and-update-columns-of-a-dataset) | `/api/v1/dataset/{pk}/refresh` |
| `GET` | [Get charts and dashboards count associated to a dataset](/developer-docs/api/get-charts-and-dashboards-count-associated-to-a-dataset) | `/api/v1/dataset/{pk}/related_objects` |
| `GET` | [Get distinct values from field data (dataset-distinct-column-name)](/developer-docs/api/get-distinct-values-from-field-data-dataset-distinct-column-name) | `/api/v1/dataset/distinct/{column_name}` |
| `POST` | [Duplicate a dataset](/developer-docs/api/duplicate-a-dataset) | `/api/v1/dataset/duplicate` |
| `GET` | [Download multiple datasets as YAML files](/developer-docs/api/download-multiple-datasets-as-yaml-files) | `/api/v1/dataset/export/` |
| `POST` | [Retrieve a table by name, or create it if it does not exist](/developer-docs/api/retrieve-a-table-by-name-or-create-it-if-it-does-not-exist) | `/api/v1/dataset/get_or_create/` |
| `POST` | [Import dataset(s) with associated databases](/developer-docs/api/import-dataset-s-with-associated-databases) | `/api/v1/dataset/import/` |
| `GET` | [Get related fields data (dataset-related-column-name)](/developer-docs/api/get-related-fields-data-dataset-related-column-name) | `/api/v1/dataset/related/{column_name}` |
| `PUT` | [Warm up the cache for each chart powered by the given table](/developer-docs/api/warm-up-the-cache-for-each-chart-powered-by-the-given-table) | `/api/v1/dataset/warm_up_cache` |
| `DELETE` | [Bulk delete datasets](./api/bulk-delete-datasets) | `/api/v1/dataset/` |
| `GET` | [Get a list of datasets](./api/get-a-list-of-datasets) | `/api/v1/dataset/` |
| `POST` | [Create a new dataset](./api/create-a-new-dataset) | `/api/v1/dataset/` |
| `GET` | [Get metadata information about this API resource (dataset--info)](./api/get-metadata-information-about-this-api-resource-dataset-info) | `/api/v1/dataset/_info` |
| `DELETE` | [Delete a dataset](./api/delete-a-dataset) | `/api/v1/dataset/{pk}` |
| `GET` | [Get a dataset](./api/get-a-dataset) | `/api/v1/dataset/{pk}` |
| `PUT` | [Update a dataset](./api/update-a-dataset) | `/api/v1/dataset/{pk}` |
| `DELETE` | [Delete a dataset column](./api/delete-a-dataset-column) | `/api/v1/dataset/{pk}/column/{column_id}` |
| `DELETE` | [Delete a dataset metric](./api/delete-a-dataset-metric) | `/api/v1/dataset/{pk}/metric/{metric_id}` |
| `PUT` | [Refresh and update columns of a dataset](./api/refresh-and-update-columns-of-a-dataset) | `/api/v1/dataset/{pk}/refresh` |
| `GET` | [Get charts and dashboards count associated to a dataset](./api/get-charts-and-dashboards-count-associated-to-a-dataset) | `/api/v1/dataset/{pk}/related_objects` |
| `GET` | [Get distinct values from field data (dataset-distinct-column-name)](./api/get-distinct-values-from-field-data-dataset-distinct-column-name) | `/api/v1/dataset/distinct/{column_name}` |
| `POST` | [Duplicate a dataset](./api/duplicate-a-dataset) | `/api/v1/dataset/duplicate` |
| `GET` | [Download multiple datasets as YAML files](./api/download-multiple-datasets-as-yaml-files) | `/api/v1/dataset/export/` |
| `POST` | [Retrieve a table by name, or create it if it does not exist](./api/retrieve-a-table-by-name-or-create-it-if-it-does-not-exist) | `/api/v1/dataset/get_or_create/` |
| `POST` | [Import dataset(s) with associated databases](./api/import-dataset-s-with-associated-databases) | `/api/v1/dataset/import/` |
| `GET` | [Get related fields data (dataset-related-column-name)](./api/get-related-fields-data-dataset-related-column-name) | `/api/v1/dataset/related/{column_name}` |
| `PUT` | [Warm up the cache for each chart powered by the given table](./api/warm-up-the-cache-for-each-chart-powered-by-the-given-table) | `/api/v1/dataset/warm_up_cache` |
</details>
@@ -151,37 +151,37 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get a list of databases](/developer-docs/api/get-a-list-of-databases) | `/api/v1/database/` |
| `POST` | [Create a new database](/developer-docs/api/create-a-new-database) | `/api/v1/database/` |
| `GET` | [Get metadata information about this API resource (database--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-database-info) | `/api/v1/database/_info` |
| `DELETE` | [Delete a database](/developer-docs/api/delete-a-database) | `/api/v1/database/{pk}` |
| `GET` | [Get a database](/developer-docs/api/get-a-database) | `/api/v1/database/{pk}` |
| `PUT` | [Change a database](/developer-docs/api/change-a-database) | `/api/v1/database/{pk}` |
| `GET` | [Get all catalogs from a database](/developer-docs/api/get-all-catalogs-from-a-database) | `/api/v1/database/{pk}/catalogs/` |
| `GET` | [Get a database connection info](/developer-docs/api/get-a-database-connection-info) | `/api/v1/database/{pk}/connection` |
| `GET` | [Get function names supported by a database](/developer-docs/api/get-function-names-supported-by-a-database) | `/api/v1/database/{pk}/function_names/` |
| `GET` | [Get charts and dashboards count associated to a database](/developer-docs/api/get-charts-and-dashboards-count-associated-to-a-database) | `/api/v1/database/{pk}/related_objects/` |
| `GET` | [The list of the database schemas where to upload information](/developer-docs/api/the-list-of-the-database-schemas-where-to-upload-information) | `/api/v1/database/{pk}/schemas_access_for_file_upload/` |
| `GET` | [Get all schemas from a database](/developer-docs/api/get-all-schemas-from-a-database) | `/api/v1/database/{pk}/schemas/` |
| `GET` | [Get database select star for table (database-pk-select-star-table-name)](/developer-docs/api/get-database-select-star-for-table-database-pk-select-star-table-name) | `/api/v1/database/{pk}/select_star/{table_name}/` |
| `GET` | [Get database select star for table (database-pk-select-star-table-name-schema-name)](/developer-docs/api/get-database-select-star-for-table-database-pk-select-star-table-name-schema-name) | `/api/v1/database/{pk}/select_star/{table_name}/{schema_name}/` |
| `DELETE` | [Delete a SSH tunnel](/developer-docs/api/delete-a-ssh-tunnel) | `/api/v1/database/{pk}/ssh_tunnel/` |
| `POST` | [Re-sync all permissions for a database connection](/developer-docs/api/re-sync-all-permissions-for-a-database-connection) | `/api/v1/database/{pk}/sync_permissions/` |
| `GET` | [Get table extra metadata (database-pk-table-extra-table-name-schema-name)](/developer-docs/api/get-table-extra-metadata-database-pk-table-extra-table-name-schema-name) | `/api/v1/database/{pk}/table_extra/{table_name}/{schema_name}/` |
| `GET` | [Get table metadata](/developer-docs/api/get-table-metadata) | `/api/v1/database/{pk}/table_metadata/` |
| `GET` | [Get table extra metadata (database-pk-table-metadata-extra)](/developer-docs/api/get-table-extra-metadata-database-pk-table-metadata-extra) | `/api/v1/database/{pk}/table_metadata/extra/` |
| `GET` | [Get database table metadata](/developer-docs/api/get-database-table-metadata) | `/api/v1/database/{pk}/table/{table_name}/{schema_name}/` |
| `GET` | [Get a list of tables for given database](/developer-docs/api/get-a-list-of-tables-for-given-database) | `/api/v1/database/{pk}/tables/` |
| `POST` | [Upload a file to a database table](/developer-docs/api/upload-a-file-to-a-database-table) | `/api/v1/database/{pk}/upload/` |
| `POST` | [Validate arbitrary SQL](/developer-docs/api/validate-arbitrary-sql) | `/api/v1/database/{pk}/validate_sql/` |
| `GET` | [Get names of databases currently available](/developer-docs/api/get-names-of-databases-currently-available) | `/api/v1/database/available/` |
| `GET` | [Download database(s) and associated dataset(s) as a zip file](/developer-docs/api/download-database-s-and-associated-dataset-s-as-a-zip-file) | `/api/v1/database/export/` |
| `POST` | [Import database(s) with associated datasets](/developer-docs/api/import-database-s-with-associated-datasets) | `/api/v1/database/import/` |
| `GET` | [Receive personal access tokens from OAuth2](/developer-docs/api/receive-personal-access-tokens-from-oauth2) | `/api/v1/database/oauth2/` |
| `GET` | [Get related fields data (database-related-column-name)](/developer-docs/api/get-related-fields-data-database-related-column-name) | `/api/v1/database/related/{column_name}` |
| `POST` | [Test a database connection](/developer-docs/api/test-a-database-connection) | `/api/v1/database/test_connection/` |
| `POST` | [Upload a file and returns file metadata](/developer-docs/api/upload-a-file-and-returns-file-metadata) | `/api/v1/database/upload_metadata/` |
| `POST` | [Validate database connection parameters](/developer-docs/api/validate-database-connection-parameters) | `/api/v1/database/validate_parameters/` |
| `GET` | [Get a list of databases](./api/get-a-list-of-databases) | `/api/v1/database/` |
| `POST` | [Create a new database](./api/create-a-new-database) | `/api/v1/database/` |
| `GET` | [Get metadata information about this API resource (database--info)](./api/get-metadata-information-about-this-api-resource-database-info) | `/api/v1/database/_info` |
| `DELETE` | [Delete a database](./api/delete-a-database) | `/api/v1/database/{pk}` |
| `GET` | [Get a database](./api/get-a-database) | `/api/v1/database/{pk}` |
| `PUT` | [Change a database](./api/change-a-database) | `/api/v1/database/{pk}` |
| `GET` | [Get all catalogs from a database](./api/get-all-catalogs-from-a-database) | `/api/v1/database/{pk}/catalogs/` |
| `GET` | [Get a database connection info](./api/get-a-database-connection-info) | `/api/v1/database/{pk}/connection` |
| `GET` | [Get function names supported by a database](./api/get-function-names-supported-by-a-database) | `/api/v1/database/{pk}/function_names/` |
| `GET` | [Get charts and dashboards count associated to a database](./api/get-charts-and-dashboards-count-associated-to-a-database) | `/api/v1/database/{pk}/related_objects/` |
| `GET` | [The list of the database schemas where to upload information](./api/the-list-of-the-database-schemas-where-to-upload-information) | `/api/v1/database/{pk}/schemas_access_for_file_upload/` |
| `GET` | [Get all schemas from a database](./api/get-all-schemas-from-a-database) | `/api/v1/database/{pk}/schemas/` |
| `GET` | [Get database select star for table (database-pk-select-star-table-name)](./api/get-database-select-star-for-table-database-pk-select-star-table-name) | `/api/v1/database/{pk}/select_star/{table_name}/` |
| `GET` | [Get database select star for table (database-pk-select-star-table-name-schema-name)](./api/get-database-select-star-for-table-database-pk-select-star-table-name-schema-name) | `/api/v1/database/{pk}/select_star/{table_name}/{schema_name}/` |
| `DELETE` | [Delete a SSH tunnel](./api/delete-a-ssh-tunnel) | `/api/v1/database/{pk}/ssh_tunnel/` |
| `POST` | [Re-sync all permissions for a database connection](./api/re-sync-all-permissions-for-a-database-connection) | `/api/v1/database/{pk}/sync_permissions/` |
| `GET` | [Get table extra metadata (database-pk-table-extra-table-name-schema-name)](./api/get-table-extra-metadata-database-pk-table-extra-table-name-schema-name) | `/api/v1/database/{pk}/table_extra/{table_name}/{schema_name}/` |
| `GET` | [Get table metadata](./api/get-table-metadata) | `/api/v1/database/{pk}/table_metadata/` |
| `GET` | [Get table extra metadata (database-pk-table-metadata-extra)](./api/get-table-extra-metadata-database-pk-table-metadata-extra) | `/api/v1/database/{pk}/table_metadata/extra/` |
| `GET` | [Get database table metadata](./api/get-database-table-metadata) | `/api/v1/database/{pk}/table/{table_name}/{schema_name}/` |
| `GET` | [Get a list of tables for given database](./api/get-a-list-of-tables-for-given-database) | `/api/v1/database/{pk}/tables/` |
| `POST` | [Upload a file to a database table](./api/upload-a-file-to-a-database-table) | `/api/v1/database/{pk}/upload/` |
| `POST` | [Validate arbitrary SQL](./api/validate-arbitrary-sql) | `/api/v1/database/{pk}/validate_sql/` |
| `GET` | [Get names of databases currently available](./api/get-names-of-databases-currently-available) | `/api/v1/database/available/` |
| `GET` | [Download database(s) and associated dataset(s) as a zip file](./api/download-database-s-and-associated-dataset-s-as-a-zip-file) | `/api/v1/database/export/` |
| `POST` | [Import database(s) with associated datasets](./api/import-database-s-with-associated-datasets) | `/api/v1/database/import/` |
| `GET` | [Receive personal access tokens from OAuth2](./api/receive-personal-access-tokens-from-oauth2) | `/api/v1/database/oauth2/` |
| `GET` | [Get related fields data (database-related-column-name)](./api/get-related-fields-data-database-related-column-name) | `/api/v1/database/related/{column_name}` |
| `POST` | [Test a database connection](./api/test-a-database-connection) | `/api/v1/database/test_connection/` |
| `POST` | [Upload a file and returns file metadata](./api/upload-a-file-and-returns-file-metadata) | `/api/v1/database/upload_metadata/` |
| `POST` | [Validate database connection parameters](./api/validate-database-connection-parameters) | `/api/v1/database/validate_parameters/` |
</details>
@@ -192,7 +192,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Assemble Explore related information in a single endpoint](/developer-docs/api/assemble-explore-related-information-in-a-single-endpoint) | `/api/v1/explore/` |
| `GET` | [Assemble Explore related information in a single endpoint](./api/assemble-explore-related-information-in-a-single-endpoint) | `/api/v1/explore/` |
</details>
@@ -201,12 +201,12 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get the bootstrap data for SqlLab page](/developer-docs/api/get-the-bootstrap-data-for-sqllab-page) | `/api/v1/sqllab/` |
| `POST` | [Estimate the SQL query execution cost](/developer-docs/api/estimate-the-sql-query-execution-cost) | `/api/v1/sqllab/estimate/` |
| `POST` | [Execute a SQL query](/developer-docs/api/execute-a-sql-query) | `/api/v1/sqllab/execute/` |
| `GET` | [Export the SQL query results to a CSV](/developer-docs/api/export-the-sql-query-results-to-a-csv) | `/api/v1/sqllab/export/{client_id}/` |
| `POST` | [Format SQL code](/developer-docs/api/format-sql-code) | `/api/v1/sqllab/format_sql/` |
| `GET` | [Get the result of a SQL query execution](/developer-docs/api/get-the-result-of-a-sql-query-execution) | `/api/v1/sqllab/results/` |
| `GET` | [Get the bootstrap data for SqlLab page](./api/get-the-bootstrap-data-for-sqllab-page) | `/api/v1/sqllab/` |
| `POST` | [Estimate the SQL query execution cost](./api/estimate-the-sql-query-execution-cost) | `/api/v1/sqllab/estimate/` |
| `POST` | [Execute a SQL query](./api/execute-a-sql-query) | `/api/v1/sqllab/execute/` |
| `GET` | [Export the SQL query results to a CSV](./api/export-the-sql-query-results-to-a-csv) | `/api/v1/sqllab/export/{client_id}/` |
| `POST` | [Format SQL code](./api/format-sql-code) | `/api/v1/sqllab/format_sql/` |
| `GET` | [Get the result of a SQL query execution](./api/get-the-result-of-a-sql-query-execution) | `/api/v1/sqllab/results/` |
</details>
@@ -215,23 +215,23 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get a list of queries](/developer-docs/api/get-a-list-of-queries) | `/api/v1/query/` |
| `GET` | [Get query detail information](/developer-docs/api/get-query-detail-information) | `/api/v1/query/{pk}` |
| `GET` | [Get distinct values from field data (query-distinct-column-name)](/developer-docs/api/get-distinct-values-from-field-data-query-distinct-column-name) | `/api/v1/query/distinct/{column_name}` |
| `GET` | [Get related fields data (query-related-column-name)](/developer-docs/api/get-related-fields-data-query-related-column-name) | `/api/v1/query/related/{column_name}` |
| `POST` | [Manually stop a query with client_id](/developer-docs/api/manually-stop-a-query-with-client-id) | `/api/v1/query/stop` |
| `GET` | [Get a list of queries that changed after last_updated_ms](/developer-docs/api/get-a-list-of-queries-that-changed-after-last-updated-ms) | `/api/v1/query/updated_since` |
| `DELETE` | [Bulk delete saved queries](/developer-docs/api/bulk-delete-saved-queries) | `/api/v1/saved_query/` |
| `GET` | [Get a list of saved queries](/developer-docs/api/get-a-list-of-saved-queries) | `/api/v1/saved_query/` |
| `POST` | [Create a saved query](/developer-docs/api/create-a-saved-query) | `/api/v1/saved_query/` |
| `GET` | [Get metadata information about this API resource (saved-query--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-saved-query-info) | `/api/v1/saved_query/_info` |
| `DELETE` | [Delete a saved query](/developer-docs/api/delete-a-saved-query) | `/api/v1/saved_query/{pk}` |
| `GET` | [Get a saved query](/developer-docs/api/get-a-saved-query) | `/api/v1/saved_query/{pk}` |
| `PUT` | [Update a saved query](/developer-docs/api/update-a-saved-query) | `/api/v1/saved_query/{pk}` |
| `GET` | [Get distinct values from field data (saved-query-distinct-column-name)](/developer-docs/api/get-distinct-values-from-field-data-saved-query-distinct-column-name) | `/api/v1/saved_query/distinct/{column_name}` |
| `GET` | [Download multiple saved queries as YAML files](/developer-docs/api/download-multiple-saved-queries-as-yaml-files) | `/api/v1/saved_query/export/` |
| `POST` | [Import saved queries with associated databases](/developer-docs/api/import-saved-queries-with-associated-databases) | `/api/v1/saved_query/import/` |
| `GET` | [Get related fields data (saved-query-related-column-name)](/developer-docs/api/get-related-fields-data-saved-query-related-column-name) | `/api/v1/saved_query/related/{column_name}` |
| `GET` | [Get a list of queries](./api/get-a-list-of-queries) | `/api/v1/query/` |
| `GET` | [Get query detail information](./api/get-query-detail-information) | `/api/v1/query/{pk}` |
| `GET` | [Get distinct values from field data (query-distinct-column-name)](./api/get-distinct-values-from-field-data-query-distinct-column-name) | `/api/v1/query/distinct/{column_name}` |
| `GET` | [Get related fields data (query-related-column-name)](./api/get-related-fields-data-query-related-column-name) | `/api/v1/query/related/{column_name}` |
| `POST` | [Manually stop a query with client_id](./api/manually-stop-a-query-with-client-id) | `/api/v1/query/stop` |
| `GET` | [Get a list of queries that changed after last_updated_ms](./api/get-a-list-of-queries-that-changed-after-last-updated-ms) | `/api/v1/query/updated_since` |
| `DELETE` | [Bulk delete saved queries](./api/bulk-delete-saved-queries) | `/api/v1/saved_query/` |
| `GET` | [Get a list of saved queries](./api/get-a-list-of-saved-queries) | `/api/v1/saved_query/` |
| `POST` | [Create a saved query](./api/create-a-saved-query) | `/api/v1/saved_query/` |
| `GET` | [Get metadata information about this API resource (saved-query--info)](./api/get-metadata-information-about-this-api-resource-saved-query-info) | `/api/v1/saved_query/_info` |
| `DELETE` | [Delete a saved query](./api/delete-a-saved-query) | `/api/v1/saved_query/{pk}` |
| `GET` | [Get a saved query](./api/get-a-saved-query) | `/api/v1/saved_query/{pk}` |
| `PUT` | [Update a saved query](./api/update-a-saved-query) | `/api/v1/saved_query/{pk}` |
| `GET` | [Get distinct values from field data (saved-query-distinct-column-name)](./api/get-distinct-values-from-field-data-saved-query-distinct-column-name) | `/api/v1/saved_query/distinct/{column_name}` |
| `GET` | [Download multiple saved queries as YAML files](./api/download-multiple-saved-queries-as-yaml-files) | `/api/v1/saved_query/export/` |
| `POST` | [Import saved queries with associated databases](./api/import-saved-queries-with-associated-databases) | `/api/v1/saved_query/import/` |
| `GET` | [Get related fields data (saved-query-related-column-name)](./api/get-related-fields-data-saved-query-related-column-name) | `/api/v1/saved_query/related/{column_name}` |
</details>
@@ -240,7 +240,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get possible values for a datasource column](/developer-docs/api/get-possible-values-for-a-datasource-column) | `/api/v1/datasource/{datasource_type}/{datasource_id}/column/{column_name}/values/` |
| `GET` | [Get possible values for a datasource column](./api/get-possible-values-for-a-datasource-column) | `/api/v1/datasource/{datasource_type}/{datasource_id}/column/{column_name}/values/` |
</details>
@@ -249,8 +249,8 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Return an AdvancedDataTypeResponse](/developer-docs/api/return-an-advanceddatatyperesponse) | `/api/v1/advanced_data_type/convert` |
| `GET` | [Return a list of available advanced data types](/developer-docs/api/return-a-list-of-available-advanced-data-types) | `/api/v1/advanced_data_type/types` |
| `GET` | [Return an AdvancedDataTypeResponse](./api/return-an-advanceddatatyperesponse) | `/api/v1/advanced_data_type/convert` |
| `GET` | [Return a list of available advanced data types](./api/return-a-list-of-available-advanced-data-types) | `/api/v1/advanced_data_type/types` |
</details>
@@ -261,21 +261,21 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Bulk delete tags](/developer-docs/api/bulk-delete-tags) | `/api/v1/tag/` |
| `GET` | [Get a list of tags](/developer-docs/api/get-a-list-of-tags) | `/api/v1/tag/` |
| `POST` | [Create a tag](/developer-docs/api/create-a-tag) | `/api/v1/tag/` |
| `GET` | [Get metadata information about tag API endpoints](/developer-docs/api/get-metadata-information-about-tag-api-endpoints) | `/api/v1/tag/_info` |
| `POST` | [Add tags to an object](/developer-docs/api/add-tags-to-an-object) | `/api/v1/tag/{object_type}/{object_id}/` |
| `DELETE` | [Delete a tagged object](/developer-docs/api/delete-a-tagged-object) | `/api/v1/tag/{object_type}/{object_id}/{tag}/` |
| `DELETE` | [Delete a tag](/developer-docs/api/delete-a-tag) | `/api/v1/tag/{pk}` |
| `GET` | [Get a tag detail information](/developer-docs/api/get-a-tag-detail-information) | `/api/v1/tag/{pk}` |
| `PUT` | [Update a tag](/developer-docs/api/update-a-tag) | `/api/v1/tag/{pk}` |
| `DELETE` | [Delete tag by pk favorites](/developer-docs/api/delete-tag-by-pk-favorites) | `/api/v1/tag/{pk}/favorites/` |
| `POST` | [Create tag by pk favorites](/developer-docs/api/create-tag-by-pk-favorites) | `/api/v1/tag/{pk}/favorites/` |
| `POST` | [Bulk create tags and tagged objects](/developer-docs/api/bulk-create-tags-and-tagged-objects) | `/api/v1/tag/bulk_create` |
| `GET` | [Get tag favorite status](/developer-docs/api/get-tag-favorite-status) | `/api/v1/tag/favorite_status/` |
| `GET` | [Get all objects associated with a tag](/developer-docs/api/get-all-objects-associated-with-a-tag) | `/api/v1/tag/get_objects/` |
| `GET` | [Get related fields data (tag-related-column-name)](/developer-docs/api/get-related-fields-data-tag-related-column-name) | `/api/v1/tag/related/{column_name}` |
| `DELETE` | [Bulk delete tags](./api/bulk-delete-tags) | `/api/v1/tag/` |
| `GET` | [Get a list of tags](./api/get-a-list-of-tags) | `/api/v1/tag/` |
| `POST` | [Create a tag](./api/create-a-tag) | `/api/v1/tag/` |
| `GET` | [Get metadata information about tag API endpoints](./api/get-metadata-information-about-tag-api-endpoints) | `/api/v1/tag/_info` |
| `POST` | [Add tags to an object](./api/add-tags-to-an-object) | `/api/v1/tag/{object_type}/{object_id}/` |
| `DELETE` | [Delete a tagged object](./api/delete-a-tagged-object) | `/api/v1/tag/{object_type}/{object_id}/{tag}/` |
| `DELETE` | [Delete a tag](./api/delete-a-tag) | `/api/v1/tag/{pk}` |
| `GET` | [Get a tag detail information](./api/get-a-tag-detail-information) | `/api/v1/tag/{pk}` |
| `PUT` | [Update a tag](./api/update-a-tag) | `/api/v1/tag/{pk}` |
| `DELETE` | [Delete tag by pk favorites](./api/delete-tag-by-pk-favorites) | `/api/v1/tag/{pk}/favorites/` |
| `POST` | [Create tag by pk favorites](./api/create-tag-by-pk-favorites) | `/api/v1/tag/{pk}/favorites/` |
| `POST` | [Bulk create tags and tagged objects](./api/bulk-create-tags-and-tagged-objects) | `/api/v1/tag/bulk_create` |
| `GET` | [Get tag favorite status](./api/get-tag-favorite-status) | `/api/v1/tag/favorite_status/` |
| `GET` | [Get all objects associated with a tag](./api/get-all-objects-associated-with-a-tag) | `/api/v1/tag/get_objects/` |
| `GET` | [Get related fields data (tag-related-column-name)](./api/get-related-fields-data-tag-related-column-name) | `/api/v1/tag/related/{column_name}` |
</details>
@@ -284,20 +284,20 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Delete multiple annotation layers in a bulk operation](/developer-docs/api/delete-multiple-annotation-layers-in-a-bulk-operation) | `/api/v1/annotation_layer/` |
| `GET` | [Get a list of annotation layers (annotation-layer)](/developer-docs/api/get-a-list-of-annotation-layers-annotation-layer) | `/api/v1/annotation_layer/` |
| `POST` | [Create an annotation layer (annotation-layer)](/developer-docs/api/create-an-annotation-layer-annotation-layer) | `/api/v1/annotation_layer/` |
| `GET` | [Get metadata information about this API resource (annotation-layer--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-annotation-layer-info) | `/api/v1/annotation_layer/_info` |
| `DELETE` | [Delete annotation layer (annotation-layer-pk)](/developer-docs/api/delete-annotation-layer-annotation-layer-pk) | `/api/v1/annotation_layer/{pk}` |
| `GET` | [Get an annotation layer (annotation-layer-pk)](/developer-docs/api/get-an-annotation-layer-annotation-layer-pk) | `/api/v1/annotation_layer/{pk}` |
| `PUT` | [Update an annotation layer (annotation-layer-pk)](/developer-docs/api/update-an-annotation-layer-annotation-layer-pk) | `/api/v1/annotation_layer/{pk}` |
| `DELETE` | [Bulk delete annotation layers](/developer-docs/api/bulk-delete-annotation-layers) | `/api/v1/annotation_layer/{pk}/annotation/` |
| `GET` | [Get a list of annotation layers (annotation-layer-pk-annotation)](/developer-docs/api/get-a-list-of-annotation-layers-annotation-layer-pk-annotation) | `/api/v1/annotation_layer/{pk}/annotation/` |
| `POST` | [Create an annotation layer (annotation-layer-pk-annotation)](/developer-docs/api/create-an-annotation-layer-annotation-layer-pk-annotation) | `/api/v1/annotation_layer/{pk}/annotation/` |
| `DELETE` | [Delete annotation layer (annotation-layer-pk-annotation-annotation-id)](/developer-docs/api/delete-annotation-layer-annotation-layer-pk-annotation-annotation-id) | `/api/v1/annotation_layer/{pk}/annotation/{annotation_id}` |
| `GET` | [Get an annotation layer (annotation-layer-pk-annotation-annotation-id)](/developer-docs/api/get-an-annotation-layer-annotation-layer-pk-annotation-annotation-id) | `/api/v1/annotation_layer/{pk}/annotation/{annotation_id}` |
| `PUT` | [Update an annotation layer (annotation-layer-pk-annotation-annotation-id)](/developer-docs/api/update-an-annotation-layer-annotation-layer-pk-annotation-annotation-id) | `/api/v1/annotation_layer/{pk}/annotation/{annotation_id}` |
| `GET` | [Get related fields data (annotation-layer-related-column-name)](/developer-docs/api/get-related-fields-data-annotation-layer-related-column-name) | `/api/v1/annotation_layer/related/{column_name}` |
| `DELETE` | [Delete multiple annotation layers in a bulk operation](./api/delete-multiple-annotation-layers-in-a-bulk-operation) | `/api/v1/annotation_layer/` |
| `GET` | [Get a list of annotation layers (annotation-layer)](./api/get-a-list-of-annotation-layers-annotation-layer) | `/api/v1/annotation_layer/` |
| `POST` | [Create an annotation layer (annotation-layer)](./api/create-an-annotation-layer-annotation-layer) | `/api/v1/annotation_layer/` |
| `GET` | [Get metadata information about this API resource (annotation-layer--info)](./api/get-metadata-information-about-this-api-resource-annotation-layer-info) | `/api/v1/annotation_layer/_info` |
| `DELETE` | [Delete annotation layer (annotation-layer-pk)](./api/delete-annotation-layer-annotation-layer-pk) | `/api/v1/annotation_layer/{pk}` |
| `GET` | [Get an annotation layer (annotation-layer-pk)](./api/get-an-annotation-layer-annotation-layer-pk) | `/api/v1/annotation_layer/{pk}` |
| `PUT` | [Update an annotation layer (annotation-layer-pk)](./api/update-an-annotation-layer-annotation-layer-pk) | `/api/v1/annotation_layer/{pk}` |
| `DELETE` | [Bulk delete annotation layers](./api/bulk-delete-annotation-layers) | `/api/v1/annotation_layer/{pk}/annotation/` |
| `GET` | [Get a list of annotation layers (annotation-layer-pk-annotation)](./api/get-a-list-of-annotation-layers-annotation-layer-pk-annotation) | `/api/v1/annotation_layer/{pk}/annotation/` |
| `POST` | [Create an annotation layer (annotation-layer-pk-annotation)](./api/create-an-annotation-layer-annotation-layer-pk-annotation) | `/api/v1/annotation_layer/{pk}/annotation/` |
| `DELETE` | [Delete annotation layer (annotation-layer-pk-annotation-annotation-id)](./api/delete-annotation-layer-annotation-layer-pk-annotation-annotation-id) | `/api/v1/annotation_layer/{pk}/annotation/{annotation_id}` |
| `GET` | [Get an annotation layer (annotation-layer-pk-annotation-annotation-id)](./api/get-an-annotation-layer-annotation-layer-pk-annotation-annotation-id) | `/api/v1/annotation_layer/{pk}/annotation/{annotation_id}` |
| `PUT` | [Update an annotation layer (annotation-layer-pk-annotation-annotation-id)](./api/update-an-annotation-layer-annotation-layer-pk-annotation-annotation-id) | `/api/v1/annotation_layer/{pk}/annotation/{annotation_id}` |
| `GET` | [Get related fields data (annotation-layer-related-column-name)](./api/get-related-fields-data-annotation-layer-related-column-name) | `/api/v1/annotation_layer/related/{column_name}` |
</details>
@@ -306,14 +306,14 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Bulk delete CSS templates](/developer-docs/api/bulk-delete-css-templates) | `/api/v1/css_template/` |
| `GET` | [Get a list of CSS templates](/developer-docs/api/get-a-list-of-css-templates) | `/api/v1/css_template/` |
| `POST` | [Create a CSS template](/developer-docs/api/create-a-css-template) | `/api/v1/css_template/` |
| `GET` | [Get metadata information about this API resource (css-template--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-css-template-info) | `/api/v1/css_template/_info` |
| `DELETE` | [Delete a CSS template](/developer-docs/api/delete-a-css-template) | `/api/v1/css_template/{pk}` |
| `GET` | [Get a CSS template](/developer-docs/api/get-a-css-template) | `/api/v1/css_template/{pk}` |
| `PUT` | [Update a CSS template](/developer-docs/api/update-a-css-template) | `/api/v1/css_template/{pk}` |
| `GET` | [Get related fields data (css-template-related-column-name)](/developer-docs/api/get-related-fields-data-css-template-related-column-name) | `/api/v1/css_template/related/{column_name}` |
| `DELETE` | [Bulk delete CSS templates](./api/bulk-delete-css-templates) | `/api/v1/css_template/` |
| `GET` | [Get a list of CSS templates](./api/get-a-list-of-css-templates) | `/api/v1/css_template/` |
| `POST` | [Create a CSS template](./api/create-a-css-template) | `/api/v1/css_template/` |
| `GET` | [Get metadata information about this API resource (css-template--info)](./api/get-metadata-information-about-this-api-resource-css-template-info) | `/api/v1/css_template/_info` |
| `DELETE` | [Delete a CSS template](./api/delete-a-css-template) | `/api/v1/css_template/{pk}` |
| `GET` | [Get a CSS template](./api/get-a-css-template) | `/api/v1/css_template/{pk}` |
| `PUT` | [Update a CSS template](./api/update-a-css-template) | `/api/v1/css_template/{pk}` |
| `GET` | [Get related fields data (css-template-related-column-name)](./api/get-related-fields-data-css-template-related-column-name) | `/api/v1/css_template/related/{column_name}` |
</details>
@@ -324,8 +324,8 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a new dashboard's permanent link](/developer-docs/api/create-a-new-dashboard-s-permanent-link) | `/api/v1/dashboard/{pk}/permalink` |
| `GET` | [Get dashboard's permanent link state](/developer-docs/api/get-dashboard-s-permanent-link-state) | `/api/v1/dashboard/permalink/{key}` |
| `POST` | [Create a new dashboard's permanent link](./api/create-a-new-dashboard-s-permanent-link) | `/api/v1/dashboard/{pk}/permalink` |
| `GET` | [Get dashboard's permanent link state](./api/get-dashboard-s-permanent-link-state) | `/api/v1/dashboard/permalink/{key}` |
</details>
@@ -334,8 +334,8 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a new permanent link (explore-permalink)](/developer-docs/api/create-a-new-permanent-link-explore-permalink) | `/api/v1/explore/permalink` |
| `GET` | [Get chart's permanent link state](/developer-docs/api/get-chart-s-permanent-link-state) | `/api/v1/explore/permalink/{key}` |
| `POST` | [Create a new permanent link (explore-permalink)](./api/create-a-new-permanent-link-explore-permalink) | `/api/v1/explore/permalink` |
| `GET` | [Get chart's permanent link state](./api/get-chart-s-permanent-link-state) | `/api/v1/explore/permalink/{key}` |
</details>
@@ -344,8 +344,8 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a new permanent link (sqllab-permalink)](/developer-docs/api/create-a-new-permanent-link-sqllab-permalink) | `/api/v1/sqllab/permalink` |
| `GET` | [Get permanent link state for SQLLab editor.](/developer-docs/api/get-permanent-link-state-for-sqllab-editor) | `/api/v1/sqllab/permalink/{key}` |
| `POST` | [Create a new permanent link (sqllab-permalink)](./api/create-a-new-permanent-link-sqllab-permalink) | `/api/v1/sqllab/permalink` |
| `GET` | [Get permanent link state for SQLLab editor.](./api/get-permanent-link-state-for-sqllab-editor) | `/api/v1/sqllab/permalink/{key}` |
</details>
@@ -354,7 +354,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get a report schedule log (embedded-dashboard-uuid)](/developer-docs/api/get-a-report-schedule-log-embedded-dashboard-uuid) | `/api/v1/embedded_dashboard/{uuid}` |
| `GET` | [Get a report schedule log (embedded-dashboard-uuid)](./api/get-a-report-schedule-log-embedded-dashboard-uuid) | `/api/v1/embedded_dashboard/{uuid}` |
</details>
@@ -363,10 +363,10 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a dashboard's filter state](/developer-docs/api/create-a-dashboard-s-filter-state) | `/api/v1/dashboard/{pk}/filter_state` |
| `DELETE` | [Delete a dashboard's filter state value](/developer-docs/api/delete-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `GET` | [Get a dashboard's filter state value](/developer-docs/api/get-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `PUT` | [Update a dashboard's filter state value](/developer-docs/api/update-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `POST` | [Create a dashboard's filter state](./api/create-a-dashboard-s-filter-state) | `/api/v1/dashboard/{pk}/filter_state` |
| `DELETE` | [Delete a dashboard's filter state value](./api/delete-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `GET` | [Get a dashboard's filter state value](./api/get-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `PUT` | [Update a dashboard's filter state value](./api/update-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
</details>
@@ -375,10 +375,10 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a new form_data](/developer-docs/api/create-a-new-form-data) | `/api/v1/explore/form_data` |
| `DELETE` | [Delete a form_data](/developer-docs/api/delete-a-form-data) | `/api/v1/explore/form_data/{key}` |
| `GET` | [Get a form_data](/developer-docs/api/get-a-form-data) | `/api/v1/explore/form_data/{key}` |
| `PUT` | [Update an existing form_data](/developer-docs/api/update-an-existing-form-data) | `/api/v1/explore/form_data/{key}` |
| `POST` | [Create a new form_data](./api/create-a-new-form-data) | `/api/v1/explore/form_data` |
| `DELETE` | [Delete a form_data](./api/delete-a-form-data) | `/api/v1/explore/form_data/{key}` |
| `GET` | [Get a form_data](./api/get-a-form-data) | `/api/v1/explore/form_data/{key}` |
| `PUT` | [Update an existing form_data](./api/update-an-existing-form-data) | `/api/v1/explore/form_data/{key}` |
</details>
@@ -389,17 +389,17 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Bulk delete report schedules](/developer-docs/api/bulk-delete-report-schedules) | `/api/v1/report/` |
| `GET` | [Get a list of report schedules](/developer-docs/api/get-a-list-of-report-schedules) | `/api/v1/report/` |
| `POST` | [Create a report schedule](/developer-docs/api/create-a-report-schedule) | `/api/v1/report/` |
| `GET` | [Get metadata information about this API resource (report--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-report-info) | `/api/v1/report/_info` |
| `DELETE` | [Delete a report schedule](/developer-docs/api/delete-a-report-schedule) | `/api/v1/report/{pk}` |
| `GET` | [Get a report schedule](/developer-docs/api/get-a-report-schedule) | `/api/v1/report/{pk}` |
| `PUT` | [Update a report schedule](/developer-docs/api/update-a-report-schedule) | `/api/v1/report/{pk}` |
| `GET` | [Get a list of report schedule logs](/developer-docs/api/get-a-list-of-report-schedule-logs) | `/api/v1/report/{pk}/log/` |
| `GET` | [Get a report schedule log (report-pk-log-log-id)](/developer-docs/api/get-a-report-schedule-log-report-pk-log-log-id) | `/api/v1/report/{pk}/log/{log_id}` |
| `GET` | [Get related fields data (report-related-column-name)](/developer-docs/api/get-related-fields-data-report-related-column-name) | `/api/v1/report/related/{column_name}` |
| `GET` | [Get slack channels](/developer-docs/api/get-slack-channels) | `/api/v1/report/slack_channels/` |
| `DELETE` | [Bulk delete report schedules](./api/bulk-delete-report-schedules) | `/api/v1/report/` |
| `GET` | [Get a list of report schedules](./api/get-a-list-of-report-schedules) | `/api/v1/report/` |
| `POST` | [Create a report schedule](./api/create-a-report-schedule) | `/api/v1/report/` |
| `GET` | [Get metadata information about this API resource (report--info)](./api/get-metadata-information-about-this-api-resource-report-info) | `/api/v1/report/_info` |
| `DELETE` | [Delete a report schedule](./api/delete-a-report-schedule) | `/api/v1/report/{pk}` |
| `GET` | [Get a report schedule](./api/get-a-report-schedule) | `/api/v1/report/{pk}` |
| `PUT` | [Update a report schedule](./api/update-a-report-schedule) | `/api/v1/report/{pk}` |
| `GET` | [Get a list of report schedule logs](./api/get-a-list-of-report-schedule-logs) | `/api/v1/report/{pk}/log/` |
| `GET` | [Get a report schedule log (report-pk-log-log-id)](./api/get-a-report-schedule-log-report-pk-log-log-id) | `/api/v1/report/{pk}/log/{log_id}` |
| `GET` | [Get related fields data (report-related-column-name)](./api/get-related-fields-data-report-related-column-name) | `/api/v1/report/related/{column_name}` |
| `GET` | [Get slack channels](./api/get-slack-channels) | `/api/v1/report/slack_channels/` |
</details>
@@ -410,16 +410,16 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get security roles](/developer-docs/api/get-security-roles) | `/api/v1/security/roles/` |
| `POST` | [Create security roles](/developer-docs/api/create-security-roles) | `/api/v1/security/roles/` |
| `GET` | [Get security roles info](/developer-docs/api/get-security-roles-info) | `/api/v1/security/roles/_info` |
| `DELETE` | [Delete security roles by pk](/developer-docs/api/delete-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `GET` | [Get security roles by pk](/developer-docs/api/get-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `PUT` | [Update security roles by pk](/developer-docs/api/update-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `POST` | [Create security roles by role_id permissions](/developer-docs/api/create-security-roles-by-role-id-permissions) | `/api/v1/security/roles/{role_id}/permissions` |
| `GET` | [Get security roles by role_id permissions](/developer-docs/api/get-security-roles-by-role-id-permissions) | `/api/v1/security/roles/{role_id}/permissions/` |
| `PUT` | [Update security roles by role_id users](/developer-docs/api/update-security-roles-by-role-id-users) | `/api/v1/security/roles/{role_id}/users` |
| `GET` | [List roles](/developer-docs/api/list-roles) | `/api/v1/security/roles/search/` |
| `GET` | [Get security roles](./api/get-security-roles) | `/api/v1/security/roles/` |
| `POST` | [Create security roles](./api/create-security-roles) | `/api/v1/security/roles/` |
| `GET` | [Get security roles info](./api/get-security-roles-info) | `/api/v1/security/roles/_info` |
| `DELETE` | [Delete security roles by pk](./api/delete-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `GET` | [Get security roles by pk](./api/get-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `PUT` | [Update security roles by pk](./api/update-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `POST` | [Create security roles by role_id permissions](./api/create-security-roles-by-role-id-permissions) | `/api/v1/security/roles/{role_id}/permissions` |
| `GET` | [Get security roles by role_id permissions](./api/get-security-roles-by-role-id-permissions) | `/api/v1/security/roles/{role_id}/permissions/` |
| `PUT` | [Update security roles by role_id users](./api/update-security-roles-by-role-id-users) | `/api/v1/security/roles/{role_id}/users` |
| `GET` | [List roles](./api/list-roles) | `/api/v1/security/roles/search/` |
</details>
@@ -428,12 +428,12 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get security users](/developer-docs/api/get-security-users) | `/api/v1/security/users/` |
| `POST` | [Create security users](/developer-docs/api/create-security-users) | `/api/v1/security/users/` |
| `GET` | [Get security users info](/developer-docs/api/get-security-users-info) | `/api/v1/security/users/_info` |
| `DELETE` | [Delete security users by pk](/developer-docs/api/delete-security-users-by-pk) | `/api/v1/security/users/{pk}` |
| `GET` | [Get security users by pk](/developer-docs/api/get-security-users-by-pk) | `/api/v1/security/users/{pk}` |
| `PUT` | [Update security users by pk](/developer-docs/api/update-security-users-by-pk) | `/api/v1/security/users/{pk}` |
| `GET` | [Get security users](./api/get-security-users) | `/api/v1/security/users/` |
| `POST` | [Create security users](./api/create-security-users) | `/api/v1/security/users/` |
| `GET` | [Get security users info](./api/get-security-users-info) | `/api/v1/security/users/_info` |
| `DELETE` | [Delete security users by pk](./api/delete-security-users-by-pk) | `/api/v1/security/users/{pk}` |
| `GET` | [Get security users by pk](./api/get-security-users-by-pk) | `/api/v1/security/users/{pk}` |
| `PUT` | [Update security users by pk](./api/update-security-users-by-pk) | `/api/v1/security/users/{pk}` |
</details>
@@ -442,9 +442,9 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get security permissions](/developer-docs/api/get-security-permissions) | `/api/v1/security/permissions/` |
| `GET` | [Get security permissions info](/developer-docs/api/get-security-permissions-info) | `/api/v1/security/permissions/_info` |
| `GET` | [Get security permissions by pk](/developer-docs/api/get-security-permissions-by-pk) | `/api/v1/security/permissions/{pk}` |
| `GET` | [Get security permissions](./api/get-security-permissions) | `/api/v1/security/permissions/` |
| `GET` | [Get security permissions info](./api/get-security-permissions-info) | `/api/v1/security/permissions/_info` |
| `GET` | [Get security permissions by pk](./api/get-security-permissions-by-pk) | `/api/v1/security/permissions/{pk}` |
</details>
@@ -453,12 +453,12 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get security resources](/developer-docs/api/get-security-resources) | `/api/v1/security/resources/` |
| `POST` | [Create security resources](/developer-docs/api/create-security-resources) | `/api/v1/security/resources/` |
| `GET` | [Get security resources info](/developer-docs/api/get-security-resources-info) | `/api/v1/security/resources/_info` |
| `DELETE` | [Delete security resources by pk](/developer-docs/api/delete-security-resources-by-pk) | `/api/v1/security/resources/{pk}` |
| `GET` | [Get security resources by pk](/developer-docs/api/get-security-resources-by-pk) | `/api/v1/security/resources/{pk}` |
| `PUT` | [Update security resources by pk](/developer-docs/api/update-security-resources-by-pk) | `/api/v1/security/resources/{pk}` |
| `GET` | [Get security resources](./api/get-security-resources) | `/api/v1/security/resources/` |
| `POST` | [Create security resources](./api/create-security-resources) | `/api/v1/security/resources/` |
| `GET` | [Get security resources info](./api/get-security-resources-info) | `/api/v1/security/resources/_info` |
| `DELETE` | [Delete security resources by pk](./api/delete-security-resources-by-pk) | `/api/v1/security/resources/{pk}` |
| `GET` | [Get security resources by pk](./api/get-security-resources-by-pk) | `/api/v1/security/resources/{pk}` |
| `PUT` | [Update security resources by pk](./api/update-security-resources-by-pk) | `/api/v1/security/resources/{pk}` |
</details>
@@ -467,12 +467,12 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get security permissions resources](/developer-docs/api/get-security-permissions-resources) | `/api/v1/security/permissions-resources/` |
| `POST` | [Create security permissions resources](/developer-docs/api/create-security-permissions-resources) | `/api/v1/security/permissions-resources/` |
| `GET` | [Get security permissions resources info](/developer-docs/api/get-security-permissions-resources-info) | `/api/v1/security/permissions-resources/_info` |
| `DELETE` | [Delete security permissions resources by pk](/developer-docs/api/delete-security-permissions-resources-by-pk) | `/api/v1/security/permissions-resources/{pk}` |
| `GET` | [Get security permissions resources by pk](/developer-docs/api/get-security-permissions-resources-by-pk) | `/api/v1/security/permissions-resources/{pk}` |
| `PUT` | [Update security permissions resources by pk](/developer-docs/api/update-security-permissions-resources-by-pk) | `/api/v1/security/permissions-resources/{pk}` |
| `GET` | [Get security permissions resources](./api/get-security-permissions-resources) | `/api/v1/security/permissions-resources/` |
| `POST` | [Create security permissions resources](./api/create-security-permissions-resources) | `/api/v1/security/permissions-resources/` |
| `GET` | [Get security permissions resources info](./api/get-security-permissions-resources-info) | `/api/v1/security/permissions-resources/_info` |
| `DELETE` | [Delete security permissions resources by pk](./api/delete-security-permissions-resources-by-pk) | `/api/v1/security/permissions-resources/{pk}` |
| `GET` | [Get security permissions resources by pk](./api/get-security-permissions-resources-by-pk) | `/api/v1/security/permissions-resources/{pk}` |
| `PUT` | [Update security permissions resources by pk](./api/update-security-permissions-resources-by-pk) | `/api/v1/security/permissions-resources/{pk}` |
</details>
@@ -481,14 +481,14 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Bulk delete RLS rules](/developer-docs/api/bulk-delete-rls-rules) | `/api/v1/rowlevelsecurity/` |
| `GET` | [Get a list of RLS](/developer-docs/api/get-a-list-of-rls) | `/api/v1/rowlevelsecurity/` |
| `POST` | [Create a new RLS rule](/developer-docs/api/create-a-new-rls-rule) | `/api/v1/rowlevelsecurity/` |
| `GET` | [Get metadata information about this API resource (rowlevelsecurity--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-rowlevelsecurity-info) | `/api/v1/rowlevelsecurity/_info` |
| `DELETE` | [Delete an RLS](/developer-docs/api/delete-an-rls) | `/api/v1/rowlevelsecurity/{pk}` |
| `GET` | [Get an RLS](/developer-docs/api/get-an-rls) | `/api/v1/rowlevelsecurity/{pk}` |
| `PUT` | [Update an RLS rule](/developer-docs/api/update-an-rls-rule) | `/api/v1/rowlevelsecurity/{pk}` |
| `GET` | [Get related fields data (rowlevelsecurity-related-column-name)](/developer-docs/api/get-related-fields-data-rowlevelsecurity-related-column-name) | `/api/v1/rowlevelsecurity/related/{column_name}` |
| `DELETE` | [Bulk delete RLS rules](./api/bulk-delete-rls-rules) | `/api/v1/rowlevelsecurity/` |
| `GET` | [Get a list of RLS](./api/get-a-list-of-rls) | `/api/v1/rowlevelsecurity/` |
| `POST` | [Create a new RLS rule](./api/create-a-new-rls-rule) | `/api/v1/rowlevelsecurity/` |
| `GET` | [Get metadata information about this API resource (rowlevelsecurity--info)](./api/get-metadata-information-about-this-api-resource-rowlevelsecurity-info) | `/api/v1/rowlevelsecurity/_info` |
| `DELETE` | [Delete an RLS](./api/delete-an-rls) | `/api/v1/rowlevelsecurity/{pk}` |
| `GET` | [Get an RLS](./api/get-an-rls) | `/api/v1/rowlevelsecurity/{pk}` |
| `PUT` | [Update an RLS rule](./api/update-an-rls-rule) | `/api/v1/rowlevelsecurity/{pk}` |
| `GET` | [Get related fields data (rowlevelsecurity-related-column-name)](./api/get-related-fields-data-rowlevelsecurity-related-column-name) | `/api/v1/rowlevelsecurity/related/{column_name}` |
</details>
@@ -499,8 +499,8 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Export all assets](/developer-docs/api/export-all-assets) | `/api/v1/assets/export/` |
| `POST` | [Import multiple assets](/developer-docs/api/import-multiple-assets) | `/api/v1/assets/import/` |
| `GET` | [Export all assets](./api/export-all-assets) | `/api/v1/assets/export/` |
| `POST` | [Import multiple assets](./api/import-multiple-assets) | `/api/v1/assets/import/` |
</details>
@@ -509,7 +509,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Invalidate cache records and remove the database records](/developer-docs/api/invalidate-cache-records-and-remove-the-database-records) | `/api/v1/cachekey/invalidate` |
| `POST` | [Invalidate cache records and remove the database records](./api/invalidate-cache-records-and-remove-the-database-records) | `/api/v1/cachekey/invalidate` |
</details>
@@ -518,10 +518,10 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get a list of logs](/developer-docs/api/get-a-list-of-logs) | `/api/v1/log/` |
| `POST` | [Create log](/developer-docs/api/create-log) | `/api/v1/log/` |
| `GET` | [Get a log detail information](/developer-docs/api/get-a-log-detail-information) | `/api/v1/log/{pk}` |
| `GET` | [Get recent activity data for a user](/developer-docs/api/get-recent-activity-data-for-a-user) | `/api/v1/log/recent_activity/` |
| `GET` | [Get a list of logs](./api/get-a-list-of-logs) | `/api/v1/log/` |
| `POST` | [Create log](./api/create-log) | `/api/v1/log/` |
| `GET` | [Get a log detail information](./api/get-a-log-detail-information) | `/api/v1/log/{pk}` |
| `GET` | [Get recent activity data for a user](./api/get-recent-activity-data-for-a-user) | `/api/v1/log/recent_activity/` |
</details>
@@ -532,8 +532,8 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get the user object](/developer-docs/api/get-the-user-object) | `/api/v1/me/` |
| `GET` | [Get the user roles](/developer-docs/api/get-the-user-roles) | `/api/v1/me/roles/` |
| `GET` | [Get the user object](./api/get-the-user-object) | `/api/v1/me/` |
| `GET` | [Get the user roles](./api/get-the-user-roles) | `/api/v1/me/roles/` |
</details>
@@ -542,7 +542,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get the user avatar](/developer-docs/api/get-the-user-avatar) | `/api/v1/user/{user_id}/avatar.png` |
| `GET` | [Get the user avatar](./api/get-the-user-avatar) | `/api/v1/user/{user_id}/avatar.png` |
</details>
@@ -551,7 +551,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get menu](/developer-docs/api/get-menu) | `/api/v1/menu/` |
| `GET` | [Get menu](./api/get-menu) | `/api/v1/menu/` |
</details>
@@ -560,7 +560,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get all available domains](/developer-docs/api/get-all-available-domains) | `/api/v1/available_domains/` |
| `GET` | [Get all available domains](./api/get-all-available-domains) | `/api/v1/available_domains/` |
</details>
@@ -569,7 +569,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Read off of the Redis events stream](/developer-docs/api/read-off-of-the-redis-events-stream) | `/api/v1/async_event/` |
| `GET` | [Read off of the Redis events stream](./api/read-off-of-the-redis-events-stream) | `/api/v1/async_event/` |
</details>
@@ -578,7 +578,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get api by version openapi](/developer-docs/api/get-api-by-version-openapi) | `/api/{version}/_openapi` |
| `GET` | [Get api by version openapi](./api/get-api-by-version-openapi) | `/api/{version}/_openapi` |
</details>

View File

@@ -224,3 +224,52 @@ async def analysis_guide(ctx: Context) -> str:
```
See [MCP Integration](./mcp) for implementation details.
### Semantic Layers
Extensions can register custom semantic layer implementations that allow Superset to connect to external data modeling frameworks. Each semantic layer defines how to authenticate, discover semantic views (tables/metrics/dimensions), and execute queries against the external system.
```python
from superset_core.semantic_layers.decorators import semantic_layer
from superset_core.semantic_layers.layer import SemanticLayer
from my_extension.config import MyConfig
from my_extension.view import MySemanticView
@semantic_layer(
id="my_platform",
name="My Data Platform",
description="Connect to My Data Platform's semantic layer",
)
class MySemanticLayer(SemanticLayer[MyConfig, MySemanticView]):
configuration_class = MyConfig
@classmethod
def from_configuration(cls, configuration: dict) -> "MySemanticLayer":
config = MyConfig.model_validate(configuration)
return cls(config)
@classmethod
def get_configuration_schema(cls, configuration=None) -> dict:
return MyConfig.model_json_schema()
@classmethod
def get_runtime_schema(cls, configuration=None, runtime_data=None) -> dict:
return {"type": "object", "properties": {}}
def get_semantic_views(self, runtime_configuration: dict) -> set[MySemanticView]:
# Return available views from the external platform
...
def get_semantic_view(self, name: str, additional_configuration: dict) -> MySemanticView:
# Return a specific view by name
...
```
**Note**: The `@semantic_layer` decorator automatically detects context and applies appropriate ID prefixing:
- **Extension context**: ID prefixed as `extensions.{publisher}.{name}.{id}`
- **Host context**: Original ID used as-is
The decorator registers the class in the semantic layers registry, making it available in the UI for users to create connections. The `configuration_class` should be a Pydantic model that defines the fields needed to connect (credentials, project, database, etc.). Superset uses the model's JSON schema to render the configuration form dynamically.

View File

@@ -39,11 +39,7 @@ superset-extensions bundle: Packages the extension into a .supx file.
superset-extensions dev: Automatically rebuilds the extension as files change.
superset-extensions validate: Validates the extension structure and metadata consistency.
superset-extensions update: Updates derived and generated files in the extension project.
Use --version [<version>] to update the version (prompts if no value given).
Use --license [<license>] to update the license (prompts if no value given).
superset-extensions validate: Validates the extension structure and metadata.
```
When creating a new extension with `superset-extensions init`, the CLI generates a standardized folder structure:

View File

@@ -1,7 +1,7 @@
---
title: MCP Server Deployment & Authentication
hide_title: true
sidebar_position: 14
sidebar_position: 9
version: 1
---
@@ -30,10 +30,6 @@ Superset includes a built-in [Model Context Protocol (MCP)](https://modelcontext
This guide covers how to run, secure, and deploy the MCP server.
:::tip Looking for user docs?
See **[Using AI with Superset](/user-docs/using-superset/using-ai-with-superset)** for a guide on what AI can do with Superset and how to connect your AI client.
:::
```mermaid
flowchart LR
A["AI Client<br/>(Claude, ChatGPT, etc.)"] -- "MCP protocol<br/>(HTTP + JSON-RPC)" --> B["MCP Server<br/>(:5008/mcp)"]
@@ -672,13 +668,12 @@ MCP_CSRF_CONFIG = {
- **Secrets management** -- Store `MCP_JWT_SECRET`, database credentials, and API keys in environment variables or a secrets manager, never in config files committed to version control
- **Scoped tokens** -- Use `MCP_REQUIRED_SCOPES` to limit what operations a token can perform
- **Network isolation** -- In Kubernetes, restrict MCP pod network policies to only allow traffic from your AI client endpoints
- Review the **[Security documentation](/developer-docs/extensions/security)** for additional extension security guidance
- Review the **[Security documentation](./security)** for additional extension security guidance
---
## Next Steps
- **[Using AI with Superset](/user-docs/using-superset/using-ai-with-superset)** -- What AI can do with Superset and how to get started
- **[MCP Integration](/developer-docs/extensions/mcp)** -- Build custom MCP tools and prompts via Superset extensions
- **[Security](/developer-docs/extensions/security)** -- Security best practices for extensions
- **[Deployment](/developer-docs/extensions/deployment)** -- Package and deploy Superset extensions
- **[MCP Integration](./mcp)** -- Build custom MCP tools and prompts via Superset extensions
- **[Security](./security)** -- Security best practices for extensions
- **[Deployment](./deployment)** -- Package and deploy Superset extensions

View File

@@ -52,6 +52,7 @@ module.exports = {
'extensions/development',
'extensions/deployment',
'extensions/mcp',
'extensions/mcp-server',
'extensions/security',
'extensions/tasks',
'extensions/registry',

View File

@@ -1,245 +0,0 @@
---
title: Using AI with Superset
hide_title: true
sidebar_position: 5
version: 1
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Using AI with Superset
Superset supports AI assistants through the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/). Connect Claude, ChatGPT, or other MCP-compatible clients to explore your data, build charts, create dashboards, and run SQL -- all through natural language.
:::info
Requires Superset 5.0+. Your admin must enable and deploy the MCP server before you can connect.
See the **[MCP Server admin guide](/admin-docs/configuration/mcp-server)** for setup instructions.
:::
---
## What Can AI Do with Superset?
### Explore Your Data
Ask your AI assistant to browse what's available in your Superset instance:
- **List datasets** -- see all datasets you have access to, with filtering and search
- **Get dataset details** -- column names, types, available metrics, and filters
- **List charts and dashboards** -- find existing visualizations by name or keyword
- **Get chart and dashboard details** -- understand what a chart shows, its query, and configuration
**Example prompts:**
> "What datasets are available?"
> "Show me the columns in the sales_orders dataset"
> "Find dashboards related to revenue"
### Build Charts
Describe the visualization you want and AI creates it for you:
- **Create charts from natural language** -- describe what you want to see and AI picks the right chart type, metrics, and dimensions
- **Preview before saving** -- AI generates a preview so you can review before committing
- **Modify existing charts** -- update filters, change chart types, add metrics
- **Get Explore links** -- open any chart in Superset's Explore view for further refinement
**Example prompts:**
> "Create a bar chart showing monthly revenue by region from the sales dataset"
> "Update chart 42 to use a line chart instead"
> "Give me a link to explore this chart further"
### Create Dashboards
Build dashboards from a collection of charts:
- **Generate dashboards** -- create a new dashboard with a set of charts, automatically laid out
- **Add charts to existing dashboards** -- place a chart on an existing dashboard with automatic positioning
**Example prompts:**
> "Create a dashboard called 'Q4 Sales Overview' with charts 10, 15, and 22"
> "Add the revenue trend chart to the executive dashboard"
### Run SQL Queries
Execute SQL directly through your AI assistant:
- **Run queries** -- execute SQL with full Superset RBAC enforcement (you can only query data your roles allow)
- **Open SQL Lab** -- get a link to SQL Lab pre-populated with a query, ready to run and explore
**Example prompts:**
> "Run this query: SELECT region, SUM(revenue) FROM sales GROUP BY region"
> "Open SQL Lab with a query to show the top 10 customers by order count"
### Analyze Chart Data
Pull the raw data behind any chart:
- **Get chart data** -- retrieve the data a chart displays, with support for JSON, CSV, and Excel export formats
- **Inspect results** -- useful for verifying what a visualization shows or feeding data into other tools
**Example prompts:**
> "Get the data behind chart 42"
> "Export chart 15 data as CSV"
### Check Instance Status
- **Health check** -- verify your Superset instance is up and the MCP connection is working
- **Instance info** -- get high-level statistics about your Superset instance (number of datasets, charts, dashboards)
**Example prompts:**
> "Is Superset healthy?"
> "How many dashboards are in this instance?"
---
## Connecting Your AI Client
Once your admin has deployed the MCP server, connect your AI client using the instructions below.
### Claude Desktop
Edit your Claude Desktop config file:
- **macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`
- **Windows**: `%APPDATA%\Claude\claude_desktop_config.json`
- **Linux**: `~/.config/Claude/claude_desktop_config.json`
```json
{
"mcpServers": {
"superset": {
"url": "http://localhost:5008/mcp"
}
}
}
```
Restart Claude Desktop. The hammer icon in the chat bar confirms the connection.
If your admin has enabled JWT authentication, you may need to include a token:
```json
{
"mcpServers": {
"superset": {
"command": "npx",
"args": [
"-y",
"mcp-remote@latest",
"http://your-superset-host:5008/mcp",
"--header",
"Authorization: Bearer YOUR_TOKEN"
]
}
}
}
```
### Claude Code (CLI)
Add to your project's `.mcp.json`:
```json
{
"mcpServers": {
"superset": {
"type": "url",
"url": "http://localhost:5008/mcp"
}
}
}
```
### ChatGPT
1. Click your profile icon > **Settings** > **Apps and Connectors**
2. Enable **Developer Mode** in Advanced Settings
3. In the chat composer, press **+** > **Add sources** > **App** > **Connect more** > **Create app**
4. Enter a name and your MCP server URL
5. Click **I understand and continue**
:::info
ChatGPT MCP connectors require a Pro, Team, Enterprise, or Edu plan.
:::
Ask your admin for the MCP server URL and any authentication tokens you need.
---
## Tips for Best Results
- **Be specific** -- "Create a bar chart of monthly revenue by region from the sales dataset" works better than "Make me a chart"
- **Start with exploration** -- ask what datasets and charts exist before creating new ones
- **Review AI-generated content** -- always check chart configurations and SQL before saving or sharing
- **Use Explore for refinement** -- ask AI for an Explore link, then fine-tune interactively in the Superset UI
- **Check permissions if you get errors** -- AI respects Superset's RBAC, so you can only access data your roles allow
---
## Available Tools Reference
| Tool | Description |
|------|-------------|
| `health_check` | Verify the MCP server is running and connected |
| `get_instance_info` | Get instance statistics (dataset, chart, dashboard counts) |
| `get_schema` | Discover available charts, datasets, and dashboards with schema info |
| `list_datasets` | List datasets with filtering and search |
| `get_dataset_info` | Get dataset metadata (columns, metrics, filters) |
| `list_charts` | List charts with filtering and search |
| `get_chart_info` | Get chart metadata and configuration |
| `get_chart_data` | Retrieve chart data (JSON, CSV, or Excel) |
| `get_chart_preview` | Generate a chart preview (URL, ASCII, table, or Vega-Lite) |
| `generate_chart` | Create a new chart from a specification |
| `update_chart` | Modify an existing chart's configuration |
| `update_chart_preview` | Update a cached chart preview without saving |
| `list_dashboards` | List dashboards with filtering and search |
| `get_dashboard_info` | Get dashboard metadata and layout |
| `generate_dashboard` | Create a new dashboard with specified charts |
| `add_chart_to_existing_dashboard` | Add a chart to an existing dashboard |
| `execute_sql` | Run a SQL query with RBAC enforcement |
| `open_sql_lab_with_context` | Open SQL Lab with a pre-populated query |
| `generate_explore_link` | Generate an Explore URL for interactive visualization |
---
## Troubleshooting
### "Connection refused" or "Cannot connect"
- Confirm the MCP server URL with your admin
- For Claude Desktop: fully quit the app (not just close the window) and restart after config changes
- Check that the URL path ends with `/mcp` (e.g., `http://localhost:5008/mcp`)
### "Permission denied" or missing data
- Superset's RBAC controls what you can access through AI, just like in the Superset UI
- Ask your admin to verify your roles and permissions
- Try accessing the same data through the Superset web UI to confirm your access
### "Response too large"
- Ask for smaller result sets: use filters, reduce `page_size`, or request specific columns
- Example: "Show me the top 10 rows from the sales dataset" instead of "Show me all sales data"
### AI doesn't see Superset tools
- Verify the connection in your AI client (e.g., the hammer icon in Claude Desktop)
- Ask the AI "What Superset tools are available?" to confirm the connection
- Restart your AI client if you recently changed the configuration

View File

@@ -69,8 +69,8 @@
"@superset-ui/core": "^0.20.4",
"@swc/core": "^1.15.17",
"antd": "^6.3.2",
"baseline-browser-mapping": "^2.10.7",
"caniuse-lite": "^1.0.30001780",
"baseline-browser-mapping": "^2.10.0",
"caniuse-lite": "^1.0.30001775",
"docusaurus-plugin-openapi-docs": "^4.6.0",
"docusaurus-theme-openapi-docs": "^4.6.0",
"js-yaml": "^4.1.1",
@@ -106,7 +106,7 @@
"globals": "^17.4.0",
"prettier": "^3.8.1",
"typescript": "~5.9.3",
"typescript-eslint": "^8.57.1",
"typescript-eslint": "^8.56.1",
"webpack": "^5.105.4"
},
"browserslist": {

View File

@@ -202,7 +202,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \\
mdx += `| Method | Endpoint | Description |\n`;
mdx += `|--------|----------|-------------|\n`;
for (const ep of tagEndpoints['Security']) {
mdx += `| \`${ep.method}\` | [${ep.summary}](/developer-docs/api/${ep.slug}) | \`${ep.path}\` |\n`;
mdx += `| \`${ep.method}\` | [${ep.summary}](./api/${ep.slug}) | \`${ep.path}\` |\n`;
}
mdx += '\n';
renderedTags.add('Security');
@@ -229,7 +229,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \\
mdx += `|--------|----------|-------------|\n`;
for (const ep of endpoints) {
mdx += `| \`${ep.method}\` | [${ep.summary}](/developer-docs/api/${ep.slug}) | \`${ep.path}\` |\n`;
mdx += `| \`${ep.method}\` | [${ep.summary}](./api/${ep.slug}) | \`${ep.path}\` |\n`;
}
mdx += `\n</details>\n\n`;
@@ -252,7 +252,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \\
mdx += `|--------|----------|-------------|\n`;
for (const ep of endpoints) {
mdx += `| \`${ep.method}\` | [${ep.summary}](/developer-docs/api/${ep.slug}) | \`${ep.path}\` |\n`;
mdx += `| \`${ep.method}\` | [${ep.summary}](./api/${ep.slug}) | \`${ep.path}\` |\n`;
}
mdx += `\n</details>\n\n`;

View File

@@ -75,6 +75,12 @@
"lifecycle": "development",
"description": "Expand nested types in Presto into extra columns/arrays. Experimental, doesn't work with all nested types."
},
{
"name": "SEMANTIC_LAYERS",
"default": false,
"lifecycle": "development",
"description": "Enable semantic layers and show semantic views alongside datasets"
},
{
"name": "TABLE_V2_TIME_COMPARISON_ENABLED",
"default": false,

View File

@@ -5117,100 +5117,185 @@
dependencies:
"@types/yargs-parser" "*"
"@typescript-eslint/eslint-plugin@8.57.1", "@typescript-eslint/eslint-plugin@^8.52.0":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.57.1.tgz#ddfdfb30f8b5ccee7f3c21798b377c51370edd55"
integrity sha512-Gn3aqnvNl4NGc6x3/Bqk1AOn0thyTU9bqDRhiRnUWezgvr2OnhYCWCgC8zXXRVqBsIL1pSDt7T9nJUe0oM0kDQ==
"@typescript-eslint/eslint-plugin@8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.56.1.tgz#b1ce606d87221daec571e293009675992f0aae76"
integrity sha512-Jz9ZztpB37dNC+HU2HI28Bs9QXpzCz+y/twHOwhyrIRdbuVDxSytJNDl6z/aAKlaRIwC7y8wJdkBv7FxYGgi0A==
dependencies:
"@eslint-community/regexpp" "^4.12.2"
"@typescript-eslint/scope-manager" "8.57.1"
"@typescript-eslint/type-utils" "8.57.1"
"@typescript-eslint/utils" "8.57.1"
"@typescript-eslint/visitor-keys" "8.57.1"
"@typescript-eslint/scope-manager" "8.56.1"
"@typescript-eslint/type-utils" "8.56.1"
"@typescript-eslint/utils" "8.56.1"
"@typescript-eslint/visitor-keys" "8.56.1"
ignore "^7.0.5"
natural-compare "^1.4.0"
ts-api-utils "^2.4.0"
"@typescript-eslint/parser@8.57.1", "@typescript-eslint/parser@^8.56.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/parser/-/parser-8.57.1.tgz#d523e559b148264055c0a49a29d5f50c7de659c2"
integrity sha512-k4eNDan0EIMTT/dUKc/g+rsJ6wcHYhNPdY19VoX/EOtaAG8DLtKCykhrUnuHPYvinn5jhAPgD2Qw9hXBwrahsw==
"@typescript-eslint/eslint-plugin@^8.52.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.56.0.tgz#5aec3db807a6b8437ea5d5ebf7bd16b4119aba8d"
integrity sha512-lRyPDLzNCuae71A3t9NEINBiTn7swyOhvUj3MyUOxb8x6g6vPEFoOU+ZRmGMusNC3X3YMhqMIX7i8ShqhT74Pw==
dependencies:
"@typescript-eslint/scope-manager" "8.57.1"
"@typescript-eslint/types" "8.57.1"
"@typescript-eslint/typescript-estree" "8.57.1"
"@typescript-eslint/visitor-keys" "8.57.1"
"@eslint-community/regexpp" "^4.12.2"
"@typescript-eslint/scope-manager" "8.56.0"
"@typescript-eslint/type-utils" "8.56.0"
"@typescript-eslint/utils" "8.56.0"
"@typescript-eslint/visitor-keys" "8.56.0"
ignore "^7.0.5"
natural-compare "^1.4.0"
ts-api-utils "^2.4.0"
"@typescript-eslint/parser@8.56.1", "@typescript-eslint/parser@^8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/parser/-/parser-8.56.1.tgz#21d13b3d456ffb08614c1d68bb9a4f8d9237cdc7"
integrity sha512-klQbnPAAiGYFyI02+znpBRLyjL4/BrBd0nyWkdC0s/6xFLkXYQ8OoRrSkqacS1ddVxf/LDyODIKbQ5TgKAf/Fg==
dependencies:
"@typescript-eslint/scope-manager" "8.56.1"
"@typescript-eslint/types" "8.56.1"
"@typescript-eslint/typescript-estree" "8.56.1"
"@typescript-eslint/visitor-keys" "8.56.1"
debug "^4.4.3"
"@typescript-eslint/project-service@8.57.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/project-service/-/project-service-8.57.1.tgz#16af9fe16eedbd7085e4fdc29baa73715c0c55c5"
integrity sha512-vx1F37BRO1OftsYlmG9xay1TqnjNVlqALymwWVuYTdo18XuKxtBpCj1QlzNIEHlvlB27osvXFWptYiEWsVdYsg==
"@typescript-eslint/project-service@8.56.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/project-service/-/project-service-8.56.0.tgz#bb8562fecd8f7922e676fc6a1189c20dd7991d73"
integrity sha512-M3rnyL1vIQOMeWxTWIW096/TtVP+8W3p/XnaFflhmcFp+U4zlxUxWj4XwNs6HbDeTtN4yun0GNTTDBw/SvufKg==
dependencies:
"@typescript-eslint/tsconfig-utils" "^8.57.1"
"@typescript-eslint/types" "^8.57.1"
"@typescript-eslint/tsconfig-utils" "^8.56.0"
"@typescript-eslint/types" "^8.56.0"
debug "^4.4.3"
"@typescript-eslint/scope-manager@8.57.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/scope-manager/-/scope-manager-8.57.1.tgz#4524d7e7b420cb501807499684d435ae129aaf35"
integrity sha512-hs/QcpCwlwT2L5S+3fT6gp0PabyGk4Q0Rv2doJXA0435/OpnSR3VRgvrp8Xdoc3UAYSg9cyUjTeFXZEPg/3OKg==
"@typescript-eslint/project-service@8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/project-service/-/project-service-8.56.1.tgz#65c8d645f028b927bfc4928593b54e2ecd809244"
integrity sha512-TAdqQTzHNNvlVFfR+hu2PDJrURiwKsUvxFn1M0h95BB8ah5jejas08jUWG4dBA68jDMI988IvtfdAI53JzEHOQ==
dependencies:
"@typescript-eslint/types" "8.57.1"
"@typescript-eslint/visitor-keys" "8.57.1"
"@typescript-eslint/tsconfig-utils" "^8.56.1"
"@typescript-eslint/types" "^8.56.1"
debug "^4.4.3"
"@typescript-eslint/tsconfig-utils@8.57.1", "@typescript-eslint/tsconfig-utils@^8.57.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.57.1.tgz#9233443ec716882a6f9e240fd900a73f0235f3d7"
integrity sha512-0lgOZB8cl19fHO4eI46YUx2EceQqhgkPSuCGLlGi79L2jwYY1cxeYc1Nae8Aw1xjgW3PKVDLlr3YJ6Bxx8HkWg==
"@typescript-eslint/type-utils@8.57.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/type-utils/-/type-utils-8.57.1.tgz#c49af1347b5869ca85155547a8f34f84ab386fd9"
integrity sha512-+Bwwm0ScukFdyoJsh2u6pp4S9ktegF98pYUU0hkphOOqdMB+1sNQhIz8y5E9+4pOioZijrkfNO/HUJVAFFfPKA==
"@typescript-eslint/scope-manager@8.56.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/scope-manager/-/scope-manager-8.56.0.tgz#604030a4c6433df3728effdd441d47f45a86edb4"
integrity sha512-7UiO/XwMHquH+ZzfVCfUNkIXlp/yQjjnlYUyYz7pfvlK3/EyyN6BK+emDmGNyQLBtLGaYrTAI6KOw8tFucWL2w==
dependencies:
"@typescript-eslint/types" "8.57.1"
"@typescript-eslint/typescript-estree" "8.57.1"
"@typescript-eslint/utils" "8.57.1"
"@typescript-eslint/types" "8.56.0"
"@typescript-eslint/visitor-keys" "8.56.0"
"@typescript-eslint/scope-manager@8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/scope-manager/-/scope-manager-8.56.1.tgz#254df93b5789a871351335dd23e20bc164060f24"
integrity sha512-YAi4VDKcIZp0O4tz/haYKhmIDZFEUPOreKbfdAN3SzUDMcPhJ8QI99xQXqX+HoUVq8cs85eRKnD+rne2UAnj2w==
dependencies:
"@typescript-eslint/types" "8.56.1"
"@typescript-eslint/visitor-keys" "8.56.1"
"@typescript-eslint/tsconfig-utils@8.56.0", "@typescript-eslint/tsconfig-utils@^8.56.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.56.0.tgz#2538ce83cbc376e685487960cbb24b65fe2abc4e"
integrity sha512-bSJoIIt4o3lKXD3xmDh9chZcjCz5Lk8xS7Rxn+6l5/pKrDpkCwtQNQQwZ2qRPk7TkUYhrq3WPIHXOXlbXP0itg==
"@typescript-eslint/tsconfig-utils@8.56.1", "@typescript-eslint/tsconfig-utils@^8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.56.1.tgz#1afa830b0fada5865ddcabdc993b790114a879b7"
integrity sha512-qOtCYzKEeyr3aR9f28mPJqBty7+DBqsdd63eO0yyDwc6vgThj2UjWfJIcsFeSucYydqcuudMOprZ+x1SpF3ZuQ==
"@typescript-eslint/type-utils@8.56.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/type-utils/-/type-utils-8.56.0.tgz#72b4edc1fc73988998f1632b3ec99c2a66eaac6e"
integrity sha512-qX2L3HWOU2nuDs6GzglBeuFXviDODreS58tLY/BALPC7iu3Fa+J7EOTwnX9PdNBxUI7Uh0ntP0YWGnxCkXzmfA==
dependencies:
"@typescript-eslint/types" "8.56.0"
"@typescript-eslint/typescript-estree" "8.56.0"
"@typescript-eslint/utils" "8.56.0"
debug "^4.4.3"
ts-api-utils "^2.4.0"
"@typescript-eslint/types@8.57.1", "@typescript-eslint/types@^8.57.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/types/-/types-8.57.1.tgz#54b27a8a25a7b45b4f978c3f8e00c4c78f11142c"
integrity sha512-S29BOBPJSFUiblEl6RzPPjJt6w25A6XsBqRVDt53tA/tlL8q7ceQNZHTjPeONt/3S7KRI4quk+yP9jK2WjBiPQ==
"@typescript-eslint/typescript-estree@8.57.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/typescript-estree/-/typescript-estree-8.57.1.tgz#a9fd28d4a0ec896aa9a9a7e0cead62ea24f99e76"
integrity sha512-ybe2hS9G6pXpqGtPli9Gx9quNV0TWLOmh58ADlmZe9DguLq0tiAKVjirSbtM1szG6+QH6rVXyU6GTLQbWnMY+g==
"@typescript-eslint/type-utils@8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/type-utils/-/type-utils-8.56.1.tgz#7a6c4fabf225d674644931e004302cbbdd2f2e24"
integrity sha512-yB/7dxi7MgTtGhZdaHCemf7PuwrHMenHjmzgUW1aJpO+bBU43OycnM3Wn+DdvDO/8zzA9HlhaJ0AUGuvri4oGg==
dependencies:
"@typescript-eslint/project-service" "8.57.1"
"@typescript-eslint/tsconfig-utils" "8.57.1"
"@typescript-eslint/types" "8.57.1"
"@typescript-eslint/visitor-keys" "8.57.1"
"@typescript-eslint/types" "8.56.1"
"@typescript-eslint/typescript-estree" "8.56.1"
"@typescript-eslint/utils" "8.56.1"
debug "^4.4.3"
ts-api-utils "^2.4.0"
"@typescript-eslint/types@8.56.0", "@typescript-eslint/types@^8.56.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/types/-/types-8.56.0.tgz#a2444011b9a98ca13d70411d2cbfed5443b3526a"
integrity sha512-DBsLPs3GsWhX5HylbP9HNG15U0bnwut55Lx12bHB9MpXxQ+R5GC8MwQe+N1UFXxAeQDvEsEDY6ZYwX03K7Z6HQ==
"@typescript-eslint/types@8.56.1", "@typescript-eslint/types@^8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/types/-/types-8.56.1.tgz#975e5942bf54895291337c91b9191f6eb0632ab9"
integrity sha512-dbMkdIUkIkchgGDIv7KLUpa0Mda4IYjo4IAMJUZ+3xNoUXxMsk9YtKpTHSChRS85o+H9ftm51gsK1dZReY9CVw==
"@typescript-eslint/typescript-estree@8.56.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/typescript-estree/-/typescript-estree-8.56.0.tgz#fadbc74c14c5bac947db04980ff58bb178701c2e"
integrity sha512-ex1nTUMWrseMltXUHmR2GAQ4d+WjkZCT4f+4bVsps8QEdh0vlBsaCokKTPlnqBFqqGaxilDNJG7b8dolW2m43Q==
dependencies:
"@typescript-eslint/project-service" "8.56.0"
"@typescript-eslint/tsconfig-utils" "8.56.0"
"@typescript-eslint/types" "8.56.0"
"@typescript-eslint/visitor-keys" "8.56.0"
debug "^4.4.3"
minimatch "^9.0.5"
semver "^7.7.3"
tinyglobby "^0.2.15"
ts-api-utils "^2.4.0"
"@typescript-eslint/typescript-estree@8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/typescript-estree/-/typescript-estree-8.56.1.tgz#3b9e57d8129a860c50864c42188f761bdef3eab0"
integrity sha512-qzUL1qgalIvKWAf9C1HpvBjif+Vm6rcT5wZd4VoMb9+Km3iS3Cv9DY6dMRMDtPnwRAFyAi7YXJpTIEXLvdfPxg==
dependencies:
"@typescript-eslint/project-service" "8.56.1"
"@typescript-eslint/tsconfig-utils" "8.56.1"
"@typescript-eslint/types" "8.56.1"
"@typescript-eslint/visitor-keys" "8.56.1"
debug "^4.4.3"
minimatch "^10.2.2"
semver "^7.7.3"
tinyglobby "^0.2.15"
ts-api-utils "^2.4.0"
"@typescript-eslint/utils@8.57.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/utils/-/utils-8.57.1.tgz#e40f5a7fcff02fd24092a7b52bd6ec029fb50465"
integrity sha512-XUNSJ/lEVFttPMMoDVA2r2bwrl8/oPx8cURtczkSEswY5T3AeLmCy+EKWQNdL4u0MmAHOjcWrqJp2cdvgjn8dQ==
"@typescript-eslint/utils@8.56.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/utils/-/utils-8.56.0.tgz#063ce6f702ec603de1b83ee795ed5e877d6f7841"
integrity sha512-RZ3Qsmi2nFGsS+n+kjLAYDPVlrzf7UhTffrDIKr+h2yzAlYP/y5ZulU0yeDEPItos2Ph46JAL5P/On3pe7kDIQ==
dependencies:
"@eslint-community/eslint-utils" "^4.9.1"
"@typescript-eslint/scope-manager" "8.57.1"
"@typescript-eslint/types" "8.57.1"
"@typescript-eslint/typescript-estree" "8.57.1"
"@typescript-eslint/scope-manager" "8.56.0"
"@typescript-eslint/types" "8.56.0"
"@typescript-eslint/typescript-estree" "8.56.0"
"@typescript-eslint/visitor-keys@8.57.1":
version "8.57.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/visitor-keys/-/visitor-keys-8.57.1.tgz#3af4f88118924d3be983d4b8ae84803f11fe4563"
integrity sha512-YWnmJkXbofiz9KbnbbwuA2rpGkFPLbAIetcCNO6mJ8gdhdZ/v7WDXsoGFAJuM6ikUFKTlSQnjWnVO4ux+UzS6A==
"@typescript-eslint/utils@8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/utils/-/utils-8.56.1.tgz#5a86acaf9f1b4c4a85a42effb217f73059f6deb7"
integrity sha512-HPAVNIME3tABJ61siYlHzSWCGtOoeP2RTIaHXFMPqjrQKCGB9OgUVdiNgH7TJS2JNIQ5qQ4RsAUDuGaGme/KOA==
dependencies:
"@typescript-eslint/types" "8.57.1"
"@eslint-community/eslint-utils" "^4.9.1"
"@typescript-eslint/scope-manager" "8.56.1"
"@typescript-eslint/types" "8.56.1"
"@typescript-eslint/typescript-estree" "8.56.1"
"@typescript-eslint/visitor-keys@8.56.0":
version "8.56.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/visitor-keys/-/visitor-keys-8.56.0.tgz#7d6592ab001827d3ce052155edf7ecad19688d7d"
integrity sha512-q+SL+b+05Ud6LbEE35qe4A99P+htKTKVbyiNEe45eCbJFyh/HVK9QXwlrbz+Q4L8SOW4roxSVwXYj4DMBT7Ieg==
dependencies:
"@typescript-eslint/types" "8.56.0"
eslint-visitor-keys "^5.0.0"
"@typescript-eslint/visitor-keys@8.56.1":
version "8.56.1"
resolved "https://registry.yarnpkg.com/@typescript-eslint/visitor-keys/-/visitor-keys-8.56.1.tgz#50e03475c33a42d123dc99e63acf1841c0231f87"
integrity sha512-KiROIzYdEV85YygXw6BI/Dx4fnBlFQu6Mq4QE4MOH9fFnhohw6wX/OAvDY2/C+ut0I3RSPKenvZJIVYqJNkhEw==
dependencies:
"@typescript-eslint/types" "8.56.1"
eslint-visitor-keys "^5.0.0"
"@ungap/structured-clone@^1.0.0":
@@ -5876,10 +5961,10 @@ base64-js@^1.3.1, base64-js@^1.5.1:
resolved "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz"
integrity sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==
baseline-browser-mapping@^2.10.7, baseline-browser-mapping@^2.9.0, baseline-browser-mapping@^2.9.19:
version "2.10.7"
resolved "https://registry.yarnpkg.com/baseline-browser-mapping/-/baseline-browser-mapping-2.10.7.tgz#2c017adffe4f7bbe93c2e55526cc1869d36f588c"
integrity sha512-1ghYO3HnxGec0TCGBXiDLVns4eCSx4zJpxnHrlqFQajmhfKMQBzUGDdkMK7fUW7PTHTeLf+j87aTuKuuwWzMGw==
baseline-browser-mapping@^2.10.0, baseline-browser-mapping@^2.9.0, baseline-browser-mapping@^2.9.19:
version "2.10.0"
resolved "https://registry.yarnpkg.com/baseline-browser-mapping/-/baseline-browser-mapping-2.10.0.tgz#5b09935025bf8a80e29130251e337c6a7fc8cbb9"
integrity sha512-lIyg0szRfYbiy67j9KN8IyeD7q7hcmqnJ1ddWmNt19ItGpNN64mnllmxUNFIOdOm6by97jlL6wfpTTJrmnjWAA==
batch@0.6.1:
version "0.6.1"
@@ -6124,10 +6209,15 @@ caniuse-api@^3.0.0:
lodash.memoize "^4.1.2"
lodash.uniq "^4.5.0"
caniuse-lite@^1.0.0, caniuse-lite@^1.0.30001702, caniuse-lite@^1.0.30001759, caniuse-lite@^1.0.30001780:
version "1.0.30001780"
resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30001780.tgz#0e413de292808868a62ed9118822683fa120a110"
integrity sha512-llngX0E7nQci5BPJDqoZSbuZ5Bcs9F5db7EtgfwBerX9XGtkkiO4NwfDDIRzHTTwcYC8vC7bmeUEPGrKlR/TkQ==
caniuse-lite@^1.0.0, caniuse-lite@^1.0.30001702, caniuse-lite@^1.0.30001759:
version "1.0.30001770"
resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30001770.tgz#4dc47d3b263a50fbb243448034921e0a88591a84"
integrity sha512-x/2CLQ1jHENRbHg5PSId2sXq1CIO1CISvwWAj027ltMVG2UNgW+w9oH2+HzgEIRFembL8bUlXtfbBHR1fCg2xw==
caniuse-lite@^1.0.30001775:
version "1.0.30001775"
resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30001775.tgz#9572266e3f7f77efee5deac1efeb4795879d1b7f"
integrity sha512-s3Qv7Lht9zbVKE9XoTyRG6wVDCKdtOFIjBGg3+Yhn6JaytuNKPIjBMTMIY1AnOH3seL5mvF+x33oGAyK3hVt3A==
ccount@^2.0.0:
version "2.0.1"
@@ -11245,6 +11335,13 @@ minimatch@^5.0.1:
dependencies:
brace-expansion "^2.0.1"
minimatch@^9.0.5:
version "9.0.5"
resolved "https://registry.npmjs.org/minimatch/-/minimatch-9.0.5.tgz"
integrity sha512-G6T0ZX48xgozx7587koeX9Ys2NYy6Gmv//P89sEte9V9whIapMNF4idKxnW2QtCcLiTWlb/wfCabAtAFWhhBow==
dependencies:
brace-expansion "^2.0.1"
minimist@^1.2.0:
version "1.2.8"
resolved "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz"
@@ -14861,15 +14958,15 @@ types-ramda@^0.30.1:
dependencies:
ts-toolbelt "^9.6.0"
typescript-eslint@^8.57.1:
version "8.57.1"
resolved "https://registry.yarnpkg.com/typescript-eslint/-/typescript-eslint-8.57.1.tgz#573f97d3e48bbb67290b47dde1b7cb3b5d01dc4f"
integrity sha512-fLvZWf+cAGw3tqMCYzGIU6yR8K+Y9NT2z23RwOjlNFF2HwSB3KhdEFI5lSBv8tNmFkkBShSjsCjzx1vahZfISA==
typescript-eslint@^8.56.1:
version "8.56.1"
resolved "https://registry.yarnpkg.com/typescript-eslint/-/typescript-eslint-8.56.1.tgz#15a9fcc5d2150a0d981772bb36f127a816fe103f"
integrity sha512-U4lM6pjmBX7J5wk4szltF7I1cGBHXZopnAXCMXb3+fZ3B/0Z3hq3wS/CCUB2NZBNAExK92mCU2tEohWuwVMsDQ==
dependencies:
"@typescript-eslint/eslint-plugin" "8.57.1"
"@typescript-eslint/parser" "8.57.1"
"@typescript-eslint/typescript-estree" "8.57.1"
"@typescript-eslint/utils" "8.57.1"
"@typescript-eslint/eslint-plugin" "8.56.1"
"@typescript-eslint/parser" "8.56.1"
"@typescript-eslint/typescript-estree" "8.56.1"
"@typescript-eslint/utils" "8.56.1"
typescript@~5.9.3:
version "5.9.3"

View File

@@ -144,7 +144,7 @@ solr = ["sqlalchemy-solr >= 0.2.0"]
elasticsearch = ["elasticsearch-dbapi>=0.2.12, <0.3.0"]
exasol = ["sqlalchemy-exasol >= 2.4.0, <3.0"]
excel = ["xlrd>=1.2.0, <1.3"]
fastmcp = ["fastmcp>=3.1.0,<4.0"]
fastmcp = ["fastmcp==2.14.3"]
firebird = ["sqlalchemy-firebird>=0.7.0, <0.8"]
firebolt = ["firebolt-sqlalchemy>=1.0.0, <2"]
gevent = ["gevent>=23.9.1"]
@@ -285,6 +285,7 @@ module = [
"superset.tags.filters",
"superset.commands.security.update",
"superset.commands.security.create",
"superset.semantic_layers.api",
]
warn_unused_ignores = false

View File

@@ -120,7 +120,7 @@ flask==2.3.3
# flask-session
# flask-sqlalchemy
# flask-wtf
flask-appbuilder==5.1.0
flask-appbuilder==5.0.2
# via
# apache-superset (pyproject.toml)
# apache-superset-core

View File

@@ -10,8 +10,6 @@
# via
# -r requirements/development.in
# apache-superset
aiofile==3.9.0
# via py-key-value-aio
alembic==1.15.2
# via
# -c requirements/base-constraint.txt
@@ -28,10 +26,8 @@ anyio==4.11.0
# via
# httpx
# mcp
# py-key-value-aio
# sse-starlette
# starlette
# watchfiles
apispec==6.6.1
# via
# -c requirements/base-constraint.txt
@@ -69,7 +65,9 @@ bcrypt==4.3.0
# -c requirements/base-constraint.txt
# paramiko
beartype==0.22.5
# via py-key-value-aio
# via
# py-key-value-aio
# py-key-value-shared
billiard==4.2.1
# via
# -c requirements/base-constraint.txt
@@ -102,8 +100,6 @@ cachetools==6.2.1
# -c requirements/base-constraint.txt
# google-auth
# py-key-value-aio
caio==0.9.25
# via aiofile
cattrs==25.1.1
# via
# -c requirements/base-constraint.txt
@@ -142,6 +138,7 @@ click==8.2.1
# click-repl
# flask
# flask-appbuilder
# typer
# uvicorn
click-didyoumean==0.3.1
# via
@@ -159,6 +156,8 @@ click-repl==0.3.0
# via
# -c requirements/base-constraint.txt
# celery
cloudpickle==3.1.2
# via pydocket
cmdstanpy==1.1.0
# via prophet
colorama==0.4.6
@@ -207,6 +206,8 @@ deprecation==2.1.0
# apache-superset
dill==0.4.0
# via pylint
diskcache==5.6.3
# via py-key-value-aio
distlib==0.3.8
# via virtualenv
dnspython==2.7.0
@@ -236,7 +237,9 @@ et-xmlfile==2.0.0
# openpyxl
exceptiongroup==1.3.0
# via fastmcp
fastmcp==3.1.0
fakeredis==2.32.1
# via pydocket
fastmcp==2.14.3
# via apache-superset
filelock==3.20.3
# via
@@ -259,7 +262,7 @@ flask==2.3.3
# flask-sqlalchemy
# flask-testing
# flask-wtf
flask-appbuilder==5.1.0
flask-appbuilder==5.0.2
# via
# -c requirements/base-constraint.txt
# apache-superset
@@ -471,8 +474,6 @@ jsonpath-ng==1.7.0
# via
# -c requirements/base-constraint.txt
# apache-superset
jsonref==1.1.0
# via fastmcp
jsonschema==4.23.0
# via
# -c requirements/base-constraint.txt
@@ -503,6 +504,8 @@ limits==5.1.0
# via
# -c requirements/base-constraint.txt
# flask-limiter
lupa==2.6
# via fakeredis
mako==1.3.10
# via
# -c requirements/base-constraint.txt
@@ -600,7 +603,7 @@ openpyxl==3.1.5
# -c requirements/base-constraint.txt
# pandas
opentelemetry-api==1.39.1
# via fastmcp
# via pydocket
ordered-set==4.1.0
# via
# -c requirements/base-constraint.txt
@@ -619,7 +622,6 @@ packaging==25.0
# deprecation
# docker
# duckdb-engine
# fastmcp
# google-cloud-bigquery
# gunicorn
# limits
@@ -651,6 +653,8 @@ parsedatetime==2.6
# apache-superset
pathable==0.4.3
# via jsonschema-path
pathvalidate==3.3.1
# via py-key-value-aio
pgsanity==0.2.9
# via
# -c requirements/base-constraint.txt
@@ -687,6 +691,8 @@ prison==0.2.1
# flask-appbuilder
progress==1.6
# via apache-superset
prometheus-client==0.23.1
# via pydocket
prompt-toolkit==3.0.51
# via
# -c requirements/base-constraint.txt
@@ -708,8 +714,12 @@ psutil==6.1.0
# via apache-superset
psycopg2-binary==2.9.9
# via apache-superset
py-key-value-aio==0.4.4
# via fastmcp
py-key-value-aio==0.3.0
# via
# fastmcp
# pydocket
py-key-value-shared==0.3.0
# via py-key-value-aio
pyarrow==16.1.0
# via
# -c requirements/base-constraint.txt
@@ -748,6 +758,8 @@ pydantic-settings==2.10.1
# via mcp
pydata-google-auth==1.9.0
# via pandas-gbq
pydocket==0.17.1
# via fastmcp
pydruid==0.6.9
# via apache-superset
pyfakefs==5.3.5
@@ -832,6 +844,8 @@ python-dotenv==1.1.0
# apache-superset
# fastmcp
# pydantic-settings
python-json-logger==4.0.0
# via pydocket
python-ldap==3.4.4
# via apache-superset
python-multipart==0.0.20
@@ -852,13 +866,15 @@ pyyaml==6.0.2
# -c requirements/base-constraint.txt
# apache-superset
# apispec
# fastmcp
# jsonschema-path
# pre-commit
redis==5.3.1
# via
# -c requirements/base-constraint.txt
# apache-superset
# fakeredis
# py-key-value-aio
# pydocket
referencing==0.36.2
# via
# -c requirements/base-constraint.txt
@@ -894,7 +910,9 @@ rich==13.9.4
# cyclopts
# fastmcp
# flask-limiter
# pydocket
# rich-rst
# typer
rich-rst==1.3.1
# via cyclopts
rpds-py==0.25.0
@@ -926,6 +944,8 @@ setuptools==80.9.0
# pydata-google-auth
# zope-event
# zope-interface
shellingham==1.5.4
# via typer
shillelagh==1.4.3
# via
# -c requirements/base-constraint.txt
@@ -953,6 +973,7 @@ sniffio==1.3.1
sortedcontainers==2.4.0
# via
# -c requirements/base-constraint.txt
# fakeredis
# trio
sqlalchemy==1.4.54
# via
@@ -996,8 +1017,6 @@ tabulate==0.9.0
# via
# -c requirements/base-constraint.txt
# apache-superset
tomli-w==1.2.0
# via apache-superset-extensions-cli
tomlkit==0.13.3
# via pylint
tqdm==4.67.1
@@ -1015,6 +1034,8 @@ trio-websocket==0.12.2
# via
# -c requirements/base-constraint.txt
# selenium
typer==0.20.0
# via pydocket
typing-extensions==4.15.0
# via
# -c requirements/base-constraint.txt
@@ -1027,14 +1048,16 @@ typing-extensions==4.15.0
# limits
# mcp
# opentelemetry-api
# py-key-value-aio
# py-key-value-shared
# pydantic
# pydantic-core
# pydocket
# pyopenssl
# referencing
# selenium
# shillelagh
# starlette
# typer
# typing-inspection
typing-inspection==0.4.1
# via
@@ -1049,8 +1072,6 @@ tzdata==2025.2
# pandas
tzlocal==5.2
# via trino
uncalled-for==0.2.0
# via fastmcp
url-normalize==2.2.1
# via
# -c requirements/base-constraint.txt
@@ -1080,8 +1101,6 @@ watchdog==6.0.0
# -c requirements/base-constraint.txt
# apache-superset
# apache-superset-extensions-cli
watchfiles==1.1.1
# via fastmcp
wcwidth==0.2.13
# via
# -c requirements/base-constraint.txt

View File

@@ -43,6 +43,8 @@ classifiers = [
]
dependencies = [
"flask-appbuilder>=5.0.2,<6",
"isodate>=0.7.0",
"pyarrow>=16.0.0",
"pydantic>=2.8.0",
"sqlalchemy>=1.4.0,<2.0",
"sqlalchemy-utils>=0.38.0, <0.43", # expanding lowerbound to work with pydoris

View File

@@ -37,13 +37,6 @@ Usage:
from typing import Any, Callable, TypeVar
try:
from mcp.types import ToolAnnotations
except (
ImportError
): # MCP extras may not be installed in superset-core-only environments
ToolAnnotations = dict
# Type variable for decorated functions
F = TypeVar("F", bound=Callable[..., Any])
@@ -55,15 +48,11 @@ def tool(
description: str | None = None,
tags: list[str] | None = None,
protect: bool = True,
class_permission_name: str | None = None,
method_permission_name: str | None = None,
annotations: ToolAnnotations | None = None,
) -> Any: # Use Any to avoid mypy issues with dependency injection
"""
Decorator to register an MCP tool with optional authentication.
This decorator combines FastMCP tool registration with optional authentication
and RBAC permission checking.
This decorator combines FastMCP tool registration with optional authentication.
Can be used as:
@tool
@@ -80,13 +69,6 @@ def tool(
description: Tool description (defaults to function docstring)
tags: List of tags for categorizing the tool (defaults to empty list)
protect: Whether to require Superset authentication (defaults to True)
class_permission_name: FAB view/resource name for RBAC checking
(e.g., "Chart", "Dashboard", "SQLLab"). When set, enables
permission checking via security_manager.can_access().
method_permission_name: FAB action name (e.g., "read", "write").
Defaults to "write" if tags includes "mutate", else "read".
annotations: MCP tool annotations (title, readOnlyHint, destructiveHint, etc.)
These hints help MCP clients understand tool behavior and safety.
Returns:
Decorator function that registers and wraps the tool, or the wrapped function
@@ -108,18 +90,6 @@ def tool(
def public_tool() -> str:
'''Public tool accessible without auth'''
return "Hello world"
@tool(class_permission_name="Chart") # RBAC: requires can_read on Chart
def list_charts() -> list:
'''List charts the user can access'''
return []
@tool( # RBAC: can_write on Chart
tags=["mutate"], class_permission_name="Chart",
)
def create_chart(name: str) -> dict:
'''Create a new chart'''
return {"name": name}
"""
raise NotImplementedError(
"MCP tool decorator not initialized. "
@@ -188,5 +158,4 @@ def prompt(
__all__ = [
"tool",
"prompt",
"ToolAnnotations",
]

View File

@@ -0,0 +1,73 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from __future__ import annotations
from typing import Any
from pydantic import BaseModel
def build_configuration_schema(
config_class: type[BaseModel],
configuration: BaseModel | None = None,
) -> dict[str, Any]:
"""
Build a JSON schema from a Pydantic configuration class.
Handles generic boilerplate that any semantic layer with dynamic fields needs:
- Reorders properties to match model field order (Pydantic sorts alphabetically)
- When ``configuration`` is None, sets ``enum: []`` on all ``x-dynamic`` properties
so the frontend renders them as empty dropdowns
Semantic layer implementations call this instead of
``model_json_schema()`` directly,
then only need to add their own dynamic population logic.
"""
schema = config_class.model_json_schema()
# Pydantic sorts properties alphabetically; restore model field order
field_order = [
field.alias or name for name, field in config_class.model_fields.items()
]
schema["properties"] = {
key: schema["properties"][key]
for key in field_order
if key in schema["properties"]
}
if configuration is None:
for prop_schema in schema["properties"].values():
if prop_schema.get("x-dynamic"):
prop_schema["enum"] = []
return schema
def check_dependencies(
prop_schema: dict[str, Any],
configuration: BaseModel,
) -> bool:
"""
Check whether a dynamic property's dependencies are satisfied.
Reads the ``x-dependsOn`` list from the property schema and returns ``True``
when every referenced attribute on ``configuration`` is truthy.
"""
dependencies = prop_schema.get("x-dependsOn", [])
return all(getattr(configuration, dep, None) for dep in dependencies)

View File

@@ -0,0 +1,169 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""
Semantic layer DAO interfaces for superset-core.
Provides abstract DAO classes for semantic layers and views that define the
interface contract. Host implementations replace these with concrete classes
backed by SQLAlchemy during initialization.
Usage:
from superset_core.semantic_layers.daos import (
AbstractSemanticLayerDAO,
AbstractSemanticViewDAO,
)
"""
from __future__ import annotations
from abc import abstractmethod
from typing import Any, ClassVar
from superset_core.common.daos import BaseDAO
from superset_core.semantic_layers.models import SemanticLayerModel, SemanticViewModel
class AbstractSemanticLayerDAO(BaseDAO[SemanticLayerModel]):
"""
Abstract DAO interface for SemanticLayer.
Host implementations will replace this class during initialization
with a concrete DAO providing actual database access.
"""
model_cls: ClassVar[type[Any] | None] = None
base_filter = None
id_column_name = "uuid"
uuid_column_name = "uuid"
@classmethod
@abstractmethod
def validate_uniqueness(cls, name: str) -> bool:
"""
Validate that a semantic layer name is unique.
:param name: Semantic layer name to validate
:return: True if the name is unique, False otherwise
"""
...
@classmethod
@abstractmethod
def validate_update_uniqueness(cls, layer_uuid: str, name: str) -> bool:
"""
Validate that a semantic layer name is unique for an update operation,
excluding the layer being updated.
:param layer_uuid: UUID of the semantic layer being updated
:param name: New name to validate
:return: True if the name is unique, False otherwise
"""
...
@classmethod
@abstractmethod
def find_by_name(cls, name: str) -> SemanticLayerModel | None:
"""
Find a semantic layer by name.
:param name: Semantic layer name
:return: SemanticLayerModel instance or None
"""
...
@classmethod
@abstractmethod
def get_semantic_views(cls, layer_uuid: str) -> list[SemanticViewModel]:
"""
Get all semantic views associated with a semantic layer.
:param layer_uuid: UUID of the semantic layer
:return: List of SemanticViewModel instances
"""
...
class AbstractSemanticViewDAO(BaseDAO[SemanticViewModel]):
"""
Abstract DAO interface for SemanticView.
Host implementations will replace this class during initialization
with a concrete DAO providing actual database access.
"""
model_cls: ClassVar[type[Any] | None] = None
base_filter = None
id_column_name = "id"
uuid_column_name = "uuid"
@classmethod
@abstractmethod
def validate_uniqueness(
cls,
name: str,
layer_uuid: str,
configuration: dict[str, Any],
) -> bool:
"""
Validate that a semantic view is unique within a semantic layer.
Uniqueness is determined by the combination of name, layer UUID, and
configuration.
:param name: View name
:param layer_uuid: UUID of the parent semantic layer
:param configuration: Configuration dict to compare
:return: True if unique, False otherwise
"""
...
@classmethod
@abstractmethod
def validate_update_uniqueness(
cls,
view_uuid: str,
name: str,
layer_uuid: str,
configuration: dict[str, Any],
) -> bool:
"""
Validate that a semantic view is unique within a semantic layer for an
update operation, excluding the view being updated.
:param view_uuid: UUID of the view being updated
:param name: New name to validate
:param layer_uuid: UUID of the parent semantic layer
:param configuration: Configuration dict to compare
:return: True if unique, False otherwise
"""
...
@classmethod
@abstractmethod
def find_by_name(cls, name: str, layer_uuid: str) -> SemanticViewModel | None:
"""
Find a semantic view by name within a semantic layer.
:param name: View name
:param layer_uuid: UUID of the parent semantic layer
:return: SemanticViewModel instance or None
"""
...
__all__ = ["AbstractSemanticLayerDAO", "AbstractSemanticViewDAO"]

View File

@@ -0,0 +1,102 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""
Semantic layer registration decorator for Superset.
This module provides a decorator interface to register semantic layer
implementations with the host application, enabling automatic discovery
by the extensions framework.
Usage:
from superset_core.semantic_layers.decorators import semantic_layer
@semantic_layer(
id="snowflake",
name="Snowflake Cortex",
description="Snowflake semantic layer via Cortex Analyst",
)
class SnowflakeSemanticLayer(SemanticLayer[SnowflakeConfig, SnowflakeView]):
...
# Or with minimal arguments:
@semantic_layer(id="dbt", name="dbt Semantic Layer")
class DbtSemanticLayer(SemanticLayer[DbtConfig, DbtView]):
...
"""
from __future__ import annotations
from typing import Callable, TypeVar
# Type variable for decorated semantic layer classes
T = TypeVar("T")
def semantic_layer(
id: str,
name: str,
description: str | None = None,
) -> Callable[[T], T]:
"""
Decorator to register a semantic layer implementation.
Automatically detects extension context and applies appropriate
namespacing to prevent ID conflicts between host and extension
semantic layers.
Host implementations will replace this function during initialization
with a concrete implementation providing actual functionality.
Args:
id: Unique semantic layer type identifier (e.g., "snowflake",
"dbt"). Used as the key in the semantic layers registry and
stored in the ``type`` column of the ``SemanticLayer`` model.
name: Human-readable display name (e.g., "Snowflake Cortex").
Shown in the UI when listing available semantic layer types.
description: Optional description for documentation and UI
tooltips.
Returns:
Decorated semantic layer class registered with the host
application.
Raises:
NotImplementedError: If called before host implementation is
initialized.
Example:
from superset_core.semantic_layers.decorators import semantic_layer
from superset_core.semantic_layers.layer import SemanticLayer
@semantic_layer(
id="snowflake",
name="Snowflake Cortex",
description="Connect to Snowflake Cortex Analyst",
)
class SnowflakeSemanticLayer(
SemanticLayer[SnowflakeConfig, SnowflakeView]
):
...
"""
raise NotImplementedError(
"Semantic layer decorator not initialized. "
"This decorator should be replaced during Superset startup."
)
__all__ = ["semantic_layer"]

View File

@@ -0,0 +1,129 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from __future__ import annotations
from abc import ABC, abstractmethod
from typing import Any, Generic, TypeVar
from pydantic import BaseModel
from superset_core.semantic_layers.view import SemanticView
ConfigT = TypeVar("ConfigT", bound=BaseModel)
SemanticViewT = TypeVar("SemanticViewT", bound="SemanticView")
class SemanticLayer(ABC, Generic[ConfigT, SemanticViewT]):
"""
Abstract base class for semantic layers.
"""
configuration_class: type[BaseModel]
@classmethod
@abstractmethod
def from_configuration(
cls,
configuration: dict[str, Any],
) -> SemanticLayer[ConfigT, SemanticViewT]:
"""
Create a semantic layer from its configuration.
"""
raise NotImplementedError(
"Semantic layers must implement the from_configuration method"
)
@classmethod
@abstractmethod
def get_configuration_schema(
cls,
configuration: ConfigT | None = None,
) -> dict[str, Any]:
"""
Get the JSON schema for the configuration needed to add the semantic layer.
A partial configuration `configuration` can be sent to improve the schema,
allowing for progressive validation and better UX. For example, a semantic
layer might require:
- auth information
- a database
If the user provides the auth information, a client can send the partial
configuration to this method, and the resulting JSON schema would include
the list of databases the user has access to, allowing a dropdown to be
populated.
The Snowflake semantic layer has an example implementation of this method, where
database and schema names are populated based on the provided connection info.
"""
raise NotImplementedError(
"Semantic layers must implement the get_configuration_schema method"
)
@classmethod
@abstractmethod
def get_runtime_schema(
cls,
configuration: ConfigT,
runtime_data: dict[str, Any] | None = None,
) -> dict[str, Any]:
"""
Get the JSON schema for the runtime parameters needed to load semantic views.
This returns the schema needed to connect to a semantic view given the
configuration for the semantic layer. For example, a semantic layer might
be configured by:
- auth information
- an optional database
If the user does not provide a database when creating the semantic layer, the
runtime schema would require the database name to be provided before loading any
semantic views. This allows users to create semantic layers that connect to a
specific database (or project, account, etc.), or that allow users to select it
at query time.
The Snowflake semantic layer has an example implementation of this method, where
database and schema names are required if they were not provided in the initial
configuration.
"""
raise NotImplementedError(
"Semantic layers must implement the get_runtime_schema method"
)
@abstractmethod
def get_semantic_views(
self,
runtime_configuration: dict[str, Any],
) -> set[SemanticViewT]:
"""
Get the semantic views available in the semantic layer.
The runtime configuration can provide information like a given project or
schema, used to restrict the semantic views returned.
"""
@abstractmethod
def get_semantic_view(
self,
name: str,
additional_configuration: dict[str, Any],
) -> SemanticViewT:
"""
Get a specific semantic view by its name and additional configuration.
"""

View File

@@ -0,0 +1,85 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""
Semantic layer model interfaces for superset-core.
Provides abstract model classes for semantic layers and views that will be
replaced by the host implementation's concrete SQLAlchemy models during
initialization.
Usage:
from superset_core.semantic_layers.models import (
SemanticLayerModel,
SemanticViewModel,
)
"""
from __future__ import annotations
from datetime import datetime
from uuid import UUID
from superset_core.common.models import CoreModel
class SemanticLayerModel(CoreModel):
"""
Abstract interface for the SemanticLayer database model.
Host implementations will replace this class during initialization
with a concrete SQLAlchemy model providing actual persistence.
"""
__abstract__ = True
# Type hints for expected column attributes
uuid: UUID
name: str
description: str | None
type: str
configuration: str
configuration_version: int
cache_timeout: int | None
created_on: datetime | None
changed_on: datetime | None
class SemanticViewModel(CoreModel):
"""
Abstract interface for the SemanticView database model.
Host implementations will replace this class during initialization
with a concrete SQLAlchemy model providing actual persistence.
"""
__abstract__ = True
# Type hints for expected column attributes
id: int
uuid: UUID
name: str
description: str | None
configuration: str
configuration_version: int
cache_timeout: int | None
semantic_layer_uuid: UUID
created_on: datetime | None
changed_on: datetime | None
__all__ = ["SemanticLayerModel", "SemanticViewModel"]

View File

@@ -0,0 +1,209 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import enum
from dataclasses import dataclass
from datetime import date, datetime, time, timedelta
import isodate
import pyarrow as pa
@dataclass(frozen=True)
class Grain:
"""
Represents a time grain (e.g., day, month, year).
Attributes:
name: Human-readable name of the grain (e.g., "Second")
representation: ISO 8601 duration (e.g., "PT1S", "P1D", "P1M")
"""
name: str
representation: str
def __post_init__(self) -> None:
isodate.parse_duration(self.representation)
def __eq__(self, other: object) -> bool:
if isinstance(other, Grain):
return self.representation == other.representation
return NotImplemented
def __hash__(self) -> int:
return hash(self.representation)
class Grains:
"""Pre-defined common grains and factory for custom ones."""
SECOND = Grain("Second", "PT1S")
MINUTE = Grain("Minute", "PT1M")
HOUR = Grain("Hour", "PT1H")
DAY = Grain("Day", "P1D")
WEEK = Grain("Week", "P1W")
MONTH = Grain("Month", "P1M")
QUARTER = Grain("Quarter", "P3M")
YEAR = Grain("Year", "P1Y")
_REGISTRY: dict[str, Grain] = {
"PT1S": SECOND,
"PT1M": MINUTE,
"PT1H": HOUR,
"P1D": DAY,
"P1W": WEEK,
"P1M": MONTH,
"P3M": QUARTER,
"P1Y": YEAR,
}
@classmethod
def get(cls, representation: str, name: str | None = None) -> Grain:
"""Return a pre-defined grain or create a custom one."""
if grain := cls._REGISTRY.get(representation):
return grain
return Grain(name or representation, representation)
@dataclass(frozen=True)
class Dimension:
id: str
name: str
type: pa.DataType
definition: str | None = None
description: str | None = None
grain: Grain | None = None
@dataclass(frozen=True)
class Metric:
id: str
name: str
type: pa.DataType
definition: str
description: str | None = None
@dataclass(frozen=True)
class AdhocExpression:
id: str
definition: str
class Operator(str, enum.Enum):
EQUALS = "="
NOT_EQUALS = "!="
GREATER_THAN = ">"
LESS_THAN = "<"
GREATER_THAN_OR_EQUAL = ">="
LESS_THAN_OR_EQUAL = "<="
IN = "IN"
NOT_IN = "NOT IN"
LIKE = "LIKE"
NOT_LIKE = "NOT LIKE"
IS_NULL = "IS NULL"
IS_NOT_NULL = "IS NOT NULL"
ADHOC = "ADHOC"
FilterValues = str | int | float | bool | datetime | date | time | timedelta | None
class PredicateType(enum.Enum):
WHERE = "WHERE"
HAVING = "HAVING"
@dataclass(frozen=True, order=True)
class Filter:
type: PredicateType
column: Dimension | Metric | None
operator: Operator
value: FilterValues | frozenset[FilterValues]
class OrderDirection(enum.Enum):
ASC = "ASC"
DESC = "DESC"
OrderTuple = tuple[Metric | Dimension | AdhocExpression, OrderDirection]
@dataclass(frozen=True)
class GroupLimit:
"""
Limit query to top/bottom N combinations of specified dimensions.
The `filters` parameter allows specifying separate filter constraints for the
group limit subquery. This is useful when you want to determine the top N groups
using different criteria (e.g., a different time range) than the main query.
For example, you might want to find the top 10 products by sales over the last
30 days, but then show daily sales for those products over the last 7 days.
"""
dimensions: list[Dimension]
top: int
metric: Metric | None
direction: OrderDirection = OrderDirection.DESC
group_others: bool = False
filters: set[Filter] | None = None
@dataclass(frozen=True)
class SemanticRequest:
"""
Represents a request made to obtain semantic results.
This could be a SQL query, an HTTP request, etc.
"""
type: str
definition: str
@dataclass(frozen=True)
class SemanticResult:
"""
Represents the results of a semantic query.
This includes any requests (SQL queries, HTTP requests) that were performed in order
to obtain the results, in order to help troubleshooting.
"""
requests: list[SemanticRequest]
results: pa.Table
@dataclass(frozen=True)
class SemanticQuery:
"""
Represents a semantic query.
"""
metrics: list[Metric]
dimensions: list[Dimension]
filters: set[Filter] | None = None
order: list[OrderTuple] | None = None
limit: int | None = None
offset: int | None = None
group_limit: GroupLimit | None = None

View File

@@ -0,0 +1,108 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import enum
from abc import ABC, abstractmethod
from superset_core.semantic_layers.types import (
Dimension,
Filter,
Metric,
SemanticQuery,
SemanticResult,
)
# TODO (betodealmeida): move to the extension JSON
class SemanticViewFeature(enum.Enum):
"""
Custom features supported by semantic layers.
"""
ADHOC_EXPRESSIONS_IN_ORDERBY = "ADHOC_EXPRESSIONS_IN_ORDERBY"
GROUP_LIMIT = "GROUP_LIMIT"
GROUP_OTHERS = "GROUP_OTHERS"
class SemanticView(ABC):
"""
Abstract base class for semantic views.
"""
features: frozenset[SemanticViewFeature]
@abstractmethod
def uid(self) -> str:
"""
Returns a unique identifier for the semantic view.
"""
@abstractmethod
def get_dimensions(self) -> set[Dimension]:
"""
Get the dimensions defined in the semantic view.
"""
@abstractmethod
def get_metrics(self) -> set[Metric]:
"""
Get the metrics defined in the semantic view.
"""
@abstractmethod
def get_values(
self,
dimension: Dimension,
filters: set[Filter] | None = None,
) -> SemanticResult:
"""
Return distinct values for a dimension.
"""
@abstractmethod
def get_table(self, query: SemanticQuery) -> SemanticResult:
"""
Execute a semantic query and return the results.
"""
@abstractmethod
def get_row_count(self, query: SemanticQuery) -> SemanticResult:
"""
Execute a query and return the number of rows the result would have.
"""
@abstractmethod
def get_compatible_metrics(
self,
selected_metrics: set[Metric],
selected_dimensions: set[Dimension],
) -> set[Metric]:
"""
Return metrics compatible with the selected dimensions.
"""
@abstractmethod
def get_compatible_dimensions(
self,
selected_metrics: set[Metric],
selected_dimensions: set[Dimension],
) -> set[Dimension]:
"""
Return dimensions compatible with the selected metrics.
"""

View File

@@ -17,7 +17,7 @@
[project]
name = "apache-superset-extensions-cli"
version = "0.1.0rc2"
version = "0.1.0rc1"
description = "Official command-line interface for building, bundling, and managing Apache Superset extensions"
readme = "README.md"
authors = [
@@ -49,7 +49,6 @@ dependencies = [
"jinja2>=3.1.6",
"semver>=3.0.4",
"tomli>=2.2.1; python_version < '3.11'",
"tomli-w>=1.2.0",
"watchdog>=6.0.0",
]

View File

@@ -50,8 +50,6 @@ from superset_extensions_cli.utils import (
validate_display_name,
validate_publisher,
validate_technical_name,
write_json,
write_toml,
)
REMOTE_ENTRY_REGEX = re.compile(r"^remoteEntry\..+\.js$")
@@ -294,7 +292,6 @@ def app() -> None:
@app.command()
def validate() -> None:
"""Validate the extension structure and metadata consistency."""
validate_npm()
cwd = Path.cwd()
@@ -375,167 +372,12 @@ def validate() -> None:
click.secho(" Convention requires: frontend/src/index.tsx", fg="yellow")
sys.exit(1)
# Validate version and license consistency across extension.json, frontend, and backend
mismatches: list[str] = []
frontend_pkg_path = cwd / "frontend" / "package.json"
frontend_pkg = None
if frontend_pkg_path.is_file():
frontend_pkg = read_json(frontend_pkg_path)
if frontend_pkg:
if frontend_pkg.get("version") != extension.version:
mismatches.append(
f" frontend/package.json version: {frontend_pkg.get('version')} "
f"(expected {extension.version})"
)
if extension.license and frontend_pkg.get("license") != extension.license:
mismatches.append(
f" frontend/package.json license: {frontend_pkg.get('license')} "
f"(expected {extension.license})"
)
backend_pyproject_path = cwd / "backend" / "pyproject.toml"
if backend_pyproject_path.is_file():
backend_pyproject = read_toml(backend_pyproject_path)
if backend_pyproject:
project = backend_pyproject.get("project", {})
if project.get("version") != extension.version:
mismatches.append(
f" backend/pyproject.toml version: {project.get('version')} "
f"(expected {extension.version})"
)
if extension.license and project.get("license") != extension.license:
mismatches.append(
f" backend/pyproject.toml license: {project.get('license')} "
f"(expected {extension.license})"
)
if mismatches:
click.secho("❌ Metadata mismatch detected:", err=True, fg="red")
for mismatch in mismatches:
click.secho(mismatch, err=True, fg="red")
click.secho(
"Run `superset-extensions update` to sync from extension.json.",
fg="yellow",
)
sys.exit(1)
click.secho("✅ Validation successful", fg="green")
@app.command()
@click.option(
"--version",
"version_opt",
is_flag=False,
flag_value="__prompt__",
default=None,
help="Set a new version. Prompts for value if none given.",
)
@click.option(
"--license",
"license_opt",
is_flag=False,
flag_value="__prompt__",
default=None,
help="Set a new license. Prompts for value if none given.",
)
def update(version_opt: str | None, license_opt: str | None) -> None:
"""Update derived and generated files in the extension project."""
cwd = Path.cwd()
extension_json_path = cwd / "extension.json"
extension_data = read_json(extension_json_path)
if not extension_data:
click.secho("❌ extension.json not found.", err=True, fg="red")
sys.exit(1)
try:
extension = ExtensionConfig.model_validate(extension_data)
except Exception as e:
click.secho(f"❌ Invalid extension.json: {e}", err=True, fg="red")
sys.exit(1)
# Resolve version: prompt if flag used without value
if version_opt == "__prompt__":
version_opt = click.prompt("Version", default=extension.version)
target_version = (
version_opt
if version_opt and version_opt != extension.version
else extension.version
)
# Resolve license: prompt if flag used without value
if license_opt == "__prompt__":
license_opt = click.prompt("License", default=extension.license or "")
target_license = (
license_opt
if license_opt and license_opt != extension.license
else extension.license
)
updated: list[str] = []
# Update extension.json if version or license changed
ext_changed = False
if version_opt and version_opt != extension.version:
extension_data["version"] = target_version
ext_changed = True
if license_opt and license_opt != extension.license:
extension_data["license"] = target_license
ext_changed = True
if ext_changed:
try:
ExtensionConfig.model_validate(extension_data)
except Exception as e:
click.secho(f"❌ Invalid value: {e}", err=True, fg="red")
sys.exit(1)
write_json(extension_json_path, extension_data)
updated.append("extension.json")
# Update frontend/package.json
frontend_pkg_path = cwd / "frontend" / "package.json"
if frontend_pkg_path.is_file():
frontend_pkg = read_json(frontend_pkg_path)
if frontend_pkg:
pkg_changed = False
if frontend_pkg.get("version") != target_version:
frontend_pkg["version"] = target_version
pkg_changed = True
if target_license and frontend_pkg.get("license") != target_license:
frontend_pkg["license"] = target_license
pkg_changed = True
if pkg_changed:
write_json(frontend_pkg_path, frontend_pkg)
updated.append("frontend/package.json")
# Update backend/pyproject.toml
backend_pyproject_path = cwd / "backend" / "pyproject.toml"
if backend_pyproject_path.is_file():
backend_pyproject = read_toml(backend_pyproject_path)
if backend_pyproject:
project = backend_pyproject.setdefault("project", {})
toml_changed = False
if project.get("version") != target_version:
project["version"] = target_version
toml_changed = True
if target_license and project.get("license") != target_license:
project["license"] = target_license
toml_changed = True
if toml_changed:
write_toml(backend_pyproject_path, backend_pyproject)
updated.append("backend/pyproject.toml")
if updated:
for path in updated:
click.secho(f"✅ Updated {path}", fg="green")
else:
click.secho("✅ All files already up to date.", fg="green")
@app.command()
@click.pass_context
def build(ctx: click.Context) -> None:
"""Build extension assets."""
ctx.invoke(validate)
cwd = Path.cwd()
frontend_dir = cwd / "frontend"
@@ -571,7 +413,6 @@ def build(ctx: click.Context) -> None:
)
@click.pass_context
def bundle(ctx: click.Context, output: Path | None) -> None:
"""Package the extension into a .supx file."""
ctx.invoke(build)
cwd = Path.cwd()
@@ -612,7 +453,6 @@ def bundle(ctx: click.Context, output: Path | None) -> None:
@app.command()
@click.pass_context
def dev(ctx: click.Context) -> None:
"""Automatically rebuild the extension as files change."""
cwd = Path.cwd()
frontend_dir = cwd / "frontend"
backend_dir = cwd / "backend"
@@ -807,7 +647,6 @@ def init(
frontend_opt: bool | None,
backend_opt: bool | None,
) -> None:
"""Scaffold a new extension project."""
# Get extension names with graceful validation
names = prompt_for_extension_info(display_name_opt, publisher_opt, name_opt)
@@ -847,7 +686,7 @@ def init(
click.secho("✅ Created extension.json", fg="green")
# Create .gitignore
gitignore = env.get_template("gitignore.j2").render(ctx)
gitignore = env.get_template(".gitignore.j2").render(ctx)
(target_dir / ".gitignore").write_text(gitignore)
click.secho("✅ Created .gitignore", fg="green")

View File

@@ -21,8 +21,6 @@ import sys
from pathlib import Path
from typing import Any
import tomli_w
from superset_core.extensions.constants import (
DISPLAY_NAME_PATTERN,
PUBLISHER_PATTERN,
@@ -111,14 +109,6 @@ def read_json(path: Path) -> dict[str, Any] | None:
return json.loads(path.read_text())
def write_json(path: Path, data: dict[str, Any]) -> None:
path.write_text(json.dumps(data, indent=2) + "\n")
def write_toml(path: Path, data: dict[str, Any]) -> None:
path.write_text(tomli_w.dumps(data))
def _normalize_for_identifiers(name: str) -> str:
"""
Normalize display name to clean lowercase words.

View File

@@ -17,12 +17,10 @@
from __future__ import annotations
import json
import os
from pathlib import Path
import pytest
import tomli_w
from click.testing import CliRunner
@@ -140,69 +138,3 @@ def extension_setup_for_bundling():
(backend_dir / "__init__.py").write_text("# init")
return _setup
@pytest.fixture
def extension_with_versions():
"""Create an extension directory structure with configurable versions and licenses."""
def _create(
base_path: Path,
ext_version: str = "1.0.0",
frontend_version: str | None = None,
backend_version: str | None = None,
ext_license: str | None = "Apache-2.0",
frontend_license: str | None = None,
backend_license: str | None = None,
) -> None:
extension_json = {
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": ext_version,
"permissions": [],
}
if ext_license is not None:
extension_json["license"] = ext_license
(base_path / "extension.json").write_text(json.dumps(extension_json))
if frontend_version is not None:
frontend_dir = base_path / "frontend"
frontend_dir.mkdir(exist_ok=True)
(frontend_dir / "src").mkdir(exist_ok=True)
(frontend_dir / "src" / "index.tsx").write_text("// entry")
pkg = {
"name": "@test-org/test-extension",
"version": frontend_version,
}
if frontend_license is not None:
pkg["license"] = frontend_license
elif ext_license is not None:
pkg["license"] = ext_license
(frontend_dir / "package.json").write_text(json.dumps(pkg, indent=2))
if backend_version is not None:
backend_dir = base_path / "backend"
backend_dir.mkdir(exist_ok=True)
src_dir = backend_dir / "src" / "test_org" / "test_extension"
src_dir.mkdir(parents=True, exist_ok=True)
(src_dir / "entrypoint.py").write_text("# entry")
project = {
"name": "test-org-test-extension",
"version": backend_version,
}
if backend_license is not None:
project["license"] = backend_license
elif ext_license is not None:
project["license"] = ext_license
pyproject = {
"project": project,
"tool": {
"apache_superset_extensions": {
"build": {"include": ["src/**/*.py"]}
}
},
}
(backend_dir / "pyproject.toml").write_text(tomli_w.dumps(pyproject))
return _create

View File

@@ -121,7 +121,7 @@ def test_build_command_success_flow(
# Setup mocks
mock_rebuild_frontend.return_value = "remoteEntry.abc123.js"
mock_read_toml.return_value = {
"project": {"name": "test", "version": "1.0.0"},
"project": {"name": "test"},
"tool": {
"apache_superset_extensions": {
"build": {"include": ["src/test_org/test_extension/**/*.py"]}
@@ -162,7 +162,7 @@ def test_build_command_handles_frontend_build_failure(
# Setup mocks
mock_rebuild_frontend.return_value = None # Indicates failure
mock_read_toml.return_value = {
"project": {"name": "test", "version": "1.0.0"},
"project": {"name": "test"},
"tool": {
"apache_superset_extensions": {
"build": {"include": ["src/test_org/test_extension/**/*.py"]}

View File

@@ -1,172 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from __future__ import annotations
import pytest
from superset_extensions_cli.cli import app
from superset_extensions_cli.utils import read_json, read_toml
@pytest.mark.cli
def test_update_syncs_versions(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test update syncs frontend and backend versions from extension.json."""
extension_with_versions(
isolated_filesystem,
ext_version="2.0.0",
frontend_version="1.0.0",
backend_version="1.0.0",
)
result = cli_runner.invoke(app, ["update"])
assert result.exit_code == 0
assert "Updated frontend/package.json" in result.output
assert "Updated backend/pyproject.toml" in result.output
frontend_pkg = read_json(isolated_filesystem / "frontend" / "package.json")
assert frontend_pkg["version"] == "2.0.0"
backend_pyproject = read_toml(isolated_filesystem / "backend" / "pyproject.toml")
assert backend_pyproject["project"]["version"] == "2.0.0"
@pytest.mark.cli
def test_update_noop_when_all_match(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test update reports no changes when everything already matches."""
extension_with_versions(
isolated_filesystem,
ext_version="1.0.0",
frontend_version="1.0.0",
backend_version="1.0.0",
)
result = cli_runner.invoke(app, ["update"])
assert result.exit_code == 0
assert "All files already up to date" in result.output
@pytest.mark.cli
def test_update_fails_without_extension_json(cli_runner, isolated_filesystem):
"""Test update fails when extension.json is missing."""
result = cli_runner.invoke(app, ["update"])
assert result.exit_code != 0
assert "extension.json not found" in result.output
@pytest.mark.cli
def test_update_with_version_flag(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test --version updates extension.json first, then syncs all files."""
extension_with_versions(
isolated_filesystem,
ext_version="1.0.0",
frontend_version="1.0.0",
backend_version="1.0.0",
)
result = cli_runner.invoke(app, ["update", "--version", "3.0.0"])
assert result.exit_code == 0
assert "Updated extension.json" in result.output
assert "Updated frontend/package.json" in result.output
assert "Updated backend/pyproject.toml" in result.output
ext = read_json(isolated_filesystem / "extension.json")
assert ext["version"] == "3.0.0"
frontend_pkg = read_json(isolated_filesystem / "frontend" / "package.json")
assert frontend_pkg["version"] == "3.0.0"
backend_pyproject = read_toml(isolated_filesystem / "backend" / "pyproject.toml")
assert backend_pyproject["project"]["version"] == "3.0.0"
@pytest.mark.cli
def test_update_with_license_flag(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test --license updates license across all files."""
extension_with_versions(
isolated_filesystem,
ext_version="1.0.0",
frontend_version="1.0.0",
backend_version="1.0.0",
ext_license="Apache-2.0",
)
result = cli_runner.invoke(app, ["update", "--license", "MIT"])
assert result.exit_code == 0
assert "Updated extension.json" in result.output
assert "Updated frontend/package.json" in result.output
assert "Updated backend/pyproject.toml" in result.output
ext = read_json(isolated_filesystem / "extension.json")
assert ext["license"] == "MIT"
frontend_pkg = read_json(isolated_filesystem / "frontend" / "package.json")
assert frontend_pkg["license"] == "MIT"
backend_pyproject = read_toml(isolated_filesystem / "backend" / "pyproject.toml")
assert backend_pyproject["project"]["license"] == "MIT"
@pytest.mark.cli
def test_update_version_prompt_default(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test --version without value prompts with current version as default."""
extension_with_versions(
isolated_filesystem,
ext_version="1.0.0",
frontend_version="1.0.0",
backend_version="1.0.0",
)
# Hit enter to accept default — nothing should change
result = cli_runner.invoke(app, ["update", "--version"], input="\n")
assert result.exit_code == 0
assert "All files already up to date" in result.output
@pytest.mark.cli
def test_update_rejects_invalid_version(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test --version with an invalid semver string exits with error."""
extension_with_versions(
isolated_filesystem,
ext_version="1.0.0",
)
result = cli_runner.invoke(app, ["update", "--version", "not-a-version"])
assert result.exit_code != 0
assert "Invalid value" in result.output
# Verify extension.json was not modified
ext = read_json(isolated_filesystem / "extension.json")
assert ext["version"] == "1.0.0"

View File

@@ -207,66 +207,3 @@ def test_validate_npm_with_empty_version_output_raises_error(mock_run, mock_whic
# semver.compare will raise ValueError for empty version
with pytest.raises(ValueError):
validate_npm()
# Version Consistency Tests
@pytest.mark.cli
def test_validate_fails_on_version_mismatch(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test validate fails when frontend/backend versions differ from extension.json."""
extension_with_versions(
isolated_filesystem,
ext_version="2.0.0",
frontend_version="1.0.0",
backend_version="1.0.0",
)
with patch("superset_extensions_cli.cli.validate_npm"):
result = cli_runner.invoke(app, ["validate"])
assert result.exit_code != 0
assert "Metadata mismatch" in result.output
assert "superset-extensions update" in result.output
@pytest.mark.cli
def test_validate_passes_with_matching_versions(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test validate passes when all versions match extension.json."""
extension_with_versions(
isolated_filesystem,
ext_version="1.0.0",
frontend_version="1.0.0",
backend_version="1.0.0",
)
with patch("superset_extensions_cli.cli.validate_npm"):
result = cli_runner.invoke(app, ["validate"])
assert result.exit_code == 0
assert "Validation successful" in result.output
@pytest.mark.cli
def test_validate_fails_on_license_mismatch(
cli_runner, isolated_filesystem, extension_with_versions
):
"""Test validate fails when frontend/backend licenses differ from extension.json."""
extension_with_versions(
isolated_filesystem,
ext_version="1.0.0",
frontend_version="1.0.0",
backend_version="1.0.0",
ext_license="Apache-2.0",
frontend_license="MIT",
backend_license="MIT",
)
with patch("superset_extensions_cli.cli.validate_npm"):
result = cli_runner.invoke(app, ["validate"])
assert result.exit_code != 0
assert "Metadata mismatch" in result.output
assert "license" in result.output

View File

@@ -20,7 +20,7 @@ from __future__ import annotations
import json
import pytest
from superset_extensions_cli.utils import read_json, read_toml, write_json, write_toml
from superset_extensions_cli.utils import read_json, read_toml
# Read JSON Tests
@@ -269,32 +269,3 @@ def test_read_toml_with_permission_denied(isolated_filesystem):
toml_file.chmod(0o644)
except (OSError, PermissionError):
pass
# Write JSON Tests
@pytest.mark.unit
def test_write_json_round_trip(isolated_filesystem):
"""Test write_json then read_json round-trip preserves content."""
data = {"name": "test-extension", "version": "2.0.0", "nested": {"key": "value"}}
json_file = isolated_filesystem / "output.json"
write_json(json_file, data)
result = read_json(json_file)
assert result == data
# Write TOML Tests
@pytest.mark.unit
def test_write_toml_round_trip(isolated_filesystem):
"""Test write_toml then read_toml round-trip preserves content."""
data = {
"project": {"name": "test-package", "version": "1.0.0"},
"tool": {"apache_superset_extensions": {"build": {"include": ["src/**/*.py"]}}},
}
toml_file = isolated_filesystem / "output.toml"
write_toml(toml_file, data)
result = read_toml(toml_file)
assert result == data

View File

@@ -69,7 +69,7 @@ module.exports = {
],
coverageReporters: ['lcov', 'json-summary', 'html', 'text'],
transformIgnorePatterns: [
'node_modules/(?!d3-(array|interpolate|color|time|scale|time-format|format)|internmap|@mapbox/tiny-sdf|remark-gfm|(?!@ngrx|(?!deck.gl)|d3-scale)|markdown-table|micromark-*.|decode-named-character-reference|character-entities|mdast-util-*.|unist-util-*.|ccount|escape-string-regexp|nanoid|uuid|@rjsf/*.|echarts|zrender|fetch-mock|pretty-ms|parse-ms|ol|@babel/runtime|@emotion|cheerio|cheerio/lib|parse5|dom-serializer|entities|htmlparser2|rehype-sanitize|hast-util-sanitize|unified|unist-.*|hast-.*|rehype-.*|remark-.*|mdast-.*|micromark-.*|parse-entities|property-information|space-separated-tokens|comma-separated-tokens|bail|devlop|zwitch|longest-streak|geostyler|geostyler-.*|(?!geostyler)lodash|react-error-boundary|react-json-tree|react-base16-styling|lodash-es|rbush|quickselect|react-diff-viewer-continued)',
'node_modules/(?!d3-(array|interpolate|color|time|scale|time-format|format)|internmap|@mapbox/tiny-sdf|remark-gfm|(?!@ngrx|(?!deck.gl)|d3-scale)|markdown-table|micromark-*.|decode-named-character-reference|character-entities|mdast-util-*.|unist-util-*.|ccount|escape-string-regexp|nanoid|uuid|@rjsf/*.|echarts|zrender|fetch-mock|pretty-ms|parse-ms|ol|@babel/runtime|@emotion|cheerio|cheerio/lib|parse5|dom-serializer|entities|htmlparser2|rehype-sanitize|hast-util-sanitize|unified|unist-.*|hast-.*|rehype-.*|remark-.*|mdast-.*|micromark-.*|parse-entities|property-information|space-separated-tokens|comma-separated-tokens|bail|devlop|zwitch|longest-streak|geostyler|geostyler-.*|react-error-boundary|react-json-tree|react-base16-styling|lodash-es|rbush|quickselect|react-diff-viewer-continued)',
],
preset: 'ts-jest',
transform: {

File diff suppressed because it is too large Load Diff

View File

@@ -117,7 +117,14 @@
"@luma.gl/gltf": "~9.2.5",
"@luma.gl/shadertools": "~9.2.5",
"@luma.gl/webgl": "~9.2.5",
"@fontsource/fira-code": "^5.2.7",
"@fontsource/inter": "^5.2.8",
"@great-expectations/jsonforms-antd-renderers": "^2.2.10",
"@jsonforms/core": "^3.7.0",
"@jsonforms/react": "^3.7.0",
"@jsonforms/vanilla-renderers": "^3.7.0",
"@reduxjs/toolkit": "^1.9.3",
"@rjsf/antd": "^5.24.13",
"@rjsf/core": "^5.24.13",
"@rjsf/utils": "^5.24.3",
"@rjsf/validator-ajv8": "^5.24.13",
@@ -170,11 +177,11 @@
"fs-extra": "^11.3.3",
"fuse.js": "^7.1.0",
"geolib": "^3.3.4",
"geostyler": "^18.3.1",
"geostyler": "^14.1.3",
"geostyler-data": "^1.1.0",
"geostyler-openlayers-parser": "^5.4.0",
"geostyler-style": "11.0.2",
"geostyler-wfs-parser": "^3.0.1",
"geostyler-openlayers-parser": "^4.3.0",
"geostyler-style": "7.5.0",
"geostyler-wfs-parser": "^2.0.3",
"google-auth-library": "^10.6.1",
"immer": "^11.1.4",
"interweave": "^13.1.1",
@@ -305,7 +312,7 @@
"babel-plugin-dynamic-import-node": "^2.3.3",
"babel-plugin-jsx-remove-data-test-id": "^3.0.0",
"babel-plugin-lodash": "^3.3.4",
"baseline-browser-mapping": "^2.10.7",
"baseline-browser-mapping": "^2.10.0",
"cheerio": "1.2.0",
"concurrently": "^9.2.1",
"copy-webpack-plugin": "^13.0.1",
@@ -343,9 +350,9 @@
"jsdom": "^28.1.0",
"lerna": "^8.2.3",
"lightningcss": "^1.32.0",
"mini-css-extract-plugin": "^2.10.1",
"mini-css-extract-plugin": "^2.10.0",
"open-cli": "^8.0.0",
"oxlint": "^1.53.0",
"oxlint": "^1.51.0",
"po2json": "^0.4.5",
"prettier": "3.8.1",
"prettier-plugin-packagejson": "^3.0.2",

View File

@@ -369,28 +369,6 @@ export interface EditorProps {
theme?: SupersetTheme;
}
/**
* A single text change expressed as an offset-based replacement.
*/
export interface ContentChange {
/** Character offset in the document where the replaced range starts */
rangeOffset: number;
/** Length in characters of the replaced range (0 for pure insertions) */
rangeLength: number;
/** Text inserted at rangeOffset (empty string for pure deletions) */
text: string;
}
/**
* Payload delivered to `onDidChangeContent` listeners.
*/
export interface ContentChangeEvent {
/** Returns the full current content of the editor */
getValue(): string;
/** The individual changes that occurred in this event */
changes: ReadonlyArray<ContentChange>;
}
/**
* Imperative API for controlling the editor programmatically.
*
@@ -514,27 +492,6 @@ export interface EditorHandle {
* - CodeMirror: editor.requestMeasure()
*/
resize(): void;
/**
* Subscribe to content changes in the editor.
*
* The listener receives a {@link ContentChangeEvent} with:
* - `getValue()` — lazy accessor for the full content (call only when needed
* to avoid unnecessary O(n) string allocation on every keystroke)
* - `changes` — the individual edits that occurred, as offset-based replacements
*
* @param listener Called with a ContentChangeEvent on every change
* @param thisArgs Optional `this` context for the listener
* @returns A Disposable that unsubscribes the listener when disposed
*
* @example
* const disposable = editor.onDidChangeContent(e => {
* setStatements(parseStatements(e.getValue()));
* });
* // Later, to unsubscribe:
* disposable.dispose();
*/
onDidChangeContent: Event<ContentChangeEvent>;
}
/**

View File

@@ -252,22 +252,6 @@ export interface QueryResult {
*/
export declare const getActivePanel: () => Panel;
/**
* Switches the active panel in the SQL Lab south pane.
* Built-in panel IDs are 'Results' and 'History'.
* Pinned table panels use the table's ID as their panel ID.
*
* @param panelId The ID of the panel to activate
* @returns Promise that resolves when the panel is activated
*
* @example
* ```typescript
* // Focus the Results panel after running a query
* await setActivePanel('Results');
* ```
*/
export declare function setActivePanel(panelId: string): Promise<void>;
/**
* Gets the currently active tab in SQL Lab.
*

View File

@@ -426,7 +426,6 @@ export interface ThemeControllerOptions {
canUpdateTheme?: () => boolean;
canUpdateMode?: () => boolean;
isGlobalContext?: boolean;
initialMode?: ThemeMode;
}
export interface ThemeContextType {

View File

@@ -28,9 +28,7 @@ export const getTimeOffset = (
// offset is represented as <offset>, group by list
series.name.includes(`${timeOffset},`) ||
// offset is represented as <metric>__<offset>
series.name.includes(`__${timeOffset}`) ||
// offset is represented as <metric>, <offset>
series.name.includes(`, ${timeOffset}`),
series.name.includes(`__${timeOffset}`),
);
export const hasTimeOffset = (
@@ -47,14 +45,10 @@ export const getOriginalSeries = (
): string => {
let result = seriesName;
timeCompare.forEach(compare => {
// offset in the middle: <metric>, <offset>, <dimension>
result = result.replace(`, ${compare},`, ',');
// offset at start: <offset>, <dimension>
// offset is represented as <offset>, group by list
result = result.replace(`${compare},`, '');
// offset with double underscore: <metric>__<offset>
// offset is represented as <metric>__<offset>
result = result.replace(`__${compare}`, '');
// offset at end: <metric>, <offset>
result = result.replace(`, ${compare}`, '');
});
return result.trim();
};

View File

@@ -25,17 +25,30 @@ export const matrixifyEnableSection: ControlPanelSectionConfig = {
controlSetRows: [
[
{
name: 'matrixify_enable',
name: 'matrixify_enable_horizontal_layout',
config: {
type: 'SwitchControl',
label: t('Enable matrixify'),
type: 'CheckboxControl',
label: t('Enable horizontal layout (columns)'),
description: t(
'Create matrix columns by placing charts side-by-side',
),
default: false,
renderTrigger: true,
},
},
],
[
{
name: 'matrixify_enable_vertical_layout',
config: {
type: 'CheckboxControl',
label: t('Enable vertical layout (rows)'),
description: t('Create matrix rows by stacking charts vertically'),
default: false,
renderTrigger: true,
},
},
],
['matrixify_mode_columns'],
['matrixify_mode_rows'],
],
tabOverride: 'matrixify',
};
@@ -44,11 +57,8 @@ export const matrixifySection: ControlPanelSectionConfig = {
label: t('Cell layout & styling'),
expanded: false,
visibility: ({ controls }) =>
controls?.matrixify_enable?.value === true &&
(controls?.matrixify_mode_rows?.value === 'metrics' ||
controls?.matrixify_mode_rows?.value === 'dimensions' ||
controls?.matrixify_mode_columns?.value === 'metrics' ||
controls?.matrixify_mode_columns?.value === 'dimensions'),
controls?.matrixify_enable_vertical_layout?.value === true ||
controls?.matrixify_enable_horizontal_layout?.value === true,
controlSetRows: [
[
{
@@ -109,13 +119,13 @@ export const matrixifySection: ControlPanelSectionConfig = {
};
export const matrixifyRowSection: ControlPanelSectionConfig = {
label: t('Vertical layout (rows)'),
expanded: false,
visibility: ({ controls }) =>
controls?.matrixify_enable?.value === true &&
(controls?.matrixify_mode_rows?.value === 'metrics' ||
controls?.matrixify_mode_rows?.value === 'dimensions'),
controls?.matrixify_enable_vertical_layout?.value === true,
controlSetRows: [
['matrixify_show_row_labels'],
['matrixify_mode_rows'],
['matrixify_rows'],
['matrixify_dimension_rows'],
['matrixify_dimension_selection_mode_rows'],
@@ -127,13 +137,13 @@ export const matrixifyRowSection: ControlPanelSectionConfig = {
};
export const matrixifyColumnSection: ControlPanelSectionConfig = {
label: t('Horizontal layout (columns)'),
expanded: false,
visibility: ({ controls }) =>
controls?.matrixify_enable?.value === true &&
(controls?.matrixify_mode_columns?.value === 'metrics' ||
controls?.matrixify_mode_columns?.value === 'dimensions'),
controls?.matrixify_enable_horizontal_layout?.value === true,
controlSetRows: [
['matrixify_show_column_headers'],
['matrixify_mode_columns'],
['matrixify_columns'],
['matrixify_dimension_columns'],
['matrixify_dimension_selection_mode_columns'],

View File

@@ -34,18 +34,19 @@ const isMatrixifyVisible = (
controls: any,
axis: 'rows' | 'columns',
mode?: 'metrics' | 'dimensions',
selectionMode?: 'members' | 'topn' | 'all',
selectionMode?: 'members' | 'topn',
) => {
const layoutControl = `matrixify_enable_${axis === 'rows' ? 'vertical' : 'horizontal'}_layout`;
const modeControl = `matrixify_mode_${axis}`;
const selectionModeControl = `matrixify_dimension_selection_mode_${axis}`;
const modeValue = controls?.[modeControl]?.value;
const isLayoutEnabled = modeValue === 'metrics' || modeValue === 'dimensions';
const isLayoutEnabled = controls?.[layoutControl]?.value === true;
if (!isLayoutEnabled) return false;
if (mode) {
if (modeValue !== mode) return false;
const isModeMatch = controls?.[modeControl]?.value === mode;
if (!isModeMatch) return false;
if (selectionMode && mode === 'dimensions') {
return controls?.[selectionModeControl]?.value === selectionMode;
@@ -65,20 +66,22 @@ const matrixifyControls: Record<string, SharedControlConfig<any>> = {};
matrixifyControls[`matrixify_mode_${axis}`] = {
type: 'RadioButtonControl',
default: 'disabled',
label: t(`Metrics / Dimensions`),
default: axis === 'columns' ? 'metrics' : 'dimensions',
renderTrigger: true,
tabOverride: 'matrixify',
visibility: ({ controls }) => isMatrixifyVisible(controls, axis),
mapStateToProps: ({ controls }) => {
const otherAxisControlName = `matrixify_mode_${otherAxis}`;
const otherAxisValue =
controls?.[otherAxisControlName]?.value ?? 'disabled';
controls?.[otherAxisControlName]?.value ??
(otherAxis === 'columns' ? 'metrics' : 'dimensions');
const isMetricsDisabled = otherAxisValue === 'metrics';
return {
options: [
{ value: 'disabled', label: t('Disabled') },
{
value: 'metrics',
label: t('Metrics'),
@@ -89,7 +92,7 @@ const matrixifyControls: Record<string, SharedControlConfig<any>> = {};
)
: undefined,
},
{ value: 'dimensions', label: t('Dimensions') },
{ value: 'dimensions', label: t('Dimension members') },
],
};
},
@@ -122,7 +125,6 @@ const matrixifyControls: Record<string, SharedControlConfig<any>> = {};
`matrixify_topn_metric_${axis}`,
`matrixify_topn_order_${axis}`,
`matrixify_dimension_selection_mode_${axis}`,
`matrixify_all_sort_by_${axis}`,
];
return fieldsToCheck.some(
@@ -159,10 +161,7 @@ const matrixifyControls: Record<string, SharedControlConfig<any>> = {};
selectionMode,
topNMetric: getValue(`matrixify_topn_metric_${axis}`),
topNValue: getValue(`matrixify_topn_value_${axis}`),
topNOrder: getValue(`matrixify_topn_order_${axis}`, true)
? 'DESC'
: 'ASC',
allSortBy: getValue(`matrixify_all_sort_by_${axis}`, 'a_to_z'),
topNOrder: getValue(`matrixify_topn_order_${axis}`),
formData: form_data,
validators,
};
@@ -188,24 +187,19 @@ const matrixifyControls: Record<string, SharedControlConfig<any>> = {};
visibility: () => false,
};
// Add selection mode control (Dimension Members / Top N / All)
// Add selection mode control (Dimension Members vs TopN)
matrixifyControls[`matrixify_dimension_selection_mode_${axis}`] = {
type: 'VerticalRadioControl',
type: 'RadioButtonControl',
label: t(`Selection method`),
default: 'members',
options: [
['members', t('Dimension members')],
['topn', t('Top n')],
],
renderTrigger: true,
tabOverride: 'matrixify',
visibility: ({ controls }) =>
isMatrixifyVisible(controls, axis, 'dimensions'),
options: [
{ value: 'members', label: t('Dimension members') },
{ value: 'topn', label: t('Top n') },
{
value: 'all',
label: t('All dimensions'),
tooltip: t('Uses the first 25 values if the dimension has more.'),
},
],
};
// TopN controls
@@ -242,15 +236,15 @@ const matrixifyControls: Record<string, SharedControlConfig<any>> = {};
description: t(`Metric to use for ordering Top N values`),
tabOverride: 'matrixify',
visibility: ({ controls }) =>
isMatrixifyVisible(controls, axis, 'dimensions', 'topn') ||
(isMatrixifyVisible(controls, axis, 'dimensions', 'all') &&
controls?.[`matrixify_all_sort_by_${axis}`]?.value === 'metric'),
isMatrixifyVisible(controls, axis, 'dimensions', 'topn'),
mapStateToProps: (state, controlState) => {
const { controls, datasource } = state;
const isVisible =
isMatrixifyVisible(controls, axis, 'dimensions', 'topn') ||
(isMatrixifyVisible(controls, axis, 'dimensions', 'all') &&
controls?.[`matrixify_all_sort_by_${axis}`]?.value === 'metric');
const isVisible = isMatrixifyVisible(
controls,
axis,
'dimensions',
'topn',
);
const originalProps =
dndAdhocMetricControl.mapStateToProps?.(state, controlState) || {};
@@ -267,31 +261,17 @@ const matrixifyControls: Record<string, SharedControlConfig<any>> = {};
};
matrixifyControls[`matrixify_topn_order_${axis}`] = {
type: 'CheckboxControl',
label: t('Sort descending'),
default: true,
renderTrigger: true,
tabOverride: 'matrixify',
visibility: ({ controls }) =>
isMatrixifyVisible(controls, axis, 'dimensions', 'topn') ||
(isMatrixifyVisible(controls, axis, 'dimensions', 'all') &&
controls?.[`matrixify_all_sort_by_${axis}`]?.value === 'metric'),
};
matrixifyControls[`matrixify_all_sort_by_${axis}`] = {
type: 'SelectControl',
label: t('Sort by'),
default: 'a_to_z',
clearable: false,
renderTrigger: true,
tabOverride: 'matrixify',
visibility: ({ controls }) =>
isMatrixifyVisible(controls, axis, 'dimensions', 'all'),
choices: [
['a_to_z', t('A-Z')],
['z_to_a', t('Z-A')],
['metric', t('Metric')],
type: 'RadioButtonControl',
label: t(`Sort order`),
default: 'desc',
options: [
['asc', t('Ascending')],
['desc', t('Descending')],
],
renderTrigger: true,
tabOverride: 'matrixify',
visibility: ({ controls }) =>
isMatrixifyVisible(controls, axis, 'dimensions', 'topn'),
};
});
@@ -337,6 +317,24 @@ matrixifyControls.matrixify_charts_per_row = {
!controls?.matrixify_fit_columns_dynamically?.value,
};
matrixifyControls.matrixify_enable_vertical_layout = {
type: 'CheckboxControl',
label: t('Enable vertical layout (rows)'),
description: t('Create matrix rows by stacking charts vertically'),
default: false,
renderTrigger: true,
tabOverride: 'matrixify',
};
matrixifyControls.matrixify_enable_horizontal_layout = {
type: 'CheckboxControl',
label: t('Enable horizontal layout (columns)'),
description: t('Create matrix columns by placing charts side-by-side'),
default: false,
renderTrigger: true,
tabOverride: 'matrixify',
};
// Cell title control for Matrixify
matrixifyControls.matrixify_cell_title_template = {
type: 'TextControl',
@@ -347,8 +345,8 @@ matrixifyControls.matrixify_cell_title_template = {
default: '',
renderTrigger: true,
visibility: ({ controls }) =>
isMatrixifyVisible(controls, 'rows') ||
isMatrixifyVisible(controls, 'columns'),
controls?.matrixify_enable_vertical_layout?.value === true ||
controls?.matrixify_enable_horizontal_layout?.value === true,
};
// Matrix display controls
@@ -359,7 +357,8 @@ matrixifyControls.matrixify_show_row_labels = {
default: true,
renderTrigger: true,
tabOverride: 'matrixify',
visibility: ({ controls }) => isMatrixifyVisible(controls, 'rows'),
visibility: ({ controls }) =>
controls?.matrixify_enable_vertical_layout?.value === true,
};
matrixifyControls.matrixify_show_column_headers = {
@@ -369,7 +368,8 @@ matrixifyControls.matrixify_show_column_headers = {
default: true,
renderTrigger: true,
tabOverride: 'matrixify',
visibility: ({ controls }) => isMatrixifyVisible(controls, 'columns'),
visibility: ({ controls }) =>
controls?.matrixify_enable_horizontal_layout?.value === true,
};
export { matrixifyControls };

View File

@@ -16,101 +16,15 @@
* specific language governing permissions and limitations
* under the License.
*/
import {
getOriginalSeries,
getTimeOffset,
hasTimeOffset,
} from '@superset-ui/chart-controls';
import { getOriginalSeries } from '@superset-ui/chart-controls';
test('getOriginalSeries returns the series name when time compare is empty', () => {
test('returns the series name when time compare is empty', () => {
const seriesName = 'sum';
expect(getOriginalSeries(seriesName, [])).toEqual(seriesName);
});
test('getOriginalSeries returns the original series name with __ pattern', () => {
test('returns the original series name', () => {
const seriesName = 'sum__1_month_ago';
const timeCompare = ['1_month_ago'];
expect(getOriginalSeries(seriesName, timeCompare)).toEqual('sum');
});
test('getOriginalSeries returns the original series name with <offset>, pattern', () => {
const seriesName = '1 year ago, groupby_value';
const timeCompare = ['1 year ago'];
expect(getOriginalSeries(seriesName, timeCompare)).toEqual('groupby_value');
});
test('getOriginalSeries returns the original series name with , <offset> pattern', () => {
const seriesName = 'AVG(price_each), 1 year ago';
const timeCompare = ['1 year ago'];
expect(getOriginalSeries(seriesName, timeCompare)).toEqual('AVG(price_each)');
});
test('getOriginalSeries handles multiple time compares', () => {
const seriesName = 'count, 1 year ago';
const timeCompare = ['1 month ago', '1 year ago'];
expect(getOriginalSeries(seriesName, timeCompare)).toEqual('count');
});
test('getOriginalSeries strips offset in the middle with dimension', () => {
const seriesName = 'SUM(sales), 28 days ago, Medium';
const timeCompare = ['28 days ago'];
expect(getOriginalSeries(seriesName, timeCompare)).toEqual(
'SUM(sales), Medium',
);
});
test('getOriginalSeries strips offset in the middle with multiple dimensions', () => {
const seriesName = 'SUM(sales), 1 year ago, Medium, 11';
const timeCompare = ['1 year ago'];
expect(getOriginalSeries(seriesName, timeCompare)).toEqual(
'SUM(sales), Medium, 11',
);
});
test('getTimeOffset returns undefined when no time offset pattern matches', () => {
const series = { name: 'count' };
const timeCompare = ['1 year ago'];
expect(getTimeOffset(series, timeCompare)).toBeUndefined();
});
test('getTimeOffset detects __ pattern', () => {
const series = { name: 'count__1 year ago' };
const timeCompare = ['1 year ago'];
expect(getTimeOffset(series, timeCompare)).toEqual('1 year ago');
});
test('getTimeOffset detects <offset>, pattern', () => {
const series = { name: '1 year ago, groupby_value' };
const timeCompare = ['1 year ago'];
expect(getTimeOffset(series, timeCompare)).toEqual('1 year ago');
});
test('getTimeOffset detects , <offset> pattern', () => {
const series = { name: 'AVG(price_each), 1 year ago' };
const timeCompare = ['1 year ago'];
expect(getTimeOffset(series, timeCompare)).toEqual('1 year ago');
});
test('getTimeOffset detects , <offset>, pattern (offset in middle)', () => {
const series = { name: 'SUM(sales), 28 days ago, Medium' };
const timeCompare = ['28 days ago'];
expect(getTimeOffset(series, timeCompare)).toEqual('28 days ago');
});
test('hasTimeOffset returns false for original series', () => {
const series = { name: 'count' };
const timeCompare = ['1 year ago'];
expect(hasTimeOffset(series, timeCompare)).toBe(false);
});
test('hasTimeOffset returns true for derived series with , <offset> pattern', () => {
const series = { name: 'AVG(price_each), 1 year ago' };
const timeCompare = ['1 year ago'];
expect(hasTimeOffset(series, timeCompare)).toBe(true);
});
test('hasTimeOffset returns false when series name is not a string', () => {
const series = { name: 123 };
const timeCompare = ['1 year ago'];
expect(hasTimeOffset(series, timeCompare)).toBe(false);
});

View File

@@ -41,7 +41,7 @@
"d3-scale": "^4.0.2",
"d3-time": "^3.1.0",
"d3-time-format": "^4.1.0",
"dompurify": "^3.3.3",
"dompurify": "^3.3.1",
"fetch-retry": "^6.0.0",
"handlebars": "^4.7.8",
"jed": "^1.1.1",

View File

@@ -438,23 +438,3 @@ test('should handle metrics without labels', () => {
expect(grid!.rowHeaders).toEqual(['']);
expect(grid!.colHeaders).toEqual(['count']);
});
test('should preserve slice_id and dashboardId for embedded dashboard permissions', () => {
const formDataWithDashboardContext: TestFormData = {
...baseFormData,
slice_id: 42,
dashboardId: 123,
};
const grid = generateMatrixifyGrid(formDataWithDashboardContext);
expect(grid).not.toBeNull();
const cell = grid!.cells[0][0];
// slice_id must be preserved for embedded dashboard permission checks
// The backend uses slice_id to verify the chart belongs to the dashboard
expect(cell!.formData.slice_id).toBe(42);
// dashboardId must be preserved for embedded dashboard context
expect(cell!.formData.dashboardId).toBe(123);
});

View File

@@ -125,9 +125,9 @@ function generateCellFormData(
});
// Override fields that could cause issues in grid cells
// Note: slice_id is intentionally preserved for embedded dashboard permission checks
const overrides: Partial<QueryFormData> = {
slice_name: undefined,
slice_id: undefined,
header_font_size: undefined,
subheader: undefined,
show_title: undefined,

View File

@@ -22,7 +22,6 @@ import '@testing-library/jest-dom';
import { ThemeProvider } from '@apache-superset/core/theme';
import { supersetTheme } from '@apache-superset/core/theme';
import MatrixifyGridRenderer from './MatrixifyGridRenderer';
import type { MatrixifyMode } from '../../types/matrixify';
import { generateMatrixifyGrid } from './MatrixifyGridGenerator';
// Mock the MatrixifyGridGenerator
@@ -75,9 +74,8 @@ test('should create single group when fitting columns dynamically', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
matrixify_fit_columns_dynamically: true,
matrixify_charts_per_row: 3,
matrixify_show_row_labels: true,
@@ -126,9 +124,8 @@ test('should create multiple groups when not fitting columns dynamically', () =>
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
matrixify_fit_columns_dynamically: false,
matrixify_charts_per_row: 3,
matrixify_show_row_labels: true,
@@ -163,9 +160,8 @@ test('should handle exact division of columns', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
matrixify_fit_columns_dynamically: false,
matrixify_charts_per_row: 2,
matrixify_show_row_labels: true,
@@ -193,9 +189,8 @@ test('should handle case where charts_per_row exceeds total columns', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
matrixify_fit_columns_dynamically: false,
matrixify_charts_per_row: 5,
matrixify_show_row_labels: true,
@@ -225,9 +220,8 @@ test('should show headers for each group when wrapping occurs', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
matrixify_fit_columns_dynamically: false,
matrixify_charts_per_row: 2,
matrixify_show_row_labels: true,
@@ -261,9 +255,8 @@ test('should show headers only on first row when not wrapping', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
matrixify_fit_columns_dynamically: true, // No wrapping
matrixify_show_row_labels: true,
matrixify_show_column_headers: true,
@@ -292,9 +285,8 @@ test('should hide headers when disabled', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
matrixify_show_row_labels: false,
matrixify_show_column_headers: false,
};
@@ -321,9 +313,8 @@ test('should place cells correctly in wrapped layout', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
matrixify_fit_columns_dynamically: false,
matrixify_charts_per_row: 2,
matrixify_show_row_labels: true,
@@ -353,9 +344,8 @@ test('should handle null grid gracefully', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
};
const { container } = renderWithTheme(
@@ -376,9 +366,8 @@ test('should handle empty grid gracefully', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
};
const { container } = renderWithTheme(
@@ -402,9 +391,8 @@ test('should use default values for missing configuration', () => {
const formData = {
viz_type: 'test_chart',
matrixify_enable: true,
matrixify_mode_rows: 'metrics' as MatrixifyMode,
matrixify_mode_columns: 'metrics' as MatrixifyMode,
matrixify_enable_vertical_layout: true,
matrixify_enable_horizontal_layout: true,
// Missing optional configurations
};

View File

@@ -130,12 +130,10 @@ function MatrixifyGridRenderer({
// Determine layout parameters - only show headers/labels if layout is enabled
const showRowLabels =
formData.matrixify_mode_rows !== undefined &&
formData.matrixify_mode_rows !== 'disabled' &&
formData.matrixify_enable_vertical_layout === true &&
(formData.matrixify_show_row_labels ?? true);
const showColumnHeaders =
formData.matrixify_mode_columns !== undefined &&
formData.matrixify_mode_columns !== 'disabled' &&
formData.matrixify_enable_horizontal_layout === true &&
(formData.matrixify_show_column_headers ?? true);
const rowHeight = formData.matrixify_row_height || DEFAULT_ROW_HEIGHT;
const fitColumnsDynamically =

View File

@@ -37,11 +37,12 @@ test('isMatrixifyEnabled should return false when no matrixify configuration exi
expect(isMatrixifyEnabled(formData)).toBe(false);
});
test('isMatrixifyEnabled should return false when layout controls are disabled', () => {
test('isMatrixifyEnabled should return false when layout controls are false', () => {
const formData = {
viz_type: 'table',
matrixify_mode_rows: 'disabled',
matrixify_mode_columns: 'disabled',
matrixify_enable_vertical_layout: false,
matrixify_enable_horizontal_layout: false,
matrixify_mode_rows: 'metrics',
matrixify_rows: [createMetric('Revenue')],
} as MatrixifyFormData;
@@ -51,7 +52,7 @@ test('isMatrixifyEnabled should return false when layout controls are disabled',
test('isMatrixifyEnabled should return true for valid metrics mode configuration', () => {
const formData = {
viz_type: 'table',
matrixify_enable: true,
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_mode_columns: 'metrics',
matrixify_rows: [createMetric('Revenue')],
@@ -64,7 +65,7 @@ test('isMatrixifyEnabled should return true for valid metrics mode configuration
test('isMatrixifyEnabled should return true for valid dimensions mode configuration', () => {
const formData = {
viz_type: 'table',
matrixify_enable: true,
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'dimensions',
matrixify_mode_columns: 'dimensions',
matrixify_dimension_rows: { dimension: 'country', values: ['USA'] },
@@ -77,7 +78,7 @@ test('isMatrixifyEnabled should return true for valid dimensions mode configurat
test('isMatrixifyEnabled should return true for mixed mode configuration', () => {
const formData = {
viz_type: 'table',
matrixify_enable: true,
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_mode_columns: 'dimensions',
matrixify_rows: [createMetric('Revenue')],
@@ -90,7 +91,7 @@ test('isMatrixifyEnabled should return true for mixed mode configuration', () =>
test('isMatrixifyEnabled should return true for topn dimension selection mode', () => {
const formData = {
viz_type: 'table',
matrixify_enable: true,
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'dimensions',
matrixify_mode_columns: 'dimensions',
matrixify_dimension_rows: {
@@ -109,7 +110,7 @@ test('isMatrixifyEnabled should return true for topn dimension selection mode',
test('isMatrixifyEnabled should return false when both axes have empty metrics arrays', () => {
const formData = {
viz_type: 'table',
matrixify_enable: true,
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_mode_columns: 'metrics',
matrixify_rows: [],
@@ -122,7 +123,7 @@ test('isMatrixifyEnabled should return false when both axes have empty metrics a
test('isMatrixifyEnabled should return false when both dimensions have empty values and no topn mode', () => {
const formData = {
viz_type: 'table',
matrixify_enable: true,
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'dimensions',
matrixify_mode_columns: 'dimensions',
matrixify_dimension_rows: { dimension: 'country', values: [] },
@@ -140,6 +141,7 @@ test('getMatrixifyConfig should return null when no matrixify configuration exis
test('getMatrixifyConfig should return valid config for metrics mode', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_mode_columns: 'metrics',
matrixify_rows: [createMetric('Revenue')],
@@ -157,6 +159,7 @@ test('getMatrixifyConfig should return valid config for metrics mode', () => {
test('getMatrixifyConfig should return valid config for dimensions mode', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'dimensions',
matrixify_mode_columns: 'dimensions',
matrixify_dimension_rows: { dimension: 'country', values: ['USA'] },
@@ -180,6 +183,7 @@ test('getMatrixifyConfig should return valid config for dimensions mode', () =>
test('getMatrixifyConfig should handle topn selection mode', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'dimensions',
matrixify_mode_columns: 'dimensions',
matrixify_dimension_rows: {
@@ -200,8 +204,8 @@ test('getMatrixifyConfig should handle topn selection mode', () => {
test('getMatrixifyValidationErrors should return empty array when matrixify is not enabled', () => {
const formData = {
viz_type: 'table',
matrixify_mode_rows: 'disabled',
matrixify_mode_columns: 'disabled',
matrixify_enable_vertical_layout: false,
matrixify_enable_horizontal_layout: false,
} as MatrixifyFormData;
expect(getMatrixifyValidationErrors(formData)).toEqual([]);
@@ -210,6 +214,7 @@ test('getMatrixifyValidationErrors should return empty array when matrixify is n
test('getMatrixifyValidationErrors should return empty array when properly configured', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_mode_columns: 'metrics',
matrixify_rows: [createMetric('Revenue')],
@@ -222,16 +227,17 @@ test('getMatrixifyValidationErrors should return empty array when properly confi
test('getMatrixifyValidationErrors should return error when enabled but no configuration exists', () => {
const formData = {
viz_type: 'table',
matrixify_mode_rows: 'metrics',
matrixify_enable_vertical_layout: true,
} as MatrixifyFormData;
const errors = getMatrixifyValidationErrors(formData);
expect(errors.length).toBeGreaterThan(0);
expect(errors).toContain('Please configure at least one row or column axis');
});
test('getMatrixifyValidationErrors should return error when metrics mode has no metrics', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_rows: [],
matrixify_columns: [],
@@ -254,28 +260,19 @@ test('should handle empty form data object', () => {
expect(isMatrixifyEnabled(formData)).toBe(false);
});
test('isMatrixifyEnabled should return false when no axis modes configured', () => {
test('isMatrixifyEnabled should return false when layout enabled but no axis modes configured', () => {
const formData = {
viz_type: 'table',
matrixify_enable: true,
matrixify_enable_vertical_layout: true,
// No matrixify_mode_rows or matrixify_mode_columns set
} as MatrixifyFormData;
expect(isMatrixifyEnabled(formData)).toBe(false);
});
test('isMatrixifyEnabled should return false when switch is off even with valid axis config', () => {
const formData = {
viz_type: 'table',
matrixify_enable: false,
matrixify_mode_rows: 'metrics',
matrixify_rows: [createMetric('Revenue')],
} as MatrixifyFormData;
expect(isMatrixifyEnabled(formData)).toBe(false);
});
test('getMatrixifyValidationErrors should return dimension error for rows when dimension has no data', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'dimensions',
// No matrixify_dimension_rows set
matrixify_mode_columns: 'metrics',
@@ -289,6 +286,7 @@ test('getMatrixifyValidationErrors should return dimension error for rows when d
test('getMatrixifyValidationErrors should return metric error for columns when metrics array is empty', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_rows: [createMetric('Revenue')],
matrixify_mode_columns: 'metrics',
@@ -302,6 +300,7 @@ test('getMatrixifyValidationErrors should return metric error for columns when m
test('getMatrixifyValidationErrors should return dimension error for columns when no dimension data', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_rows: [createMetric('Revenue')],
matrixify_mode_columns: 'dimensions',
@@ -312,9 +311,10 @@ test('getMatrixifyValidationErrors should return dimension error for columns whe
expect(errors).toContain('Please select a dimension and values for columns');
});
test('getMatrixifyValidationErrors skips row check when matrixify_mode_rows is not set', () => {
test('getMatrixifyValidationErrors skips row check when matrixify_mode_rows is not set (line 240 false, line 279 || false)', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
// No matrixify_mode_rows — hasRowMode = false
matrixify_mode_columns: 'metrics',
matrixify_columns: [createMetric('Q1')],
@@ -324,9 +324,10 @@ test('getMatrixifyValidationErrors skips row check when matrixify_mode_rows is n
expect(errors).toEqual([]);
});
test('getMatrixifyValidationErrors evaluates full && expression when dimension is set but values are empty', () => {
test('getMatrixifyValidationErrors evaluates full && expression when dimension is set but values are empty (lines 244, 264, 283, 291 true branches)', () => {
const formData = {
viz_type: 'table',
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'dimensions',
matrixify_dimension_rows: { dimension: 'country', values: [] },
matrixify_mode_columns: 'dimensions',
@@ -344,7 +345,7 @@ test('getMatrixifyValidationErrors evaluates full && expression when dimension i
test('should handle partial configuration with one axis only', () => {
const formData = {
viz_type: 'table',
matrixify_enable: true,
matrixify_enable_vertical_layout: true,
matrixify_mode_rows: 'metrics',
matrixify_rows: [createMetric('Revenue')],
// No columns configuration

View File

@@ -23,7 +23,6 @@ import { AdhocMetric } from '../../query';
* Constants for Matrixify filter generation
* These match the literal types used in Filter.ts
*/
export const MatrixifyFilterConstants = {
// Filter expression types
ExpressionType: {
@@ -47,12 +46,12 @@ export const MatrixifyFilterConstants = {
/**
* Mode for selecting matrix axis values
*/
export type MatrixifyMode = 'disabled' | 'metrics' | 'dimensions';
export type MatrixifyMode = 'metrics' | 'dimensions';
/**
* Selection method for dimension values
*/
export type MatrixifySelectionMode = 'members' | 'topn' | 'all';
export type MatrixifySelectionMode = 'members' | 'topn';
/**
* Sort order for top N selection
@@ -97,18 +96,18 @@ export interface MatrixifyAxisConfig {
* Complete Matrixify configuration in form data
*/
export interface MatrixifyFormData {
// Global enable switch
matrixify_enable?: boolean;
// Layout enable controls
matrixify_enable_vertical_layout?: boolean;
matrixify_enable_horizontal_layout?: boolean;
// Row axis configuration (mode 'disabled' means axis is off)
// Row axis configuration
matrixify_mode_rows?: MatrixifyMode;
matrixify_rows?: AdhocMetric[];
matrixify_dimension_selection_mode_rows?: MatrixifySelectionMode;
matrixify_dimension_rows?: MatrixifyDimensionValue;
matrixify_topn_value_rows?: number;
matrixify_topn_metric_rows?: AdhocMetric;
matrixify_topn_order_rows?: boolean;
matrixify_all_sort_by_rows?: 'a_to_z' | 'z_to_a' | 'metric';
matrixify_topn_order_rows?: MatrixifySortOrder;
// Column axis configuration
matrixify_mode_columns?: MatrixifyMode;
@@ -117,8 +116,7 @@ export interface MatrixifyFormData {
matrixify_dimension_columns?: MatrixifyDimensionValue;
matrixify_topn_value_columns?: number;
matrixify_topn_metric_columns?: AdhocMetric;
matrixify_topn_order_columns?: boolean;
matrixify_all_sort_by_columns?: 'a_to_z' | 'z_to_a' | 'metric';
matrixify_topn_order_columns?: MatrixifySortOrder;
// Grid layout configuration
matrixify_row_height?: number;
@@ -141,85 +139,75 @@ export interface MatrixifyConfig {
columns: MatrixifyAxisConfig;
}
/**
* Check if a given axis mode is active (not disabled)
*/
function isAxisEnabled(mode?: MatrixifyMode): boolean {
return mode === 'metrics' || mode === 'dimensions';
}
/**
* Helper function to extract Matrixify configuration from form data
*/
export function getMatrixifyConfig(
formData: MatrixifyFormData & any,
): MatrixifyConfig | null {
const rowEnabled = isAxisEnabled(formData.matrixify_mode_rows);
const colEnabled = isAxisEnabled(formData.matrixify_mode_columns);
const hasRowConfig = formData.matrixify_mode_rows;
const hasColumnConfig = formData.matrixify_mode_columns;
if (!rowEnabled && !colEnabled) {
if (!hasRowConfig && !hasColumnConfig) {
return null;
}
return {
rows: {
mode: formData.matrixify_mode_rows || 'disabled',
mode: formData.matrixify_mode_rows || 'metrics',
metrics: formData.matrixify_rows,
selectionMode: formData.matrixify_dimension_selection_mode_rows,
dimension: formData.matrixify_dimension_rows,
topnValue: formData.matrixify_topn_value_rows,
topnMetric: formData.matrixify_topn_metric_rows,
topnOrder: formData.matrixify_topn_order_rows === false ? 'asc' : 'desc',
topnOrder: formData.matrixify_topn_order_rows,
},
columns: {
mode: formData.matrixify_mode_columns || 'disabled',
mode: formData.matrixify_mode_columns || 'metrics',
metrics: formData.matrixify_columns,
selectionMode: formData.matrixify_dimension_selection_mode_columns,
dimension: formData.matrixify_dimension_columns,
topnValue: formData.matrixify_topn_value_columns,
topnMetric: formData.matrixify_topn_metric_columns,
topnOrder:
formData.matrixify_topn_order_columns === false ? 'asc' : 'desc',
topnOrder: formData.matrixify_topn_order_columns,
},
};
}
/**
* Check if Matrixify is enabled and properly configured
*/
export function isMatrixifyEnabled(formData: MatrixifyFormData): boolean {
if (formData.matrixify_enable !== true) {
return false;
}
const rowEnabled = isAxisEnabled(formData.matrixify_mode_rows);
const colEnabled = isAxisEnabled(formData.matrixify_mode_columns);
if (!rowEnabled && !colEnabled) {
// Check if either vertical or horizontal layout is enabled
const hasVerticalLayout = formData.matrixify_enable_vertical_layout === true;
const hasHorizontalLayout =
formData.matrixify_enable_horizontal_layout === true;
if (!hasVerticalLayout && !hasHorizontalLayout) {
return false;
}
// Then validate that we have proper configuration
const config = getMatrixifyConfig(formData);
if (!config) {
return false;
}
const hasRowData =
rowEnabled &&
(config.rows.mode === 'metrics'
config.rows.mode === 'metrics'
? config.rows.metrics && config.rows.metrics.length > 0
: config.rows.dimension?.dimension &&
(config.rows.selectionMode === 'topn' ||
config.rows.selectionMode === 'all' ||
(config.rows.dimension.values &&
config.rows.dimension.values.length > 0)));
config.rows.dimension.values.length > 0));
const hasColumnData =
colEnabled &&
(config.columns.mode === 'metrics'
config.columns.mode === 'metrics'
? config.columns.metrics && config.columns.metrics.length > 0
: config.columns.dimension?.dimension &&
(config.columns.selectionMode === 'topn' ||
config.columns.selectionMode === 'all' ||
(config.columns.dimension.values &&
config.columns.dimension.values.length > 0)));
config.columns.dimension.values.length > 0));
return Boolean(hasRowData || hasColumnData);
}
@@ -232,10 +220,12 @@ export function getMatrixifyValidationErrors(
): string[] {
const errors: string[] = [];
const rowEnabled = isAxisEnabled(formData.matrixify_mode_rows);
const colEnabled = isAxisEnabled(formData.matrixify_mode_columns);
// Only validate if matrixify is enabled
const hasVerticalLayout = formData.matrixify_enable_vertical_layout === true;
const hasHorizontalLayout =
formData.matrixify_enable_horizontal_layout === true;
if (!rowEnabled && !colEnabled) {
if (!hasVerticalLayout && !hasHorizontalLayout) {
return errors;
}
@@ -253,7 +243,6 @@ export function getMatrixifyValidationErrors(
? config.rows.metrics && config.rows.metrics.length > 0
: config.rows.dimension?.dimension &&
(config.rows.selectionMode === 'topn' ||
config.rows.selectionMode === 'all' ||
(config.rows.dimension.values &&
config.rows.dimension.values.length > 0));
@@ -274,7 +263,6 @@ export function getMatrixifyValidationErrors(
? config.columns.metrics && config.columns.metrics.length > 0
: config.columns.dimension?.dimension &&
(config.columns.selectionMode === 'topn' ||
config.columns.selectionMode === 'all' ||
(config.columns.dimension.values &&
config.columns.dimension.values.length > 0));
@@ -293,7 +281,6 @@ export function getMatrixifyValidationErrors(
? config.rows.metrics && config.rows.metrics.length > 0
: config.rows.dimension?.dimension &&
(config.rows.selectionMode === 'topn' ||
config.rows.selectionMode === 'all' ||
(config.rows.dimension.values &&
config.rows.dimension.values.length > 0));
@@ -302,7 +289,6 @@ export function getMatrixifyValidationErrors(
? config.columns.metrics && config.columns.metrics.length > 0
: config.columns.dimension?.dimension &&
(config.columns.selectionMode === 'topn' ||
config.columns.selectionMode === 'all' ||
(config.columns.dimension.values &&
config.columns.dimension.values.length > 0));

View File

@@ -16,7 +16,7 @@
* specific language governing permissions and limitations
* under the License.
*/
import { screen, render, fireEvent } from '@superset-ui/core/spec';
import { screen, render } from '@superset-ui/core/spec';
import { Button, DropdownContainer, Icons } from '..';
const generateItems = (n: number) =>
@@ -158,36 +158,6 @@ test('accepts custom style props', () => {
expect(container).toHaveStyle('padding: 10px');
});
test('shows dropdown button when alwaysShowDropdownButton is true even without overflow', () => {
render(
<DropdownContainer items={generateItems(3)} alwaysShowDropdownButton />,
);
expect(screen.getByTestId('dropdown-container-btn')).toBeInTheDocument();
});
test('does not open popover when alwaysShowDropdownButton is true but there is no popover content', () => {
render(
<DropdownContainer items={generateItems(3)} alwaysShowDropdownButton />,
);
const btn = screen.getByTestId('dropdown-container-btn');
expect(btn).toBeInTheDocument();
fireEvent.click(btn);
// No popover content exists, so the popover should not open
expect(screen.queryByRole('tooltip')).not.toBeInTheDocument();
});
test('does not show dropdown button when alwaysShowDropdownButton is false and not overflowing', () => {
render(
<DropdownContainer
items={generateItems(3)}
alwaysShowDropdownButton={false}
/>,
);
expect(
screen.queryByTestId('dropdown-container-btn'),
).not.toBeInTheDocument();
});
// Integration test that doesn't rely on specific overflow behavior
test('component renders and functions without throwing errors', () => {
const onOverflowingStateChange = jest.fn();

View File

@@ -53,7 +53,6 @@ export const DropdownContainer = forwardRef(
dropdownTriggerTooltip = null,
forceRender,
style,
alwaysShowDropdownButton,
}: DropdownContainerProps,
outerRef: RefObject<DropdownRef>,
) => {
@@ -315,7 +314,7 @@ export const DropdownContainer = forwardRef(
>
{notOverflowedItems.map(item => item.element)}
</div>
{(popoverContent || alwaysShowDropdownButton) && (
{popoverContent && (
<>
<Global
styles={css`
@@ -349,13 +348,8 @@ export const DropdownContainer = forwardRef(
}}
content={popoverContent}
trigger="click"
open={popoverVisible && !!popoverContent}
onOpenChange={visible => {
// When alwaysShowDropdownButton is set but there is no content
// yet (e.g. during layout recalculation), ignore open attempts
// so the button stays visible without opening an empty popover.
if (popoverContent) setPopoverVisible(visible);
}}
open={popoverVisible}
onOpenChange={visible => setPopoverVisible(visible)}
placement="bottom"
forceRender={forceRender}
>

View File

@@ -87,11 +87,6 @@ export interface DropdownContainerProps {
* Force render popover content before it's first opened
*/
forceRender?: boolean;
/**
* Always show the dropdown button, even when no items are overflowing.
* Useful to prevent button flickering during layout recalculations.
*/
alwaysShowDropdownButton?: boolean;
}
export type DropdownRef = HTMLDivElement & { open: () => void };

View File

@@ -23,7 +23,7 @@ import { Label } from '..';
// Define the prop types for DatasetTypeLabel
interface DatasetTypeLabelProps {
datasetType: 'physical' | 'virtual'; // Accepts only 'physical' or 'virtual'
datasetType: 'physical' | 'virtual' | 'semantic_view';
}
const SIZE = 's'; // Define the size as a constant
@@ -32,6 +32,24 @@ export const DatasetTypeLabel: React.FC<DatasetTypeLabelProps> = ({
datasetType,
}) => {
const theme = useTheme();
if (datasetType === 'semantic_view') {
return (
<Label
icon={
<Icons.ApartmentOutlined
iconSize={SIZE}
iconColor={theme.colorInfo}
/>
}
type="info"
style={{ color: theme.colorInfo }}
>
{t('Semantic')}
</Label>
);
}
const label: string =
datasetType === 'physical' ? t('Physical') : t('Virtual');
const icon =

View File

@@ -23,7 +23,7 @@ import { Icons } from '@superset-ui/core/components/Icons';
import { ContentType, MetadataType } from '.';
const Header = styled.div`
font-weight: ${({ theme }) => theme.fontWeightBold};
font-weight: ${({ theme }) => theme.fontWeightStrong};
`;
const Info = ({

View File

@@ -19,6 +19,15 @@
import { DatasourceType } from './types/Datasource';
const DATASOURCE_TYPE_MAP: Record<string, DatasourceType> = {
table: DatasourceType.Table,
query: DatasourceType.Query,
dataset: DatasourceType.Dataset,
sl_table: DatasourceType.SlTable,
saved_query: DatasourceType.SavedQuery,
semantic_view: DatasourceType.SemanticView,
};
export default class DatasourceKey {
readonly id: number;
@@ -27,8 +36,7 @@ export default class DatasourceKey {
constructor(key: string) {
const [idStr, typeStr] = key.split('__');
this.id = parseInt(idStr, 10);
this.type = DatasourceType.Table; // default to SqlaTable model
this.type = typeStr === 'query' ? DatasourceType.Query : this.type;
this.type = DATASOURCE_TYPE_MAP[typeStr] ?? DatasourceType.Table;
}
public toString() {

View File

@@ -26,6 +26,7 @@ export enum DatasourceType {
Dataset = 'dataset',
SlTable = 'sl_table',
SavedQuery = 'saved_query',
SemanticView = 'semantic_view',
}
export interface Currency {

View File

@@ -30,7 +30,6 @@ const BINARY_OPERATORS = [
'<=',
'ILIKE',
'LIKE',
'NOT ILIKE',
'NOT LIKE',
'REGEX',
'TEMPORAL_RANGE',

View File

@@ -60,6 +60,7 @@ export enum FeatureFlag {
ListviewsDefaultCardView = 'LISTVIEWS_DEFAULT_CARD_VIEW',
Matrixify = 'MATRIXIFY',
ScheduledQueries = 'SCHEDULED_QUERIES',
SemanticLayers = 'SEMANTIC_LAYERS',
SqllabBackendPersistence = 'SQLLAB_BACKEND_PERSISTENCE',
SqlValidatorsByEngine = 'SQL_VALIDATORS_BY_ENGINE',
SshTunneling = 'SSH_TUNNELING',

View File

@@ -28,10 +28,11 @@ test('DEFAULT_METRICS', () => {
});
test('DatasourceType', () => {
expect(Object.keys(DatasourceType).length).toBe(5);
expect(Object.keys(DatasourceType).length).toBe(6);
expect(DatasourceType.Table).toBe('table');
expect(DatasourceType.Query).toBe('query');
expect(DatasourceType.Dataset).toBe('dataset');
expect(DatasourceType.SlTable).toBe('sl_table');
expect(DatasourceType.SavedQuery).toBe('saved_query');
expect(DatasourceType.SemanticView).toBe('semantic_view');
});

View File

@@ -61,9 +61,6 @@ interface MapBoxProps {
renderWhileDragging?: boolean;
rgb?: (string | number)[];
bounds?: [[number, number], [number, number]]; // May be undefined for empty datasets
viewportLongitude?: number;
viewportLatitude?: number;
viewportZoom?: number;
}
interface MapBoxState {
@@ -85,10 +82,30 @@ class MapBox extends Component<MapBoxProps, MapBoxState> {
constructor(props: MapBoxProps) {
super(props);
const fitBounds = this.computeFitBoundsViewport();
const { width = 400, height = 400, bounds } = this.props;
// Get a viewport that fits the given bounds, which all marks to be clustered.
// Derive lat, lon and zoom from this viewport. This is only done on initial
// render as the bounds don't update as we pan/zoom in the current design.
let latitude = 0;
let longitude = 0;
let zoom = 1;
// Guard against empty datasets where bounds may be undefined
if (bounds && bounds[0] && bounds[1]) {
const mercator = new WebMercatorViewport({
width,
height,
}).fitBounds(bounds);
({ latitude, longitude, zoom } = mercator);
}
this.state = {
viewport: this.mergeViewportWithProps(fitBounds),
viewport: {
longitude,
latitude,
zoom,
},
};
this.handleViewportChange = this.handleViewportChange.bind(this);
}
@@ -99,75 +116,6 @@ class MapBox extends Component<MapBoxProps, MapBoxState> {
onViewportChange!(viewport);
}
mergeViewportWithProps(
fitBounds: Viewport,
viewport: Viewport = fitBounds,
props: MapBoxProps = this.props,
useFitBoundsForUnset = true,
): Viewport {
const { viewportLongitude, viewportLatitude, viewportZoom } = props;
return {
...viewport,
longitude:
viewportLongitude ??
(useFitBoundsForUnset ? fitBounds.longitude : viewport.longitude),
latitude:
viewportLatitude ??
(useFitBoundsForUnset ? fitBounds.latitude : viewport.latitude),
zoom:
viewportZoom ?? (useFitBoundsForUnset ? fitBounds.zoom : viewport.zoom),
};
}
computeFitBoundsViewport(): Viewport {
const { width = 400, height = 400, bounds } = this.props;
if (bounds && bounds[0] && bounds[1]) {
const mercator = new WebMercatorViewport({ width, height }).fitBounds(
bounds,
);
return {
latitude: mercator.latitude,
longitude: mercator.longitude,
zoom: mercator.zoom,
};
}
return { latitude: 0, longitude: 0, zoom: 1 };
}
componentDidUpdate(prevProps: MapBoxProps) {
const { viewport } = this.state;
const fitBoundsInputsChanged =
prevProps.width !== this.props.width ||
prevProps.height !== this.props.height ||
prevProps.bounds !== this.props.bounds;
const viewportPropsChanged =
prevProps.viewportLongitude !== this.props.viewportLongitude ||
prevProps.viewportLatitude !== this.props.viewportLatitude ||
prevProps.viewportZoom !== this.props.viewportZoom;
if (!fitBoundsInputsChanged && !viewportPropsChanged) {
return;
}
const fitBounds = this.computeFitBoundsViewport();
const nextViewport = this.mergeViewportWithProps(
fitBounds,
viewport,
this.props,
fitBoundsInputsChanged || viewportPropsChanged,
);
const viewportChanged =
nextViewport.longitude !== viewport.longitude ||
nextViewport.latitude !== viewport.latitude ||
nextViewport.zoom !== viewport.zoom;
if (viewportChanged) {
this.setState({ viewport: nextViewport });
}
}
render() {
const {
width,

View File

@@ -241,7 +241,6 @@ const config: ControlPanelConfig = {
label: t('Opacity'),
default: 1,
isFloat: true,
renderTrigger: true,
description: t(
'Opacity of all clusters, points, and labels. Between 0 and 1.',
),
@@ -274,7 +273,7 @@ const config: ControlPanelConfig = {
type: 'TextControl',
label: t('Default longitude'),
renderTrigger: true,
default: '',
default: -122.405293,
isFloat: true,
description: t('Longitude of default viewport'),
places: 8,
@@ -288,7 +287,7 @@ const config: ControlPanelConfig = {
type: 'TextControl',
label: t('Default latitude'),
renderTrigger: true,
default: '',
default: 37.772123,
isFloat: true,
description: t('Latitude of default viewport'),
places: 8,
@@ -305,7 +304,7 @@ const config: ControlPanelConfig = {
label: t('Zoom'),
renderTrigger: true,
isFloat: true,
default: '',
default: 11,
description: t('Zoom level of the map'),
places: 8,
// Viewport zoom shouldn't prompt user to re-run query

View File

@@ -23,30 +23,6 @@ import { ChartProps } from '@superset-ui/core';
import { DEFAULT_POINT_RADIUS, DEFAULT_MAX_ZOOM } from './MapBox';
const NOOP = () => {};
const MIN_LONGITUDE = -180;
const MAX_LONGITUDE = 180;
const MIN_LATITUDE = -90;
const MAX_LATITUDE = 90;
const MIN_ZOOM = 0;
function toFiniteNumber(
value: string | number | null | undefined,
): number | undefined {
if (value === null || value === undefined) return undefined;
const normalizedValue = typeof value === 'string' ? value.trim() : value;
if (normalizedValue === '') return undefined;
const num = Number(normalizedValue);
return Number.isFinite(num) ? num : undefined;
}
function clampNumber(
value: number | undefined,
min: number,
max: number,
): number | undefined {
if (value === undefined) return undefined;
return Math.min(max, Math.max(min, value));
}
interface ClusterProperties {
metric: number;
@@ -69,9 +45,6 @@ export default function transformProps(chartProps: ChartProps) {
pandasAggfunc,
pointRadiusUnit,
renderWhileDragging,
viewportLongitude,
viewportLatitude,
viewportZoom,
} = formData;
// Validate mapbox color
@@ -120,6 +93,7 @@ export default function transformProps(chartProps: ChartProps) {
aggregatorName: pandasAggfunc,
bounds,
clusterer,
globalOpacity,
hasCustomMetric,
mapboxApiKey,
mapStyle: mapboxStyle,
@@ -142,21 +116,5 @@ export default function transformProps(chartProps: ChartProps) {
pointRadiusUnit,
renderWhileDragging,
rgb,
viewportLongitude: clampNumber(
toFiniteNumber(viewportLongitude),
MIN_LONGITUDE,
MAX_LONGITUDE,
),
viewportLatitude: clampNumber(
toFiniteNumber(viewportLatitude),
MIN_LATITUDE,
MAX_LATITUDE,
),
viewportZoom: clampNumber(
toFiniteNumber(viewportZoom),
MIN_ZOOM,
DEFAULT_MAX_ZOOM,
),
globalOpacity: Math.min(1, Math.max(0, toFiniteNumber(globalOpacity) ?? 1)),
};
}

View File

@@ -1,381 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import { type ReactNode } from 'react';
import { render } from '@testing-library/react';
import MapBox from '../src/MapBox';
// Capture the most recent viewport props passed to MapGL
let lastMapGLProps: Record<string, unknown> = {};
const mockFitBounds = jest.fn();
jest.mock('react-map-gl', () => {
const MockMapGL = (props: Record<string, unknown>) => {
lastMapGLProps = props;
return <div data-test="map-gl">{props.children as ReactNode}</div>;
};
return { __esModule: true, default: MockMapGL };
});
jest.mock('@math.gl/web-mercator', () => ({
WebMercatorViewport: jest
.fn()
.mockImplementation(
({ width, height }: { width: number; height: number }) => ({
fitBounds: (bounds: [[number, number], [number, number]]) =>
mockFitBounds(bounds, width, height),
}),
),
}));
jest.mock('../src/ScatterPlotGlowOverlay', () => {
const MockOverlay = (props: Record<string, unknown>) => (
<div data-test="scatter-overlay" data-opacity={props.globalOpacity} />
);
return { __esModule: true, default: MockOverlay };
});
const defaultProps = {
width: 800,
height: 600,
clusterer: {
getClusters: jest.fn().mockReturnValue([]),
},
globalOpacity: 1,
mapboxApiKey: 'test-key',
mapStyle: 'mapbox://styles/mapbox/light-v9',
pointRadius: 60,
pointRadiusUnit: 'Pixels',
renderWhileDragging: true,
rgb: ['', 255, 0, 0] as (string | number)[],
hasCustomMetric: false,
bounds: [
[-74.0, 40.7],
[-73.9, 40.8],
] as [[number, number], [number, number]],
onViewportChange: jest.fn(),
};
beforeEach(() => {
lastMapGLProps = {};
jest.clearAllMocks();
mockFitBounds.mockImplementation(
(
bounds: [[number, number], [number, number]],
width: number,
height: number,
) => ({
latitude: Number(((bounds[0][1] + bounds[1][1]) / 2).toFixed(2)),
longitude: Number(((bounds[0][0] + bounds[1][0]) / 2).toFixed(2)),
zoom: Number((10 + width / 1000 + height / 10000).toFixed(2)),
}),
);
});
test('initializes viewport from bounds', () => {
render(<MapBox {...defaultProps} />);
expect(lastMapGLProps.latitude).toBe(40.75);
expect(lastMapGLProps.longitude).toBe(-73.95);
expect(lastMapGLProps.zoom).toBe(10.86);
});
test('updates viewport when viewport props change', () => {
const { rerender } = render(
<MapBox
{...defaultProps}
viewportLongitude={-73.95}
viewportLatitude={40.75}
viewportZoom={10}
/>,
);
rerender(
<MapBox
{...defaultProps}
viewportLongitude={-122.4}
viewportLatitude={37.8}
viewportZoom={5}
/>,
);
expect(lastMapGLProps.longitude).toBe(-122.4);
expect(lastMapGLProps.latitude).toBe(37.8);
expect(lastMapGLProps.zoom).toBe(5);
});
test('does not loop when viewport state matches new props', () => {
const { rerender } = render(
<MapBox
{...defaultProps}
viewportLongitude={-73.95}
viewportLatitude={40.75}
viewportZoom={10}
/>,
);
// Re-render with same props that match the initial viewport state
rerender(
<MapBox
{...defaultProps}
viewportLongitude={-73.95}
viewportLatitude={40.75}
viewportZoom={10}
/>,
);
// Viewport should still be the fitBounds-computed values since props didn't change
expect(lastMapGLProps.longitude).toBe(-73.95);
expect(lastMapGLProps.latitude).toBe(40.75);
expect(lastMapGLProps.zoom).toBe(10);
});
test('passes globalOpacity to ScatterPlotGlowOverlay', () => {
const { getByTestId } = render(
<MapBox {...defaultProps} globalOpacity={0.5} />,
);
const overlay = getByTestId('scatter-overlay');
expect(overlay.dataset.opacity).toBe('0.5');
});
test('initializes viewport from props when provided', () => {
render(
<MapBox
{...defaultProps}
viewportLongitude={-122.4}
viewportLatitude={37.8}
viewportZoom={5}
/>,
);
expect(lastMapGLProps.longitude).toBe(-122.4);
expect(lastMapGLProps.latitude).toBe(37.8);
expect(lastMapGLProps.zoom).toBe(5);
});
test('handles undefined bounds gracefully', () => {
render(<MapBox {...defaultProps} bounds={undefined} />);
expect(lastMapGLProps.longitude).toBe(0);
expect(lastMapGLProps.latitude).toBe(0);
expect(lastMapGLProps.zoom).toBe(1);
});
test('applies partial viewport props on update', () => {
const { rerender } = render(<MapBox {...defaultProps} />);
rerender(<MapBox {...defaultProps} viewportLongitude={-122.4} />);
expect(lastMapGLProps.longitude).toBe(-122.4);
expect(lastMapGLProps.latitude).toBe(40.75);
expect(lastMapGLProps.zoom).toBe(10.86);
});
test('restores fitBounds when viewport props are cleared', () => {
const { rerender } = render(
<MapBox
{...defaultProps}
viewportLongitude={-122.4}
viewportLatitude={37.8}
viewportZoom={5}
/>,
);
// Clear all viewport props (simulates user clearing the controls)
rerender(<MapBox {...defaultProps} />);
// Should revert to fitBounds values
expect(lastMapGLProps.longitude).toBe(-73.95);
expect(lastMapGLProps.latitude).toBe(40.75);
expect(lastMapGLProps.zoom).toBe(10.86);
});
test('restores only cleared viewport props, keeps the rest', () => {
const { rerender } = render(
<MapBox
{...defaultProps}
viewportLongitude={-122.4}
viewportLatitude={37.8}
viewportZoom={5}
/>,
);
// Clear only longitude, keep lat/zoom
rerender(
<MapBox {...defaultProps} viewportLatitude={37.8} viewportZoom={5} />,
);
// Longitude reverts to fitBounds, lat/zoom stay
expect(lastMapGLProps.longitude).toBe(-73.95);
expect(lastMapGLProps.latitude).toBe(37.8);
expect(lastMapGLProps.zoom).toBe(5);
});
test('applies changed viewport props even when another is cleared simultaneously', () => {
const { rerender } = render(
<MapBox
{...defaultProps}
viewportLongitude={-122.4}
viewportLatitude={37.8}
viewportZoom={5}
/>,
);
// Clear longitude, change latitude simultaneously
rerender(
<MapBox {...defaultProps} viewportLatitude={40.0} viewportZoom={5} />,
);
// Longitude reverts to fitBounds, latitude should be the NEW value
expect(lastMapGLProps.longitude).toBe(-73.95);
expect(lastMapGLProps.latitude).toBe(40.0);
expect(lastMapGLProps.zoom).toBe(5);
});
test('falls back to default viewport when cleared with undefined bounds', () => {
const { rerender } = render(
<MapBox
{...defaultProps}
bounds={undefined}
viewportLongitude={-122.4}
viewportLatitude={37.8}
viewportZoom={5}
/>,
);
// Clear viewport props — no bounds to fitBounds to
rerender(<MapBox {...defaultProps} bounds={undefined} />);
// Should fall back to {0, 0, 1}
expect(lastMapGLProps.longitude).toBe(0);
expect(lastMapGLProps.latitude).toBe(0);
expect(lastMapGLProps.zoom).toBe(1);
});
test('recomputes fitBounds when bounds change and no explicit viewport is set', () => {
const { rerender } = render(<MapBox {...defaultProps} />);
rerender(
<MapBox
{...defaultProps}
bounds={[
[-123.2, 36.5],
[-121.8, 38.1],
]}
/>,
);
expect(lastMapGLProps.longitude).toBe(-122.5);
expect(lastMapGLProps.latitude).toBe(37.3);
expect(lastMapGLProps.zoom).toBe(10.86);
});
test('recomputes fitBounds when chart size changes and no explicit viewport is set', () => {
const { rerender } = render(<MapBox {...defaultProps} />);
rerender(<MapBox {...defaultProps} width={1200} height={900} />);
expect(lastMapGLProps.longitude).toBe(-73.95);
expect(lastMapGLProps.latitude).toBe(40.75);
expect(lastMapGLProps.zoom).toBe(11.29);
});
test('recomputes only implicit viewport fields when bounds change', () => {
const { rerender } = render(
<MapBox {...defaultProps} viewportLongitude={-122.4} />,
);
rerender(
<MapBox
{...defaultProps}
viewportLongitude={-122.4}
bounds={[
[-123.2, 36.5],
[-121.8, 38.1],
]}
/>,
);
expect(lastMapGLProps.longitude).toBe(-122.4);
expect(lastMapGLProps.latitude).toBe(37.3);
expect(lastMapGLProps.zoom).toBe(10.86);
});
test('recomputes only implicit viewport fields when chart size changes', () => {
const { rerender } = render(
<MapBox {...defaultProps} viewportLatitude={37.8} />,
);
rerender(
<MapBox
{...defaultProps}
viewportLatitude={37.8}
width={1200}
height={900}
/>,
);
expect(lastMapGLProps.longitude).toBe(-73.95);
expect(lastMapGLProps.latitude).toBe(37.8);
expect(lastMapGLProps.zoom).toBe(11.29);
});
test('recomputes implicit position when zoom stays explicit across bounds changes', () => {
const { rerender } = render(<MapBox {...defaultProps} viewportZoom={5} />);
rerender(
<MapBox
{...defaultProps}
viewportZoom={5}
bounds={[
[-123.2, 36.5],
[-121.8, 38.1],
]}
/>,
);
expect(lastMapGLProps.longitude).toBe(-122.5);
expect(lastMapGLProps.latitude).toBe(37.3);
expect(lastMapGLProps.zoom).toBe(5);
});
test('does not recompute fitBounds on bounds change when an explicit viewport is set', () => {
const { rerender } = render(
<MapBox
{...defaultProps}
viewportLongitude={-122.4}
viewportLatitude={37.8}
viewportZoom={5}
/>,
);
rerender(
<MapBox
{...defaultProps}
viewportLongitude={-122.4}
viewportLatitude={37.8}
viewportZoom={5}
bounds={[
[-123.2, 36.5],
[-121.8, 38.1],
]}
/>,
);
expect(lastMapGLProps.longitude).toBe(-122.4);
expect(lastMapGLProps.latitude).toBe(37.8);
expect(lastMapGLProps.zoom).toBe(5);
});

View File

@@ -1,81 +0,0 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import type {
ControlPanelConfig,
CustomControlItem,
} from '@superset-ui/chart-controls';
import controlPanel from '../src/controlPanel';
type ControlConfig = Required<CustomControlItem['config']>;
function isCustomControlItem(
controlItem: unknown,
): controlItem is CustomControlItem & { config: ControlConfig } {
return (
typeof controlItem === 'object' &&
controlItem !== null &&
'name' in controlItem &&
'config' in controlItem
);
}
function getControl(
panel: ControlPanelConfig,
controlName: string,
): CustomControlItem & { config: ControlConfig } {
const item = (panel.controlPanelSections || [])
.flatMap(section => section?.controlSetRows || [])
.flat()
.find(
controlItem =>
isCustomControlItem(controlItem) && controlItem.name === controlName,
);
if (!isCustomControlItem(item)) {
throw new Error(`Control "${controlName}" not found`);
}
return item;
}
test('viewport controls default to empty values and rerender without query refresh', () => {
const longitudeControl = getControl(controlPanel, 'viewport_longitude');
const latitudeControl = getControl(controlPanel, 'viewport_latitude');
const zoomControl = getControl(controlPanel, 'viewport_zoom');
expect(longitudeControl.config.default).toBe('');
expect(latitudeControl.config.default).toBe('');
expect(zoomControl.config.default).toBe('');
expect(longitudeControl.config.renderTrigger).toBe(true);
expect(latitudeControl.config.renderTrigger).toBe(true);
expect(zoomControl.config.renderTrigger).toBe(true);
expect(longitudeControl.config.dontRefreshOnChange).toBe(true);
expect(latitudeControl.config.dontRefreshOnChange).toBe(true);
expect(zoomControl.config.dontRefreshOnChange).toBe(true);
});
test('opacity control rerenders immediately when changed', () => {
const opacityControl = getControl(controlPanel, 'global_opacity');
expect(opacityControl.config.default).toBe(1);
expect(opacityControl.config.renderTrigger).toBe(true);
expect(opacityControl.config.isFloat).toBe(true);
});

View File

@@ -1,230 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import { ChartProps } from '@superset-ui/core';
import { supersetTheme } from '@apache-superset/core/theme';
jest.mock('supercluster', () => {
const MockSupercluster = jest.fn().mockImplementation(() => ({
load: jest.fn(),
getClusters: jest.fn().mockReturnValue([]),
}));
return { __esModule: true, default: MockSupercluster };
});
// Import after mocking supercluster to avoid ESM parse error
// eslint-disable-next-line import/first
import transformProps from '../src/transformProps';
type TransformPropsResult = {
globalOpacity?: number;
onViewportChange?: (viewport: {
latitude: number;
longitude: number;
zoom: number;
}) => void;
viewportLongitude?: number;
viewportLatitude?: number;
viewportZoom?: number;
};
const baseFormData = {
clusteringRadius: 60,
globalOpacity: 0.8,
mapboxColor: 'rgb(0, 139, 139)',
mapboxStyle: 'mapbox://styles/mapbox/light-v9',
pandasAggfunc: 'sum',
pointRadiusUnit: 'Pixels',
renderWhileDragging: true,
viewportLongitude: -73.935242,
viewportLatitude: 40.73061,
viewportZoom: 9,
};
const baseQueriesData = [
{
data: {
bounds: [
[-74.0, 40.7],
[-73.9, 40.8],
] as [[number, number], [number, number]],
geoJSON: { features: [] },
hasCustomMetric: false,
mapboxApiKey: 'test-api-key',
},
},
];
function createChartProps(overrides: Record<string, unknown> = {}) {
return new ChartProps({
formData: { ...baseFormData, ...overrides },
width: 800,
height: 600,
queriesData: baseQueriesData,
theme: supersetTheme,
});
}
function getTransformPropsResult(
overrides: Record<string, unknown> = {},
): TransformPropsResult {
return transformProps(createChartProps(overrides)) as TransformPropsResult;
}
test('extracts globalOpacity from formData', () => {
const result = getTransformPropsResult({ globalOpacity: 0.5 });
expect(result.globalOpacity).toBe(0.5);
});
test('extracts viewport values from formData', () => {
const result = getTransformPropsResult({
viewportLongitude: -122.4,
viewportLatitude: 37.8,
viewportZoom: 12,
});
expect(result).toEqual(
expect.objectContaining({
viewportLongitude: -122.4,
viewportLatitude: 37.8,
viewportZoom: 12,
}),
);
});
test('clamps viewport values to safe map ranges', () => {
const result = getTransformPropsResult({
viewportLongitude: 190,
viewportLatitude: -100,
viewportZoom: 99,
});
expect(result).toEqual(
expect.objectContaining({
viewportLongitude: 180,
viewportLatitude: -90,
viewportZoom: 16,
}),
);
});
test('provides onViewportChange callback that updates control values', () => {
const setControlValue = jest.fn();
const chartProps = new ChartProps({
formData: baseFormData,
width: 800,
height: 600,
queriesData: baseQueriesData,
hooks: { setControlValue },
theme: supersetTheme,
});
const result = transformProps(chartProps) as TransformPropsResult;
expect(result.onViewportChange).toBeDefined();
result.onViewportChange!({
latitude: 51.5,
longitude: -0.12,
zoom: 10,
});
expect(setControlValue).toHaveBeenCalledWith('viewport_longitude', -0.12);
expect(setControlValue).toHaveBeenCalledWith('viewport_latitude', 51.5);
expect(setControlValue).toHaveBeenCalledWith('viewport_zoom', 10);
});
test('normalizes string viewport values to numbers', () => {
const result = getTransformPropsResult({
viewportLongitude: '-122.4',
viewportLatitude: '37.8',
viewportZoom: '12',
});
expect(result.viewportLongitude).toBe(-122.4);
expect(result.viewportLatitude).toBe(37.8);
expect(result.viewportZoom).toBe(12);
});
test('normalizes empty viewport values to undefined', () => {
const result = getTransformPropsResult({
viewportLongitude: '',
viewportLatitude: '',
viewportZoom: '',
});
expect(result.viewportLongitude).toBeUndefined();
expect(result.viewportLatitude).toBeUndefined();
expect(result.viewportZoom).toBeUndefined();
});
test('normalizes whitespace-only viewport values to undefined', () => {
const result = getTransformPropsResult({
viewportLongitude: ' ',
viewportLatitude: '\t',
viewportZoom: ' \n ',
});
expect(result.viewportLongitude).toBeUndefined();
expect(result.viewportLatitude).toBeUndefined();
expect(result.viewportZoom).toBeUndefined();
});
test('normalizes string opacity to number', () => {
const result = getTransformPropsResult({ globalOpacity: '0.5' });
expect(result.globalOpacity).toBe(0.5);
});
test('defaults empty opacity to 1', () => {
const result = getTransformPropsResult({ globalOpacity: '' });
expect(result.globalOpacity).toBe(1);
});
test('defaults whitespace-only opacity to 1', () => {
const result = getTransformPropsResult({ globalOpacity: ' ' });
expect(result.globalOpacity).toBe(1);
});
test('clamps opacity to [0, 1] range', () => {
expect(getTransformPropsResult({ globalOpacity: 5 }).globalOpacity).toBe(1);
expect(getTransformPropsResult({ globalOpacity: -1 }).globalOpacity).toBe(0);
});
test('passes through numeric values unchanged', () => {
const result = getTransformPropsResult({
viewportLongitude: -122.4,
viewportLatitude: 37.8,
viewportZoom: 12,
globalOpacity: 0.8,
});
expect(result.viewportLongitude).toBe(-122.4);
expect(result.viewportLatitude).toBe(37.8);
expect(result.viewportZoom).toBe(12);
expect(result.globalOpacity).toBe(0.8);
});
test('calls onError and returns empty object for invalid color', () => {
const onError = jest.fn();
const chartProps = new ChartProps({
formData: { ...baseFormData, mapboxColor: 'invalid-color' },
width: 800,
height: 600,
queriesData: baseQueriesData,
hooks: { onError },
theme: supersetTheme,
});
const result = transformProps(chartProps);
expect(onError).toHaveBeenCalledWith(
"Color field must be of form 'rgb(%d, %d, %d)'",
);
expect(result).toEqual({});
});

View File

@@ -150,20 +150,14 @@ function WorldMap(element: HTMLElement, props: WorldMapProps): void {
fillColor: colorFn(d.name, sliceId),
}));
} else {
const rawExtents = d3Extent(filteredData, d => d.m1);
const extents: [number, number] =
rawExtents[0] != null && rawExtents[1] != null
? [rawExtents[0], rawExtents[1]]
: [0, 1];
const colorSchemeObj = getSequentialSchemeRegistry().get(linearColorScheme);
colorFn = colorSchemeObj
? colorSchemeObj.createLinearScale(extents)
: () => theme.colorBorder;
colorFn = getSequentialSchemeRegistry()
.get(linearColorScheme)
.createLinearScale(d3Extent(filteredData, d => d.m1));
processedData = filteredData.map(d => ({
...d,
radius: radiusScale(Math.sqrt(d.m2)),
fillColor: colorFn(d.m1) ?? theme.colorBorder,
fillColor: colorFn(d.m1),
}));
}

View File

@@ -489,59 +489,6 @@ test('popupTemplate returns tooltip HTML when country data exists', () => {
expect(tooltipHtml).toContain('hoverinfo');
});
test('assigns fill colors from sequential scheme when colorBy is metric', () => {
WorldMap(container, {
...baseProps,
colorBy: ColorBy.Metric,
});
const data = lastDatamapConfig?.data as Record<
string,
WorldMapDataEntry & { fillColor: string }
>;
expect(data).toHaveProperty('USA');
expect(data).toHaveProperty('CAN');
expect(data.USA).toMatchObject({
country: 'USA',
name: 'United States',
m1: 100,
});
// fillColor should be a valid color string from the sequential scale
expect(data.USA.fillColor).toMatch(/^(#|rgb)/);
expect(data.CAN.fillColor).toMatch(/^(#|rgb)/);
});
test('falls back to theme.colorBorder when metric values are null', () => {
WorldMap(container, {
...baseProps,
colorBy: ColorBy.Metric,
data: [
{
country: 'USA',
name: 'United States',
m1: null as unknown as number,
m2: 200,
code: 'US',
latitude: 37.0902,
longitude: -95.7129,
},
],
} as any);
const data = lastDatamapConfig?.data as Record<string, { fillColor: string }>;
expect(data.USA.fillColor).toBe('#e0e0e0');
});
test('does not throw with empty data and metric coloring', () => {
expect(() => {
WorldMap(container, {
...baseProps,
colorBy: ColorBy.Metric,
data: [],
});
}).not.toThrow();
});
test('popupTemplate handles null/undefined country data gracefully', () => {
WorldMap(container, baseProps);

View File

@@ -605,140 +605,3 @@ describe('DeckMulti Component Rendering', () => {
});
});
});
test('includes parent_slice_id in child slice requests when parent has slice_id', async () => {
jest.clearAllMocks();
const mockGet = jest.fn().mockResolvedValue({
json: {
result: {
form_data: {
viz_type: 'deck_scatter',
datasource: '1__table',
},
},
data: {
features: [],
},
},
});
(SupersetClient.get as jest.Mock) = mockGet;
const parentSliceId = 99;
const dashboardId = 5;
const props = {
...baseMockProps,
formData: {
...baseMockProps.formData,
slice_id: parentSliceId,
dashboardId,
},
};
renderWithProviders(<DeckMulti {...props} />);
await waitFor(() => {
expect(mockGet).toHaveBeenCalled();
});
// Check that the child slice requests include parent_slice_id
const { calls } = mockGet.mock;
calls.forEach(call => {
const { endpoint } = call[0];
if (endpoint.includes('api/v1/explore/form_data')) {
const body = JSON.parse(call[0].body);
expect(body.form_data).toMatchObject({
dashboardId,
parent_slice_id: parentSliceId,
});
}
});
});
test('includes parent_slice_id in embedded mode', async () => {
jest.clearAllMocks();
const mockGet = jest.fn().mockResolvedValue({
json: {
result: {
form_data: {
viz_type: 'deck_scatter',
datasource: '1__table',
},
},
data: {
features: [],
},
},
});
(SupersetClient.get as jest.Mock) = mockGet;
const parentSliceId = 200;
const dashboardId = 10;
const props = {
...baseMockProps,
formData: {
...baseMockProps.formData,
slice_id: parentSliceId,
dashboardId,
embedded: true,
},
};
renderWithProviders(<DeckMulti {...props} />);
await waitFor(() => {
expect(mockGet).toHaveBeenCalled();
});
// Verify parent_slice_id is included in embedded mode
const { calls } = mockGet.mock;
calls.forEach(call => {
const { endpoint } = call[0];
if (endpoint.includes('api/v1/explore/form_data')) {
const body = JSON.parse(call[0].body);
expect(body.form_data.parent_slice_id).toBe(parentSliceId);
}
});
});
test('does not include parent_slice_id when parent has no slice_id', async () => {
jest.clearAllMocks();
const mockGet = jest.fn().mockResolvedValue({
json: {
result: {
form_data: {
viz_type: 'deck_scatter',
datasource: '1__table',
},
},
data: {
features: [],
},
},
});
(SupersetClient.get as jest.Mock) = mockGet;
const props = {
...baseMockProps,
formData: {
...baseMockProps.formData,
// No slice_id in parent
dashboardId: 5,
},
};
renderWithProviders(<DeckMulti {...props} />);
await waitFor(() => {
expect(mockGet).toHaveBeenCalled();
});
// Verify parent_slice_id is not included when parent has no slice_id
const { calls } = mockGet.mock;
calls.forEach(call => {
const { endpoint } = call[0];
if (endpoint.includes('api/v1/explore/form_data')) {
const body = JSON.parse(call[0].body);
expect(body.form_data.parent_slice_id).toBeUndefined();
}
});
});

View File

@@ -289,8 +289,6 @@ const DeckMulti = (props: DeckMultiProps) => {
adhoc_filters: adhocFilters,
// Preserve dashboard context for embedded mode permissions
...(formData.dashboardId && { dashboardId: formData.dashboardId }),
// Include parent multilayer chart ID for security checks
...(formData.slice_id && { parent_slice_id: formData.slice_id }),
},
} as any as JsonObject & { slice_id: number };

View File

@@ -289,126 +289,6 @@ describe('Polygon transformProps', () => {
expect(features[0]?.elevation).toBeUndefined();
});
test('should render polygons when boundary column contains GeoJSON Feature format', () => {
const geojsonFeatureProps = {
...mockChartProps,
queriesData: [
{
data: [
{
geom: JSON.stringify({
type: 'Feature',
geometry: {
type: 'Polygon',
coordinates: [
[
[-122.4, 37.8],
[-122.3, 37.8],
[-122.3, 37.9],
[-122.4, 37.9],
[-122.4, 37.8],
],
],
},
properties: { name: 'test' },
}),
},
],
},
],
};
const result = transformProps(geojsonFeatureProps as ChartProps);
const features = result.payload.data.features as PolygonFeature[];
expect(features).toHaveLength(1);
expect(features[0]?.polygon).toEqual([
[-122.4, 37.8],
[-122.3, 37.8],
[-122.3, 37.9],
[-122.4, 37.9],
[-122.4, 37.8],
]);
});
test('should render polygons when boundary column contains GeoJSON Geometry format', () => {
const geojsonGeometryProps = {
...mockChartProps,
queriesData: [
{
data: [
{
geom: JSON.stringify({
type: 'Polygon',
coordinates: [
[
[-122.4, 37.8],
[-122.3, 37.8],
[-122.3, 37.9],
[-122.4, 37.9],
[-122.4, 37.8],
],
],
}),
},
],
},
],
};
const result = transformProps(geojsonGeometryProps as ChartProps);
const features = result.payload.data.features as PolygonFeature[];
expect(features).toHaveLength(1);
expect(features[0]?.polygon).toEqual([
[-122.4, 37.8],
[-122.3, 37.8],
[-122.3, 37.9],
[-122.4, 37.9],
[-122.4, 37.8],
]);
});
test('should render polygons when boundary column contains JSON with nested geometry', () => {
// Real-world format: {"type":"Polygon","geometry":{"type":"Polygon","coordinates":[...]}}
const nonStandardProps = {
...mockChartProps,
queriesData: [
{
data: [
{
geom: JSON.stringify({
type: 'Polygon',
geometry: {
type: 'Polygon',
coordinates: [
[
[79.7912, 8.4641],
[79.7959, 8.4629],
[79.7994, 8.4535],
[79.7912, 8.4641],
],
],
},
}),
},
],
},
],
};
const result = transformProps(nonStandardProps as ChartProps);
const features = result.payload.data.features as PolygonFeature[];
expect(features).toHaveLength(1);
expect(features[0]?.polygon).toEqual([
[79.7912, 8.4641],
[79.7959, 8.4629],
[79.7994, 8.4535],
[79.7912, 8.4641],
]);
});
test('should handle geohash decoding successfully', () => {
const props = {
...mockedChartPropsWithGeoHash,

View File

@@ -115,10 +115,6 @@ function processPolygonData(
if (parsed.coordinates) {
polygonCoords = parsed.coordinates[0] || parsed.coordinates;
} else if (parsed.geometry?.coordinates) {
// Non-standard format with nested geometry
polygonCoords =
parsed.geometry.coordinates[0] || parsed.geometry.coordinates;
} else if (Array.isArray(parsed)) {
polygonCoords = parsed;
} else {

View File

@@ -34,7 +34,7 @@
"fast-safe-stringify": "^2.1.1",
"lodash": "^4.17.23",
"nvd3-fork": "^2.0.5",
"dompurify": "^3.3.3",
"dompurify": "^3.3.1",
"prop-types": "^15.8.1",
"urijs": "^1.19.11"
},

View File

@@ -85,7 +85,6 @@ export default function TableChart<D extends DataRecord = DataRecord>(
width,
onChartStateChange,
chartState,
metricSqlExpressions,
} = props;
const [searchOptions, setSearchOptions] = useState<SearchOption[]>([]);
@@ -188,7 +187,6 @@ export default function TableChart<D extends DataRecord = DataRecord>(
lastFilteredColumn: completeFilterState.lastFilteredColumn,
lastFilteredInputPosition: completeFilterState.inputPosition,
currentPage: 0, // Reset to first page when filtering
metricSqlExpressions,
};
updateTableOwnState(setDataMask, modifiedOwnState);
@@ -199,7 +197,6 @@ export default function TableChart<D extends DataRecord = DataRecord>(
serverPaginationData,
onChartStateChange,
chartState,
metricSqlExpressions,
],
);

View File

@@ -482,77 +482,12 @@ const buildQuery: BuildQuery<TableChartFormData> = (
};
}
// Map metric/column labels to SQL expressions for WHERE/HAVING resolution
const sqlExpressionMap: Record<string, string> = {};
(metrics || []).forEach((m: QueryFormMetric) => {
if (typeof m === 'object' && 'expressionType' in m) {
const label = getMetricLabel(m);
if (m.expressionType === 'SQL' && m.sqlExpression) {
sqlExpressionMap[label] = m.sqlExpression;
} else if (
m.expressionType === 'SIMPLE' &&
m.aggregate &&
m.column?.column_name
) {
sqlExpressionMap[label] = `${m.aggregate}(${m.column.column_name})`;
}
}
});
// Map dimension columns with custom SQL expressions
(columns || []).forEach((col: QueryFormColumn) => {
if (typeof col === 'object' && 'sqlExpression' in col) {
const label = getColumnLabel(col);
if (col.sqlExpression) {
sqlExpressionMap[label] = col.sqlExpression;
}
}
});
// Merge datasource-level saved metrics and calculated columns
if (ownState.metricSqlExpressions) {
Object.entries(
ownState.metricSqlExpressions as Record<string, string>,
).forEach(([label, expression]) => {
if (!sqlExpressionMap[label]) {
sqlExpressionMap[label] = expression;
}
});
}
const resolveLabelsToSQL = (clause: string): string => {
let resolved = clause;
// Sort by label length descending to prevent substring false positives
const sortedEntries = Object.entries(sqlExpressionMap).sort(
([a], [b]) => b.length - a.length,
);
sortedEntries.forEach(([label, expression]) => {
if (resolved.includes(label)) {
const escapedLabel = label.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
// Wrap complex expressions in parentheses for valid SQL
const isExpression =
expression.includes('(') ||
expression.toUpperCase().includes('CASE') ||
expression.includes('\n');
const wrappedExpression = isExpression
? `(${expression})`
: expression;
resolved = resolved.replace(
new RegExp(`\\b${escapedLabel}\\b`, 'g'),
wrappedExpression,
);
}
});
return resolved;
};
// Resolve and apply AG Grid WHERE clause
// Add AG Grid complex WHERE clause from ownState (non-metric filters)
if (ownState.agGridComplexWhere && ownState.agGridComplexWhere.trim()) {
const resolvedWhere = resolveLabelsToSQL(ownState.agGridComplexWhere);
(ownState as Record<string, unknown>).agGridComplexWhere =
resolvedWhere;
const existingWhere = queryObject.extras?.where;
const combinedWhere = existingWhere
? `${existingWhere} AND ${resolvedWhere}`
: resolvedWhere;
? `${existingWhere} AND ${ownState.agGridComplexWhere}`
: ownState.agGridComplexWhere;
queryObject = {
...queryObject,
@@ -563,15 +498,12 @@ const buildQuery: BuildQuery<TableChartFormData> = (
};
}
// Resolve and apply AG Grid HAVING clause
// Add AG Grid HAVING clause from ownState (metric filters only)
if (ownState.agGridHavingClause && ownState.agGridHavingClause.trim()) {
const resolvedHaving = resolveLabelsToSQL(ownState.agGridHavingClause);
(ownState as Record<string, unknown>).agGridHavingClause =
resolvedHaving;
const existingHaving = queryObject.extras?.having;
const combinedHaving = existingHaving
? `${existingHaving} AND ${resolvedHaving}`
: resolvedHaving;
? `${existingHaving} AND ${ownState.agGridHavingClause}`
: ownState.agGridHavingClause;
queryObject = {
...queryObject,

View File

@@ -34,8 +34,6 @@ import {
SMART_DATE_ID,
TimeFormats,
TimeFormatter,
AgGridChartState,
AgGridFilterModel,
} from '@superset-ui/core';
import { GenericDataType } from '@apache-superset/core/common';
import { isEmpty, isEqual, merge } from 'lodash';
@@ -458,9 +456,6 @@ const getPageSize = (
return numRecords * numColumns > 5000 ? 200 : 0;
};
// Tracks slice_ids that have already applied their saved chartState filter on mount
const savedFilterAppliedSet = new Set<number>();
const transformProps = (
chartProps: TableChartProps,
): AgGridTableChartTransformedProps => {
@@ -715,36 +710,6 @@ const transformProps = (
: totalQuery?.data[0]
: undefined;
// Map saved metric/calculated column labels to their SQL expressions for filter resolution
const metricSqlExpressions: Record<string, string> = {};
chartProps.datasource.metrics.forEach(metric => {
if (metric.metric_name && metric.expression) {
metricSqlExpressions[metric.metric_name] = metric.expression;
}
});
chartProps.datasource.columns.forEach(col => {
if (col.column_name && col.expression) {
metricSqlExpressions[col.column_name] = col.expression;
if (col.verbose_name && col.verbose_name !== col.column_name) {
metricSqlExpressions[col.verbose_name] = col.expression;
}
}
});
// Strip saved filter from chartState after initial application to prevent re-injection
let chartState = serverPaginationData?.chartState as
| AgGridChartState
| undefined;
const chartStateHasFilter = !!(
chartState?.filterModel && Object.keys(chartState.filterModel).length > 0
);
if (chartStateHasFilter && savedFilterAppliedSet.has(slice_id)) {
chartState = { ...chartState!, filterModel: {} as AgGridFilterModel };
} else if (chartStateHasFilter) {
savedFilterAppliedSet.add(slice_id);
}
return {
height,
width,
@@ -777,8 +742,7 @@ const transformProps = (
basicColorColumnFormatters,
basicColorFormatters,
formData,
metricSqlExpressions,
chartState,
chartState: serverPaginationData?.chartState,
onChartStateChange,
};
};

View File

@@ -128,7 +128,6 @@ export interface AgGridTableChartTransformedProps<
basicColorFormatters?: { [Key: string]: BasicColorFormatterType }[];
basicColorColumnFormatters?: { [Key: string]: BasicColorFormatterType }[];
formData: TableChartFormData;
metricSqlExpressions: Record<string, string>;
onChartStateChange?: (chartState: JsonObject) => void;
chartState?: AgGridChartState;
}

View File

@@ -1090,258 +1090,4 @@ describe('plugin-chart-ag-grid-table', () => {
expect(query.metrics).toEqual([]);
});
});
describe('buildQuery - label-to-SQL resolution in WHERE/HAVING', () => {
test('should resolve inline SQL metric labels in WHERE clause', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
metrics: [
{
expressionType: 'SQL',
sqlExpression: 'SUM(revenue)',
label: 'Total Revenue',
},
],
},
{
ownState: {
agGridComplexWhere: 'Total Revenue > 1000',
},
},
).queries[0];
expect(query.extras?.where).toBe('(SUM(revenue)) > 1000');
});
test('should resolve SIMPLE metric labels in HAVING clause', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
metrics: [
{
expressionType: 'SIMPLE',
aggregate: 'SUM',
column: { column_name: 'revenue' },
label: 'Total Revenue',
},
],
},
{
ownState: {
agGridHavingClause: 'Total Revenue > 1000',
},
},
).queries[0];
expect(query.extras?.having).toBe('(SUM(revenue)) > 1000');
});
test('should resolve adhoc column SQL expressions in WHERE clause', () => {
const adhocColumn = createAdhocColumn('UPPER(city)', 'City Upper');
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
groupby: [adhocColumn],
},
{
ownState: {
agGridComplexWhere: "City Upper = 'NEW YORK'",
},
},
).queries[0];
expect(query.extras?.where).toContain('UPPER(city)');
});
test('should wrap CASE expressions in parentheses', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
},
{
ownState: {
agGridComplexWhere: "degree_level = 'High'",
metricSqlExpressions: {
degree_level:
"CASE WHEN degree = 'PhD' THEN 'High' ELSE 'Low' END",
},
},
},
).queries[0];
expect(query.extras?.where).toBe(
"(CASE WHEN degree = 'PhD' THEN 'High' ELSE 'Low' END) = 'High'",
);
});
test('should wrap aggregate expressions in parentheses', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
},
{
ownState: {
agGridHavingClause: 'total_count > 100',
metricSqlExpressions: {
total_count: 'COUNT(*)',
},
},
},
).queries[0];
expect(query.extras?.having).toBe('(COUNT(*)) > 100');
});
test('should quote simple column names without parentheses', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
},
{
ownState: {
agGridComplexWhere: "status = 'active'",
metricSqlExpressions: {
status: 'user_status',
},
},
},
).queries[0];
expect(query.extras?.where).toBe("user_status = 'active'");
});
test('should resolve longer labels before shorter ones', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
metrics: [
{
expressionType: 'SQL',
sqlExpression: 'COUNT(*)',
label: 'count',
},
{
expressionType: 'SQL',
sqlExpression: 'COUNT(DISTINCT id)',
label: 'count_distinct',
},
],
},
{
ownState: {
agGridHavingClause: 'count_distinct > 5 AND count > 10',
},
},
).queries[0];
expect(query.extras?.having).toContain('(COUNT(DISTINCT id)) > 5');
expect(query.extras?.having).toContain('(COUNT(*)) > 10');
});
test('should prefer query-level expressions over datasource-level', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
metrics: [
{
expressionType: 'SQL',
sqlExpression: 'SUM(amount)',
label: 'total',
},
],
},
{
ownState: {
agGridHavingClause: 'total > 500',
metricSqlExpressions: {
total: 'SUM(old_amount)',
},
},
},
).queries[0];
// Query-level SUM(amount) should win over datasource-level SUM(old_amount)
expect(query.extras?.having).toBe('(SUM(amount)) > 500');
});
test('should not modify clause when no labels match', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
},
{
ownState: {
agGridComplexWhere: 'physical_column > 10',
},
},
).queries[0];
expect(query.extras?.where).toBe('physical_column > 10');
});
test('should resolve labels in both WHERE and HAVING simultaneously', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: true,
metrics: [
{
expressionType: 'SQL',
sqlExpression: 'SUM(sales)',
label: 'Total Sales',
},
],
},
{
ownState: {
agGridComplexWhere: "region = 'West'",
agGridHavingClause: 'Total Sales > 1000',
metricSqlExpressions: {
region: "CASE WHEN area = 'W' THEN 'West' ELSE 'East' END",
},
},
},
).queries[0];
expect(query.extras?.where).toContain('CASE WHEN');
expect(query.extras?.having).toBe('(SUM(sales)) > 1000');
});
test('should not resolve labels when server pagination is disabled', () => {
const query = buildQuery(
{
...basicFormData,
server_pagination: false,
metrics: [
{
expressionType: 'SQL',
sqlExpression: 'SUM(revenue)',
label: 'Total Revenue',
},
],
},
{
ownState: {
agGridComplexWhere: 'Total Revenue > 1000',
metricSqlExpressions: {
some_col: 'COUNT(*)',
},
},
},
).queries[0];
expect(query.extras?.where || undefined).toBeUndefined();
});
});
});

View File

@@ -40,11 +40,11 @@
"@superset-ui/core": "*",
"@apache-superset/core": "*",
"@types/react-redux": "*",
"geostyler": "^18.3.1",
"geostyler": "^14.1.3",
"geostyler-data": "^1.0.0",
"geostyler-openlayers-parser": "^5.4.0",
"geostyler-style": "^11.0.2",
"geostyler-wfs-parser": "^3.0.1",
"geostyler-openlayers-parser": "^4.0.0",
"geostyler-style": "^7.2.0",
"geostyler-wfs-parser": "^2.0.0",
"ol": "^10.8.0",
"polished": "*",
"react": "^17.0.2",

View File

@@ -36,7 +36,6 @@ describe('layerUtil', () => {
describe('createWfsLayer', () => {
test('properly applies style', async () => {
const colorToExpect = '#123456';
const fillColor = '#ff0000';
const wfsLayerConf: WfsLayerConf = {
title: 'osm:osm-fuel',
@@ -62,7 +61,7 @@ describe('layerUtil', () => {
},
{
kind: 'Fill',
color: fillColor,
color: '#000000',
},
],
},
@@ -77,8 +76,8 @@ describe('layerUtil', () => {
expect(style!.length).toEqual(3);
// @ts-expect-error upgrade `ol` package for better type of StyleLike type.
const colorAtLayer = style![2].getFill().getColor();
expect(colorAtLayer).toEqual(fillColor);
const colorAtLayer = style![1].getImage().getFill().getColor();
expect(colorToExpect).toEqual(colorAtLayer);
});
});

View File

@@ -26,7 +26,7 @@
"dependencies": {
"@types/d3-array": "^3.2.2",
"@types/react-redux": "^7.1.34",
"acorn": "^8.16.0",
"acorn": "^8.9.0",
"d3-array": "^3.2.4",
"lodash": "^4.17.23",
"zod": "^4.3.6"

View File

@@ -158,7 +158,7 @@ const defaultFormData: EchartsTimeseriesFormData & {
xAxisTitle: '',
xAxisTitleMargin: 0,
yAxisTitle: '',
yAxisTitleMargin: 15,
yAxisTitleMargin: 0,
yAxisTitlePosition: '',
time_range: 'No filter',
granularity: undefined,

View File

@@ -46,7 +46,7 @@ export const DEFAULT_FORM_DATA: EchartsTimeseriesFormData = {
xAxisTitle: '',
xAxisTitleMargin: 0,
yAxisTitle: '',
yAxisTitleMargin: 15,
yAxisTitleMargin: 0,
yAxisTitlePosition: 'Top',
// Now that the weird bug workaround is over, here's the rest...
...DEFAULT_SORT_SERIES_DATA,

View File

@@ -336,7 +336,6 @@ export default function transformProps(
chartProps.rawFormData,
seriesName,
);
const lineStyle: LineStyleOption = {};
if (derivedSeries) {
// Get the time offset for this series to assign different dash patterns
@@ -371,21 +370,7 @@ export default function transformProps(
let colorScaleKey = getOriginalSeries(seriesName, array);
// When there's a single metric with dimensions, the backend replaces the metric
// with the time offset in derived series (e.g., "28 days ago, Medium" instead of
// "SUM(sales), 28 days ago, Medium"). To match colors, strip the metric label
// from original series so both produce the same key (e.g., "Medium").
if (
groupby &&
groupby.length > 0 &&
array.length > 0 &&
metrics?.length === 1
) {
const metricLabel = getMetricLabel(metrics[0]);
colorScaleKey = colorScaleKey.replace(`${metricLabel}, `, '');
}
// If series name exactly matches a time offset (single metric case, no dimensions),
// If series name exactly matches a time offset (single metric case),
// find the original series for color matching
if (derivedSeries && array.includes(seriesName)) {
const originalSeries = rawSeries.find(

View File

@@ -104,7 +104,7 @@ export const DEFAULT_TITLE_FORM_DATA: TitleFormData = {
xAxisTitle: '',
xAxisTitleMargin: 0,
yAxisTitle: '',
yAxisTitleMargin: 15,
yAxisTitleMargin: 0,
yAxisTitlePosition: 'Top',
};

View File

@@ -112,7 +112,7 @@ const formData: EchartsMixedTimeseriesFormData = {
yAxisBounds: [undefined, undefined],
yAxisBoundsSecondary: [undefined, undefined],
yAxisTitle: '',
yAxisTitleMargin: 15,
yAxisTitleMargin: 0,
yAxisTitlePosition: '',
yAxisTitleSecondary: '',
zoomable: false,

View File

@@ -187,10 +187,8 @@ const ChartContextMenu = (
canDrillBy &&
isDisplayed(ContextMenuItem.DrillBy) &&
!(
(formData.matrixify_mode_rows !== undefined &&
formData.matrixify_mode_rows !== 'disabled') ||
(formData.matrixify_mode_columns !== undefined &&
formData.matrixify_mode_columns !== 'disabled')
formData.matrixify_enable_vertical_layout === true ||
formData.matrixify_enable_horizontal_layout === true
); // Disable drill by when matrixify is enabled
const datasetResource = useDatasetDrillInfo(

View File

@@ -163,7 +163,7 @@ test('should detect changes in matrixify properties', () => {
...requiredProps.formData,
datasource: '',
viz_type: VizType.Table,
matrixify_mode_rows: 'metrics',
matrixify_enable_vertical_layout: true,
matrixify_dimension_x: { dimension: 'country', values: ['USA'] },
matrixify_dimension_y: { dimension: 'category', values: ['Tech'] },
matrixify_charts_per_row: 3,
@@ -177,9 +177,9 @@ test('should detect changes in matrixify properties', () => {
// Since we can't directly test shouldComponentUpdate, we verify the component
// correctly identifies matrixify-related properties by checking the implementation
expect((initialProps.formData as JsonObject).matrixify_mode_rows).toBe(
'metrics',
);
expect(
(initialProps.formData as JsonObject).matrixify_enable_vertical_layout,
).toBe(true);
expect((initialProps.formData as JsonObject).matrixify_dimension_x).toEqual({
dimension: 'country',
values: ['USA'],
@@ -212,7 +212,7 @@ test('should identify matrixify property changes correctly', () => {
formData: {
datasource: '',
viz_type: VizType.Table,
matrixify_mode_rows: 'metrics',
matrixify_enable_vertical_layout: true,
matrixify_dimension_x: { dimension: 'country', values: ['USA'] },
matrixify_charts_per_row: 3,
},
@@ -234,7 +234,7 @@ test('should identify matrixify property changes correctly', () => {
formData: {
datasource: '',
viz_type: VizType.Table,
matrixify_mode_rows: 'metrics',
matrixify_enable_vertical_layout: true,
matrixify_dimension_x: {
dimension: 'country',
values: ['USA', 'Canada'], // Changed
@@ -277,7 +277,7 @@ test('should handle matrixify-related form data changes', () => {
formData: {
datasource: '',
viz_type: VizType.Table,
matrixify_mode_rows: 'metrics', // This is a significant change
matrixify_enable_vertical_layout: true, // This is a significant change
regular_control: 'value1',
},
};
@@ -296,7 +296,7 @@ test('should detect matrixify property addition', () => {
formData: {
datasource: '',
viz_type: VizType.Table,
matrixify_mode_rows: 'metrics',
matrixify_enable_vertical_layout: true,
// No matrixify_dimension_x initially
},
queriesResponse: [{ data: 'current' } as unknown as JsonObject],
@@ -317,7 +317,7 @@ test('should detect matrixify property addition', () => {
formData: {
datasource: '',
viz_type: VizType.Table,
matrixify_mode_rows: 'metrics',
matrixify_enable_vertical_layout: true,
matrixify_dimension_x: { dimension: 'country', values: ['USA'] }, // Added
},
};
@@ -336,7 +336,7 @@ test('should detect nested matrixify property changes', () => {
formData: {
datasource: '',
viz_type: VizType.Table,
matrixify_mode_rows: 'metrics',
matrixify_enable_vertical_layout: true,
matrixify_dimension_x: {
dimension: 'country',
values: ['USA'],
@@ -361,7 +361,7 @@ test('should detect nested matrixify property changes', () => {
formData: {
datasource: '',
viz_type: VizType.Table,
matrixify_mode_rows: 'metrics',
matrixify_enable_vertical_layout: true,
matrixify_dimension_x: {
dimension: 'country',
values: ['USA'],

View File

@@ -275,10 +275,8 @@ class ChartRenderer extends Component<ChartRendererProps, ChartRendererState> {
const nextFormData = nextProps.formData as JsonObject;
const currentFormData = this.props.formData as JsonObject;
const isMatrixifyEnabled =
(nextFormData.matrixify_mode_rows !== undefined &&
nextFormData.matrixify_mode_rows !== 'disabled') ||
(nextFormData.matrixify_mode_columns !== undefined &&
nextFormData.matrixify_mode_columns !== 'disabled');
nextFormData.matrixify_enable_vertical_layout === true ||
nextFormData.matrixify_enable_horizontal_layout === true;
if (!isMatrixifyEnabled) return false;
// Check all matrixify-related properties

View File

@@ -272,9 +272,9 @@ test('When menu item is clicked, call onSelection with clicked column and drill
);
});
test('matrixify_mode_rows enabled should not render component', () => {
test('matrixify_enable_vertical_layout should not render component', () => {
const { container } = renderSubmenu({
formData: { ...defaultFormData, matrixify_mode_rows: 'metrics' },
formData: { ...defaultFormData, matrixify_enable_vertical_layout: true },
});
expect(container).toBeEmptyDOMElement();
});

View File

@@ -179,10 +179,8 @@ export const DrillBySubmenu = ({
}
if (
(formData.matrixify_mode_rows !== undefined &&
formData.matrixify_mode_rows !== 'disabled') ||
(formData.matrixify_mode_columns !== undefined &&
formData.matrixify_mode_columns !== 'disabled')
formData.matrixify_enable_vertical_layout === true ||
formData.matrixify_enable_horizontal_layout === true
) {
return null;
}

View File

@@ -35,35 +35,36 @@ describe('chart reducers', () => {
});
test('should update endtime on fail', () => {
const newState = chartReducer(charts, actions.chartUpdateStopped(chartKey));
expect(newState[chartKey].chartUpdateEndTime).toBeGreaterThan(0);
expect(newState[chartKey].chartStatus).toEqual('stopped');
});
test('should handle chartUpdateStopped without queryController', () => {
const newState = chartReducer(charts, actions.chartUpdateStopped(chartKey));
expect(newState[chartKey].chartStatus).toEqual('stopped');
expect(newState[chartKey].chartAlert).toContain(
'Updating chart was stopped',
);
expect(newState[chartKey].chartUpdateEndTime).toBeGreaterThan(0);
});
test('chartUpdateStopped sets state correctly', () => {
const chartsWithController = {
[chartKey]: {
...testChart,
queryController: new AbortController(),
},
const controller = new AbortController();
charts[chartKey] = {
...charts[chartKey],
queryController: controller,
};
const newState = chartReducer(
chartsWithController,
actions.chartUpdateStopped(chartKey),
charts,
actions.chartUpdateStopped(chartKey, controller),
);
// Verify the chart status and alert are set
expect(newState[chartKey].chartUpdateEndTime).toBeGreaterThan(0);
expect(newState[chartKey].chartStatus).toEqual('stopped');
expect(newState[chartKey].chartAlert).toContain(
'Updating chart was stopped',
});
test('should ignore stopped updates from stale controllers', () => {
const controller = new AbortController();
const staleController = new AbortController();
charts[chartKey] = {
...charts[chartKey],
chartStatus: 'loading',
queryController: controller,
};
const newState = chartReducer(
charts,
actions.chartUpdateStopped(chartKey, staleController),
);
expect(newState[chartKey].chartStatus).toEqual('loading');
expect(newState[chartKey].chartUpdateEndTime).toEqual(
charts[chartKey].chartUpdateEndTime,
);
});

View File

@@ -55,8 +55,6 @@ type Range = editors.Range;
type Selection = editors.Selection;
type EditorAnnotation = editors.EditorAnnotation;
type CompletionProvider = editors.CompletionProvider;
type ContentChange = editors.ContentChange;
type ContentChangeEvent = editors.ContentChangeEvent;
/**
* Maps EditorLanguage to the corresponding Ace editor component.
@@ -119,14 +117,10 @@ const createAceEditorHandle = (
},
moveCursorToPosition: (position: Position) => {
const editor = aceEditorRef.current?.editor;
if (editor) {
editor.clearSelection();
editor.moveCursorToPosition({
row: position.line,
column: position.column,
});
}
aceEditorRef.current?.editor?.moveCursorToPosition({
row: position.line,
column: position.column,
});
},
getSelections: (): Selection[] => {
@@ -192,33 +186,6 @@ const createAceEditorHandle = (
resize: () => {
aceEditorRef.current?.editor?.resize();
},
onDidChangeContent: (listener, thisArgs?) => {
const editor = aceEditorRef.current?.editor;
if (!editor) return new Disposable(() => {});
const bound = (thisArgs ? listener.bind(thisArgs) : listener) as (
e: ContentChangeEvent,
) => void;
const handler = (delta: {
action: 'insert' | 'remove';
start: { row: number; column: number };
lines: string[];
}) => {
const rangeOffset = editor.session.doc.positionToIndex(delta.start);
const changeText = delta.lines.join(
editor.session.doc.getNewLineCharacter(),
);
const change: ContentChange =
delta.action === 'insert'
? { rangeOffset, rangeLength: 0, text: changeText }
: { rangeOffset, rangeLength: changeText.length, text: '' };
bound({ getValue: () => editor.getValue(), changes: [change] });
};
editor.session.on('change', handler);
return new Disposable(() => {
editor.session.off('change', handler);
});
},
});
/**

View File

@@ -694,12 +694,6 @@ const setSchema: typeof sqlLabApi.setSchema = async (schema: string | null) => {
store.dispatch(queryEditorSetSchema(queryEditor ?? null, schema));
};
const setActivePanel: typeof sqlLabApi.setActivePanel = async (
panelId: string,
) => {
store.dispatch({ type: SET_ACTIVE_SOUTHPANE_TAB, tabId: panelId });
};
export const sqlLab: typeof sqlLabApi = {
CTASMethod,
getActivePanel,
@@ -725,7 +719,6 @@ export const sqlLab: typeof sqlLabApi = {
setDatabase,
setCatalog,
setSchema,
setActivePanel,
};
// Export all models

View File

@@ -630,7 +630,6 @@ const FilterControls: FC<FilterControlsProps> = ({
: undefined
}
forceRender={hasRequiredFirst}
alwaysShowDropdownButton={items.length > 0}
ref={popoverRef}
onOverflowingStateChange={({ overflowed: nextOverflowedIds }) => {
if (

View File

@@ -33,7 +33,6 @@ const TitleArea = styled.div`
justify-content: space-between;
margin: 0;
padding: 0 ${theme.sizeUnit * 2}px ${theme.sizeUnit * 2}px;
padding-bottom: 0; /* Works with other changes in PR https://github.com/apache/superset/pull/38646 to reduces space between filter header and 1st filter */
& > span {
font-size: ${theme.fontSizeLG}px;
@@ -57,8 +56,9 @@ const HeaderButton = styled(Button)`
const Wrapper = styled.div`
${({ theme }) => `
padding: ${theme.sizeUnit * 3}px ${theme.sizeUnit * 2}px;
padding-bottom: 0; /* Works with other changes in PR https://github.com/apache/superset/pull/38646 to reduces space between filter header and 1st filter */
padding: ${theme.sizeUnit * 3}px ${theme.sizeUnit * 2}px ${
theme.sizeUnit
}px;
`}
`;

View File

@@ -119,7 +119,6 @@ const FilterControlsWrapper = styled.div`
flex-direction: column;
gap: ${theme.sizeUnit * 2}px;
padding: ${theme.sizeUnit * 4}px;
padding-top: 0; /* Works with other changes in PR https://github.com/apache/superset/pull/38646 to reduces space between filter header and 1st filter */
// 108px padding to make room for buttons with position: absolute
padding-bottom: ${theme.sizeUnit * 27}px;
`}

View File

@@ -45,16 +45,12 @@ import {
useEffect,
useImperativeHandle,
useMemo,
useRef,
useState,
RefObject,
memo,
} from 'react';
import rison from 'rison';
import {
PluginFilterSelectCustomizeProps,
SelectFilterOperatorType,
} from 'src/filters/components/Select/types';
import { PluginFilterSelectCustomizeProps } from 'src/filters/components/Select/types';
import { useSelector } from 'react-redux';
import { getChartDataRequest } from 'src/components/Chart/chartAction';
import {
@@ -628,49 +624,6 @@ const FiltersConfigForm = (
forceUpdate();
};
const currentOperatorType: SelectFilterOperatorType =
formFilter?.controlValues?.operatorType ??
filterToEdit?.controlValues?.operatorType ??
SelectFilterOperatorType.Exact;
const selectedColumnIsString = useMemo(() => {
const columnName = formFilter?.column;
if (!columnName || !datasetDetails?.columns) return true;
const colMeta = datasetDetails.columns.find(
(c: { column_name: string }) => c.column_name === columnName,
);
if (!colMeta) return true;
return colMeta.type_generic === GenericDataType.String;
}, [formFilter?.column, datasetDetails?.columns]);
const onOperatorTypeChanged = (value: SelectFilterOperatorType) => {
const previous = form.getFieldValue('filters')?.[filterId].controlValues;
setNativeFilterFieldValues(form, filterId, {
controlValues: {
...previous,
operatorType: value,
},
defaultDataMask: null,
});
formChanged();
forceUpdate();
};
const prevColumnRef = useRef(formFilter?.column);
const datasetLoaded = !!datasetDetails?.columns;
useEffect(() => {
const columnChanged = prevColumnRef.current !== formFilter?.column;
if (
(columnChanged || datasetLoaded) &&
!selectedColumnIsString &&
currentOperatorType !== SelectFilterOperatorType.Exact
) {
onOperatorTypeChanged(SelectFilterOperatorType.Exact);
}
prevColumnRef.current = formFilter?.column;
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [formFilter?.column, selectedColumnIsString, datasetLoaded]);
const validatePreFilter = () =>
setTimeout(
() =>
@@ -732,7 +685,6 @@ const FiltersConfigForm = (
'columns.filterable',
'columns.is_dttm',
'columns.type',
'columns.type_generic',
'columns.verbose_name',
'database.id',
'database.database_name',
@@ -1549,67 +1501,6 @@ const FiltersConfigForm = (
hidden
initialValue={null}
/>
{!isChartCustomization &&
itemTypeField === 'filter_select' && (
<StyledRowFormItem
expanded={expanded}
name={[
'filters',
filterId,
'controlValues',
'operatorType',
]}
initialValue={currentOperatorType}
label={
<>
<StyledLabel>{t('Match type')}</StyledLabel>
&nbsp;
<InfoTooltip
placement="top"
tooltip={t(
'Warning: ILIKE queries may be slow on large datasets as they cannot use indexes effectively.',
)}
/>
</>
}
>
<Select
ariaLabel={t('Match type')}
options={[
{
value: SelectFilterOperatorType.Exact,
label: t('Exact match (IN)'),
},
...(selectedColumnIsString
? [
{
value:
SelectFilterOperatorType.Contains,
label: t(
'Contains text (ILIKE %x%)',
),
},
{
value:
SelectFilterOperatorType.StartsWith,
label: t('Starts with (ILIKE x%)'),
},
{
value:
SelectFilterOperatorType.EndsWith,
label: t('Ends with (ILIKE %x)'),
},
]
: []),
]}
onChange={value => {
onOperatorTypeChanged(
value as SelectFilterOperatorType,
);
}}
/>
</StyledRowFormItem>
)}
<FormItem
name={['filters', filterId, 'defaultValue']}
>

Some files were not shown because too many files have changed in this diff Show More