Compare commits

..

13 Commits

Author SHA1 Message Date
Maxime Beauchemin
235d4ea516 chore: trigger Showtime environment for QA testing 2026-04-15 15:44:46 +00:00
Maxime Beauchemin
860f8cbe0f fix(explore): remove flaky ag-grid header text assertion in test
ag-grid's custom header component doesn't expose header text as
simple text nodes in JSDOM. Replace with a simpler assertion that
verifies the grid container renders without crashes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-09 15:52:02 +00:00
Maxime Beauchemin
2fad87569c fix(explore): resolve CI failures for GridTable migration
- Fix TS2345 in SamplesPane: cast queryFormData for getDrillPayload
- Fix TS2345 in useResultsPane: use Number() for row_limit type coercion
- Update DrillByModal tests: remove pagination/sort-header assertions
  that relied on old TableView DOM; ag-grid virtualizes instead
- Fix backend test: update per_page validation test to use 10001
  (schema max is now 10000, not 1000)
- Apply prettier formatting to useGridResultTable

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-09 01:07:11 +00:00
Maxime Beauchemin
c6f54471dc fix(explore): cap Results row limit at chart's row_limit setting
Both tabs now share the same ROW_LIMIT_OPTIONS (100, 500, 1k, 5k, 10k).
The Results dropdown never overrides the chart's row_limit upward —
effective limit is min(dropdown, chart_row_limit). The Samples dropdown
has no override logic since it uses its own independent API.

Backend schema max bumped to 10000 to support higher sample limits.
The SAMPLES_ROW_LIMIT config (default 1000) still acts as the
server-side cap.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-09 00:31:16 +00:00
Maxime Beauchemin
7539138702 fix(explore): add row limit selector to Results tab, fix padding
- Add row limit dropdown to Results tab (options: 100, 500, 1k, 5k, 10k,
  default 1k) — same pattern as Samples but with higher limits
- Override queryFormData.row_limit before fetching chart results so the
  backend respects the selected limit
- Add padding-top to TableControlsWrapper so the search input isn't
  pressed against the tab bar
- Make row limit options configurable per-consumer (SAMPLES_ROW_LIMIT_OPTIONS
  vs RESULTS_ROW_LIMIT_OPTIONS)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-09 00:08:25 +00:00
Maxime Beauchemin
e0b1b557d7 fix(explore): cap row limit options at 1k, hide redundant row count
- Remove 5k/10k options since backend SAMPLES_ROW_LIMIT defaults to
  1000 and caps higher values silently
- Revert backend schema max back to 1000
- Only show the row count badge when the returned count is less than the
  selected limit (avoids showing "1k rows" dropdown next to "1k rows"
  badge)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 23:51:47 +00:00
Maxime Beauchemin
bc5a5c2ac5 fix(explore): apply chart filters to Samples tab queries
The Samples tab was sending an empty payload {} to the samples API,
ignoring all chart filters (WHERE clause, time range, adhoc filters).
This was a pre-existing regression.

Use getDrillPayload() to extract filters, granularity, time_range, and
extras from the chart's queryFormData and pass them to the samples
endpoint. Also switch the cache from WeakSet<datasource> to
WeakMap<queryFormData> so samples re-fetch when filters change.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 23:49:05 +00:00
Maxime Beauchemin
3a562dbe29 feat(explore): add row limit selector to Samples tab
Default to 100 rows instead of 1000 to improve initial load performance,
especially for wide datasets. Users can increase to 500, 1k, 5k, or 10k
via a dropdown selector in the controls bar.

Also bumps the backend schema validation max from 1000 to 10000 to
support the higher limits. The SAMPLES_ROW_LIMIT config still acts as
the server-side cap (default 1000).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 23:39:31 +00:00
Maxime Beauchemin
73b780a28c fix(explore): use callback ref for ResizeObserver to fix grid height
The useGridHeight hook used useEffect with [] deps, which only runs once
on mount. In SamplesPane, the GridSizer element doesn't exist at mount
time (component renders <Loading /> first), so the ResizeObserver was
never created and gridHeight stayed at the 400px fallback forever.

Switch to a callback ref pattern so the ResizeObserver is created when
the element actually mounts in the DOM. Also guard against 0-height
measurements from hidden tabs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 22:44:57 +00:00
Maxime Beauchemin
caeb6a6b7c fix(explore): fix grid height measurement with absolute positioning
The ResizeObserver approach had a circular dependency: GridTable needs
an explicit pixel height, but the container's height comes from flex
layout. The grid's initial 300px default overflowed the flex container.

Fix by using position: absolute + inset: 0 on an inner sizer element.
The sizer fills its relative-positioned parent (whose size comes from
flex), and ResizeObserver measures the sizer to get the correct height
for GridTable. This decouples the measurement from the content.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 22:36:42 +00:00
Maxime Beauchemin
19072074c5 refactor(explore): extract shared grid hooks, fix drill-by height, clean up unused props
- Extract useGridColumns, useKeywordFilter, useGridHeight into shared
  useGridResultTable hook to eliminate duplication between SamplesPane
  and SingleQueryResultPane
- Wrap SingleQueryResultPane in a flex container so GridTable gets
  proper height in both Explore (flex parent) and drill-by (modal) contexts
- Update drill-by useResultsTableView to use flex-based ResultContainer
- Remove unused props: dataSize, isPaginationSticky from types and callers
- Fix drill-by tests for ag-grid DOM structure
- Use proper ag-grid IRowNode type instead of any

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 22:04:19 +00:00
Maxime Beauchemin
f2037fa332 perf(explore): replace TableView with GridTable in SingleQueryResultPane
Apply the same virtualization fix to the Results tab — same root cause as the
Samples tab: TableView renders all columns without virtualization, freezing the
browser on wide datasets.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 21:45:35 +00:00
Maxime Beauchemin
6c71800436 perf(explore): replace TableView with GridTable in SamplesPane for virtualized rendering
The Samples tab in Explore froze the browser for ~30s on datasets with many
columns because TableView (react-table) renders all columns in the DOM without
virtualization. Switch to GridTable (ag-grid) which provides both row and column
virtualization out of the box, eliminating the freeze.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 20:58:24 +00:00
959 changed files with 20067 additions and 86596 deletions

5
.github/CODEOWNERS vendored
View File

@@ -22,11 +22,6 @@
/.github/ @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @kgabryje @dpgaspar @sadpandajoe @hainenber
# Notify PMC members of changes to CI-executed scripts (supply-chain risk:
# scripts/ files run directly in CI workflows and can execute arbitrary code)
/scripts/ @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @kgabryje @dpgaspar @sadpandajoe @hainenber
# Notify PMC members of changes to required GitHub Actions
/.asf.yaml @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @kgabryje @dpgaspar @Antonio-RiveroMartnez

View File

@@ -4,10 +4,6 @@ updates:
- package-ecosystem: "github-actions"
directory: "/"
ignore:
# Ignore temporarily as release schedule is too mentally taxing for dep-handling maintainers
# Additionally, very few PRs are reviewed by this action.
- dependency-name: anthropics/claude-code-action
schedule:
interval: "daily"
@@ -37,10 +33,6 @@ updates:
# `just-handlerbars-helpers` library in plugin-chart-handlebars requires `currencyformatter`` to be < 2
- dependency-name: "currencyformatter.js"
update-types: ["version-update:semver-major"]
# TODO: remove below clause once https://github.com/pmmmwh/react-refresh-webpack-plugin/pull/940 lands onto a future release
# and confirm the issue https://github.com/apache/superset/issues/39600 is fixed
- dependency-name: "react-checkbox-tree"
update-types: ["version-update:semver-major"]
groups:
storybook:
applies-to: version-updates
@@ -59,13 +51,15 @@ updates:
versioning-strategy: increase
- package-ecosystem: "pip"
directory: "/"
# NOTE: `uv` support is in beta, more details here:
# https://github.com/dependabot/dependabot-core/pull/10040#issuecomment-2696978430
- package-ecosystem: "uv"
directory: "requirements/"
open-pull-requests-limit: 10
schedule:
interval: "weekly"
labels:
- pip
- uv
- dependabot
- package-ecosystem: "npm"

15
.github/labeler.yml vendored
View File

@@ -17,11 +17,6 @@
- any-glob-to-any-file:
- 'superset/migrations/**'
"risk:ci-script":
- changed-files:
- any-glob-to-any-file:
- 'scripts/**'
############################################
# Dependencies
############################################
@@ -77,11 +72,6 @@
- any-glob-to-any-file:
- 'superset/translations/zh/**'
"i18n:czech":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/cs/**'
"i18n:traditional-chinese":
- changed-files:
- any-glob-to-any-file:
@@ -127,11 +117,6 @@
- any-glob-to-any-file:
- 'superset/translations/sk/**'
"i18n:latvian":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/lv/**'
"i18n:ukrainian":
- changed-files:
- any-glob-to-any-file:

View File

@@ -127,20 +127,6 @@ playwright_testdata() {
superset load_test_users
superset load_examples
superset init
# Enable DML on the examples database so Playwright tests can create/drop
# temporary tables via SQL Lab without depending on external data sources.
superset shell <<'PYEOF'
import sys
from superset.extensions import db
from superset.models.core import Database
examples_db = db.session.query(Database).filter_by(database_name='examples').first()
if not examples_db:
sys.exit('ERROR: examples database not found. load_examples may have failed.')
examples_db.allow_dml = True
db.session.commit()
print('Enabled allow_dml on examples database')
PYEOF
say "::endgroup::"
}

View File

@@ -27,7 +27,7 @@ jobs:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Check and notify
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
github-token: ${{ github.token }}
script: |

View File

@@ -44,7 +44,7 @@ jobs:
pull-requests: write
steps:
- name: Comment access denied
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
const message = `👋 Hi @${{ github.event.comment.user.login || github.event.review.user.login || github.event.issue.user.login }}!
@@ -76,7 +76,7 @@ jobs:
fetch-depth: 1
- name: Run Claude PR Action
uses: anthropics/claude-code-action@5fb899572b81d2bb648d4d187173a2f423a9677c # beta
uses: anthropics/claude-code-action@6e2bd52842c65e914eba5c8badd17560bd26b5de # beta
with:
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
timeout_minutes: "60"

View File

@@ -31,7 +31,7 @@ jobs:
working-directory: superset-embedded-sdk
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './superset-embedded-sdk/.nvmrc'
registry-url: 'https://registry.npmjs.org'

View File

@@ -19,7 +19,7 @@ jobs:
working-directory: superset-embedded-sdk
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './superset-embedded-sdk/.nvmrc'
registry-url: 'https://registry.npmjs.org'

View File

@@ -58,7 +58,7 @@ jobs:
- name: Login to Amazon ECR
if: steps.describe-services.outputs.active == 'true'
id: login-ecr
uses: aws-actions/amazon-ecr-login@19d944daaa35f0fa1d3f7f8af1d3f2e5de25c5b7 # v2
uses: aws-actions/amazon-ecr-login@f2e9fc6c2b355c1890b65e6f6f0e2ac3e6e22f78 # v2
- name: Delete ECR image tag
if: steps.describe-services.outputs.active == 'true'
@@ -71,7 +71,7 @@ jobs:
- name: Comment (success)
if: steps.describe-services.outputs.active == 'true'
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
github-token: ${{github.token}}
script: |

View File

@@ -65,7 +65,7 @@ jobs:
- name: Get event SHA
id: get-sha
if: steps.eval-label.outputs.result == 'up'
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
@@ -96,7 +96,7 @@ jobs:
core.setOutput("sha", prSha);
- name: Looking for feature flags in PR description
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
id: eval-feature-flags
if: steps.eval-label.outputs.result == 'up'
with:
@@ -118,7 +118,7 @@ jobs:
return results;
- name: Reply with confirmation comment
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
if: steps.eval-label.outputs.result == 'up'
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
@@ -199,7 +199,7 @@ jobs:
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@19d944daaa35f0fa1d3f7f8af1d3f2e5de25c5b7 # v2
uses: aws-actions/amazon-ecr-login@f2e9fc6c2b355c1890b65e6f6f0e2ac3e6e22f78 # v2
- name: Load, tag and push image to ECR
id: push-image
@@ -235,7 +235,7 @@ jobs:
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@19d944daaa35f0fa1d3f7f8af1d3f2e5de25c5b7 # v2
uses: aws-actions/amazon-ecr-login@f2e9fc6c2b355c1890b65e6f6f0e2ac3e6e22f78 # v2
- name: Check target image exists in ECR
id: check-image
@@ -250,7 +250,7 @@ jobs:
- name: Fail on missing container image
if: steps.check-image.outcome == 'failure'
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
github-token: ${{ github.token }}
script: |
@@ -324,7 +324,7 @@ jobs:
echo "ip=$(aws ec2 describe-network-interfaces --network-interface-ids ${{ steps.get-eni.outputs.eni }} | jq -r '.NetworkInterfaces | first | .Association.PublicIp')" >> $GITHUB_OUTPUT
- name: Comment (success)
if: ${{ success() }}
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
github-token: ${{github.token}}
script: |
@@ -337,7 +337,7 @@ jobs:
});
- name: Comment (failure)
if: ${{ failure() }}
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
github-token: ${{github.token}}
script: |

View File

@@ -17,7 +17,7 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version: '20'

View File

@@ -17,7 +17,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Check for 'hold' label
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |

View File

@@ -42,7 +42,7 @@ jobs:
echo "HOMEBREW_REPOSITORY=$HOMEBREW_REPOSITORY" >>"${GITHUB_ENV}"
brew install norwoodj/tap/helm-docs
- name: Setup Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version: '20'
@@ -57,7 +57,7 @@ jobs:
yarn install --immutable
- name: Cache pre-commit environments
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5
with:
path: ~/.cache/pre-commit
key: pre-commit-v2-${{ runner.os }}-py${{ matrix.python-version }}-${{ hashFiles('.pre-commit-config.yaml') }}

View File

@@ -44,13 +44,13 @@ jobs:
- name: Install Node.js
if: env.HAS_TAGS
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Cache npm
if: env.HAS_TAGS
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5
with:
path: ~/.npm # npm cache files are stored in `~/.npm` on Linux/macOS
key: ${{ runner.OS }}-node-${{ hashFiles('**/package-lock.json') }}
@@ -64,7 +64,7 @@ jobs:
run: echo "dir=$(npm config get cache)" >> $GITHUB_OUTPUT
- name: Cache npm
if: env.HAS_TAGS
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5
id: npm-cache # use this to check for `cache-hit` (`steps.npm-cache.outputs.cache-hit != 'true'`)
with:
path: ${{ steps.npm-cache-dir-path.outputs.dir }}

View File

@@ -37,7 +37,7 @@ jobs:
steps:
- name: Security Check - Authorize Maintainers Only
id: auth
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:

View File

@@ -46,7 +46,7 @@ jobs:
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './docs/.nvmrc'
- name: Setup Python
@@ -70,7 +70,7 @@ jobs:
yarn install --check-cache
- name: Download database diagnostics (if triggered by integration tests)
if: github.event_name == 'workflow_run' && github.event.workflow_run.conclusion == 'success'
uses: dawidd6/action-download-artifact@b6e2e70617bc3265edd6dab6c906732b2f1ae151 # v21
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml
@@ -79,7 +79,7 @@ jobs:
path: docs/src/data/
- name: Try to download latest diagnostics (for push/dispatch triggers)
if: github.event_name != 'workflow_run'
uses: dawidd6/action-download-artifact@b6e2e70617bc3265edd6dab6c906732b2f1ae151 # v21
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml

View File

@@ -72,7 +72,7 @@ jobs:
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './docs/.nvmrc'
- name: yarn install
@@ -104,14 +104,14 @@ jobs:
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './docs/.nvmrc'
- name: yarn install
run: |
yarn install --check-cache
- name: Download database diagnostics from integration tests
uses: dawidd6/action-download-artifact@b6e2e70617bc3265edd6dab6c906732b2f1ae151 # v21
uses: dawidd6/action-download-artifact@8305c0f1062bb0d184d09ef4493ecb9288447732 # v20
with:
workflow: superset-python-integrationtest.yml
run_id: ${{ github.event.workflow_run.id }}

View File

@@ -109,7 +109,7 @@ jobs:
run: testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
@@ -146,7 +146,7 @@ jobs:
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7
if: failure()
with:
path: ${{ github.workspace }}/superset-frontend/cypress-base/cypress/screenshots
@@ -226,7 +226,7 @@ jobs:
run: playwright_testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
@@ -259,7 +259,7 @@ jobs:
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Playwright Artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7
if: failure()
with:
path: |

View File

@@ -58,7 +58,7 @@ jobs:
- name: Upload HTML coverage report
if: steps.check.outputs.superset-extensions-cli
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7
with:
name: superset-extensions-cli-coverage-html
path: htmlcov/

View File

@@ -58,7 +58,7 @@ jobs:
- name: Upload Docker Image Artifact
if: steps.check.outputs.frontend
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7
with:
name: docker-image
path: docker-image.tar.zst
@@ -91,7 +91,7 @@ jobs:
"npm run test -- --coverage --shard=${{ matrix.shard }}/8 --coverageReporters=json"
- name: Upload Coverage Artifact
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7
with:
name: coverage-artifacts-${{ matrix.shard }}
path: superset-frontend/coverage

View File

@@ -101,7 +101,7 @@ jobs:
CR_RELEASE_NAME_TEMPLATE: "superset-helm-chart-{{ .Version }}"
- name: Open Pull Request
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
const branchName = '${{ env.branch_name }}';

View File

@@ -100,7 +100,7 @@ jobs:
run: playwright_testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
@@ -133,7 +133,7 @@ jobs:
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Playwright Artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7
if: failure()
with:
path: |

View File

@@ -101,7 +101,7 @@ jobs:
"
- name: Upload database diagnostics artifact
if: steps.check.outputs.python
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7
with:
name: database-diagnostics
path: databases-diagnostics.json

View File

@@ -31,7 +31,7 @@ jobs:
- name: Setup Node.js
if: steps.check.outputs.frontend
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install dependencies

View File

@@ -26,7 +26,7 @@ jobs:
steps:
- name: Quickly add thumbs up!
if: github.event_name == 'issue_comment' && contains(github.event.comment.body, '@supersetbot')
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
const [owner, repo] = process.env.GITHUB_REPOSITORY.split('/')

View File

@@ -62,7 +62,7 @@ jobs:
build: "true"
- name: Use Node.js 20
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version: 20
@@ -117,7 +117,7 @@ jobs:
fetch-depth: 0
- name: Use Node.js 20
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version: 20

View File

@@ -35,7 +35,7 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: './superset-frontend/.nvmrc'

1
.gitignore vendored
View File

@@ -62,7 +62,6 @@ rat-results.txt
superset/app/
superset-websocket/config.json
.direnv
*.log
# Node.js, webpack artifacts, storybook
*.entry.js

View File

@@ -29,7 +29,7 @@ ARG BUILD_TRANSLATIONS="false"
######################################################################
# superset-node-ci used as a base for building frontend assets and CI
######################################################################
FROM --platform=${BUILDPLATFORM} node:22-trixie-slim AS superset-node-ci
FROM --platform=${BUILDPLATFORM} node:20-trixie-slim AS superset-node-ci
ARG BUILD_TRANSLATIONS
ENV BUILD_TRANSLATIONS=${BUILD_TRANSLATIONS}
ARG DEV_MODE="false" # Skip frontend build in dev mode

View File

@@ -458,7 +458,7 @@ cd ../
sed -i '' "s/version_string = .*/version_string = \"$SUPERSET_VERSION\"/" setup.py
# build the python distribution
python -m build
python setup.py sdist
```
Publish to PyPI

View File

@@ -58,10 +58,6 @@ categories:
url: https://www.ontruck.com/
Financial Services:
- name: Aadhar Housing Finance Limited
url: https://www.aadharhousing.com
contributors: ["@thakerhardiks"]
- name: Aktia Bank plc
url: https://www.aktia.com
@@ -291,11 +287,6 @@ categories:
url: https://www.gfk.com/home
contributors: ["@mherr"]
- name: Hifadih Business & Technology
url: https://hifadih.net/en
logo: hifadih.png
contributors: ["@saintLaurent00"]
# Logo approved by @anmol-hpe on behalf of HPE
- name: HPE
url: https://www.hpe.com/in/en/home.html

View File

@@ -24,20 +24,6 @@ assists people when migrating to a new version.
## Next
### Granular Export Controls
A new feature flag `GRANULAR_EXPORT_CONTROLS` introduces three fine-grained permissions that replace the legacy `can_csv` permission:
| Permission | Controls |
|---|---|
| `can_export_data` | CSV, Excel, JSON exports |
| `can_export_image` | Screenshot/PDF exports |
| `can_copy_clipboard` | Copy-to-clipboard operations |
When the feature flag is enabled, these permissions are enforced on both the frontend (disabled buttons with tooltips) and backend (403 responses from API endpoints). When disabled, legacy `can_csv` behavior is preserved.
**Migration behavior:** All three new permissions are granted to every role that currently has `can_csv`, preserving existing access. Admins can then selectively revoke individual export permissions from specific roles as needed.
### Deck.gl MapBox viewport and opacity controls are functional
The Deck.gl MapBox chart's **Opacity**, **Default longitude**, **Default latitude**, and **Zoom** controls were previously non-functional — changing them had no effect on the rendered map. These controls are now wired up correctly.

View File

@@ -1,162 +0,0 @@
{/*
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
*/}
---
title: AWS IAM Authentication
sidebar_label: AWS IAM Authentication
sidebar_position: 15
---
# AWS IAM Authentication for AWS Databases
Superset supports IAM-based authentication for **Amazon Aurora** (PostgreSQL and MySQL) and **Amazon Redshift**. IAM auth eliminates the need for database passwords — Superset generates a short-lived auth token using temporary AWS credentials instead.
Cross-account IAM role assumption via STS `AssumeRole` is supported, allowing a Superset deployment in one AWS account to connect to databases in a different account.
## Prerequisites
- Enable the `AWS_DATABASE_IAM_AUTH` feature flag in `superset_config.py`. IAM authentication is gated behind this flag; if it is disabled, connections using `aws_iam` fail with *"AWS IAM database authentication is not enabled."*
```python
FEATURE_FLAGS = {
"AWS_DATABASE_IAM_AUTH": True,
}
```
- `boto3` must be installed in your Superset environment:
```bash
pip install boto3
```
- The Superset server's IAM role (or static credentials) must have permission to call `sts:AssumeRole` (for cross-account) or the same-account permissions for the target service:
- **Aurora (RDS)**: `rds-db:connect`
- **Redshift provisioned**: `redshift:GetClusterCredentials`
- **Redshift Serverless**: `redshift-serverless:GetCredentials` and `redshift-serverless:GetWorkgroup`
- SSL must be enabled on the Aurora / Redshift endpoint (required for IAM token auth).
## Configuration
IAM authentication is configured via the **encrypted_extra** field of the database connection. Access this field in the **Advanced** → **Security** section of the database connection form, under **Secure Extra**.
### Aurora PostgreSQL or Aurora MySQL
```json
{
"aws_iam": {
"enabled": true,
"role_arn": "arn:aws:iam::222222222222:role/SupersetDatabaseAccess",
"external_id": "superset-prod-12345",
"region": "us-east-1",
"db_username": "superset_iam_user",
"session_duration": 3600
}
}
```
| Field | Required | Description |
|-------|----------|-------------|
| `enabled` | Yes | Set to `true` to activate IAM auth |
| `role_arn` | No | ARN of the cross-account IAM role to assume via STS. Omit for same-account auth |
| `external_id` | No | External ID for the STS `AssumeRole` call, if required by the target role's trust policy |
| `region` | Yes | AWS region of the database cluster |
| `db_username` | Yes | The database username associated with the IAM identity |
| `session_duration` | No | STS session duration in seconds (default: `3600`) |
### Redshift (Serverless)
```json
{
"aws_iam": {
"enabled": true,
"role_arn": "arn:aws:iam::222222222222:role/SupersetRedshiftAccess",
"region": "us-east-1",
"workgroup_name": "my-workgroup",
"db_name": "dev"
}
}
```
### Redshift (Provisioned Cluster)
```json
{
"aws_iam": {
"enabled": true,
"role_arn": "arn:aws:iam::222222222222:role/SupersetRedshiftAccess",
"region": "us-east-1",
"cluster_identifier": "my-cluster",
"db_username": "superset_iam_user",
"db_name": "dev"
}
}
```
## Cross-Account IAM Setup
To connect to a database in Account B from a Superset deployment in Account A:
**1. In Account B — create a database-access role:**
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["rds-db:connect"],
"Resource": "arn:aws:rds-db:us-east-1:222222222222:dbuser/db-XXXXXXXXXXXX/superset_iam_user"
}
]
}
```
**Trust policy** (allows Account A's Superset role to assume it):
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111111111111:role/SupersetInstanceRole"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"sts:ExternalId": "superset-prod-12345"
}
}
}
]
}
```
**2. In Account A — grant Superset's role permission to assume the Account B role:**
```json
{
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Resource": "arn:aws:iam::222222222222:role/SupersetDatabaseAccess"
}
```
**3. Configure the database connection in Superset** using the `role_arn` and `external_id` from the trust policy (as shown in the configuration example above).
## Credential Caching
STS credentials are cached in memory keyed by `(role_arn, region, external_id)` with a 10-minute TTL. This reduces the number of STS API calls when multiple queries are executed with the same connection. Tokens are refreshed automatically before expiry.

View File

@@ -138,33 +138,14 @@ THUMBNAIL_CACHE_CONFIG = init_thumbnail_cache
```
Using the above example cache keys for dashboards will be `superset_thumb__dashboard__{ID}`. You can
override the base URL for Selenium using:
override the base URL for selenium using:
```
WEBDRIVER_BASEURL = "https://superset.company.com"
```
To control which user account is used for rendering thumbnails and warming up caches, configure
`THUMBNAIL_EXECUTORS` and `CACHE_WARMUP_EXECUTORS`. Each accepts a list of executor types (which
resolve to an owner, creator, modifier, or the currently-logged-in user) and/or a `FixedExecutor`
pinned to a specific username. By default, thumbnails render as the current user
(`ExecutorType.CURRENT_USER`) and cache warmup runs as the chart/dashboard owner
(`ExecutorType.OWNER`).
To force both to run as a dedicated service account (`admin` in this example):
```python
from superset.tasks.types import ExecutorType, FixedExecutor
THUMBNAIL_EXECUTORS = [FixedExecutor("admin")]
CACHE_WARMUP_EXECUTORS = [FixedExecutor("admin")]
```
Use a dedicated read-only service account here rather than a personal admin account, so that
thumbnail rendering and cache warmup tasks don't fail if a specific user's credentials change.
Additional Selenium WebDriver configuration can be set using `WEBDRIVER_CONFIGURATION`. You can
implement a custom function to authenticate Selenium. The default function uses the `flask-login`
Additional selenium web drive configuration can be set using `WEBDRIVER_CONFIGURATION`. You can
implement a custom function to authenticate selenium. The default function uses the `flask-login`
session cookie. Here's an example of a custom function signature:
```python
@@ -178,20 +159,6 @@ Then on configuration:
WEBDRIVER_AUTH_FUNC = auth_driver
```
## ETag Support for Thumbnails
Thumbnail and screenshot endpoints return `ETag` response headers based on the cached content digest. Clients can use conditional requests to avoid downloading unchanged images:
```
GET /api/v1/chart/42/thumbnail/
If-None-Match: "abc123..."
→ 304 Not Modified (if unchanged)
→ 200 OK (with new image if changed)
```
This is particularly useful for embedded dashboards and external integrations that periodically poll for updated screenshots — unchanged thumbnails return immediately with no payload.
## Distributed Coordination Backend
Superset supports an optional distributed coordination (`DISTRIBUTED_COORDINATION_CONFIG`) for

View File

@@ -109,14 +109,6 @@ SECRET_KEY = 'YOUR_OWN_RANDOM_GENERATED_SECRET_KEY'
You can generate a strong secure key with `openssl rand -base64 42`.
Alternatively, you can set the secret key using `SUPERSET_SECRET_KEY` environment variable:
On a Unix-based system, such as Linux or macOS, you can do so by running the following command in your terminal:
```bash
export SUPERSET_SECRET_KEY=$(openssl rand -base64 42)
```
:::caution Use a strong secret key
This key will be used for securely signing session cookies and encrypting sensitive information stored in Superset's application metadata database.
Your deployment must use a complex, unique key.
@@ -372,26 +364,6 @@ CUSTOM_SECURITY_MANAGER = CustomSsoSecurityManager
]
```
### PKCE Support
For public OAuth2 clients that cannot securely store a client secret, enable Proof Key for Code Exchange (PKCE) by adding `code_challenge_method` to the `remote_app` configuration:
```python
OAUTH_PROVIDERS = [
{
'name': 'myProvider',
'remote_app': {
'client_id': 'myClientId',
'client_secret': 'mySecret', # may be empty for pure public clients
'code_challenge_method': 'S256', # enables PKCE
'server_metadata_url': 'https://myAuthorizationServer/.well-known/openid-configuration'
}
}
]
```
PKCE (`S256`) is recommended for all OAuth2 flows, even when a client secret is present, as it protects against authorization code interception attacks.
## LDAP Authentication
FAB supports authenticating user credentials against an LDAP server.

View File

@@ -10,10 +10,6 @@ version: 1
The superset cli allows you to import and export datasources from and to YAML. Datasources include
databases. The data is expected to be organized in the following hierarchy:
:::info
Superset's ZIP-based import/export also covers **dashboards**, **charts**, and **saved queries**, exercised through the UI and REST API. The [Dashboard Import Overwrite Behavior](#dashboard-import-overwrite-behavior) and [UUIDs in API Responses](#uuids-in-api-responses) sections below document the behavior shared across all asset types.
:::
```text
├──databases
| ├──database_1
@@ -30,10 +26,6 @@ Superset's ZIP-based import/export also covers **dashboards**, **charts**, and *
| └── ... (more databases)
```
:::note
When you export a database connection, the `masked_encrypted_extra` field (used for sensitive connection parameters such as service account JSON, OAuth tokens, and other encrypted credentials) is included in the export. When importing on another instance, these values are decrypted and re-encrypted using the destination instance's `SECRET_KEY`. Ensure the receiving instance has a valid `SECRET_KEY` configured before importing.
:::
## Exporting Datasources to YAML
You can print your current datasources to stdout by running:
@@ -83,29 +75,6 @@ The optional username flag **-u** sets the user used for the datasource import.
superset import_datasources -p <path / filename> -u 'admin'
```
## Dashboard Import Overwrite Behavior
When importing a dashboard ZIP with the **overwrite** option enabled, any existing charts that are part of the dashboard are **replaced** rather than duplicated. This applies to:
- Charts whose UUID matches a chart already present in the target instance
- The full chart configuration (query, visualization type, columns, metrics) is replaced by the imported version
If you import without the overwrite flag, existing charts with conflicting UUIDs are left unchanged and the import skips those objects. Use overwrite when you want to push a fully updated dashboard (including chart definitions) from a development or staging environment to production.
## UUIDs in API Responses
The REST API POST endpoints for **datasets**, **charts**, and **dashboards** include the auto-generated `uuid` field in the response body:
```json
{
"id": 42,
"uuid": "b8a8d5c3-1234-4abc-8def-0123456789ab",
...
}
```
UUIDs remain stable across import/export cycles and can be used for cross-environment workflows — for example, recording a UUID when creating a chart in development and using it to identify the matching chart after importing into production.
## Legacy Importing Datasources
### From older versions of Superset to current version

View File

@@ -501,7 +501,6 @@ All MCP settings go in `superset_config.py`. Defaults are defined in `superset/m
| `MCP_SERVICE_URL` | `None` | Public base URL for MCP-generated links (set this when behind a reverse proxy) |
| `MCP_DEBUG` | `False` | Enable debug logging |
| `MCP_DEV_USERNAME` | -- | Superset username for development mode (no auth) |
| `MCP_RBAC_ENABLED` | `True` | Enforce Superset's role-based access control on MCP tool calls. When `True`, each tool checks that the authenticated user has the required FAB permission before executing. Disable only for testing or trusted-network deployments. |
### Authentication
@@ -517,7 +516,6 @@ All MCP settings go in `superset_config.py`. Defaults are defined in `superset/m
| `MCP_REQUIRED_SCOPES` | `[]` | Required JWT scopes |
| `MCP_JWT_DEBUG_ERRORS` | `False` | Log detailed JWT errors server-side (never exposed in HTTP responses per RFC 6750) |
| `MCP_AUTH_FACTORY` | `None` | Custom auth provider factory `(flask_app) -> auth_provider`. Takes precedence over built-in JWT |
| `MCP_USER_RESOLVER` | `None` | Custom function `(app, access_token) -> username` to extract a Superset username from a validated JWT token. When `None`, the default resolver checks `preferred_username`, `username`, `email`, and `sub` claims in that order. |
### Response Size Guard
@@ -601,43 +599,6 @@ MCP_STORE_CONFIG = {
| `event_store_max_events` | `100` | Maximum events retained per session |
| `event_store_ttl` | `3600` | Event TTL in seconds |
### Tool Search
By default the MCP server exposes a lightweight tool-search interface instead of advertising every tool at once. This reduces the initial context sent to the LLM by ~70%, which lowers cost and latency. The AI client discovers tools on demand by calling `search_tools` and then invokes them via `call_tool`.
```python
MCP_TOOL_SEARCH_CONFIG = {
"enabled": True,
"strategy": "bm25", # "bm25" (natural language) or "regex"
"max_results": 5,
"always_visible": [ # Tools always listed (pinned)
"health_check",
"get_instance_info",
],
"search_tool_name": "search_tools",
"call_tool_name": "call_tool",
"include_schemas": False, # False=summary mode (name + parameters_hint)
"compact_schemas": True, # Strip $defs (only applies when include_schemas=True)
"max_description_length": 300,
}
```
| Key | Default | Description |
|-----|---------|-------------|
| `enabled` | `True` | Enable tool search. When `False`, all tools are listed upfront |
| `strategy` | `"bm25"` | Search ranking algorithm. `"bm25"` supports natural language; `"regex"` supports pattern matching |
| `max_results` | `5` | Maximum tools returned per search query |
| `always_visible` | See above | Tools that always appear in `list_tools`, regardless of search |
| `include_schemas` | `False` | When `False` (default, "summary mode"), search results omit `inputSchema` entirely and include a lightweight `parameters_hint` listing top-level parameter names. Set to `True` to include the full `inputSchema` in search results. Full schemas are always used when a tool is actually invoked via `call_tool`. |
| `compact_schemas` | `True` | Strip `$defs` / `$ref` and replace with `{"type": "object"}` in search results to reduce token cost. Only takes effect when `include_schemas=True` — ignored in summary mode. |
| `max_description_length` | `300` | Truncate tool descriptions in search results (0 = no truncation). Applies in both summary and full-schema modes. |
:::tip
Set `enabled: False` to revert to the traditional "show all tools at once" behavior, which some clients or workflows may prefer.
:::
Tool search reduces the initial token cost from ~1520K tokens (full catalog) down to ~45K tokens (pinned tools + search interface) — roughly 85% savings at the start of each conversation.
### Session & CSRF
These values are flat-merged into the Flask app config used by the MCP server process:
@@ -659,102 +620,6 @@ MCP_CSRF_CONFIG = {
---
## Access Control
### RBAC Enforcement
The MCP server respects Superset's full role-based access control (RBAC). Every authenticated user can only access the data and operations their Superset roles permit — the same rules that apply in the Superset UI apply through MCP.
Each tool declares one or more required FAB permissions. The table below maps tool groups to their permission requirements:
| Tool group | Required FAB permission |
|------------|------------------------|
| `list_charts`, `get_chart_info`, `get_chart_data`, `get_chart_preview`, `generate_chart`, `update_chart` | `can_read` on `Chart` (read), `can_write` on `Chart` (mutate) |
| `list_dashboards`, `get_dashboard_info`, `generate_dashboard`, `add_chart_to_existing_dashboard` | `can_read` on `Dashboard` (read), `can_write` on `Dashboard` (mutate) |
| `list_datasets`, `get_dataset_info`, `create_virtual_dataset` | `can_read` on `Dataset` (read), `can_write` on `Dataset` (mutate) |
| `list_databases`, `get_database_info` | `can_read` on `Database` |
| `execute_sql` | `can_execute_sql_query` on `SQLLab` |
| `open_sql_lab_with_context` | `can_read` on `SQLLab` |
| `save_sql_query` | `can_write` on `SavedQuery` |
| `health_check` | None (public) |
To disable RBAC checking globally (for trusted-network deployments or testing), set:
```python
# superset_config.py
MCP_RBAC_ENABLED = False
```
:::warning
Disabling RBAC removes all permission checks from MCP tool calls. Only do this on isolated, internal deployments where all MCP users are trusted admins.
:::
### Audit Log
All MCP tool calls are recorded in Superset's action log. You can view them at **Settings → Action Log** (admin only). Each log entry records:
- The tool name (e.g., `mcp.generate_chart.db_write`)
- The authenticated user
- A timestamp
This makes MCP activity fully auditable alongside regular Superset activity. The action log uses the same event logger as the rest of Superset, so existing log ingestion pipelines (e.g., sending logs to Elasticsearch or a SIEM) capture MCP events automatically.
### Middleware Pipeline
Every MCP request passes through a middleware stack before reaching the tool function. The default stack (assembled in `build_middleware_list()` in `server.py`) is:
| Middleware | Purpose | Default |
|------------|---------|---------|
| `StructuredContentStripperMiddleware` | Strips `structuredContent` from responses for Claude.ai bridge compatibility | Enabled |
| `LoggingMiddleware` | Logs each tool call with user, parameters, and duration | Enabled |
| `GlobalErrorHandlerMiddleware` | Catches unhandled exceptions and sanitizes sensitive data before it reaches the client | Enabled |
| `ResponseSizeGuardMiddleware` | Estimates token count, warns at 80% of limit, blocks at limit | Enabled (configurable via `MCP_RESPONSE_SIZE_CONFIG`) |
| `ResponseCachingMiddleware` | Caches read-heavy tool responses (in-memory or Redis) | Disabled (enable via `MCP_CACHE_CONFIG`) |
Additional middleware classes (`RateLimitMiddleware`, `FieldPermissionsMiddleware`, `PrivateToolMiddleware`) are implemented in `superset/mcp_service/middleware.py` but are not added to the default pipeline. They are available for operators who want to layer them in via a custom startup path.
### Error Sanitization
The `GlobalErrorHandlerMiddleware` automatically redacts sensitive information from all error messages before they reach the LLM client. The following are replaced with generic messages:
- **Database connection strings** — replaced with a generic connection error message
- **API keys and tokens** — redacted from error traces
- **File system paths** — stripped to prevent information disclosure
- **IP addresses** — removed from error context
This ensures that a misconfigured database connection or an unexpected exception never leaks credentials or internal topology to the LLM or its users. All regex patterns used for redaction are bounded to prevent ReDoS attacks.
---
## Performance
### Connection Pooling
Each MCP server process maintains its own SQLAlchemy connection pool to the database. For multi-worker deployments, total open connections = **workers × pool size**.
```python
# superset_config.py
SQLALCHEMY_POOL_SIZE = 5
SQLALCHEMY_MAX_OVERFLOW = 10
SQLALCHEMY_POOL_TIMEOUT = 30
SQLALCHEMY_POOL_RECYCLE = 3600 # Recycle connections after 1 hour
```
For a 3-pod Kubernetes deployment with the defaults above, expect up to 3 × (5 + 10) = 45 connections. Size your database's `max_connections` accordingly.
### Response Caching
Enable response caching for read-heavy workloads (dashboards/datasets that don't change frequently). With the in-memory backend (default when `MCP_STORE_CONFIG` is disabled), caching is per-process. Use Redis-backed caching for consistent cache hits across multiple pods:
```python
MCP_CACHE_CONFIG = {"enabled": True, "call_tool_ttl": 3600}
MCP_STORE_CONFIG = {"enabled": True, "CACHE_REDIS_URL": "redis://redis:6379/0"}
```
Mutating tools (`generate_chart`, `update_chart`, `execute_sql`, `generate_dashboard`) are always excluded from caching regardless of this setting.
---
## Troubleshooting
### Server won't start
@@ -799,32 +664,6 @@ Mutating tools (`generate_chart`, `update_chart`, `execute_sql`, `generate_dashb
---
## Audit Events
All MCP tool calls are logged to Superset's event logger, the same system used by the web UI (viewable at **Settings → Action Log**). Each event captures:
- **Action**: `mcp.<tool_name>.<phase>` (e.g., `mcp.list_databases.query`)
- **User**: the resolved Superset username from the JWT or dev config
- **Timestamp**: when the operation ran
This means MCP activity is auditable alongside normal user activity. No additional configuration is required — logging is on by default whenever the event logger is enabled in your Superset deployment.
## Tool Pagination
MCP list tools (`list_datasets`, `list_charts`, `list_dashboards`, `list_databases`) use **offset pagination** via `page` (1-based) and `page_size` parameters. Responses include `page`, `page_size`, `total_count`, `total_pages`, `has_previous`, and `has_next`. To iterate through all results:
```python
# Example: fetch all charts across pages
all_charts = []
page = 1
while True:
result = mcp.list_charts(page=page, page_size=50)
all_charts.extend(result["charts"])
if not result.get("has_next"):
break
page += 1
```
## Security Best Practices
- **Use TLS** for all production MCP endpoints -- place the server behind a reverse proxy with HTTPS

View File

@@ -64,7 +64,7 @@ There are two approaches to making dashboards publicly accessible:
3. Edit each dashboard's properties and add the "Public" role
4. Only dashboards with the Public role explicitly assigned are visible to anonymous users
See the [Public role documentation](/admin-docs/security/#public) for more details.
See the [Public role documentation](/admin-docs/security/security#public) for more details.
#### Embedding a Public Dashboard
@@ -111,7 +111,7 @@ FEATURE_FLAGS = {
This flag only hides the logout button when Superset detects it is running inside an iframe. Users accessing Superset directly (not embedded) will still see the logout button regardless of this setting.
:::note
When embedding with SSO, also set `SESSION_COOKIE_SAMESITE = 'None'` and `SESSION_COOKIE_SECURE = True`. See [Security documentation](/admin-docs/security/securing_superset) for details.
When embedding with SSO, also set `SESSION_COOKIE_SAMESITE = 'None'` and `SESSION_COOKIE_SECURE = True`. See [Security documentation](/docs/security/securing_superset) for details.
:::
## CSRF settings

View File

@@ -84,35 +84,6 @@ THEME_DARK = {
# - OS preference detection is automatically enabled
```
### App Branding
The application name shown in the browser title bar and navigation can be
set via the `brandAppName` theme token:
```python
THEME_DEFAULT = {
"token": {
"brandAppName": "Acme Analytics",
# ... other tokens
}
}
```
Or in the theme CRUD UI JSON editor:
```json
{
"token": {
"brandAppName": "Acme Analytics"
}
}
```
The existing `APP_NAME` Python config key continues to work for backward compatibility.
`brandAppName` takes precedence when both are set, and allows different themes to carry different brand names.
Email and alert/report notification subjects are driven by backend settings such as
`EMAIL_REPORTS_SUBJECT_PREFIX` and `APP_NAME`, not by this theme token.
### Migration from Configuration to UI
When `ENABLE_UI_THEME_ADMINISTRATION = True`:
@@ -341,25 +312,11 @@ Available chart types for `echartsOptionsOverridesByChartType`:
- `echarts_heatmap` - Heatmaps
- `echarts_mixed_timeseries` - Mixed time series
### Array Property Overrides
Array properties (such as color palettes) are fully supported in overrides. Arrays are **replaced entirely** rather than merged, so specify the complete array:
```python
THEME_DEFAULT = {
"token": { ... },
"echartsOptionsOverrides": {
# Replace the default color palette for all ECharts visualizations
"color": ["#1f77b4", "#ff7f0e", "#2ca02c", "#d62728", "#9467bd", "#8c564b"]
}
}
```
### Best Practices
1. **Start with global overrides** for consistent styling across all charts
2. **Use chart-specific overrides** for unique requirements per visualization type
3. **Test thoroughly** as overrides use deep merge for objects, but arrays are completely replaced — always specify the full array value
3. **Test thoroughly** as overrides use deep merge - nested objects are combined, but arrays are completely replaced
4. **Document your overrides** to help team members understand custom styling
5. **Consider performance** - complex overrides may impact chart rendering speed

View File

@@ -20,7 +20,7 @@ To help make the problem somewhat tractable—given that Apache Superset has no
To strive for data consistency (regardless of the timezone of the client) the Apache Superset backend tries to ensure that any timestamp sent to the client has an explicit (or semi-explicit as in the case with [Epoch time](https://en.wikipedia.org/wiki/Unix_time) which is always in reference to UTC) timezone encoded within.
The challenge however lies with the slew of [database engines](/user-docs/databases#installing-drivers-in-docker) which Apache Superset supports and various inconsistencies between their [Python Database API (DB-API)](https://www.python.org/dev/peps/pep-0249/) implementations combined with the fact that we use [Pandas](https://pandas.pydata.org/) to read SQL into a DataFrame prior to serializing to JSON. Regrettably Pandas ignores the DB-API [type_code](https://www.python.org/dev/peps/pep-0249/#type-objects) relying by default on the underlying Python type returned by the DB-API. Currently only a subset of the supported database engines work correctly with Pandas, i.e., ensuring timestamps without an explicit timestamp are serialized to JSON with the server timezone, thus guaranteeing the client will display timestamps in a consistent manner irrespective of the client's timezone.
The challenge however lies with the slew of [database engines](/admin-docs/databases#installing-drivers-in-docker) which Apache Superset supports and various inconsistencies between their [Python Database API (DB-API)](https://www.python.org/dev/peps/pep-0249/) implementations combined with the fact that we use [Pandas](https://pandas.pydata.org/) to read SQL into a DataFrame prior to serializing to JSON. Regrettably Pandas ignores the DB-API [type_code](https://www.python.org/dev/peps/pep-0249/#type-objects) relying by default on the underlying Python type returned by the DB-API. Currently only a subset of the supported database engines work correctly with Pandas, i.e., ensuring timestamps without an explicit timestamp are serializd to JSON with the server timezone, thus guaranteeing the client will display timestamps in a consistent manner irrespective of the client's timezone.
For example the following is a comparison of MySQL and Presto,

View File

@@ -149,7 +149,7 @@ For production clusters it's recommended to build own image with this step done
Superset requires a Python DB-API database driver and a SQLAlchemy
dialect to be installed for each datastore you want to connect to.
See [Install Database Drivers](/user-docs/databases#installing-database-drivers) for more information.
See [Install Database Drivers](/admin-docs/databases#installing-database-drivers) for more information.
It is recommended that you refer to versions listed in
[pyproject.toml](https://github.com/apache/superset/blob/master/pyproject.toml)
instead of hard-coding them in your bootstrap script, as seen below.

View File

@@ -52,15 +52,6 @@ only see the objects that they have access to.
The **sql_lab** role grants access to SQL Lab. Note that while **Admin** users have access
to all databases by default, both **Alpha** and **Gamma** users need to be given access on a per database basis.
Beyond the base `sql_lab` role, two additional SQL Lab permissions must be explicitly granted for users who need these capabilities:
| Permission | Feature |
|------------|---------|
| `can_estimate_query_cost` on `SQLLab` | Estimate query cost before running |
| `can_format_sql` on `SQLLab` | Format SQL using the database's dialect |
Grant these in **Security → List Roles** by adding the permissions to the relevant role.
### Public
The **Public** role is the most restrictive built-in role, designed specifically for anonymous/unauthenticated
@@ -191,8 +182,6 @@ However, it is crucial to understand the following:
By combining Superset's configurable safeguards with strong database-level security practices, you can achieve a more robust and layered security posture.
**Dataset Sample Access**: The `get_samples()` endpoint now enforces datasource-level access control. Users can only fetch sample rows from datasets they have been explicitly granted access to — the same permission check applied when running chart queries. This closes a prior gap where unauthenticated or under-privileged access could retrieve sample data.
### REST API for user & role management
Flask-AppBuilder supports a REST API for user CRUD,
@@ -250,143 +239,26 @@ based on the roles and permissions that were attributed.
### Row Level Security
Using Row Level Security filters (under the **Security** menu) you can create filters
that are assigned to a particular dataset, as well as a set of roles.
that are assigned to a particular table, as well as a set of roles.
If you want members of the Finance team to only have access to
rows where `department = "finance"`, you could:
- Create a Row Level Security filter with that clause (`department = "finance"`)
- Then assign the clause to the **Finance** role and the dataset it applies to
- Then assign the clause to the **Finance** role and the table it applies to
The **clause** field, which can contain arbitrary text, is then added to the generated
SQL statement's WHERE clause. So you could even do something like create a filter
SQL statements WHERE clause. So you could even do something like create a filter
for the last 30 days and apply it to a specific role, with a clause
like `date_field > DATE_SUB(NOW(), INTERVAL 30 DAY)`. It can also support
multiple conditions: `client_id = 6` AND `advertiser="foo"`, etc.
RLS clauses also support **Jinja templating** when `ENABLE_TEMPLATE_PROCESSING` is enabled, so you can write dynamic filters such as
`user_id = '{{ current_username() }}'` to restrict rows based on the logged-in user.
All relevant Row level security filters will be combined together (under the hood,
the different SQL clauses are combined using AND statements). This means it's
possible to create a situation where two roles conflict in such a way as to limit a table subset to empty.
#### Filter Types
There are two types of RLS filters:
- **Regular** — The filter clause is applied when the querying user belongs to one of the
roles assigned to the filter. Use this to restrict what specific roles can see.
- **Base** — The filter clause is applied to **all** users _except_ those in the assigned
roles. Use this to define a default restriction that privileged roles (e.g. Admin) are
exempt from. For example, a Base filter with clause `1 = 0` and the Admin role would
hide all rows from everyone except Admin — useful as a deny-by-default baseline.
#### Group Keys and Filter Combination
All applicable RLS filters are combined before being added to the query. The combination
rules are:
- Filters that share the **same group key** are combined with **OR** (any match within
the group is sufficient).
- Different filter groups (different group keys, or no group key) are combined with
**AND** (all groups must match).
- Filters with **no group key** are each treated as their own group and are always AND'd.
For example, if a dataset has three filters:
| Filter | Clause | Group Key |
|--------|--------|-----------|
| F1 | `department = 'Finance'` | `department` |
| F2 | `department = 'Marketing'` | `department` |
| F3 | `region = 'Europe'` | `region` |
The resulting WHERE clause would be:
```sql
(department = 'Finance' OR department = 'Marketing') AND (region = 'Europe')
```
:::caution Conflicting filters
It is possible to create filters that conflict and produce an empty result set. For
example, the filters `client_id = 4` and `client_id = 5` **without a shared group key**
will be AND'd together, producing `client_id = 4 AND client_id = 5`, which can never
be true.
If you intend for these to be alternatives, assign them the **same group key** so they
are OR'd instead.
:::
#### RLS and Virtual (SQL-Based) Datasets
RLS filters are assigned to **datasets**, not to underlying database tables directly. This
has important implications when working with virtual (SQL-based) datasets:
- **Physical datasets** (backed directly by a table or view) — RLS filters assigned to
the dataset are added as WHERE clauses to the query.
- **Virtual datasets** (defined by a custom SQL query) — RLS filters assigned directly to
the virtual dataset are applied to the _outer_ query that wraps the dataset's SQL.
Additionally, RLS filters on the **underlying physical datasets** referenced by the
virtual dataset's SQL are injected into the inner subquery for each referenced table.
For example, if you have:
1. A physical dataset `orders` with RLS filter `region = 'US'`
2. A virtual dataset defined as `SELECT * FROM orders WHERE status = 'active'`
A user affected by the RLS filter will effectively see:
```sql
SELECT * FROM (
SELECT * FROM orders WHERE (region = 'US') AND status = 'active'
) ...
```
**Key considerations for virtual datasets:**
- You generally do **not** need to duplicate RLS filters on both the physical and virtual
dataset — filters on the physical dataset are applied automatically at query time.
- If you assign an RLS filter directly to a virtual dataset, the clause must reference
columns available in the virtual dataset's _output_, not necessarily the underlying
table's columns.
- In **SQL Lab**, RLS is enforced only when the `RLS_IN_SQLLAB` feature flag is enabled:
queries run against tables that have associated datasets with RLS filters will then have
the appropriate predicates injected automatically.
#### Checking RLS Filters via the API
You can use the RLS REST API to audit which filters are configured and which datasets
they affect. This requires the `can_read` permission on the `Row Level Security` resource.
**List all RLS rules:**
```
GET /api/v1/rowlevelsecurity/
```
**Filter RLS rules for a specific dataset** (using [Rison](https://github.com/Nanonid/rison) query syntax):
```
GET /api/v1/rowlevelsecurity/?q=(filters:!((col:tables,opr:rel_m_m,value:<dataset_id>)))
```
**Filter RLS rules by role:**
```
GET /api/v1/rowlevelsecurity/?q=(filters:!((col:roles,opr:rel_m_m,value:<role_id>)))
```
**View details of a specific rule** (including clause, assigned datasets, and roles):
```
GET /api/v1/rowlevelsecurity/<id>
```
The response includes the filter's `name`, `filter_type` (Regular or Base), `clause`,
`group_key`, assigned `tables` (with id, schema, and table\_name), and assigned `roles`
(with id and name).
:::tip Auditing RLS for virtual datasets
To find all RLS rules that could affect a particular virtual dataset, query the list
endpoint filtered by that dataset's ID for any directly-assigned rules. Then also check
the physical datasets referenced in the virtual dataset's SQL, since their RLS filters
are applied at query time too.
:::
For example, the filters `client_id=4` and `client_id=5`, applied to a role,
will result in users of that role having `client_id=4` AND `client_id=5`
added to their query, which can never be true.
### User Sessions

View File

@@ -59,7 +59,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
#### Core Resources
<details>
<summary><strong>Dashboards</strong> (28 endpoints) — Create, read, update, and delete dashboards.</summary>
<summary><strong>Dashboards</strong> (26 endpoints) — Create, read, update, and delete dashboards.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
@@ -68,25 +68,23 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| `POST` | [Create a new dashboard](/developer-docs/api/create-a-new-dashboard) | `/api/v1/dashboard/` |
| `GET` | [Get metadata information about this API resource (dashboard--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-dashboard-info) | `/api/v1/dashboard/_info` |
| `GET` | [Get a dashboard detail information](/developer-docs/api/get-a-dashboard-detail-information) | `/api/v1/dashboard/{id_or_slug}` |
| `GET` | [Get a dashboard's chart definitions.](/developer-docs/api/get-a-dashboards-chart-definitions) | `/api/v1/dashboard/{id_or_slug}/charts` |
| `GET` | [Get a dashboard's chart definitions.](/developer-docs/api/get-a-dashboard-s-chart-definitions) | `/api/v1/dashboard/{id_or_slug}/charts` |
| `POST` | [Create a copy of an existing dashboard](/developer-docs/api/create-a-copy-of-an-existing-dashboard) | `/api/v1/dashboard/{id_or_slug}/copy/` |
| `GET` | [Get dashboard's datasets](/developer-docs/api/get-dashboards-datasets) | `/api/v1/dashboard/{id_or_slug}/datasets` |
| `DELETE` | [Delete a dashboard's embedded configuration](/developer-docs/api/delete-a-dashboards-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `GET` | [Get the dashboard's embedded configuration](/developer-docs/api/get-the-dashboards-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `POST` | [Set a dashboard's embedded configuration](/developer-docs/api/set-a-dashboards-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `GET` | [Get dashboard's datasets](/developer-docs/api/get-dashboard-s-datasets) | `/api/v1/dashboard/{id_or_slug}/datasets` |
| `DELETE` | [Delete a dashboard's embedded configuration](/developer-docs/api/delete-a-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `GET` | [Get the dashboard's embedded configuration](/developer-docs/api/get-the-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `POST` | [Set a dashboard's embedded configuration](/developer-docs/api/set-a-dashboard-s-embedded-configuration) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `PUT` | [Update dashboard by id_or_slug embedded](/developer-docs/api/update-dashboard-by-id-or-slug-embedded) | `/api/v1/dashboard/{id_or_slug}/embedded` |
| `GET` | [Get dashboard's tabs](/developer-docs/api/get-dashboards-tabs) | `/api/v1/dashboard/{id_or_slug}/tabs` |
| `GET` | [Get dashboard's tabs](/developer-docs/api/get-dashboard-s-tabs) | `/api/v1/dashboard/{id_or_slug}/tabs` |
| `DELETE` | [Delete a dashboard](/developer-docs/api/delete-a-dashboard) | `/api/v1/dashboard/{pk}` |
| `PUT` | [Update a dashboard](/developer-docs/api/update-a-dashboard) | `/api/v1/dashboard/{pk}` |
| `POST` | [Compute and cache a screenshot (dashboard-pk-cache-dashboard-screenshot)](/developer-docs/api/compute-and-cache-a-screenshot-dashboard-pk-cache-dashboard-screenshot) | `/api/v1/dashboard/{pk}/cache_dashboard_screenshot/` |
| `PUT` | [Update chart customizations configuration for a dashboard.](/developer-docs/api/update-chart-customizations-configuration-for-a-dashboard) | `/api/v1/dashboard/{pk}/chart_customizations` |
| `PUT` | [Update colors configuration for a dashboard.](/developer-docs/api/update-colors-configuration-for-a-dashboard) | `/api/v1/dashboard/{pk}/colors` |
| `GET` | [Export dashboard as example bundle](/developer-docs/api/export-dashboard-as-example-bundle) | `/api/v1/dashboard/{pk}/export_as_example/` |
| `DELETE` | [Remove the dashboard from the user favorite list](/developer-docs/api/remove-the-dashboard-from-the-user-favorite-list) | `/api/v1/dashboard/{pk}/favorites/` |
| `POST` | [Mark the dashboard as favorite for the current user](/developer-docs/api/mark-the-dashboard-as-favorite-for-the-current-user) | `/api/v1/dashboard/{pk}/favorites/` |
| `PUT` | [Update native filters configuration for a dashboard.](/developer-docs/api/update-native-filters-configuration-for-a-dashboard) | `/api/v1/dashboard/{pk}/filters` |
| `GET` | [Get a computed screenshot from cache (dashboard-pk-screenshot-digest)](/developer-docs/api/get-a-computed-screenshot-from-cache-dashboard-pk-screenshot-digest) | `/api/v1/dashboard/{pk}/screenshot/{digest}/` |
| `GET` | [Get dashboard's thumbnail](/developer-docs/api/get-dashboards-thumbnail) | `/api/v1/dashboard/{pk}/thumbnail/{digest}/` |
| `GET` | [Get dashboard's thumbnail](/developer-docs/api/get-dashboard-s-thumbnail) | `/api/v1/dashboard/{pk}/thumbnail/{digest}/` |
| `GET` | [Download multiple dashboards as YAML files](/developer-docs/api/download-multiple-dashboards-as-yaml-files) | `/api/v1/dashboard/export/` |
| `GET` | [Check favorited dashboards for current user](/developer-docs/api/check-favorited-dashboards-for-current-user) | `/api/v1/dashboard/favorite_status/` |
| `POST` | [Import dashboard(s) with associated charts/datasets/databases](/developer-docs/api/import-dashboard-s-with-associated-charts-datasets-databases) | `/api/v1/dashboard/import/` |
@@ -103,8 +101,8 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| `GET` | [Get a list of charts](/developer-docs/api/get-a-list-of-charts) | `/api/v1/chart/` |
| `POST` | [Create a new chart](/developer-docs/api/create-a-new-chart) | `/api/v1/chart/` |
| `GET` | [Get metadata information about this API resource (chart--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-chart-info) | `/api/v1/chart/_info` |
| `GET` | [Get a chart detail information](/developer-docs/api/get-a-chart-detail-information) | `/api/v1/chart/{id_or_uuid}` |
| `DELETE` | [Delete a chart](/developer-docs/api/delete-a-chart) | `/api/v1/chart/{pk}` |
| `GET` | [Get a chart detail information](/developer-docs/api/get-a-chart-detail-information) | `/api/v1/chart/{pk}` |
| `PUT` | [Update a chart](/developer-docs/api/update-a-chart) | `/api/v1/chart/{pk}` |
| `GET` | [Compute and cache a screenshot (chart-pk-cache-screenshot)](/developer-docs/api/compute-and-cache-a-screenshot-chart-pk-cache-screenshot) | `/api/v1/chart/{pk}/cache_screenshot/` |
| `GET` | [Return payload data response for a chart](/developer-docs/api/return-payload-data-response-for-a-chart) | `/api/v1/chart/{pk}/data/` |
@@ -123,7 +121,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
</details>
<details>
<summary><strong>Datasets</strong> (19 endpoints) — Manage datasets (tables) used for building charts.</summary>
<summary><strong>Datasets</strong> (18 endpoints) — Manage datasets (tables) used for building charts.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
@@ -131,14 +129,13 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| `GET` | [Get a list of datasets](/developer-docs/api/get-a-list-of-datasets) | `/api/v1/dataset/` |
| `POST` | [Create a new dataset](/developer-docs/api/create-a-new-dataset) | `/api/v1/dataset/` |
| `GET` | [Get metadata information about this API resource (dataset--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-dataset-info) | `/api/v1/dataset/_info` |
| `GET` | [Get a dataset](/developer-docs/api/get-a-dataset) | `/api/v1/dataset/{id_or_uuid}` |
| `GET` | [Get charts and dashboards count associated to a dataset](/developer-docs/api/get-charts-and-dashboards-count-associated-to-a-dataset) | `/api/v1/dataset/{id_or_uuid}/related_objects` |
| `DELETE` | [Delete a dataset](/developer-docs/api/delete-a-dataset) | `/api/v1/dataset/{pk}` |
| `GET` | [Get a dataset](/developer-docs/api/get-a-dataset) | `/api/v1/dataset/{pk}` |
| `PUT` | [Update a dataset](/developer-docs/api/update-a-dataset) | `/api/v1/dataset/{pk}` |
| `DELETE` | [Delete a dataset column](/developer-docs/api/delete-a-dataset-column) | `/api/v1/dataset/{pk}/column/{column_id}` |
| `GET` | [Get dataset drill info](/developer-docs/api/get-dataset-drill-info) | `/api/v1/dataset/{pk}/drill_info/` |
| `DELETE` | [Delete a dataset metric](/developer-docs/api/delete-a-dataset-metric) | `/api/v1/dataset/{pk}/metric/{metric_id}` |
| `PUT` | [Refresh and update columns of a dataset](/developer-docs/api/refresh-and-update-columns-of-a-dataset) | `/api/v1/dataset/{pk}/refresh` |
| `GET` | [Get charts and dashboards count associated to a dataset](/developer-docs/api/get-charts-and-dashboards-count-associated-to-a-dataset) | `/api/v1/dataset/{pk}/related_objects` |
| `GET` | [Get distinct values from field data (dataset-distinct-column-name)](/developer-docs/api/get-distinct-values-from-field-data-dataset-distinct-column-name) | `/api/v1/dataset/distinct/{column_name}` |
| `POST` | [Duplicate a dataset](/developer-docs/api/duplicate-a-dataset) | `/api/v1/dataset/duplicate` |
| `GET` | [Download multiple datasets as YAML files](/developer-docs/api/download-multiple-datasets-as-yaml-files) | `/api/v1/dataset/export/` |
@@ -150,7 +147,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
</details>
<details>
<summary><strong>Database</strong> (30 endpoints) — Manage database connections and metadata.</summary>
<summary><strong>Database</strong> (31 endpoints) — Manage database connections and metadata.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
@@ -168,6 +165,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| `GET` | [Get all schemas from a database](/developer-docs/api/get-all-schemas-from-a-database) | `/api/v1/database/{pk}/schemas/` |
| `GET` | [Get database select star for table (database-pk-select-star-table-name)](/developer-docs/api/get-database-select-star-for-table-database-pk-select-star-table-name) | `/api/v1/database/{pk}/select_star/{table_name}/` |
| `GET` | [Get database select star for table (database-pk-select-star-table-name-schema-name)](/developer-docs/api/get-database-select-star-for-table-database-pk-select-star-table-name-schema-name) | `/api/v1/database/{pk}/select_star/{table_name}/{schema_name}/` |
| `DELETE` | [Delete a SSH tunnel](/developer-docs/api/delete-a-ssh-tunnel) | `/api/v1/database/{pk}/ssh_tunnel/` |
| `POST` | [Re-sync all permissions for a database connection](/developer-docs/api/re-sync-all-permissions-for-a-database-connection) | `/api/v1/database/{pk}/sync_permissions/` |
| `GET` | [Get table extra metadata (database-pk-table-extra-table-name-schema-name)](/developer-docs/api/get-table-extra-metadata-database-pk-table-extra-table-name-schema-name) | `/api/v1/database/{pk}/table_extra/{table_name}/{schema_name}/` |
| `GET` | [Get table metadata](/developer-docs/api/get-table-metadata) | `/api/v1/database/{pk}/table_metadata/` |
@@ -179,7 +177,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| `GET` | [Get names of databases currently available](/developer-docs/api/get-names-of-databases-currently-available) | `/api/v1/database/available/` |
| `GET` | [Download database(s) and associated dataset(s) as a zip file](/developer-docs/api/download-database-s-and-associated-dataset-s-as-a-zip-file) | `/api/v1/database/export/` |
| `POST` | [Import database(s) with associated datasets](/developer-docs/api/import-database-s-with-associated-datasets) | `/api/v1/database/import/` |
| `GET` | [Receive personal access tokens from OAuth2](/developer-docs/api/receive-personal-access-tokens-from-o-auth-2) | `/api/v1/database/oauth2/` |
| `GET` | [Receive personal access tokens from OAuth2](/developer-docs/api/receive-personal-access-tokens-from-oauth2) | `/api/v1/database/oauth2/` |
| `GET` | [Get related fields data (database-related-column-name)](/developer-docs/api/get-related-fields-data-database-related-column-name) | `/api/v1/database/related/{column_name}` |
| `POST` | [Test a database connection](/developer-docs/api/test-a-database-connection) | `/api/v1/database/test_connection/` |
| `POST` | [Upload a file and returns file metadata](/developer-docs/api/upload-a-file-and-returns-file-metadata) | `/api/v1/database/upload_metadata/` |
@@ -199,14 +197,13 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
</details>
<details>
<summary><strong>SQL Lab</strong> (7 endpoints) — Execute SQL queries and manage SQL Lab sessions.</summary>
<summary><strong>SQL Lab</strong> (6 endpoints) — Execute SQL queries and manage SQL Lab sessions.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get the bootstrap data for SqlLab page](/developer-docs/api/get-the-bootstrap-data-for-sql-lab-page) | `/api/v1/sqllab/` |
| `GET` | [Get the bootstrap data for SqlLab page](/developer-docs/api/get-the-bootstrap-data-for-sqllab-page) | `/api/v1/sqllab/` |
| `POST` | [Estimate the SQL query execution cost](/developer-docs/api/estimate-the-sql-query-execution-cost) | `/api/v1/sqllab/estimate/` |
| `POST` | [Execute a SQL query](/developer-docs/api/execute-a-sql-query) | `/api/v1/sqllab/execute/` |
| `POST` | [Export SQL query results to CSV with streaming](/developer-docs/api/export-sql-query-results-to-csv-with-streaming) | `/api/v1/sqllab/export_streaming/` |
| `GET` | [Export the SQL query results to a CSV](/developer-docs/api/export-the-sql-query-results-to-a-csv) | `/api/v1/sqllab/export/{client_id}/` |
| `POST` | [Format SQL code](/developer-docs/api/format-sql-code) | `/api/v1/sqllab/format_sql/` |
| `GET` | [Get the result of a SQL query execution](/developer-docs/api/get-the-result-of-a-sql-query-execution) | `/api/v1/sqllab/results/` |
@@ -239,21 +236,20 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
</details>
<details>
<summary><strong>Datasources</strong> (2 endpoints) — Query datasource metadata and column values.</summary>
<summary><strong>Datasources</strong> (1 endpoints) — Query datasource metadata and column values.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get possible values for a datasource column](/developer-docs/api/get-possible-values-for-a-datasource-column) | `/api/v1/datasource/{datasource_type}/{datasource_id}/column/{column_name}/values/` |
| `POST` | [Validate a SQL expression against a datasource](/developer-docs/api/validate-a-sql-expression-against-a-datasource) | `/api/v1/datasource/{datasource_type}/{datasource_id}/validate_expression/` |
</details>
<details>
<summary><strong>Advanced Data Type</strong> (2 endpoints) — Advanced data type operations and conversions.</summary>
<summary><strong>Advanced Data Type</strong> (2 endpoints) — Endpoints for advanced data type operations and conversions.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Return an AdvancedDataTypeResponse](/developer-docs/api/return-an-advanced-data-type-response) | `/api/v1/advanced_data_type/convert` |
| `GET` | [Return an AdvancedDataTypeResponse](/developer-docs/api/return-an-advanceddatatyperesponse) | `/api/v1/advanced_data_type/convert` |
| `GET` | [Return a list of available advanced data types](/developer-docs/api/return-a-list-of-available-advanced-data-types) | `/api/v1/advanced_data_type/types` |
</details>
@@ -324,32 +320,32 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
#### Sharing & Embedding
<details>
<summary><strong>Dashboard Permanent Link</strong> (2 endpoints) — Permanent links to dashboard states.</summary>
<summary><strong>Dashboard Permanent Link</strong> (2 endpoints) — Create and retrieve permanent links to dashboard states.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a new dashboard's permanent link](/developer-docs/api/create-a-new-dashboards-permanent-link) | `/api/v1/dashboard/{pk}/permalink` |
| `GET` | [Get dashboard's permanent link state](/developer-docs/api/get-dashboards-permanent-link-state) | `/api/v1/dashboard/permalink/{key}` |
| `POST` | [Create a new dashboard's permanent link](/developer-docs/api/create-a-new-dashboard-s-permanent-link) | `/api/v1/dashboard/{pk}/permalink` |
| `GET` | [Get dashboard's permanent link state](/developer-docs/api/get-dashboard-s-permanent-link-state) | `/api/v1/dashboard/permalink/{key}` |
</details>
<details>
<summary><strong>Explore Permanent Link</strong> (2 endpoints) — Permanent links to chart explore states.</summary>
<summary><strong>Explore Permanent Link</strong> (2 endpoints) — Create and retrieve permanent links to chart explore states.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a new permanent link (explore-permalink)](/developer-docs/api/create-a-new-permanent-link-explore-permalink) | `/api/v1/explore/permalink` |
| `GET` | [Get chart's permanent link state](/developer-docs/api/get-charts-permanent-link-state) | `/api/v1/explore/permalink/{key}` |
| `GET` | [Get chart's permanent link state](/developer-docs/api/get-chart-s-permanent-link-state) | `/api/v1/explore/permalink/{key}` |
</details>
<details>
<summary><strong>SQL Lab Permanent Link</strong> (2 endpoints) — Permanent links to SQL Lab states.</summary>
<summary><strong>SQL Lab Permanent Link</strong> (2 endpoints) — Create and retrieve permanent links to SQL Lab states.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a new permanent link (sqllab-permalink)](/developer-docs/api/create-a-new-permanent-link-sqllab-permalink) | `/api/v1/sqllab/permalink` |
| `GET` | [Get permanent link state for SQLLab editor.](/developer-docs/api/get-permanent-link-state-for-sql-lab-editor) | `/api/v1/sqllab/permalink/{key}` |
| `GET` | [Get permanent link state for SQLLab editor.](/developer-docs/api/get-permanent-link-state-for-sqllab-editor) | `/api/v1/sqllab/permalink/{key}` |
</details>
@@ -367,10 +363,10 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | [Create a dashboard's filter state](/developer-docs/api/create-a-dashboards-filter-state) | `/api/v1/dashboard/{pk}/filter_state` |
| `DELETE` | [Delete a dashboard's filter state value](/developer-docs/api/delete-a-dashboards-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `GET` | [Get a dashboard's filter state value](/developer-docs/api/get-a-dashboards-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `PUT` | [Update a dashboard's filter state value](/developer-docs/api/update-a-dashboards-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `POST` | [Create a dashboard's filter state](/developer-docs/api/create-a-dashboard-s-filter-state) | `/api/v1/dashboard/{pk}/filter_state` |
| `DELETE` | [Delete a dashboard's filter state value](/developer-docs/api/delete-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `GET` | [Get a dashboard's filter state value](/developer-docs/api/get-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
| `PUT` | [Update a dashboard's filter state value](/developer-docs/api/update-a-dashboard-s-filter-state-value) | `/api/v1/dashboard/{pk}/filter_state/{key}` |
</details>
@@ -410,7 +406,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
#### Security & Access Control
<details>
<summary><strong>Security Roles</strong> (11 endpoints) — Manage security roles and their permissions.</summary>
<summary><strong>Security Roles</strong> (10 endpoints) — Manage security roles and their permissions.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
@@ -420,7 +416,6 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
| `DELETE` | [Delete security roles by pk](/developer-docs/api/delete-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `GET` | [Get security roles by pk](/developer-docs/api/get-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `PUT` | [Update security roles by pk](/developer-docs/api/update-security-roles-by-pk) | `/api/v1/security/roles/{pk}` |
| `PUT` | [Update security roles by role_id groups](/developer-docs/api/update-security-roles-by-role-id-groups) | `/api/v1/security/roles/{role_id}/groups` |
| `POST` | [Create security roles by role_id permissions](/developer-docs/api/create-security-roles-by-role-id-permissions) | `/api/v1/security/roles/{role_id}/permissions` |
| `GET` | [Get security roles by role_id permissions](/developer-docs/api/get-security-roles-by-role-id-permissions) | `/api/v1/security/roles/{role_id}/permissions/` |
| `PUT` | [Update security roles by role_id users](/developer-docs/api/update-security-roles-by-role-id-users) | `/api/v1/security/roles/{role_id}/users` |
@@ -468,7 +463,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
</details>
<details>
<summary><strong>Security Permissions on Resources (View Menus)</strong> (6 endpoints) — Permission-resource mappings.</summary>
<summary><strong>Security Permissions on Resources (View Menus)</strong> (6 endpoints) — Manage permission-resource mappings.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
@@ -482,7 +477,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
</details>
<details>
<summary><strong>Row Level Security</strong> (8 endpoints) — Manage row-level security rules for data access.</summary>
<summary><strong>Row Level Security</strong> (8 endpoints) — Manage row-level security rules for data access control.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
@@ -500,7 +495,7 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
#### Import/Export & Administration
<details>
<summary><strong>Import/export</strong> (2 endpoints) — Import and export Superset assets.</summary>
<summary><strong>Import/export</strong> (2 endpoints) — Import and export Superset assets (dashboards, charts, databases).</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
@@ -533,12 +528,11 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
#### User & System
<details>
<summary><strong>Current User</strong> (3 endpoints) — Get information about the authenticated user.</summary>
<summary><strong>Current User</strong> (2 endpoints) — Get information about the currently authenticated user.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get the user object](/developer-docs/api/get-the-user-object) | `/api/v1/me/` |
| `PUT` | [Update the current user](/developer-docs/api/update-the-current-user) | `/api/v1/me/` |
| `GET` | [Get the user roles](/developer-docs/api/get-the-user-roles) | `/api/v1/me/roles/` |
</details>
@@ -588,60 +582,6 @@ curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
</details>
#### Other
<details>
<summary><strong>Security Groups</strong> (6 endpoints) — Endpoints related to Security Groups.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get security groups](/developer-docs/api/get-security-groups) | `/api/v1/security/groups/` |
| `POST` | [Create security groups](/developer-docs/api/create-security-groups) | `/api/v1/security/groups/` |
| `GET` | [Get security groups info](/developer-docs/api/get-security-groups-info) | `/api/v1/security/groups/_info` |
| `DELETE` | [Delete security groups by pk](/developer-docs/api/delete-security-groups-by-pk) | `/api/v1/security/groups/{pk}` |
| `GET` | [Get security groups by pk](/developer-docs/api/get-security-groups-by-pk) | `/api/v1/security/groups/{pk}` |
| `PUT` | [Update security groups by pk](/developer-docs/api/update-security-groups-by-pk) | `/api/v1/security/groups/{pk}` |
</details>
<details>
<summary><strong>Themes</strong> (14 endpoints) — Manage UI themes for customizing Superset's appearance.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `DELETE` | [Bulk delete themes](/developer-docs/api/bulk-delete-themes) | `/api/v1/theme/` |
| `GET` | [Get a list of themes](/developer-docs/api/get-a-list-of-themes) | `/api/v1/theme/` |
| `POST` | [Create a theme](/developer-docs/api/create-a-theme) | `/api/v1/theme/` |
| `GET` | [Get metadata information about this API resource (theme--info)](/developer-docs/api/get-metadata-information-about-this-api-resource-theme-info) | `/api/v1/theme/_info` |
| `DELETE` | [Delete a theme](/developer-docs/api/delete-a-theme) | `/api/v1/theme/{pk}` |
| `GET` | [Get a theme](/developer-docs/api/get-a-theme) | `/api/v1/theme/{pk}` |
| `PUT` | [Update a theme](/developer-docs/api/update-a-theme) | `/api/v1/theme/{pk}` |
| `PUT` | [Set a theme as the system dark theme](/developer-docs/api/set-a-theme-as-the-system-dark-theme) | `/api/v1/theme/{pk}/set_system_dark` |
| `PUT` | [Set a theme as the system default theme](/developer-docs/api/set-a-theme-as-the-system-default-theme) | `/api/v1/theme/{pk}/set_system_default` |
| `GET` | [Download multiple themes as YAML files](/developer-docs/api/download-multiple-themes-as-yaml-files) | `/api/v1/theme/export/` |
| `POST` | [Import themes from a ZIP file](/developer-docs/api/import-themes-from-a-zip-file) | `/api/v1/theme/import/` |
| `GET` | [Get related fields data (theme-related-column-name)](/developer-docs/api/get-related-fields-data-theme-related-column-name) | `/api/v1/theme/related/{column_name}` |
| `DELETE` | [Clear the system dark theme](/developer-docs/api/clear-the-system-dark-theme) | `/api/v1/theme/unset_system_dark` |
| `DELETE` | [Clear the system default theme](/developer-docs/api/clear-the-system-default-theme) | `/api/v1/theme/unset_system_default` |
</details>
<details>
<summary><strong>UserRegistrationsRestAPI</strong> (8 endpoints) — Endpoints related to UserRegistrationsRestAPI.</summary>
| Method | Endpoint | Description |
|--------|----------|-------------|
| `GET` | [Get security user registrations](/developer-docs/api/get-security-user-registrations) | `/api/v1/security/user_registrations/` |
| `POST` | [Create security user registrations](/developer-docs/api/create-security-user-registrations) | `/api/v1/security/user_registrations/` |
| `GET` | [Get security user registrations info](/developer-docs/api/get-security-user-registrations-info) | `/api/v1/security/user_registrations/_info` |
| `DELETE` | [Delete security user registrations by pk](/developer-docs/api/delete-security-user-registrations-by-pk) | `/api/v1/security/user_registrations/{pk}` |
| `GET` | [Get security user registrations by pk](/developer-docs/api/get-security-user-registrations-by-pk) | `/api/v1/security/user_registrations/{pk}` |
| `PUT` | [Update security user registrations by pk](/developer-docs/api/update-security-user-registrations-by-pk) | `/api/v1/security/user_registrations/{pk}` |
| `GET` | [Get distinct values from field data (security-user-registrations-distinct-column-name)](/developer-docs/api/get-distinct-values-from-field-data-security-user-registrations-distinct-column-name) | `/api/v1/security/user_registrations/distinct/{column_name}` |
| `GET` | [Get related fields data (security-user-registrations-related-column-name)](/developer-docs/api/get-related-fields-data-security-user-registrations-related-column-name) | `/api/v1/security/user_registrations/related/{column_name}` |
</details>
---
### Additional Resources

View File

@@ -156,7 +156,7 @@ function SelectFilters() {
## Import
```tsx
import { DropdownContainer } from '@superset-ui/core/components';
import { DropdownContainer } from '@superset/components';
```
---

View File

@@ -186,7 +186,7 @@ function JustifyAlign() {
## Import
```tsx
import { Flex } from '@superset-ui/core/components';
import { Flex } from '@superset/components';
```
---

View File

@@ -181,7 +181,7 @@ function AlignmentDemo() {
## Import
```tsx
import { Grid } from '@superset-ui/core/components';
import Grid from '@superset/components';
```
---

View File

@@ -128,7 +128,7 @@ function RightSidebar() {
## Import
```tsx
import { Layout } from '@superset-ui/core/components';
import { Layout } from '@superset/components';
```
---

View File

@@ -163,7 +163,7 @@ function FullMetadata() {
## Import
```tsx
import { MetadataBar } from '@superset-ui/core/components';
import MetadataBar from '@superset/components';
```
---

View File

@@ -157,7 +157,7 @@ function SpaceSizes() {
## Import
```tsx
import { Space } from '@superset-ui/core/components';
import { Space } from '@superset/components';
```
---

View File

@@ -300,7 +300,7 @@ function LoadingTable() {
## Import
```tsx
import { Table } from '@superset-ui/core/components';
import { Table } from '@superset/components';
```
---

View File

@@ -23,16 +23,7 @@ sidebar_position: 0
under the License.
-->
import { ComponentIndex } from '@site/src/components/ui-components';
import componentData from '@site/static/data/components.json';
# UI Components
<ComponentIndex data={componentData} />
---
## Design System
# Superset Design System
A design system is a complete set of standards intended to manage design at scale using reusable components and patterns.
@@ -44,6 +35,19 @@ The Superset Design System uses [Atomic Design](https://bradfrost.com/blog/post/
<img src="/img/atomic-design.png" alt="Atoms = Foundations, Molecules = Components, Organisms = Patterns, Templates = Templates, Pages / Screens = Features" style={{maxWidth: '100%'}} />
---
## Component Library
Interactive documentation for Superset's UI component library. **53 components** documented across 2 categories.
### [Core Components](./ui/)
46 components — Buttons, inputs, modals, selects, and other fundamental UI elements.
### [Layout Components](./design-system/)
7 components — Grid, Layout, Table, Flex, Space, and container components for page structure.
## Usage
All components are exported from `@superset-ui/core/components`:

View File

@@ -204,7 +204,7 @@ function Demo() {
## Import
```tsx
import { AutoComplete } from '@superset-ui/core/components';
import { AutoComplete } from '@superset/components';
```
---

View File

@@ -129,7 +129,7 @@ function Demo() {
## Import
```tsx
import { Avatar } from '@superset-ui/core/components';
import { Avatar } from '@superset/components';
```
---

View File

@@ -149,7 +149,7 @@ function ColorGallery() {
## Import
```tsx
import { Badge } from '@superset-ui/core/components';
import { Badge } from '@superset/components';
```
---

View File

@@ -82,7 +82,7 @@ function Demo() {
## Import
```tsx
import { Breadcrumb } from '@superset-ui/core/components';
import { Breadcrumb } from '@superset/components';
```
---

View File

@@ -43,7 +43,7 @@ The Button component from Superset's UI library.
<StoryWithControls
component="Button"
props={{
buttonStyle: "primary",
buttonStyle: "default",
buttonSize: "default",
children: "Button!"
}}
@@ -111,7 +111,7 @@ Edit the code below to experiment with the component:
function Demo() {
return (
<Button
buttonStyle="primary"
buttonStyle="default"
buttonSize="default"
>
Button!
@@ -124,14 +124,14 @@ function Demo() {
| Prop | Type | Default | Description |
|------|------|---------|-------------|
| `buttonStyle` | `string` | `"primary"` | The style variant of the button. |
| `buttonStyle` | `string` | `"default"` | The style variant of the button. |
| `buttonSize` | `string` | `"default"` | The size of the button. |
| `children` | `string` | `"Button!"` | The button text or content. |
## Import
```tsx
import { Button } from '@superset-ui/core/components';
import { Button } from '@superset/components';
```
---

View File

@@ -77,7 +77,7 @@ function Demo() {
## Import
```tsx
import { ButtonGroup } from '@superset-ui/core/components';
import { ButtonGroup } from '@superset/components';
```
---

View File

@@ -68,7 +68,7 @@ function Demo() {
## Import
```tsx
import { CachedLabel } from '@superset-ui/core/components';
import { CachedLabel } from '@superset/components';
```
---

View File

@@ -131,7 +131,7 @@ function CardStates() {
## Import
```tsx
import { Card } from '@superset-ui/core/components';
import { Card } from '@superset/components';
```
---

View File

@@ -130,7 +130,7 @@ function SelectAllDemo() {
## Import
```tsx
import { Checkbox } from '@superset-ui/core/components';
import { Checkbox } from '@superset/components';
```
---

View File

@@ -95,7 +95,7 @@ function Demo() {
## Import
```tsx
import { Collapse } from '@superset-ui/core/components';
import { Collapse } from '@superset/components';
```
---

View File

@@ -99,7 +99,7 @@ function Demo() {
## Import
```tsx
import { DatePicker } from '@superset-ui/core/components';
import { DatePicker } from '@superset/components';
```
---

View File

@@ -133,7 +133,7 @@ function Demo() {
## Import
```tsx
import { Divider } from '@superset-ui/core/components';
import { Divider } from '@superset/components';
```
---

View File

@@ -161,7 +161,7 @@ function Demo() {
## Import
```tsx
import { EditableTitle } from '@superset-ui/core/components';
import { EditableTitle } from '@superset/components';
```
---

View File

@@ -136,7 +136,7 @@ function Demo() {
## Import
```tsx
import { EmptyState } from '@superset-ui/core/components';
import { EmptyState } from '@superset/components';
```
---

View File

@@ -85,7 +85,7 @@ function Demo() {
## Import
```tsx
import { FaveStar } from '@superset-ui/core/components';
import { FaveStar } from '@superset/components';
```
---

View File

@@ -95,7 +95,7 @@ function Demo() {
## Import
```tsx
import { IconButton } from '@superset-ui/core/components';
import { IconButton } from '@superset/components';
```
---

View File

@@ -241,7 +241,7 @@ function IconWithText() {
## Import
```tsx
import { Icons } from '@superset-ui/core/components';
import { Icons } from '@superset/components';
```
---

View File

@@ -89,7 +89,7 @@ function Demo() {
## Import
```tsx
import { IconTooltip } from '@superset-ui/core/components';
import { IconTooltip } from '@superset/components';
```
---

View File

@@ -95,7 +95,7 @@ function Demo() {
## Import
```tsx
import { InfoTooltip } from '@superset-ui/core/components';
import { InfoTooltip } from '@superset/components';
```
---

View File

@@ -151,7 +151,7 @@ function Demo() {
## Import
```tsx
import { Input } from '@superset-ui/core/components';
import { Input } from '@superset/components';
```
---

View File

@@ -94,7 +94,7 @@ function Demo() {
## Import
```tsx
import { Label } from '@superset-ui/core/components';
import { Label } from '@superset/components';
```
---

View File

@@ -106,7 +106,7 @@ function Demo() {
## Import
```tsx
import { List } from '@superset-ui/core/components';
import { List } from '@superset/components';
```
---

View File

@@ -121,7 +121,7 @@ function Demo() {
## Import
```tsx
import { ListViewCard } from '@superset-ui/core/components';
import { ListViewCard } from '@superset/components';
```
---

View File

@@ -176,7 +176,7 @@ function ContextualDemo() {
## Import
```tsx
import { Loading } from '@superset-ui/core/components';
import { Loading } from '@superset/components';
```
---

View File

@@ -163,7 +163,7 @@ function MenuWithIcons() {
## Import
```tsx
import { Menu } from '@superset-ui/core/components';
import { Menu } from '@superset/components';
```
---

View File

@@ -196,7 +196,7 @@ function ConfirmationDialogs() {
## Import
```tsx
import { Modal } from '@superset-ui/core/components';
import { Modal } from '@superset/components';
```
---

View File

@@ -181,7 +181,7 @@ function DraggableModal() {
## Import
```tsx
import { ModalTrigger } from '@superset-ui/core/components';
import { ModalTrigger } from '@superset/components';
```
---

View File

@@ -188,7 +188,7 @@ function RichPopover() {
## Import
```tsx
import { Popover } from '@superset-ui/core/components';
import { Popover } from '@superset/components';
```
---

View File

@@ -195,7 +195,7 @@ function CustomColors() {
## Import
```tsx
import { ProgressBar } from '@superset-ui/core/components';
import { ProgressBar } from '@superset/components';
```
---

View File

@@ -126,7 +126,7 @@ function VerticalDemo() {
## Import
```tsx
import { Radio } from '@superset-ui/core/components';
import { Radio } from '@superset/components';
```
---

View File

@@ -74,7 +74,7 @@ function Demo() {
## Import
```tsx
import { SafeMarkdown } from '@superset-ui/core/components';
import { SafeMarkdown } from '@superset/components';
```
---

View File

@@ -297,7 +297,7 @@ function OneLineDemo() {
## Import
```tsx
import { Select } from '@superset-ui/core/components';
import { Select } from '@superset/components';
```
---

View File

@@ -129,7 +129,7 @@ function Demo() {
## Import
```tsx
import { Skeleton } from '@superset-ui/core/components';
import { Skeleton } from '@superset/components';
```
---

View File

@@ -242,7 +242,7 @@ function VerticalDemo() {
## Import
```tsx
import { Slider } from '@superset-ui/core/components';
import { Slider } from '@superset/components';
```
---

View File

@@ -261,7 +261,7 @@ function DotAndSmall() {
## Import
```tsx
import { Steps } from '@superset-ui/core/components';
import { Steps } from '@superset/components';
```
---

View File

@@ -182,7 +182,7 @@ function SettingsPanel() {
## Import
```tsx
import { Switch } from '@superset-ui/core/components';
import { Switch } from '@superset/components';
```
---

View File

@@ -52,6 +52,12 @@ function Demo() {
## Import
```tsx
import { TableCollection } from '@superset/components';
```
---
:::tip[Improve this page]

View File

@@ -283,7 +283,7 @@ function SortingDemo() {
## Import
```tsx
import { TableView } from '@superset-ui/core/components';
import { TableView } from '@superset/components';
```
---

View File

@@ -212,7 +212,7 @@ function IconTabs() {
## Import
```tsx
import { Tabs } from '@superset-ui/core/components';
import { Tabs } from '@superset/components';
```
---

View File

@@ -161,7 +161,7 @@ function StartStop() {
## Import
```tsx
import { Timer } from '@superset-ui/core/components';
import { Timer } from '@superset/components';
```
---

View File

@@ -160,7 +160,7 @@ function Triggers() {
## Import
```tsx
import { Tooltip } from '@superset-ui/core/components';
import { Tooltip } from '@superset/components';
```
---

View File

@@ -257,7 +257,7 @@ function LinesAndIcons() {
## Import
```tsx
import { Tree } from '@superset-ui/core/components';
import { Tree } from '@superset/components';
```
---

View File

@@ -275,7 +275,7 @@ function TreeLinesDemo() {
## Import
```tsx
import { TreeSelect } from '@superset-ui/core/components';
import { TreeSelect } from '@superset/components';
```
---

View File

@@ -225,7 +225,7 @@ function TextStyles() {
## Import
```tsx
import { Typography } from '@superset-ui/core/components';
import { Typography } from '@superset/components';
```
---

View File

@@ -115,7 +115,7 @@ function CustomTitle() {
## Import
```tsx
import { UnsavedChangesModal } from '@superset-ui/core/components';
import { UnsavedChangesModal } from '@superset/components';
```
---

View File

@@ -125,7 +125,7 @@ function DragDrop() {
## Import
```tsx
import { Upload } from '@superset-ui/core/components';
import { Upload } from '@superset/components';
```
---

View File

@@ -485,7 +485,7 @@ Frontend assets (TypeScript, JavaScript, CSS, and images) must be compiled in or
First, be sure you are using the following versions of Node.js and npm:
- `Node.js`: Version 22 (LTS)
- `Node.js`: Version 20
- `npm`: Version 10
We recommend using [nvm](https://github.com/nvm-sh/nvm) to manage your node environment:

View File

@@ -43,7 +43,7 @@ Extensions can provide:
## UI Components for Extensions
Extension developers have access to pre-built UI components via `@apache-superset/core/components`. Browse all available components on the [UI Components](/developer-docs/components/) page and filter by **Extension Compatible** to see components available to extensions.
Extension developers have access to pre-built UI components via `@apache-superset/core/components`. Browse all available components on the [UI Components](/docs/components/) page and filter by **Extension Compatible** to see components available to extensions.
## Next Steps

View File

@@ -91,7 +91,7 @@ or a view.
When working with tables, the solution would be to create a table that contains all the fields
needed for your analysis, most likely through some scheduled batch process.
A view is a simple logical layer that abstracts an arbitrary SQL query as a virtual table. This can
A view is a simple logical layer that abstracts an arbitrary SQL queries as a virtual table. This can
allow you to join and union multiple tables and to apply some transformation using arbitrary SQL
expressions. The limitation there is your database performance, as Superset effectively will run a
query on top of your query (view). A good practice may be to limit yourself to joining your main

Some files were not shown because too many files have changed in this diff Show More