Compare commits

..

1 Commits

Author SHA1 Message Date
Maxime Beauchemin
59ddd52789 fix(sqllab): Set explicit Content-Type headers to prevent HTTP 406 errors
Fixes #36072 where SQL Lab queries with WHERE clauses failed with
"Database error: Not acceptable" in Superset v4.1+.

Root cause: Flask 2.3+ (upgraded in v4.1.0) has stricter content
negotiation that could return HTTP 406 when Content-Type headers
aren't explicitly set, particularly with ENABLE_PROXY_FIX or certain
Accept header configurations.

Changes:
- Add explicit Content-Type headers to /api/v1/sqllab/execute/ and
  /api/v1/sqllab/results/ endpoints
- Improve error handling with try-except blocks for result fetching
  and JSON serialization
- Add targeted integration test for WHERE clause queries

The fix ensures Flask 2.3+ doesn't attempt content negotiation that
could fail, while maintaining backward compatibility.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-13 10:47:44 -08:00
2283 changed files with 56752 additions and 190621 deletions

View File

@@ -1,15 +0,0 @@
{
"hooks": {
"PreToolUse": [
{
"matcher": "Bash",
"hooks": [
{
"type": "command",
"command": "jq -r '.tool_input.command // \"\"' | grep -qE '^git commit' && cd \"$CLAUDE_PROJECT_DIR\" && echo '🔍 Running pre-commit before commit...' && pre-commit run || true"
}
]
}
]
}
}

View File

@@ -1,41 +0,0 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Auto-configure Docker Compose for multi-instance support
# Requires direnv: https://direnv.net/
#
# Install: brew install direnv (or apt install direnv)
# Setup: Add 'eval "$(direnv hook bash)"' to ~/.bashrc (or ~/.zshrc)
# Allow: Run 'direnv allow' in this directory once
# Generate unique project name from directory
export COMPOSE_PROJECT_NAME=$(basename "$PWD" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9]/-/g')
# Find available ports sequentially to avoid collisions
_is_free() { ! lsof -i ":$1" &>/dev/null 2>&1; }
_p=80; while ! _is_free $_p; do ((_p++)); done; export NGINX_PORT=$_p
_p=8088; while ! _is_free $_p; do ((_p++)); done; export SUPERSET_PORT=$_p
_p=9000; while ! _is_free $_p; do ((_p++)); done; export NODE_PORT=$_p
_p=8080; while ! _is_free $_p || [ $_p -eq $NGINX_PORT ]; do ((_p++)); done; export WEBSOCKET_PORT=$_p
_p=8081; while ! _is_free $_p || [ $_p -eq $WEBSOCKET_PORT ]; do ((_p++)); done; export CYPRESS_PORT=$_p
_p=5432; while ! _is_free $_p; do ((_p++)); done; export DATABASE_PORT=$_p
_p=6379; while ! _is_free $_p; do ((_p++)); done; export REDIS_PORT=$_p
unset _p _is_free
echo "🐳 Superset configured: http://localhost:$SUPERSET_PORT (dev: localhost:$NODE_PORT)"

1
.github/CODEOWNERS vendored
View File

@@ -33,7 +33,6 @@
# Notify PMC members of changes to extension-related files
/docs/developer_portal/extensions/ @michael-s-molina @villebro @rusackas
/superset-core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-extensions-cli/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset/core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje

View File

@@ -41,8 +41,8 @@ body:
label: Superset version
options:
- master / latest-dev
- "6.0.0"
- "5.0.0"
- "4.1.3"
validations:
required: true
- type: dropdown

View File

@@ -10,7 +10,7 @@ jobs:
steps:
- name: Check if the PR is a draft
id: check-draft
uses: actions/github-script@v8
uses: actions/github-script@v6
with:
script: |
const isDraft = context.payload.pull_request.draft;

View File

@@ -12,17 +12,10 @@ updates:
# not until React >= 18.0.0
- dependency-name: "storybook"
- dependency-name: "@storybook*"
# remark-gfm v4+ requires react-markdown v9+, which needs React 18
- dependency-name: "remark-gfm"
- dependency-name: "react-markdown"
# JSDOM v30 doesn't play well with Jest v30
# Source: https://jestjs.io/blog#known-issues
# GH thread: https://github.com/jsdom/jsdom/issues/3492
- dependency-name: "jest-environment-jsdom"
# `@swc/plugin-transform-imports` doesn't work with current Webpack-SWC hybrid setup
# See https://github.com/apache/superset/pull/37384#issuecomment-3793991389
# TODO: remove the plugin once Lodash usage has been migrated to a more readily tree-shakeable alternative
- dependency-name: "@swc/plugin-transform-imports"
directory: "/superset-frontend/"
schedule:
interval: "daily"

View File

@@ -117,19 +117,6 @@ testdata() {
say "::endgroup::"
}
playwright_testdata() {
cd "$GITHUB_WORKSPACE"
say "::group::Load all examples for Playwright tests"
# must specify PYTHONPATH to make `tests.superset_test_config` importable
export PYTHONPATH="$GITHUB_WORKSPACE"
pip install -e .
superset db upgrade
superset load_test_users
superset load_examples
superset init
say "::endgroup::"
}
celery-worker() {
cd "$GITHUB_WORKSPACE"
say "::group::Start Celery worker"

View File

@@ -32,7 +32,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: true
ref: master

View File

@@ -31,7 +31,7 @@ jobs:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
if: steps.check_queued.outputs.count >= 20
uses: actions/checkout@v6
uses: actions/checkout@v5
- name: Cancel duplicate workflow runs
if: steps.check_queued.outputs.count >= 20

View File

@@ -18,7 +18,7 @@ jobs:
runs-on: ubuntu-22.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -25,7 +25,7 @@ jobs:
pull-requests: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
- name: Check and notify
uses: actions/github-script@v8
with:
@@ -69,7 +69,7 @@ jobs:
`❗ @${pull.user.login} Your base branch \`${currentBranch}\` has ` +
'also updated `superset/migrations`.\n' +
'\n' +
'**Please consider rebasing your branch and [resolving potential db migration conflicts](https://superset.apache.org/docs/contributing/development#merging-db-migrations).**',
'**Please consider rebasing your branch and [resolving potential db migration conflicts](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#merging-db-migrations).**',
});
}
}

View File

@@ -71,7 +71,7 @@ jobs:
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
fetch-depth: 1

View File

@@ -31,7 +31,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v6
uses: actions/checkout@v5
- name: Check for file changes
id: check

View File

@@ -27,7 +27,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout Repository"
uses: actions/checkout@v6
uses: actions/checkout@v5
- name: "Dependency Review"
uses: actions/dependency-review-action@v4
continue-on-error: true
@@ -53,7 +53,7 @@ jobs:
runs-on: ubuntu-22.04
steps:
- name: "Checkout Repository"
uses: actions/checkout@v6
uses: actions/checkout@v5
- name: Setup Python
uses: ./.github/actions/setup-backend/

View File

@@ -42,7 +42,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
@@ -101,23 +101,6 @@ jobs:
docker images $IMAGE_TAG
docker history $IMAGE_TAG
# Scan for vulnerabilities in built container image after pushes to mainline branch.
- name: Run Trivy container image vulnerabity scan
if: github.event_name == 'push' && github.ref == 'refs/heads/master' && (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'lean'
uses: aquasecurity/trivy-action@b6643a29fecd7f34b3597bc6acb0a98b03d33ff8 # v0.33.1
with:
image-ref: ${{ env.IMAGE_TAG }}
format: 'sarif'
output: 'trivy-results.sarif'
vuln-type: 'os'
severity: 'CRITICAL,HIGH'
ignore-unfixed: true
- name: Upload Trivy scan results to GitHub Security tab
if: github.event_name == 'push' && github.ref == 'refs/heads/master' && (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'lean'
uses: github/codeql-action/upload-sarif@1b168cd39490f61582a9beae412bb7057a6b2c4e # v4.31.8
with:
sarif_file: 'trivy-results.sarif'
- name: docker-compose sanity check
if: (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'dev'
shell: bash
@@ -134,7 +117,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
- name: Check for file changes

View File

@@ -28,8 +28,8 @@ jobs:
run:
working-directory: superset-embedded-sdk
steps:
- uses: actions/checkout@v6
- uses: actions/setup-node@v6
- uses: actions/checkout@v5
- uses: actions/setup-node@v5
with:
node-version-file: './superset-embedded-sdk/.nvmrc'
registry-url: 'https://registry.npmjs.org'

View File

@@ -18,8 +18,8 @@ jobs:
run:
working-directory: superset-embedded-sdk
steps:
- uses: actions/checkout@v6
- uses: actions/setup-node@v6
- uses: actions/checkout@v5
- uses: actions/setup-node@v5
with:
node-version-file: './superset-embedded-sdk/.nvmrc'
registry-url: 'https://registry.npmjs.org'

View File

@@ -160,7 +160,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ needs.ephemeral-env-label.outputs.sha }} : ${{steps.get-sha.outputs.sha}} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
ref: ${{ needs.ephemeral-env-label.outputs.sha }}
persist-credentials: false
@@ -220,7 +220,7 @@ jobs:
pull-requests: write
steps:
- uses: actions/checkout@v6
- uses: actions/checkout@v5
with:
persist-credentials: false

View File

@@ -27,7 +27,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -14,10 +14,10 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Checkout Repository
uses: actions/checkout@v6
uses: actions/checkout@v5
- name: Set up Node.js
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version: '20'

View File

@@ -17,7 +17,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false

View File

@@ -12,7 +12,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -16,7 +16,7 @@ jobs:
pull-requests: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -21,7 +21,7 @@ jobs:
python-version: ["current", "previous", "next"]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
@@ -39,7 +39,7 @@ jobs:
echo "HOMEBREW_REPOSITORY=$HOMEBREW_REPOSITORY" >>"${GITHUB_ENV}"
brew install norwoodj/tap/helm-docs
- name: Setup Node.js
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version: '20'
@@ -54,7 +54,7 @@ jobs:
yarn install --immutable
- name: Cache pre-commit environments
uses: actions/cache@v5
uses: actions/cache@v4
with:
path: ~/.cache/pre-commit
key: pre-commit-v2-${{ runner.os }}-py${{ matrix.python-version }}-${{ hashFiles('.pre-commit-config.yaml') }}
@@ -71,9 +71,7 @@ jobs:
GIT_DIFF_EXIT_CODE=$?
if [ "${PRE_COMMIT_EXIT_CODE}" -ne 0 ] || [ "${GIT_DIFF_EXIT_CODE}" -ne 0 ]; then
if [ "${PRE_COMMIT_EXIT_CODE}" -ne 0 ]; then
echo "❌ Pre-commit check failed (exit code: ${PRE_COMMIT_EXIT_CODE})."
echo "🔍 Modified files:"
git diff --name-only
echo "❌ Pre-commit check failed (exit code: ${EXIT_CODE})."
else
echo "❌ Git working directory is dirty."
echo "📌 This likely means that pre-commit made changes that were not committed."

View File

@@ -27,7 +27,7 @@ jobs:
pull-requests: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -26,7 +26,7 @@ jobs:
name: Bump version and publish package(s)
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v6
- uses: actions/checkout@v5
with:
# pulls all commits (needed for lerna / semantic release to correctly version)
fetch-depth: 0
@@ -42,13 +42,13 @@ jobs:
- name: Install Node.js
if: env.HAS_TAGS
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Cache npm
if: env.HAS_TAGS
uses: actions/cache@v5
uses: actions/cache@v4
with:
path: ~/.npm # npm cache files are stored in `~/.npm` on Linux/macOS
key: ${{ runner.OS }}-node-${{ hashFiles('**/package-lock.json') }}
@@ -62,7 +62,7 @@ jobs:
run: echo "dir=$(npm config get cache)" >> $GITHUB_OUTPUT
- name: Cache npm
if: env.HAS_TAGS
uses: actions/cache@v5
uses: actions/cache@v4
id: npm-cache # use this to check for `cache-hit` (`steps.npm-cache.outputs.cache-hit != 'true'`)
with:
path: ${{ steps.npm-cache-dir-path.outputs.dir }}

View File

@@ -7,6 +7,12 @@ on:
# Manual trigger for testing
workflow_dispatch:
inputs:
max_age_hours:
description: 'Maximum age in hours before cleanup'
required: false
default: '48'
type: string
# Common environment variables
env:
@@ -32,5 +38,13 @@ jobs:
- name: Cleanup expired environments
run: |
echo "Cleaning up environments respecting TTL labels"
python -m showtime cleanup --respect-ttl
MAX_AGE="${{ github.event.inputs.max_age_hours || '48' }}"
# Validate max_age is numeric
if [[ ! "$MAX_AGE" =~ ^[0-9]+$ ]]; then
echo "❌ Invalid max_age_hours format: $MAX_AGE (must be numeric)"
exit 1
fi
echo "Cleaning up environments older than ${MAX_AGE}h"
python -m showtime cleanup --older-than "${MAX_AGE}h"

View File

@@ -147,7 +147,7 @@ jobs:
- name: Checkout PR code (only if build needed)
if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.build_needed == 'true'
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
ref: ${{ steps.check.outputs.target_sha }}
persist-credentials: false

View File

@@ -37,7 +37,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -51,7 +51,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
@@ -63,7 +63,7 @@ jobs:
with:
run: testdata
- name: Setup Node.js
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies

View File

@@ -30,13 +30,13 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
ref: master
- name: Set up Node.js
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install eyes-storybook dependencies

View File

@@ -1,13 +1,6 @@
name: Docs Deployment
on:
# Deploy after integration tests complete on master
workflow_run:
workflows: ["Python-Integration"]
types: [completed]
branches: [master]
# Also allow manual trigger and direct pushes to docs
push:
paths:
- "docs/**"
@@ -37,14 +30,13 @@ jobs:
name: Build & Deploy
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.event.workflow_run.head_sha || github.sha }}"
uses: actions/checkout@v6
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
with:
ref: ${{ github.event.workflow_run.head_sha || github.sha }}
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './docs/.nvmrc'
- name: Setup Python
@@ -66,35 +58,6 @@ jobs:
working-directory: docs
run: |
yarn install --check-cache
- name: Download database diagnostics (if triggered by integration tests)
if: github.event_name == 'workflow_run' && github.event.workflow_run.conclusion == 'success'
uses: dawidd6/action-download-artifact@v12
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml
run_id: ${{ github.event.workflow_run.id }}
name: database-diagnostics
path: docs/src/data/
- name: Try to download latest diagnostics (for push/dispatch triggers)
if: github.event_name != 'workflow_run'
uses: dawidd6/action-download-artifact@v12
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml
name: database-diagnostics
path: docs/src/data/
branch: master
search_artifacts: true
if_no_artifact_found: warn
- name: Use diagnostics artifact if available
working-directory: docs
run: |
if [ -f "src/data/databases-diagnostics.json" ]; then
echo "Using fresh diagnostics from integration tests"
mv src/data/databases-diagnostics.json src/data/databases.json
else
echo "Using committed databases.json (no artifact found)"
fi
- name: yarn build
working-directory: docs
run: |
@@ -108,5 +71,5 @@ jobs:
destination-github-username: "apache"
destination-repository-name: "superset-site"
target-branch: "asf-site"
commit-message: "deploying docs: ${{ github.event.head_commit.message || 'triggered by integration tests' }} (apache/superset@${{ github.event.workflow_run.head_sha || github.sha }})"
commit-message: "deploying docs: ${{ github.event.head_commit.message }} (apache/superset@${{ github.sha }})"
user-email: dev@superset.apache.org

View File

@@ -4,30 +4,24 @@ on:
pull_request:
paths:
- "docs/**"
- "superset/db_engine_specs/**"
- ".github/workflows/superset-docs-verify.yml"
types: [synchronize, opened, reopened, ready_for_review]
workflow_run:
workflows: ["Python-Integration"]
types: [completed]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.event.workflow_run.head_sha || github.run_id }}
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
linkinator:
# See docs here: https://github.com/marketplace/actions/linkinator
# Only run on pull_request, not workflow_run
if: github.event_name == 'pull_request'
name: Link Checking
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
- uses: actions/checkout@v5
# Do not bump this linkinator-action version without opening
# an ASF Infra ticket to allow the new version first!
- uses: JustinBeckwith/linkinator-action@af984b9f30f63e796ae2ea5be5e07cb587f1bbd9 # v2.3
- uses: JustinBeckwith/linkinator-action@3d5ba091319fa7b0ac14703761eebb7d100e6f6d # v1.11.0
continue-on-error: true # This will make the job advisory (non-blocking, no red X)
with:
paths: "**/*.md, **/*.mdx"
@@ -56,23 +50,20 @@ jobs:
https://timbr.ai/,
https://opensource.org/license/apache-2-0,
https://www.plaidcloud.com/
build-on-pr:
# Build docs when PR changes docs/** (uses committed databases.json)
if: github.event_name == 'pull_request'
name: Build (PR trigger)
build-deploy:
name: Build & Deploy
runs-on: ubuntu-24.04
defaults:
run:
working-directory: docs
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './docs/.nvmrc'
- name: yarn install
@@ -84,50 +75,3 @@ jobs:
- name: yarn build
run: |
yarn build
build-after-tests:
# Build docs after integration tests complete (uses fresh diagnostics)
# Only runs if integration tests succeeded
if: >
github.event_name == 'workflow_run' &&
github.event.workflow_run.conclusion == 'success'
name: Build (after integration tests)
runs-on: ubuntu-24.04
defaults:
run:
working-directory: docs
steps:
- name: "Checkout PR head: ${{ github.event.workflow_run.head_sha }}"
uses: actions/checkout@v6
with:
ref: ${{ github.event.workflow_run.head_sha }}
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@v6
with:
node-version-file: './docs/.nvmrc'
- name: yarn install
run: |
yarn install --check-cache
- name: Download database diagnostics from integration tests
uses: dawidd6/action-download-artifact@v12
with:
workflow: superset-python-integrationtest.yml
run_id: ${{ github.event.workflow_run.id }}
name: database-diagnostics
path: docs/src/data/
- name: Use fresh diagnostics
run: |
if [ -f "src/data/databases-diagnostics.json" ]; then
echo "Using fresh diagnostics from integration tests"
mv src/data/databases-diagnostics.json src/data/databases.json
else
echo "Warning: No diagnostics artifact found, using committed data"
fi
- name: yarn typecheck
run: |
yarn typecheck
- name: yarn build
run: |
yarn build

View File

@@ -69,21 +69,21 @@ jobs:
# Conditional checkout based on context
- name: Checkout for push or pull_request event
if: github.event_name == 'push' || github.event_name == 'pull_request'
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
ref: ${{ github.event.inputs.ref }}
submodules: recursive
- name: Checkout using PR ID (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
@@ -109,7 +109,7 @@ jobs:
run: testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
@@ -146,7 +146,7 @@ jobs:
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Artifacts
uses: actions/upload-artifact@v6
uses: actions/upload-artifact@v4
if: failure()
with:
path: ${{ github.workspace }}/superset-frontend/cypress-base/cypress/screenshots
@@ -186,21 +186,21 @@ jobs:
# Conditional checkout based on context (same as Cypress workflow)
- name: Checkout for push or pull_request event
if: github.event_name == 'push' || github.event_name == 'pull_request'
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
ref: ${{ github.event.inputs.ref }}
submodules: recursive
- name: Checkout using PR ID (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
@@ -223,10 +223,10 @@ jobs:
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: playwright_testdata
run: testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
@@ -259,7 +259,7 @@ jobs:
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Playwright Artifacts
uses: actions/upload-artifact@v6
uses: actions/upload-artifact@v4
if: failure()
with:
path: |

View File

@@ -24,7 +24,7 @@ jobs:
working-directory: superset-extensions-cli
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
@@ -58,7 +58,7 @@ jobs:
- name: Upload HTML coverage report
if: steps.check.outputs.superset-extensions-cli
uses: actions/upload-artifact@v6
uses: actions/upload-artifact@v4
with:
name: superset-extensions-cli-coverage-html
path: htmlcov/

View File

@@ -23,7 +23,7 @@ jobs:
should-run: ${{ steps.check.outputs.frontend }}
steps:
- name: Checkout Code
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
fetch-depth: 0
@@ -58,7 +58,7 @@ jobs:
- name: Upload Docker Image Artifact
if: steps.check.outputs.frontend
uses: actions/upload-artifact@v6
uses: actions/upload-artifact@v4
with:
name: docker-image
path: docker-image.tar.gz
@@ -73,7 +73,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@v7
uses: actions/download-artifact@v5
with:
name: docker-image
@@ -90,7 +90,7 @@ jobs:
"npm run test -- --coverage --shard=${{ matrix.shard }}/8 --coverageReporters=json-summary"
- name: Upload Coverage Artifact
uses: actions/upload-artifact@v6
uses: actions/upload-artifact@v4
with:
name: coverage-artifacts-${{ matrix.shard }}
path: superset-frontend/coverage
@@ -101,7 +101,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Download Coverage Artifacts
uses: actions/download-artifact@v7
uses: actions/download-artifact@v5
with:
pattern: coverage-artifacts-*
path: coverage/
@@ -127,7 +127,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@v7
uses: actions/download-artifact@v5
with:
name: docker-image
@@ -151,7 +151,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@v7
uses: actions/download-artifact@v5
with:
name: docker-image
@@ -167,21 +167,3 @@ jobs:
run: |
docker run --rm $TAG bash -c \
"npm run plugins:build-storybook"
test-storybook:
needs: frontend-build
if: needs.frontend-build.outputs.should-run == 'true'
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@v7
with:
name: docker-image
- name: Load Docker Image
run: docker load < docker-image.tar.gz
- name: Build Storybook and Run Tests
run: |
docker run --rm $TAG bash -c \
"npm run build-storybook && npx playwright install-deps && npx playwright install chromium && npm run test-storybook:ci"

View File

@@ -16,7 +16,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -29,7 +29,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
ref: ${{ inputs.ref || github.ref_name }}
persist-credentials: true

View File

@@ -60,21 +60,21 @@ jobs:
# Conditional checkout based on context (same as Cypress workflow)
- name: Checkout for push or pull_request event
if: github.event_name == 'push' || github.event_name == 'pull_request'
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
ref: ${{ github.event.inputs.ref }}
submodules: recursive
- name: Checkout using PR ID (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
@@ -97,10 +97,10 @@ jobs:
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: playwright_testdata
run: testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
@@ -133,7 +133,7 @@ jobs:
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Playwright Artifacts
uses: actions/upload-artifact@v6
uses: actions/upload-artifact@v4
if: failure()
with:
path: |

View File

@@ -41,7 +41,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
@@ -73,36 +73,6 @@ jobs:
flags: python,mysql
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
- name: Generate database diagnostics for docs
if: steps.check.outputs.python
env:
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
SUPERSET__SQLALCHEMY_DATABASE_URI: |
mysql+mysqldb://superset:superset@127.0.0.1:13306/superset?charset=utf8mb4&binary_prefix=true
run: |
python -c "
import json
from superset.app import create_app
from superset.db_engine_specs.lib import generate_yaml_docs
app = create_app()
with app.app_context():
docs = generate_yaml_docs()
# Wrap in the expected format
output = {
'generated': '$(date -Iseconds)',
'databases': docs
}
with open('databases-diagnostics.json', 'w') as f:
json.dump(output, f, indent=2, default=str)
print(f'Generated diagnostics for {len(docs)} databases')
"
- name: Upload database diagnostics artifact
if: steps.check.outputs.python
uses: actions/upload-artifact@v6
with:
name: database-diagnostics
path: databases-diagnostics.json
retention-days: 7
test-postgres:
runs-on: ubuntu-24.04
strategy:
@@ -129,7 +99,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
@@ -182,7 +152,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -48,7 +48,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
@@ -108,7 +108,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -24,7 +24,7 @@ jobs:
PYTHONPATH: ${{ github.workspace }}
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -18,7 +18,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
@@ -31,7 +31,7 @@ jobs:
- name: Setup Node.js
if: steps.check.outputs.frontend
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install dependencies
@@ -49,7 +49,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive

View File

@@ -21,7 +21,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false
- name: Install dependencies

View File

@@ -38,7 +38,7 @@ jobs:
});
- name: "Checkout ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
persist-credentials: false

View File

@@ -47,7 +47,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
fetch-depth: 0
@@ -60,7 +60,7 @@ jobs:
build: "true"
- name: Use Node.js 20
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version: 20
@@ -107,12 +107,12 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
uses: actions/checkout@v5
with:
fetch-depth: 0
- name: Use Node.js 20
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version: 20

View File

@@ -27,10 +27,10 @@ jobs:
name: Generate Reports
steps:
- name: Checkout Repository
uses: actions/checkout@v6
uses: actions/checkout@v5
- name: Set up Node.js
uses: actions/setup-node@v6
uses: actions/setup-node@v5
with:
node-version-file: './superset-frontend/.nvmrc'

8
.gitignore vendored
View File

@@ -33,7 +33,6 @@ cover
.env
.envrc
.idea
.roo
.mypy_cache
.python-version
.tox
@@ -61,7 +60,6 @@ tmp
rat-results.txt
superset/app/
superset-websocket/config.json
.direnv
# Node.js, webpack artifacts, storybook
*.entry.js
@@ -73,6 +71,10 @@ superset/static/assets/*
superset/static/uploads/*
!superset/static/uploads/.gitkeep
superset/static/version_info.json
superset-frontend/**/esm/*
superset-frontend/**/lib/*
superset-frontend/**/storybook-static/*
superset-frontend/migration-storybook.log
yarn-error.log
*.map
*.min.js
@@ -135,5 +137,3 @@ PROJECT.md
.claude_rc*
.env.local
oxc-custom-build/
*.code-workspace
*.duckdb

View File

@@ -49,12 +49,12 @@ repos:
hooks:
- id: check-docstring-first
- id: check-added-large-files
exclude: ^.*\.(geojson)$|^docs/static/img/screenshots/.*|^superset-frontend/CHANGELOG\.md$|^superset/examples/.*/data\.parquet$
exclude: ^.*\.(geojson)$|^docs/static/img/screenshots/.*|^superset-frontend/CHANGELOG\.md$
- id: check-yaml
exclude: ^helm/superset/templates/
- id: debug-statements
- id: end-of-file-fixer
exclude: .*/lerna\.json$|^docs/static/img/logos/
exclude: .*/lerna\.json$
- id: trailing-whitespace
exclude: ^.*\.(snap)
args: ["--markdown-linebreak-ext=md"]
@@ -106,19 +106,12 @@ repos:
files: helm
verbose: false
args: ["--log-level", "error"]
# Using local hooks ensures ruff version matches requirements/development.txt
- repo: local
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.9.7
hooks:
- id: ruff-format
name: ruff-format
entry: ruff format
language: system
types: [python]
- id: ruff
name: ruff
entry: ruff check --fix --show-fixes
language: system
types: [python]
args: [--fix]
- repo: local
hooks:
- id: pylint
@@ -142,18 +135,3 @@ repos:
else
echo "No Python files to lint."
fi
- id: db-engine-spec-metadata
name: database engine spec metadata validation
entry: python superset/db_engine_specs/lint_metadata.py --strict
language: system
files: ^superset/db_engine_specs/.*\.py$
exclude: ^superset/db_engine_specs/(base|lib|lint_metadata|__init__)\.py$
pass_filenames: false
- repo: local
hooks:
- id: feature-flags-sync
name: feature flags documentation sync
entry: bash -c 'python scripts/extract_feature_flags.py > docs/static/feature-flags.json.tmp && if ! diff -q docs/static/feature-flags.json docs/static/feature-flags.json.tmp > /dev/null 2>&1; then mv docs/static/feature-flags.json.tmp docs/static/feature-flags.json && echo "Updated docs/static/feature-flags.json" && exit 1; else rm docs/static/feature-flags.json.tmp; fi'
language: system
files: ^superset/config\.py$
pass_filenames: false

View File

@@ -53,7 +53,7 @@ extension-pkg-whitelist=pyarrow
[MESSAGES CONTROL]
disable=all
enable=json-import,disallowed-sql-import,consider-using-transaction
enable=disallowed-json-import,disallowed-sql-import,consider-using-transaction
[REPORTS]

View File

@@ -67,19 +67,22 @@ temporary_superset_ui/*
# skip license checks for auto-generated test snapshots
.*snap
# docs third-party logos (database logos, org logos, etc.)
databases/*
logos/*
# docs overrides for third party logos we don't have the rights to
google-big-query.svg
google-sheets.svg
ibm-db2.svg
postgresql.svg
snowflake.svg
ydb.svg
loading.svg
# docs-related
erd.puml
erd.svg
intro_header.txt
TODO.md
# for LLMs
llm-context.md
llms.txt
AGENTS.md
LLMS.md
CLAUDE.md

View File

@@ -2,27 +2,6 @@
Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend.
## ⚠️ CRITICAL: Always Run Pre-commit Before Pushing
**ALWAYS run `pre-commit run --all-files` before pushing commits.** CI will fail if pre-commit checks don't pass. This is non-negotiable.
```bash
# Stage your changes first
git add .
# Run pre-commit on all files
pre-commit run --all-files
# If there are auto-fixes, stage them and commit
git add .
git commit --amend # or new commit
```
Common pre-commit failures:
- **Formatting** - black, prettier, eslint will auto-fix
- **Type errors** - mypy failures need manual fixes
- **Linting** - ruff, pylint issues need manual fixes
## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do)
**These migrations are actively happening - avoid deprecated patterns:**
@@ -101,30 +80,6 @@ superset/
- **UPDATING.md**: Add breaking changes here
- **Docstrings**: Required for new functions/classes
## Developer Portal: Storybook-to-MDX Documentation
The Developer Portal auto-generates MDX documentation from Storybook stories. **Stories are the single source of truth.**
### Core Philosophy
- **Fix issues in the STORY, not the generator** - When something doesn't render correctly, update the story file first
- **Generator should be lightweight** - It extracts and passes through data; avoid special cases
- **Stories define everything** - Props, controls, galleries, examples all come from story metadata
### Story Requirements for Docs Generation
- Use `export default { title: '...' }` (inline), not `const meta = ...; export default meta;`
- Name interactive stories `Interactive${ComponentName}` (e.g., `InteractiveButton`)
- Define `args` for default prop values
- Define `argTypes` at the story level (not meta level) with control types and descriptions
- Use `parameters.docs.gallery` for size×style variant grids
- Use `parameters.docs.sampleChildren` for components that need children
- Use `parameters.docs.liveExample` for custom live code blocks
- Use `parameters.docs.staticProps` for complex object props that can't be parsed inline
### Generator Location
- Script: `docs/scripts/generate-superset-components.mjs`
- Wrapper: `docs/src/components/StorybookWrapper.jsx`
- Output: `docs/developer_portal/components/`
## Architecture Patterns
### Security & Features

View File

@@ -49,4 +49,3 @@ under the License.
- [4.1.3](./CHANGELOG/4.1.3.md)
- [4.1.4](./CHANGELOG/4.1.4.md)
- [5.0.0](./CHANGELOG/5.0.0.md)
- [6.0.0](./CHANGELOG/6.0.0.md)

File diff suppressed because it is too large Load Diff

View File

@@ -18,7 +18,7 @@
######################################################################
# Node stage to deal with static asset construction
######################################################################
ARG PY_VER=3.11.14-slim-trixie
ARG PY_VER=3.11.13-slim-trixie
# If BUILDPLATFORM is null, set it to 'amd64' (or leave as is otherwise).
ARG BUILDPLATFORM=${BUILDPLATFORM:-amd64}
@@ -26,6 +26,9 @@ ARG BUILDPLATFORM=${BUILDPLATFORM:-amd64}
# Include translations in the final build
ARG BUILD_TRANSLATIONS="false"
# Build arg to pre-populate examples DuckDB file
ARG LOAD_EXAMPLES_DUCKDB="false"
######################################################################
# superset-node-ci used as a base for building frontend assets and CI
######################################################################
@@ -143,6 +146,9 @@ RUN if [ "${BUILD_TRANSLATIONS}" = "true" ]; then \
######################################################################
FROM python-base AS python-common
# Re-declare build arg to receive it in this stage
ARG LOAD_EXAMPLES_DUCKDB
ENV SUPERSET_HOME="/app/superset_home" \
HOME="/app/superset_home" \
SUPERSET_ENV="production" \
@@ -154,7 +160,7 @@ ENV SUPERSET_HOME="/app/superset_home" \
COPY --chmod=755 docker/entrypoints /app/docker/entrypoints
WORKDIR /app
# Set up necessary directories
# Set up necessary directories and user
RUN mkdir -p \
${PYTHONPATH} \
superset/static \
@@ -196,9 +202,17 @@ RUN /app/docker/apt-install.sh \
libecpg-dev \
libldap2-dev
# Create data directory for DuckDB examples database
# The database file will be created at runtime when examples are loaded from Parquet files
RUN mkdir -p /app/data && chown -R superset:superset /app/data
# Pre-load examples DuckDB file if requested
RUN if [ "$LOAD_EXAMPLES_DUCKDB" = "true" ]; then \
mkdir -p /app/data && \
echo "Downloading pre-built examples.duckdb..." && \
curl -L -o /app/data/examples.duckdb \
"https://raw.githubusercontent.com/apache-superset/examples-data/master/examples.duckdb" && \
chown -R superset:superset /app/data; \
else \
mkdir -p /app/data && \
chown -R superset:superset /app/data; \
fi
# Copy compiled things from previous stages
COPY --from=superset-node /app/superset/static/assets superset/static/assets

View File

@@ -16,20 +16,8 @@ KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Installing Apache Superset
# INSTALL / BUILD instructions for Apache Superset
For comprehensive installation instructions, please see the Apache Superset documentation:
**[📚 Installation Guide →](https://superset.apache.org/docs/installation/installation-methods)**
The documentation covers:
- [Docker Compose](https://superset.apache.org/docs/installation/docker-compose) (recommended for development)
- [Kubernetes / Helm](https://superset.apache.org/docs/installation/kubernetes)
- [PyPI](https://superset.apache.org/docs/installation/pypi)
- [Docker Builds](https://superset.apache.org/docs/installation/docker-builds)
- [Architecture Overview](https://superset.apache.org/docs/installation/architecture)
## Building from Source
For building from a source release tarball, see the Dockerfile at:
`RELEASING/Dockerfile.from_local_tarball`
At this time, the docker file at RELEASING/Dockerfile.from_local_tarball
constitutes the recipe on how to get to a working release from a source
release tarball.

121
LINTING_ARCHITECTURE.md Normal file
View File

@@ -0,0 +1,121 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Superset Frontend Linting Architecture
## Overview
We use a hybrid linting approach combining OXC (fast, standard rules) with custom AST-based checks for Superset-specific patterns.
## Components
### 1. Primary Linter: OXC
- **What**: Oxidation Compiler's linter (oxlint)
- **Handles**: 95% of linting rules (standard ESLint rules, TypeScript, React, etc.)
- **Speed**: ~50-100x faster than ESLint
- **Config**: `oxlint.json`
### 2. Custom Rule Checker
- **What**: Node.js AST-based script
- **Handles**: Superset-specific rules:
- No literal colors (use theme)
- No FontAwesome icons (use Icons component)
- No template vars in i18n
- **Speed**: Fast enough for pre-commit
- **Script**: `scripts/check-custom-rules.js`
## Developer Workflow
### Local Development
```bash
# Fast linting (OXC only)
npm run lint
# Full linting (OXC + custom rules)
npm run lint:full
# Auto-fix what's possible
npm run lint-fix
```
### Pre-commit
1. OXC runs first (via `scripts/oxlint.sh`)
2. Custom rules check runs second (lightweight, AST-based)
3. Both must pass for commit to succeed
### CI Pipeline
```yaml
- name: Lint with OXC
run: npm run lint
- name: Check custom rules
run: npm run check:custom-rules
```
## Why This Architecture?
### ✅ Pros
1. **No binary distribution issues** - ASF compatible
2. **Fast performance** - OXC for bulk, lightweight script for custom
3. **Maintainable** - Custom rules in JavaScript, not Rust
4. **Flexible** - Can evolve as OXC adds plugin support
5. **Cacheable** - Both OXC and Node.js are standard tools
### ❌ Cons
1. **Two tools** - Slightly more complex than single linter
2. **Duplicate parsing** - Files parsed twice (once by each tool)
### 🔄 Migration Path
When OXC supports JavaScript plugins:
1. Convert `check-custom-rules.js` to OXC plugin format
2. Consolidate back to single tool
3. Keep same rules and developer experience
## Implementation Checklist
- [x] OXC for standard linting
- [x] Pre-commit integration
- [ ] Custom rules script
- [ ] Combine in npm scripts
- [ ] Update CI pipeline
- [ ] Developer documentation
## Performance Targets
| Operation | Target Time | Current |
|-----------|------------|---------|
| Pre-commit (changed files) | <2s | ✅ 1.5s |
| Full lint (all files) | <10s | ✅ 8s |
| Custom rules check | <5s | 🔄 TBD |
## Caching Strategy
### Local Development
- OXC: Built-in incremental checking
- Custom rules: Use file hash cache (similar to pytest cache)
### CI
- Cache `node_modules` (includes oxlint binary)
- Cache custom rules results by commit hash
- Skip unchanged files using git diff
## Future Improvements
1. **When OXC adds plugin support**: Migrate custom rules to OXC plugins
2. **Consider Biome**: Another Rust-based linter with plugin support
3. **AST sharing**: Investigate sharing AST between tools to avoid double parsing

View File

@@ -18,7 +18,7 @@
# Python version installed; we need 3.10-3.11
PYTHON=`command -v python3.11 || command -v python3.10`
.PHONY: install superset venv pre-commit up down logs ps nuke ports open
.PHONY: install superset venv pre-commit
install: superset pre-commit
@@ -112,28 +112,3 @@ report-celery-beat:
admin-user:
superset fab create-admin
# Docker Compose with auto-assigned ports (for running multiple instances)
up:
./scripts/docker-compose-up.sh
up-detached:
./scripts/docker-compose-up.sh -d
down:
./scripts/docker-compose-up.sh down
logs:
./scripts/docker-compose-up.sh logs -f
ps:
./scripts/docker-compose-up.sh ps
nuke:
./scripts/docker-compose-up.sh nuke
ports:
./scripts/docker-compose-up.sh ports
open:
./scripts/docker-compose-up.sh open

120
README.md
View File

@@ -23,12 +23,8 @@ under the License.
[![Latest Release on Github](https://img.shields.io/github/v/release/apache/superset?sort=semver)](https://github.com/apache/superset/releases/latest)
[![Build Status](https://github.com/apache/superset/actions/workflows/superset-python-unittest.yml/badge.svg)](https://github.com/apache/superset/actions)
[![PyPI version](https://badge.fury.io/py/apache_superset.svg)](https://badge.fury.io/py/apache_superset)
[![Coverage Status](https://codecov.io/github/apache/superset/coverage.svg?branch=master)](https://codecov.io/github/apache/superset)
[![PyPI](https://img.shields.io/pypi/pyversions/apache_superset.svg?maxAge=2592000)](https://pypi.python.org/pypi/apache_superset)
[![GitHub Stars](https://img.shields.io/github/stars/apache/superset?style=social)](https://github.com/apache/superset/stargazers)
[![Contributors](https://img.shields.io/github/contributors/apache/superset)](https://github.com/apache/superset/graphs/contributors)
[![Last Commit](https://img.shields.io/github/last-commit/apache/superset)](https://github.com/apache/superset/commits/master)
[![Open Issues](https://img.shields.io/github/issues/apache/superset)](https://github.com/apache/superset/issues)
[![Open PRs](https://img.shields.io/github/issues-pr/apache/superset)](https://github.com/apache/superset/pulls)
[![Get on Slack](https://img.shields.io/badge/slack-join-orange.svg)](http://bit.ly/join-superset-slack)
[![Documentation](https://img.shields.io/badge/docs-apache.org-blue.svg)](https://superset.apache.org)
@@ -55,7 +51,7 @@ A modern, enterprise-ready business intelligence web application.
[**Get Involved**](#get-involved) |
[**Contributor Guide**](#contributor-guide) |
[**Resources**](#resources) |
[**Organizations Using Superset**](https://superset.apache.org/inTheWild)
[**Organizations Using Superset**](https://github.com/apache/superset/blob/master/RESOURCES/INTHEWILD.md)
## Why Superset?
@@ -89,7 +85,7 @@ Superset provides:
**Craft Beautiful, Dynamic Dashboards**
<kbd><img title="View Dashboards" src="https://superset.apache.org/img/screenshots/dashboard.jpg"/></kbd><br/>
<kbd><img title="View Dashboards" src="https://superset.apache.org/img/screenshots/slack_dash.jpg"/></kbd><br/>
**No-Code Chart Builder**
@@ -101,77 +97,51 @@ Superset provides:
## Supported Databases
Superset can query data from any SQL-speaking datastore or data engine (Presto, Trino, Athena, [and more](https://superset.apache.org/docs/databases)) that has a Python DB-API driver and a SQLAlchemy dialect.
Superset can query data from any SQL-speaking datastore or data engine (Presto, Trino, Athena, [and more](https://superset.apache.org/docs/configuration/databases)) that has a Python DB-API driver and a SQLAlchemy dialect.
Here are some of the major database solutions that are supported:
<!-- SUPPORTED_DATABASES_START -->
<p align="center">
<a href="https://superset.apache.org/docs/databases/supported/amazon-athena" title="Amazon Athena"><img src="docs/static/img/databases/amazon-athena.jpg" alt="Amazon Athena" width="76" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/amazon-dynamodb" title="Amazon DynamoDB"><img src="docs/static/img/databases/aws.png" alt="Amazon DynamoDB" width="40" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/amazon-redshift" title="Amazon Redshift"><img src="docs/static/img/databases/redshift.png" alt="Amazon Redshift" width="100" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-doris" title="Apache Doris"><img src="docs/static/img/databases/doris.png" alt="Apache Doris" width="103" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-drill" title="Apache Drill"><img src="docs/static/img/databases/apache-drill.png" alt="Apache Drill" width="81" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-druid" title="Apache Druid"><img src="docs/static/img/databases/druid.png" alt="Apache Druid" width="117" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-hive" title="Apache Hive"><img src="docs/static/img/databases/apache-hive.svg" alt="Apache Hive" width="44" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-impala" title="Apache Impala"><img src="docs/static/img/databases/apache-impala.png" alt="Apache Impala" width="21" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-kylin" title="Apache Kylin"><img src="docs/static/img/databases/apache-kylin.png" alt="Apache Kylin" width="44" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-pinot" title="Apache Pinot"><img src="docs/static/img/databases/apache-pinot.svg" alt="Apache Pinot" width="76" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-solr" title="Apache Solr"><img src="docs/static/img/databases/apache-solr.png" alt="Apache Solr" width="79" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/apache-spark-sql" title="Apache Spark SQL"><img src="docs/static/img/databases/apache-spark.png" alt="Apache Spark SQL" width="75" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/ascend" title="Ascend"><img src="docs/static/img/databases/ascend.webp" alt="Ascend" width="117" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/aurora-mysql-data-api" title="Aurora MySQL (Data API)"><img src="docs/static/img/databases/mysql.png" alt="Aurora MySQL (Data API)" width="77" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/aurora-postgresql-data-api" title="Aurora PostgreSQL (Data API)"><img src="docs/static/img/databases/postgresql.svg" alt="Aurora PostgreSQL (Data API)" width="76" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/azure-data-explorer" title="Azure Data Explorer"><img src="docs/static/img/databases/kusto.png" alt="Azure Data Explorer" width="40" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/azure-synapse" title="Azure Synapse"><img src="docs/static/img/databases/azure.svg" alt="Azure Synapse" width="40" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/clickhouse" title="ClickHouse"><img src="docs/static/img/databases/clickhouse.png" alt="ClickHouse" width="150" height="37" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/cloudflare-d1" title="Cloudflare D1"><img src="docs/static/img/databases/cloudflare.png" alt="Cloudflare D1" width="40" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/cockroachdb" title="CockroachDB"><img src="docs/static/img/databases/cockroachdb.png" alt="CockroachDB" width="150" height="24" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/couchbase" title="Couchbase"><img src="docs/static/img/databases/couchbase.svg" alt="Couchbase" width="150" height="35" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/cratedb" title="CrateDB"><img src="docs/static/img/databases/cratedb.svg" alt="CrateDB" width="180" height="24" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/databend" title="Databend"><img src="docs/static/img/databases/databend.png" alt="Databend" width="100" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/databricks" title="Databricks"><img src="docs/static/img/databases/databricks.png" alt="Databricks" width="152" height="24" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/denodo" title="Denodo"><img src="docs/static/img/databases/denodo.png" alt="Denodo" width="138" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/dremio" title="Dremio"><img src="docs/static/img/databases/dremio.png" alt="Dremio" width="126" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/duckdb" title="DuckDB"><img src="docs/static/img/databases/duckdb.png" alt="DuckDB" width="52" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/elasticsearch" title="Elasticsearch"><img src="docs/static/img/databases/elasticsearch.png" alt="Elasticsearch" width="40" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/exasol" title="Exasol"><img src="docs/static/img/databases/exasol.png" alt="Exasol" width="72" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/firebird" title="Firebird"><img src="docs/static/img/databases/firebird.png" alt="Firebird" width="100" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/firebolt" title="Firebolt"><img src="docs/static/img/databases/firebolt.png" alt="Firebolt" width="100" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/google-bigquery" title="Google BigQuery"><img src="docs/static/img/databases/google-big-query.svg" alt="Google BigQuery" width="76" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/google-sheets" title="Google Sheets"><img src="docs/static/img/databases/google-sheets.svg" alt="Google Sheets" width="76" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/greenplum" title="Greenplum"><img src="docs/static/img/databases/greenplum.png" alt="Greenplum" width="124" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/hologres" title="Hologres"><img src="docs/static/img/databases/hologres.png" alt="Hologres" width="44" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/ibm-db2" title="IBM Db2"><img src="docs/static/img/databases/ibm-db2.svg" alt="IBM Db2" width="91" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/ibm-netezza-performance-server" title="IBM Netezza Performance Server"><img src="docs/static/img/databases/netezza.png" alt="IBM Netezza Performance Server" width="40" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/mariadb" title="MariaDB"><img src="docs/static/img/databases/mariadb.png" alt="MariaDB" width="150" height="37" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/microsoft-sql-server" title="Microsoft SQL Server"><img src="docs/static/img/databases/msql.png" alt="Microsoft SQL Server" width="50" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/monetdb" title="MonetDB"><img src="docs/static/img/databases/monet-db.png" alt="MonetDB" width="100" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/mongodb" title="MongoDB"><img src="docs/static/img/databases/mongodb.png" alt="MongoDB" width="150" height="38" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/motherduck" title="MotherDuck"><img src="docs/static/img/databases/motherduck.png" alt="MotherDuck" width="40" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/oceanbase" title="OceanBase"><img src="docs/static/img/databases/oceanbase.svg" alt="OceanBase" width="175" height="24" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/oracle" title="Oracle"><img src="docs/static/img/databases/oraclelogo.png" alt="Oracle" width="111" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/presto" title="Presto"><img src="docs/static/img/databases/presto-og.png" alt="Presto" width="127" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/risingwave" title="RisingWave"><img src="docs/static/img/databases/risingwave.svg" alt="RisingWave" width="147" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/sap-hana" title="SAP HANA"><img src="docs/static/img/databases/sap-hana.png" alt="SAP HANA" width="137" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/sap-sybase" title="SAP Sybase"><img src="docs/static/img/databases/sybase.png" alt="SAP Sybase" width="100" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/shillelagh" title="Shillelagh"><img src="docs/static/img/databases/shillelagh.png" alt="Shillelagh" width="40" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/singlestore" title="SingleStore"><img src="docs/static/img/databases/singlestore.png" alt="SingleStore" width="150" height="31" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/snowflake" title="Snowflake"><img src="docs/static/img/databases/snowflake.svg" alt="Snowflake" width="76" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/sqlite" title="SQLite"><img src="docs/static/img/databases/sqlite.png" alt="SQLite" width="84" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/starrocks" title="StarRocks"><img src="docs/static/img/databases/starrocks.png" alt="StarRocks" width="149" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/superset-meta-database" title="Superset meta database"><img src="docs/static/img/databases/superset.svg" alt="Superset meta database" width="150" height="39" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/tdengine" title="TDengine"><img src="docs/static/img/databases/tdengine.png" alt="TDengine" width="140" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/teradata" title="Teradata"><img src="docs/static/img/databases/teradata.png" alt="Teradata" width="124" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/timescaledb" title="TimescaleDB"><img src="docs/static/img/databases/timescale.png" alt="TimescaleDB" width="150" height="36" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/trino" title="Trino"><img src="docs/static/img/databases/trino.png" alt="Trino" width="89" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/vertica" title="Vertica"><img src="docs/static/img/databases/vertica.png" alt="Vertica" width="128" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/ydb" title="YDB"><img src="docs/static/img/databases/ydb.svg" alt="YDB" width="110" height="40" /></a> &nbsp;
<a href="https://superset.apache.org/docs/databases/supported/yugabytedb" title="YugabyteDB"><img src="docs/static/img/databases/yugabyte.png" alt="YugabyteDB" width="150" height="26" /></a>
<img src="https://superset.apache.org/img/databases/redshift.png" alt="redshift" border="0" width="200"/>
<img src="https://superset.apache.org/img/databases/google-biquery.png" alt="google-bigquery" border="0" width="200"/>
<img src="https://superset.apache.org/img/databases/snowflake.png" alt="snowflake" border="0" width="200"/>
<img src="https://superset.apache.org/img/databases/trino.png" alt="trino" border="0" width="150" />
<img src="https://superset.apache.org/img/databases/presto.png" alt="presto" border="0" width="200"/>
<img src="https://superset.apache.org/img/databases/databricks.png" alt="databricks" border="0" width="160" />
<img src="https://superset.apache.org/img/databases/druid.png" alt="druid" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/firebolt.png" alt="firebolt" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/timescale.png" alt="timescale" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/postgresql.png" alt="postgresql" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/mysql.png" alt="mysql" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/mssql-server.png" alt="mssql-server" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/ibm-db2.svg" alt="db2" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/sqlite.png" alt="sqlite" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/sybase.png" alt="sybase" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/mariadb.png" alt="mariadb" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/vertica.png" alt="vertica" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/oracle.png" alt="oracle" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/firebird.png" alt="firebird" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/greenplum.png" alt="greenplum" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/clickhouse.png" alt="clickhouse" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/exasol.png" alt="exasol" border="0" width="160" />
<img src="https://superset.apache.org/img/databases/monet-db.png" alt="monet-db" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/apache-kylin.png" alt="apache-kylin" border="0" width="80"/>
<img src="https://superset.apache.org/img/databases/hologres.png" alt="hologres" border="0" width="80"/>
<img src="https://superset.apache.org/img/databases/netezza.png" alt="netezza" border="0" width="80"/>
<img src="https://superset.apache.org/img/databases/pinot.png" alt="pinot" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/teradata.png" alt="teradata" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/yugabyte.png" alt="yugabyte" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/databend.png" alt="databend" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/starrocks.png" alt="starrocks" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/doris.png" alt="doris" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/oceanbase.svg" alt="oceanbase" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/sap-hana.png" alt="sap-hana" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/denodo.png" alt="denodo" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/ydb.svg" alt="ydb" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/tdengine.png" alt="TDengine" border="0" width="200" />
</p>
<!-- SUPPORTED_DATABASES_END -->
**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/databases).
**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/configuration/databases).
Want to add support for your datastore or data engine? Read more [here](https://superset.apache.org/docs/frequently-asked-questions#does-superset-work-with-insert-database-engine-here) about the technical requirements.
@@ -191,14 +161,14 @@ Try out Superset's [quickstart](https://superset.apache.org/docs/quickstart/) gu
## Contributor Guide
Interested in contributing? Check out our
[Developer Portal](https://superset.apache.org/developer_portal/)
[CONTRIBUTING.md](https://github.com/apache/superset/blob/master/CONTRIBUTING.md)
to find resources around contributing along with a detailed guide on
how to set up a development environment.
## Resources
- [Superset "In the Wild"](https://superset.apache.org/inTheWild) - see who's using Superset, and [add your organization](https://github.com/apache/superset/edit/master/RESOURCES/INTHEWILD.yaml) to the list!
- [Feature Flags](https://superset.apache.org/docs/configuration/feature-flags) - the status of Superset's Feature Flags.
- [Superset "In the Wild"](https://github.com/apache/superset/blob/master/RESOURCES/INTHEWILD.md) - open a PR to add your org to the list!
- [Feature Flags](https://github.com/apache/superset/blob/master/RESOURCES/FEATURE_FLAGS.md) - the status of Superset's Feature Flags.
- [Standard Roles](https://github.com/apache/superset/blob/master/RESOURCES/STANDARD_ROLES.md) - How RBAC permissions map to roles.
- [Superset Wiki](https://github.com/apache/superset/wiki) - Tons of additional community resources: best practices, community content and other information.
- [Superset SIPs](https://github.com/orgs/apache/projects/170) - The status of Superset's SIPs (Superset Improvement Proposals) for both consensus and implementation status.

View File

@@ -92,7 +92,7 @@ Some of the new features in this release are disabled by default. Each has a fea
| Feature | Feature Flag | Dependencies | Documentation
| --- | --- | --- | --- |
| Global Async Queries | `GLOBAL_ASYNC_QUERIES: True` | Redis 5.0+, celery workers configured and running | [Extra documentation](https://superset.apache.org/docs/contributing/misc#async-chart-queries)
| Global Async Queries | `GLOBAL_ASYNC_QUERIES: True` | Redis 5.0+, celery workers configured and running | [Extra documentation](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#async-chart-queries )
| Dashboard Native Filters | `DASHBOARD_NATIVE_FILTERS: True` | |
| Alerts & Reporting | `ALERT_REPORTS: True` | [Celery workers configured & celery beat process](https://superset.apache.org/docs/installation/async-queries-celery) |
| Homescreen Thumbnails | `THUMBNAILS: TRUE, THUMBNAIL_CACHE_CONFIG: CacheConfig = { "CACHE_TYPE": "null", "CACHE_NO_NULL_WARNING": True}`| selenium, pillow 7, celery |

103
RESOURCES/FEATURE_FLAGS.md Normal file
View File

@@ -0,0 +1,103 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Superset Feature Flags
This is a list of the current Superset optional features. See config.py for default values. These features can be turned on/off by setting your preferred values in superset_config.py to True/False respectively
## In Development
These features are considered **unfinished** and should only be used on development environments.
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
- ALERT_REPORT_TABS
- DATE_RANGE_TIMESHIFTS_ENABLED
- ENABLE_ADVANCED_DATA_TYPES
- PRESTO_EXPAND_DATA
- SHARE_QUERIES_VIA_KV_STORE
- TAGGING_SYSTEM
- CHART_PLUGINS_EXPERIMENTAL
## In Testing
These features are **finished** but currently being tested. They are usable, but may still contain some bugs.
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
- ALERT_REPORTS: [(docs)](https://superset.apache.org/docs/configuration/alerts-reports)
- ALLOW_FULL_CSV_EXPORT
- CACHE_IMPERSONATION
- CONFIRM_DASHBOARD_DIFF
- DYNAMIC_PLUGINS
- DATE_FORMAT_IN_EMAIL_SUBJECT: [(docs)](https://superset.apache.org/docs/configuration/alerts-reports#commons)
- ENABLE_SUPERSET_META_DB: [(docs)](https://superset.apache.org/docs/configuration/databases/#querying-across-databases)
- ESTIMATE_QUERY_COST
- GLOBAL_ASYNC_QUERIES [(docs)](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#async-chart-queries)
- IMPERSONATE_WITH_EMAIL_PREFIX
- PLAYWRIGHT_REPORTS_AND_THUMBNAILS
- RLS_IN_SQLLAB
- SSH_TUNNELING [(docs)](https://superset.apache.org/docs/configuration/setup-ssh-tunneling)
- USE_ANALAGOUS_COLORS
## Stable
These features flags are **safe for production**. They have been tested and will be supported for the at least the current major version cycle.
[//]: # "PLEASE KEEP THESE LISTS SORTED ALPHABETICALLY"
### Flags on the path to feature launch and flag deprecation/removal
- DASHBOARD_VIRTUALIZATION
### Flags retained for runtime configuration
Currently some of our feature flags act as dynamic configurations that can changed
on the fly. This acts in contradiction with the typical ephemeral feature flag use case,
where the flag is used to mature a feature, and eventually deprecated once the feature is
solid. Eventually we'll likely refactor these under a more formal "dynamic configurations" managed
independently. This new framework will also allow for non-boolean configurations.
- ALERTS_ATTACH_REPORTS
- ALLOW_ADHOC_SUBQUERY
- DASHBOARD_RBAC [(docs)](https://superset.apache.org/docs/using-superset/creating-your-first-dashboard#manage-access-to-dashboards)
- DATAPANEL_CLOSED_BY_DEFAULT
- DRILL_BY
- DRUID_JOINS
- EMBEDDABLE_CHARTS
- EMBEDDED_SUPERSET
- ENABLE_TEMPLATE_PROCESSING
- ESCAPE_MARKDOWN_HTML
- LISTVIEWS_DEFAULT_CARD_VIEW
- SCHEDULED_QUERIES [(docs)](https://superset.apache.org/docs/configuration/alerts-reports)
- SLACK_ENABLE_AVATARS (see `superset/config.py` for more information)
- SQLLAB_BACKEND_PERSISTENCE
- SQL_VALIDATORS_BY_ENGINE [(docs)](https://superset.apache.org/docs/configuration/sql-templating)
- THUMBNAILS [(docs)](https://superset.apache.org/docs/configuration/cache)
## Deprecated Flags
These features flags currently default to True and **will be removed in a future major release**. For this current release you can turn them off by setting your config to False, but it is advised to remove or set these flags in your local configuration to **True** so that you do not experience any unexpected changes in a future release.
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
- AVOID_COLORS_COLLISION
- DRILL_TO_DETAIL
- ENABLE_JAVASCRIPT_CONTROLS
- KV_STORE

226
RESOURCES/INTHEWILD.md Normal file
View File

@@ -0,0 +1,226 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Superset Users in the Wild
Here's a list of organizations, broken down into broad industry categories, that have taken the time to send a PR to let
the world know they are using Apache Superset. If you are a user and want to be recognized,
all you have to do is file a simple PR [like this one](https://github.com/apache/superset/pull/10122) — [just click here](https://github.com/apache/superset/edit/master/RESOURCES/INTHEWILD.md) to do so. If you think
the categorization is inaccurate, please file a PR with your correction as well.
Join our growing community!
### Sharing Economy
- [Airbnb](https://github.com/airbnb)
- [Faasos](https://faasos.com/) [@shashanksingh]
- [Free2Move](https://www.free2move.com/) [@PaoloTerzi]
- [Hostnfly](https://www.hostnfly.com/) [@alexisrosuel]
- [Lime](https://www.li.me/) [@cxmcc]
- [Lyft](https://www.lyft.com/)
- [Ontruck](https://www.ontruck.com/)
### Financial Services
- [Aktia Bank plc](https://www.aktia.com)
- [American Express](https://www.americanexpress.com) [@TheLastSultan]
- [bumper](https://www.bumper.co/) [@vasu-ram, @JamiePercival]
- [Cape Crypto](https://capecrypto.com)
- [Capital Service S.A.](https://capitalservice.pl) [@pkonarzewski]
- [Clark.de](https://clark.de/)
- [Europace](https://europace.de)
- [KarrotPay](https://www.daangnpay.com/)
- [Remita](https://remita.net) [@mujibishola]
- [Taveo](https://www.taveo.com) [@codek]
- [Unit](https://www.unit.co/about-us) [@amitmiran137]
- [Wise](https://wise.com) [@koszti]
- [Xendit](https://xendit.co/) [@LieAlbertTriAdrian]
- [Cover Genius](https://covergenius.com/)
### Gaming
- [Popoko VM Games Studio](https://popoko.live)
### E-Commerce
- [AiHello](https://www.aihello.com) [@ganeshkrishnan1]
- [Bazaar Technologies](https://www.bazaartech.com) [@umair-abro]
- [Dragonpass](https://www.dragonpass.com.cn/) [@zhxjdwh]
- [Dropit Shopping](https://www.dropit.shop/) [@dropit-dev]
- [Fanatics](https://www.fanatics.com/) [@coderfender]
- [Fordeal](https://www.fordeal.com) [@Renkai]
- [Fynd](https://www.fynd.com/) [@darpanjain07]
- [GFG - Global Fashion Group](https://global-fashion-group.com) [@ksaagariconic]
- [GoTo/Gojek](https://www.gojek.io/) [@gwthm-in]
- [HuiShouBao](https://www.huishoubao.com/) [@Yukinoshita-Yukino]
- [Now](https://www.now.vn/) [@davidkohcw]
- [Qunar](https://www.qunar.com/) [@flametest]
- [Rakuten Viki](https://www.viki.com)
- [Shopee](https://shopee.sg) [@xiaohanyu]
- [Shopkick](https://www.shopkick.com) [@LAlbertalli]
- [ShopUp](https://www.shopup.org/) [@gwthm-in]
- [Tails.com](https://tails.com/gb/) [@alanmcruickshank]
- [THE ICONIC](https://theiconic.com.au/) [@ksaagariconic]
- [Utair](https://www.utair.ru) [@utair-digital]
- [VkusVill](https://vkusvill.ru/) [@ETselikov]
- [Zalando](https://www.zalando.com) [@dmigo]
- [Zalora](https://www.zalora.com) [@ksaagariconic]
- [Zepto](https://www.zeptonow.com/) [@gwthm-in]
### Enterprise Technology
- [A3Data](https://a3data.com.br) [@neylsoncrepalde]
- [Analytics Aura](https://analyticsaura.com/) [@Analytics-Aura]
- [Apollo GraphQL](https://www.apollographql.com/) [@evans]
- [Astronomer](https://www.astronomer.io) [@ryw]
- [Avesta Technologies](https://avestatechnologies.com/) [@TheRum]
- [Caizin](https://caizin.com/) [@tejaskatariya]
- [Canonical](https://canonical.com)
- [Careem](https://www.careem.com/) [@samraHanif0340]
- [Cloudsmith](https://cloudsmith.io) [@alancarson]
- [Cyberhaven](https://www.cyberhaven.com/) [@toliver-ch]
- [Deepomatic](https://deepomatic.com/) [@Zanoellia]
- [Dial Once](https://www.dial-once.com/)
- [Dremio](https://dremio.com) [@narendrans]
- [EFinance](https://www.efinance.com.eg) [@habeeb556]
- [Elestio](https://elest.io/) [@kaiwalyakoparkar]
- [ELMO Cloud HR & Payroll](https://elmosoftware.com.au/)
- [Endress+Hauser](https://www.endress.com/) [@rumbin]
- [FBK - ICT center](https://ict.fbk.eu)
- [Formbricks](https://formbricks.com)
- [Gavagai](https://gavagai.io) [@gavagai-corp]
- [GfK Data Lab](https://www.gfk.com/home) [@mherr]
- [HPE](https://www.hpe.com/in/en/home.html) [@anmol-hpe]
- [Hydrolix](https://www.hydrolix.io/)
- [Intercom](https://www.intercom.com/) [@kate-gallo]
- [jampp](https://jampp.com/)
- [Konfío](https://konfio.mx) [@uis-rodriguez]
- [Mainstrat](https://mainstrat.com/)
- [mishmash io](https://mishmash.io/) [@mishmash-io]
- [Myra Labs](https://www.myralabs.com/) [@viksit]
- [Nielsen](https://www.nielsen.com/) [@amitNielsen]
- [Ona](https://ona.io) [@pld]
- [Orange](https://www.orange.com) [@icsu]
- [Oslandia](https://oslandia.com)
- [Oxylabs](https://oxylabs.io/) [@rytis-ulys]
- [Peak AI](https://www.peak.ai/) [@azhar22k]
- [PeopleDoc](https://www.people-doc.com) [@rodo]
- [PlaidCloud](https://www.plaidcloud.com)
- [Preset, Inc.](https://preset.io)
- [PubNub](https://pubnub.com) [@jzucker2]
- [ReadyTech](https://www.readytech.io)
- [Reward Gateway](https://www.rewardgateway.com)
- [RIADVICE](https://riadvice.tn) [@riadvice]
- [ScopeAI](https://www.getscopeai.com) [@iloveluce]
- [shipmnts](https://shipmnts.com)
- [Showmax](https://showmax.com) [@bobek]
- [SingleStore](https://www.singlestore.com/)
- [TechAudit](https://www.techaudit.info) [@ETselikov]
- [Tenable](https://www.tenable.com) [@dflionis]
- [Tentacle](https://www.linkedin.com/company/tentacle-cmi/) [@jdclarke5]
- [timbr.ai](https://timbr.ai/) [@semantiDan]
- [Tobii](https://www.tobii.com/) [@dwa]
- [Tooploox](https://www.tooploox.com/) [@jakubczaplicki]
- [Unvired](https://unvired.com) [@srinisubramanian]
- [Virtuoso QA](https://www.virtuosoqa.com)
- [Whale](https://whale.im)
- [Windsor.ai](https://www.windsor.ai/) [@octaviancorlade]
- [WinWin Network马上赢](https://brandct.cn/) [@wenbinye]
- [Zeta](https://www.zeta.tech/) [@shaikidris]
### Media & Entertainment
- [6play](https://www.6play.fr) [@CoryChaplin]
- [bilibili](https://www.bilibili.com) [@Moinheart]
- [BurdaForward](https://www.burda-forward.de/en/)
- [Douban](https://www.douban.com/) [@luchuan]
- [Kuaishou](https://www.kuaishou.com/) [@zhaoyu89730105]
- [Netflix](https://www.netflix.com/)
- [Prensa Iberica](https://www.prensaiberica.es/) [@zamar-roura]
- [TME QQMUSIC/WESING](https://www.tencentmusic.com/) [@shenyuanli,@marklaw]
- [Xite](https://xite.com/) [@shashankkoppar]
- [Zaihang](https://www.zaih.com/)
### Education
- [Aveti Learning](https://avetilearning.com/) [@TheShubhendra]
- [Brilliant.org](https://brilliant.org/)
- [Open edX](https://openedx.org/)
- [Platzi.com](https://platzi.com/)
- [Sunbird](https://www.sunbird.org/) [@eksteporg]
- [The GRAPH Network](https://thegraphnetwork.org/) [@fccoelho]
- [Udemy](https://www.udemy.com/) [@sungjuly]
- [VIPKID](https://www.vipkid.com.cn/) [@illpanda]
- [WikiMedia Foundation](https://wikimediafoundation.org) [@vg]
### Energy
- [Airboxlab](https://foobot.io) [@antoine-galataud]
- [DouroECI](https://www.douroeci.com/) [@nunohelibeires]
- [Safaricom](https://www.safaricom.co.ke/) [@mmutiso]
- [Scoot](https://scoot.co/) [@haaspt]
- [Wattbewerb](https://wattbewerb.de/) [@wattbewerb]
### Healthcare
- [Amino](https://amino.com) [@shkr]
- [Bluesquare](https://www.bluesquarehub.com/) [@madewulf]
- [Care](https://www.getcare.io/) [@alandao2021]
- [Living Goods](https://www.livinggoods.org) [@chelule]
- [Maieutical Labs](https://maieuticallabs.it) [@xrmx]
- [Medic](https://medic.org) [@1yuv]
- [REDCap Cloud](https://www.redcapcloud.com/)
- [TrustMedis](https://trustmedis.com/) [@famasya]
- [WeSure](https://www.wesure.cn/)
- [2070Health](https://2070health.com/)
### HR / Staffing
- [Swile](https://www.swile.co/) [@PaoloTerzi]
- [Symmetrics](https://www.symmetrics.fyi)
- [bluquist](https://bluquist.com/)
### Government
- [City of Ann Arbor, MI](https://www.a2gov.org/) [@sfirke]
- [RIS3 Strategy of CZ, MIT CR](https://www.ris3.cz/) [@RIS3CZ]
- [NRLM - Sarathi, India](https://pib.gov.in/PressReleasePage.aspx?PRID=1999586)
### Travel
- [Agoda](https://www.agoda.com/) [@lostseaway, @maiake, @obombayo]
- [HomeToGo](https://hometogo.com/) [@pedromartinsteenstrup]
- [Skyscanner](https://www.skyscanner.net/) [@cleslie, @stanhoucke]
### Others
- [10Web](https://10web.io/)
- [AI inside](https://inside.ai/en/)
- [Automattic](https://automattic.com/) [@Khrol, @Usiel]
- [Dropbox](https://www.dropbox.com/) [@bkyryliuk]
- [Flowbird](https://flowbird.com) [@EmmanuelCbd]
- [GEOTAB](https://www.geotab.com) [@JZ6]
- [Grassroot](https://www.grassrootinstitute.org/)
- [Increff](https://www.increff.com/) [@ishansinghania]
- [komoot](https://www.komoot.com/) [@christophlingg]
- [Let's Roam](https://www.letsroam.com/)
- [Machrent SA](https://www.machrent.com/)
- [Onebeat](https://1beat.com/) [@GuyAttia]
- [X](https://x.com/)
- [VLMedia](https://www.vlmedia.com.tr/) [@ibotheperfect]
- [Yahoo!](https://yahoo.com/)

View File

@@ -1,686 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# Apache Superset Users in the Wild
#
# To add your organization:
# 1. Find the appropriate category (or add a new one)
# 2. Add an entry with your organization details
# 3. Optionally add a logo file to docs/static/img/logos/
#
# Required fields:
# - name: Your organization name
# - url: Link to your organization's website
#
# Optional fields:
# - logo: Filename of logo in docs/static/img/logos/ (e.g., "mycompany.svg")
# - contributors: List of GitHub usernames who contributed (e.g., ["@username"])
categories:
Sharing Economy:
- name: Airbnb
url: https://github.com/airbnb
- name: Faasos
url: https://faasos.com/
contributors: ["@shashanksingh"]
- name: Free2Move
url: https://www.free2move.com/
contributors: ["@PaoloTerzi"]
- name: Hostnfly
url: https://www.hostnfly.com/
contributors: ["@alexisrosuel"]
- name: Lime
url: https://www.li.me/
contributors: ["@cxmcc"]
- name: Lyft
url: https://www.lyft.com/
- name: Ontruck
url: https://www.ontruck.com/
Financial Services:
- name: Aktia Bank plc
url: https://www.aktia.com
- name: American Express
url: https://www.americanexpress.com
contributors: ["@TheLastSultan"]
- name: bumper
url: https://www.bumper.co/
contributors: ["@vasu-ram", "@JamiePercival"]
- name: Cape Crypto
url: https://capecrypto.com
- name: Capital Service S.A.
url: https://capitalservice.pl
contributors: ["@pkonarzewski"]
- name: Clark.de
url: https://clark.de/
- name: EnquiryLabs
url: https://www.enquirylabs.co.uk
- name: Europace
url: https://europace.de
- name: KarrotPay
url: https://www.daangnpay.com/
- name: Remita
url: https://remita.net
contributors: ["@mujibishola"]
- name: Taveo
url: https://www.taveo.com
contributors: ["@codek"]
- name: Unit
url: https://www.unit.co/about-us
contributors: ["@amitmiran137"]
- name: Wise
url: https://wise.com
contributors: ["@koszti"]
- name: Xendit
url: https://xendit.co/
contributors: ["@LieAlbertTriAdrian"]
- name: Cover Genius
url: https://covergenius.com/
Gaming:
- name: Popoko VM Games Studio
url: https://popoko.live
E-Commerce:
- name: AiHello
url: https://www.aihello.com
contributors: ["@ganeshkrishnan1"]
- name: Bazaar Technologies
url: https://www.bazaartech.com
contributors: ["@umair-abro"]
- name: Blinkit
url: https://www.blinkit.com/
contributors: ["@amsharm2"]
- name: Dragonpass
url: https://www.dragonpass.com.cn/
contributors: ["@zhxjdwh"]
- name: Dropit Shopping
url: https://www.dropit.shop/
contributors: ["@dropit-dev"]
- name: Fordeal
url: https://www.fordeal.com
contributors: ["@Renkai"]
- name: Fynd
url: https://www.fynd.com/
contributors: ["@darpanjain07"]
- name: GFG - Global Fashion Group
url: https://global-fashion-group.com
contributors: ["@ksaagariconic"]
- name: GoTo/Gojek
url: https://www.gojek.io/
contributors: ["@gwthm-in"]
- name: HuiShouBao
url: https://www.huishoubao.com/
contributors: ["@Yukinoshita-Yukino"]
- name: Now
url: https://www.now.vn/
contributors: ["@davidkohcw"]
- name: Qunar
url: https://www.qunar.com/
contributors: ["@flametest"]
- name: Rakuten Viki
url: https://www.viki.com
- name: Shopee
url: https://shopee.sg
contributors: ["@xiaohanyu"]
- name: Shopkick
url: https://www.shopkick.com
contributors: ["@LAlbertalli"]
- name: ShopUp
url: https://www.shopup.org/
contributors: ["@gwthm-in"]
- name: Tails.com
url: https://tails.com/gb/
contributors: ["@alanmcruickshank"]
- name: THE ICONIC
url: https://theiconic.com.au/
contributors: ["@ksaagariconic"]
- name: Utair
url: https://www.utair.ru
contributors: ["@utair-digital"]
- name: VkusVill
url: https://vkusvill.ru/
contributors: ["@ETselikov"]
- name: Zalando
url: https://www.zalando.com
contributors: ["@dmigo"]
- name: Zalora
url: https://www.zalora.com
contributors: ["@ksaagariconic"]
- name: Zepto
url: https://www.zeptonow.com/
contributors: ["@gwthm-in"]
Enterprise Technology:
- name: A3Data
url: https://a3data.com.br
contributors: ["@neylsoncrepalde"]
- name: Analytics Aura
url: https://analyticsaura.com/
contributors: ["@Analytics-Aura"]
- name: Apollo GraphQL
url: https://www.apollographql.com/
contributors: ["@evans"]
- name: Astronomer
url: https://www.astronomer.io
contributors: ["@ryw"]
- name: Avesta Technologies
url: https://avestatechnologies.com/
contributors: ["@TheRum"]
- name: Caizin
url: https://caizin.com/
contributors: ["@tejaskatariya"]
- name: Canonical
url: https://canonical.com
- name: Careem
url: https://www.careem.com/
contributors: ["@samraHanif0340"]
- name: Cloudsmith
url: https://cloudsmith.io
contributors: ["@alancarson"]
- name: Cyberhaven
url: https://www.cyberhaven.com/
contributors: ["@toliver-ch"]
- name: Deepomatic
url: https://deepomatic.com/
contributors: ["@Zanoellia"]
- name: Dial Once
url: https://www.dial-once.com/
- name: Dremio
url: https://dremio.com
contributors: ["@narendrans"]
- name: EFinance
url: https://www.efinance.com.eg
contributors: ["@habeeb556"]
- name: Elestio
url: https://elest.io/
contributors: ["@kaiwalyakoparkar"]
- name: ELMO Cloud HR & Payroll
url: https://elmosoftware.com.au/
- name: Endress+Hauser
url: https://www.endress.com/
contributors: ["@rumbin"]
- name: FBK - ICT center
url: https://ict.fbk.eu
- name: Formbricks
url: https://formbricks.com
- name: Gavagai
url: https://gavagai.io
contributors: ["@gavagai-corp"]
- name: GfK Data Lab
url: https://www.gfk.com/home
contributors: ["@mherr"]
# Logo approved by @anmol-hpe on behalf of HPE
- name: HPE
url: https://www.hpe.com/in/en/home.html
logo: hpe.png
contributors: ["@anmol-hpe"]
- name: Hydrolix
url: https://www.hydrolix.io/
- name: Intercom
url: https://www.intercom.com/
contributors: ["@kate-gallo"]
- name: jampp
url: https://jampp.com/
- name: Konfío
url: https://konfio.mx
contributors: ["@uis-rodriguez"]
- name: Mainstrat
url: https://mainstrat.com/
- name: mishmash io
url: https://mishmash.io/
contributors: ["@mishmash-io"]
- name: Myra Labs
url: https://www.myralabs.com/
contributors: ["@viksit"]
- name: Nielsen
url: https://www.nielsen.com/
contributors: ["@amitNielsen"]
- name: Ona
url: https://ona.io
contributors: ["@pld"]
- name: Orange
url: https://www.orange.com
contributors: ["@icsu"]
- name: Oslandia
url: https://oslandia.com
- name: Oxylabs
url: https://oxylabs.io/
contributors: ["@rytis-ulys"]
- name: Peak AI
url: https://www.peak.ai/
contributors: ["@azhar22k"]
- name: PeopleDoc
url: https://www.people-doc.com
contributors: ["@rodo"]
- name: PlaidCloud
url: https://plaidcloud.com
logo: plaidcloud.svg
contributors: ["@rad-pat"]
- name: Preset, Inc.
url: https://preset.io
logo: preset.svg
contributors: ["@mistercrunch", "@betodealmeida", "@dpgaspar", "@rusackas", "@sadpandajoe", "@Vitor-Avila", "@kgabryje", "@geido", "@eschutho", "@Antonio-RiveroMartnez", "@yousoph"]
- name: PubNub
url: https://pubnub.com
contributors: ["@jzucker2"]
- name: ReadyTech
url: https://www.readytech.io
- name: Reward Gateway
url: https://www.rewardgateway.com
- name: RIADVICE
url: https://riadvice.tn
contributors: ["@riadvice"]
- name: ScopeAI
url: https://www.getscopeai.com
contributors: ["@iloveluce"]
- name: shipmnts
url: https://shipmnts.com
- name: Showmax
url: https://showmax.com
contributors: ["@bobek"]
- name: SingleStore
url: https://www.singlestore.com/
- name: TechAudit
url: https://www.techaudit.info
contributors: ["@ETselikov"]
- name: Tenable
url: https://www.tenable.com
contributors: ["@dflionis"]
- name: Tentacle
url: https://www.linkedin.com/company/tentacle-cmi/
contributors: ["@jdclarke5"]
- name: timbr.ai
url: https://timbr.ai/
contributors: ["@semantiDan"]
- name: Tobii
url: https://www.tobii.com/
contributors: ["@dwa"]
- name: Tooploox
url: https://www.tooploox.com/
contributors: ["@jakubczaplicki"]
- name: Unvired
url: https://unvired.com
contributors: ["@srinisubramanian"]
- name: UserGuiding
url: https://userguiding.com/
logo: userguiding.svg
contributors: ["@tzercin"]
- name: Virtuoso QA
url: https://www.virtuosoqa.com
- name: Whale
url: https://whale.im
- name: Windsor.ai
url: https://www.windsor.ai/
contributors: ["@octaviancorlade"]
- name: WinWin Network马上赢
url: https://brandct.cn/
contributors: ["@wenbinye"]
- name: Zeta
url: https://www.zeta.tech/
contributors: ["@shaikidris"]
Media & Entertainment:
- name: 6play
url: https://www.6play.fr
contributors: ["@CoryChaplin"]
- name: bilibili
url: https://www.bilibili.com
contributors: ["@Moinheart"]
- name: BurdaForward
url: https://www.burda-forward.de/en/
- name: Douban
url: https://www.douban.com/
contributors: ["@luchuan"]
- name: Kuaishou
url: https://www.kuaishou.com/
contributors: ["@zhaoyu89730105"]
- name: Netflix
url: https://www.netflix.com/
- name: Prensa Iberica
url: https://www.prensaiberica.es/
contributors: ["@zamar-roura"]
- name: TME QQMUSIC/WESING
url: https://www.tencentmusic.com/
contributors: ["@shenyuanli", "@marklaw"]
- name: Xite
url: https://xite.com/
contributors: ["@shashankkoppar"]
- name: Zaihang
url: https://www.zaih.com/
Education:
- name: Aveti Learning
url: https://avetilearning.com/
contributors: ["@TheShubhendra"]
- name: Brilliant.org
url: https://brilliant.org/
- name: Cirrus Assessment
url: https://cirrusassessment.com/
logo: cirrus.svg
contributors: ["@jeroenhabets", "@ddmm-white", "@paulrocost"]
- name: Open edX
url: https://openedx.org/
- name: Platzi.com
url: https://platzi.com/
- name: Sunbird
url: https://www.sunbird.org/
contributors: ["@eksteporg"]
- name: The GRAPH Network
url: https://thegraphnetwork.org/
contributors: ["@fccoelho"]
- name: Udemy
url: https://www.udemy.com/
contributors: ["@sungjuly"]
- name: VIPKID
url: https://www.vipkid.com.cn/
contributors: ["@illpanda"]
- name: WikiMedia Foundation
url: https://wikimediafoundation.org
contributors: ["@vg"]
Energy:
- name: Airboxlab
url: https://foobot.io
contributors: ["@antoine-galataud"]
- name: DouroECI
url: https://www.douroeci.com/
contributors: ["@nunohelibeires"]
- name: Safaricom
url: https://www.safaricom.co.ke/
contributors: ["@mmutiso"]
- name: Scoot
url: https://scoot.co/
contributors: ["@haaspt"]
- name: Wattbewerb
url: https://wattbewerb.de/
contributors: ["@wattbewerb"]
- name: Rogow
url: https://rogow.com.br/
contributors: ["@nilmonto"]
Healthcare:
- name: Amino
url: https://amino.com
contributors: ["@shkr"]
- name: Bluesquare
url: https://www.bluesquarehub.com/
contributors: ["@madewulf"]
- name: Care
url: https://www.getcare.io/
contributors: ["@alandao2021"]
- name: Living Goods
url: https://www.livinggoods.org
contributors: ["@chelule"]
- name: Maieutical Labs
url: https://maieuticallabs.it
contributors: ["@xrmx"]
- name: Medic
url: https://medic.org
contributors: ["@1yuv"]
- name: REDCap Cloud
url: https://www.redcapcloud.com/
- name: TrustMedis
url: https://trustmedis.com/
contributors: ["@famasya"]
- name: WeSure
url: https://www.wesure.cn/
- name: 2070Health
url: https://2070health.com/
HR / Staffing:
- name: Swile
url: https://www.swile.co/
contributors: ["@PaoloTerzi"]
- name: Symmetrics
url: https://www.symmetrics.fyi
- name: bluquist
url: https://bluquist.com/
Government:
- name: City of Ann Arbor, MI
url: https://www.a2gov.org/
contributors: ["@sfirke"]
- name: RIS3 Strategy of CZ, MIT CR
url: https://www.ris3.cz/
contributors: ["@RIS3CZ"]
- name: NRLM - Sarathi, India
url: https://pib.gov.in/PressReleasePage.aspx?PRID=1999586
Mobile Software:
- name: VLMedia
url: https://www.vlmedia.com.tr
logo: vlmedia.svg
contributors: ["@iercan"]
Travel:
- name: Agoda
url: https://www.agoda.com/
contributors: ["@lostseaway", "@maiake", "@obombayo"]
- name: HomeToGo
url: https://hometogo.com/
contributors: ["@pedromartinsteenstrup"]
- name: Skyscanner
url: https://www.skyscanner.net/
contributors: ["@cleslie", "@stanhoucke"]
Logistics:
- name: Stockarea
url: https://stockarea.io
Sports:
- name: Club 25 de Agosto (Femenino / Women's Team)
url: https://www.instagram.com/25deagosto.basketfemenino/
contributors: [ "@lion90" ]
logo: club25deagosto.svg
- name: Fanatics
url: https://www.fanatics.com/
contributors: [ "@coderfender" ]
- name: komoot
url: https://www.komoot.com/
contributors: [ "@christophlingg" ]
Others:
- name: 10Web
url: https://10web.io/
- name: AI inside
url: https://inside.ai/en/
- name: Automattic
url: https://automattic.com/
contributors: ["@Khrol", "@Usiel"]
- name: Dropbox
url: https://www.dropbox.com/
contributors: ["@bkyryliuk"]
- name: Flowbird
url: https://flowbird.com
contributors: ["@EmmanuelCbd"]
- name: GEOTAB
url: https://www.geotab.com
contributors: ["@JZ6"]
- name: Grassroot
url: https://www.grassrootinstitute.org/
- name: HOLLYLAND猛玛
url: https://www.hollyland.com
logo: hollyland猛玛.svg
contributors: ["@hlyda0601"]
- name: Increff
url: https://www.increff.com/
contributors: ["@ishansinghania"]
- name: Let's Roam
url: https://www.letsroam.com/
- name: Machrent SA
url: https://www.machrent.com/
- name: Onebeat
url: https://1beat.com/
contributors: ["@GuyAttia"]
- name: X
url: https://x.com/
- name: Yahoo!
url: https://yahoo.com/

View File

@@ -17,193 +17,192 @@ specific language governing permissions and limitations
under the License.
-->
| |Admin|Alpha|Gamma|Public|SQL_LAB|
|--------------------------------------------------|---|---|---|---|---|
| Permission/role description |Admins have all possible rights, including granting or revoking rights from other users and altering other people's slices and dashboards.|Alpha users have access to all data sources, but they cannot grant or revoke access from other users. They are also limited to altering the objects that they own. Alpha users can add and alter data sources.|Gamma users have limited access. They can only consume data coming from data sources they have been given access to through another complementary role. They only have access to view the slices and dashboards made from data sources that they have access to. Currently Gamma users are not able to alter or add data sources. We assume that they are mostly content consumers, though they can create slices and dashboards.|Public is the most restrictive built-in role, designed for anonymous/unauthenticated users viewing public dashboards. It provides minimal read-only access for dashboard viewing with interactive filters. Use `PUBLIC_ROLE_LIKE = "Public"` to apply these permissions to anonymous users.|The sql_lab role grants access to SQL Lab. Note that while Admin users have access to all databases by default, both Alpha and Gamma users need to be given access on a per database basis.||
| can read on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can write on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can read on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can write on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on Annotation |:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|O|
| can write on Annotation |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can read on AnnotationLayerRestApi |:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|O|
| can read on Dataset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can write on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can read on Log |:heavy_check_mark:|O|O|O|O|
| can write on Log |:heavy_check_mark:|O|O|O|O|
| can read on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on Database |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can write on Database |:heavy_check_mark:|O|O|O|O|
| can read on Query |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can this form get on ResetPasswordView |:heavy_check_mark:|O|O|O|O|
| can this form post on ResetPasswordView |:heavy_check_mark:|O|O|O|O|
| can this form get on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can this form post on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can this form get on UserInfoEditView |:heavy_check_mark:|O|O|O|O|
| can this form post on UserInfoEditView |:heavy_check_mark:|O|O|O|O|
| can show on UserDBModelView |:heavy_check_mark:|O|O|O|O|
| can edit on UserDBModelView |:heavy_check_mark:|O|O|O|O|
| can delete on UserDBModelView |:heavy_check_mark:|O|O|O|O|
| can add on UserDBModelView |:heavy_check_mark:|O|O|O|O|
| can list on UserDBModelView |:heavy_check_mark:|O|O|O|O|
| can userinfo on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| resetmypassword on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| resetpasswords on UserDBModelView |:heavy_check_mark:|O|O|O|O|
| userinfoedit on UserDBModelView |:heavy_check_mark:|O|O|O|O|
| can show on RoleModelView |:heavy_check_mark:|O|O|O|O|
| can edit on RoleModelView |:heavy_check_mark:|O|O|O|O|
| can delete on RoleModelView |:heavy_check_mark:|O|O|O|O|
| can add on RoleModelView |:heavy_check_mark:|O|O|O|O|
| can list on RoleModelView |:heavy_check_mark:|O|O|O|O|
| copyrole on RoleModelView |:heavy_check_mark:|O|O|O|O|
| can get on OpenApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can show on SwaggerView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can get on MenuApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can list on AsyncEventsRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can invalidate on CacheRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can csv upload on Database |:heavy_check_mark:|O|O|O|O|
| can excel upload on Database |:heavy_check_mark:|O|O|O|O|
| can query form data on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can query on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can time range on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can external metadata on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can save on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can get on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can my queries on SqlLab |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can log on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can import dashboards on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can schemas on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can sqllab history on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can publish on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can csv on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can slice on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can sync druid source on Superset |:heavy_check_mark:|O|O|O|O|
| can explore on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can approve on Superset |:heavy_check_mark:|O|O|O|O|
| can explore json on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can fetch datasource metadata on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can csrf token on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can sqllab on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can select star on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can warm up cache on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can sqllab table viz on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can available domains on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can request access on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can post on TableSchemaView |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
| can expanded on TableSchemaView |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
| can delete on TableSchemaView |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
| can get on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can post on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can delete query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can migrate query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can activate on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can delete on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can put on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can read on SecurityRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| menu access on Security |:heavy_check_mark:|O|O|O|O|
| menu access on List Users |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on List Roles |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Action Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Manage |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| menu access on Annotation Layers |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on CSS Templates |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| menu access on Import Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Data |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Databases |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Datasets |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Charts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on SQL Lab |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
| menu access on SQL Editor |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| menu access on Saved Queries |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| menu access on Query Search |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| all datasource access on all_datasource_access |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| all database access on all_database_access |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| all query access on all_query_access |:heavy_check_mark:|O|O|O|O|
| can write on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
| can edit on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
| can list on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can show on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can download on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
| can add on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
| can delete on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
| can external metadata by name on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can get value on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can store on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can tagged objects on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can suggestions on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can get on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can post on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can delete on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can edit on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can list on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can show on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can add on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can delete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| muldelete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can edit on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can list on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can show on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can add on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can delete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| muldelete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can edit on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can list on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can show on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can add on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can delete on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can list on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can show on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can list on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can show on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Row Level Security |:heavy_check_mark:|O|O|O|O|
| menu access on Access requests |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Home |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Plugins |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Dashboard Email Schedules |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Chart Emails |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Alerts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Alerts & Report |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Scan New Datasources |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can share dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can share chart on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can this form get on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can this form post on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can export on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can write on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can delete embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can set embedded on Dashboard |:heavy_check_mark:|O|O|O|O|
| can export on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can get embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can export on Database |:heavy_check_mark:|O|O|O|O|
| can export on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can write on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can write on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can export on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can import on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can export on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
| can dashboard permalink on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can grant guest token on SecurityRestApi |:heavy_check_mark:|O|O|O|O|
| can read on AdvancedDataType |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on EmbeddedDashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can duplicate on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can read on Explore |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can samples on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can read on AvailableDomains |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| can get or create dataset on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can get column values on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
| can export csv on SQLLab |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
| can get results on SQLLab |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
| can execute sql query on SQLLab |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
| can recent activity on Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
| |Admin|Alpha|Gamma|SQL_LAB|
|--------------------------------------------------|---|---|---|---|
| Permission/role description |Admins have all possible rights, including granting or revoking rights from other users and altering other peoples slices and dashboards.|Alpha users have access to all data sources, but they cannot grant or revoke access from other users. They are also limited to altering the objects that they own. Alpha users can add and alter data sources.|Gamma users have limited access. They can only consume data coming from data sources they have been given access to through another complementary role. They only have access to view the slices and dashboards made from data sources that they have access to. Currently Gamma users are not able to alter or add data sources. We assume that they are mostly content consumers, though they can create slices and dashboards.|The sql_lab role grants access to SQL Lab. Note that while Admin users have access to all databases by default, both Alpha and Gamma users need to be given access on a per database basis.||
| can read on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can write on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can read on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on Annotation |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on Annotation |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on Dataset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on Log |:heavy_check_mark:|O|O|O|
| can write on Log |:heavy_check_mark:|O|O|O|
| can read on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on Database |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can write on Database |:heavy_check_mark:|O|O|O|
| can read on Query |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can this form get on ResetPasswordView |:heavy_check_mark:|O|O|O|
| can this form post on ResetPasswordView |:heavy_check_mark:|O|O|O|
| can this form get on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form post on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form get on UserInfoEditView |:heavy_check_mark:|O|O|O|
| can this form post on UserInfoEditView |:heavy_check_mark:|O|O|O|
| can show on UserDBModelView |:heavy_check_mark:|O|O|O|
| can edit on UserDBModelView |:heavy_check_mark:|O|O|O|
| can delete on UserDBModelView |:heavy_check_mark:|O|O|O|
| can add on UserDBModelView |:heavy_check_mark:|O|O|O|
| can list on UserDBModelView |:heavy_check_mark:|O|O|O|
| can userinfo on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| resetmypassword on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| resetpasswords on UserDBModelView |:heavy_check_mark:|O|O|O|
| userinfoedit on UserDBModelView |:heavy_check_mark:|O|O|O|
| can show on RoleModelView |:heavy_check_mark:|O|O|O|
| can edit on RoleModelView |:heavy_check_mark:|O|O|O|
| can delete on RoleModelView |:heavy_check_mark:|O|O|O|
| can add on RoleModelView |:heavy_check_mark:|O|O|O|
| can list on RoleModelView |:heavy_check_mark:|O|O|O|
| copyrole on RoleModelView |:heavy_check_mark:|O|O|O|
| can get on OpenApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can show on SwaggerView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can get on MenuApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can list on AsyncEventsRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can invalidate on CacheRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can csv upload on Database |:heavy_check_mark:|O|O|O|
| can excel upload on Database |:heavy_check_mark:|O|O|O|
| can query form data on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can query on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can time range on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can external metadata on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can save on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can get on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can my queries on SqlLab |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can log on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can import dashboards on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can schemas on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can sqllab history on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can publish on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can csv on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can slice on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can sync druid source on Superset |:heavy_check_mark:|O|O|O|
| can explore on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can approve on Superset |:heavy_check_mark:|O|O|O|
| can explore json on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can fetch datasource metadata on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can csrf token on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can sqllab on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can select star on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can warm up cache on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can sqllab table viz on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can available domains on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can request access on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can post on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
| can expanded on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
| can delete on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
| can get on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can post on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can delete query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can migrate query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can activate on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can delete on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can put on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can read on SecurityRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| menu access on Security |:heavy_check_mark:|O|O|O|
| menu access on List Users |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on List Roles |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Action Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Manage |:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Annotation Layers |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on CSS Templates |:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Import Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Data |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Databases |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Datasets |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Charts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on SQL Lab |:heavy_check_mark:|O|O|:heavy_check_mark:|
| menu access on SQL Editor |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| menu access on Saved Queries |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| menu access on Query Search |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| all datasource access on all_datasource_access |:heavy_check_mark:|:heavy_check_mark:|O|O|
| all database access on all_database_access |:heavy_check_mark:|:heavy_check_mark:|O|O|
| all query access on all_query_access |:heavy_check_mark:|O|O|O|
| can write on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can edit on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can list on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can show on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can download on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can add on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can delete on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can external metadata by name on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can get value on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can store on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can tagged objects on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can suggestions on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can get on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can post on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can delete on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can edit on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can list on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can show on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can add on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can delete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| muldelete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can edit on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can list on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can show on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can add on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can delete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| muldelete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can edit on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can list on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can show on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can add on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can delete on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can list on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can show on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can list on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can show on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Row Level Security |:heavy_check_mark:|O|O|O|
| menu access on Access requests |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Home |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Plugins |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Dashboard Email Schedules |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Chart Emails |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Alerts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Alerts & Report |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Scan New Datasources |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can share dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can share chart on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form get on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form post on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can export on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can delete embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can set embedded on Dashboard |:heavy_check_mark:|O|O|O|
| can export on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can get embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can export on Database |:heavy_check_mark:|O|O|O|
| can export on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can write on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can export on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can import on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can export on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can dashboard permalink on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can grant guest token on SecurityRestApi |:heavy_check_mark:|O|O|O|
| can read on AdvancedDataType |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on EmbeddedDashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can duplicate on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on Explore |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can samples on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can read on AvailableDomains |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can get or create dataset on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can get column values on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
| can export csv on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
| can get results on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
| can execute sql query on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
| can recent activity on Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|

View File

@@ -23,189 +23,8 @@ This file documents any backwards-incompatible changes in Superset and
assists people when migrating to a new version.
## Next
### Example Data Loading Improvements
#### New Directory Structure
Examples are now organized by name with data and configs co-located:
```
superset/examples/
├── _shared/ # Shared database & metadata configs
├── birth_names/ # Each example is self-contained
│ ├── data.parquet # Dataset (Parquet format)
│ ├── dataset.yaml # Dataset metadata
│ ├── dashboard.yaml # Dashboard config (optional)
│ └── charts/ # Chart configs (optional)
└── ...
```
#### Simplified Parquet-based Loading
- Auto-discovery: create `superset/examples/my_dataset/data.parquet` to add a new example
- Parquet is an Apache project format: compressed (~27% smaller), self-describing schema
- YAML configs define datasets, charts, and dashboards declaratively
- Removed Python-based data generation from individual example files
#### Test Data Reorganization
- Moved `big_data.py` to `superset/cli/test_loaders.py` - better reflects its purpose as a test utility
- Fixed inverted logic for `--load-test-data` flag (now correctly includes .test.yaml files when flag is set)
- Clarified CLI flags:
- `--force` / `-f`: Force reload even if tables exist
- `--only-metadata` / `-m`: Create table metadata without loading data
- `--load-test-data` / `-t`: Include test dashboards and .test.yaml configs
- `--load-big-data` / `-b`: Generate synthetic stress-test data
#### Bug Fixes
- Fixed numpy array serialization for PostgreSQL (converts complex types to JSON strings)
- Fixed KeyError for `allow_csv_upload` field in database configs (now optional with default)
- Fixed test data loading logic that was incorrectly filtering files
### MCP Service
The MCP (Model Context Protocol) service enables AI assistants and automation tools to interact programmatically with Superset.
#### New Features
- MCP service infrastructure with FastMCP framework
- Tools for dashboards, charts, datasets, SQL Lab, and instance metadata
- Optional dependency: install with `pip install apache-superset[fastmcp]`
- Runs as separate process from Superset web server
- JWT-based authentication for production deployments
#### New Configuration Options
**Development** (single-user, local testing):
```python
# superset_config.py
MCP_DEV_USERNAME = "admin" # User for MCP authentication
MCP_SERVICE_HOST = "localhost"
MCP_SERVICE_PORT = 5008
```
**Production** (JWT-based, multi-user):
```python
# superset_config.py
MCP_AUTH_ENABLED = True
MCP_JWT_ISSUER = "https://your-auth-provider.com"
MCP_JWT_AUDIENCE = "superset-mcp"
MCP_JWT_ALGORITHM = "RS256" # or "HS256" for shared secrets
# Option 1: Use JWKS endpoint (recommended for RS256)
MCP_JWKS_URI = "https://auth.example.com/.well-known/jwks.json"
# Option 2: Use static public key (RS256)
MCP_JWT_PUBLIC_KEY = "-----BEGIN PUBLIC KEY-----..."
# Option 3: Use shared secret (HS256)
MCP_JWT_ALGORITHM = "HS256"
MCP_JWT_SECRET = "your-shared-secret-key"
# Optional overrides
MCP_SERVICE_HOST = "0.0.0.0"
MCP_SERVICE_PORT = 5008
MCP_SESSION_CONFIG = {
"SESSION_COOKIE_SECURE": True,
"SESSION_COOKIE_HTTPONLY": True,
"SESSION_COOKIE_SAMESITE": "Strict",
}
```
#### Running the MCP Service
```bash
# Development
superset mcp run --port 5008 --debug
# Production
superset mcp run --port 5008
# With factory config
superset mcp run --port 5008 --use-factory-config
```
#### Deployment Considerations
The MCP service runs as a **separate process** from the Superset web server.
**Important**:
- Requires same Python environment and configuration as Superset
- Shares database connections with main Superset app
- Can be scaled independently from web server
- Requires `fastmcp` package (optional dependency)
**Installation**:
```bash
# Install with MCP support
pip install apache-superset[fastmcp]
# Or add to requirements.txt
apache-superset[fastmcp]>=X.Y.Z
```
**Process Management**:
Use systemd, supervisord, or Kubernetes to manage the MCP service process.
See `superset/mcp_service/PRODUCTION.md` for deployment guides.
**Security**:
- Development: Uses `MCP_DEV_USERNAME` for single-user access
- Production: **MUST** configure JWT authentication
- See `superset/mcp_service/SECURITY.md` for details
#### Documentation
- Architecture: `superset/mcp_service/ARCHITECTURE.md`
- Security: `superset/mcp_service/SECURITY.md`
- Production: `superset/mcp_service/PRODUCTION.md`
- Developer Guide: `superset/mcp_service/CLAUDE.md`
- Quick Start: `superset/mcp_service/README.md`
---
- [35621](https://github.com/apache/superset/pull/35621): The default hash algorithm has changed from MD5 to SHA-256 for improved security and FedRAMP compliance. This affects cache keys for thumbnails, dashboard digests, chart digests, and filter option names. Existing cached data will be invalidated upon upgrade. To opt out of this change and maintain backward compatibility, set `HASH_ALGORITHM = "md5"` in your `superset_config.py`.
- [35062](https://github.com/apache/superset/pull/35062): Changed the function signature of `setupExtensions` to `setupCodeOverrides` with options as arguments.
### Breaking Changes
- [37370](https://github.com/apache/superset/pull/37370): The `APP_NAME` configuration variable no longer controls the browser window/tab title or other frontend branding. Application names should now be configured using the theme system with the `brandAppName` token. The `APP_NAME` config is still used for backend contexts (MCP service, logs, etc.) and serves as a fallback if `brandAppName` is not set.
- **Migration:**
```python
# Before (Superset 5.x)
APP_NAME = "My Custom App"
# After (Superset 6.x) - Option 1: Use theme system (recommended)
THEME_DEFAULT = {
"token": {
"brandAppName": "My Custom App", # Window titles
"brandLogoAlt": "My Custom App", # Logo alt text
"brandLogoUrl": "/static/assets/images/custom_logo.png"
}
}
# After (Superset 6.x) - Option 2: Temporary fallback
# Keep APP_NAME for now (will be used as fallback for brandAppName)
APP_NAME = "My Custom App"
# But you should migrate to THEME_DEFAULT.token.brandAppName
```
- **Note:** For dark mode, set the same tokens in `THEME_DARK` configuration.
- [36317](https://github.com/apache/superset/pull/36317): The `CUSTOM_FONT_URLS` configuration option has been removed. Use the new per-theme `fontUrls` token in `THEME_DEFAULT` or database-managed themes instead.
- **Before:**
```python
CUSTOM_FONT_URLS = [
"https://fonts.example.com/myfont.css",
]
```
- **After:**
```python
THEME_DEFAULT = {
"token": {
"fontUrls": [
"https://fonts.example.com/myfont.css",
],
# ... other tokens
}
}
```
## 6.0.0
- [33055](https://github.com/apache/superset/pull/33055): Upgrades Flask-AppBuilder to 5.0.0. The AUTH_OID authentication type has been deprecated and is no longer available as an option in Flask-AppBuilder. OpenID (OID) is considered a deprecated authentication protocol - if you are using AUTH_OID, you will need to migrate to an alternative authentication method such as OAuth, LDAP, or database authentication before upgrading.
- [35062](https://github.com/apache/superset/pull/35062): Changed the function signature of `setupExtensions` to `setupCodeOverrides` with options as arguments.
- [34871](https://github.com/apache/superset/pull/34871): Fixed Jest test hanging issue from Ant Design v5 upgrade. MessageChannel is now mocked in test environment to prevent rc-overflow from causing Jest to hang. Test environment only - no production impact.
- [34782](https://github.com/apache/superset/pull/34782): Dataset exports now include the dataset ID in their file name (similar to charts and dashboards). If managing assets as code, make sure to rename existing dataset YAMLs to include the ID (and avoid duplicated files).
- [34536](https://github.com/apache/superset/pull/34536): The `ENVIRONMENT_TAG_CONFIG` color values have changed to support only Ant Design semantic colors. Update your `superset_config.py`:
@@ -222,7 +41,7 @@ Note: Pillow is now a required dependency (previously optional) to support image
- [33116](https://github.com/apache/superset/pull/33116) In Echarts Series charts (e.g. Line, Area, Bar, etc.) charts, the `x_axis_sort_series` and `x_axis_sort_series_ascending` form data items have been renamed with `x_axis_sort` and `x_axis_sort_asc`.
There's a migration added that can potentially affect a significant number of existing charts.
- [32317](https://github.com/apache/superset/pull/32317) The horizontal filter bar feature is now out of testing/beta development and its feature flag `HORIZONTAL_FILTER_BAR` has been removed.
- [31590](https://github.com/apache/superset/pull/31590) Marks the begining of intricate work around supporting dynamic Theming, and breaks support for [THEME_OVERRIDES](https://github.com/apache/superset/blob/732de4ac7fae88e29b7f123b6cbb2d7cd411b0e4/superset/config.py#L671) in favor of a new theming system based on AntD V5. Likely this will be in disrepair until settling over the 5.x lifecycle.
- [31590](https://github.com/apache/superset/pull/31590) Marks the beginning of intricate work around supporting dynamic Theming, and breaks support for [THEME_OVERRIDES](https://github.com/apache/superset/blob/732de4ac7fae88e29b7f123b6cbb2d7cd411b0e4/superset/config.py#L671) in favor of a new theming system based on AntD V5. Likely this will be in disrepair until settling over the 5.x lifecycle.
- [32432](https://github.com/apache/superset/pull/31260) Moves the List Roles FAB view to the frontend and requires `FAB_ADD_SECURITY_API` to be enabled in the configuration and `superset init` to be executed.
- [34319](https://github.com/apache/superset/pull/34319) Drill to Detail and Drill By is now supported in Embedded mode, and also with the `DASHBOARD_RBAC` FF. If you don't want to expose these features in Embedded / `DASHBOARD_RBAC`, make sure the roles used for Embedded / `DASHBOARD_RBAC`don't have the required permissions to perform D2D actions.

View File

@@ -77,6 +77,7 @@ x-common-build: &common-build
INCLUDE_CHROMIUM: ${INCLUDE_CHROMIUM:-false}
INCLUDE_FIREFOX: ${INCLUDE_FIREFOX:-false}
BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false}
LOAD_EXAMPLES_DUCKDB: ${LOAD_EXAMPLES_DUCKDB:-true}
services:
db-light:
@@ -115,6 +116,7 @@ services:
DATABASE_HOST: db-light
DATABASE_DB: superset_light
POSTGRES_DB: superset_light
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py
GITHUB_HEAD_REF: ${GITHUB_HEAD_REF:-}
GITHUB_SHA: ${GITHUB_SHA:-}
@@ -137,11 +139,8 @@ services:
DATABASE_HOST: db-light
DATABASE_DB: superset_light
POSTGRES_DB: superset_light
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py
# Override examples host to use light DB service
EXAMPLES_HOST: db-light
# Skip example loading for faster startup
SUPERSET_LOAD_EXAMPLES: "no"
healthcheck:
disable: true
@@ -164,7 +163,7 @@ services:
# configuring the dev-server to use the host.docker.internal to connect to the backend
superset: "http://superset-light:8088"
# Webpack dev server configuration
WEBPACK_DEVSERVER_HOST: "${WEBPACK_DEVSERVER_HOST:-0.0.0.0}"
WEBPACK_DEVSERVER_HOST: "${WEBPACK_DEVSERVER_HOST:-127.0.0.1}"
WEBPACK_DEVSERVER_PORT: "${WEBPACK_DEVSERVER_PORT:-9000}"
ports:
- "${NODE_PORT:-9001}:9000" # Parameterized port, accessible on all interfaces
@@ -197,6 +196,7 @@ services:
DATABASE_DB: test
POSTGRES_DB: test
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@db-light:5432/test
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
SUPERSET_CONFIG: superset_test_config_light
PYTHONPATH: /app/pythonpath:/app/docker/pythonpath_dev:/app

View File

@@ -44,6 +44,7 @@ x-common-build: &common-build
INCLUDE_CHROMIUM: ${INCLUDE_CHROMIUM:-false}
INCLUDE_FIREFOX: ${INCLUDE_FIREFOX:-false}
BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false}
LOAD_EXAMPLES_DUCKDB: ${LOAD_EXAMPLES_DUCKDB:-true}
services:
nginx:
@@ -53,9 +54,10 @@ services:
- path: docker/.env-local # optional override
required: false
image: nginx:latest
container_name: superset_nginx
restart: unless-stopped
ports:
- "${NGINX_PORT:-80}:80"
- "80:80"
extra_hosts:
- "host.docker.internal:host-gateway"
volumes:
@@ -64,9 +66,10 @@ services:
redis:
image: redis:7
container_name: superset_cache
restart: unless-stopped
ports:
- "127.0.0.1:${REDIS_PORT:-6379}:6379"
- "127.0.0.1:6379:6379"
volumes:
- redis:/data
@@ -77,9 +80,10 @@ services:
- path: docker/.env-local # optional override
required: false
image: postgres:16
container_name: superset_db
restart: unless-stopped
ports:
- "127.0.0.1:${DATABASE_PORT:-5432}:5432"
- "127.0.0.1:5432:5432"
volumes:
- db_home:/var/lib/postgresql/data
- ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d
@@ -92,12 +96,13 @@ services:
required: false
build:
<<: *common-build
container_name: superset_app
command: ["/app/docker/docker-bootstrap.sh", "app"]
restart: unless-stopped
ports:
- ${SUPERSET_PORT:-8088}:8088
- 8088:8088
# When in cypress-mode ->
- ${CYPRESS_PORT:-8081}:8081
- 8081:8081
extra_hosts:
- "host.docker.internal:host-gateway"
user: *superset-user
@@ -105,11 +110,14 @@ services:
superset-init:
condition: service_completed_successfully
volumes: *superset-volumes
environment:
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
superset-websocket:
container_name: superset_websocket
build: ./superset-websocket
ports:
- ${WEBSOCKET_PORT:-8080}:8080
- 8080:8080
extra_hosts:
- "host.docker.internal:host-gateway"
depends_on:
@@ -141,6 +149,7 @@ services:
superset-init:
build:
<<: *common-build
container_name: superset_init
command: ["/app/docker/docker-init.sh"]
env_file:
- path: docker/.env # default
@@ -154,6 +163,8 @@ services:
condition: service_started
user: *superset-user
volumes: *superset-volumes
environment:
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
healthcheck:
disable: true
@@ -175,10 +186,9 @@ services:
SCARF_ANALYTICS: "${SCARF_ANALYTICS:-}"
# configuring the dev-server to use the host.docker.internal to connect to the backend
superset: "http://superset:8088"
# Bind to all interfaces so Docker port mapping works
WEBPACK_DEVSERVER_HOST: "0.0.0.0"
ports:
- "127.0.0.1:${NODE_PORT:-9000}:9000" # exposing the dynamic webpack dev server
- "127.0.0.1:9000:9000" # exposing the dynamic webpack dev server
container_name: superset_node
command: ["/app/docker/docker-frontend.sh"]
env_file:
- path: docker/.env # default
@@ -190,6 +200,7 @@ services:
superset-worker:
build:
<<: *common-build
container_name: superset_worker
command: ["/app/docker/docker-bootstrap.sh", "worker"]
env_file:
- path: docker/.env # default
@@ -215,6 +226,7 @@ services:
superset-worker-beat:
build:
<<: *common-build
container_name: superset_worker_beat
command: ["/app/docker/docker-bootstrap.sh", "beat"]
env_file:
- path: docker/.env # default
@@ -232,6 +244,7 @@ services:
superset-tests-worker:
build:
<<: *common-build
container_name: superset_tests_worker
command: ["/app/docker/docker-bootstrap.sh", "worker"]
env_file:
- path: docker/.env # default

View File

@@ -21,15 +21,6 @@ PYTHONUNBUFFERED=1
COMPOSE_PROJECT_NAME=superset
DEV_MODE=true
# Port configuration (override in .env-local for multiple instances)
# NGINX_PORT=80
# SUPERSET_PORT=8088
# NODE_PORT=9000
# WEBSOCKET_PORT=8080
# CYPRESS_PORT=8081
# DATABASE_PORT=5432
# REDIS_PORT=6379
# database configurations (do not modify)
DATABASE_DB=superset
DATABASE_HOST=db

View File

@@ -1,39 +0,0 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# -----------------------------------------------------------------------
# Example .env-local file for running multiple Superset instances
# Copy this file to .env-local and customize for your setup
# -----------------------------------------------------------------------
# Unique project name prevents container/volume conflicts between clones
# Each clone should have a different name (e.g., superset-pr123, superset-feature-x)
COMPOSE_PROJECT_NAME=superset-dev2
# Port offsets for running multiple instances simultaneously
# Instance 1 (default): 80, 8088, 9000, 8080, 8081, 5432, 6379
# Instance 2 example: 81, 8089, 9001, 8082, 8083, 5433, 6380
NGINX_PORT=81
SUPERSET_PORT=8089
NODE_PORT=9001
WEBSOCKET_PORT=8082
CYPRESS_PORT=8083
DATABASE_PORT=5433
REDIS_PORT=6380
# For verbose logging during development:
# SUPERSET_LOG_LEVEL=debug

View File

@@ -77,34 +77,6 @@ To run the container, simply run: `docker compose up`
After waiting several minutes for Superset initialization to finish, you can open a browser and view [`http://localhost:8088`](http://localhost:8088)
to start your journey.
### Running Multiple Instances
If you need to run multiple Superset instances simultaneously (e.g., different branches or clones), use the make targets which automatically find available ports:
```bash
make up
```
This automatically:
- Generates a unique project name from your directory
- Finds available ports (incrementing from defaults if in use)
- Displays the assigned URLs before starting
Available commands (run from repo root):
| Command | Description |
|---------|-------------|
| `make up` | Start services (foreground) |
| `make up-detached` | Start services (background) |
| `make down` | Stop all services |
| `make ps` | Show running containers |
| `make logs` | Follow container logs |
| `make nuke` | Stop, remove volumes & local images |
From a subdirectory, use: `make -C $(git rev-parse --show-toplevel) up`
**Important**: Always use these commands instead of plain `docker compose down`, which won't know the correct project name.
## Developing
While running, the container server will reload on modification of the Superset Python and JavaScript source code.

View File

@@ -80,7 +80,7 @@ case "${1}" in
;;
app)
echo "Starting web app (using development server)..."
flask run -p $PORT --reload --debugger --without-threads --host=0.0.0.0 --exclude-patterns "*/node_modules/*:*/.venv/*:*/build/*:*/__pycache__/*"
flask run -p $PORT --reload --debugger --without-threads --host=0.0.0.0
;;
app-gunicorn)
echo "Starting web app..."

View File

@@ -19,7 +19,6 @@
# Import all settings from the main config first
from flask_caching.backends.filesystemcache import FileSystemCache
from superset_config import * # noqa: F403
# Override caching to use simple in-memory cache instead of Redis

View File

@@ -1,115 +0,0 @@
# Developer Portal Documentation Instructions
## Core Principle: Stories Are the Single Source of Truth
When working on the Storybook-to-MDX documentation system:
**ALWAYS fix the story first. NEVER add workarounds to the generator.**
## Why This Matters
The generator (`scripts/generate-superset-components.mjs`) should be lightweight - it extracts data from stories and passes it through. When you add special cases to the generator:
- It becomes harder to maintain
- Stories diverge from their docs representation
- Future stories need to know about generator quirks
When you fix stories to match the expected patterns:
- Stories work identically in Storybook and Docs
- The generator stays simple and predictable
- Patterns are consistent and learnable
## Story Patterns for Docs Generation
### Required Structure
```tsx
// Use inline export default (NOT const meta = ...; export default meta)
export default {
title: 'Components/MyComponent',
component: MyComponent,
};
// Name interactive stories with Interactive prefix
export const InteractiveMyComponent: Story = {
args: {
// Default prop values
},
argTypes: {
// Control definitions - MUST be at story level, not meta level
propName: {
control: { type: 'select' },
options: ['a', 'b', 'c'],
description: 'What this prop does',
},
},
};
```
### For Components with Variants (size × style grids)
```tsx
const sizes = ['small', 'medium', 'large'];
const variants = ['primary', 'secondary', 'danger'];
InteractiveButton.parameters = {
docs: {
gallery: {
component: 'Button',
sizes,
styles: variants,
sizeProp: 'size',
styleProp: 'variant',
},
},
};
```
### For Components Requiring Children
```tsx
InteractiveIconTooltip.parameters = {
docs: {
// Component descriptors with dot notation for nested components
sampleChildren: [{ component: 'Icons.InfoCircleOutlined', props: { iconSize: 'l' } }],
},
};
```
### For Custom Live Code Examples
```tsx
InteractiveMyComponent.parameters = {
docs: {
liveExample: `function Demo() {
return <MyComponent prop="value">Content</MyComponent>;
}`,
},
};
```
### For Complex Props (objects, arrays)
```tsx
InteractiveMenu.parameters = {
docs: {
staticProps: {
items: [
{ key: '1', label: 'Item 1' },
{ key: '2', label: 'Item 2' },
],
},
},
};
```
## Common Issues and How to Fix Them (in the Story)
| Issue | Wrong Approach | Right Approach |
|-------|---------------|----------------|
| Component not generated | Add pattern to generator | Change story to use inline `export default` |
| Control shows as text instead of select | Add special case in generator | Add `argTypes` with `control: { type: 'select' }` |
| Missing children/content | Modify StorybookWrapper | Add `parameters.docs.sampleChildren` |
| Gallery not showing | Add to generator output | Add `parameters.docs.gallery` config |
| Wrong live example | Hardcode in generator | Add `parameters.docs.liveExample` |
## Files
- **Generator**: `docs/scripts/generate-superset-components.mjs`
- **Wrapper**: `docs/src/components/StorybookWrapper.jsx`
- **Output**: `docs/developer_portal/components/`
- **Stories**: `superset-frontend/packages/superset-ui-core/src/components/*/`

21
docs/.gitignore vendored
View File

@@ -23,24 +23,3 @@ docs/.zshrc
# Gets copied from the root of the project at build time (yarn start / yarn build)
docs/intro.md
# Generated badge images (downloaded at build time by remark-localize-badges plugin)
static/badges/
# Generated database documentation MDX files (regenerated at build time)
# Source of truth is in superset/db_engine_specs/*.py metadata attributes
docs/databases/
# Generated API documentation (regenerated at build time from openapi.json)
# Source of truth is static/resources/openapi.json
docs/api/
# Generated component documentation MDX files (regenerated at build time)
# Source of truth is Storybook stories in superset-frontend/packages/superset-ui-core/src/components/
developer_portal/components/
# Generated extension component documentation (regenerated at build time)
developer_portal/extensions/components/
# Note: src/data/databases.json is COMMITTED (not ignored) to preserve feature diagnostics
# that require Flask context to generate. Update it locally with: npm run gen-db-docs

View File

@@ -416,7 +416,7 @@ If versions don't appear in dropdown:
- [Docusaurus Documentation](https://docusaurus.io/docs)
- [MDX Documentation](https://mdxjs.com/)
- [Superset Developer Portal](https://superset.apache.org/developer_portal/)
- [Superset Contributing Guide](../CONTRIBUTING.md)
- [Main Superset Documentation](https://superset.apache.org/docs/intro)
## 📖 Real Examples and Patterns

View File

@@ -18,9 +18,9 @@ under the License.
-->
This is the public documentation site for Superset, built using
[Docusaurus 3](https://docusaurus.io/). See the
[Developer Portal](https://superset.apache.org/developer_portal/contributing/development-setup#documentation)
for documentation on contributing to documentation.
[Docusaurus 3](https://docusaurus.io/). See
[CONTRIBUTING.md](../CONTRIBUTING.md#documentation) for documentation on
contributing to documentation.
## Version Management

View File

@@ -19,14 +19,5 @@
*/
module.exports = {
presets: [
[
require.resolve('@docusaurus/core/lib/babel/preset'),
{
runtime: 'automatic',
importSource: '@emotion/react',
},
],
],
plugins: ['@emotion/babel-plugin'],
presets: [require.resolve('@docusaurus/core/lib/babel/preset')],
};

View File

@@ -139,41 +139,6 @@ docker volume rm superset_db_home
docker-compose up
```
### Running multiple instances
If you need to run multiple Superset clones simultaneously (e.g., testing different branches),
use `make up` instead of `docker compose up`:
```bash
make up
```
This automatically:
- Generates a unique project name from your directory name
- Finds available ports (incrementing from 8088, 9000, etc. if already in use)
- Displays the assigned URLs before starting
Each clone gets isolated containers and volumes, so you can run them side-by-side without conflicts.
Available commands (run from repo root):
| Command | Description |
|---------|-------------|
| `make up` | Start services (foreground) |
| `make up-detached` | Start services (background) |
| `make down` | Stop all services |
| `make ps` | Show running containers |
| `make logs` | Follow container logs |
| `make ports` | Show assigned URLs and ports |
| `make open` | Open browser to dev server |
| `make nuke` | Stop, remove volumes & local images |
From a subdirectory, use: `make -C $(git rev-parse --show-toplevel) up`
:::warning
Always use these commands instead of plain `docker compose down`, which won't know the correct project name for your instance.
:::
## GitHub Codespaces (Cloud Development)
GitHub Codespaces provides a complete, pre-configured development environment in the cloud. This is ideal for:
@@ -653,7 +618,7 @@ export enum FeatureFlag {
those specified under FEATURE_FLAGS in `superset_config.py`. For example, `DEFAULT_FEATURE_FLAGS = { 'FOO': True, 'BAR': False }` in `superset/config.py` and `FEATURE_FLAGS = { 'BAR': True, 'BAZ': True }` in `superset_config.py` will result
in combined feature flags of `{ 'FOO': True, 'BAR': True, 'BAZ': True }`.
The current status of the usability of each flag (stable vs testing, etc) can be found in the [Feature Flags](/docs/configuration/feature-flags) documentation.
The current status of the usability of each flag (stable vs testing, etc) can be found in `RESOURCES/FEATURE_FLAGS.md`.
## Git Hooks

View File

@@ -258,7 +258,19 @@ For debugging the Flask backend:
### Storybook
See the dedicated [Storybook documentation](../testing/storybook) for information on running Storybook locally and adding new stories.
Storybook is used for developing and testing UI components in isolation:
```bash
cd superset-frontend
# Start Storybook
npm run storybook
# Build static Storybook
npm run build-storybook
```
Access Storybook at http://localhost:6006
## Contributing Translations
@@ -330,79 +342,26 @@ ruff check --fix .
Pre-commit hooks run automatically on `git commit` if installed.
### TypeScript / JavaScript
### TypeScript
We use a hybrid linting approach combining OXC (Oxidation Compiler) for standard rules and a custom AST-based checker for Superset-specific patterns.
#### Quick Commands
We use ESLint and Prettier for TypeScript:
```bash
cd superset-frontend
# Run both OXC and custom rules
npm run lint:full
# Run OXC linter only (faster for most checks)
# Run eslint checks
npm run lint
# Fix auto-fixable issues with OXC
npm run lint-fix
# Run custom rules checker only
npm run check:custom-rules
# Run tsc (typescript) checks
npm run type
# Fix lint issues
npm run lint-fix
# Format with Prettier
npm run prettier
```
#### Architecture
The linting system consists of two components:
1. **OXC Linter** (`oxlint`) - A Rust-based linter that's 50-100x faster than ESLint
- Handles all standard JavaScript/TypeScript rules
- Configured via `oxlint.json`
- Runs via `npm run lint` or `npm run lint-fix`
2. **Custom Rules Checker** - A Node.js AST-based checker for Superset-specific patterns
- Enforces no literal colors (use theme colors)
- Prevents FontAwesome usage (use @superset-ui/core Icons)
- Validates i18n template usage (no template variables)
- Runs via `npm run check:custom-rules`
#### Why This Approach?
- **50-100x faster linting** compared to ESLint for standard rules via OXC
- **Apache-compatible** - No custom binaries, ASF-friendly
- **Maintainable** - Custom rules in JavaScript, not Rust
- **Flexible** - Can evolve as OXC adds plugin support
#### Troubleshooting
**"Plugin 'basic-custom-plugin' not found" Error**
Ensure you're using the explicit config:
```bash
npx oxlint --config oxlint.json
```
**Custom Rules Not Running**
Verify the AST parsing dependencies are installed:
```bash
npm ls @babel/parser @babel/traverse glob
```
#### Adding New Custom Rules
1. Edit `scripts/check-custom-rules.js`
2. Add a new check function following the AST visitor pattern
3. Call the function in `processFile()`
4. Test with `npm run check:custom-rules`
## GitHub Ephemeral Environments
For every PR, an ephemeral environment is automatically deployed for testing.

View File

@@ -138,6 +138,19 @@ The diagram shows:
3. **The host application** implements the APIs and manages extensions
4. **Extensions** integrate seamlessly with the host through well-defined interfaces
### Extension Dependencies
Extensions can depend on any combination of packages based on their needs. For example:
**Frontend-only extension** (e.g., a custom chart type):
- Depends on `@apache-superset/core` for UI components and React APIs
**Full-stack extension** (e.g., a custom SQL editor with new API endpoints):
- Depends on `@apache-superset/core` for frontend components
- Depends on `apache-superset-core` for backend APIs and models
This modular approach allows extension authors to choose exactly what they need while promoting consistency and reusability.
## Dynamic Module Loading
One of the most sophisticated aspects of the extension architecture is how frontend code is dynamically loaded at runtime using Webpack's Module Federation.
@@ -202,6 +215,7 @@ import {
authentication,
core,
commands,
environment,
extensions,
sqlLab,
} from 'src/extensions';
@@ -212,6 +226,7 @@ export default function setupExtensionsAPI() {
authentication,
core,
commands,
environment,
extensions,
sqlLab,
};
@@ -233,7 +248,6 @@ This architecture provides several key benefits:
Now that you understand the architecture, explore:
- **[Dependencies](./dependencies)** - Managing dependencies and understanding API stability
- **[Extension Project Structure](./extension-project-structure)** - How to organize your extension code
- **[Frontend Contribution Types](./frontend-contribution-types)** - What kinds of extensions you can build
- **[Quick Start](./quick-start)** - Build your first extension
- **[Contribution Types](./contribution-types)** - What kinds of extensions you can build
- **[Development](./development)** - Project structure, APIs, and development workflow

View File

@@ -1,151 +0,0 @@
---
title: Contribution Types
sidebar_position: 5
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Contribution Types
To facilitate the development of extensions, we define a set of well-defined contribution types that extensions can implement. These contribution types serve as the building blocks for extensions, allowing them to interact with the host application and provide new functionality.
## Frontend
Frontend contribution types allow extensions to extend Superset's user interface with new views, commands, and menu items.
### Views
Extensions can add new views or panels to the host application, such as custom SQL Lab panels, dashboards, or other UI components. Each view is registered with a unique ID and can be activated or deactivated as needed. Contribution areas are uniquely identified (e.g., `sqllab.panels` for SQL Lab panels), enabling seamless integration into specific parts of the application.
```json
"frontend": {
"contributions": {
"views": {
"sqllab.panels": [
{
"id": "my_extension.main",
"name": "My Panel Name"
}
]
}
}
}
```
### Commands
Extensions can define custom commands that can be executed within the host application, such as context-aware actions or menu options. Each command can specify properties like a unique command identifier, an icon, a title, and a description. These commands can be invoked by users through menus, keyboard shortcuts, or other UI elements, enabling extensions to add rich, interactive functionality to Superset.
```json
"frontend": {
"contributions": {
"commands": [
{
"command": "my_extension.copy_query",
"icon": "CopyOutlined",
"title": "Copy Query",
"description": "Copy the current query to clipboard"
}
]
}
}
```
### Menus
Extensions can contribute new menu items or context menus to the host application, providing users with additional actions and options. Each menu item can specify properties such as the target view, the command to execute, its placement (primary, secondary, or context), and conditions for when it should be displayed. Menu contribution areas are uniquely identified (e.g., `sqllab.editor` for the SQL Lab editor), allowing extensions to seamlessly integrate their functionality into specific menus and workflows within Superset.
```json
"frontend": {
"contributions": {
"menus": {
"sqllab.editor": {
"primary": [
{
"view": "builtin.editor",
"command": "my_extension.copy_query"
}
],
"secondary": [
{
"view": "builtin.editor",
"command": "my_extension.prettify"
}
],
"context": [
{
"view": "builtin.editor",
"command": "my_extension.clear"
}
]
}
}
}
}
```
### Editors
Extensions can replace Superset's default text editors with custom implementations. This enables enhanced editing experiences using alternative editor frameworks like Monaco, CodeMirror, or custom solutions. When an extension registers an editor for a language, it replaces the default Ace editor in all locations that use that language (SQL Lab, Dashboard Properties, CSS editors, etc.).
```json
"frontend": {
"contributions": {
"editors": [
{
"id": "my_extension.monaco_sql",
"name": "Monaco SQL Editor",
"languages": ["sql"],
"description": "Monaco-based SQL editor with IntelliSense"
}
]
}
}
```
See [Editors Extension Point](./extension-points/editors) for implementation details.
## Backend
Backend contribution types allow extensions to extend Superset's server-side capabilities with new API endpoints, MCP tools, and MCP prompts.
### REST API Endpoints
Extensions can register custom REST API endpoints under the `/api/v1/extensions/` namespace. This dedicated namespace prevents conflicts with built-in endpoints and provides a clear separation between core and extension functionality.
```json
"backend": {
"entryPoints": ["my_extension.entrypoint"],
"files": ["backend/src/my_extension/**/*.py"]
}
```
The entry point module registers the API with Superset:
```python
from superset_core.api.rest_api import add_extension_api
from .api import MyExtensionAPI
add_extension_api(MyExtensionAPI)
```
### MCP Tools and Prompts
Extensions can contribute Model Context Protocol (MCP) tools and prompts that AI agents can discover and use. See [MCP Integration](./mcp) for detailed documentation.

View File

@@ -1,166 +0,0 @@
---
title: Dependencies
sidebar_position: 4
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Dependencies
This guide explains how to manage dependencies in your Superset extensions, including the difference between public APIs and internal code, and best practices for maintaining stable extensions.
## Core Packages vs Internal Code
Extensions run in the same context as Superset during runtime. This means extension developers can technically import any module from the Superset codebase, not just the public APIs. Understanding the distinction between public and internal code is critical for building maintainable extensions.
### Public APIs (Stable)
The core packages follow [semantic versioning](https://semver.org/) and provide stable, documented APIs:
| Package | Language | Description |
|---------|----------|-------------|
| `@apache-superset/core` | JavaScript/TypeScript | Frontend APIs, UI components, hooks, and utilities |
| `apache-superset-core` | Python | Backend APIs, models, DAOs, and utilities |
**Benefits of using core packages:**
- **Semantic versioning**: Breaking changes are communicated through version numbers
- **Documentation**: APIs are documented with clear usage examples
- **Stability commitment**: We strive to maintain backward compatibility
- **Type safety**: Full TypeScript and Python type definitions
### Internal Code (Unstable)
Any code that is not exported through the core packages is considered internal. This includes:
- Direct imports from `superset-frontend/src/` modules
- Direct imports from `superset/` Python modules (outside of `superset_core`)
- Undocumented functions, classes, or utilities
:::warning Use at Your Own Risk
Internal code can change at any time without notice. If you depend on internal modules, your extension may break when Superset is upgraded. There is no guarantee of backward compatibility for internal code.
:::
**Example of internal vs public imports:**
```typescript
// ✅ Public API - stable
import { Button, sqlLab } from '@apache-superset/core';
// ❌ Internal code - may break without notice
import { someInternalFunction } from 'src/explore/components/SomeComponent';
```
```python
# ✅ Public API - stable
from superset_core.api.models import Database
from superset_core.api.daos import DatabaseDAO
# ❌ Internal code - may break without notice
from superset.views.core import SomeInternalClass
```
## API Evolution
The core packages are still evolving. While we follow semantic versioning, the APIs may change as we add new extension points and refine existing ones based on community feedback.
**What this means for extension developers:**
- Check the release notes when upgrading Superset
- Test your extensions against new Superset versions before deploying
- Participate in discussions about API changes to influence the direction
- In some cases, using internal dependencies may be acceptable while the public API is being developed for your use case
### When Internal Dependencies May Be Acceptable
While public APIs are always preferred, there are situations where using internal code may be reasonable:
1. **Missing functionality**: The public API doesn't yet expose what you need
2. **Prototype/experimental extensions**: You're exploring capabilities before committing to a stable implementation
3. **Bridge period**: You need functionality that's planned for the public API but not yet released
In these cases, document your internal dependencies clearly and plan to migrate to public APIs when they become available.
## Core Library Dependencies
An important architectural principle of the Superset extension system is that **we do not provide abstractions on top of core dependencies** like React (frontend) or SQLAlchemy (backend).
### Why We Don't Abstract Core Libraries
Abstracting libraries like React or SQLAlchemy would:
- Create maintenance overhead keeping abstractions in sync with upstream
- Limit access to the full power of these libraries
- Add unnecessary abstraction layers
- Fragment the ecosystem with Superset-specific variants
### Depending on Core Libraries Directly
Extension developers should depend on and use core libraries directly:
**Frontend (examples):**
- [React](https://react.dev/) - UI framework
- [Ant Design](https://ant.design/) - UI component library (prefer Superset components from `@apache-superset/core/ui` when available to preserve visual consistency)
- [Emotion](https://emotion.sh/) - CSS-in-JS styling
- ...
**Backend (examples):**
- [SQLAlchemy](https://www.sqlalchemy.org/) - Database toolkit
- [Flask](https://flask.palletsprojects.com/) - Web framework
- [Flask-AppBuilder](https://flask-appbuilder.readthedocs.io/) - Application framework
- ...
:::info Version Compatibility
When Superset upgrades its core dependencies (e.g., a new major version of Ant Design or SQLAlchemy), extension developers should upgrade their extensions accordingly. This ensures compatibility and access to the latest features and security fixes.
:::
## API Versioning and Changelog
Once the extensions API reaches **v1**, we will maintain a dedicated `CHANGELOG.md` file to track all changes to the public APIs. This will include:
- New APIs and features
- Deprecation notices
- Breaking changes with migration guides
- Bug fixes affecting API behavior
Until then, monitor the Superset release notes and test your extensions with each new release.
## Best Practices
### Do
- **Prefer public APIs**: Always check if functionality exists in `@apache-superset/core` or `apache-superset-core` before using internal code
- **Pin versions**: Specify compatible Superset versions in your extension metadata
- **Test upgrades**: Verify your extension works with new Superset releases before deploying
- **Report missing APIs**: If you need functionality not in the public API, open a GitHub issue to request it
- **Use core libraries directly**: Leverage Ant Design, SQLAlchemy, and other core libraries directly
### Don't
- **Assume stability of internal code**: Internal modules can change or be removed in any release
- **Depend on implementation details**: Even if something works, it may not be supported
- **Skip upgrade testing**: Always test your extension against new Superset versions
- **Expect abstractions**: Use core dependencies directly rather than expecting Superset-specific abstractions
## Next Steps
- **[Architecture](./architecture)** - Understand the extension system design
- **[Development](./development)** - Learn about APIs and development workflow
- **[Quick Start](./quick-start)** - Build your first extension

View File

@@ -1,6 +1,6 @@
---
title: Deployment
sidebar_position: 7
title: Deploying an Extension
sidebar_position: 8
---
<!--
@@ -22,7 +22,7 @@ specific language governing permissions and limitations
under the License.
-->
# Deployment
# Deploying an Extension
Once an extension has been developed, the deployment process involves packaging and uploading it to the host application.
@@ -33,17 +33,13 @@ Packaging is handled by the `superset-extensions bundle` command, which:
3. Generates a `manifest.json` with build-time metadata, including the contents of `extension.json` and references to built assets.
4. Packages everything into a `.supx` file (a zip archive with a specific structure required by Superset).
To deploy an extension, place the `.supx` file in the extensions directory configured via `EXTENSIONS_PATH` in your `superset_config.py`:
Uploading is accomplished through Superset's REST API at `/api/v1/extensions/import/`. The endpoint accepts the `.supx` file as form data and processes it by:
``` python
EXTENSIONS_PATH = "/path/to/extensions"
```
1. Extracting and validating the extension metadata and manifest.
2. Storing extension assets in the metadata database for dynamic loading.
3. Registering the extension in the metadata database, including its name, version, author, and capabilities.
4. Automatically activating the extension, making it immediately available for use and management via the Superset UI or API.
During application startup, Superset automatically discovers and loads all `.supx` files from this directory:
This API-driven approach enables automated deployment workflows and simplifies extension management for administrators. Extensions can be uploaded through the Swagger UI, programmatically via scripts, or through the management interface:
1. Scans the configured directory for `.supx` files.
2. Validates each file is a properly formatted zip archive.
3. Extracts and validates the extension manifest and metadata.
4. Loads the extension, making it available for use.
This file-based approach simplifies deployment in containerized environments and enables version control of extensions alongside infrastructure configuration.
https://github.com/user-attachments/assets/98b16cdd-8ec5-4812-9d5e-9915badd8f0d

View File

@@ -0,0 +1,48 @@
---
title: Development Mode
sidebar_position: 10
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Development Mode
Development mode accelerates extension development by letting developers see changes in Superset quickly, without the need for repeated packaging and uploading. To enable development mode, set the `LOCAL_EXTENSIONS` configuration in your `superset_config.py`:
``` python
LOCAL_EXTENSIONS = [
"/path/to/your/extension1",
"/path/to/your/extension2",
]
```
This instructs Superset to load and serve extensions directly from disk, so you can iterate quickly. Running `superset-extensions dev` watches for file changes and rebuilds assets automatically, while the Webpack development server (started separately with `npm run dev-server`) serves updated files as soon as they're modified. This enables immediate feedback for React components, styles, and other frontend code. Changes to backend files are also detected automatically and immediately synced, ensuring that both frontend and backend updates are reflected in your development environment.
Example output when running in development mode:
```
superset-extensions dev
⚙️ Building frontend assets…
✅ Frontend rebuilt
✅ Backend files synced
✅ Manifest updated
👀 Watching for changes in: /dataset_references/frontend, /dataset_references/backend
```

View File

@@ -1,305 +0,0 @@
---
title: Development
sidebar_position: 6
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Development
This guide covers everything you need to know about developing extensions for Superset, from project structure to development workflow.
## Project Structure
The [apache-superset-extensions-cli](https://github.com/apache/superset/tree/master/superset-extensions-cli) package provides a command-line interface (CLI) that streamlines the extension development workflow. It offers the following commands:
```
superset-extensions init: Generates the initial folder structure and scaffolds a new extension project.
superset-extensions build: Builds extension assets.
superset-extensions bundle: Packages the extension into a .supx file.
superset-extensions dev: Automatically rebuilds the extension as files change.
```
When creating a new extension with `superset-extensions init <extension-name>`, the CLI generates a standardized folder structure:
```
dataset_references/
├── extension.json
├── frontend/
│ ├── src/
│ ├── webpack.config.js
│ ├── tsconfig.json
│ └── package.json
├── backend/
│ ├── src/
│ └── dataset_references/
│ ├── tests/
│ ├── pyproject.toml
│ └── requirements.txt
├── dist/
│ ├── manifest.json
│ ├── frontend
│ └── dist/
│ ├── remoteEntry.d7a9225d042e4ccb6354.js
│ └── 900.038b20cdff6d49cfa8d9.js
│ └── backend
│ └── dataset_references/
│ ├── __init__.py
│ ├── api.py
│ └── entrypoint.py
├── dataset_references-1.0.0.supx
└── README.md
```
The `extension.json` file serves as the declared metadata for the extension, containing the extension's name, version, author, description, and a list of capabilities. This file is essential for the host application to understand how to load and manage the extension.
The `frontend` directory contains the source code for the frontend components of the extension, including React components, styles, and assets. The `webpack.config.js` file is used to configure Webpack for building the frontend code, while the `tsconfig.json` file defines the TypeScript configuration for the project. The `package.json` file specifies the dependencies and scripts for building and testing the frontend code.
The `backend` directory contains the source code for the backend components of the extension, including Python modules, tests, and configuration files. The `pyproject.toml` file is used to define the Python package and its dependencies, while the `requirements.txt` file lists the required Python packages for the extension. The `src` folder contains the functional backend source files, `tests` directory contains unit tests for the backend code, ensuring that the extension behaves as expected and meets the defined requirements.
The `dist` directory is built when running the `build` or `dev` command, and contains the files that will be included in the bundle. The `manifest.json` file contains critical metadata about the extension, including the majority of the contents of the `extension.json` file, but also other build-time information, like the name of the built Webpack Module Federation remote entry file. The files in the `dist` directory will be zipped into the final `.supx` file. Although this file is technically a zip archive, the `.supx` extension makes it clear that it is a Superset extension package and follows a specific file layout. This packaged file can be distributed and installed in Superset instances.
The `README.md` file provides documentation and instructions for using the extension, including how to install, configure, and use its functionality.
## Extension Metadata
The `extension.json` file contains all metadata necessary for the host application to understand and manage the extension:
```json
{
"name": "dataset_references",
"version": "1.0.0",
"frontend": {
"contributions": {
"views": {
"sqllab.panels": [
{
"id": "dataset_references.main",
"name": "Dataset references"
}
]
}
},
"moduleFederation": {
"exposes": ["./index"]
}
},
"backend": {
"entryPoints": ["dataset_references.entrypoint"],
"files": ["backend/src/dataset_references/**/*.py"]
}
}
```
The `contributions` section declares how the extension extends Superset's functionality through views, commands, menus, and other contribution types. The `backend` section specifies entry points and files to include in the bundle.
## Interacting with the Host
Extensions interact with Superset through well-defined, versioned APIs provided by the `@apache-superset/core` (frontend) and `apache-superset-core` (backend) packages. These APIs are designed to be stable, discoverable, and consistent for both built-in and external extensions.
**Note**: The `superset_core.api` module provides abstract classes that are replaced with concrete implementations via dependency injection when Superset initializes. This allows extensions to use the same interfaces as the host application.
### Frontend APIs
The frontend extension APIs (via `@apache-superset/core`) are organized into logical namespaces such as `authentication`, `commands`, `extensions`, `sqlLab`, and others. Each namespace groups related functionality, making it easy for extension authors to discover and use the APIs relevant to their needs. For example, the `sqlLab` namespace provides events and methods specific to SQL Lab, allowing extensions to react to user actions and interact with the SQL Lab environment:
```typescript
export const getCurrentTab: () => Tab | undefined;
export const getDatabases: () => Database[];
export const getTabs: () => Tab[];
export const onDidChangeActivePanel: Event<Panel>;
export const onDidChangeTabTitle: Event<string>;
export const onDidQueryRun: Event<Editor>;
export const onDidQueryStop: Event<Editor>;
```
The following code demonstrates more examples of the existing frontend APIs:
```typescript
import { core, commands, sqlLab, authentication, Button } from '@apache-superset/core';
import MyPanel from './MyPanel';
export function activate(context) {
// Register a new panel (view) in SQL Lab and use shared UI components in your extension's React code
const panelDisposable = core.registerView('my_extension.panel', <MyPanel><Button/></MyPanel>);
// Register a custom command
const commandDisposable = commands.registerCommand('my_extension.copy_query', {
title: 'Copy Query',
execute: () => {
// Command logic here
},
});
// Listen for query run events in SQL Lab
const eventDisposable = sqlLab.onDidQueryRun(editor => {
// Handle query execution event
});
// Access a CSRF token for secure API requests
authentication.getCSRFToken().then(token => {
// Use token as needed
});
// Add all disposables for automatic cleanup on deactivation
context.subscriptions.push(panelDisposable, commandDisposable, eventDisposable);
}
```
### Backend APIs
Backend APIs (via `apache-superset-core`) follow a similar pattern, providing access to Superset's models, sessions, and query capabilities. Extensions can register REST API endpoints, access the metadata database, and interact with Superset's core functionality.
Extension endpoints are registered under a dedicated `/extensions` namespace to avoid conflicting with built-in endpoints and also because they don't share the same version constraints. By grouping all extension endpoints under `/extensions`, Superset establishes a clear boundary between core and extension functionality, making it easier to manage, document, and secure both types of APIs.
```python
from superset_core.api.models import Database, get_session
from superset_core.api.daos import DatabaseDAO
from superset_core.api.rest_api import add_extension_api
from .api import DatasetReferencesAPI
# Register a new extension REST API
add_extension_api(DatasetReferencesAPI)
# Fetch Superset entities via the DAO to apply base filters that filter out entities
# that the user doesn't have access to
databases = DatabaseDAO.find_all()
# ..or apply simple filters on top of base filters
databases = DatabaseDAO.filter_by(uuid=database.uuid)
if not databases:
raise Exception("Database not found")
return databases[0]
# Perform complex queries using SQLAlchemy Query, also filtering out
# inaccessible entities
session = get_session()
databases_query = session.query(Database).filter(
Database.database_name.ilike("%abc%")
)
return DatabaseDAO.query(databases_query)
```
In the future, we plan to expand the backend APIs to support configuring security models, database engines, SQL Alchemy dialects, etc.
## Development Mode
Development mode accelerates extension development by letting developers see changes in Superset quickly, without the need for repeated packaging and uploading. To enable development mode, set the `LOCAL_EXTENSIONS` configuration in your `superset_config.py`:
```python
LOCAL_EXTENSIONS = [
"/path/to/your/extension1",
"/path/to/your/extension2",
]
```
This instructs Superset to load and serve extensions directly from disk, so you can iterate quickly. Running `superset-extensions dev` watches for file changes and rebuilds assets automatically, while the Webpack development server (started separately with `npm run dev-server`) serves updated files as soon as they're modified. This enables immediate feedback for React components, styles, and other frontend code. Changes to backend files are also detected automatically and immediately synced, ensuring that both frontend and backend updates are reflected in your development environment.
Example output when running in development mode:
```
superset-extensions dev
⚙️ Building frontend assets…
✅ Frontend rebuilt
✅ Backend files synced
✅ Manifest updated
👀 Watching for changes in: /dataset_references/frontend, /dataset_references/backend
```
## Contributing Extension-Compatible Components
Components in `@apache-superset/core` are automatically documented in the Developer Portal. Simply add a component to the package and it will appear in the extension documentation.
### Requirements
1. **Location**: The component must be in `superset-frontend/packages/superset-core/src/ui/components/`
2. **Exported**: The component must be exported from the package's `index.ts`
3. **Story**: The component must have a Storybook story
### Creating a Story for Your Component
Create a story file with an `Interactive` export that defines args and argTypes:
```typescript
// MyComponent.stories.tsx
import { MyComponent } from '.';
export default {
title: 'Extension Components/MyComponent',
component: MyComponent,
parameters: {
docs: {
description: {
component: 'A brief description of what this component does.',
},
},
},
};
// Define an interactive story with args
export const InteractiveMyComponent = (args) => <MyComponent {...args} />;
InteractiveMyComponent.args = {
variant: 'primary',
disabled: false,
};
InteractiveMyComponent.argTypes = {
variant: {
control: { type: 'select' },
options: ['primary', 'secondary', 'danger'],
},
disabled: {
control: { type: 'boolean' },
},
};
```
### How Documentation is Generated
When the docs site is built (`yarn start` or `yarn build` in the `docs/` directory):
1. The `generate-extension-components` script scans all stories in `superset-core`
2. For each story, it generates an MDX page with:
- Component description
- **Live interactive example** with controls extracted from `argTypes`
- **Editable code playground** for experimentation
- Props table from story `args`
- Usage code snippet
- Links to source files
3. Pages appear automatically in **Developer Portal → Extensions → Components**
### Best Practices
- **Use descriptive titles**: The title path determines the component's location in docs (e.g., `Extension Components/Alert`)
- **Define argTypes**: These become interactive controls in the documentation
- **Provide default args**: These populate the initial state of the live example
- **Write clear descriptions**: Help extension developers understand when to use each component

View File

@@ -0,0 +1,55 @@
---
title: Extension Metadata
sidebar_position: 4
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Extension Metadata
The `extension.json` file contains all metadata necessary for the host application to understand and manage the extension:
``` json
{
"name": "dataset_references",
"version": "1.0.0",
"frontend": {
"contributions": {
"views": {
"sqllab.panels": [
{
"id": "dataset_references.main",
"name": "Dataset references"
}
]
}
},
"moduleFederation": {
"exposes": ["./index"]
}
},
"backend": {
"entryPoints": ["dataset_references.entrypoint"],
"files": ["backend/src/dataset_references/**/*.py"]
},
}
```
The `contributions` section declares how the extension extends Superset's functionality through views, commands, menus, and other contribution types. The `backend` section specifies entry points and files to include in the bundle.

View File

@@ -1,245 +0,0 @@
---
title: Editors
sidebar_position: 2
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Editor Contributions
Extensions can replace Superset's default text editors with custom implementations. This allows you to provide enhanced editing experiences using alternative editor frameworks like Monaco, CodeMirror, or custom solutions.
## Overview
Superset uses text editors in various places throughout the application:
| Language | Locations |
|----------|-----------|
| `sql` | SQL Lab, Metric/Filter Popovers |
| `json` | Dashboard Properties, Annotation Modal, Theme Modal |
| `css` | Dashboard Properties, CSS Template Modal |
| `markdown` | Dashboard Markdown component |
| `yaml` | Template Params Editor |
By registering an editor provider for a language, your extension replaces the default Ace editor in **all** locations that use that language.
## Manifest Configuration
Declare editor contributions in your `extension.json` manifest:
```json
{
"name": "monaco-editor",
"version": "1.0.0",
"frontend": {
"contributions": {
"editors": [
{
"id": "monaco-editor.sql",
"name": "Monaco SQL Editor",
"languages": ["sql"],
"description": "Monaco-based SQL editor with IntelliSense"
}
]
}
}
}
```
## Implementing an Editor
Your editor component must implement the `EditorProps` interface and expose an `EditorHandle` via `forwardRef`. For the complete interface definitions, see `@apache-superset/core/api/editors.ts`.
### Key EditorProps
```typescript
interface EditorProps {
/** Controlled value */
value: string;
/** Content change handler */
onChange: (value: string) => void;
/** Language mode for syntax highlighting */
language: EditorLanguage;
/** Keyboard shortcuts to register */
hotkeys?: EditorHotkey[];
/** Callback when editor is ready with imperative handle */
onReady?: (handle: EditorHandle) => void;
/** Host-specific context (e.g., database info from SQL Lab) */
metadata?: Record<string, unknown>;
// ... additional props for styling, annotations, etc.
}
```
### Key EditorHandle Methods
```typescript
interface EditorHandle {
/** Focus the editor */
focus(): void;
/** Get the current editor content */
getValue(): string;
/** Get the current cursor position */
getCursorPosition(): Position;
/** Move the cursor to a specific position */
moveCursorToPosition(position: Position): void;
/** Set the selection range */
setSelection(selection: Range): void;
/** Scroll to a specific line */
scrollToLine(line: number): void;
// ... additional methods for text manipulation, annotations, etc.
}
```
## Example Implementation
Here's an example of a Monaco-based SQL editor implementing the key interfaces shown above:
### MonacoSQLEditor.tsx
```typescript
import { forwardRef, useRef, useImperativeHandle, useEffect } from 'react';
import * as monaco from 'monaco-editor';
import type { editors } from '@apache-superset/core';
const MonacoSQLEditor = forwardRef<editors.EditorHandle, editors.EditorProps>(
(props, ref) => {
const { value, onChange, hotkeys, onReady } = props;
const containerRef = useRef<HTMLDivElement>(null);
const editorRef = useRef<monaco.editor.IStandaloneCodeEditor | null>(null);
// Implement EditorHandle interface
const handle: editors.EditorHandle = {
focus: () => editorRef.current?.focus(),
getValue: () => editorRef.current?.getValue() ?? '',
getCursorPosition: () => {
const pos = editorRef.current?.getPosition();
return { line: (pos?.lineNumber ?? 1) - 1, column: (pos?.column ?? 1) - 1 };
},
// ... implement remaining methods
};
useImperativeHandle(ref, () => handle, []);
useEffect(() => {
if (!containerRef.current) return;
const editor = monaco.editor.create(containerRef.current, { value, language: 'sql' });
editorRef.current = editor;
editor.onDidChangeModelContent(() => onChange(editor.getValue()));
// Register hotkeys
hotkeys?.forEach(hotkey => {
editor.addAction({
id: hotkey.name,
label: hotkey.name,
run: () => hotkey.exec(handle),
});
});
onReady?.(handle);
return () => editor.dispose();
}, []);
return <div ref={containerRef} style={{ height: '100%', width: '100%' }} />;
},
);
export default MonacoSQLEditor;
```
### activate.ts
```typescript
import { editors } from '@apache-superset/core';
import MonacoSQLEditor from './MonacoSQLEditor';
export function activate(context) {
// Register the Monaco editor for SQL
const disposable = editors.registerEditorProvider(
{
id: 'monaco-sql-editor.sql',
name: 'Monaco SQL Editor',
languages: ['sql'],
},
MonacoSQLEditor,
);
context.subscriptions.push(disposable);
}
```
## Handling Hotkeys
Superset passes keyboard shortcuts via the `hotkeys` prop. Each hotkey includes an `exec` function that receives the `EditorHandle`:
```typescript
interface EditorHotkey {
name: string;
key: string; // e.g., "Ctrl-Enter", "Alt-Shift-F"
description?: string;
exec: (handle: EditorHandle) => void;
}
```
Your editor must register these hotkeys with your editor framework and call `exec(handle)` when triggered.
## Keywords
Superset passes static autocomplete suggestions via the `keywords` prop. These include table names, column names, and SQL functions based on the current database context:
```typescript
interface EditorKeyword {
name: string;
value?: string; // Text to insert (defaults to name)
meta?: string; // Category like "table", "column", "function"
score?: number; // Sorting priority
}
```
Your editor should convert these to your framework's completion format and register them for autocomplete.
## Completion Providers
For dynamic autocomplete (e.g., fetching suggestions as the user types), implement and register a `CompletionProvider` via the `EditorHandle`:
```typescript
const provider: CompletionProvider = {
id: 'my-sql-completions',
triggerCharacters: ['.', ' '],
provideCompletions: async (content, position, context) => {
// Use context.metadata for database info
// Return array of CompletionItem
return [
{ label: 'SELECT', insertText: 'SELECT', kind: 'keyword' },
// ...
];
},
};
// Register during editor initialization
const disposable = handle.registerCompletionProvider(provider);
```
## Next Steps
- **[SQL Lab Extension Points](./sqllab)** - Learn about other SQL Lab customizations
- **[Contribution Types](../contribution-types)** - Explore other contribution types
- **[Development](../development)** - Set up your development environment

View File

@@ -1,209 +0,0 @@
---
title: SQL Lab
sidebar_position: 1
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# SQL Lab Extension Points
SQL Lab provides 5 extension points where extensions can contribute custom UI components. Each area serves a specific purpose and can be customized to add new functionality.
## Layout Overview
```
┌──────────┬─────────────────────────────────────────┬─────────────┐
│ │ │ │
│ │ │ │
│ │ Editor │ │
│ │ │ │
│ Left │ │ Right │
│ Sidebar ├─────────────────────────────────────────┤ Sidebar │
│ │ │ │
│ │ Panels │ │
│ │ │ │
│ │ │ │
│ │ │ │
├──────────┴─────────────────────────────────────────┴─────────────┤
│ Status Bar │
└──────────────────────────────────────────────────────────────────┘
```
| Extension Point | ID | Description |
| ----------------- | --------------------- | ---------------------------------------------------------- |
| **Left Sidebar** | `sqllab.leftSidebar` | Navigation and browsing (database explorer, saved queries) |
| **Editor** | `sqllab.editor` | SQL query editor workspace |
| **Right Sidebar** | `sqllab.rightSidebar` | Contextual tools (AI assistants, query analysis) |
| **Panels** | `sqllab.panels` | Results and related views (visualizations, data profiling) |
| **Status Bar** | `sqllab.statusBar` | Connection status and query metrics |
## Area Customizations
Each extension point area supports three types of action customizations:
```
┌───────────────────────────────────────────────────────────────┐
│ Area Title [Button] [Button] [•••] │
├───────────────────────────────────────────────────────────────┤
│ │
│ │
│ Area Content │
│ │
│ (right-click for context menu) │
│ │
│ │
└───────────────────────────────────────────────────────────────┘
```
| Action Type | Location | Use Case |
| --------------------- | ----------------- | ----------------------------------------------------- |
| **Primary Actions** | Top-right buttons | Frequently used actions (e.g., run, refresh, add new) |
| **Secondary Actions** | 3-dot menu (•••) | Less common actions (e.g., export, settings) |
| **Context Actions** | Right-click menu | Context-sensitive actions on content |
## Examples
### Adding a Panel
This example adds a "Data Profiler" panel to SQL Lab:
```json
{
"name": "data_profiler",
"version": "1.0.0",
"frontend": {
"contributions": {
"views": {
"sqllab.panels": [
{
"id": "data_profiler.main",
"name": "Data Profiler"
}
]
}
}
}
}
```
```typescript
import { core } from '@apache-superset/core';
import DataProfilerPanel from './DataProfilerPanel';
export function activate(context) {
// Register the panel view with the ID declared in extension.json
const disposable = core.registerView('data_profiler.main', <DataProfilerPanel />);
context.subscriptions.push(disposable);
}
```
### Adding Actions to the Editor
This example adds primary, secondary, and context actions to the editor:
```json
{
"name": "query_tools",
"version": "1.0.0",
"frontend": {
"contributions": {
"commands": [
{
"command": "query_tools.format",
"title": "Format Query",
"icon": "FormatPainterOutlined"
},
{
"command": "query_tools.explain",
"title": "Explain Query"
},
{
"command": "query_tools.copy_as_cte",
"title": "Copy as CTE"
}
],
"menus": {
"sqllab.editor": {
"primary": [
{
"view": "builtin.editor",
"command": "query_tools.format"
}
],
"secondary": [
{
"view": "builtin.editor",
"command": "query_tools.explain"
}
],
"context": [
{
"view": "builtin.editor",
"command": "query_tools.copy_as_cte"
}
]
}
}
}
}
}
```
```typescript
import { commands, sqlLab } from '@apache-superset/core';
export function activate(context) {
// Register the commands declared in extension.json
const formatCommand = commands.registerCommand('query_tools.format', {
execute: () => {
const tab = sqlLab.getCurrentTab();
if (tab?.editor) {
// Format the SQL query
}
},
});
const explainCommand = commands.registerCommand('query_tools.explain', {
execute: () => {
const tab = sqlLab.getCurrentTab();
if (tab?.editor) {
// Show query explanation
}
},
});
const copyAsCteCommand = commands.registerCommand('query_tools.copy_as_cte', {
execute: () => {
const tab = sqlLab.getCurrentTab();
if (tab?.editor) {
// Copy selected text as CTE
}
},
});
context.subscriptions.push(formatCommand, explainCommand, copyAsCteCommand);
}
```
## Next Steps
- **[Contribution Types](../contribution-types)** - Learn about other contribution types (commands, menus)
- **[Development](../development)** - Set up your development environment
- **[Quick Start](../quick-start)** - Build a complete extension

View File

@@ -0,0 +1,78 @@
---
title: Extension Project Structure
sidebar_position: 3
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Extension Project Structure
The `apache-superset-extensions-cli` package provides a command-line interface (CLI) that streamlines the extension development workflow. It offers the following commands:
```
superset-extensions init: Generates the initial folder structure and scaffolds a new extension project.
superset-extensions build: Builds extension assets.
superset-extensions bundle: Packages the extension into a .supx file.
superset-extensions dev: Automatically rebuilds the extension as files change.
```
When creating a new extension with `superset-extensions init <extension-name>`, the CLI generates a standardized folder structure:
```
dataset_references/
├── extension.json
├── frontend/
│ ├── src/
│ ├── webpack.config.js
│ ├── tsconfig.json
│ └── package.json
├── backend/
│ ├── src/
│ └── dataset_references/
│ ├── tests/
│ ├── pyproject.toml
│ └── requirements.txt
├── dist/
│ ├── manifest.json
│ ├── frontend
│ └── dist/
│ ├── remoteEntry.d7a9225d042e4ccb6354.js
│ └── 900.038b20cdff6d49cfa8d9.js
│ └── backend
│ └── dataset_references/
│ ├── __init__.py
│ ├── api.py
│ └── entrypoint.py
├── dataset_references-1.0.0.supx
└── README.md
```
The `extension.json` file serves as the declared metadata for the extension, containing the extension's name, version, author, description, and a list of capabilities. This file is essential for the host application to understand how to load and manage the extension.
The `frontend` directory contains the source code for the frontend components of the extension, including React components, styles, and assets. The `webpack.config.js` file is used to configure Webpack for building the frontend code, while the `tsconfig.json` file defines the TypeScript configuration for the project. The `package.json` file specifies the dependencies and scripts for building and testing the frontend code.
The `backend` directory contains the source code for the backend components of the extension, including Python modules, tests, and configuration files. The `pyproject.toml` file is used to define the Python package and its dependencies, while the r`equirements.txt` file lists the required Python packages for the extension. The `src` folder contains the functional backend source files, `tests` directory contains unit tests for the backend code, ensuring that the extension behaves as expected and meets the defined requirements.
The `dist` directory is built when running the `build` or `dev` command, and contains the files that will be included in the bundle. The `manifest.json` file contains critical metadata about the extension, including the majority of the contents of the `extension.json` file, but also other build-time information, like the name of the built Webpack Module Federation remote entry file. The files in the `dist` directory will be zipped into the final `.supx` file. Although this file is technically a zip archive, the `.supx` extension makes it clear that it is a Superset extension package and follows a specific file layout. This packaged file can be distributed and installed in Superset instances.
The `README.md` file provides documentation and instructions for using the extension, including how to install, configure, and use its functionality.

View File

@@ -0,0 +1,90 @@
---
title: Frontend Contribution Types
sidebar_position: 5
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Frontend Contribution Types
To facilitate the development of extensions, we will define a set of well-defined contribution types that extensions can implement. These contribution types will serve as the building blocks for extensions, allowing them to interact with the host application and provide new functionality. The initial set of contribution types will include:
## Views
Extensions can add new views or panels to the host application, such as custom SQL Lab panels, dashboards, or other UI components. Each view is registered with a unique ID and can be activated or deactivated as needed. Contribution areas are uniquely identified (e.g., `sqllab.panels` for SQL Lab panels), enabling seamless integration into specific parts of the application.
``` json
"views": {
"sqllab.panels": [
{
"id": "dataset_references.main",
"name": "Table references"
}
]
},
```
## Commands
Extensions can define custom commands that can be executed within the host application, such as context-aware actions or menu options. Each command can specify properties like a unique command identifier, an icon, a title, and a description. These commands can be invoked by users through menus, keyboard shortcuts, or other UI elements, enabling extensions to add rich, interactive functionality to Superset.
``` json
"commands": [
{
"command": "extension1.copy_query",
"icon": "CopyOutlined",
"title": "Copy Query",
"description": "Copy the current query to clipboard"
},
]
```
## Menus
Extensions can contribute new menu items or context menus to the host application, providing users with additional actions and options. Each menu item can specify properties such as the target view, the command to execute, its placement (primary, secondary, or context), and conditions for when it should be displayed. Menu contribution areas are uniquely identified (e.g., `sqllab.editor` for the SQL Lab editor), allowing extensions to seamlessly integrate their functionality into specific menus and workflows within Superset.
``` json
"menus": {
"sqllab.editor": {
"primary": [
{
"view": "builtin.editor",
"command": "extension1.copy_query"
}
],
"secondary": [
{
"view": "builtin.editor",
"command": "extension1.prettify"
}
],
"context": [
{
"view": "builtin.editor",
"command": "extension1.clear"
},
{
"view": "builtin.editor",
"command": "extension1.refresh"
}
]
},
}
```

View File

@@ -0,0 +1,120 @@
---
title: Interacting with the Host
sidebar_position: 6
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Interacting with the Host
Extensions interact with Superset through well-defined, versioned APIs provided by the `@apache-superset/core` (frontend) and `apache-superset-core` (backend) packages. These APIs are designed to be stable, discoverable, and consistent for both built-in and external extensions.
**Frontend APIs** (via `@apache-superset/core)`:
The frontend extension APIs in Superset are organized into logical namespaces such as `authentication`, `commands`, `extensions`, `sqlLab`, and others. Each namespace groups related functionality, making it easy for extension authors to discover and use the APIs relevant to their needs. For example, the `sqlLab` namespace provides events and methods specific to SQL Lab, allowing extensions to react to user actions and interact with the SQL Lab environment:
``` typescript
export const getCurrentTab: () => Tab | undefined;
export const getDatabases: () => Database[];
export const getTabs: () => Tab[];
export const onDidChangeEditorContent: Event<string>;
export const onDidClosePanel: Event<Panel>;
export const onDidChangeActivePanel: Event<Panel>;
export const onDidChangeTabTitle: Event<string>;
export const onDidQueryRun: Event<Editor>;
export const onDidQueryStop: Event<Editor>;
```
The following code demonstrates more examples of the existing frontend APIs:
``` typescript
import { core, commands, sqlLab, authentication, Button } from '@apache-superset/core';
import MyPanel from './MyPanel';
export function activate(context) {
// Register a new panel (view) in SQL Lab and use shared UI components in your extension's React code
const panelDisposable = core.registerView('my_extension.panel', <MyPanel><Button/></MyPanel>);
// Register a custom command
const commandDisposable = commands.registerCommand('my_extension.copy_query', {
title: 'Copy Query',
execute: () => {
// Command logic here
},
});
// Listen for query run events in SQL Lab
const eventDisposable = sqlLab.onDidQueryRun(editor => {
// Handle query execution event
});
// Access a CSRF token for secure API requests
authentication.getCSRFToken().then(token => {
// Use token as needed
});
// Add all disposables for automatic cleanup on deactivation
context.subscriptions.push(panelDisposable, commandDisposable, eventDisposable);
}
```
**Backend APIs** (via `apache-superset-core`):
Backend APIs follow a similar pattern, providing access to Superset's models, sessions, and query capabilities. Extensions can register REST API endpoints, access the metadata database, and interact with Superset's core functionality.
Extension endpoints are registered under a dedicated `/extensions` namespace to avoid conflicting with built-in endpoints and also because they don't share the same version constraints. By grouping all extension endpoints under `/extensions`, Superset establishes a clear boundary between core and extension functionality, making it easier to manage, document, and secure both types of APIs.
``` python
from superset_core.api import rest_api, models, query
from .api import DatasetReferencesAPI
# Register a new extension REST API
rest_api.add_extension_api(DatasetReferencesAPI)
# Access Superset models with simple queries that filter out entities that
# the user doesn't have access to
databases = models.get_databases(id=database_id)
if not databases:
return self.response_404()
database = databases[0]
# Perform complex queries using SQLAlchemy BaseQuery, also filtering
# out inaccessible entities
session = models.get_session()
db_model = models.get_database_model())
database_query = session.query(db_model.database_name.ilike("%abc%")
databases_containing_abc = models.get_databases(query)
# Bypass security model for highly custom use cases
session = models.get_session()
db_model = models.get_database_model())
all_databases_containg_abc = session.query(db_model.database_name.ilike("%abc%").all()
```
In the future, we plan to expand the backend APIs to support configuring security models, database engines, SQL Alchemy dialects, etc.

View File

@@ -1,459 +0,0 @@
---
title: MCP Integration
hide_title: true
sidebar_position: 8
version: 1
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# MCP Integration
Model Context Protocol (MCP) integration allows extensions to register custom AI agent capabilities that integrate seamlessly with Superset's MCP service. Extensions can provide both **tools** (executable functions) and **prompts** (interactive guidance) that AI agents can discover and use.
## What is MCP?
MCP enables extensions to extend Superset's AI capabilities in two ways:
### MCP Tools
Tools are Python functions that AI agents can call to perform specific tasks. They provide executable functionality that extends Superset's capabilities.
**Examples of MCP tools:**
- Data processing and transformation functions
- Custom analytics calculations
- Integration with external APIs
- Specialized report generation
- Business-specific operations
### MCP Prompts
Prompts provide interactive guidance and context to AI agents. They help agents understand how to better assist users with specific workflows or domain knowledge.
**Examples of MCP prompts:**
- Step-by-step workflow guidance
- Domain-specific context and knowledge
- Interactive troubleshooting assistance
- Template generation helpers
- Best practices recommendations
## Getting Started
## MCP Tools
### Basic Tool Registration
The simplest way to create an MCP tool is using the `@tool` decorator:
```python
from superset_core.mcp import tool
@tool
def hello_world() -> dict:
"""A simple greeting tool."""
return {"message": "Hello from my extension!"}
```
This creates a tool that AI agents can call by name. The tool name defaults to the function name.
### Decorator Parameters
The `@tool` decorator accepts several optional parameters:
**Parameter details:**
- **`name`**: Tool identifier (AI agents use this to call your tool)
- **`description`**: Explains what the tool does (helps AI agents decide when to use it)
- **`tags`**: Categories for organization and discovery
- **`protect`**: Whether the tool requires user authentication (defaults to `True`)
### Naming Your Tools
For extensions, include your extension ID in tool names to avoid conflicts:
## Complete Example
Here's a more comprehensive example showing best practices:
```python
# backend/mcp_tools.py
import random
from datetime import datetime, timezone
from pydantic import BaseModel, Field
from superset_core.mcp import tool
class RandomNumberRequest(BaseModel):
"""Request schema for random number generation."""
min_value: int = Field(
description="Minimum value (inclusive) for random number generation",
ge=-2147483648,
le=2147483647
)
max_value: int = Field(
description="Maximum value (inclusive) for random number generation",
ge=-2147483648,
le=2147483647
)
@tool(
name="example_extension.random_number",
tags=["extension", "utility", "random", "generator"]
)
def random_number_generator(request: RandomNumberRequest) -> dict:
"""
Generate a random integer between specified bounds.
This tool validates input ranges and provides detailed error messages
for invalid requests.
"""
# Validate business logic (Pydantic handles type/range validation)
if request.min_value > request.max_value:
return {
"status": "error",
"error": f"min_value ({request.min_value}) cannot be greater than max_value ({request.max_value})",
"timestamp": datetime.now(timezone.utc).isoformat()
}
# Generate random number
result = random.randint(request.min_value, request.max_value)
return {
"status": "success",
"random_number": result,
"min_value": request.min_value,
"max_value": request.max_value,
"range_size": request.max_value - request.min_value + 1,
"timestamp": datetime.now(timezone.utc).isoformat()
}
```
## Best Practices
### Response Format
Use consistent response structures:
```python
# Success response
{
"status": "success",
"result": "your_data_here",
"timestamp": "2024-01-01T00:00:00Z"
}
# Error response
{
"status": "error",
"error": "Clear error message",
"timestamp": "2024-01-01T00:00:00Z"
}
```
### Documentation
Write clear descriptions and docstrings:
```python
@tool(
name="my_extension.process_data",
description="Process customer data and generate insights. Requires valid customer ID and date range.",
tags=["analytics", "customer", "reporting"]
)
def process_data(customer_id: int, start_date: str, end_date: str) -> dict:
"""
Process customer data for the specified date range.
This tool analyzes customer behavior patterns and generates
actionable insights for business decision-making.
Args:
customer_id: Unique customer identifier
start_date: Analysis start date (YYYY-MM-DD format)
end_date: Analysis end date (YYYY-MM-DD format)
Returns:
Dictionary containing analysis results and recommendations
"""
# Implementation here
pass
```
### Tool Naming
- **Extension tools**: Use prefixed names like `my_extension.tool_name`
- **Descriptive names**: `calculate_tax_amount` vs `calculate`
- **Consistent naming**: Follow patterns within your extension
## How AI Agents Use Your Tools
Once registered, AI agents can discover and use your tools automatically:
```
User: "Generate a random number between 1 and 100"
Agent: I'll use the random number generator tool.
→ Calls: example_extension.random_number(min_value=1, max_value=100)
← Returns: {"status": "success", "random_number": 42, ...}
Agent: I generated the number 42 for you.
```
The AI agent sees your tool's:
- **Name**: How to call it
- **Description**: What it does and when to use it
- **Parameters**: What inputs it expects (from Pydantic schema)
- **Tags**: Categories for discovery
## Troubleshooting
### Tool Not Available to AI Agents
1. **Check extension registration**: Verify your tool module is listed in extension entrypoints
2. **Verify decorator**: Ensure `@tool` is correctly applied
3. **Extension loading**: Confirm your extension is installed and enabled
### Input Validation Errors
1. **Pydantic models**: Ensure field types match expected inputs
2. **Field constraints**: Check min/max values and string lengths are reasonable
3. **Required fields**: Verify which parameters are required vs optional
### Runtime Issues
1. **Error handling**: Add try/catch blocks with clear error messages
2. **Response format**: Use consistent status/error/timestamp structure
3. **Testing**: Test your tools with various input scenarios
### Development Tips
1. **Start simple**: Begin with basic tools, add complexity gradually
2. **Test locally**: Use MCP clients (like Claude Desktop) to test your tools
3. **Clear descriptions**: Write tool descriptions as if explaining to a new user
4. **Meaningful tags**: Use tags that help categorize and discover tools
5. **Error messages**: Provide specific, actionable error messages
## MCP Prompts
### Basic Prompt Registration
Create interactive prompts using the `@prompt` decorator:
```python
from superset_core.mcp import prompt
from fastmcp import Context
@prompt("my_extension.workflow_guide")
async def workflow_guide(ctx: Context) -> str:
"""Interactive guide for data analysis workflows."""
return """
# Data Analysis Workflow Guide
Here's a step-by-step approach to effective data analysis in Superset:
## 1. Data Discovery
- Start by exploring your datasets using the dataset browser
- Check data quality and identify key metrics
- Look for patterns and relationships in your data
## 2. Chart Creation
- Choose appropriate visualizations for your data types
- Apply filters to focus on relevant subsets
- Configure proper aggregations and groupings
## 3. Dashboard Assembly
- Combine related charts into coherent dashboards
- Use filters and parameters for interactivity
- Add markdown components for context and explanations
Would you like guidance on any specific step?
"""
```
### Advanced Prompt Examples
#### Domain-Specific Context
```python
@prompt(
"sales_extension.sales_analysis_guide",
title="Sales Analysis Guide",
description="Specialized guidance for sales data analysis workflows"
)
async def sales_analysis_guide(ctx: Context) -> str:
"""Provides sales-specific analysis guidance and best practices."""
return """
# Sales Data Analysis Best Practices
## Key Metrics to Track
- **Revenue Growth**: Month-over-month and year-over-year trends
- **Conversion Rates**: Lead-to-opportunity-to-close ratios
- **Customer Lifetime Value**: Total value per customer segment
- **Sales Cycle Length**: Time from lead to close by product/region
## Recommended Chart Types
- **Time Series**: Revenue trends, seasonal patterns
- **Funnel Charts**: Conversion analysis across sales stages
- **Geographic Maps**: Regional performance comparison
- **Cohort Analysis**: Customer retention and growth patterns
## Common Pitfalls to Avoid
- Don't mix different time granularities without proper context
- Always normalize for business days when comparing periods
- Consider external factors (holidays, market events) in analysis
- Segment by relevant dimensions (product, region, channel)
## Next Steps
1. Identify your primary sales KPIs
2. Create baseline trend charts for each metric
3. Build comparative views across segments
4. Set up automated alerts for significant changes
"""
```
#### Interactive Troubleshooting
```python
@prompt("support_extension.troubleshoot_charts")
async def troubleshoot_charts(ctx: Context) -> str:
"""Interactive troubleshooting assistant for chart issues."""
return """
# Chart Troubleshooting Assistant
Let's diagnose your chart issue step by step:
## Common Issues and Solutions
### 🚫 No Data Showing
**Possible causes:**
- Filters are too restrictive
- Date range doesn't match your data
- Database connection issues
- Missing permissions
**Check:** Try removing all filters and expanding the date range first.
### 📊 Unexpected Aggregation Results
**Possible causes:**
- Incorrect grouping dimensions
- Wrong aggregation function (SUM vs COUNT vs AVG)
- Data quality issues (duplicates, nulls)
- Time zone mismatches
**Check:** Verify your GROUP BY columns and aggregation logic.
### 🐌 Slow Performance
**Possible causes:**
- Large dataset without proper indexing
- Complex joins or calculations
- Missing query optimizations
- Resource constraints
**Check:** Simplify the query and add appropriate filters first.
## Debug Steps
1. **Start Simple**: Create a basic count query first
2. **Add Gradually**: Introduce complexity step by step
3. **Check SQL**: Review the generated SQL for issues
4. **Test Data**: Verify with a small sample first
What specific issue are you experiencing?
"""
```
### Prompt Best Practices
#### Content Structure
- **Use clear headings** and sections for easy navigation
- **Provide actionable steps** rather than just theory
- **Include examples** relevant to the user's domain
- **Offer next steps** to continue the workflow
#### Interactive Design
- **Ask questions** to engage the user
- **Provide options** for different scenarios
- **Reference specific Superset features** by name
- **Link to related tools** when appropriate
#### Context Awareness
```python
@prompt("analytics_extension.context_aware_guide")
async def context_aware_guide(ctx: Context) -> str:
"""Provides guidance based on current user context."""
# Access user information if available
user_info = getattr(ctx, 'user', None)
guidance = """# Personalized Analytics Guide\n\n"""
if user_info:
guidance += f"Welcome back! Here's guidance tailored for your role:\n\n"
guidance += """
## Getting Started
Based on your previous activity, here are recommended next steps:
1. **Review Recent Dashboards**: Check your most-used dashboards for updates
2. **Explore New Data**: Look for recently added datasets in your domain
3. **Share Insights**: Consider sharing successful analyses with your team
## Advanced Techniques
- Set up automated alerts for key metrics
- Create parameterized dashboards for different audiences
- Use SQL Lab for complex custom analyses
"""
return guidance
```
## Combining Tools and Prompts
Extensions can provide both tools and prompts that work together:
```python
# Tool for data processing
@tool("analytics_extension.calculate_metrics")
def calculate_metrics(data: dict) -> dict:
"""Calculate advanced analytics metrics."""
# Implementation here
pass
# Prompt that guides users to the tool
@prompt("analytics_extension.metrics_guide")
async def metrics_guide(ctx: Context) -> str:
"""Guide users through advanced metrics calculation."""
return """
# Advanced Metrics Calculation
Use the `calculate_metrics` tool to compute specialized analytics:
## Available Metrics
- Customer Lifetime Value (CLV)
- Cohort Retention Rates
- Statistical Significance Tests
- Predictive Trend Analysis
## Usage
Call the tool with your dataset to get detailed calculations
and recommendations for visualization approaches.
Would you like to calculate metrics for your current dataset?
"""
```
## Next Steps
- **[Development](./development)** - Project structure, APIs, and dev workflow
- **[Security](./security)** - Security best practices for extensions

View File

@@ -24,31 +24,53 @@ under the License.
# Overview
Apache Superset's extension system enables organizations to build custom features without modifying the core codebase. Inspired by the [VS Code extension model](https://code.visualstudio.com/api), this architecture addresses a long-standing challenge: teams previously had to fork Superset or make invasive modifications to add capabilities like query optimizers, custom panels, or specialized integrations—resulting in maintenance overhead and codebase fragmentation.
The extension system introduces a modular, plugin-based architecture where both built-in features and external extensions use the same well-defined APIs. This "lean core" approach ensures that any capability available to Superset's internal features is equally accessible to community-developed extensions, fostering a vibrant ecosystem while reducing the maintenance burden on core contributors.
Apache Superset's extension system allows developers to enhance and customize Superset's functionality through a modular, plugin-based architecture. Extensions can add new visualization types, custom UI components, data processing capabilities, and integration points.
## What are Superset Extensions?
Superset extensions are self-contained `.supx` packages that extend the platform's capabilities through standardized contribution points. Each extension can include both frontend (React/TypeScript) and backend (Python) components, bundled together and loaded dynamically at runtime using Webpack Module Federation.
Superset extensions are self-contained packages that extend the core platform's capabilities. They follow a standardized architecture that ensures compatibility, security, and maintainability while providing powerful customization options.
## Extension Architecture
- **[Architecture](./architecture)** - Architectural principles and high-level system overview
- **[Extension Project Structure](./extension-project-structure)** - Standard project layout and organization
- **[Extension Metadata](./extension-metadata)** - Configuration and manifest structure
## Development Guide
- **[Frontend Contribution Types](./frontend-contribution-types)** - Types of UI contributions available
- **[Interacting with Host](./interacting-with-host)** - Communication patterns with Superset core
- **[Development Mode](./development-mode)** - Tools and workflows for extension development
For information about runtime loading and dependency management, see the [Dynamic Module Loading](./architecture#dynamic-module-loading) section in the Architecture page.
## Deployment & Management
- **[Deploying Extension](./deploying-extension)** - Packaging and distribution strategies
- **[Security Implications](./security-implications)** - Security considerations and best practices
## Hands-on Examples
- **[Quick Start](./quick-start)** - Complete Hello World extension walkthrough
## Extension Capabilities
Extensions can provide:
- **Custom UI Components**: New panels, views, and interactive elements
- **Commands and Menus**: Custom actions accessible via menus and keyboard shortcuts
- **REST API Endpoints**: Backend services under the `/api/v1/extensions/` namespace
- **MCP Tools and Prompts**: AI agent capabilities for enhanced user assistance
- **Custom Visualizations**: New chart types and data visualization components
- **UI Enhancements**: Custom dashboards, panels, and interactive elements
- **Data Connectors**: Integration with external data sources and APIs
- **Workflow Automation**: Custom actions and batch processing capabilities
- **Authentication Providers**: SSO and custom authentication mechanisms
- **Theme Customization**: Custom styling and branding options
## Next Steps
## Getting Started
- **[Quick Start](./quick-start)** - Build your first extension with a complete walkthrough
- **[Architecture](./architecture)** - Design principles and system overview
- **[Dependencies](./dependencies)** - Managing dependencies and understanding API stability
- **[Contribution Types](./contribution-types)** - Available extension points
- **[Development](./development)** - Project structure, APIs, and development workflow
- **[Deployment](./deployment)** - Packaging and deploying extensions
- **[MCP Integration](./mcp)** - Adding AI agent capabilities using extensions
- **[Security](./security)** - Security considerations and best practices
- **[Community Extensions](./registry)** - Browse extensions shared by the community
1. **Learn the Architecture**: Start with [Architecture](./architecture) to understand the design philosophy
2. **Set up Development**: Follow the [Development Mode](./development-mode) guide to configure your environment
3. **Build Your First Extension**: Complete the [Quick Start](./quick-start) tutorial
4. **Deploy and Share**: Use the [Deploying Extension](./deploying-extension) guide to package your extension
## Extension Ecosystem
The extension system is designed to foster a vibrant ecosystem of community-contributed functionality. By following the established patterns and guidelines, developers can create extensions that seamlessly integrate with Superset while maintaining the platform's reliability and performance standards.

View File

@@ -128,7 +128,7 @@ The CLI generated a basic `backend/src/hello_world/entrypoint.py`. We'll create
```python
from flask import Response
from flask_appbuilder.api import expose, protect, safe
from superset_core.api.rest_api import RestApi
from superset_core.api.types.rest_api import RestApi
class HelloWorldAPI(RestApi):
@@ -191,125 +191,7 @@ This registers your API with Superset when the extension loads.
## Step 5: Create Frontend Component
The CLI generates the frontend configuration files. Below are the key configurations that enable Module Federation integration with Superset.
**`frontend/package.json`**
The `@apache-superset/core` package must be listed in both `peerDependencies` (to declare runtime compatibility) and `devDependencies` (to provide TypeScript types during build):
```json
{
"name": "hello_world",
"version": "0.1.0",
"private": true,
"license": "Apache-2.0",
"scripts": {
"start": "webpack serve --mode development",
"build": "webpack --stats-error-details --mode production"
},
"peerDependencies": {
"@apache-superset/core": "^x.x.x",
"react": "^x.x.x",
"react-dom": "^x.x.x"
},
"devDependencies": {
"@apache-superset/core": "^x.x.x",
"@types/react": "^x.x.x",
"ts-loader": "^x.x.x",
"typescript": "^x.x.x",
"webpack": "^5.x.x",
"webpack-cli": "^x.x.x",
"webpack-dev-server": "^x.x.x"
}
}
```
**`frontend/webpack.config.js`**
The webpack configuration requires specific settings for Module Federation. Key settings include `externalsType: "window"` and `externals` to map `@apache-superset/core` to `window.superset` at runtime, `import: false` for shared modules to use the host's React instead of bundling a separate copy, and `remoteEntry.[contenthash].js` for cache busting:
```javascript
const path = require("path");
const { ModuleFederationPlugin } = require("webpack").container;
const packageConfig = require("./package.json");
module.exports = (env, argv) => {
const isProd = argv.mode === "production";
return {
entry: isProd ? {} : "./src/index.tsx",
mode: isProd ? "production" : "development",
devServer: {
port: 3001,
headers: {
"Access-Control-Allow-Origin": "*",
},
},
output: {
filename: isProd ? undefined : "[name].[contenthash].js",
chunkFilename: "[name].[contenthash].js",
clean: true,
path: path.resolve(__dirname, "dist"),
publicPath: `/api/v1/extensions/${packageConfig.name}/`,
},
resolve: {
extensions: [".ts", ".tsx", ".js", ".jsx"],
},
// Map @apache-superset/core imports to window.superset at runtime
externalsType: "window",
externals: {
"@apache-superset/core": "superset",
},
module: {
rules: [
{
test: /\.tsx?$/,
use: "ts-loader",
exclude: /node_modules/,
},
],
},
plugins: [
new ModuleFederationPlugin({
name: packageConfig.name,
filename: "remoteEntry.[contenthash].js",
exposes: {
"./index": "./src/index.tsx",
},
shared: {
react: {
singleton: true,
requiredVersion: packageConfig.peerDependencies.react,
import: false, // Use host's React, don't bundle
},
"react-dom": {
singleton: true,
requiredVersion: packageConfig.peerDependencies["react-dom"],
import: false,
},
},
}),
],
};
};
```
**`frontend/tsconfig.json`**
```json
{
"compilerOptions": {
"baseUrl": ".",
"moduleResolution": "node",
"jsx": "react",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true
},
"include": ["src"]
}
```
The CLI generated boilerplate files. The webpack config and package.json are already properly configured with Module Federation.
**Create `frontend/src/HelloWorldPanel.tsx`**
@@ -455,7 +337,7 @@ Add the following to your `superset_config.py`:
```python
# Enable extensions feature
FEATURE_FLAGS = {
"ENABLE_EXTENSIONS": True,
"EXTENSIONS": True,
}
# Set the directory where extensions are stored
@@ -506,9 +388,10 @@ Here's what happens when your extension loads:
Now that you have a working extension, explore:
- **[Development](./development)** - Project structure, APIs, and development workflow
- **[Contribution Types](./contribution-types)** - Other contribution points beyond panels
- **[Deployment](./deployment)** - Packaging and deploying your extension
- **[Security](./security)** - Security best practices for extensions
- **[Development Mode](./development-mode)** - Faster iteration with local development and watch mode
- **[Extension Project Structure](./extension-project-structure)** - Best practices for organizing larger extensions
- **[Frontend Contribution Types](./frontend-contribution-types)** - Other UI contribution points beyond panels
- **[Interacting with Host](./interacting-with-host)** - Advanced APIs for interacting with Superset
- **[Security Implications](./security-implications)** - Security best practices for extensions
For a complete real-world example, examine the query insights extension in the Superset codebase.

View File

@@ -1,55 +0,0 @@
---
title: Community Extensions
sidebar_position: 10
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Community Extensions
This page serves as a registry of community-created Superset extensions. These extensions are developed and maintained by community members and are not officially supported or vetted by the Apache Superset project. **Before installing any community extension, administrators are responsible for evaluating the extension's source code for security vulnerabilities, performance impact, UI/UX quality, and compatibility with their Superset deployment.**
## Extensions
| Name | Description | Author | Preview |
| ------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| [Extensions API Explorer](https://github.com/michael-s-molina/superset-extensions/tree/main/api_explorer) | A SQL Lab panel that demonstrates the Extensions API by providing an interactive explorer for testing commands like getTabs, getCurrentTab, and getDatabases. Useful for extension developers to understand and experiment with the available APIs. | Michael S. Molina | <a href="/img/extensions/api-explorer.png" target="_blank"><img src="/img/extensions/api-explorer.png" alt="Extensions API Explorer" width="120" /></a> |
| [SQL Query Flow Visualizer](https://github.com/msyavuz/superset-sql-visualizer) | A SQL Lab panel that transforms SQL queries into interactive flow diagrams, helping developers and analysts understand query execution paths and data relationships. | Mehmet Salih Yavuz | <a href="/img/extensions/sql-flow-visualizer.png" target="_blank"><img src="/img/extensions/sql-flow-visualizer.png" alt="SQL Flow Visualizer" width="120" /></a> |
| [SQL Lab Export to Google Sheets](https://github.com/michael-s-molina/superset-extensions/tree/main/sqllab_gsheets) | A Superset extension that allows users to export SQL Lab query results directly to Google Sheets. | Michael S. Molina | <a href="/img/extensions/gsheets-export.png" target="_blank"><img src="/img/extensions/gsheets-export.png" alt="SQL Lab Export to Google Sheets" width="120" /></a> |
| [SQL Lab Export to Parquet](https://github.com/rusackas/superset-extensions/tree/main/sqllab_parquet) | Export SQL Lab query results directly to Apache Parquet format with Snappy compression. | Evan Rusackas | <a href="/img/extensions/parquet-export.png" target="_blank"><img src="/img/extensions/parquet-export.png" alt="SQL Lab Export to Parquet" width="120" /></a> |
| [SQL Lab Query Comparison](https://github.com/michael-s-molina/superset-extensions/tree/main/query_comparison) | A SQL Lab extension that enables side-by-side comparison of query results across different tabs, with GitHub-style diff visualization showing added/removed rows and columns. | Michael S. Molina | <a href="/img/extensions/query-comparison.png" target="_blank"><img src="/img/extensions/query-comparison.png" alt="Query Comparison" width="120" /></a> |
| [SQL Lab Result Stats](https://github.com/michael-s-molina/superset-extensions/tree/main/result_stats) | A SQL Lab extension that automatically computes statistics for query results, providing type-aware analysis including numeric metrics (min, max, mean, median, std dev), string analysis (length, empty counts), and date range information. | Michael S. Molina | <a href="/img/extensions/result-stats.png" target="_blank"><img src="/img/extensions/result-stats.png" alt="Result Stats" width="120" /></a> |
| [SQL Snippets](https://github.com/michael-s-molina/superset-extensions/tree/main/sql_snippets) | A SQL Lab extension that provides reusable SQL code snippets, enabling quick insertion of commonly used code blocks such as license headers, author information, and frequently used SQL patterns. | Michael S. Molina | <a href="/img/extensions/sql-snippets.png" target="_blank"><img src="/img/extensions/sql-snippets.png" alt="SQL Snippets" width="120" /></a> |
| [SQL Lab Query Estimator](https://github.com/michael-s-molina/superset-extensions/tree/main/query_estimator) | A SQL Lab panel that analyzes query execution plans to estimate resource impact, detect performance issues like Cartesian products and high-cost operations, and visualize the query plan tree. | Michael S. Molina | <a href="/img/extensions/query-estimator.png" target="_blank"><img src="/img/extensions/query-estimator.png" alt="Query Estimator" width="120" /></a> |
| [Editors Bundle](https://github.com/michael-s-molina/superset-extensions/tree/main/editors_bundle) | A Superset extension that demonstrates how to provide custom code editors for different languages. This extension showcases the editor contribution system by registering alternative editors that can replace Superset's default Ace editor. | Michael S. Molina | <a href="/img/extensions/editors-bundle.png" target="_blank"><img src="/img/extensions/editors-bundle.png" alt="Editors Bundle" width="120" /></a> |
## How to Add Your Extension
To add your extension to this registry, submit a pull request to the [Apache Superset repository](https://github.com/apache/superset) with the following changes:
1. Add a row to the **Extensions** table above using this format:
```markdown
| [Your Extension](https://github.com/your-username/your-repo) | A brief description of your extension. | Your Name | <a href="/img/extensions/your-screenshot.png" target="_blank"><img src="/img/extensions/your-screenshot.png" alt="Your Extension" width="120" /></a> |
```
2. Add a screenshot to `docs/static/img/extensions/` (recommended size: 800x450px, PNG or JPG format)
3. Submit your PR with a title like "docs: Add [Extension Name] to community extensions registry"

View File

@@ -1,6 +1,6 @@
---
title: Security
sidebar_position: 9
title: Security Implications and Responsibilities
sidebar_position: 12
---
<!--
@@ -22,14 +22,12 @@ specific language governing permissions and limitations
under the License.
-->
# Security
# Security Implications and Responsibilities
By default, extensions are disabled and must be explicitly enabled by setting the `ENABLE_EXTENSIONS` feature flag. Built-in extensions are included as part of the Superset codebase and are held to the same security standards and review processes as the rest of the application.
For external extensions, administrators are responsible for evaluating and verifying the security of any extensions they choose to install, just as they would when installing third-party NPM or PyPI packages. At this stage, all extensions run in the same context as the host application, without additional sandboxing. This means that external extensions can impact the security and performance of a Superset environment in the same way as any other installed dependency.
We plan to introduce an optional sandboxed execution model for extensions in the future (as part of an additional SIP). Until then, administrators should exercise caution and follow best practices when selecting and deploying third-party extensions. A directory of community extensions is available in the [Community Extensions](./registry) page. Note that these extensions are not vetted by the Apache Superset project—administrators must evaluate each extension before installation.
We plan to introduce an optional sandboxed execution model for extensions in the future (as part of an additional SIP). Until then, administrators should exercise caution and follow best practices when selecting and deploying third-party extensions. A directory of known Superset extensions may be maintained in a means similar to [this page](https://github.com/apache/superset/wiki/Superset-Third%E2%80%90Party-Plugins-Directory) on the wiki. We also discussed the possibility of introducing a shared registry for vetted extensions but decided to leave it out of the initial scope of the project. We might introduce a registry at a later stage depending on the evolution of extensions created by the community.
**Any performance or security vulnerabilities introduced by external extensions should be reported directly to the extension author, not as Superset vulnerabilities.**
Any security concerns regarding built-in extensions (included in Superset's monorepo) should be reported to the Superset Security mailing list for triage and resolution by maintainers.
Any performance or security vulnerabilities introduced by external extensions should be reported directly to the extension author, not as Superset vulnerabilities. Any security concerns regarding built-in extensions (included in Superset's monorepo) should be reported to the Superset Security mailing list for triage and resolution by maintainers.

View File

@@ -1,6 +1,6 @@
---
title: Overview
sidebar_position: 1
title: Backend Style Guidelines
sidebar_position: 3
---
<!--
@@ -26,22 +26,22 @@ under the License.
This is a list of statements that describe how we do backend development in Superset. While they might not be 100% true for all files in the repo, they represent the gold standard we strive towards for backend quality and style.
- We use a monolithic Python/Flask/Flask-AppBuilder backend, with small single-responsibility satellite services where necessary.
- Files are generally organized by feature or object type. Within each domain, we can have api controllers, models, schemas, commands, and data access objects (DAO).
- See: [Proposal for Improving Superset's Python Code Organization](https://github.com/apache/superset/issues/9077)
- API controllers use Marshmallow Schemas to serialize/deserialize data.
- Authentication and authorization are controlled by the [security manager](https://github.com/apache/superset/blob/master/superset/security/manager).
- We use Pytest for unit and integration tests. These live in the `tests` directory.
- We add tests for every new piece of functionality added to the backend.
- We use pytest fixtures to share setup between tests.
- We use SQLAlchemy to access both Superset's application database, and users' analytics databases.
- We make changes backwards compatible whenever possible.
- If a change cannot be made backwards compatible, it goes into a major release.
- See: [Proposal For Semantic Versioning](https://github.com/apache/superset/issues/12566)
- We use Swagger for API documentation, with docs written inline on the API endpoint code.
- We prefer thin ORM models, putting shared functionality in other utilities.
- Several linters/checkers are used to maintain consistent code style and type safety: pylint, mypy, black, isort.
- `__init__.py` files are kept empty to avoid implicit dependencies.
* We use a monolithic Python/Flask/Flask-AppBuilder backend, with small single-responsibility satellite services where necessary.
* Files are generally organized by feature or object type. Within each domain, we can have api controllers, models, schemas, commands, and data access objects (dao).
* See: [Proposal for Improving Superset's Python Code Organization](https://github.com/apache/superset/issues/9077)
* API controllers use Marshmallow Schemas to serialize/deserialize data.
* Authentication and authorization are controlled by the [security manager](https://github.com/apache/superset/blob/master/superset/security/manager).
* We use Pytest for unit and integration tests. These live in the `tests` directory.
* We add tests for every new piece of functionality added to the backend.
* We use pytest fixtures to share setup between tests.
* We use sqlalchemy to access both Superset's application database, and users' analytics databases.
* We make changes backwards compatible whenever possible.
* If a change cannot be made backwards compatible, it goes into a major release.
* See: [Proposal For Semantic Versioning](https://github.com/apache/superset/issues/12566)
* We use Swagger for API documentation, with docs written inline on the API endpoint code.
* We prefer thin ORM models, putting shared functionality in other utilities.
* Several linters/checkers are used to maintain consistent code style and type safety: pylint, pypy, black, isort.
* `__init__.py` files are kept empty to avoid implicit dependencies.
## Code Organization

View File

@@ -1,6 +1,6 @@
---
title: DAO Style Guidelines and Best Practices
sidebar_position: 2
sidebar_position: 1
---
<!--
@@ -26,29 +26,19 @@ under the License.
A Data Access Object (DAO) is a pattern that provides an abstract interface to the SQLAlchemy Object Relational Mapper (ORM). The DAOs are critical as they form the building block of the application which are wrapped by the associated commands and RESTful API endpoints.
There are numerous inconsistencies and violations of the DRY principal within the codebase as it relates to DAOs and ORMs—unnecessary commits, non-ACID transactions, etc.—which makes the code unnecessarily complex and convoluted. Addressing the underlying issues with the DAOs _should_ help simplify the downstream operations and improve the developer experience.
Currently there are numerous inconsistencies and violation of the DRY principal within the codebase as it relates to DAOs and ORMs—unnecessary commits, non-ACID transactions, etc.—which makes the code unnecessarily complex and convoluted. Addressing the underlying issues with the DAOs _should_ help simplify the downstream operations and improve the developer experience.
To ensure consistency the following rules should be adhered to:
## Core Rules
1. All database operations (including testing) should be defined within a DAO, i.e., there should not be any explicit `db.session.add`, `db.session.merge`, etc. calls outside of a DAO.
1. **All database operations (including testing) should be defined within a DAO**, i.e., there should not be any explicit `db.session.add`, `db.session.merge`, etc. calls outside of a DAO.
2. A DAO should use `create`, `update`, `delete`, `upsert` terms—typical database operations which ensure consistency with commands—rather than action based terms like `save`, `saveas`, `override`, etc.
2. **A DAO should use `create`, `update`, `delete`, `upsert` terms**—typical database operations which ensure consistency with commands—rather than action based terms like `save`, `saveas`, `override`, etc.
3. Sessions should be managed via a [context manager](https://docs.sqlalchemy.org/en/20/orm/session_transaction.html#begin-once) which auto-commits on success and rolls back on failure, i.e., there should be no explicit `db.session.commit` or `db.session.rollback` calls within the DAO.
3. **Sessions should be managed via a [context manager](https://docs.sqlalchemy.org/en/20/orm/session_transaction.html#begin-once)** which auto-commits on success and rolls back on failure, i.e., there should be no explicit `db.session.commit` or `db.session.rollback` calls within the DAO.
4. There should be a single atomic transaction representing the entirety of the operation, i.e., when creating a dataset with associated columns and metrics either all the changes succeed when the transaction is committed, or all the changes are undone when the transaction is rolled back. SQLAlchemy supports [nested transactions](https://docs.sqlalchemy.org/en/20/orm/session_transaction.html#nested-transaction) via the `begin_nested` method which can be nested—inline with how DAOs are invoked.
4. **There should be a single atomic transaction representing the entirety of the operation**, i.e., when creating a dataset with associated columns and metrics either all the changes succeed when the transaction is committed, or all the changes are undone when the transaction is rolled back. SQLAlchemy supports [nested transactions](https://docs.sqlalchemy.org/en/20/orm/session_transaction.html#nested-transaction) via the `begin_nested` method which can be nested—inline with how DAOs are invoked.
5. **The database layer should adopt a "shift left" mentality** i.e., uniqueness/foreign key constraints, relationships, cascades, etc. should all be defined in the database layer rather than being enforced in the application layer.
6. **Exception-based validation**: Ask for forgiveness rather than permission. Try to perform the operation and rely on database constraints to verify that the model is acceptable, rather than pre-validating conditions.
7. **Bulk operations**: Provide bulk `create`, `update`, and `delete` methods where applicable for performance optimization.
8. **Sparse updates**: Updates should only modify explicitly defined attributes.
9. **Test transactions**: Tests should leverage nested transactions which should be rolled back on teardown, rather than deleting objects.
5. The database layer should adopt a "shift left" mentality i.e., uniqueness/foreign key constraints, relationships, cascades, etc. should all be defined in the database layer rather than being enforced in the application layer.
## DAO Implementation Examples

View File

@@ -30,23 +30,21 @@ This is an area to host resources and documentation supporting the evolution and
### Sentence case
Use sentence-case capitalization for everything in the UI (except these exceptions below).
Use sentence-case capitalization for everything in the UI (except these **).
Sentence case is predominantly lowercase. Capitalize only the initial character of the first word, and other words that require capitalization, like:
- **Proper nouns.** Objects in the product _are not_ considered proper nouns e.g. dashboards, charts, saved queries etc. Proprietary feature names eg. SQL Lab, Preset Manager _are_ considered proper nouns
- **Acronyms** (e.g. CSS, HTML)
- When referring to **UI labels that are themselves capitalized** from sentence case (e.g. page titles - Dashboards page, Charts page, Saved queries page, etc.)
- User input that is reflected in the UI. E.g. a user-named a dashboard tab
* **Proper nouns.** Objects in the product _are not_ considered proper nouns e.g. dashboards, charts, saved queries etc. Proprietary feature names eg. SQL Lab, Preset Manager _are_ considered proper nouns
* **Acronyms** (e.g. CSS, HTML)
* When referring to **UI labels that are themselves capitalized** from sentence case (e.g. page titles - Dashboards page, Charts page, Saved queries page, etc.)
* User input that is reflected in the UI. E.g. a user-named a dashboard tab
**Sentence case vs. Title case:**
- Title case: "A Dog Takes a Walk in Paris"
- Sentence case: "A dog takes a walk in Paris"
**Sentence case vs. Title case:** Title case: "A Dog Takes a Walk in Paris" Sentence case: "A dog takes a walk in Paris"
**Why sentence case?**
- It's generally accepted as the quickest to read
- It's the easiest form to distinguish between common and proper nouns
* It's generally accepted as the quickest to read
* It's the easiest form to distinguish between common and proper nouns
### How to refer to UI elements
@@ -54,38 +52,21 @@ When writing about a UI element, use the same capitalization as used in the UI.
For example, if an input field is labeled "Name" then you refer to this as the "Name input field". Similarly, if a button has the label "Save" in it, then it is correct to refer to the "Save button".
Where a product page is titled "Settings", you refer to this in writing as follows:
"Edit your personal information on the Settings page".
Where a product page is titled "Settings", you refer to this in writing as follows: "Edit your personal information on the Settings page".
Often a product page will have the same title as the objects it contains. In this case, refer to the page as it appears in the UI, and the objects as common nouns:
- Upload a dashboard on the Dashboards page
- Go to Dashboards
- View dashboard
- View all dashboards
- Upload CSS templates on the CSS templates page
- Queries that you save will appear on the Saved queries page
- Create custom queries in SQL Lab then create dashboards
* Upload a dashboard on the Dashboards page
* Go to Dashboards
* View dashboard
* View all dashboards
* Upload CSS templates on the CSS templates page
* Queries that you save will appear on the Saved queries page
* Create custom queries in SQL Lab then create dashboards
### Exceptions to sentence case
### **Exceptions to sentence case:**
1. **Acronyms and abbreviations.**
Examples: URL, CSV, XML, CSS, SQL, SSH, URI, NaN, CRON, CC, BCC
2. **Proper nouns and brand names.**
Examples: Apache, Superset, AntD JavaScript, GeoJSON, Slack, Google Sheets, SQLAlchemy
3. **Technical terms derived from proper nouns.**
Examples: Jinja, Gaussian, European (as in European time zone)
4. **Key names.** Capitalize button labels and UI elements as they appear in the product UI.
Examples: Shift (as in the keyboard button), Enter key
5. **Named queries or specific labeled items.**
Examples: Query A, Query B
6. **Database names.** Always capitalize names of database engines and connectors.
Examples: Presto, Trino, Drill, Hive, Google Sheets
1. Acronyms and abbreviations. Examples: URL, CSV, XML
## Button Design Guidelines
@@ -117,32 +98,6 @@ Primary buttons have a fourth style: dropdown.
| Tertiary | For less prominent actions; can be used in isolation or paired with a primary button |
| Destructive | For actions that could have destructive effects on the user's data |
### Format
#### Anatomy
Button text is centered using the Label style. Icons appear left of text when combined. If no text label exists, an icon must indicate the button's function.
#### Button size
- Default dimensions: 160px width × 32px height
- Text: 11px, Inter Medium, all caps
- Corners: 4px border radius
- Minimum padding: 8px around text
- Width can decrease if space is limited, but maintain minimum padding
#### Button groups
- Group related buttons to establish visual hierarchy
- Avoid overwhelming users with too many actions
- Limit calls to action; use tertiary/ghost buttons for layouts with 3+ actions
- Maintain consistent styles within groups when possible
- Space buttons 8px apart vertically or horizontally
#### Content guidelines
Button labels should be clear and predictable. Use the "\{verb\} + \{noun\}" format, except for common actions like "Done," "Close," "Cancel," "Add," or "Delete." This formula provides necessary context and aids translation, though compact UIs or localization needs may warrant exceptions.
## Error Message Design Guidelines
### Definition
@@ -173,10 +128,10 @@ In all cases, encountering errors increases user friction and frustration while
Select one pattern per error (e.g. do not implement an inline and banner pattern for the same error).
| When the error... | Use... |
|------------------|--------|
| Is directly related to a UI control | Inline error |
| Is not directly related to a UI control | Banner error |
When the error... | Use...
---------------- | ------
Is directly related to a UI control | Inline error
Is not directly related to a UI control | Banner error
#### Inline
@@ -191,45 +146,3 @@ Use the `LabeledErrorBoundInput` component for this error pattern.
##### Implementation details
- Where and when relevant, scroll the screen to the UI control with the error
- When multiple inline errors are present, scroll to the topmost error
#### Banner
Banner errors are used when the source of the error is not directly related to a UI control (text input, selector, etc.) such as a technical failure or a loading problem.
##### Anatomy
Use the `ErrorAlert` component for this error pattern.
1. **Headline** (optional): 1-2 word summary of the error
2. **Message**: What went wrong and what users should do next
3. **Expand option** (optional): "See more"/"See less"
4. **Details** (optional): Additional helpful context
5. **Modal** (optional): For spatial constraints using `ToastType.DANGER`
##### Implementation details
- Place the banner near the content area most relevant to the error
- For chart errors in Explore, use the chart area
- For modal errors, use the modal footer
- For app-wide errors, use the top of the screen
### Content guidelines
Effective error messages communicate:
1. What went wrong
2. What users should do next
Error messages should be:
- Clear and accurate, leaving no room for misinterpretation
- Short and concise
- Understandable to non-technical users
- Non-blaming and avoiding negative language
**Example:**
❌ "Cannot delete a datasource that has slices attached to it."
✅ "Please delete all charts using this dataset before deleting the dataset."

Some files were not shown because too many files have changed in this diff Show More