Compare commits

..

10 Commits

Author SHA1 Message Date
Maxime Beauchemin
58b448dfc3 temporiraly set up matrixify flag 2025-08-29 10:34:42 -07:00
Maxime Beauchemin
c73e132369 fix tests 2025-08-26 15:24:56 -07:00
Maxime Beauchemin
0f9d0996a2 fix: Correct OR operator syntax in Rison filter test
Fix test_or_operator to use proper Rison syntax for OR operations:

- Changed: (OR:!(status:active,priority:high))
- To: (OR:!((status:active),(priority:high)))

The Python prison library expects OR operations to contain an array of
properly formed objects, where each condition is wrapped in parentheses.

This matches the backend parser expectation in _handle_or_operator which
iterates over a list of dict objects to build SQL expressions.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-26 15:24:56 -07:00
Maxime Beauchemin
2f4f216e47 fix: Clean up debug logging and finalize URL cleanup
Complete the URL cleanup implementation:

- Removed debug console.log statements from production code
- Finalized risonFiltersToString() function for proper encoding
- Completed updateUrlWithUnmatchedFilters() implementation
- Verified that matched filters are properly removed from URL
- Confirmed quotes are required for Rison values with spaces

The intelligent filter injection system now works perfectly:
- Matched filters only appear in native filter bar
- Unmatched filters only appear in URL filters section
- No filter duplication between sections
- URLs progressively get cleaner as native filters are added

Example syntax:
- f=(country_name:Canada) - works, no quotes needed
- f=(region:'North America') - works, quotes required for spaces
- f=(region:North America) - fails, space breaks Rison parsing

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-26 15:24:55 -07:00
Maxime Beauchemin
ab916bf199 fix colors 2025-08-26 15:24:55 -07:00
Maxime Beauchemin
01169bc5db tweakskies 2025-08-26 15:24:55 -07:00
Maxime Beauchemin
5c710def85 feat: Enhance Rison filters with intelligent native filter injection
Implement smart filter matching that respects dashboard native filter configuration:

- Intelligent matching: Match Rison filters to native filters by column name
- Value conversion: Convert Rison values to appropriate native filter formats
- Scoping preservation: Respect existing native filter scope and configuration
- Graceful fallback: Unmatched filters use existing brute-force approach
- Enhanced UX: Matched filters appear in filter bar with proper visual state

Benefits:
- Respects dashboard designer's filter configuration intent
- Progressive enhancement as dashboards adopt native filters
- Maintains backwards compatibility for all existing functionality
- Better user experience with visible filter state

Technical details:
- Added injectRisonFiltersIntelligently() with column-based matching
- Enhanced DashboardPage.tsx with hybrid injection logic
- Supports filter_select, filter_range, and filter_time types

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-26 15:24:55 -07:00
Maxime Beauchemin
0b0a295b83 fix: Add Rison filter display to horizontal filter bar
The Rison URL filters were only showing in the vertical filter bar layout.
This adds the same visual indicators to the horizontal filter bar for consistency.

- Add styled components for Rison filter display in horizontal layout
- Parse and display active Rison filters from URL
- Include Rison filters in hasFilters check
- Match the visual styling from vertical bar (adapted for horizontal layout)
2025-08-26 15:24:55 -07:00
Maxime Beauchemin
c9b587e363 feat: Add Rison URL filters for human-readable filter syntax
Implements Rison-based URL filters that enable human-readable filter parameters
in Superset URLs. Users can now apply filters via clean syntax like
?f=(region:'North America',year:2024) instead of opaque encoded parameters.

Key changes:
- Add RisonFilterParser for parsing Rison syntax into Superset filters
- Support logical operators (OR, NOT) and comparison operators (gt, lt, between)
- Integrate with both dashboard and explore views
- Add visual indicators in FilterBar for active URL filters
- Prevent Rison filters from being saved to backend (stay in URL only)
- Fix duplicate filter issue by processing only on frontend
- Add URL prettification to maintain readable URLs where possible

Backend:
- Disable backend Rison processing in explore/get.py to prevent duplicates
- Filters are now processed exclusively on frontend

Frontend:
- Parse 'f' parameter and convert to adhoc_filters in Chart/index.tsx
- Dashboard integration via DashboardPage.tsx with Redux state management
- Add deduplication logic in getFormDataWithExtraFilters
- Exclude Rison filters from backend storage via sanitizeFormData
- Visual feedback in FilterBar showing URL-derived filters

Known limitations:
- Dashboard filters apply to all charts (scoping not implemented)
- Browser URL encoding varies (Chrome encodes quotes, Firefox doesn't)
- Rison only supports single quotes for strings (no alternatives)

Testing: Manual testing completed for both dashboard and explore contexts

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-26 15:24:55 -07:00
Maxime Beauchemin
bb5660512b feat: Add Rison URL filters for simplified filter syntax
- Add RisonFilterParser to parse f parameter with Rison syntax
- Integrate with explore endpoint to merge filters with form_data
- Support equality, IN, NOT, OR, comparison operators, BETWEEN, LIKE
- Add comprehensive unit tests and documentation
- Enable human-readable filter URLs like f=(country:USA,year:2024)

Examples:
- Simple: ?f=(country:USA)
- Multiple: ?f=(country:USA,year:2024)
- IN operator: ?f=(country:\!(USA,Canada))
- NOT: ?f=(NOT:(status:inactive))
- OR: ?f=(OR:\!(status:active,priority:high))
- Comparisons: ?f=(sales:(gt:100000))
- BETWEEN: ?f=(date:(between:\!(2024-01-01,2024-12-31)))
- LIKE: ?f=(name:(like:'%smith%'))
- Special chars: ?f=(region:'North America')
2025-08-26 15:24:55 -07:00
951 changed files with 11117 additions and 32134 deletions

14
.github/CODEOWNERS vendored
View File

@@ -33,10 +33,10 @@
# Notify PMC members of changes to extension-related files
/superset-core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-extensions-cli/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset/core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset/extensions/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-frontend/src/packages/superset-core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-frontend/src/core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-frontend/src/extensions/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-core/ @michael-s-molina @villebro
/superset-cli/ @michael-s-molina @villebro
/superset/core/ @michael-s-molina @villebro
/superset/extensions/ @michael-s-molina @villebro
/superset-frontend/src/packages/superset-core/ @michael-s-molina @villebro
/superset-frontend/src/core/ @michael-s-molina @villebro
/superset-frontend/src/extensions/ @michael-s-molina @villebro

View File

@@ -17,9 +17,9 @@ outputs:
docs:
description: Whether docs-related files were changed
value: ${{ steps.change-detector.outputs.docs }}
superset-extensions-cli:
description: Whether superset-extensions-cli package-related files were changed
value: ${{ steps.change-detector.outputs.superset-extensions-cli }}
superset-cli:
description: Whether superset-cli package-related files were changed
value: ${{ steps.change-detector.outputs.superset-cli }}
runs:
using: composite
steps:

View File

@@ -182,76 +182,6 @@ cypress-run-all() {
kill $flaskProcessId
}
playwright-install() {
cd "$GITHUB_WORKSPACE/superset-frontend"
say "::group::Install Playwright browsers"
npx playwright install --with-deps chromium
# Create output directories for test results and debugging
mkdir -p playwright-results
mkdir -p test-results
say "::endgroup::"
}
playwright-run() {
local APP_ROOT=$1
# Start Flask from the project root (same as Cypress)
cd "$GITHUB_WORKSPACE"
local flasklog="${HOME}/flask-playwright.log"
local port=8081
PLAYWRIGHT_BASE_URL="http://localhost:${port}"
if [ -n "$APP_ROOT" ]; then
export SUPERSET_APP_ROOT=$APP_ROOT
PLAYWRIGHT_BASE_URL=${PLAYWRIGHT_BASE_URL}${APP_ROOT}/
fi
export PLAYWRIGHT_BASE_URL
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
local flaskProcessId=$!
# Ensure cleanup on exit
trap "kill $flaskProcessId 2>/dev/null || true" EXIT
# Wait for server to be ready with health check
local timeout=60
say "Waiting for Flask server to start on port $port..."
while [ $timeout -gt 0 ]; do
if curl -f ${PLAYWRIGHT_BASE_URL}/health >/dev/null 2>&1; then
say "Flask server is ready"
break
fi
sleep 1
timeout=$((timeout - 1))
done
if [ $timeout -eq 0 ]; then
echo "::error::Flask server failed to start within 60 seconds"
echo "::group::Flask startup log"
cat "$flasklog"
echo "::endgroup::"
return 1
fi
# Change to frontend directory for Playwright execution
cd "$GITHUB_WORKSPACE/superset-frontend"
say "::group::Run Playwright tests"
echo "Running Playwright with baseURL: ${PLAYWRIGHT_BASE_URL}"
npx playwright test auth/login --reporter=github --output=playwright-results
local status=$?
say "::endgroup::"
# After job is done, print out Flask log for debugging
echo "::group::Flask log for Playwright run"
cat "$flasklog"
echo "::endgroup::"
# make sure the program exits
kill $flaskProcessId
return $status
}
eyes-storybook-dependencies() {
say "::group::install eyes-storyook dependencies"
sudo apt-get update -y && sudo apt-get -y install gconf-service ca-certificates libxshmfence-dev fonts-liberation libappindicator3-1 libasound2 libatk-bridge2.0-0 libatk1.0-0 libc6 libcairo2 libcups2 libdbus-1-3 libexpat1 libfontconfig1 libgbm1 libgcc1 libgconf-2-4 libglib2.0-0 libgdk-pixbuf2.0-0 libgtk-3-0 libnspr4 libnss3 libpangocairo-1.0-0 libstdc++6 libx11-6 libx11-xcb1 libxcb1 libxcomposite1 libxcursor1 libxdamage1 libxext6 libxfixes3 libxi6 libxrandr2 libxrender1 libxss1 libxtst6 lsb-release xdg-utils libappindicator1

View File

@@ -32,7 +32,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: true
ref: master

View File

@@ -31,7 +31,7 @@ jobs:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
if: steps.check_queued.outputs.count >= 20
uses: actions/checkout@v5
uses: actions/checkout@v4
- name: Cancel duplicate workflow runs
if: steps.check_queued.outputs.count >= 20

View File

@@ -18,7 +18,7 @@ jobs:
runs-on: ubuntu-22.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -25,7 +25,7 @@ jobs:
pull-requests: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
- name: Check and notify
uses: actions/github-script@v7
with:

View File

@@ -71,7 +71,7 @@ jobs:
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
fetch-depth: 1

View File

@@ -31,7 +31,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v5
uses: actions/checkout@v4
- name: Check for file changes
id: check

View File

@@ -27,7 +27,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout Repository"
uses: actions/checkout@v5
uses: actions/checkout@v4
- name: "Dependency Review"
uses: actions/dependency-review-action@v4
continue-on-error: true
@@ -53,7 +53,7 @@ jobs:
runs-on: ubuntu-22.04
steps:
- name: "Checkout Repository"
uses: actions/checkout@v5
uses: actions/checkout@v4
- name: Setup Python
uses: ./.github/actions/setup-backend/

View File

@@ -22,7 +22,7 @@ jobs:
steps:
- id: set_matrix
run: |
MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["dev", "lean"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize", "py311", "py312"]'; fi)
MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["dev", "lean"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize", "py311"]'; fi)
echo "matrix_config=${MATRIX_CONFIG}" >> $GITHUB_OUTPUT
echo $GITHUB_OUTPUT
@@ -42,7 +42,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
@@ -117,7 +117,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Check for file changes

View File

@@ -28,7 +28,7 @@ jobs:
run:
working-directory: superset-embedded-sdk
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: './superset-embedded-sdk/.nvmrc'

View File

@@ -18,7 +18,7 @@ jobs:
run:
working-directory: superset-embedded-sdk
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: './superset-embedded-sdk/.nvmrc'

View File

@@ -1,10 +1,4 @@
name: Cleanup ephemeral envs (PR close) [DEPRECATED]
# ⚠️ DEPRECATION NOTICE ⚠️
# This workflow is deprecated and will be removed in a future version.
# The new Superset Showtime workflow handles cleanup automatically.
# See .github/workflows/showtime.yml and showtime-cleanup.yml for replacements.
# Migration guide: https://github.com/mistercrunch/superset-showtime
name: Cleanup ephemeral envs (PR close)
on:
pull_request_target:
@@ -77,5 +71,5 @@ jobs:
issue_number: ${{ github.event.number }},
owner: context.repo.owner,
repo: context.repo.repo,
body: '⚠️ **DEPRECATED WORKFLOW** - Ephemeral environment shutdown and build artifacts deleted. Please migrate to the new Superset Showtime system for future PRs.'
body: 'Ephemeral environment shutdown and build artifacts deleted.'
})

View File

@@ -1,12 +1,4 @@
name: Ephemeral env workflow [DEPRECATED]
# ⚠️ DEPRECATION NOTICE ⚠️
# This workflow is deprecated and will be removed in a future version.
# Please use the new Superset Showtime workflow instead:
# - Use label "🎪 trigger-start" instead of "testenv-up"
# - Showtime provides better reliability and easier management
# - See .github/workflows/showtime.yml for the replacement
# - Migration guide: https://github.com/mistercrunch/superset-showtime
name: Ephemeral env workflow
# Example manual trigger:
# gh workflow run ephemeral-env.yml --ref fix_ephemerals --field label_name="testenv-up" --field issue_number=666
@@ -134,11 +126,8 @@ jobs:
throw new Error("Issue number is not available.");
}
const body = `⚠️ **DEPRECATED WORKFLOW** ⚠️\n\n@${user} This workflow is deprecated! Please use the new **Superset Showtime** system instead:\n\n` +
`- Replace "testenv-up" label with "🎪 trigger-start"\n` +
`- Better reliability and easier management\n` +
`- See https://github.com/mistercrunch/superset-showtime for details\n\n` +
`Processing your ephemeral environment request [here](${workflowUrl}). Action: **${action}**.` +
const body = `@${user} Processing your ephemeral environment request [here](${workflowUrl}).` +
` Action: **${action}**.` +
` More information on [how to use or configure ephemeral environments]` +
`(https://superset.apache.org/docs/contributing/howtos/#github-ephemeral-environments)`;
@@ -160,7 +149,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ needs.ephemeral-env-label.outputs.sha }} : ${{steps.get-sha.outputs.sha}} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
ref: ${{ needs.ephemeral-env-label.outputs.sha }}
persist-credentials: false
@@ -220,7 +209,7 @@ jobs:
pull-requests: write
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v4
with:
persist-credentials: false

View File

@@ -27,12 +27,12 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
- name: Setup Java
uses: actions/setup-java@v5
uses: actions/setup-java@v4
with:
distribution: "temurin"
java-version: "11"

View File

@@ -14,7 +14,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Checkout Repository
uses: actions/checkout@v5
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4

View File

@@ -17,7 +17,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false

View File

@@ -12,7 +12,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -15,12 +15,12 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
- name: Setup Java
uses: actions/setup-java@v5
uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '11'

View File

@@ -16,7 +16,7 @@ jobs:
pull-requests: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -21,7 +21,7 @@ jobs:
python-version: ["current", "previous", "next"]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -27,7 +27,7 @@ jobs:
pull-requests: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -26,7 +26,7 @@ jobs:
name: Bump version and publish package(s)
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v4
with:
# pulls all commits (needed for lerna / semantic release to correctly version)
fetch-depth: 0

View File

@@ -1,50 +0,0 @@
name: 🎪 Showtime Cleanup
# Scheduled cleanup of expired environments
on:
schedule:
- cron: '0 */6 * * *' # Every 6 hours
# Manual trigger for testing
workflow_dispatch:
inputs:
max_age_hours:
description: 'Maximum age in hours before cleanup'
required: false
default: '48'
type: string
# Common environment variables
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_REGION: ${{ vars.AWS_REGION || 'us-west-2' }}
GITHUB_ORG: ${{ github.repository_owner }}
GITHUB_REPO: ${{ github.event.repository.name }}
jobs:
cleanup-expired:
name: Clean up expired showtime environments
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
steps:
- name: Install Superset Showtime
run: pip install superset-showtime
- name: Cleanup expired environments
run: |
MAX_AGE="${{ github.event.inputs.max_age_hours || '48' }}"
# Validate max_age is numeric
if [[ ! "$MAX_AGE" =~ ^[0-9]+$ ]]; then
echo "❌ Invalid max_age_hours format: $MAX_AGE (must be numeric)"
exit 1
fi
echo "Cleaning up environments older than ${MAX_AGE}h"
python -m showtime cleanup --older-than "${MAX_AGE}h"

View File

@@ -1,179 +0,0 @@
name: 🎪 Superset Showtime
# Ultra-simple: just sync on any PR state change
on:
pull_request_target:
types: [labeled, unlabeled, synchronize, closed]
# Manual testing
workflow_dispatch:
inputs:
pr_number:
description: 'PR number to sync'
required: true
type: number
sha:
description: 'Specific SHA to deploy (optional, defaults to latest)'
required: false
type: string
# Common environment variables for all jobs (non-sensitive only)
env:
AWS_REGION: us-west-2
GITHUB_ORG: ${{ github.repository_owner }}
GITHUB_REPO: ${{ github.event.repository.name }}
GITHUB_ACTOR: ${{ github.actor }}
jobs:
sync:
name: 🎪 Sync PR to desired state
runs-on: ubuntu-latest
timeout-minutes: 90
permissions:
contents: read
pull-requests: write
steps:
- name: Security Check - Authorize Maintainers Only
id: auth
uses: actions/github-script@v7
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
script: |
const actor = context.actor;
console.log(`🔍 Checking authorization for ${actor}`);
// Early exit for workflow_dispatch - assume authorized since it's manually triggered
if (context.eventName === 'workflow_dispatch') {
console.log(`✅ Workflow dispatch event - assuming authorized for ${actor}`);
core.setOutput('authorized', 'true');
return;
}
const { data: permission } = await github.rest.repos.getCollaboratorPermissionLevel({
owner: context.repo.owner,
repo: context.repo.repo,
username: actor
});
console.log(`📊 Permission level for ${actor}: ${permission.permission}`);
const authorized = ['write', 'admin'].includes(permission.permission);
// If this is a synchronize event from unauthorized user, check if Showtime is active and set blocked label
if (!authorized && context.eventName === 'pull_request_target' && context.payload.action === 'synchronize') {
console.log(`🔒 Synchronize event detected - checking if Showtime is active`);
// Check if PR has any circus tent labels (Showtime is in use)
const { data: issue } = await github.rest.issues.get({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.pull_request.number
});
const hasCircusLabels = issue.labels.some(label => label.name.startsWith('🎪 '));
if (hasCircusLabels) {
console.log(`🎪 Circus labels found - setting blocked label to prevent auto-deployment`);
await github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.pull_request.number,
labels: ['🎪 🔒 showtime-blocked']
});
console.log(`✅ Blocked label set - Showtime will detect and skip operations`);
} else {
console.log(` No circus labels found - Showtime not in use, skipping block`);
}
}
if (!authorized) {
console.log(`🚨 Unauthorized user ${actor} - skipping all operations`);
core.setOutput('authorized', 'false');
return;
}
console.log(`✅ Authorized maintainer: ${actor}`);
core.setOutput('authorized', 'true');
- name: Install Superset Showtime
if: steps.auth.outputs.authorized == 'true'
run: |
echo "::notice::Maintainer ${{ github.actor }} triggered deploy for PR ${{ github.event.pull_request.number || github.event.inputs.pr_number }}"
pip install --upgrade superset-showtime
showtime version
- name: Check what actions are needed
if: steps.auth.outputs.authorized == 'true'
id: check
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
# Bulletproof PR number extraction
if [[ -n "${{ github.event.pull_request.number }}" ]]; then
PR_NUM="${{ github.event.pull_request.number }}"
elif [[ -n "${{ github.event.inputs.pr_number }}" ]]; then
PR_NUM="${{ github.event.inputs.pr_number }}"
else
echo "❌ No PR number found in event or inputs"
exit 1
fi
echo "Using PR number: $PR_NUM"
# Run sync check-only with optional SHA override
if [[ -n "${{ github.event.inputs.sha }}" ]]; then
OUTPUT=$(python -m showtime sync $PR_NUM --check-only --sha "${{ github.event.inputs.sha }}")
else
OUTPUT=$(python -m showtime sync $PR_NUM --check-only)
fi
echo "$OUTPUT"
# Extract the outputs we need for conditional steps
BUILD=$(echo "$OUTPUT" | grep "build_needed=" | cut -d'=' -f2)
SYNC=$(echo "$OUTPUT" | grep "sync_needed=" | cut -d'=' -f2)
PR_NUM_OUT=$(echo "$OUTPUT" | grep "pr_number=" | cut -d'=' -f2)
TARGET_SHA=$(echo "$OUTPUT" | grep "target_sha=" | cut -d'=' -f2)
echo "build_needed=$BUILD" >> $GITHUB_OUTPUT
echo "sync_needed=$SYNC" >> $GITHUB_OUTPUT
echo "pr_number=$PR_NUM_OUT" >> $GITHUB_OUTPUT
echo "target_sha=$TARGET_SHA" >> $GITHUB_OUTPUT
- name: Checkout PR code (only if build needed)
if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.build_needed == 'true'
uses: actions/checkout@v5
with:
ref: ${{ steps.check.outputs.target_sha }}
persist-credentials: false
- name: Setup Docker Environment (only if build needed)
if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.build_needed == 'true'
uses: ./.github/actions/setup-docker
with:
dockerhub-user: ${{ secrets.DOCKERHUB_USER }}
dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }}
build: "true"
install-docker-compose: "false"
- name: Execute sync (handles everything)
if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.sync_needed == 'true'
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
run: |
PR_NUM="${{ steps.check.outputs.pr_number }}"
TARGET_SHA="${{ steps.check.outputs.target_sha }}"
if [[ -n "$TARGET_SHA" ]]; then
python -m showtime sync $PR_NUM --sha "$TARGET_SHA"
else
python -m showtime sync $PR_NUM
fi

View File

@@ -37,7 +37,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -51,7 +51,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -30,7 +30,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -1,4 +1,4 @@
name: Superset Extensions CLI Package Tests
name: Superset CLI Package Tests
on:
push:
@@ -14,17 +14,17 @@ concurrency:
cancel-in-progress: true
jobs:
test-superset-extensions-cli-package:
test-superset-cli-package:
runs-on: ubuntu-24.04
strategy:
matrix:
python-version: ["previous", "current", "next"]
defaults:
run:
working-directory: superset-extensions-cli
working-directory: superset-cli
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
@@ -36,29 +36,29 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
if: steps.check.outputs.superset-extensions-cli
if: steps.check.outputs.superset-cli
uses: ./.github/actions/setup-backend/
with:
python-version: ${{ matrix.python-version }}
requirements-type: dev
- name: Run pytest with coverage
if: steps.check.outputs.superset-extensions-cli
if: steps.check.outputs.superset-cli
run: |
pytest --cov=superset_extensions_cli --cov-report=xml --cov-report=term-missing --cov-report=html -v --tb=short
pytest --cov=superset_cli --cov-report=xml --cov-report=term-missing --cov-report=html -v --tb=short
- name: Upload coverage reports to Codecov
if: steps.check.outputs.superset-extensions-cli
uses: codecov/codecov-action@v5
if: steps.check.outputs.superset-cli
uses: codecov/codecov-action@v3
with:
file: ./coverage.xml
flags: superset-extensions-cli
name: superset-extensions-cli-coverage
flags: superset-cli
name: superset-cli-coverage
fail_ci_if_error: false
- name: Upload HTML coverage report
if: steps.check.outputs.superset-extensions-cli
if: steps.check.outputs.superset-cli
uses: actions/upload-artifact@v4
with:
name: superset-extensions-cli-coverage-html
name: superset-cli-coverage-html
path: htmlcov/

View File

@@ -31,7 +31,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
@@ -41,7 +41,7 @@ jobs:
node-version-file: './docs/.nvmrc'
- name: Setup Python
uses: ./.github/actions/setup-backend/
- uses: actions/setup-java@v5
- uses: actions/setup-java@v4
with:
distribution: 'zulu'
java-version: '21'

View File

@@ -18,7 +18,7 @@ jobs:
name: Link Checking
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v4
# Do not bump this linkinator-action version without opening
# an ASF Infra ticket to allow the new version first!
- uses: JustinBeckwith/linkinator-action@v1.11.0
@@ -56,7 +56,7 @@ jobs:
working-directory: docs
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -69,21 +69,21 @@ jobs:
# Conditional checkout based on context
- name: Checkout for push or pull_request event
if: github.event_name == 'push' || github.event_name == 'pull_request'
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
ref: ${{ github.event.inputs.ref }}
submodules: recursive
- name: Checkout using PR ID (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge

View File

@@ -23,7 +23,7 @@ jobs:
should-run: ${{ steps.check.outputs.frontend }}
steps:
- name: Checkout Code
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
fetch-depth: 0
@@ -73,7 +73,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@v5
uses: actions/download-artifact@v4
with:
name: docker-image
@@ -101,7 +101,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Download Coverage Artifacts
uses: actions/download-artifact@v5
uses: actions/download-artifact@v4
with:
pattern: coverage-artifacts-*
path: coverage/
@@ -127,7 +127,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@v5
uses: actions/download-artifact@v4
with:
name: docker-image
@@ -143,7 +143,7 @@ jobs:
- name: tsc
run: |
docker run --rm $TAG bash -c \
"npm run plugins:build && npm run type"
"npm run type"
validate-frontend:
needs: frontend-build
@@ -151,7 +151,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@v5
uses: actions/download-artifact@v4
with:
name: docker-image

View File

@@ -16,7 +16,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -29,7 +29,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
ref: ${{ inputs.ref || github.ref_name }}
persist-credentials: true

View File

@@ -1,141 +0,0 @@
name: Playwright E2E Tests
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
workflow_dispatch:
inputs:
ref:
description: 'The branch or tag to checkout'
required: false
default: ''
pr_id:
description: 'The pull request ID to checkout'
required: false
default: ''
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
playwright-tests:
runs-on: ubuntu-22.04
# Allow workflow to succeed even if tests fail during shadow mode
continue-on-error: true
permissions:
contents: read
pull-requests: read
strategy:
fail-fast: false
matrix:
browser: ["chromium"]
app_root: ["", "/app/prefix"]
env:
SUPERSET_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
PYTHONPATH: ${{ github.workspace }}
REDIS_PORT: 16379
GITHUB_TOKEN: ${{ github.token }}
services:
postgres:
image: postgres:16-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
ports:
- 15432:5432
redis:
image: redis:7-alpine
ports:
- 16379:6379
steps:
# -------------------------------------------------------
# Conditional checkout based on context (same as Cypress workflow)
- name: Checkout for push or pull_request event
if: github.event_name == 'push' || github.event_name == 'pull_request'
uses: actions/checkout@v5
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@v5
with:
persist-credentials: false
ref: ${{ github.event.inputs.ref }}
submodules: recursive
- name: Checkout using PR ID (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
uses: actions/checkout@v5
with:
persist-credentials: false
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
submodules: recursive
# -------------------------------------------------------
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python || steps.check.outputs.frontend
- name: Setup postgres
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: setup-postgres
- name: Import test data
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@v4
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: Build javascript packages
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: build-instrumented-assets
- name: Install Playwright
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: playwright-install
- name: Run Playwright
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
env:
NODE_OPTIONS: "--max-old-space-size=4096"
with:
run: playwright-run ${{ matrix.app_root }}
- name: Set safe app root
if: failure()
id: set-safe-app-root
run: |
APP_ROOT="${{ matrix.app_root }}"
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Playwright Artifacts
uses: actions/upload-artifact@v4
if: failure()
with:
path: |
${{ github.workspace }}/superset-frontend/playwright-results/
${{ github.workspace }}/superset-frontend/test-results/
name: playwright-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}

View File

@@ -41,7 +41,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
@@ -99,7 +99,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
@@ -152,7 +152,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -48,7 +48,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
@@ -108,7 +108,7 @@ jobs:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -24,7 +24,7 @@ jobs:
PYTHONPATH: ${{ github.workspace }}
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -18,7 +18,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
@@ -49,7 +49,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive

View File

@@ -21,7 +21,7 @@ jobs:
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Install dependencies

View File

@@ -38,7 +38,7 @@ jobs:
});
- name: "Checkout ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
persist-credentials: false

View File

@@ -42,12 +42,12 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
build_preset: ["dev", "lean", "py310", "websocket", "dockerize", "py311", "py312"]
build_preset: ["dev", "lean", "py310", "websocket", "dockerize", "py311"]
fail-fast: false
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
fetch-depth: 0
@@ -107,7 +107,7 @@ jobs:
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v5
uses: actions/checkout@v4
with:
fetch-depth: 0

View File

@@ -27,7 +27,7 @@ jobs:
name: Generate Reports
steps:
- name: Checkout Repository
uses: actions/checkout@v5
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4

View File

@@ -12,7 +12,7 @@ jobs:
steps:
- name: Welcome Message
uses: actions/first-interaction@v3
uses: actions/first-interaction@v2
continue-on-error: true
with:
repo-token: ${{ github.token }}

View File

@@ -25,7 +25,7 @@ repos:
- id: mypy
name: mypy (main)
args: [--check-untyped-defs]
exclude: ^superset-extensions-cli/
exclude: ^superset-cli/
additional_dependencies: [
types-simplejson,
types-python-dateutil,
@@ -41,9 +41,9 @@ repos:
types-Markdown,
]
- id: mypy
name: mypy (superset-extensions-cli)
name: mypy (superset-cli)
args: [--check-untyped-defs]
files: ^superset-extensions-cli/
files: ^superset-cli/
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:

View File

@@ -73,7 +73,6 @@ ibm-db2.svg
postgresql.svg
snowflake.svg
ydb.svg
loading.svg
# docs-related
erd.puml

View File

@@ -44,8 +44,4 @@ under the License.
- [4.0.1](./CHANGELOG/4.0.1.md)
- [4.0.2](./CHANGELOG/4.0.2.md)
- [4.1.0](./CHANGELOG/4.1.0.md)
- [4.1.1](./CHANGELOG/4.1.1.md)
- [4.1.2](./CHANGELOG/4.1.2.md)
- [4.1.3](./CHANGELOG/4.1.3.md)
- [4.1.4](./CHANGELOG/4.1.4.md)
- [5.0.0](./CHANGELOG/5.0.0.md)

View File

@@ -1,33 +0,0 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Change Log
### 4.1.4 (Thu Jul 24 08:30:04 2025 -0300)
**Database Migrations**
**Features**
**Fixes**
- [#34289](https://github.com/apache/superset/pull/34289) fix: Saved queries list break if one query can't be parsed (@michael-s-molina)
- [#33059](https://github.com/apache/superset/pull/33059) fix: Adds missing __init__ file to commands/logs (@michael-s-molina)
**Others**
- [#32236](https://github.com/apache/superset/pull/32236) chore(deps): bump cryptography from 43.0.3 to 44.0.1 (@dependabot[bot])

View File

@@ -261,7 +261,7 @@ COPY requirements/*.txt requirements/
# Copy local packages needed for editable installs in development.txt
COPY superset-core superset-core
COPY superset-extensions-cli superset-extensions-cli
COPY superset-cli superset-cli
# Install Python dependencies using docker/pip-install.sh
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \

30
LLMS.md
View File

@@ -15,9 +15,8 @@ Apache Superset is a data visualization platform with Flask/Python backend and R
### Testing Strategy Migration
- **Prefer unit tests** over integration tests
- **Prefer integration tests** over end-to-end tests
- **Use Playwright for E2E tests** - Migrating from Cypress
- **Cypress is deprecated** - Will be removed once migration is completed
- **Prefer integration tests** over Cypress end-to-end tests
- **Cypress is last resort** - Actively moving away from Cypress
- **Use Jest + React Testing Library** for component testing
- **Use `test()` instead of `describe()`** - Follow [avoid nesting when testing](https://kentcdodds.com/blog/avoid-nesting-when-youre-testing) principles
@@ -108,18 +107,6 @@ superset/
npm run test # All tests
npm run test -- filename.test.tsx # Single file
# E2E Tests (Playwright - NEW)
npm run playwright:test # All Playwright tests
npm run playwright:ui # Interactive UI mode
npm run playwright:headed # See browser during tests
npx playwright test tests/auth/login.spec.ts # Single file
npm run playwright:debug tests/auth/login.spec.ts # Debug specific file
# E2E Tests (Cypress - DEPRECATED)
cd superset-frontend/cypress-base
npm run cypress-run-chrome # All Cypress tests (headless)
npm run cypress-debug # Interactive Cypress UI
# Backend
pytest # All tests
pytest tests/unit_tests/specific_test.py # Single file
@@ -149,19 +136,6 @@ curl -f http://localhost:8088/health || echo "❌ Setup required - see https://s
- **Use negation operator**: `~Model.field` instead of `== False` to avoid ruff E712 errors
- **Example**: `~Model.is_active` instead of `Model.is_active == False`
## Pull Request Guidelines
**When creating pull requests:**
1. **Read the current PR template**: Always check `.github/PULL_REQUEST_TEMPLATE.md` for the latest format
2. **Use the template sections**: Include all sections from the template (SUMMARY, BEFORE/AFTER, TESTING INSTRUCTIONS, ADDITIONAL INFORMATION)
3. **Follow PR title conventions**: Use [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/)
- Format: `type(scope): description`
- Example: `fix(dashboard): load charts correctly`
- Types: `fix`, `feat`, `docs`, `style`, `refactor`, `perf`, `test`, `chore`
**Important**: Always reference the actual template file at `.github/PULL_REQUEST_TEMPLATE.md` instead of using cached content, as the template may be updated over time.
## Pre-commit Validation
**Use pre-commit hooks for quality validation:**

View File

@@ -469,10 +469,6 @@ an account first if you don't have one, and reference your username
while requesting access to push packages.
```bash
# Run this first to make sure you are uploading the right version.
# Pypi does not allow you to delete or retract once uplaoded.
twine check dist/*
twine upload dist/*
```
@@ -522,8 +518,6 @@ takes the version (ie `3.1.1`), the git reference (any SHA, tag or branch
reference), and whether to force the `latest` Docker tag on the
generated images.
**NOTE:** If the docker image isn't built, you'll need to run this [GH action](https://github.com/apache/superset/actions/workflows/tag-release.yml) where you provide it the tag sha.
### Npm Release
You might want to publish the latest @superset-ui release to npm

View File

@@ -23,9 +23,6 @@ This file documents any backwards-incompatible changes in Superset and
assists people when migrating to a new version.
## Next
- [33055](https://github.com/apache/superset/pull/33055): Upgrades Flask-AppBuilder to 5.0.0. The AUTH_OID authentication type has been deprecated and is no longer available as an option in Flask-AppBuilder. OpenID (OID) is considered a deprecated authentication protocol - if you are using AUTH_OID, you will need to migrate to an alternative authentication method such as OAuth, LDAP, or database authentication before upgrading.
- [35062](https://github.com/apache/superset/pull/35062): Changed the function signature of `setupExtensions` to `setupCodeOverrides` with options as arguments.
- [34871](https://github.com/apache/superset/pull/34871): Fixed Jest test hanging issue from Ant Design v5 upgrade. MessageChannel is now mocked in test environment to prevent rc-overflow from causing Jest to hang. Test environment only - no production impact.
- [34782](https://github.com/apache/superset/pull/34782): Dataset exports now include the dataset ID in their file name (similar to charts and dashboards). If managing assets as code, make sure to rename existing dataset YAMLs to include the ID (and avoid duplicated files).
- [34536](https://github.com/apache/superset/pull/34536): The `ENVIRONMENT_TAG_CONFIG` color values have changed to support only Ant Design semantic colors. Update your `superset_config.py`:
- Change `"error.base"` to just `"error"` after this PR

View File

@@ -118,8 +118,6 @@ services:
POSTGRES_DB: superset_light
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py
GITHUB_HEAD_REF: ${GITHUB_HEAD_REF:-}
GITHUB_SHA: ${GITHUB_SHA:-}
superset-init-light:
build:
@@ -162,11 +160,8 @@ services:
SCARF_ANALYTICS: "${SCARF_ANALYTICS:-}"
# configuring the dev-server to use the host.docker.internal to connect to the backend
superset: "http://superset-light:8088"
# Webpack dev server configuration
WEBPACK_DEVSERVER_HOST: "${WEBPACK_DEVSERVER_HOST:-0.0.0.0}"
WEBPACK_DEVSERVER_PORT: "${WEBPACK_DEVSERVER_PORT:-9000}"
ports:
- "${NODE_PORT:-9001}:9000" # Parameterized port, accessible on all interfaces
- "127.0.0.1:${NODE_PORT:-9001}:9000" # Parameterized port
command: ["/app/docker/docker-frontend.sh"]
env_file:
- path: docker/.env # default

View File

@@ -72,7 +72,7 @@ case "${1}" in
;;
app)
echo "Starting web app (using development server)..."
flask run -p $PORT --reload --debugger --without-threads --host=0.0.0.0
flask run -p $PORT --reload --debugger --host=0.0.0.0
;;
app-gunicorn)
echo "Starting web app..."

View File

@@ -138,7 +138,7 @@ try:
from superset_config_docker import * # noqa: F403
logger.info(
"Loaded your Docker configuration at [%s]", superset_config_docker.__file__
f"Loaded your Docker configuration at [{superset_config_docker.__file__}]"
)
except ImportError:
logger.info("Using default Docker config...")

View File

@@ -12,7 +12,7 @@ Users can configure automated alerts and reports to send dashboards or charts to
- *Alerts* are sent when a SQL condition is reached
- *Reports* are sent on a schedule
Alerts and reports are disabled by default. To turn them on, you'll need to change configuration settings and install a suitable headless browser in your environment.
Alerts and reports are disabled by default. To turn them on, you need to do some setup, described here.
## Requirements
@@ -35,14 +35,16 @@ Screenshots will be taken but no messages actually sent as long as `ALERT_REPORT
#### In your `Dockerfile`
You'll need to extend the Superset image to include a headless browser. Your options include:
- Use Playwright with Chrome: this is the recommended approach as of version >=4.1.x. A working example of a Dockerfile that installs these tools is provided under “Building your own production Docker image” on the [Docker Builds](/docs/installation/docker-builds#building-your-own-production-docker-image) page. Read the code comments there as you'll also need to change a feature flag in your config.
- Use Firefox: you'll need to install geckodriver and Firefox.
- Use Chrome without Playwright: you'll need to install Chrome and set the value of `WEBDRIVER_TYPE` to `"chrome"` in your `superset_config.py`.
- You must install a headless browser, for taking screenshots of the charts and dashboards. Only Firefox and Chrome are currently supported.
> If you choose Chrome, you must also change the value of `WEBDRIVER_TYPE` to `"chrome"` in your `superset_config.py`.
In Superset versions <=4.0x, users installed Firefox or Chrome and that was documented here.
Note: All the components required (Firefox headless browser, Redis, Postgres db, celery worker and celery beat) are present in the *dev* docker image if you are following [Installing Superset Locally](/docs/installation/docker-compose/).
All you need to do is add the required config variables described in this guide (See `Detailed Config`).
Only the worker container needs the browser.
If you are running a non-dev docker image, e.g., a stable release like `apache/superset:3.1.0`, that image does not include a headless browser. Only the `superset_worker` container needs this headless browser to browse to the target chart or dashboard.
You can either install and configure the headless browser - see "Custom Dockerfile" section below - or when deploying via `docker compose`, modify your `docker-compose.yml` file to use a dev image for the worker container and a stable release image for the `superset_app` container.
*Note*: In this context, a "dev image" is the same application software as its corresponding non-dev image, just bundled with additional tools. So an image like `3.1.0-dev` is identical to `3.1.0` when it comes to stability, functionality, and running in production. The actual "in-development" versions of Superset - cutting-edge and unstable - are not tagged with version numbers on Docker Hub and will display version `0.0.0-dev` within the Superset UI.
### Slack integration
@@ -150,8 +152,8 @@ SMTP_MAIL_FROM = "noreply@youremail.com"
EMAIL_REPORTS_SUBJECT_PREFIX = "[Superset] " # optional - overwrites default value in config.py of "[Report] "
# WebDriver configuration
# If you use Firefox or Playwright with Chrome, you can stick with default values
# If you use Chrome and are *not* using Playwright, then add the following WEBDRIVER_TYPE and WEBDRIVER_OPTION_ARGS
# If you use Firefox, you can stick with default values
# If you use Chrome, then add the following WEBDRIVER_TYPE and WEBDRIVER_OPTION_ARGS
WEBDRIVER_TYPE = "chrome"
WEBDRIVER_OPTION_ARGS = [
"--force-device-scale-factor=2.0",
@@ -217,6 +219,62 @@ def alert_dynamic_minimal_interval(**kwargs) -> int:
ALERT_MINIMUM_INTERVAL = alert_dynamic_minimal_interval
```
## Custom Dockerfile
If you're running the dev version of a released Superset image, like `apache/superset:3.1.0-dev`, you should be set with the above.
But if you're building your own image, or starting with a non-dev version, a webdriver (and headless browser) is needed to capture screenshots of the charts and dashboards which are then sent to the recipient.
Here's how you can modify your Dockerfile to take the screenshots either with Firefox or Chrome.
### Using Firefox
```docker
FROM apache/superset:3.1.0
USER root
RUN apt-get update && \
apt-get install --no-install-recommends -y firefox-esr
ENV GECKODRIVER_VERSION=0.29.0
RUN wget -q https://github.com/mozilla/geckodriver/releases/download/v${GECKODRIVER_VERSION}/geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz && \
tar -x geckodriver -zf geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz -O > /usr/bin/geckodriver && \
chmod 755 /usr/bin/geckodriver && \
rm geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz
RUN pip install --no-cache gevent psycopg2 redis
USER superset
```
### Using Chrome
```docker
FROM apache/superset:3.1.0
USER root
RUN apt-get update && \
apt-get install -y wget zip libaio1
RUN export CHROMEDRIVER_VERSION=$(curl --silent https://googlechromelabs.github.io/chrome-for-testing/LATEST_RELEASE_116) && \
wget -O google-chrome-stable_current_amd64.deb -q http://dl.google.com/linux/chrome/deb/pool/main/g/google-chrome-stable/google-chrome-stable_${CHROMEDRIVER_VERSION}-1_amd64.deb && \
apt-get install -y --no-install-recommends ./google-chrome-stable_current_amd64.deb && \
rm -f google-chrome-stable_current_amd64.deb
RUN export CHROMEDRIVER_VERSION=$(curl --silent https://googlechromelabs.github.io/chrome-for-testing/LATEST_RELEASE_116) && \
wget -q https://storage.googleapis.com/chrome-for-testing-public/${CHROMEDRIVER_VERSION}/linux64/chromedriver-linux64.zip && \
unzip -j chromedriver-linux64.zip -d /usr/bin && \
chmod 755 /usr/bin/chromedriver && \
rm -f chromedriver-linux64.zip
RUN pip install --no-cache gevent psycopg2 redis
USER superset
```
Don't forget to set `WEBDRIVER_TYPE` and `WEBDRIVER_OPTION_ARGS` in your config if you use Chrome.
## Troubleshooting
There are many reasons that reports might not be working. Try these steps to check for specific issues.
@@ -235,7 +293,9 @@ This is the best source of information about the problem. In a docker compose d
To take a screenshot, the worker visits the dashboard or chart using a headless browser, then takes a screenshot. If you are able to send a chart as CSV or text but can't send as PNG, your problem may lie with the browser.
If you are handling the installation of the headless browser on your own, do your own verification to ensure that the headless browser opens successfully in the worker environment.
Superset docker images that have a tag ending with `-dev` have the Firefox headless browser and geckodriver already installed. You can test that these are installed and in the proper path by entering your Superset worker and running `firefox --headless` and then `geckodriver`. Both commands should start those applications.
If you are handling the installation of that software on your own, or wish to use Chromium instead, do your own verification to ensure that the headless browser opens successfully in the worker environment.
### Send a test email

View File

@@ -363,6 +363,110 @@ CUSTOM_SECURITY_MANAGER = CustomSsoSecurityManager
]
```
### Keycloak-Specific Configuration using Flask-OIDC
If you are using Keycloak as OpenID Connect 1.0 Provider, the above configuration based on [`Authlib`](https://authlib.org/) might not work. In this case using [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is a viable option.
Make sure the pip package [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is installed on the webserver. This was successfully tested using version 2.2.0. This package requires [`Flask-OpenID`](https://pypi.org/project/Flask-OpenID/) as a dependency.
The following code defines a new security manager. Add it to a new file named `keycloak_security_manager.py`, placed in the same directory as your `superset_config.py` file.
```python
from flask_appbuilder.security.manager import AUTH_OID
from superset.security import SupersetSecurityManager
from flask_oidc import OpenIDConnect
from flask_appbuilder.security.views import AuthOIDView
from flask_login import login_user
from urllib.parse import quote
from flask_appbuilder.views import ModelView, SimpleFormView, expose
from flask import (
redirect,
request
)
import logging
class OIDCSecurityManager(SupersetSecurityManager):
def __init__(self, appbuilder):
super(OIDCSecurityManager, self).__init__(appbuilder)
if self.auth_type == AUTH_OID:
self.oid = OpenIDConnect(self.appbuilder.get_app)
self.authoidview = AuthOIDCView
class AuthOIDCView(AuthOIDView):
@expose('/login/', methods=['GET', 'POST'])
def login(self, flag=True):
sm = self.appbuilder.sm
oidc = sm.oid
@self.appbuilder.sm.oid.require_login
def handle_login():
user = sm.auth_user_oid(oidc.user_getfield('email'))
if user is None:
info = oidc.user_getinfo(['preferred_username', 'given_name', 'family_name', 'email'])
user = sm.add_user(info.get('preferred_username'), info.get('given_name'), info.get('family_name'),
info.get('email'), sm.find_role('Gamma'))
login_user(user, remember=False)
return redirect(self.appbuilder.get_url_for_index)
return handle_login()
@expose('/logout/', methods=['GET', 'POST'])
def logout(self):
oidc = self.appbuilder.sm.oid
oidc.logout()
super(AuthOIDCView, self).logout()
redirect_url = request.url_root.strip('/') + self.appbuilder.get_url_for_login
return redirect(
oidc.client_secrets.get('issuer') + '/protocol/openid-connect/logout?redirect_uri=' + quote(redirect_url))
```
Then add to your `superset_config.py` file:
```python
from keycloak_security_manager import OIDCSecurityManager
from flask_appbuilder.security.manager import AUTH_OID, AUTH_REMOTE_USER, AUTH_DB, AUTH_LDAP, AUTH_OAUTH
import os
AUTH_TYPE = AUTH_OID
SECRET_KEY: 'SomethingNotEntirelySecret'
OIDC_CLIENT_SECRETS = '/path/to/client_secret.json'
OIDC_ID_TOKEN_COOKIE_SECURE = False
OIDC_OPENID_REALM: '<myRealm>'
OIDC_INTROSPECTION_AUTH_METHOD: 'client_secret_post'
CUSTOM_SECURITY_MANAGER = OIDCSecurityManager
# Will allow user self registration, allowing to create Flask users from Authorized User
AUTH_USER_REGISTRATION = True
# The default user self registration role
AUTH_USER_REGISTRATION_ROLE = 'Public'
```
Store your client-specific OpenID information in a file called `client_secret.json`. Create this file in the same directory as `superset_config.py`:
```json
{
"<myOpenIDProvider>": {
"issuer": "https://<myKeycloakDomain>/realms/<myRealm>",
"auth_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/auth",
"client_id": "https://<myKeycloakDomain>",
"client_secret": "<myClientSecret>",
"redirect_uris": [
"https://<SupersetWebserver>/oauth-authorized/<myOpenIDProvider>"
],
"userinfo_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/userinfo",
"token_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/token",
"token_introspection_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/token/introspect"
}
}
```
## LDAP Authentication
FAB supports authenticating user credentials against an LDAP server.

View File

@@ -10,15 +10,8 @@ version: 1
## Jinja Templates
SQL Lab and Explore supports [Jinja templating](https://jinja.palletsprojects.com/en/2.11.x/) in queries.
To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/docs/configuration/configuring-superset#feature-flags) needs to be enabled in `superset_config.py`.
> #### ⚠️ Security Warning
>
> While powerful, this feature executes template code on the server. Within the Superset security model, this is **intended functionality**, as users with permissions to edit charts and virtual datasets are considered **trusted users**.
>
> If you grant these permissions to untrusted users, this feature can be exploited as a **Server-Side Template Injection (SSTI)** vulnerability. Do not enable `ENABLE_TEMPLATE_PROCESSING` unless you fully understand and accept the associated security risks.
When templating is enabled, python code can be embedded in virtual datasets and
To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/docs/configuration/configuring-superset#feature-flags) needs to be enabled in
`superset_config.py`. When templating is enabled, python code can be embedded in virtual datasets and
in Custom SQL in the filter and metric controls in Explore. By default, the following variables are
made available in the Jinja context:

View File

@@ -165,206 +165,6 @@ Or in the CRUD interface theme JSON:
This feature works with the stock Docker image - no custom build required!
## ECharts Configuration Overrides
:::note
Available since Superset 6.0
:::
Superset provides fine-grained control over ECharts visualizations through theme-level configuration overrides. This allows you to customize the appearance and behavior of all ECharts-based charts without modifying individual chart configurations.
### Global ECharts Overrides
Apply settings to all ECharts visualizations using `echartsOptionsOverrides`:
```python
THEME_DEFAULT = {
"token": {
"colorPrimary": "#2893B3",
# ... other Ant Design tokens
},
"echartsOptionsOverrides": {
"grid": {
"left": "10%",
"right": "10%",
"top": "15%",
"bottom": "15%"
},
"tooltip": {
"backgroundColor": "rgba(0, 0, 0, 0.8)",
"borderColor": "#ccc",
"textStyle": {
"color": "#fff"
}
},
"legend": {
"textStyle": {
"fontSize": 14,
"fontWeight": "bold"
}
}
}
}
```
### Chart-Specific Overrides
Target specific chart types using `echartsOptionsOverridesByChartType`:
```python
THEME_DEFAULT = {
"token": {
"colorPrimary": "#2893B3",
# ... other tokens
},
"echartsOptionsOverridesByChartType": {
"echarts_pie": {
"legend": {
"orient": "vertical",
"right": 10,
"top": "center"
}
},
"echarts_timeseries": {
"xAxis": {
"axisLabel": {
"rotate": 45,
"fontSize": 12
}
},
"dataZoom": [{
"type": "slider",
"show": True,
"start": 0,
"end": 100
}]
},
"echarts_bubble": {
"grid": {
"left": "15%",
"bottom": "20%"
}
}
}
}
```
### UI Configuration
You can also configure ECharts overrides through the theme CRUD interface:
```json
{
"token": {
"colorPrimary": "#2893B3"
},
"echartsOptionsOverrides": {
"grid": {
"left": "10%",
"right": "10%"
},
"tooltip": {
"backgroundColor": "rgba(0, 0, 0, 0.8)"
}
},
"echartsOptionsOverridesByChartType": {
"echarts_pie": {
"legend": {
"orient": "vertical",
"right": 10
}
}
}
}
```
### Override Precedence
The system applies overrides in the following order (last wins):
1. **Base ECharts theme** - Default Superset styling
2. **Plugin options** - Chart-specific configurations
3. **Global overrides** - `echartsOptionsOverrides`
4. **Chart-specific overrides** - `echartsOptionsOverridesByChartType[chartType]`
This ensures chart-specific overrides take precedence over global ones.
### Common Chart Types
Available chart types for `echartsOptionsOverridesByChartType`:
- `echarts_timeseries` - Time series/line charts
- `echarts_pie` - Pie and donut charts
- `echarts_bubble` - Bubble/scatter charts
- `echarts_funnel` - Funnel charts
- `echarts_gauge` - Gauge charts
- `echarts_radar` - Radar charts
- `echarts_boxplot` - Box plot charts
- `echarts_treemap` - Treemap charts
- `echarts_sunburst` - Sunburst charts
- `echarts_graph` - Network/graph charts
- `echarts_sankey` - Sankey diagrams
- `echarts_heatmap` - Heatmaps
- `echarts_mixed_timeseries` - Mixed time series
### Best Practices
1. **Start with global overrides** for consistent styling across all charts
2. **Use chart-specific overrides** for unique requirements per visualization type
3. **Test thoroughly** as overrides use deep merge - nested objects are combined, but arrays are completely replaced
4. **Document your overrides** to help team members understand custom styling
5. **Consider performance** - complex overrides may impact chart rendering speed
### Example: Corporate Branding
```python
# Complete corporate theme with ECharts customization
THEME_DEFAULT = {
"token": {
"colorPrimary": "#1B4D3E",
"fontFamily": "Corporate Sans, Arial, sans-serif"
},
"echartsOptionsOverrides": {
"grid": {
"left": "8%",
"right": "8%",
"top": "12%",
"bottom": "12%"
},
"textStyle": {
"fontFamily": "Corporate Sans, Arial, sans-serif"
},
"title": {
"textStyle": {
"color": "#1B4D3E",
"fontSize": 18,
"fontWeight": "bold"
}
}
},
"echartsOptionsOverridesByChartType": {
"echarts_timeseries": {
"xAxis": {
"axisLabel": {
"color": "#666",
"fontSize": 11
}
}
},
"echarts_pie": {
"legend": {
"textStyle": {
"fontSize": 12
},
"itemGap": 20
}
}
}
}
```
This feature provides powerful theming capabilities while maintaining the flexibility of ECharts' extensive configuration options.
## Advanced Features
- **System Themes**: Manage system-wide default and dark themes via UI or configuration

View File

@@ -620,10 +620,10 @@ See [how tos](/docs/contributing/howtos#linting)
:::tip
`act` compatibility of Superset's GHAs is not fully tested. Running `act` locally may or may not
work for different actions, and may require fine tuning and local secret-handling.
work for different actions, and may require fine tunning and local secret-handling.
For those more intricate GHAs that are tricky to run locally, we recommend iterating
directly on GHA's infrastructure, by pushing directly on a branch and monitoring GHA logs.
For more targeted iteration, see the `gh workflow run --ref {BRANCH}` subcommand of the GitHub CLI.
For more targetted iteration, see the `gh workflow run --ref {BRANCH}` subcommand of the GitHub CLI.
:::
For automation and CI/CD, Superset makes extensive use of GitHub Actions (GHA). You
@@ -631,7 +631,7 @@ can find all of the workflows and other assets under the `.github/` folder. This
- running the backend unit test suites (`tests/`)
- running the frontend test suites (`superset-frontend/src/**.*.test.*`)
- running our Playwright end-to-end tests (`superset-frontend/playwright/`) and legacy Cypress tests (`superset-frontend/cypress-base/`)
- running our Cypress end-to-end tests (`superset-frontend/cypress-base/`)
- linting the codebase, including all Python, Typescript and Javascript, yaml and beyond
- checking for all sorts of other rules conventions
@@ -747,26 +747,6 @@ To run a single test file:
npm run test -- path/to/file.js
```
#### Known Issues and Workarounds
**Jest Test Hanging (MessageChannel Issue)**
If Jest tests hang with "Jest did not exit one second after the test run has completed", this is likely due to the MessageChannel issue from rc-overflow (Ant Design v5 components).
**Root Cause**: `rc-overflow@1.4.1` creates MessageChannel handles for responsive overflow detection that remain open after test completion.
**Current Workaround**: MessageChannel is mocked as undefined in `spec/helpers/jsDomWithFetchAPI.ts`, forcing rc-overflow to use requestAnimationFrame fallback.
**To verify if still needed**: Remove the MessageChannel mocking lines and run `npm test -- --shard=4/8`. If tests hang, the workaround is still required.
**Future removal conditions**: This workaround can be removed when:
- rc-overflow updates to properly clean up MessagePorts in test environments
- Jest updates to handle MessageChannel/MessagePort cleanup better
- Ant Design switches away from rc-overflow
- We switch away from Ant Design v5
**See**: [PR #34871](https://github.com/apache/superset/pull/34871) for full technical details.
### Debugging Server App
#### Local

View File

@@ -225,57 +225,21 @@ npm run test -- path/to/file.js
### E2E Integration Testing
**Note: We are migrating from Cypress to Playwright. Use Playwright for new tests.**
#### Playwright (Recommended - NEW)
For E2E testing with Playwright, use the same `docker compose` backend:
For E2E testing, we recommend that you use a `docker compose` backend
```bash
CYPRESS_CONFIG=true docker compose up --build
```
`docker compose` will get to work and expose a Cypress-ready Superset app.
This app uses a different database schema (`superset_cypress`) to keep it isolated from
your other dev environmen(s)t, a specific set of examples, and a set of configurations that
aligns with the expectations within the end-to-end tests. Also note that it's served on a
different port than the default port for the backend (`8088`).
The backend setup is identical - this exposes a test-ready Superset app on port 8081 with isolated database schema (`superset_cypress`), test data, and configurations.
Now in another terminal, let's get ready to execute some Cypress commands. First, tell cypress
to connect to the Cypress-ready Superset backend.
Now in another terminal, run Playwright tests:
```bash
# Navigate to frontend directory (Playwright config is here)
cd superset-frontend
# Run all Playwright tests
npm run playwright:test
# or: npx playwright test
# Run with interactive UI for debugging
npm run playwright:ui
# or: npx playwright test --ui
# Run in headed mode (see browser)
npm run playwright:headed
# or: npx playwright test --headed
# Run specific test file
npx playwright test tests/auth/login.spec.ts
# Run with debug mode (step through tests)
npm run playwright:debug tests/auth/login.spec.ts
# or: npx playwright test --debug tests/auth/login.spec.ts
# Generate test report
npx playwright show-report
```
Configuration is in `superset-frontend/playwright.config.ts`. Base URL is automatically set to `http://localhost:8088` but will use `PLAYWRIGHT_BASE_URL` if provided.
#### Cypress (DEPRECATED - will be removed in Phase 5)
:::warning
Cypress is being phased out in favor of Playwright. Use Playwright for all new tests.
:::
```bash
# Set base URL for Cypress
CYPRESS_BASE_URL=http://localhost:8081
```
@@ -663,7 +627,7 @@ feature flag to `true`, you can add the following line to the PR body/descriptio
FEATURE_TAGGING_SYSTEM=true
```
Similarly, it's possible to disable feature flags with:
Simarly, it's possible to disable feature flags with:
```
FEATURE_TAGGING_SYSTEM=false

View File

@@ -86,6 +86,8 @@ USER root
ENV PLAYWRIGHT_BROWSERS_PATH=/usr/local/share/playwright-browsers
# Install packages using uv into the virtual environment
# Superset started using uv after the 4.1 branch; if you are building from apache/superset:4.1.x or an older version,
# replace the first two lines with RUN pip install \
RUN . /app/.venv/bin/activate && \
uv pip install \
# install psycopg2 for using PostgreSQL metadata store - could be a MySQL package if using that backend:

View File

@@ -282,5 +282,5 @@ address.
When running `docker compose up`, docker will build what is required behind the scene, but
may use the docker cache if assets already exist. Running `docker compose build` prior to
`docker compose up` or the equivalent shortcut `docker compose up --build` ensures that your
docker images match the definition in the repository. This should only apply to the main
docker images matche the definition in the repository. This should only apply to the main
docker-compose.yml file (default) and not to the alternative methods defined above.

View File

@@ -0,0 +1,283 @@
---
title: URL Filters
hide_title: false
sidebar_position: 4
version: 1
---
# URL Filters
Apply filters to dashboards and charts directly through the URL using a simple, human-readable syntax.
Superset URL filters use [Rison](https://github.com/Nanonid/rison), a data serialization format that's JSON-compatible but optimized for URLs - it's expressive, compact, and looks as great as URLs can look without all the percent-encoding clutter.
## Why URL Filters?
URL filters allow you to:
- Share specific views of data with colleagues
- Bookmark frequently used filter combinations
- Create dynamic links in external applications
- Override saved dashboard states temporarily
- Build data-driven workflows
## Quick Start
Add the `f` parameter to any dashboard or explore URL:
```
/dashboard/123?f=(country:USA)
/explore?f=(year:2024)
```
## Basic Syntax
### Single Filter
Filter by a single value:
```
f=(country:USA)
```
### Multiple Filters (AND)
Combine multiple filters with commas (AND logic):
```
f=(country:USA,year:2024)
f=(status:active,department:Sales,region:North)
```
### Lists (IN Operator)
Use `!()` to filter by multiple values (OR within the field):
```
f=(country:!(USA,Canada)) # country IN ('USA', 'Canada')
f=(status:!(active,pending,review)) # status IN ('active', 'pending', 'review')
```
## Logical Operators
### NOT Operator
Exclude specific values:
```
f=(NOT:(country:USA)) # country != 'USA'
f=(NOT:(status:deleted)) # status != 'deleted'
```
Exclude multiple values (NOT IN):
```
f=(NOT:(country:!(USA,Canada))) # country NOT IN ('USA', 'Canada')
f=(NOT:(type:!(test,demo))) # type NOT IN ('test', 'demo')
```
### OR Operator
Create OR conditions across different fields:
```
f=(OR:!(status:urgent,priority:high)) # status = 'urgent' OR priority = 'high'
f=(OR:!(region:Europe,country:USA)) # region = 'Europe' OR country = 'USA'
```
## Comparison Operators
### Numeric Comparisons
Use comparison operators for numeric fields:
```
f=(sales:(gt:100000)) # sales > 100000
f=(age:(gte:18)) # age >= 18
f=(temperature:(lt:32)) # temperature < 32
f=(price:(lte:1000)) # price <= 1000
```
### Range Queries (BETWEEN)
Filter values within a range:
```
f=(date:(between:!(2024-01-01,2024-12-31))) # Full year 2024
f=(age:(between:!(25,65))) # Age 25 to 65 inclusive
f=(revenue:(between:!(10000,50000))) # Revenue range
```
### Text Matching (LIKE)
Use SQL LIKE patterns for text fields:
```
f=(name:(like:'John%')) # Names starting with John
f=(email:(like:'%@company.com')) # Company emails
f=(description:(like:'%urgent%')) # Contains 'urgent'
```
## Complex Examples
### E-commerce Dashboard
Show high-value orders from North America, excluding test accounts:
```
f=(region:!(USA,Canada,Mexico),amount:(gt:1000),NOT:(account_type:test))
```
### Sales Analytics
Q4 data for either VIP customers or high revenue:
```
f=(quarter:Q4,OR:!(customer_type:VIP,revenue:(gt:100000)))
```
### User Activity
Active users in specific departments, excluding contractors:
```
f=(status:active,department:!(Engineering,Sales),NOT:(employee_type:contractor))
```
## Integration with Existing Features
### With Permalinks
Override saved permalink state:
```
/dashboard/permalink/xyz789?f=(region:Europe)
```
The filter will override the region saved in the permalink while preserving other settings.
### With Form Data Keys
Apply filters on top of cached explore state:
```
/explore?form_data_key=abc123&f=(metric:(gt:baseline))
```
### With Embedded Dashboards
Filter embedded dashboards:
```
/dashboard/42/embedded?f=(client:ACME)
```
## Combining Everything
Here's a complex real-world example that combines multiple features:
```
f=(
year:2024,
quarter:!(Q3,Q4),
region:!(North,South),
NOT:(status:!(cancelled,refunded)),
revenue:(gt:50000),
OR:!(priority:urgent,escalated:!t)
)
```
This filters for:
- Year 2024
- Q3 or Q4
- North or South regions
- Excluding cancelled or refunded orders
- Revenue greater than $50,000
- And either urgent priority OR escalated flag is true
## Tips and Best Practices
### Keep It Simple
Start with basic filters and add complexity only when needed:
- ✅ Good: `f=(country:USA,year:2024)`
- ❌ Avoid: Complex nested logic when native filters would be clearer
### Use Lists for Same-Field OR
Instead of complex OR operators, use lists when filtering one field:
- ✅ Better: `f=(status:!(active,pending,review))`
- ❌ Avoid: `f=(OR:!(status:active,OR:!(status:pending,status:review)))`
### Quote Strings with Special Characters
Use single quotes for strings containing spaces or special characters:
- `f=(city:'New York')`
- `f=(name:'O''Brien')` # Escape single quotes by doubling
### Date Formats
Use ISO 8601 format for dates:
- `f=(date:2024-01-15)`
- `f=(created:(between:!(2024-01-01,2024-12-31)))`
## Limitations
- **Complex Boolean Logic**: For nested AND/OR combinations beyond what's shown here, use Superset's native filters
- **Column Names**: Must not conflict with reserved operators (OR, NOT)
- **URL Length**: Browsers have URL length limits; for very complex filters, use native filters
- **Special Characters**: Some characters may need URL encoding
## API Reference
### Logical Operators
| Operator | Syntax | Description |
|----------|--------|-------------|
| AND | `,` (comma) | Default between conditions |
| OR | `OR:!(...)` | Explicit OR across conditions |
| NOT | `NOT:(...)` | Negation |
### Comparison Operators
| Operator | Syntax | Example |
|----------|--------|---------|
| Equals | `:` | `country:USA` |
| IN | `:!(...)` | `country:!(USA,Canada)` |
| Greater Than | `(gt:n)` | `sales:(gt:1000)` |
| Greater Than or Equal | `(gte:n)` | `age:(gte:18)` |
| Less Than | `(lt:n)` | `temp:(lt:32)` |
| Less Than or Equal | `(lte:n)` | `price:(lte:100)` |
| BETWEEN | `(between:!(a,b))` | `date:(between:!(2024-01-01,2024-12-31))` |
| LIKE | `(like:pattern)` | `name:(like:'%smith%')` |
### Data Types
| Type | Example | Notes |
|------|---------|-------|
| String | `USA` or `'North America'` | Use quotes for spaces/special chars |
| Number | `42` or `3.14` | No quotes needed |
| Boolean | `!t` or `!f` | Rison boolean syntax |
| Null | `!n` | Rison null syntax |
| Array | `!(val1,val2)` | For IN operations |
## Troubleshooting
### Filters Not Working?
1. Check that you're using the `f=()` wrapper
2. Verify column names match exactly (case-sensitive)
3. Ensure proper Rison syntax (especially for arrays and objects)
4. Check browser console for error messages
### URL Too Long?
If your filter URL becomes too long (browsers typically limit URLs to ~2000 characters):
1. Use dashboard native filters for complex logic
2. Use the Superset Permalink API to store complex filter state and reference it with a key
3. Create a permalink with base filters, then add `f` parameter for variations
4. Consider splitting filters across multiple parameters (future feature)
### Special Characters Issues?
- **Spaces in values**: Use single quotes around the entire value
- Example: `f=(region:'North America')` for filtering on "North America"
- Example: `f=(city:'Los Angeles')` for filtering on "Los Angeles"
- **Single quotes in values**: Double them for escaping
- Example: `f=(name:'O''Brien')` for filtering on "O'Brien"
- **URL encoding**: The browser handles this automatically
- Spaces become `%20`, special chars are encoded as needed
- You type: `f=(region:'North America')`
- Browser sends: `f=(region:'North%20America')`
- **Multiple words**: Always quote multi-word values
- Correct: `f=(status:'In Progress')`
- Wrong: `f=(status:In Progress)` (will cause parse error)

View File

@@ -35,7 +35,7 @@
"@emotion/core": "^10.0.27",
"@emotion/react": "^11.13.3",
"@emotion/styled": "^10.0.27",
"@mdx-js/react": "^3.1.1",
"@mdx-js/react": "^3.1.0",
"@saucelabs/theme-github-codeblock": "^0.3.0",
"@storybook/addon-docs": "^8.6.11",
"@storybook/blocks": "^8.6.11",
@@ -50,7 +50,7 @@
"@storybook/theming": "^8.6.11",
"@superset-ui/core": "^0.20.4",
"antd": "^5.26.7",
"caniuse-lite": "^1.0.30001739",
"caniuse-lite": "^1.0.30001707",
"docusaurus-plugin-less": "^2.0.2",
"json-bigint": "^1.0.0",
"less": "^4.4.0",
@@ -65,7 +65,7 @@
"storybook": "^8.6.11",
"swagger-ui-react": "^5.27.1",
"tinycolor2": "^1.4.2",
"ts-loader": "^9.5.4"
"ts-loader": "^9.5.2"
},
"devDependencies": {
"@docusaurus/module-type-aliases": "^3.8.1",
@@ -73,14 +73,14 @@
"@eslint/js": "^9.32.0",
"@types/react": "^19.1.8",
"@typescript-eslint/eslint-plugin": "^8.37.0",
"@typescript-eslint/parser": "^8.42.0",
"eslint": "^9.34.0",
"@typescript-eslint/parser": "^8.37.0",
"eslint": "^9.32.0",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-prettier": "^5.5.3",
"eslint-plugin-react": "^7.37.5",
"globals": "^16.3.0",
"prettier": "^3.6.2",
"typescript": "~5.9.2",
"typescript": "~5.8.3",
"typescript-eslint": "^8.39.0",
"webpack": "^5.101.0"
},

View File

@@ -962,11 +962,8 @@
"ChartDataDatasource": {
"properties": {
"id": {
"description": "Datasource id/uuid",
"oneOf": [
{ "type": "integer" },
{ "type": "string" }
]
"description": "Datasource id",
"type": "integer"
},
"type": {
"description": "Datasource type",

View File

@@ -363,6 +363,110 @@ CUSTOM_SECURITY_MANAGER = CustomSsoSecurityManager
]
```
### Keycloak-Specific Configuration using Flask-OIDC
If you are using Keycloak as OpenID Connect 1.0 Provider, the above configuration based on [`Authlib`](https://authlib.org/) might not work. In this case using [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is a viable option.
Make sure the pip package [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is installed on the webserver. This was successfully tested using version 2.2.0. This package requires [`Flask-OpenID`](https://pypi.org/project/Flask-OpenID/) as a dependency.
The following code defines a new security manager. Add it to a new file named `keycloak_security_manager.py`, placed in the same directory as your `superset_config.py` file.
```python
from flask_appbuilder.security.manager import AUTH_OID
from superset.security import SupersetSecurityManager
from flask_oidc import OpenIDConnect
from flask_appbuilder.security.views import AuthOIDView
from flask_login import login_user
from urllib.parse import quote
from flask_appbuilder.views import ModelView, SimpleFormView, expose
from flask import (
redirect,
request
)
import logging
class OIDCSecurityManager(SupersetSecurityManager):
def __init__(self, appbuilder):
super(OIDCSecurityManager, self).__init__(appbuilder)
if self.auth_type == AUTH_OID:
self.oid = OpenIDConnect(self.appbuilder.get_app)
self.authoidview = AuthOIDCView
class AuthOIDCView(AuthOIDView):
@expose('/login/', methods=['GET', 'POST'])
def login(self, flag=True):
sm = self.appbuilder.sm
oidc = sm.oid
@self.appbuilder.sm.oid.require_login
def handle_login():
user = sm.auth_user_oid(oidc.user_getfield('email'))
if user is None:
info = oidc.user_getinfo(['preferred_username', 'given_name', 'family_name', 'email'])
user = sm.add_user(info.get('preferred_username'), info.get('given_name'), info.get('family_name'),
info.get('email'), sm.find_role('Gamma'))
login_user(user, remember=False)
return redirect(self.appbuilder.get_url_for_index)
return handle_login()
@expose('/logout/', methods=['GET', 'POST'])
def logout(self):
oidc = self.appbuilder.sm.oid
oidc.logout()
super(AuthOIDCView, self).logout()
redirect_url = request.url_root.strip('/') + self.appbuilder.get_url_for_login
return redirect(
oidc.client_secrets.get('issuer') + '/protocol/openid-connect/logout?redirect_uri=' + quote(redirect_url))
```
Then add to your `superset_config.py` file:
```python
from keycloak_security_manager import OIDCSecurityManager
from flask_appbuilder.security.manager import AUTH_OID, AUTH_REMOTE_USER, AUTH_DB, AUTH_LDAP, AUTH_OAUTH
import os
AUTH_TYPE = AUTH_OID
SECRET_KEY: 'SomethingNotEntirelySecret'
OIDC_CLIENT_SECRETS = '/path/to/client_secret.json'
OIDC_ID_TOKEN_COOKIE_SECURE = False
OIDC_OPENID_REALM: '<myRealm>'
OIDC_INTROSPECTION_AUTH_METHOD: 'client_secret_post'
CUSTOM_SECURITY_MANAGER = OIDCSecurityManager
# Will allow user self registration, allowing to create Flask users from Authorized User
AUTH_USER_REGISTRATION = True
# The default user self registration role
AUTH_USER_REGISTRATION_ROLE = 'Public'
```
Store your client-specific OpenID information in a file called `client_secret.json`. Create this file in the same directory as `superset_config.py`:
```json
{
"<myOpenIDProvider>": {
"issuer": "https://<myKeycloakDomain>/realms/<myRealm>",
"auth_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/auth",
"client_id": "https://<myKeycloakDomain>",
"client_secret": "<myClientSecret>",
"redirect_uris": [
"https://<SupersetWebserver>/oauth-authorized/<myOpenIDProvider>"
],
"userinfo_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/userinfo",
"token_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/token",
"token_introspection_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/token/introspect"
}
}
```
## LDAP Authentication
FAB supports authenticating user credentials against an LDAP server.

View File

@@ -620,10 +620,10 @@ See [how tos](/docs/contributing/howtos#linting)
:::tip
`act` compatibility of Superset's GHAs is not fully tested. Running `act` locally may or may not
work for different actions, and may require fine tuning and local secret-handling.
work for different actions, and may require fine tunning and local secret-handling.
For those more intricate GHAs that are tricky to run locally, we recommend iterating
directly on GHA's infrastructure, by pushing directly on a branch and monitoring GHA logs.
For more targeted iteration, see the `gh workflow run --ref {BRANCH}` subcommand of the GitHub CLI.
For more targetted iteration, see the `gh workflow run --ref {BRANCH}` subcommand of the GitHub CLI.
:::
For automation and CI/CD, Superset makes extensive use of GitHub Actions (GHA). You

View File

@@ -232,7 +232,7 @@ CYPRESS_CONFIG=true docker compose up --build
```
`docker compose` will get to work and expose a Cypress-ready Superset app.
This app uses a different database schema (`superset_cypress`) to keep it isolated from
your other dev environment(s), a specific set of examples, and a set of configurations that
your other dev environmen(s)t, a specific set of examples, and a set of configurations that
aligns with the expectations within the end-to-end tests. Also note that it's served on a
different port than the default port for the backend (`8088`).
@@ -627,7 +627,7 @@ feature flag to `true`, you can add the following line to the PR body/descriptio
FEATURE_TAGGING_SYSTEM=true
```
Similarly, it's possible to disable feature flags with:
Simarly, it's possible to disable feature flags with:
```
FEATURE_TAGGING_SYSTEM=false

View File

@@ -282,5 +282,5 @@ address.
When running `docker compose up`, docker will build what is required behind the scene, but
may use the docker cache if assets already exist. Running `docker compose build` prior to
`docker compose up` or the equivalent shortcut `docker compose up --build` ensures that your
docker images match the definition in the repository. This should only apply to the main
docker images matche the definition in the repository. This should only apply to the main
docker-compose.yml file (default) and not to the alternative methods defined above.

View File

@@ -2433,10 +2433,10 @@
minimatch "^3.1.2"
strip-json-comments "^3.1.1"
"@eslint/js@9.34.0", "@eslint/js@^9.32.0":
version "9.34.0"
resolved "https://registry.yarnpkg.com/@eslint/js/-/js-9.34.0.tgz#fc423168b9d10e08dea9088d083788ec6442996b"
integrity sha512-EoyvqQnBNsV1CWaEJ559rxXL4c8V92gxirbawSmVUOWXlsRxxQXl6LmCpdUblgxgSkDIqKnhzba2SjRTI/A5Rw==
"@eslint/js@9.33.0", "@eslint/js@^9.32.0":
version "9.33.0"
resolved "https://registry.yarnpkg.com/@eslint/js/-/js-9.33.0.tgz#475c92fdddab59b8b8cab960e3de2564a44bf368"
integrity sha512-5K1/mKhWaMfreBGJTwval43JJmkip0RmM+3+IuqupeSKNC/Th2Kc7ucaq5ovTSra/OOKB9c58CGSz3QMVbWt0A==
"@eslint/object-schema@^2.1.6":
version "2.1.6"
@@ -2598,10 +2598,10 @@
unist-util-visit "^5.0.0"
vfile "^6.0.0"
"@mdx-js/react@^3.0.0", "@mdx-js/react@^3.1.1":
version "3.1.1"
resolved "https://registry.yarnpkg.com/@mdx-js/react/-/react-3.1.1.tgz#24bda7fffceb2fe256f954482123cda1be5f5fef"
integrity sha512-f++rKLQgUVYDAtECQ6fn/is15GkEH9+nZPM3MS0RcxVqoTfawHvDlSCH7JbMhAM6uJ32v3eXLvLmLvjGu7PTQw==
"@mdx-js/react@^3.0.0", "@mdx-js/react@^3.1.0":
version "3.1.0"
resolved "https://registry.yarnpkg.com/@mdx-js/react/-/react-3.1.0.tgz#c4522e335b3897b9a845db1dbdd2f966ae8fb0ed"
integrity sha512-QjHtSaoameoalGnKDT3FoIl4+9RwyTmo9ZJGBdLOks/YOiWHoRDI3PUwEzOE7kEmGcV3AFcp9K6dYu9rEuKLAQ==
dependencies:
"@types/mdx" "^2.0.0"
@@ -4102,7 +4102,7 @@
natural-compare "^1.4.0"
ts-api-utils "^2.1.0"
"@typescript-eslint/parser@8.40.0":
"@typescript-eslint/parser@8.40.0", "@typescript-eslint/parser@^8.37.0":
version "8.40.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/parser/-/parser-8.40.0.tgz#1bc9f3701ced29540eb76ff2d95ce0d52ddc7e69"
integrity sha512-jCNyAuXx8dr5KJMkecGmZ8KI61KBUhkCob+SD+C+I5+Y1FWI2Y3QmY4/cxMCC5WAsZqoEtEETVhUiUMIGCf6Bw==
@@ -4113,17 +4113,6 @@
"@typescript-eslint/visitor-keys" "8.40.0"
debug "^4.3.4"
"@typescript-eslint/parser@^8.42.0":
version "8.42.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/parser/-/parser-8.42.0.tgz#20ea66f4867981fb5bb62cbe1454250fc4a440ab"
integrity sha512-r1XG74QgShUgXph1BYseJ+KZd17bKQib/yF3SR+demvytiRXrwd12Blnz5eYGm8tXaeRdd4x88MlfwldHoudGg==
dependencies:
"@typescript-eslint/scope-manager" "8.42.0"
"@typescript-eslint/types" "8.42.0"
"@typescript-eslint/typescript-estree" "8.42.0"
"@typescript-eslint/visitor-keys" "8.42.0"
debug "^4.3.4"
"@typescript-eslint/project-service@8.40.0":
version "8.40.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/project-service/-/project-service-8.40.0.tgz#1b7ba6079ff580c3215882fe75a43e5d3ed166b9"
@@ -4133,15 +4122,6 @@
"@typescript-eslint/types" "^8.40.0"
debug "^4.3.4"
"@typescript-eslint/project-service@8.42.0":
version "8.42.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/project-service/-/project-service-8.42.0.tgz#636eb3418b6c42c98554dce884943708bf41a583"
integrity sha512-vfVpLHAhbPjilrabtOSNcUDmBboQNrJUiNAGoImkZKnMjs2TIcWG33s4Ds0wY3/50aZmTMqJa6PiwkwezaAklg==
dependencies:
"@typescript-eslint/tsconfig-utils" "^8.42.0"
"@typescript-eslint/types" "^8.42.0"
debug "^4.3.4"
"@typescript-eslint/scope-manager@8.40.0":
version "8.40.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/scope-manager/-/scope-manager-8.40.0.tgz#2fbfcc8643340d8cd692267e61548b946190be8a"
@@ -4150,24 +4130,11 @@
"@typescript-eslint/types" "8.40.0"
"@typescript-eslint/visitor-keys" "8.40.0"
"@typescript-eslint/scope-manager@8.42.0":
version "8.42.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/scope-manager/-/scope-manager-8.42.0.tgz#36016757bc85b46ea42bae47b61f9421eddedde3"
integrity sha512-51+x9o78NBAVgQzOPd17DkNTnIzJ8T/O2dmMBLoK9qbY0Gm52XJcdJcCl18ExBMiHo6jPMErUQWUv5RLE51zJw==
dependencies:
"@typescript-eslint/types" "8.42.0"
"@typescript-eslint/visitor-keys" "8.42.0"
"@typescript-eslint/tsconfig-utils@8.40.0", "@typescript-eslint/tsconfig-utils@^8.40.0":
version "8.40.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.40.0.tgz#8e8fdb9b988854aedd04abdde3239c4bdd2d26e4"
integrity sha512-jtMytmUaG9d/9kqSl/W3E3xaWESo4hFDxAIHGVW/WKKtQhesnRIJSAJO6XckluuJ6KDB5woD1EiqknriCtAmcw==
"@typescript-eslint/tsconfig-utils@8.42.0", "@typescript-eslint/tsconfig-utils@^8.42.0":
version "8.42.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/tsconfig-utils/-/tsconfig-utils-8.42.0.tgz#21a3e74396fd7443ff930bc41b27789ba7e9236e"
integrity sha512-kHeFUOdwAJfUmYKjR3CLgZSglGHjbNTi1H8sTYRYV2xX6eNz4RyJ2LIgsDLKf8Yi0/GL1WZAC/DgZBeBft8QAQ==
"@typescript-eslint/type-utils@8.40.0":
version "8.40.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/type-utils/-/type-utils-8.40.0.tgz#a7e4a1f0815dd0ba3e4eef945cc87193ca32c422"
@@ -4179,16 +4146,11 @@
debug "^4.3.4"
ts-api-utils "^2.1.0"
"@typescript-eslint/types@8.40.0":
"@typescript-eslint/types@8.40.0", "@typescript-eslint/types@^8.40.0":
version "8.40.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/types/-/types-8.40.0.tgz#0b580fdf643737aa5c01285314b5c6e9543846a9"
integrity sha512-ETdbFlgbAmXHyFPwqUIYrfc12ArvpBhEVgGAxVYSwli26dn8Ko+lIo4Su9vI9ykTZdJn+vJprs/0eZU0YMAEQg==
"@typescript-eslint/types@8.42.0", "@typescript-eslint/types@^8.40.0", "@typescript-eslint/types@^8.42.0":
version "8.42.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/types/-/types-8.42.0.tgz#ae15c09cebda20473772902033328e87372db008"
integrity sha512-LdtAWMiFmbRLNP7JNeY0SqEtJvGMYSzfiWBSmx+VSZ1CH+1zyl8Mmw1TT39OrtsRvIYShjJWzTDMPWZJCpwBlw==
"@typescript-eslint/typescript-estree@8.40.0":
version "8.40.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/typescript-estree/-/typescript-estree-8.40.0.tgz#295149440ce7da81c790a4e14e327599a3a1e5c9"
@@ -4205,22 +4167,6 @@
semver "^7.6.0"
ts-api-utils "^2.1.0"
"@typescript-eslint/typescript-estree@8.42.0":
version "8.42.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/typescript-estree/-/typescript-estree-8.42.0.tgz#593c3af87d4462252c0d7239d1720b84a1b56864"
integrity sha512-ku/uYtT4QXY8sl9EDJETD27o3Ewdi72hcXg1ah/kkUgBvAYHLwj2ofswFFNXS+FL5G+AGkxBtvGt8pFBHKlHsQ==
dependencies:
"@typescript-eslint/project-service" "8.42.0"
"@typescript-eslint/tsconfig-utils" "8.42.0"
"@typescript-eslint/types" "8.42.0"
"@typescript-eslint/visitor-keys" "8.42.0"
debug "^4.3.4"
fast-glob "^3.3.2"
is-glob "^4.0.3"
minimatch "^9.0.4"
semver "^7.6.0"
ts-api-utils "^2.1.0"
"@typescript-eslint/utils@8.40.0":
version "8.40.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/utils/-/utils-8.40.0.tgz#8d0c6430ed2f5dc350784bb0d8be514da1e54054"
@@ -4239,14 +4185,6 @@
"@typescript-eslint/types" "8.40.0"
eslint-visitor-keys "^4.2.1"
"@typescript-eslint/visitor-keys@8.42.0":
version "8.42.0"
resolved "https://registry.yarnpkg.com/@typescript-eslint/visitor-keys/-/visitor-keys-8.42.0.tgz#87c6caaa1ac307bc73a87c1fc469f88f0162f27e"
integrity sha512-3WbiuzoEowaEn8RSnhJBrxSwX8ULYE9CXaPepS2C2W3NSA5NNIvBaslpBSBElPq0UGr0xVJlXFWOAKIkyylydQ==
dependencies:
"@typescript-eslint/types" "8.42.0"
eslint-visitor-keys "^4.2.1"
"@ungap/structured-clone@^1.0.0":
version "1.3.0"
resolved "https://registry.yarnpkg.com/@ungap/structured-clone/-/structured-clone-1.3.0.tgz#d06bbb384ebcf6c505fde1c3d0ed4ddffe0aaff8"
@@ -4766,9 +4704,9 @@ available-typed-arrays@^1.0.7:
possible-typed-array-names "^1.0.0"
axios@^1.9.0:
version "1.12.0"
resolved "https://registry.yarnpkg.com/axios/-/axios-1.12.0.tgz#11248459be05a5ee493485628fa0e4323d0abfc3"
integrity sha512-oXTDccv8PcfjZmPGlWsPSwtOJCZ/b6W5jAMCNcfwJbCzDckwG0jrYJFaWH1yvivfCXjVzV/SPDEhMB3Q+DSurg==
version "1.11.0"
resolved "https://registry.yarnpkg.com/axios/-/axios-1.11.0.tgz#c2ec219e35e414c025b2095e8b8280278478fdb6"
integrity sha512-1Lx3WLFQWm3ooKDYZD1eXmoGO9fxYQjrycfHFC8P0sCfQVXyROp0p9PFWBehewBOdCwHc+f/b8I0fMto5eSfwA==
dependencies:
follow-redirects "^1.15.6"
form-data "^4.0.4"
@@ -5082,10 +5020,10 @@ caniuse-api@^3.0.0:
lodash.memoize "^4.1.2"
lodash.uniq "^4.5.0"
caniuse-lite@^1.0.0, caniuse-lite@^1.0.30001702, caniuse-lite@^1.0.30001735, caniuse-lite@^1.0.30001739:
version "1.0.30001739"
resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30001739.tgz#b34ce2d56bfc22f4352b2af0144102d623a124f4"
integrity sha512-y+j60d6ulelrNSwpPyrHdl+9mJnQzHBr08xm48Qno0nSk4h3Qojh+ziv2qE6rXf4k3tadF4o1J/1tAbVm1NtnA==
caniuse-lite@^1.0.0, caniuse-lite@^1.0.30001702, caniuse-lite@^1.0.30001707, caniuse-lite@^1.0.30001735:
version "1.0.30001735"
resolved "https://registry.yarnpkg.com/caniuse-lite/-/caniuse-lite-1.0.30001735.tgz#ba658fd3fd24a4106fd68d5ce472a2c251494dbe"
integrity sha512-EV/laoX7Wq2J9TQlyIXRxTJqIw4sxfXS4OYgudGxBYRuTv0q7AM6yMEpU/Vo1I94thg9U6EZ2NfZx9GJq83u7w==
ccount@^2.0.0:
version "2.0.1"
@@ -6775,10 +6713,10 @@ eslint-visitor-keys@^4.2.1:
resolved "https://registry.yarnpkg.com/eslint-visitor-keys/-/eslint-visitor-keys-4.2.1.tgz#4cfea60fe7dd0ad8e816e1ed026c1d5251b512c1"
integrity sha512-Uhdk5sfqcee/9H/rCOJikYz67o0a2Tw2hGRPOG2Y1R2dg7brRe1uG0yaNQDHu+TO/uQPF/5eCapvYSmHUjt7JQ==
eslint@^9.34.0:
version "9.34.0"
resolved "https://registry.yarnpkg.com/eslint/-/eslint-9.34.0.tgz#0ea1f2c1b5d1671db8f01aa6b8ce722302016f7b"
integrity sha512-RNCHRX5EwdrESy3Jc9o8ie8Bog+PeYvvSR8sDGoZxNFTvZ4dlxUB3WzQ3bQMztFrSRODGrLLj8g6OFuGY/aiQg==
eslint@^9.32.0:
version "9.33.0"
resolved "https://registry.yarnpkg.com/eslint/-/eslint-9.33.0.tgz#cc186b3d9eb0e914539953d6a178a5b413997b73"
integrity sha512-TS9bTNIryDzStCpJN93aC5VRSW3uTx9sClUn4B87pwiCaJh220otoI0X8mJKr+VcPtniMdN8GKjlwgWGUv5ZKA==
dependencies:
"@eslint-community/eslint-utils" "^4.2.0"
"@eslint-community/regexpp" "^4.12.1"
@@ -6786,7 +6724,7 @@ eslint@^9.34.0:
"@eslint/config-helpers" "^0.3.1"
"@eslint/core" "^0.15.2"
"@eslint/eslintrc" "^3.3.1"
"@eslint/js" "9.34.0"
"@eslint/js" "9.33.0"
"@eslint/plugin-kit" "^0.3.5"
"@humanfs/node" "^0.16.6"
"@humanwhocodes/module-importer" "^1.0.1"
@@ -13201,10 +13139,10 @@ ts-dedent@^2.0.0, ts-dedent@^2.2.0:
resolved "https://registry.yarnpkg.com/ts-dedent/-/ts-dedent-2.2.0.tgz#39e4bd297cd036292ae2394eb3412be63f563bb5"
integrity sha512-q5W7tVM71e2xjHZTlgfTDoPF/SmqKG5hddq9SzR49CH2hayqRKJtQ4mtRlSxKaJlR/+9rEM+mnBHf7I2/BQcpQ==
ts-loader@^9.5.4:
version "9.5.4"
resolved "https://registry.yarnpkg.com/ts-loader/-/ts-loader-9.5.4.tgz#44b571165c10fb5a90744aa5b7e119233c4f4585"
integrity sha512-nCz0rEwunlTZiy6rXFByQU1kVVpCIgUpc/psFiKVrUwrizdnIbRFu8w7bxhUF0X613DYwT4XzrZHpVyMe758hQ==
ts-loader@^9.5.2:
version "9.5.2"
resolved "https://registry.yarnpkg.com/ts-loader/-/ts-loader-9.5.2.tgz#1f3d7f4bb709b487aaa260e8f19b301635d08020"
integrity sha512-Qo4piXvOTWcMGIgRiuFa6nHNm+54HbYaZCKqc9eeZCLRy3XqafQgwX2F7mofrbJG3g7EEb+lkiR+z2Lic2s3Zw==
dependencies:
chalk "^4.1.0"
enhanced-resolve "^5.0.0"
@@ -13331,10 +13269,10 @@ typescript-eslint@^8.39.0:
"@typescript-eslint/typescript-estree" "8.40.0"
"@typescript-eslint/utils" "8.40.0"
typescript@~5.9.2:
version "5.9.2"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-5.9.2.tgz#d93450cddec5154a2d5cabe3b8102b83316fb2a6"
integrity sha512-CWBzXQrc/qOkhidw1OzBTQuYRbfyxDXJMVJ1XNwUHGROVmuaeiEm3OslpZ1RV96d7SKKjZKrSJu3+t/xlw3R9A==
typescript@~5.8.3:
version "5.8.3"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-5.8.3.tgz#92f8a3e5e3cf497356f4178c34cd65a7f5e8440e"
integrity sha512-p1diW6TqL9L07nNxvRMM7hMMw4c5XOo/1ibL4aAIGmSAt9slTE1Xgw5KWuof2uTOvCg9BY7ZRi+GaF+7sfgPeQ==
ufo@^1.5.4:
version "1.6.1"

View File

@@ -35,8 +35,7 @@ classifiers = [
"Programming Language :: Python :: 3.12",
]
dependencies = [
# no bounds for apache-superset-core until we have a stable version
"apache-superset-core",
"apache-superset-core>=0.0.1, <0.2",
"backoff>=1.8.0",
"celery>=5.3.6, <6.0.0",
"click>=8.0.3",
@@ -48,7 +47,7 @@ dependencies = [
"cryptography>=42.0.4, <45.0.0",
"deprecation>=2.1.0, <2.2.0",
"flask>=2.2.5, <3.0.0",
"flask-appbuilder>=5.0.0,<6",
"flask-appbuilder>=4.8.1, <5.0.0",
"flask-caching>=2.1.0, <3",
"flask-compress>=1.13, <2.0",
"flask-talisman>=1.0.0, <2.0",
@@ -69,14 +68,13 @@ dependencies = [
"markdown>=3.0",
# marshmallow>=4 has issues: https://github.com/apache/superset/issues/33162
"marshmallow>=3.0, <4",
"marshmallow-union>=0.1",
"msgpack>=1.0.0, <1.1",
"nh3>=0.2.11, <0.3",
"numpy>1.23.5, <2.3",
"packaging",
# --------------------------
# pandas and related (wanting pandas[performance] without numba as it's 100+MB and not needed)
"pandas[excel]>=2.0.3, <2.2",
"pandas[excel]>=2.0.3, <2.1",
"bottleneck", # recommended performance dependency for pandas, see https://pandas.pydata.org/docs/getting_started/install.html#performance-dependencies-recommended
# --------------------------
"parsedatetime",
@@ -84,12 +82,11 @@ dependencies = [
"pgsanity",
"Pillow>=11.0.0, <12",
"polyline>=2.0.0, <3.0",
"pydantic>=2.8.0",
"pyparsing>=3.0.6, <4",
"python-dateutil",
"python-dotenv", # optional dependencies for Flask but required for Superset, see https://flask.palletsprojects.com/en/stable/installation/#optional-dependencies
"python-geohash",
"pyarrow>=16.1.0, <19", # before upgrading pyarrow, check that all db dependencies support this, see e.g. https://github.com/apache/superset/pull/34693
"pyarrow>=16.1.0, <17", # before upgrading pyarrow, check that all db dependencies support this, see e.g. https://github.com/apache/superset/pull/34693
"pyyaml>=6.0.0, <7.0.0",
"PyJWT>=2.4.0, <3.0",
"redis>=4.6.0, <5.0",
@@ -125,8 +122,8 @@ cockroachdb = ["cockroachdb>=0.3.5, <0.4"]
crate = ["sqlalchemy-cratedb>=0.40.1, <1"]
databend = ["databend-sqlalchemy>=0.3.2, <1.0"]
databricks = [
"databricks-sql-connector==4.1.2",
"databricks-sqlalchemy==1.0.5",
"databricks-sql-connector>=2.0.2, <3",
"sqlalchemy-databricks>=0.2.0",
]
db2 = ["ibm-db-sa>0.3.8, <=0.4.0"]
denodo = ["denodo-sqlalchemy~=1.0.6"]
@@ -195,8 +192,7 @@ doris = ["pydoris>=1.0.0, <2.0.0"]
oceanbase = ["oceanbase_py>=0.0.1"]
ydb = ["ydb-sqlalchemy>=0.1.2"]
development = [
# no bounds for apache-superset-extensions-cli until a stable version
"apache-superset-extensions-cli",
"apache-superset-cli>=0.0.1, <0.2",
"docker",
"flask-testing",
"freezegun",
@@ -228,8 +224,8 @@ documentation = "https://superset.apache.org/docs/intro"
combine_as_imports = true
include_trailing_comma = true
line_length = 88
known_first_party = "superset, apache-superset-core, apache-superset-extensions-cli"
known_third_party = "alembic, apispec, backoff, celery, click, colorama, cron_descriptor, croniter, cryptography, dateutil, deprecation, flask, flask_appbuilder, flask_babel, flask_caching, flask_compress, flask_jwt_extended, flask_login, flask_migrate, flask_sqlalchemy, flask_talisman, flask_testing, flask_wtf, freezegun, geohash, geopy, holidays, humanize, isodate, jinja2, jwt, markdown, markupsafe, marshmallow, marshmallow-union, msgpack, nh3, numpy, pandas, parameterized, parsedatetime, pgsanity, polyline, prison, progress, pyarrow, sqlalchemy_bigquery, pyhive, pyparsing, pytest, pytest_mock, pytz, redis, requests, selenium, setuptools, shillelagh, simplejson, slack, sqlalchemy, sqlalchemy_utils, typing_extensions, urllib3, werkzeug, wtforms, wtforms_json, yaml"
known_first_party = "superset, apache-superset-core, apache-superset-cli"
known_third_party = "alembic, apispec, backoff, celery, click, colorama, cron_descriptor, croniter, cryptography, dateutil, deprecation, flask, flask_appbuilder, flask_babel, flask_caching, flask_compress, flask_jwt_extended, flask_login, flask_migrate, flask_sqlalchemy, flask_talisman, flask_testing, flask_wtf, freezegun, geohash, geopy, holidays, humanize, isodate, jinja2, jwt, markdown, markupsafe, marshmallow, msgpack, nh3, numpy, pandas, parameterized, parsedatetime, pgsanity, polyline, prison, progress, pyarrow, sqlalchemy_bigquery, pyhive, pyparsing, pytest, pytest_mock, pytz, redis, requests, selenium, setuptools, shillelagh, simplejson, slack, sqlalchemy, sqlalchemy_utils, typing_extensions, urllib3, werkzeug, wtforms, wtforms_json, yaml"
multi_line_output = 3
order_by_type = false
@@ -314,7 +310,6 @@ select = [
"E",
"F",
"F",
"G",
"I",
"N",
"PT",
@@ -330,7 +325,6 @@ ignore = [
"PT006",
"T201",
"N999",
"G201",
]
extend-select = ["I"]
@@ -436,4 +430,4 @@ pyxlsb = "1" # GPL
[tool.uv.sources]
apache-superset-core = { path = "./superset-core", editable = true }
apache-superset-extensions-cli = { path = "./superset-extensions-cli", editable = true }
apache-superset-cli = { path = "./superset-cli", editable = true }

View File

@@ -6,8 +6,6 @@ alembic==1.15.2
# via flask-migrate
amqp==5.3.1
# via kombu
annotated-types==0.7.0
# via pydantic
apispec==6.6.1
# via
# -r requirements/base.in
@@ -116,11 +114,11 @@ flask==2.3.3
# flask-session
# flask-sqlalchemy
# flask-wtf
flask-appbuilder==5.0.0
flask-appbuilder==4.8.1
# via
# apache-superset (pyproject.toml)
# apache-superset-core
flask-babel==3.1.0
flask-babel==2.0.0
# via flask-appbuilder
flask-caching==2.3.1
# via apache-superset (pyproject.toml)
@@ -160,7 +158,6 @@ greenlet==3.1.1
# via
# apache-superset (pyproject.toml)
# shillelagh
# sqlalchemy
gunicorn==23.0.0
# via apache-superset (pyproject.toml)
h11==0.16.0
@@ -222,13 +219,10 @@ marshmallow==3.26.1
# apache-superset (pyproject.toml)
# flask-appbuilder
# marshmallow-sqlalchemy
# marshmallow-union
marshmallow-sqlalchemy==1.4.0
# via
# -r requirements/base.in
# flask-appbuilder
marshmallow-union==0.1.15
# via apache-superset (pyproject.toml)
mdurl==0.1.2
# via markdown-it-py
msgpack==1.0.8
@@ -267,7 +261,7 @@ packaging==25.0
# limits
# marshmallow
# shillelagh
pandas==2.1.4
pandas==2.0.3
# via apache-superset (pyproject.toml)
paramiko==3.5.1
# via
@@ -299,10 +293,6 @@ pyasn1-modules==0.4.2
# via google-auth
pycparser==2.22
# via cffi
pydantic==2.11.7
# via apache-superset (pyproject.toml)
pydantic-core==2.33.2
# via pydantic
pygments==2.19.1
# via rich
pyjwt==2.10.1
@@ -413,15 +403,10 @@ typing-extensions==4.14.0
# alembic
# cattrs
# limits
# pydantic
# pydantic-core
# pyopenssl
# referencing
# selenium
# shillelagh
# typing-inspection
typing-inspection==0.4.1
# via pydantic
tzdata==2025.2
# via
# kombu

View File

@@ -17,4 +17,4 @@
# under the License.
#
-e .[development,bigquery,druid,duckdb,gevent,gsheets,mysql,postgres,presto,prophet,trino,thumbnails]
-e ./superset-extensions-cli[test]
-e ./superset-cli[test]

View File

@@ -2,14 +2,14 @@
# uv pip compile requirements/development.in -c requirements/base-constraint.txt -o requirements/development.txt
-e .
# via -r requirements/development.in
-e ./superset-core
# via
# apache-superset
# apache-superset-extensions-cli
-e ./superset-extensions-cli
-e ./superset-cli
# via
# -r requirements/development.in
# apache-superset
-e ./superset-core
# via
# apache-superset
# apache-superset-cli
alembic==1.15.2
# via
# -c requirements/base-constraint.txt
@@ -18,10 +18,6 @@ amqp==5.3.1
# via
# -c requirements/base-constraint.txt
# kombu
annotated-types==0.7.0
# via
# -c requirements/base-constraint.txt
# pydantic
apispec==6.6.1
# via
# -c requirements/base-constraint.txt
@@ -106,7 +102,7 @@ click==8.2.1
# via
# -c requirements/base-constraint.txt
# apache-superset
# apache-superset-extensions-cli
# apache-superset-cli
# celery
# click-didyoumean
# click-option-group
@@ -212,12 +208,12 @@ flask==2.3.3
# flask-sqlalchemy
# flask-testing
# flask-wtf
flask-appbuilder==5.0.0
flask-appbuilder==4.8.1
# via
# -c requirements/base-constraint.txt
# apache-superset
# apache-superset-core
flask-babel==3.1.0
flask-babel==2.0.0
# via
# -c requirements/base-constraint.txt
# flask-appbuilder
@@ -331,7 +327,6 @@ greenlet==3.1.1
# apache-superset
# gevent
# shillelagh
# sqlalchemy
grpcio==1.71.0
# via
# apache-superset
@@ -387,7 +382,7 @@ itsdangerous==2.2.0
jinja2==3.1.6
# via
# -c requirements/base-constraint.txt
# apache-superset-extensions-cli
# apache-superset-cli
# flask
# flask-babel
jsonpath-ng==1.7.0
@@ -449,15 +444,10 @@ marshmallow==3.26.1
# apache-superset
# flask-appbuilder
# marshmallow-sqlalchemy
# marshmallow-union
marshmallow-sqlalchemy==1.4.0
# via
# -c requirements/base-constraint.txt
# flask-appbuilder
marshmallow-union==0.1.15
# via
# -c requirements/base-constraint.txt
# apache-superset
matplotlib==3.9.0
# via prophet
mccabe==0.7.0
@@ -537,7 +527,7 @@ packaging==25.0
# pytest
# shillelagh
# sqlalchemy-bigquery
pandas==2.1.4
pandas==2.0.3
# via
# -c requirements/base-constraint.txt
# apache-superset
@@ -637,14 +627,6 @@ pycparser==2.22
# via
# -c requirements/base-constraint.txt
# cffi
pydantic==2.11.7
# via
# -c requirements/base-constraint.txt
# apache-superset
pydantic-core==2.33.2
# via
# -c requirements/base-constraint.txt
# pydantic
pydata-google-auth==1.9.0
# via pandas-gbq
pydruid==0.6.9
@@ -687,17 +669,17 @@ pysocks==1.7.1
pytest==7.4.4
# via
# apache-superset
# apache-superset-extensions-cli
# apache-superset-cli
# pytest-cov
# pytest-mock
pytest-cov==6.0.0
# via
# apache-superset
# apache-superset-extensions-cli
# apache-superset-cli
pytest-mock==3.10.0
# via
# apache-superset
# apache-superset-extensions-cli
# apache-superset-cli
python-dateutil==2.9.0.post0
# via
# -c requirements/base-constraint.txt
@@ -794,7 +776,7 @@ selenium==4.32.0
# -c requirements/base-constraint.txt
# apache-superset
semver==3.0.4
# via apache-superset-extensions-cli
# via apache-superset-cli
setuptools==80.7.1
# via
# nodeenv
@@ -888,17 +870,10 @@ typing-extensions==4.14.0
# apache-superset
# cattrs
# limits
# pydantic
# pydantic-core
# pyopenssl
# referencing
# selenium
# shillelagh
# typing-inspection
typing-inspection==0.4.1
# via
# -c requirements/base-constraint.txt
# pydantic
tzdata==2025.2
# via
# -c requirements/base-constraint.txt
@@ -929,7 +904,7 @@ watchdog==6.0.0
# via
# -c requirements/base-constraint.txt
# apache-superset
# apache-superset-extensions-cli
# apache-superset-cli
wcwidth==0.2.13
# via
# -c requirements/base-constraint.txt

View File

@@ -45,9 +45,9 @@ PATTERNS = {
"docs": [
r"^docs/",
],
"superset-extensions-cli": [
r"^\.github/workflows/superset-extensions-cli\.yml",
r"^superset-extensions-cli/",
"superset-cli": [
r"^\.github/workflows/superset-cli\.yml",
r"^superset-cli/",
r"^superset-core/",
],
}

View File

@@ -17,6 +17,6 @@ specific language governing permissions and limitations
under the License.
-->
## Change Log
# Apache Superset SDK
Changelogs will be added once we have the first stable release.
This is an SDK tool used for bundling Apache Superset extensions.

View File

@@ -16,35 +16,20 @@
# under the License.
[project]
name = "apache-superset-extensions-cli"
version = "0.0.1rc2"
description = "Official command-line interface for building, bundling, and managing Apache Superset extensions"
readme = "README.md"
name = "apache-superset-cli"
version = "0.0.1"
description = "SDK to build Apache Superset extensions"
authors = [
{ name = "Apache Software Foundation", email = "dev@superset.apache.org" },
]
license = { file="LICENSE.txt" }
requires-python = ">=3.10"
keywords = ["superset", "apache", "cli", "extensions", "analytics", "business-intelligence", "development-tools"]
classifiers = [
"Development Status :: 3 - Alpha",
"Environment :: Console",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Database",
"Topic :: Scientific/Engineering :: Visualization",
"Topic :: Software Development :: Build Tools",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: System :: Software Distribution",
]
dependencies = [
# no bounds for apache-superset-core until we have a stable version
"apache-superset-core",
"apache-superset-core>=0.0.1, <0.2",
"click>=8.0.3",
"jinja2>=3.1.6",
"semver>=3.0.4",
@@ -52,13 +37,6 @@ dependencies = [
"watchdog>=6.0.0",
]
[project.urls]
Homepage = "https://superset.apache.org/"
Documentation = "https://superset.apache.org/docs/"
Repository = "https://github.com/apache/superset"
"Bug Tracker" = "https://github.com/apache/superset/issues"
Changelog = "https://github.com/apache/superset/blob/master/CHANGELOG.md"
[project.optional-dependencies]
test = [
"pytest",
@@ -71,17 +49,11 @@ requires = ["setuptools>=76.0.0", "wheel"]
build-backend = "setuptools.build_meta"
[tool.setuptools]
packages = ["superset_cli"]
package-dir = { "" = "src" }
include-package-data = true
[tool.setuptools.packages.find]
where = ["src"]
[tool.setuptools.package-data]
superset_extensions_cli = ["templates/**/*"]
[project.scripts]
superset-extensions = "superset_extensions_cli.cli:app"
superset-extensions = "superset_cli.cli:app"
[tool.pytest.ini_options]
testpaths = ["tests"]
@@ -92,7 +64,7 @@ addopts = [
"--strict-markers",
"--strict-config",
"--verbose",
"--cov=superset_extensions_cli",
"--cov=superset_cli",
"--cov-report=term-missing",
"--cov-report=html:htmlcov"
]
@@ -104,7 +76,7 @@ markers = [
]
[tool.coverage.run]
source = ["src/superset_extensions_cli"]
source = ["src/superset_cli"]
omit = ["*/tests/*", "*/test_*"]
[tool.coverage.report]
@@ -122,4 +94,4 @@ exclude_lines = [
]
[tool.ruff.lint.per-file-ignores]
"src/superset_extensions_cli/*" = ["TID251"]
"src/superset_cli/*" = ["TID251"]

View File

@@ -32,8 +32,8 @@ from superset_core.extensions.types import Manifest, Metadata
from watchdog.events import FileSystemEventHandler
from watchdog.observers import Observer
from superset_extensions_cli.constants import MIN_NPM_VERSION
from superset_extensions_cli.utils import read_json, read_toml
from superset_cli.constants import MIN_NPM_VERSION
from superset_cli.utils import read_json, read_toml
REMOTE_ENTRY_REGEX = re.compile(r"^remoteEntry\..+\.js$")
FRONTEND_DIST_REGEX = re.compile(r"/frontend/dist")

View File

@@ -51,7 +51,7 @@ under the License.
# Superset CLI Tests
This directory contains tests for the superset-extensions-cli package, focusing on the `init` command and other CLI functionality.
This directory contains tests for the superset-cli package, focusing on the `init` command and other CLI functionality.
## Test Structure
@@ -164,7 +164,7 @@ pytest -m cli # CLI tests only
### With coverage
```bash
pytest --cov=superset_extensions_cli --cov-report=html
pytest --cov=superset_cli --cov-report=html
```
### Specific test files

View File

@@ -21,7 +21,7 @@ import json
from unittest.mock import Mock, patch
import pytest
from superset_extensions_cli.cli import (
from superset_cli.cli import (
app,
build_manifest,
clean_dist,
@@ -79,11 +79,11 @@ def extension_with_build_structure():
# Build Command Tests
@pytest.mark.cli
@patch("superset_extensions_cli.cli.validate_npm")
@patch("superset_extensions_cli.cli.init_frontend_deps")
@patch("superset_extensions_cli.cli.rebuild_frontend")
@patch("superset_extensions_cli.cli.rebuild_backend")
@patch("superset_extensions_cli.cli.read_toml")
@patch("superset_cli.cli.validate_npm")
@patch("superset_cli.cli.init_frontend_deps")
@patch("superset_cli.cli.rebuild_frontend")
@patch("superset_cli.cli.rebuild_backend")
@patch("superset_cli.cli.read_toml")
def test_build_command_success_flow(
mock_read_toml,
mock_rebuild_backend,
@@ -115,9 +115,9 @@ def test_build_command_success_flow(
@pytest.mark.cli
@patch("superset_extensions_cli.cli.validate_npm")
@patch("superset_extensions_cli.cli.init_frontend_deps")
@patch("superset_extensions_cli.cli.rebuild_frontend")
@patch("superset_cli.cli.validate_npm")
@patch("superset_cli.cli.init_frontend_deps")
@patch("superset_cli.cli.rebuild_frontend")
def test_build_command_handles_frontend_build_failure(
mock_rebuild_frontend,
mock_init_frontend_deps,
@@ -187,7 +187,7 @@ def test_init_frontend_deps_skips_when_node_modules_exists(
@pytest.mark.unit
@patch("subprocess.run")
@patch("superset_extensions_cli.cli.validate_npm")
@patch("superset_cli.cli.validate_npm")
def test_init_frontend_deps_runs_npm_i_when_missing(
mock_validate_npm, mock_run, isolated_filesystem
):
@@ -207,7 +207,7 @@ def test_init_frontend_deps_runs_npm_i_when_missing(
@pytest.mark.unit
@patch("subprocess.run")
@patch("superset_extensions_cli.cli.validate_npm")
@patch("superset_cli.cli.validate_npm")
def test_init_frontend_deps_exits_on_npm_ci_failure(
mock_validate_npm, mock_run, isolated_filesystem
):
@@ -303,7 +303,7 @@ def test_build_manifest_exits_when_extension_json_missing(isolated_filesystem):
@pytest.mark.unit
def test_clean_dist_frontend_removes_frontend_dist(isolated_filesystem):
"""Test clean_dist_frontend removes frontend/dist directory specifically."""
from superset_extensions_cli.cli import clean_dist_frontend
from superset_cli.cli import clean_dist_frontend
# Create dist/frontend structure
dist_dir = isolated_filesystem / "dist"
@@ -322,7 +322,7 @@ def test_clean_dist_frontend_removes_frontend_dist(isolated_filesystem):
@pytest.mark.unit
def test_clean_dist_frontend_handles_nonexistent_directory(isolated_filesystem):
"""Test clean_dist_frontend handles case where frontend dist doesn't exist."""
from superset_extensions_cli.cli import clean_dist_frontend
from superset_cli.cli import clean_dist_frontend
# No dist directory exists
clean_dist_frontend(isolated_filesystem)
@@ -333,7 +333,7 @@ def test_clean_dist_frontend_handles_nonexistent_directory(isolated_filesystem):
@pytest.mark.unit
def test_run_frontend_build_with_output_messages(isolated_filesystem):
"""Test run_frontend_build produces expected output messages."""
from superset_extensions_cli.cli import run_frontend_build
from superset_cli.cli import run_frontend_build
frontend_dir = isolated_filesystem / "frontend"
frontend_dir.mkdir()
@@ -362,7 +362,7 @@ def test_rebuild_frontend_handles_build_results(
isolated_filesystem, return_code, expected_result
):
"""Test rebuild_frontend handles different build results."""
from superset_extensions_cli.cli import rebuild_frontend
from superset_cli.cli import rebuild_frontend
# Create frontend structure
frontend_dir = isolated_filesystem / "frontend"
@@ -378,7 +378,7 @@ def test_rebuild_frontend_handles_build_results(
dist_dir = isolated_filesystem / "dist"
dist_dir.mkdir()
with patch("superset_extensions_cli.cli.run_frontend_build") as mock_build:
with patch("superset_cli.cli.run_frontend_build") as mock_build:
mock_build.return_value = Mock(returncode=return_code)
result = rebuild_frontend(isolated_filesystem, frontend_dir)
@@ -390,7 +390,7 @@ def test_rebuild_frontend_handles_build_results(
@pytest.mark.unit
def test_rebuild_backend_calls_copy_and_shows_message(isolated_filesystem):
"""Test rebuild_backend calls copy_backend_files and shows success message."""
from superset_extensions_cli.cli import rebuild_backend
from superset_cli.cli import rebuild_backend
# Create extension.json
extension_json = {
@@ -401,7 +401,7 @@ def test_rebuild_backend_calls_copy_and_shows_message(isolated_filesystem):
}
(isolated_filesystem / "extension.json").write_text(json.dumps(extension_json))
with patch("superset_extensions_cli.cli.copy_backend_files") as mock_copy:
with patch("superset_cli.cli.copy_backend_files") as mock_copy:
rebuild_backend(isolated_filesystem)
mock_copy.assert_called_once_with(isolated_filesystem)

View File

@@ -22,14 +22,14 @@ import zipfile
from unittest.mock import patch
import pytest
from superset_extensions_cli.cli import app
from superset_cli.cli import app
from tests.utils import assert_file_exists
# Bundle Command Tests
@pytest.mark.cli
@patch("superset_extensions_cli.cli.build")
@patch("superset_cli.cli.build")
def test_bundle_command_creates_zip_with_default_name(
mock_build, cli_runner, isolated_filesystem, extension_setup_for_bundling
):
@@ -59,7 +59,7 @@ def test_bundle_command_creates_zip_with_default_name(
@pytest.mark.cli
@patch("superset_extensions_cli.cli.build")
@patch("superset_cli.cli.build")
def test_bundle_command_with_custom_output_filename(
mock_build, cli_runner, isolated_filesystem, extension_setup_for_bundling
):
@@ -81,7 +81,7 @@ def test_bundle_command_with_custom_output_filename(
@pytest.mark.cli
@patch("superset_extensions_cli.cli.build")
@patch("superset_cli.cli.build")
def test_bundle_command_with_output_directory(
mock_build, cli_runner, isolated_filesystem, extension_setup_for_bundling
):
@@ -106,7 +106,7 @@ def test_bundle_command_with_output_directory(
@pytest.mark.cli
@patch("superset_extensions_cli.cli.build")
@patch("superset_cli.cli.build")
def test_bundle_command_fails_without_manifest(
mock_build, cli_runner, isolated_filesystem
):
@@ -124,7 +124,7 @@ def test_bundle_command_fails_without_manifest(
@pytest.mark.cli
@patch("superset_extensions_cli.cli.build")
@patch("superset_cli.cli.build")
def test_bundle_command_handles_zip_creation_error(
mock_build, cli_runner, isolated_filesystem, extension_setup_for_bundling
):
@@ -145,7 +145,7 @@ def test_bundle_command_handles_zip_creation_error(
@pytest.mark.cli
@patch("superset_extensions_cli.cli.build")
@patch("superset_cli.cli.build")
def test_bundle_includes_all_files_recursively(
mock_build, cli_runner, isolated_filesystem
):
@@ -214,7 +214,7 @@ def test_bundle_includes_all_files_recursively(
@pytest.mark.cli
@patch("superset_extensions_cli.cli.build")
@patch("superset_cli.cli.build")
def test_bundle_command_short_option(
mock_build, cli_runner, isolated_filesystem, extension_setup_for_bundling
):
@@ -233,7 +233,7 @@ def test_bundle_command_short_option(
@pytest.mark.cli
@pytest.mark.parametrize("output_option", ["--output", "-o"])
@patch("superset_extensions_cli.cli.build")
@patch("superset_cli.cli.build")
def test_bundle_command_output_options(
mock_build,
output_option,

View File

@@ -23,17 +23,17 @@ import time
from unittest.mock import Mock, patch
import pytest
from superset_extensions_cli.cli import app, FrontendChangeHandler
from superset_cli.cli import app, FrontendChangeHandler
# Dev Command Tests
@pytest.mark.cli
@patch("superset_extensions_cli.cli.Observer")
@patch("superset_extensions_cli.cli.init_frontend_deps")
@patch("superset_extensions_cli.cli.rebuild_frontend")
@patch("superset_extensions_cli.cli.rebuild_backend")
@patch("superset_extensions_cli.cli.build_manifest")
@patch("superset_extensions_cli.cli.write_manifest")
@patch("superset_cli.cli.Observer")
@patch("superset_cli.cli.init_frontend_deps")
@patch("superset_cli.cli.rebuild_frontend")
@patch("superset_cli.cli.rebuild_backend")
@patch("superset_cli.cli.build_manifest")
@patch("superset_cli.cli.write_manifest")
def test_dev_command_starts_watchers(
mock_write_manifest,
mock_build_manifest,
@@ -82,11 +82,11 @@ def test_dev_command_starts_watchers(
@pytest.mark.cli
@patch("superset_extensions_cli.cli.init_frontend_deps")
@patch("superset_extensions_cli.cli.rebuild_frontend")
@patch("superset_extensions_cli.cli.rebuild_backend")
@patch("superset_extensions_cli.cli.build_manifest")
@patch("superset_extensions_cli.cli.write_manifest")
@patch("superset_cli.cli.init_frontend_deps")
@patch("superset_cli.cli.rebuild_frontend")
@patch("superset_cli.cli.rebuild_backend")
@patch("superset_cli.cli.build_manifest")
@patch("superset_cli.cli.write_manifest")
def test_dev_command_initial_build(
mock_write_manifest,
mock_build_manifest,
@@ -104,7 +104,7 @@ def test_dev_command_initial_build(
extension_setup_for_dev(isolated_filesystem)
with patch("superset_extensions_cli.cli.Observer") as mock_observer_class:
with patch("superset_cli.cli.Observer") as mock_observer_class:
mock_observer = Mock()
mock_observer_class.return_value = mock_observer
@@ -188,9 +188,9 @@ def test_frontend_watcher_function_coverage(isolated_filesystem):
dist_dir = isolated_filesystem / "dist"
dist_dir.mkdir()
with patch("superset_extensions_cli.cli.rebuild_frontend") as mock_rebuild:
with patch("superset_extensions_cli.cli.build_manifest") as mock_build:
with patch("superset_extensions_cli.cli.write_manifest") as mock_write:
with patch("superset_cli.cli.rebuild_frontend") as mock_rebuild:
with patch("superset_cli.cli.build_manifest") as mock_build:
with patch("superset_cli.cli.write_manifest") as mock_write:
mock_rebuild.return_value = "remoteEntry.abc123.js"
mock_build.return_value = {"name": "test", "version": "1.0.0"}
@@ -224,8 +224,8 @@ def test_backend_watcher_function_coverage(isolated_filesystem):
manifest_data = {"name": "test", "version": "1.0.0"}
(dist_dir / "manifest.json").write_text(json.dumps(manifest_data))
with patch("superset_extensions_cli.cli.rebuild_backend") as mock_rebuild:
with patch("superset_extensions_cli.cli.write_manifest") as mock_write:
with patch("superset_cli.cli.rebuild_backend") as mock_rebuild:
with patch("superset_cli.cli.write_manifest") as mock_write:
# Simulate backend watcher function
mock_rebuild(isolated_filesystem)

View File

@@ -20,7 +20,7 @@ from __future__ import annotations
from pathlib import Path
import pytest
from superset_extensions_cli.cli import app
from superset_cli.cli import app
from tests.utils import (
assert_directory_exists,

View File

@@ -20,14 +20,14 @@ from __future__ import annotations
from unittest.mock import Mock, patch
import pytest
from superset_extensions_cli.cli import app, validate_npm
from superset_cli.cli import app, validate_npm
# Validate Command Tests
@pytest.mark.cli
def test_validate_command_success(cli_runner):
"""Test validate command succeeds when npm is available and valid."""
with patch("superset_extensions_cli.cli.validate_npm") as mock_validate:
with patch("superset_cli.cli.validate_npm") as mock_validate:
result = cli_runner.invoke(app, ["validate"])
assert result.exit_code == 0
@@ -38,7 +38,7 @@ def test_validate_command_success(cli_runner):
@pytest.mark.cli
def test_validate_command_calls_npm_validation(cli_runner):
"""Test that validate command calls the npm validation function."""
with patch("superset_extensions_cli.cli.validate_npm") as mock_validate:
with patch("superset_cli.cli.validate_npm") as mock_validate:
cli_runner.invoke(app, ["validate"])
mock_validate.assert_called_once()
@@ -160,9 +160,7 @@ def test_validate_npm_handles_file_not_found_exception(mock_run, mock_which):
def test_validate_npm_does_not_catch_other_subprocess_exceptions(
mock_run, mock_which, exception_type
):
"""
Test validate_npm does not catch OSError and PermissionError (they propagate up).
"""
"""Test validate_npm does not catch OSError and PermissionError (they propagate up)."""
mock_which.return_value = "/usr/bin/npm"
mock_run.side_effect = exception_type("Test error")

View File

@@ -27,9 +27,7 @@ from jinja2 import Environment, FileSystemLoader
@pytest.fixture
def templates_dir():
"""Get the templates directory path."""
return (
Path(__file__).parent.parent / "src" / "superset_extensions_cli" / "templates"
)
return Path(__file__).parent.parent / "src" / "superset_cli" / "templates"
@pytest.fixture

View File

@@ -20,7 +20,7 @@ from __future__ import annotations
import json
import pytest
from superset_extensions_cli.utils import read_json, read_toml
from superset_cli.utils import read_json, read_toml
# Read JSON Tests

View File

@@ -1,22 +0,0 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Change Log
Changelogs will be added once we have the first stable release.

Some files were not shown because too many files have changed in this diff Show More