diff --git a/.cursor/rules/dev-standard.mdc b/.cursor/rules/dev-standard.mdc new file mode 100644 index 00000000000..e45ff299043 --- /dev/null +++ b/.cursor/rules/dev-standard.mdc @@ -0,0 +1,125 @@ +--- +description: Apache Superset development standards and guidelines for Cursor IDE +globs: ["**/*.py", "**/*.ts", "**/*.tsx", "**/*.js", "**/*.jsx", "**/*.sql", "**/*.md"] +alwaysApply: true +--- + +# Apache Superset Development Standards for Cursor IDE + +Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend. + +## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do) + +**These migrations are actively happening - avoid deprecated patterns:** + +### Frontend Modernization +- **NO `any` types** - Use proper TypeScript types +- **NO JavaScript files** - Convert to TypeScript (.ts/.tsx) +- **NO Enzyme** - Use React Testing Library/Jest (Enzyme fully removed) +- **Use @superset-ui/core** - Don't import Ant Design directly + +### Testing Strategy Migration +- **Prefer unit tests** over integration tests +- **Prefer integration tests** over Cypress end-to-end tests +- **Cypress is last resort** - Actively moving away from Cypress +- **Use Jest + React Testing Library** for component testing + +### Backend Type Safety +- **Add type hints** - All new Python code needs proper typing +- **MyPy compliance** - Run `pre-commit run mypy` to validate +- **SQLAlchemy typing** - Use proper model annotations + +## Code Standards + +### TypeScript Frontend +- **NO `any` types** - Use proper TypeScript +- **Functional components** with hooks +- **@superset-ui/core** for UI components (not direct antd) +- **Jest** for testing (NO Enzyme) +- **Redux** for global state, hooks for local + +### Python Backend +- **Type hints required** for all new code +- **MyPy compliant** - run `pre-commit run mypy` +- **SQLAlchemy models** with proper typing +- **pytest** for testing + +### Apache License Headers +- **New files require ASF license headers** - When creating new code files, include the standard Apache Software Foundation license header +- **LLM instruction files are excluded** - Files like LLMS.md, CLAUDE.md, etc. are in `.rat-excludes` to avoid header token overhead + +## Key Directory Structure + +``` +superset/ +├── superset/ # Python backend (Flask, SQLAlchemy) +│ ├── views/api/ # REST API endpoints +│ ├── models/ # Database models +│ └── connectors/ # Database connections +├── superset-frontend/src/ # React TypeScript frontend +│ ├── components/ # Reusable components +│ ├── explore/ # Chart builder +│ ├── dashboard/ # Dashboard interface +│ └── SqlLab/ # SQL editor +├── superset-frontend/packages/ +│ └── superset-ui-core/ # UI component library (USE THIS) +├── tests/ # Python/integration tests +├── docs/ # Documentation (UPDATE FOR CHANGES) +└── UPDATING.md # Breaking changes log +``` + +## Architecture Patterns + +### Dataset-Centric Approach +Charts built from enriched datasets containing: +- Dimension columns with labels/descriptions +- Predefined metrics as SQL expressions +- Self-service analytics within defined contexts + +### Security & Features +- **RBAC**: Role-based access via Flask-AppBuilder +- **Feature flags**: Control feature rollouts +- **Row-level security**: SQL-based data access control + +## Test Utilities + +### Python Test Helpers +- **`SupersetTestCase`** - Base class in `tests/integration_tests/base_tests.py` +- **`@with_config`** - Config mocking decorator +- **`@with_feature_flags`** - Feature flag testing +- **`login_as()`, `login_as_admin()`** - Authentication helpers +- **`create_dashboard()`, `create_slice()`** - Data setup utilities + +### TypeScript Test Helpers +- **`superset-frontend/spec/helpers/testing-library.tsx`** - Custom render() with providers +- **`createWrapper()`** - Redux/Router/Theme wrapper +- **`selectOption()`** - Select component helper +- **React Testing Library** - NO Enzyme (removed) + +## Pre-commit Validation + +**Use pre-commit hooks for quality validation:** + +```bash +# Install hooks +pre-commit install + +# Quick validation (faster than --all-files) +pre-commit run # Staged files only +pre-commit run mypy # Python type checking +pre-commit run prettier # Code formatting +pre-commit run eslint # Frontend linting +``` + +## Development Guidelines + +- **Documentation**: Update docs/ for any user-facing changes +- **Breaking Changes**: Add to UPDATING.md +- **Docstrings**: Required for new functions/classes +- **Follow existing patterns**: Mimic code style, use existing libraries and utilities +- **Type Safety**: This codebase is actively modernizing toward full TypeScript and type safety +- **Always run `pre-commit run`** to validate changes before committing + +--- + +**Note**: This codebase is actively modernizing toward full TypeScript and type safety. Always run `pre-commit run` to validate changes. Follow the ongoing refactors section to avoid deprecated patterns. diff --git a/.devcontainer/Dockerfile b/.devcontainer/Dockerfile new file mode 100644 index 00000000000..ddd39d21950 --- /dev/null +++ b/.devcontainer/Dockerfile @@ -0,0 +1,20 @@ +# Keep this in sync with the base image in the main Dockerfile (ARG PY_VER) +FROM python:3.11.13-trixie AS base + +# Install system dependencies that Superset needs +# This layer will be cached across Codespace sessions +RUN apt-get update && apt-get install -y \ + libsasl2-dev \ + libldap2-dev \ + libpq-dev \ + tmux \ + gh \ + && rm -rf /var/lib/apt/lists/* + +# Install uv for fast Python package management +# This will also be cached in the image +RUN curl -LsSf https://astral.sh/uv/install.sh | sh && \ + echo 'export PATH="/root/.cargo/bin:$PATH"' >> /etc/bash.bashrc + +# Set the cargo/bin directory in PATH for all users +ENV PATH="/root/.cargo/bin:${PATH}" diff --git a/.devcontainer/README.md b/.devcontainer/README.md new file mode 100644 index 00000000000..e5dda78fe30 --- /dev/null +++ b/.devcontainer/README.md @@ -0,0 +1,16 @@ +# Superset Development with GitHub Codespaces + +For complete documentation on using GitHub Codespaces with Apache Superset, please see: + +**[Setting up a Development Environment - GitHub Codespaces](https://superset.apache.org/docs/contributing/development#github-codespaces-cloud-development)** + +## Pre-installed Development Environment + +When you create a new Codespace from this repository, it automatically: + +1. **Creates a Python virtual environment** using `uv venv` +2. **Installs all development dependencies** via `uv pip install -r requirements/development.txt` +3. **Sets up pre-commit hooks** with `pre-commit install` +4. **Activates the virtual environment** automatically in all terminals + +The virtual environment is located at `/workspaces/{repository-name}/.venv` and is automatically activated through environment variables set in the devcontainer configuration. diff --git a/.devcontainer/bashrc-additions b/.devcontainer/bashrc-additions new file mode 100644 index 00000000000..28a65b51e66 --- /dev/null +++ b/.devcontainer/bashrc-additions @@ -0,0 +1,62 @@ +# Superset Codespaces environment setup +# This file is appended to ~/.bashrc during Codespace setup + +# Find the workspace directory (handles both 'superset' and 'superset-2' names) +WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1) + +if [ -n "$WORKSPACE_DIR" ]; then + # Check if virtual environment exists + if [ -d "$WORKSPACE_DIR/.venv" ]; then + # Activate the virtual environment + source "$WORKSPACE_DIR/.venv/bin/activate" + echo "✅ Python virtual environment activated" + + # Verify pre-commit is installed and set up + if command -v pre-commit &> /dev/null; then + echo "✅ pre-commit is available ($(pre-commit --version))" + # Install git hooks if not already installed + if [ -d "$WORKSPACE_DIR/.git" ] && [ ! -f "$WORKSPACE_DIR/.git/hooks/pre-commit" ]; then + echo "🪝 Installing pre-commit hooks..." + cd "$WORKSPACE_DIR" && pre-commit install + fi + else + echo "⚠️ pre-commit not found. Run: pip install pre-commit" + fi + else + echo "⚠️ Python virtual environment not found at $WORKSPACE_DIR/.venv" + echo " Run: cd $WORKSPACE_DIR && .devcontainer/setup-dev.sh" + fi + + # Always cd to the workspace directory for convenience + cd "$WORKSPACE_DIR" +fi + +# Add helpful aliases for Superset development +alias start-superset="$WORKSPACE_DIR/.devcontainer/start-superset.sh" +alias setup-dev="$WORKSPACE_DIR/.devcontainer/setup-dev.sh" + +# Show helpful message on login +echo "" +echo "🚀 Superset Codespaces Environment" +echo "==================================" + +# Check if Superset is running +if docker ps 2>/dev/null | grep -q "superset"; then + echo "✅ Superset is running!" + echo " - Check the 'Ports' tab for your live Superset URL" + echo " - Initial startup takes 10-20 minutes" + echo " - Login: admin/admin" +else + echo "⚠️ Superset is not running. Use: start-superset" + # Check if there's a startup log + if [ -f "/tmp/superset-startup.log" ]; then + echo " 📋 Startup log found: cat /tmp/superset-startup.log" + fi +fi + +echo "" +echo "Quick commands:" +echo " start-superset - Start Superset with Docker Compose" +echo " setup-dev - Set up Python environment (if not already done)" +echo " pre-commit run - Run pre-commit checks on staged files" +echo "" diff --git a/.devcontainer/build-and-push-image.sh b/.devcontainer/build-and-push-image.sh new file mode 100755 index 00000000000..84fa0e4dbf8 --- /dev/null +++ b/.devcontainer/build-and-push-image.sh @@ -0,0 +1,20 @@ +#!/bin/bash +# Script to build and push the devcontainer image to GitHub Container Registry +# This allows caching the image between Codespace sessions + +# You'll need to run this with appropriate GitHub permissions +# gh auth login --scopes write:packages + +REGISTRY="ghcr.io" +OWNER="apache" +REPO="superset" +TAG="devcontainer-base" + +echo "Building devcontainer image..." +docker build -t $REGISTRY/$OWNER/$REPO:$TAG .devcontainer/ + +echo "Pushing to GitHub Container Registry..." +docker push $REGISTRY/$OWNER/$REPO:$TAG + +echo "Done! Update .devcontainer/devcontainer.json to use:" +echo " \"image\": \"$REGISTRY/$OWNER/$REPO:$TAG\"" diff --git a/.devcontainer/devcontainer.json b/.devcontainer/devcontainer.json new file mode 100644 index 00000000000..5f47e964d31 --- /dev/null +++ b/.devcontainer/devcontainer.json @@ -0,0 +1,66 @@ +{ + "name": "Apache Superset Development", + // Option 1: Use pre-built image directly + // "image": "ghcr.io/apache/superset:devcontainer-base", + + // Option 2: Build from Dockerfile with cache (current approach) + "build": { + "dockerfile": "Dockerfile", + "context": ".", + // Cache from the Apache registry image + "cacheFrom": ["ghcr.io/apache/superset:devcontainer-base"] + }, + + "features": { + "ghcr.io/devcontainers/features/docker-in-docker:2": { + "moby": true, + "dockerDashComposeVersion": "v2" + }, + "ghcr.io/devcontainers/features/node:1": { + "version": "20" + }, + "ghcr.io/devcontainers/features/git:1": {}, + "ghcr.io/devcontainers/features/common-utils:2": { + "configureZshAsDefaultShell": true + }, + "ghcr.io/devcontainers/features/sshd:1": { + "version": "latest" + } + }, + + // Forward ports for development + "forwardPorts": [9001], + "portsAttributes": { + "9001": { + "label": "Superset (via Webpack Dev Server)", + "onAutoForward": "notify", + "visibility": "public" + } + }, + + // Run commands after container is created + "postCreateCommand": "bash .devcontainer/setup-dev.sh || echo '⚠️ Setup had issues - run .devcontainer/setup-dev.sh manually'", + + // Auto-start Superset after ensuring Docker is ready + // Run in foreground to see any errors, but don't block on failures + "postStartCommand": "bash -c 'echo \"Waiting 30s for services to initialize...\"; sleep 30; .devcontainer/start-superset.sh || echo \"⚠️ Auto-start failed - run start-superset manually\"'", + + // Set environment variables + "remoteEnv": { + // Removed automatic venv activation to prevent startup issues + // The setup script will handle this + }, + + // VS Code customizations + "customizations": { + "vscode": { + "extensions": [ + "ms-python.python", + "ms-python.vscode-pylance", + "charliermarsh.ruff", + "dbaeumer.vscode-eslint", + "esbenp.prettier-vscode" + ] + } + } +} diff --git a/.devcontainer/setup-dev.sh b/.devcontainer/setup-dev.sh new file mode 100755 index 00000000000..91482551bee --- /dev/null +++ b/.devcontainer/setup-dev.sh @@ -0,0 +1,78 @@ +#!/bin/bash +# Setup script for Superset Codespaces development environment + +echo "🔧 Setting up Superset development environment..." + +# System dependencies and uv are now pre-installed in the Docker image +# This speeds up Codespace creation significantly! + +# Create virtual environment using uv +echo "🐍 Creating Python virtual environment..." +if ! uv venv; then + echo "❌ Failed to create virtual environment" + exit 1 +fi + +# Install Python dependencies +echo "📦 Installing Python dependencies..." +if ! uv pip install -r requirements/development.txt; then + echo "❌ Failed to install Python dependencies" + echo "💡 You may need to run this manually after the Codespace starts" + exit 1 +fi + +# Install pre-commit hooks +echo "🪝 Installing pre-commit hooks..." +if source .venv/bin/activate && pre-commit install; then + echo "✅ Pre-commit hooks installed" +else + echo "⚠️ Pre-commit hooks installation failed (non-critical)" +fi + +# Install Claude Code CLI via npm +echo "🤖 Installing Claude Code..." +if npm install -g @anthropic-ai/claude-code; then + echo "✅ Claude Code installed" +else + echo "⚠️ Claude Code installation failed (non-critical)" +fi + +# Make the start script executable +chmod +x .devcontainer/start-superset.sh + +# Add bashrc additions for automatic venv activation +echo "🔧 Setting up automatic environment activation..." +if [ -f ~/.bashrc ]; then + # Check if we've already added our additions + if ! grep -q "Superset Codespaces environment setup" ~/.bashrc; then + echo "" >> ~/.bashrc + cat .devcontainer/bashrc-additions >> ~/.bashrc + echo "✅ Added automatic venv activation to ~/.bashrc" + else + echo "✅ Bashrc additions already present" + fi +else + # Create bashrc if it doesn't exist + cat .devcontainer/bashrc-additions > ~/.bashrc + echo "✅ Created ~/.bashrc with automatic venv activation" +fi + +# Also add to zshrc since that's the default shell +if [ -f ~/.zshrc ] || [ -n "$ZSH_VERSION" ]; then + if ! grep -q "Superset Codespaces environment setup" ~/.zshrc; then + echo "" >> ~/.zshrc + cat .devcontainer/bashrc-additions >> ~/.zshrc + echo "✅ Added automatic venv activation to ~/.zshrc" + fi +fi + +echo "✅ Development environment setup complete!" +echo "" +echo "📝 The virtual environment will be automatically activated in new terminals" +echo "" +echo "🔄 To activate in this terminal, run:" +echo " source ~/.bashrc" +echo "" +echo "🚀 To start Superset:" +echo " start-superset" +echo "" diff --git a/.devcontainer/start-superset.sh b/.devcontainer/start-superset.sh new file mode 100755 index 00000000000..6ba990cae10 --- /dev/null +++ b/.devcontainer/start-superset.sh @@ -0,0 +1,108 @@ +#!/bin/bash +# Startup script for Superset in Codespaces + +# Log to a file for debugging +LOG_FILE="/tmp/superset-startup.log" +echo "[$(date)] Starting Superset startup script" >> "$LOG_FILE" +echo "[$(date)] User: $(whoami), PWD: $(pwd)" >> "$LOG_FILE" + +echo "🚀 Starting Superset in Codespaces..." +echo "🌐 Frontend will be available at port 9001" + +# Find the workspace directory (Codespaces clones as 'superset', not 'superset-2') +WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1) +if [ -n "$WORKSPACE_DIR" ]; then + cd "$WORKSPACE_DIR" + echo "📁 Working in: $WORKSPACE_DIR" +else + echo "📁 Using current directory: $(pwd)" +fi + +# Wait for Docker to be available +echo "⏳ Waiting for Docker to start..." +echo "[$(date)] Waiting for Docker..." >> "$LOG_FILE" +max_attempts=30 +attempt=0 +while ! docker info > /dev/null 2>&1; do + if [ $attempt -eq $max_attempts ]; then + echo "❌ Docker failed to start after $max_attempts attempts" + echo "[$(date)] Docker failed to start after $max_attempts attempts" >> "$LOG_FILE" + echo "🔄 Please restart the Codespace or run this script manually later" + exit 1 + fi + echo " Attempt $((attempt + 1))/$max_attempts..." + echo "[$(date)] Docker check attempt $((attempt + 1))/$max_attempts" >> "$LOG_FILE" + sleep 2 + attempt=$((attempt + 1)) +done +echo "✅ Docker is ready!" +echo "[$(date)] Docker is ready" >> "$LOG_FILE" + +# Check if Superset containers are already running +if docker ps | grep -q "superset"; then + echo "✅ Superset containers are already running!" + echo "" + echo "🌐 To access Superset:" + echo " 1. Click the 'Ports' tab at the bottom of VS Code" + echo " 2. Find port 9001 and click the globe icon to open" + echo " 3. Wait 10-20 minutes for initial startup" + echo "" + echo "📝 Login credentials: admin/admin" + exit 0 +fi + +# Clean up any existing containers +echo "🧹 Cleaning up existing containers..." +docker-compose -f docker-compose-light.yml down + +# Start services +echo "🏗️ Starting Superset in background (daemon mode)..." +echo "" + +# Start in detached mode +docker-compose -f docker-compose-light.yml up -d + +echo "" +echo "✅ Docker Compose started successfully!" +echo "" +echo "📋 Important information:" +echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" +echo "⏱️ Initial startup takes 10-20 minutes" +echo "🌐 Check the 'Ports' tab for your Superset URL (port 9001)" +echo "👤 Login: admin / admin" +echo "" +echo "📊 Useful commands:" +echo " docker-compose -f docker-compose-light.yml logs -f # Follow logs" +echo " docker-compose -f docker-compose-light.yml ps # Check status" +echo " docker-compose -f docker-compose-light.yml down # Stop services" +echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" +echo "" +echo "💤 Keeping terminal open for 60 seconds to test persistence..." +sleep 60 +echo "✅ Test complete - check if this terminal is still visible!" + +# Show final status +docker-compose -f docker-compose-light.yml ps +EXIT_CODE=$? + +# If it failed, provide helpful instructions +if [ $EXIT_CODE -ne 0 ] && [ $EXIT_CODE -ne 130 ]; then # 130 is Ctrl+C + echo "" + echo "❌ Superset startup failed (exit code: $EXIT_CODE)" + echo "" + echo "🔄 To restart Superset, run:" + echo " .devcontainer/start-superset.sh" + echo "" + echo "🔧 For troubleshooting:" + echo " # View logs:" + echo " docker-compose -f docker-compose-light.yml logs" + echo "" + echo " # Clean restart (removes volumes):" + echo " docker-compose -f docker-compose-light.yml down -v" + echo " .devcontainer/start-superset.sh" + echo "" + echo " # Common issues:" + echo " - Network timeouts: Just retry, often transient" + echo " - Port conflicts: Check 'docker ps'" + echo " - Database issues: Try clean restart with -v" +fi diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 5fdc65b6a22..ea027a84896 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -2,7 +2,7 @@ # https://github.com/apache/superset/issues/13351 -/superset/migrations/ @mistercrunch @michael-s-molina @betodealmeida @eschutho +/superset/migrations/ @mistercrunch @michael-s-molina @betodealmeida @eschutho @sadpandajoe # Notify some committers of changes in the components @@ -30,3 +30,13 @@ **/*.geojson @villebro @rusackas /superset-frontend/plugins/legacy-plugin-chart-country-map/ @villebro @rusackas + +# Notify PMC members of changes to extension-related files + +/superset-core/ @michael-s-molina @villebro +/superset-extensions-cli/ @michael-s-molina @villebro +/superset/core/ @michael-s-molina @villebro +/superset/extensions/ @michael-s-molina @villebro +/superset-frontend/src/packages/superset-core/ @michael-s-molina @villebro +/superset-frontend/src/core/ @michael-s-molina @villebro +/superset-frontend/src/extensions/ @michael-s-molina @villebro diff --git a/.github/ISSUE_TEMPLATE/bug-report.yml b/.github/ISSUE_TEMPLATE/bug-report.yml index b13dfedbd6f..6bfff77d279 100644 --- a/.github/ISSUE_TEMPLATE/bug-report.yml +++ b/.github/ISSUE_TEMPLATE/bug-report.yml @@ -42,7 +42,7 @@ body: options: - master / latest-dev - "5.0.0" - - "4.1.2" + - "4.1.3" validations: required: true - type: dropdown diff --git a/.github/actions/change-detector/action.yml b/.github/actions/change-detector/action.yml index d0f356e771d..da19bea6a4c 100644 --- a/.github/actions/change-detector/action.yml +++ b/.github/actions/change-detector/action.yml @@ -1,24 +1,27 @@ -name: 'Change Detector' -description: 'Detects file changes for pull request and push events' +name: Change Detector +description: Detects file changes for pull request and push events inputs: token: - description: 'GitHub token for authentication' + description: GitHub token for authentication required: true outputs: python: - description: 'Whether Python-related files were changed' + description: Whether Python-related files were changed value: ${{ steps.change-detector.outputs.python }} frontend: - description: 'Whether frontend-related files were changed' + description: Whether frontend-related files were changed value: ${{ steps.change-detector.outputs.frontend }} docker: - description: 'Whether docker-related files were changed' + description: Whether docker-related files were changed value: ${{ steps.change-detector.outputs.docker }} docs: - description: 'Whether docs-related files were changed' + description: Whether docs-related files were changed value: ${{ steps.change-detector.outputs.docs }} + superset-extensions-cli: + description: Whether superset-extensions-cli package-related files were changed + value: ${{ steps.change-detector.outputs.superset-extensions-cli }} runs: - using: 'composite' + using: composite steps: - name: Detect file changes id: change-detector diff --git a/.github/copilot-instructions.md b/.github/copilot-instructions.md new file mode 120000 index 00000000000..3395d566a13 --- /dev/null +++ b/.github/copilot-instructions.md @@ -0,0 +1 @@ +../LLMS.md \ No newline at end of file diff --git a/.github/workflows/bump-python-package.yml b/.github/workflows/bump-python-package.yml index 36da48fbb10..0be3cc53eeb 100644 --- a/.github/workflows/bump-python-package.yml +++ b/.github/workflows/bump-python-package.yml @@ -32,7 +32,7 @@ jobs: steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: true ref: master diff --git a/.github/workflows/cancel_duplicates.yml b/.github/workflows/cancel_duplicates.yml index 24e1eb40afc..c221b066c8d 100644 --- a/.github/workflows/cancel_duplicates.yml +++ b/.github/workflows/cancel_duplicates.yml @@ -31,7 +31,7 @@ jobs: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" if: steps.check_queued.outputs.count >= 20 - uses: actions/checkout@v4 + uses: actions/checkout@v5 - name: Cancel duplicate workflow runs if: steps.check_queued.outputs.count >= 20 diff --git a/.github/workflows/check-python-deps.yml b/.github/workflows/check-python-deps.yml index edb7d4ea5f6..da4fca4fe7b 100644 --- a/.github/workflows/check-python-deps.yml +++ b/.github/workflows/check-python-deps.yml @@ -18,12 +18,18 @@ jobs: runs-on: ubuntu-22.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive fetch-depth: 1 + - name: Check for file changes + id: check + uses: ./.github/actions/change-detector/ + with: + token: ${{ secrets.GITHUB_TOKEN }} + - name: Setup Python if: steps.check.outputs.python uses: ./.github/actions/setup-backend/ @@ -33,10 +39,20 @@ jobs: run: ./scripts/uv-pip-compile.sh - name: Check for uncommitted changes + if: steps.check.outputs.python run: | - if [[ -n "$(git diff)" ]]; then + echo "Full diff (for logging/debugging):" + git diff + + echo "Filtered diff (excluding comments and whitespace):" + filtered_diff=$(git diff -U0 | grep '^[-+]' | grep -vE '^[-+]{3}' | grep -vE '^[-+][[:space:]]*#' | grep -vE '^[-+][[:space:]]*$' || true) + echo "$filtered_diff" + + if [[ -n "$filtered_diff" ]]; then + echo echo "ERROR: The pinned dependencies are not up-to-date." echo "Please run './scripts/uv-pip-compile.sh' and commit the changes." + echo "More info: https://github.com/apache/superset/tree/master/requirements" exit 1 else echo "Pinned dependencies are up-to-date." diff --git a/.github/workflows/check_db_migration_confict.yml b/.github/workflows/check_db_migration_confict.yml index d9a6ca85e8c..bc99b764bd2 100644 --- a/.github/workflows/check_db_migration_confict.yml +++ b/.github/workflows/check_db_migration_confict.yml @@ -25,7 +25,7 @@ jobs: pull-requests: write steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 - name: Check and notify uses: actions/github-script@v7 with: diff --git a/.github/workflows/claude.yml b/.github/workflows/claude.yml new file mode 100644 index 00000000000..7949aebaba4 --- /dev/null +++ b/.github/workflows/claude.yml @@ -0,0 +1,82 @@ +name: Claude PR Assistant + +on: + issue_comment: + types: [created] + pull_request_review_comment: + types: [created] + +jobs: + check-permissions: + if: | + (github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) || + (github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) + runs-on: ubuntu-latest + outputs: + allowed: ${{ steps.check.outputs.allowed }} + steps: + - name: Check if user is allowed + id: check + run: | + # List of allowed users + ALLOWED_USERS="mistercrunch,rusackas" + + # Get the commenter's username + COMMENTER="${{ github.event.comment.user.login }}" + + echo "Checking permissions for user: $COMMENTER" + + # Check if user is in allowed list + if [[ ",$ALLOWED_USERS," == *",$COMMENTER,"* ]]; then + echo "allowed=true" >> $GITHUB_OUTPUT + echo "✅ User $COMMENTER is allowed to use Claude" + else + echo "allowed=false" >> $GITHUB_OUTPUT + echo "❌ User $COMMENTER is not allowed to use Claude" + fi + + deny-access: + needs: check-permissions + if: needs.check-permissions.outputs.allowed == 'false' + runs-on: ubuntu-latest + permissions: + issues: write + pull-requests: write + steps: + - name: Comment access denied + uses: actions/github-script@v7 + with: + script: | + const message = `👋 Hi @${{ github.event.comment.user.login || github.event.review.user.login || github.event.issue.user.login }}! + + Thanks for trying to use Claude Code, but currently only certain team members have access to this feature. + + If you believe you should have access, please contact a project maintainer.`; + + await github.rest.issues.createComment({ + owner: context.repo.owner, + repo: context.repo.repo, + issue_number: context.issue.number, + body: message + }); + + claude-code-action: + needs: check-permissions + if: needs.check-permissions.outputs.allowed == 'true' + runs-on: ubuntu-latest + permissions: + contents: write + pull-requests: write + issues: write + id-token: write + steps: + - name: Checkout repository + uses: actions/checkout@v5 + with: + fetch-depth: 1 + + - name: Run Claude PR Action + uses: anthropics/claude-code-action@beta + with: + anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }} + timeout_minutes: "60" diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml index b038a5723ad..363a7cde447 100644 --- a/.github/workflows/codeql-analysis.yml +++ b/.github/workflows/codeql-analysis.yml @@ -31,7 +31,7 @@ jobs: steps: - name: Checkout repository - uses: actions/checkout@v4 + uses: actions/checkout@v5 - name: Check for file changes id: check diff --git a/.github/workflows/dependency-review.yml b/.github/workflows/dependency-review.yml index b708d398eba..5755e4f4975 100644 --- a/.github/workflows/dependency-review.yml +++ b/.github/workflows/dependency-review.yml @@ -27,7 +27,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout Repository" - uses: actions/checkout@v4 + uses: actions/checkout@v5 - name: "Dependency Review" uses: actions/dependency-review-action@v4 continue-on-error: true @@ -53,7 +53,7 @@ jobs: runs-on: ubuntu-22.04 steps: - name: "Checkout Repository" - uses: actions/checkout@v4 + uses: actions/checkout@v5 - name: Setup Python uses: ./.github/actions/setup-backend/ diff --git a/.github/workflows/docker.yml b/.github/workflows/docker.yml index 91417cc86a5..09c5b82554b 100644 --- a/.github/workflows/docker.yml +++ b/.github/workflows/docker.yml @@ -22,7 +22,7 @@ jobs: steps: - id: set_matrix run: | - MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["dev", "lean"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize", "py311"]'; fi) + MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["dev", "lean"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize", "py311", "py312"]'; fi) echo "matrix_config=${MATRIX_CONFIG}" >> $GITHUB_OUTPUT echo $GITHUB_OUTPUT @@ -42,7 +42,7 @@ jobs: steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false @@ -102,7 +102,7 @@ jobs: docker history $IMAGE_TAG - name: docker-compose sanity check - if: (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && (matrix.build_preset == 'dev' || matrix.build_preset == 'lean') + if: (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'dev' shell: bash run: | export SUPERSET_BUILD_TARGET=${{ matrix.build_preset }} @@ -117,7 +117,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false - name: Check for file changes diff --git a/.github/workflows/embedded-sdk-release.yml b/.github/workflows/embedded-sdk-release.yml index fc2a8fe3626..d357de5be07 100644 --- a/.github/workflows/embedded-sdk-release.yml +++ b/.github/workflows/embedded-sdk-release.yml @@ -28,7 +28,7 @@ jobs: run: working-directory: superset-embedded-sdk steps: - - uses: actions/checkout@v4 + - uses: actions/checkout@v5 - uses: actions/setup-node@v4 with: node-version-file: './superset-embedded-sdk/.nvmrc' diff --git a/.github/workflows/embedded-sdk-test.yml b/.github/workflows/embedded-sdk-test.yml index 999d8733709..a5309f7b6da 100644 --- a/.github/workflows/embedded-sdk-test.yml +++ b/.github/workflows/embedded-sdk-test.yml @@ -18,7 +18,7 @@ jobs: run: working-directory: superset-embedded-sdk steps: - - uses: actions/checkout@v4 + - uses: actions/checkout@v5 - uses: actions/setup-node@v4 with: node-version-file: './superset-embedded-sdk/.nvmrc' diff --git a/.github/workflows/ephemeral-env-pr-close.yml b/.github/workflows/ephemeral-env-pr-close.yml index 60cf75dac82..fe65c6fa64a 100644 --- a/.github/workflows/ephemeral-env-pr-close.yml +++ b/.github/workflows/ephemeral-env-pr-close.yml @@ -1,4 +1,10 @@ -name: Cleanup ephemeral envs (PR close) +name: Cleanup ephemeral envs (PR close) [DEPRECATED] + +# ⚠️ DEPRECATION NOTICE ⚠️ +# This workflow is deprecated and will be removed in a future version. +# The new Superset Showtime workflow handles cleanup automatically. +# See .github/workflows/showtime.yml and showtime-cleanup.yml for replacements. +# Migration guide: https://github.com/mistercrunch/superset-showtime on: pull_request_target: @@ -71,5 +77,5 @@ jobs: issue_number: ${{ github.event.number }}, owner: context.repo.owner, repo: context.repo.repo, - body: 'Ephemeral environment shutdown and build artifacts deleted.' + body: '⚠️ **DEPRECATED WORKFLOW** - Ephemeral environment shutdown and build artifacts deleted. Please migrate to the new Superset Showtime system for future PRs.' }) diff --git a/.github/workflows/ephemeral-env.yml b/.github/workflows/ephemeral-env.yml index af1dfaff80b..d34cf46608b 100644 --- a/.github/workflows/ephemeral-env.yml +++ b/.github/workflows/ephemeral-env.yml @@ -1,4 +1,12 @@ -name: Ephemeral env workflow +name: Ephemeral env workflow [DEPRECATED] + +# ⚠️ DEPRECATION NOTICE ⚠️ +# This workflow is deprecated and will be removed in a future version. +# Please use the new Superset Showtime workflow instead: +# - Use label "🎪 trigger-start" instead of "testenv-up" +# - Showtime provides better reliability and easier management +# - See .github/workflows/showtime.yml for the replacement +# - Migration guide: https://github.com/mistercrunch/superset-showtime # Example manual trigger: # gh workflow run ephemeral-env.yml --ref fix_ephemerals --field label_name="testenv-up" --field issue_number=666 @@ -126,8 +134,11 @@ jobs: throw new Error("Issue number is not available."); } - const body = `@${user} Processing your ephemeral environment request [here](${workflowUrl}).` + - ` Action: **${action}**.` + + const body = `⚠️ **DEPRECATED WORKFLOW** ⚠️\n\n@${user} This workflow is deprecated! Please use the new **Superset Showtime** system instead:\n\n` + + `- Replace "testenv-up" label with "🎪 trigger-start"\n` + + `- Better reliability and easier management\n` + + `- See https://github.com/mistercrunch/superset-showtime for details\n\n` + + `Processing your ephemeral environment request [here](${workflowUrl}). Action: **${action}**.` + ` More information on [how to use or configure ephemeral environments]` + `(https://superset.apache.org/docs/contributing/howtos/#github-ephemeral-environments)`; @@ -149,7 +160,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ needs.ephemeral-env-label.outputs.sha }} : ${{steps.get-sha.outputs.sha}} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: ref: ${{ needs.ephemeral-env-label.outputs.sha }} persist-credentials: false @@ -209,7 +220,7 @@ jobs: pull-requests: write steps: - - uses: actions/checkout@v4 + - uses: actions/checkout@v5 with: persist-credentials: false diff --git a/.github/workflows/generate-FOSSA-report.yml b/.github/workflows/generate-FOSSA-report.yml index 9f51a396c22..d9d41b94ea7 100644 --- a/.github/workflows/generate-FOSSA-report.yml +++ b/.github/workflows/generate-FOSSA-report.yml @@ -27,12 +27,12 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive - name: Setup Java - uses: actions/setup-java@v4 + uses: actions/setup-java@v5 with: distribution: "temurin" java-version: "11" diff --git a/.github/workflows/github-action-validator.yml b/.github/workflows/github-action-validator.yml index 3bdefddc008..28c5054f127 100644 --- a/.github/workflows/github-action-validator.yml +++ b/.github/workflows/github-action-validator.yml @@ -14,7 +14,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: Checkout Repository - uses: actions/checkout@v4 + uses: actions/checkout@v5 - name: Set up Node.js uses: actions/setup-node@v4 diff --git a/.github/workflows/issue_creation.yml b/.github/workflows/issue_creation.yml index 1d531e77967..7993a0563ed 100644 --- a/.github/workflows/issue_creation.yml +++ b/.github/workflows/issue_creation.yml @@ -17,7 +17,7 @@ jobs: steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false diff --git a/.github/workflows/latest-release-tag.yml b/.github/workflows/latest-release-tag.yml index 72f63d1e8cc..1887faf1176 100644 --- a/.github/workflows/latest-release-tag.yml +++ b/.github/workflows/latest-release-tag.yml @@ -12,7 +12,7 @@ jobs: steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/license-check.yml b/.github/workflows/license-check.yml index 6001eede70d..12c5a20e600 100644 --- a/.github/workflows/license-check.yml +++ b/.github/workflows/license-check.yml @@ -15,12 +15,12 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive - name: Setup Java - uses: actions/setup-java@v4 + uses: actions/setup-java@v5 with: distribution: 'temurin' java-version: '11' diff --git a/.github/workflows/pr-lint.yml b/.github/workflows/pr-lint.yml index 230af3d19c0..31298d302f9 100644 --- a/.github/workflows/pr-lint.yml +++ b/.github/workflows/pr-lint.yml @@ -16,7 +16,7 @@ jobs: pull-requests: write steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/pre-commit.yml b/.github/workflows/pre-commit.yml index ec7031e5081..05fc96da34e 100644 --- a/.github/workflows/pre-commit.yml +++ b/.github/workflows/pre-commit.yml @@ -21,7 +21,7 @@ jobs: python-version: ["current", "previous", "next"] steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/prefer-typescript.yml b/.github/workflows/prefer-typescript.yml index d243c8f5289..ffb0fcd450a 100644 --- a/.github/workflows/prefer-typescript.yml +++ b/.github/workflows/prefer-typescript.yml @@ -27,7 +27,7 @@ jobs: pull-requests: write steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml index 513b3ac97d1..69e387db425 100644 --- a/.github/workflows/release.yml +++ b/.github/workflows/release.yml @@ -26,7 +26,7 @@ jobs: name: Bump version and publish package(s) runs-on: ubuntu-24.04 steps: - - uses: actions/checkout@v4 + - uses: actions/checkout@v5 with: # pulls all commits (needed for lerna / semantic release to correctly version) fetch-depth: 0 diff --git a/.github/workflows/showtime-cleanup.yml b/.github/workflows/showtime-cleanup.yml new file mode 100644 index 00000000000..c0404bc101f --- /dev/null +++ b/.github/workflows/showtime-cleanup.yml @@ -0,0 +1,50 @@ +name: 🎪 Showtime Cleanup + +# Scheduled cleanup of expired environments +on: + schedule: + - cron: '0 */6 * * *' # Every 6 hours + + # Manual trigger for testing + workflow_dispatch: + inputs: + max_age_hours: + description: 'Maximum age in hours before cleanup' + required: false + default: '48' + type: string + +# Common environment variables +env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }} + AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }} + AWS_REGION: ${{ vars.AWS_REGION || 'us-west-2' }} + GITHUB_ORG: ${{ github.repository_owner }} + GITHUB_REPO: ${{ github.event.repository.name }} + +jobs: + cleanup-expired: + name: Clean up expired showtime environments + runs-on: ubuntu-latest + + permissions: + contents: read + pull-requests: write + + steps: + - name: Install Superset Showtime + run: pip install superset-showtime + + - name: Cleanup expired environments + run: | + MAX_AGE="${{ github.event.inputs.max_age_hours || '48' }}" + + # Validate max_age is numeric + if [[ ! "$MAX_AGE" =~ ^[0-9]+$ ]]; then + echo "❌ Invalid max_age_hours format: $MAX_AGE (must be numeric)" + exit 1 + fi + + echo "Cleaning up environments older than ${MAX_AGE}h" + python -m showtime cleanup --older-than "${MAX_AGE}h" diff --git a/.github/workflows/showtime-trigger.yml b/.github/workflows/showtime-trigger.yml new file mode 100644 index 00000000000..1bff3e92f9d --- /dev/null +++ b/.github/workflows/showtime-trigger.yml @@ -0,0 +1,179 @@ +name: 🎪 Superset Showtime + +# Ultra-simple: just sync on any PR state change +on: + pull_request_target: + types: [labeled, unlabeled, synchronize, closed] + + # Manual testing + workflow_dispatch: + inputs: + pr_number: + description: 'PR number to sync' + required: true + type: number + sha: + description: 'Specific SHA to deploy (optional, defaults to latest)' + required: false + type: string + +# Common environment variables for all jobs (non-sensitive only) +env: + AWS_REGION: us-west-2 + GITHUB_ORG: ${{ github.repository_owner }} + GITHUB_REPO: ${{ github.event.repository.name }} + GITHUB_ACTOR: ${{ github.actor }} + +jobs: + sync: + name: 🎪 Sync PR to desired state + runs-on: ubuntu-latest + timeout-minutes: 90 + + permissions: + contents: read + pull-requests: write + + steps: + - name: Security Check - Authorize Maintainers Only + id: auth + uses: actions/github-script@v7 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + script: | + const actor = context.actor; + console.log(`🔍 Checking authorization for ${actor}`); + + // Early exit for workflow_dispatch - assume authorized since it's manually triggered + if (context.eventName === 'workflow_dispatch') { + console.log(`✅ Workflow dispatch event - assuming authorized for ${actor}`); + core.setOutput('authorized', 'true'); + return; + } + + const { data: permission } = await github.rest.repos.getCollaboratorPermissionLevel({ + owner: context.repo.owner, + repo: context.repo.repo, + username: actor + }); + + console.log(`📊 Permission level for ${actor}: ${permission.permission}`); + const authorized = ['write', 'admin'].includes(permission.permission); + + if (!authorized) { + console.log(`🚨 Unauthorized user ${actor} - skipping all operations`); + core.setOutput('authorized', 'false'); + return; + } + + console.log(`✅ Authorized maintainer: ${actor}`); + core.setOutput('authorized', 'true'); + + // If this is a synchronize event, check if Showtime is active and set blocked label + if (context.eventName === 'pull_request_target' && context.payload.action === 'synchronize') { + console.log(`🔒 Synchronize event detected - checking if Showtime is active`); + + // Check if PR has any circus tent labels (Showtime is in use) + const { data: issue } = await github.rest.issues.get({ + owner: context.repo.owner, + repo: context.repo.repo, + issue_number: context.payload.pull_request.number + }); + + const hasCircusLabels = issue.labels.some(label => label.name.startsWith('🎪 ')); + + if (hasCircusLabels) { + console.log(`🎪 Circus labels found - setting blocked label to prevent auto-deployment`); + + await github.rest.issues.addLabels({ + owner: context.repo.owner, + repo: context.repo.repo, + issue_number: context.payload.pull_request.number, + labels: ['🎪 🔒 showtime-blocked'] + }); + + console.log(`✅ Blocked label set - Showtime will detect and skip operations`); + } else { + console.log(`ℹ️ No circus labels found - Showtime not in use, skipping block`); + } + } + + - name: Install Superset Showtime + if: steps.auth.outputs.authorized == 'true' + run: | + echo "::notice::Maintainer ${{ github.actor }} triggered deploy for PR ${{ github.event.pull_request.number || github.event.inputs.pr_number }}" + pip install --upgrade superset-showtime + showtime version + + - name: Check what actions are needed + if: steps.auth.outputs.authorized == 'true' + id: check + env: + AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }} + AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + run: | + # Bulletproof PR number extraction + if [[ -n "${{ github.event.pull_request.number }}" ]]; then + PR_NUM="${{ github.event.pull_request.number }}" + elif [[ -n "${{ github.event.inputs.pr_number }}" ]]; then + PR_NUM="${{ github.event.inputs.pr_number }}" + else + echo "❌ No PR number found in event or inputs" + exit 1 + fi + + echo "Using PR number: $PR_NUM" + + # Run sync check-only with optional SHA override + if [[ -n "${{ github.event.inputs.sha }}" ]]; then + OUTPUT=$(python -m showtime sync $PR_NUM --check-only --sha "${{ github.event.inputs.sha }}") + else + OUTPUT=$(python -m showtime sync $PR_NUM --check-only) + fi + echo "$OUTPUT" + + # Extract the outputs we need for conditional steps + BUILD=$(echo "$OUTPUT" | grep "build_needed=" | cut -d'=' -f2) + SYNC=$(echo "$OUTPUT" | grep "sync_needed=" | cut -d'=' -f2) + PR_NUM_OUT=$(echo "$OUTPUT" | grep "pr_number=" | cut -d'=' -f2) + TARGET_SHA=$(echo "$OUTPUT" | grep "target_sha=" | cut -d'=' -f2) + + echo "build_needed=$BUILD" >> $GITHUB_OUTPUT + echo "sync_needed=$SYNC" >> $GITHUB_OUTPUT + echo "pr_number=$PR_NUM_OUT" >> $GITHUB_OUTPUT + echo "target_sha=$TARGET_SHA" >> $GITHUB_OUTPUT + + - name: Checkout PR code (only if build needed) + if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.build_needed == 'true' + uses: actions/checkout@v5 + with: + ref: ${{ steps.check.outputs.target_sha }} + persist-credentials: false + + - name: Setup Docker Environment (only if build needed) + if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.build_needed == 'true' + uses: ./.github/actions/setup-docker + with: + dockerhub-user: ${{ secrets.DOCKERHUB_USER }} + dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }} + build: "true" + install-docker-compose: "false" + + - name: Execute sync (handles everything) + if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.sync_needed == 'true' + env: + AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }} + AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }} + DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }} + run: | + PR_NUM="${{ steps.check.outputs.pr_number }}" + TARGET_SHA="${{ steps.check.outputs.target_sha }}" + if [[ -n "$TARGET_SHA" ]]; then + python -m showtime sync $PR_NUM --sha "$TARGET_SHA" + else + python -m showtime sync $PR_NUM + fi diff --git a/.github/workflows/superset-cli.yml b/.github/workflows/superset-app-cli.yml similarity index 96% rename from .github/workflows/superset-cli.yml rename to .github/workflows/superset-app-cli.yml index 1fe02f30e94..d9dd761218c 100644 --- a/.github/workflows/superset-cli.yml +++ b/.github/workflows/superset-app-cli.yml @@ -1,4 +1,4 @@ -name: Superset CLI tests +name: Superset App CLI tests on: push: @@ -37,7 +37,7 @@ jobs: - 16379:6379 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/superset-applitool-cypress.yml b/.github/workflows/superset-applitool-cypress.yml index 42b5d49a349..071772f14a6 100644 --- a/.github/workflows/superset-applitool-cypress.yml +++ b/.github/workflows/superset-applitool-cypress.yml @@ -51,7 +51,7 @@ jobs: - 16379:6379 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/superset-applitools-storybook.yml b/.github/workflows/superset-applitools-storybook.yml index 44ed0451ff9..1af39c115bf 100644 --- a/.github/workflows/superset-applitools-storybook.yml +++ b/.github/workflows/superset-applitools-storybook.yml @@ -30,7 +30,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/superset-docs-deploy.yml b/.github/workflows/superset-docs-deploy.yml index b827889eb15..148c0472985 100644 --- a/.github/workflows/superset-docs-deploy.yml +++ b/.github/workflows/superset-docs-deploy.yml @@ -31,7 +31,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive @@ -41,7 +41,7 @@ jobs: node-version-file: './docs/.nvmrc' - name: Setup Python uses: ./.github/actions/setup-backend/ - - uses: actions/setup-java@v4 + - uses: actions/setup-java@v5 with: distribution: 'zulu' java-version: '21' diff --git a/.github/workflows/superset-docs-verify.yml b/.github/workflows/superset-docs-verify.yml index 2862541e32d..1bb81d55b55 100644 --- a/.github/workflows/superset-docs-verify.yml +++ b/.github/workflows/superset-docs-verify.yml @@ -18,7 +18,7 @@ jobs: name: Link Checking runs-on: ubuntu-latest steps: - - uses: actions/checkout@v4 + - uses: actions/checkout@v5 # Do not bump this linkinator-action version without opening # an ASF Infra ticket to allow the new version first! - uses: JustinBeckwith/linkinator-action@v1.11.0 @@ -56,7 +56,7 @@ jobs: working-directory: docs steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/superset-e2e.yml b/.github/workflows/superset-e2e.yml index 7c77f323df6..cc1564b4f42 100644 --- a/.github/workflows/superset-e2e.yml +++ b/.github/workflows/superset-e2e.yml @@ -69,21 +69,21 @@ jobs: # Conditional checkout based on context - name: Checkout for push or pull_request event if: github.event_name == 'push' || github.event_name == 'pull_request' - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }} - name: Checkout using ref (workflow_dispatch) if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != '' - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false ref: ${{ github.event.inputs.ref }} submodules: recursive - name: Checkout using PR ID (workflow_dispatch) if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != '' - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false ref: refs/pull/${{ github.event.inputs.pr_id }}/merge diff --git a/.github/workflows/superset-extensions-cli.yml b/.github/workflows/superset-extensions-cli.yml new file mode 100644 index 00000000000..c2a654cbb7c --- /dev/null +++ b/.github/workflows/superset-extensions-cli.yml @@ -0,0 +1,64 @@ +name: Superset Extensions CLI Package Tests + +on: + push: + branches: + - "master" + - "[0-9].[0-9]*" + pull_request: + types: [synchronize, opened, reopened, ready_for_review] + +# cancel previous workflow jobs for PRs +concurrency: + group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }} + cancel-in-progress: true + +jobs: + test-superset-extensions-cli-package: + runs-on: ubuntu-24.04 + strategy: + matrix: + python-version: ["previous", "current", "next"] + defaults: + run: + working-directory: superset-extensions-cli + steps: + - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" + uses: actions/checkout@v5 + with: + persist-credentials: false + submodules: recursive + + - name: Check for file changes + id: check + uses: ./.github/actions/change-detector/ + with: + token: ${{ secrets.GITHUB_TOKEN }} + + - name: Setup Python + if: steps.check.outputs.superset-extensions-cli + uses: ./.github/actions/setup-backend/ + with: + python-version: ${{ matrix.python-version }} + requirements-type: dev + + - name: Run pytest with coverage + if: steps.check.outputs.superset-extensions-cli + run: | + pytest --cov=superset_extensions_cli --cov-report=xml --cov-report=term-missing --cov-report=html -v --tb=short + + - name: Upload coverage reports to Codecov + if: steps.check.outputs.superset-extensions-cli + uses: codecov/codecov-action@v5 + with: + file: ./coverage.xml + flags: superset-extensions-cli + name: superset-extensions-cli-coverage + fail_ci_if_error: false + + - name: Upload HTML coverage report + if: steps.check.outputs.superset-extensions-cli + uses: actions/upload-artifact@v4 + with: + name: superset-extensions-cli-coverage-html + path: htmlcov/ diff --git a/.github/workflows/superset-frontend.yml b/.github/workflows/superset-frontend.yml index 3665c6f0591..308d1f4d6bd 100644 --- a/.github/workflows/superset-frontend.yml +++ b/.github/workflows/superset-frontend.yml @@ -23,7 +23,7 @@ jobs: should-run: ${{ steps.check.outputs.frontend }} steps: - name: Checkout Code - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false fetch-depth: 0 @@ -47,7 +47,7 @@ jobs: git show -s --format=raw HEAD docker buildx build \ -t $TAG \ - --cache-from=type=registry,ref=apache/superset-cache:3.10-slim-bookworm \ + --cache-from=type=registry,ref=apache/superset-cache:3.10-slim-trixie \ --target superset-node-ci \ . @@ -73,7 +73,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: Download Docker Image Artifact - uses: actions/download-artifact@v4 + uses: actions/download-artifact@v5 with: name: docker-image @@ -101,7 +101,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: Download Coverage Artifacts - uses: actions/download-artifact@v4 + uses: actions/download-artifact@v5 with: pattern: coverage-artifacts-* path: coverage/ @@ -127,7 +127,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: Download Docker Image Artifact - uses: actions/download-artifact@v4 + uses: actions/download-artifact@v5 with: name: docker-image @@ -151,7 +151,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: Download Docker Image Artifact - uses: actions/download-artifact@v4 + uses: actions/download-artifact@v5 with: name: docker-image diff --git a/.github/workflows/superset-helm-lint.yml b/.github/workflows/superset-helm-lint.yml index b3b1447641f..df4024404b5 100644 --- a/.github/workflows/superset-helm-lint.yml +++ b/.github/workflows/superset-helm-lint.yml @@ -16,7 +16,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/superset-helm-release.yml b/.github/workflows/superset-helm-release.yml index 639bb4e7204..ef89c37879d 100644 --- a/.github/workflows/superset-helm-release.yml +++ b/.github/workflows/superset-helm-release.yml @@ -29,7 +29,7 @@ jobs: steps: - name: Checkout code - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: ref: ${{ inputs.ref || github.ref_name }} persist-credentials: true diff --git a/.github/workflows/superset-python-integrationtest.yml b/.github/workflows/superset-python-integrationtest.yml index 03da74abbfd..36c48099dec 100644 --- a/.github/workflows/superset-python-integrationtest.yml +++ b/.github/workflows/superset-python-integrationtest.yml @@ -41,7 +41,7 @@ jobs: - 16379:6379 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive @@ -99,7 +99,7 @@ jobs: - 16379:6379 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive @@ -152,7 +152,7 @@ jobs: - 16379:6379 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/superset-python-presto-hive.yml b/.github/workflows/superset-python-presto-hive.yml index 36e2bb581b5..8090f4cbb81 100644 --- a/.github/workflows/superset-python-presto-hive.yml +++ b/.github/workflows/superset-python-presto-hive.yml @@ -48,7 +48,7 @@ jobs: - 16379:6379 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive @@ -108,7 +108,7 @@ jobs: - 16379:6379 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/superset-python-unittest.yml b/.github/workflows/superset-python-unittest.yml index 615993164d8..b88de912557 100644 --- a/.github/workflows/superset-python-unittest.yml +++ b/.github/workflows/superset-python-unittest.yml @@ -24,7 +24,7 @@ jobs: PYTHONPATH: ${{ github.workspace }} steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive @@ -51,7 +51,7 @@ jobs: SUPERSET_TESTENV: true SUPERSET_SECRET_KEY: not-a-secret run: | - pytest --durations-min=0.5 --cov-report= --cov=superset/sql/ ./tests/unit_tests/sql/ --cache-clear --cov-fail-under=100 + pytest --durations-min=0.5 --cov=superset/sql/ ./tests/unit_tests/sql/ --cache-clear --cov-fail-under=100 - name: Upload code coverage uses: codecov/codecov-action@v5 with: diff --git a/.github/workflows/superset-translations.yml b/.github/workflows/superset-translations.yml index 685d35cfe67..731365d3135 100644 --- a/.github/workflows/superset-translations.yml +++ b/.github/workflows/superset-translations.yml @@ -18,7 +18,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive @@ -49,7 +49,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false submodules: recursive diff --git a/.github/workflows/superset-websocket.yml b/.github/workflows/superset-websocket.yml index ce7ec50a92f..fe150be008e 100644 --- a/.github/workflows/superset-websocket.yml +++ b/.github/workflows/superset-websocket.yml @@ -21,7 +21,7 @@ jobs: runs-on: ubuntu-24.04 steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false - name: Install dependencies diff --git a/.github/workflows/supersetbot.yml b/.github/workflows/supersetbot.yml index b78fc743023..84076bb62f3 100644 --- a/.github/workflows/supersetbot.yml +++ b/.github/workflows/supersetbot.yml @@ -38,7 +38,7 @@ jobs: }); - name: "Checkout ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: persist-credentials: false diff --git a/.github/workflows/tag-release.yml b/.github/workflows/tag-release.yml index 8161304a1c6..a62f90b38ed 100644 --- a/.github/workflows/tag-release.yml +++ b/.github/workflows/tag-release.yml @@ -42,12 +42,12 @@ jobs: runs-on: ubuntu-24.04 strategy: matrix: - build_preset: ["dev", "lean", "py310", "websocket", "dockerize", "py311"] + build_preset: ["dev", "lean", "py310", "websocket", "dockerize", "py311", "py312"] fail-fast: false steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: fetch-depth: 0 @@ -107,7 +107,7 @@ jobs: steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 + uses: actions/checkout@v5 with: fetch-depth: 0 diff --git a/.github/workflows/tech-debt.yml b/.github/workflows/tech-debt.yml index 8c7942b7266..04dc907cf5d 100644 --- a/.github/workflows/tech-debt.yml +++ b/.github/workflows/tech-debt.yml @@ -27,7 +27,7 @@ jobs: name: Generate Reports steps: - name: Checkout Repository - uses: actions/checkout@v4 + uses: actions/checkout@v5 - name: Set up Node.js uses: actions/setup-node@v4 diff --git a/.github/workflows/welcome-new-users.yml b/.github/workflows/welcome-new-users.yml index f973a243c62..e4a05700612 100644 --- a/.github/workflows/welcome-new-users.yml +++ b/.github/workflows/welcome-new-users.yml @@ -12,7 +12,7 @@ jobs: steps: - name: Welcome Message - uses: actions/first-interaction@v1 + uses: actions/first-interaction@v3 continue-on-error: true with: repo-token: ${{ github.token }} diff --git a/.gitignore b/.gitignore index 2f649d941f8..9c9fc39d173 100644 --- a/.gitignore +++ b/.gitignore @@ -43,7 +43,7 @@ _modules _static build app.db -apache_superset.egg-info/ +*.egg-info/ changelog.sh dist dump.rdb @@ -92,6 +92,7 @@ scripts/*.zip # IntelliJ *.iml venv +.venv @eaDir/ # PyCharm @@ -126,4 +127,10 @@ docker/*local* # Jest test report test-report.html superset/static/stats/statistics.html + +# LLM-related +CLAUDE.local.md +PROJECT.md .aider* +.claude_rc* +.env.local diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 0b93b4f96f6..1a2a799bb9e 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -23,7 +23,9 @@ repos: rev: v1.15.0 hooks: - id: mypy + name: mypy (main) args: [--check-untyped-defs] + exclude: ^superset-extensions-cli/ additional_dependencies: [ types-simplejson, types-python-dateutil, @@ -38,6 +40,10 @@ repos: types-paramiko, types-Markdown, ] + - id: mypy + name: mypy (superset-extensions-cli) + args: [--check-untyped-defs] + files: ^superset-extensions-cli/ - repo: https://github.com/pre-commit/pre-commit-hooks rev: v5.0.0 hooks: @@ -52,35 +58,27 @@ repos: - id: trailing-whitespace exclude: ^.*\.(snap) args: ["--markdown-linebreak-ext=md"] - - repo: https://github.com/pre-commit/mirrors-prettier - rev: v4.0.0-alpha.8 # Use the sha or tag you want to point at - hooks: - - id: prettier - additional_dependencies: - - prettier@3.5.3 - args: ["--ignore-path=./superset-frontend/.prettierignore", "--exclude", "site-packages"] - files: "superset-frontend" - repo: local hooks: - - id: eslint-frontend - name: eslint (frontend) - entry: ./scripts/eslint.sh - language: system - pass_filenames: true - files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$ - - id: eslint-docs - name: eslint (docs) - entry: bash -c 'cd docs && FILES=$(echo "$@" | sed "s|docs/||g") && yarn eslint --fix --ext .js,.jsx,.ts,.tsx --quiet $FILES' - language: system - pass_filenames: true - files: ^docs/.*\.(js|jsx|ts|tsx)$ - - id: type-checking-frontend - name: Type-Checking (Frontend) - entry: bash -c './scripts/check-type.js package=superset-frontend excludeDeclarationDir=cypress-base' - language: system - files: ^superset-frontend\/.*\.(js|jsx|ts|tsx)$ - exclude: ^superset-frontend/cypress-base\/ - require_serial: true + - id: eslint-frontend + name: eslint (frontend) + entry: ./scripts/eslint.sh + language: system + pass_filenames: true + files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$ + - id: eslint-docs + name: eslint (docs) + entry: bash -c 'cd docs && FILES=$(echo "$@" | sed "s|docs/||g") && yarn eslint --fix --ext .js,.jsx,.ts,.tsx --quiet $FILES' + language: system + pass_filenames: true + files: ^docs/.*\.(js|jsx|ts|tsx)$ + - id: type-checking-frontend + name: Type-Checking (Frontend) + entry: ./scripts/check-type.js package=superset-frontend excludeDeclarationDir=cypress-base + language: system + files: ^superset-frontend\/.*\.(js|jsx|ts|tsx)$ + exclude: ^superset-frontend/cypress-base\/ + require_serial: true # blacklist unsafe functions like make_url (see #19526) - repo: https://github.com/skorokithakis/blacklist-pre-commit-hook rev: e2f070289d8eddcaec0b580d3bde29437e7c8221 @@ -97,25 +95,26 @@ repos: - repo: https://github.com/astral-sh/ruff-pre-commit rev: v0.9.7 hooks: + - id: ruff-format - id: ruff args: [--fix] - - id: ruff-format - repo: local hooks: - - id: pylint - name: pylint with custom Superset plugins - entry: bash - language: system - types: [python] - exclude: ^(tests/|superset/migrations/|scripts/|RELEASING/|docker/) - args: - - -c - - | - TARGET_BRANCH=${GITHUB_BASE_REF:-master} - git fetch origin "$TARGET_BRANCH" - files=$(git diff --name-only --diff-filter=ACM origin/"$TARGET_BRANCH"..HEAD | grep '^superset/.*\.py$' || true) - if [ -n "$files" ]; then - pylint --rcfile=.pylintrc --load-plugins=superset.extensions.pylint $files - else - echo "No Python files to lint." - fi + - id: pylint + name: pylint with custom Superset plugins + entry: bash + language: system + types: [python] + exclude: ^(tests/|superset/migrations/|scripts/|RELEASING/|docker/) + args: + - -c + - | + TARGET_BRANCH=${GITHUB_BASE_REF:-master} + git fetch origin "$TARGET_BRANCH" + BASE=$(git merge-base origin/"$TARGET_BRANCH" HEAD) + files=$(git diff --name-only --diff-filter=ACM "$BASE"..HEAD | grep '^superset/.*\.py$' || true) + if [ -n "$files" ]; then + pylint --rcfile=.pylintrc --load-plugins=superset.extensions.pylint --reports=no $files + else + echo "No Python files to lint." + fi diff --git a/.rat-excludes b/.rat-excludes index 4ed267f7ccb..6d37fdd7a91 100644 --- a/.rat-excludes +++ b/.rat-excludes @@ -32,6 +32,8 @@ apache_superset.egg-info # json and csv in general cannot have comments .*json .*csv +# jinja templates often need to be as-is +.*j2 # Generated doc files env/* docs/.htaccess* @@ -71,8 +73,17 @@ ibm-db2.svg postgresql.svg snowflake.svg ydb.svg +loading.svg # docs-related erd.puml erd.svg intro_header.txt + +# for LLMs +llm-context.md +LLMS.md +CLAUDE.md +CURSOR.md +GEMINI.md +GPT.md diff --git a/CHANGELOG.md b/CHANGELOG.md index 9e8f17c823d..99198909fbd 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -44,4 +44,8 @@ under the License. - [4.0.1](./CHANGELOG/4.0.1.md) - [4.0.2](./CHANGELOG/4.0.2.md) - [4.1.0](./CHANGELOG/4.1.0.md) +- [4.1.1](./CHANGELOG/4.1.1.md) +- [4.1.2](./CHANGELOG/4.1.2.md) +- [4.1.3](./CHANGELOG/4.1.3.md) +- [4.1.4](./CHANGELOG/4.1.4.md) - [5.0.0](./CHANGELOG/5.0.0.md) diff --git a/CHANGELOG/4.1.3.md b/CHANGELOG/4.1.3.md new file mode 100644 index 00000000000..d1514548775 --- /dev/null +++ b/CHANGELOG/4.1.3.md @@ -0,0 +1,58 @@ + + +## Change Log + +### 4.1.3 (Thu May 29 02:31:07 2025 -0500) + +**Database Migrations** + +**Features** + +**Fixes** + +- [#33522](https://github.com/apache/superset/pull/33522) fix(Sqllab): Autocomplete got stuck in UI when open it too fast (@rebenitez1802) +- [#33425](https://github.com/apache/superset/pull/33425) fix(table-chart): time shift is not working (@justinpark) +- [#32414](https://github.com/apache/superset/pull/32414) fix(api): Added uuid to list api calls (@withnale) +- [#33354](https://github.com/apache/superset/pull/33354) fix: loading examples from raw.githubusercontent.com fails with 429 errors (@mistercrunch) +- [#32382](https://github.com/apache/superset/pull/32382) fix(pinot): revert join and subquery flags (@yuribogomolov) +- [#32473](https://github.com/apache/superset/pull/32473) fix(plugin-chart-echarts): remove erroneous upper bound value (@villebro) +- [#33048](https://github.com/apache/superset/pull/33048) fix: improve error type on parse error (@justinpark) +- [#32968](https://github.com/apache/superset/pull/32968) fix(pivot-table): Revert "fix(Pivot Table): Fix column width to respect currency config (#31414)" (@justinpark) +- [#32795](https://github.com/apache/superset/pull/32795) fix(log): store navigation path to get correct logging path (@justinpark) +- [#33216](https://github.com/apache/superset/pull/33216) fix: Downgrade to marshmallow<4 (@amotl) +- [#32866](https://github.com/apache/superset/pull/32866) fix: make packages PEP 625 compliant (@sadpandajoe) +- [#32035](https://github.com/apache/superset/pull/32035) fix(fe/dashboard-list): display modifier info for `Last modified` data (@hainenber) +- [#32708](https://github.com/apache/superset/pull/32708) fix(logging): missing path in event data (@justinpark) +- [#32699](https://github.com/apache/superset/pull/32699) fix: Signature of Celery pruner jobs (@michael-s-molina) +- [#32681](https://github.com/apache/superset/pull/32681) fix(log): Update recent_activity by event name (@justinpark) +- [#32608](https://github.com/apache/superset/pull/32608) fix(welcome): perf on distinct recent activities (@justinpark) +- [#32572](https://github.com/apache/superset/pull/32572) fix: Log table retention policy (@michael-s-molina) +- [#32406](https://github.com/apache/superset/pull/32406) fix(model/helper): represent RLS filter clause in proper textual SQL string (@hainenber) +- [#32240](https://github.com/apache/superset/pull/32240) fix: upgrade to 3.11.11-slim-bookworm to address critical vulnerabilities (@gpchandran) +- [#30858](https://github.com/apache/superset/pull/30858) fix(chart data): removing query from /chart/data payload when accessing as guest user (@fisjac) + +**Others** + +- [#33612](https://github.com/apache/superset/pull/33612) chore: update Dockerfile - Upgrade to 3.11.12 (@gpchandran) +- [#33435](https://github.com/apache/superset/pull/33435) docs: CVEs fixed on 4.1.2 (@sha174n) +- [#33339](https://github.com/apache/superset/pull/33339) chore(🦾): bump python h11 0.14.0 -> 0.16.0 (@github-actions[bot]) +- [#32745](https://github.com/apache/superset/pull/32745) chore(🦾): bump python sqlglot 26.1.3 -> 26.11.1 (@github-actions[bot]) +- [#32782](https://github.com/apache/superset/pull/32782) chore: Revert "chore: bump base image in Dockerfile with `ARG PY_VER=3.11.11-slim-bookworm`" (@sadpandajoe) +- [#32780](https://github.com/apache/superset/pull/32780) chore: bump base image in Dockerfile with `ARG PY_VER=3.11.11-slim-bookworm` (@gpchandran) diff --git a/CHANGELOG/4.1.4.md b/CHANGELOG/4.1.4.md new file mode 100644 index 00000000000..31c98ee3ee6 --- /dev/null +++ b/CHANGELOG/4.1.4.md @@ -0,0 +1,33 @@ + + +## Change Log + +### 4.1.4 (Thu Jul 24 08:30:04 2025 -0300) + +**Database Migrations** + +**Features** + +**Fixes** +- [#34289](https://github.com/apache/superset/pull/34289) fix: Saved queries list break if one query can't be parsed (@michael-s-molina) +- [#33059](https://github.com/apache/superset/pull/33059) fix: Adds missing __init__ file to commands/logs (@michael-s-molina) + +**Others** +- [#32236](https://github.com/apache/superset/pull/32236) chore(deps): bump cryptography from 43.0.3 to 44.0.1 (@dependabot[bot]) diff --git a/CLAUDE.md b/CLAUDE.md new file mode 120000 index 00000000000..f811a97b81a --- /dev/null +++ b/CLAUDE.md @@ -0,0 +1 @@ +LLMS.md \ No newline at end of file diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 5dbf90157f8..06ed3b146c5 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -5,7 +5,7 @@ regarding copyright ownership. The ASF licenses this file to you under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance - with the License. You may obtain a copy of the License at + with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 diff --git a/Dockerfile b/Dockerfile index 223746eac98..1b689c2319c 100644 --- a/Dockerfile +++ b/Dockerfile @@ -18,7 +18,7 @@ ###################################################################### # Node stage to deal with static asset construction ###################################################################### -ARG PY_VER=3.11.13-slim-bookworm +ARG PY_VER=3.11.13-slim-trixie # If BUILDPLATFORM is null, set it to 'amd64' (or leave as is otherwise). ARG BUILDPLATFORM=${BUILDPLATFORM:-amd64} @@ -29,7 +29,7 @@ ARG BUILD_TRANSLATIONS="false" ###################################################################### # superset-node-ci used as a base for building frontend assets and CI ###################################################################### -FROM --platform=${BUILDPLATFORM} node:20-bookworm-slim AS superset-node-ci +FROM --platform=${BUILDPLATFORM} node:20-trixie-slim AS superset-node-ci ARG BUILD_TRANSLATIONS ENV BUILD_TRANSLATIONS=${BUILD_TRANSLATIONS} ARG DEV_MODE="false" # Skip frontend build in dev mode @@ -59,12 +59,12 @@ RUN mkdir -p /app/superset/static/assets \ # NOTE: we mount packages and plugins as they are referenced in package.json as workspaces # ideally we'd COPY only their package.json. Here npm ci will be cached as long # as the full content of these folders don't change, yielding a decent cache reuse rate. -# Note that's it's not possible selectively COPY of mount using blobs. +# Note that it's not possible to selectively COPY or mount using blobs. RUN --mount=type=bind,source=./superset-frontend/package.json,target=./package.json \ --mount=type=bind,source=./superset-frontend/package-lock.json,target=./package-lock.json \ --mount=type=cache,target=/root/.cache \ --mount=type=cache,target=/root/.npm \ - if [ "$DEV_MODE" = "false" ]; then \ + if [ "${DEV_MODE}" = "false" ]; then \ npm ci; \ else \ echo "Skipping 'npm ci' in dev mode"; \ @@ -74,13 +74,13 @@ RUN --mount=type=bind,source=./superset-frontend/package.json,target=./package.j COPY superset-frontend /app/superset-frontend ###################################################################### -# superset-node used for compile frontend assets +# superset-node is used for compiling frontend assets ###################################################################### FROM superset-node-ci AS superset-node # Build the frontend if not in dev mode RUN --mount=type=cache,target=/root/.npm \ - if [ "$DEV_MODE" = "false" ]; then \ + if [ "${DEV_MODE}" = "false" ]; then \ echo "Running 'npm run ${BUILD_CMD}'"; \ npm run ${BUILD_CMD}; \ else \ @@ -90,12 +90,11 @@ RUN --mount=type=cache,target=/root/.npm \ # Copy translation files COPY superset/translations /app/superset/translations -# Build the frontend if not in dev mode -RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \ +# Build translations if enabled, then cleanup localization files +RUN if [ "${BUILD_TRANSLATIONS}" = "true" ]; then \ npm run build-translation; \ fi; \ - rm -rf /app/superset/translations/*/*/*.po; \ - rm -rf /app/superset/translations/*/*/*.mo; + rm -rf /app/superset/translations/*/*/*.[po,mo]; ###################################################################### @@ -106,10 +105,10 @@ FROM python:${PY_VER} AS python-base ARG SUPERSET_HOME="/app/superset_home" ENV SUPERSET_HOME=${SUPERSET_HOME} -RUN mkdir -p $SUPERSET_HOME +RUN mkdir -p ${SUPERSET_HOME} RUN useradd --user-group -d ${SUPERSET_HOME} -m --no-log-init --shell /bin/bash superset \ - && chmod -R 1777 $SUPERSET_HOME \ - && chown -R superset:superset $SUPERSET_HOME + && chmod -R 1777 ${SUPERSET_HOME} \ + && chown -R superset:superset ${SUPERSET_HOME} # Some bash scripts needed throughout the layers COPY --chmod=755 docker/*.sh /app/docker/ @@ -134,17 +133,19 @@ RUN --mount=type=cache,target=/root/.cache/uv \ . /app/.venv/bin/activate && /app/docker/pip-install.sh --requires-build-essential -r requirements/translations.txt COPY superset/translations/ /app/translations_mo/ -RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \ +RUN if [ "${BUILD_TRANSLATIONS}" = "true" ]; then \ pybabel compile -d /app/translations_mo | true; \ fi; \ - rm -f /app/translations_mo/*/*/*.po; \ - rm -f /app/translations_mo/*/*/*.json; + rm -f /app/translations_mo/*/*/*.[po,json] ###################################################################### # Python APP common layer ###################################################################### FROM python-base AS python-common +# Build arg to pre-populate examples DuckDB file +ARG LOAD_EXAMPLES_DUCKDB="false" + ENV SUPERSET_HOME="/app/superset_home" \ HOME="/app/superset_home" \ SUPERSET_ENV="production" \ @@ -167,14 +168,14 @@ RUN mkdir -p \ && touch superset/static/version_info.json # Install Playwright and optionally setup headless browsers -ARG INCLUDE_CHROMIUM="true" +ARG INCLUDE_CHROMIUM="false" ARG INCLUDE_FIREFOX="false" RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \ - if [ "$INCLUDE_CHROMIUM" = "true" ] || [ "$INCLUDE_FIREFOX" = "true" ]; then \ + if [ "${INCLUDE_CHROMIUM}" = "true" ] || [ "${INCLUDE_FIREFOX}" = "true" ]; then \ uv pip install playwright && \ playwright install-deps && \ - if [ "$INCLUDE_CHROMIUM" = "true" ]; then playwright install chromium; fi && \ - if [ "$INCLUDE_FIREFOX" = "true" ]; then playwright install firefox; fi; \ + if [ "${INCLUDE_CHROMIUM}" = "true" ]; then playwright install chromium; fi && \ + if [ "${INCLUDE_FIREFOX}" = "true" ]; then playwright install firefox; fi; \ else \ echo "Skipping browser installation"; \ fi @@ -196,6 +197,18 @@ RUN /app/docker/apt-install.sh \ libecpg-dev \ libldap2-dev +# Pre-load examples DuckDB file if requested +RUN if [ "$LOAD_EXAMPLES_DUCKDB" = "true" ]; then \ + mkdir -p /app/data && \ + echo "Downloading pre-built examples.duckdb..." && \ + curl -L -o /app/data/examples.duckdb \ + "https://raw.githubusercontent.com/apache-superset/examples-data/master/examples.duckdb" && \ + chown -R superset:superset /app/data; \ + else \ + mkdir -p /app/data && \ + chown -R superset:superset /app/data; \ + fi + # Copy compiled things from previous stages COPY --from=superset-node /app/superset/static/assets superset/static/assets @@ -219,11 +232,15 @@ FROM python-common AS lean # Install Python dependencies using docker/pip-install.sh COPY requirements/base.txt requirements/ + +# Copy superset-core package needed for editable install in base.txt +COPY superset-core superset-core + RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \ /app/docker/pip-install.sh --requires-build-essential -r requirements/base.txt # Install the superset package RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \ - uv pip install . + uv pip install -e . RUN python -m compileall /app/superset USER superset @@ -241,12 +258,17 @@ RUN /app/docker/apt-install.sh \ # Copy development requirements and install them COPY requirements/*.txt requirements/ + +# Copy local packages needed for editable installs in development.txt +COPY superset-core superset-core +COPY superset-extensions-cli superset-extensions-cli + # Install Python dependencies using docker/pip-install.sh RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \ /app/docker/pip-install.sh --requires-build-essential -r requirements/development.txt # Install the superset package RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \ - uv pip install . + uv pip install -e . RUN uv pip install .[postgres] RUN python -m compileall /app/superset @@ -258,6 +280,15 @@ USER superset ###################################################################### FROM lean AS ci USER root -RUN uv pip install .[postgres] +RUN uv pip install .[postgres,duckdb] +USER superset +CMD ["/app/docker/entrypoints/docker-ci.sh"] + +###################################################################### +# Showtime image - lean + DuckDB for examples database +###################################################################### +FROM lean AS showtime +USER root +RUN uv pip install .[duckdb] USER superset CMD ["/app/docker/entrypoints/docker-ci.sh"] diff --git a/GEMINI.md b/GEMINI.md new file mode 120000 index 00000000000..f811a97b81a --- /dev/null +++ b/GEMINI.md @@ -0,0 +1 @@ +LLMS.md \ No newline at end of file diff --git a/GPT.md b/GPT.md new file mode 120000 index 00000000000..f811a97b81a --- /dev/null +++ b/GPT.md @@ -0,0 +1 @@ +LLMS.md \ No newline at end of file diff --git a/LLMS.md b/LLMS.md new file mode 100644 index 00000000000..ad3fbf22741 --- /dev/null +++ b/LLMS.md @@ -0,0 +1,194 @@ +# LLM Context Guide for Apache Superset + +Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend. + +## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do) + +**These migrations are actively happening - avoid deprecated patterns:** + +### Frontend Modernization +- **NO `any` types** - Use proper TypeScript types +- **NO JavaScript files** - Convert to TypeScript (.ts/.tsx) +- **Use @superset-ui/core** - Don't import Ant Design directly, prefer Ant Design component wrappers from @superset-ui/core/components +- **Use antd theming tokens** - Prefer antd tokens over legacy theming tokens +- **Avoid custom css and styles** - Follow antd best practices and avoid styling and custom CSS whenever possible + +### Testing Strategy Migration +- **Prefer unit tests** over integration tests +- **Prefer integration tests** over Cypress end-to-end tests +- **Cypress is last resort** - Actively moving away from Cypress +- **Use Jest + React Testing Library** for component testing +- **Use `test()` instead of `describe()`** - Follow [avoid nesting when testing](https://kentcdodds.com/blog/avoid-nesting-when-youre-testing) principles + +### Backend Type Safety +- **Add type hints** - All new Python code needs proper typing +- **MyPy compliance** - Run `pre-commit run mypy` to validate +- **SQLAlchemy typing** - Use proper model annotations + +### UUID Migration +- **Prefer UUIDs over auto-incrementing IDs** - New models should use UUID primary keys +- **External API exposure** - Use UUIDs in public APIs instead of internal integer IDs +- **Existing models** - Add UUID fields alongside integer IDs for gradual migration + +## Key Directories + +``` +superset/ +├── superset/ # Python backend (Flask, SQLAlchemy) +│ ├── views/api/ # REST API endpoints +│ ├── models/ # Database models +│ └── connectors/ # Database connections +├── superset-frontend/src/ # React TypeScript frontend +│ ├── components/ # Reusable components +│ ├── explore/ # Chart builder +│ ├── dashboard/ # Dashboard interface +│ └── SqlLab/ # SQL editor +├── superset-frontend/packages/ +│ └── superset-ui-core/ # UI component library (USE THIS) +├── tests/ # Python/integration tests +├── docs/ # Documentation (UPDATE FOR CHANGES) +└── UPDATING.md # Breaking changes log +``` + +## Code Standards + +### TypeScript Frontend +- **Avoid `any` types** - Use proper TypeScript, reuse existing types +- **Functional components** with hooks +- **@superset-ui/core** for UI components (not direct antd) +- **Jest** for testing (NO Enzyme) +- **Redux** for global state where it exists, hooks for local + +### Python Backend +- **Type hints required** for all new code +- **MyPy compliant** - run `pre-commit run mypy` +- **SQLAlchemy models** with proper typing +- **pytest** for testing + +### Apache License Headers +- **New files require ASF license headers** - When creating new code files, include the standard Apache Software Foundation license header +- **LLM instruction files are excluded** - Files like LLMS.md, CLAUDE.md, etc. are in `.rat-excludes` to avoid header token overhead + +## Documentation Requirements + +- **docs/**: Update for any user-facing changes +- **UPDATING.md**: Add breaking changes here +- **Docstrings**: Required for new functions/classes + +## Architecture Patterns + +### Security & Features +- **RBAC**: Role-based access via Flask-AppBuilder +- **Feature flags**: Control feature rollouts +- **Row-level security**: SQL-based data access control + +## Test Utilities + +### Python Test Helpers +- **`SupersetTestCase`** - Base class in `tests/integration_tests/base_tests.py` +- **`@with_config`** - Config mocking decorator +- **`@with_feature_flags`** - Feature flag testing +- **`login_as()`, `login_as_admin()`** - Authentication helpers +- **`create_dashboard()`, `create_slice()`** - Data setup utilities + +### TypeScript Test Helpers +- **`superset-frontend/spec/helpers/testing-library.tsx`** - Custom render() with providers +- **`createWrapper()`** - Redux/Router/Theme wrapper +- **`selectOption()`** - Select component helper +- **React Testing Library** - NO Enzyme (removed) + +### Test Database Patterns +- **Mock patterns**: Use `MagicMock()` for config objects, avoid `AsyncMock` for synchronous code +- **API tests**: Update expected columns when adding new model fields + +### Running Tests +```bash +# Frontend +npm run test # All tests +npm run test -- filename.test.tsx # Single file + +# Backend +pytest # All tests +pytest tests/unit_tests/specific_test.py # Single file +pytest tests/unit_tests/ # Directory + +# If pytest fails with database/setup issues, ask the user to run test environment setup +``` + +## Environment Validation + +**Quick Setup Check (run this first):** + +```bash +# Verify Superset is running +curl -f http://localhost:8088/health || echo "❌ Setup required - see https://superset.apache.org/docs/contributing/development#working-with-llms" +``` + +**If health checks fail:** +"It appears you aren't set up properly. Please refer to the [Working with LLMs](https://superset.apache.org/docs/contributing/development#working-with-llms) section in the development docs for setup instructions." + +**Key Project Files:** +- `superset-frontend/package.json` - Frontend build scripts (`npm run dev` on port 9000, `npm run test`, `npm run lint`) +- `pyproject.toml` - Python tooling (ruff, mypy configs) +- `requirements/` folder - Python dependencies (base.txt, development.txt) + +## SQLAlchemy Query Best Practices +- **Use negation operator**: `~Model.field` instead of `== False` to avoid ruff E712 errors +- **Example**: `~Model.is_active` instead of `Model.is_active == False` + +## Pre-commit Validation + +**Use pre-commit hooks for quality validation:** + +```bash +# Install hooks +pre-commit install + +# IMPORTANT: Stage your changes first! +git add . # Pre-commit only checks staged files + +# Quick validation (faster than --all-files) +pre-commit run # Staged files only +pre-commit run mypy # Python type checking +pre-commit run prettier # Code formatting +pre-commit run eslint # Frontend linting +``` + +**Important pre-commit usage notes:** +- **Stage files first**: Run `git add .` before `pre-commit run` to check only changed files (much faster) +- **Virtual environment**: Activate your Python virtual environment before running pre-commit + ```bash + # Common virtual environment locations (yours may differ): + source .venv/bin/activate # if using .venv + source venv/bin/activate # if using venv + source ~/venvs/superset/bin/activate # if using a central location + ``` + If you get a "command not found" error, ask the user which virtual environment to activate +- **Auto-fixes**: Some hooks auto-fix issues (e.g., trailing whitespace). Re-run after fixes are applied + +## Common File Patterns + +### API Structure +- **`/api.py`** - REST endpoints with decorators and OpenAPI docstrings +- **`/schemas.py`** - Marshmallow validation schemas for OpenAPI spec +- **`/commands/`** - Business logic classes with @transaction() decorators +- **`/models/`** - SQLAlchemy database models +- **OpenAPI docs**: Auto-generated at `/swagger/v1` from docstrings and schemas + +### Migration Files +- **Location**: `superset/migrations/versions/` +- **Naming**: `YYYY-MM-DD_HH-MM_hash_description.py` +- **Utilities**: Use helpers from `superset.migrations.shared.utils` for database compatibility +- **Pattern**: Import utilities instead of raw SQLAlchemy operations + +## Platform-Specific Instructions + +- **[CLAUDE.md](CLAUDE.md)** - For Claude/Anthropic tools +- **[.github/copilot-instructions.md](.github/copilot-instructions.md)** - For GitHub Copilot +- **[GEMINI.md](GEMINI.md)** - For Google Gemini tools +- **[GPT.md](GPT.md)** - For OpenAI/ChatGPT tools +- **[.cursor/rules/dev-standard.mdc](.cursor/rules/dev-standard.mdc)** - For Cursor editor + +--- + +**LLM Note**: This codebase is actively modernizing toward full TypeScript and type safety. Always run `pre-commit run` to validate changes. Follow the ongoing refactors section to avoid deprecated patterns. diff --git a/Makefile b/Makefile index 1c9aa80ff53..4a7121fd34f 100644 --- a/Makefile +++ b/Makefile @@ -91,7 +91,7 @@ js-format: cd superset-frontend; npm run prettier flask-app: - flask run -p 8088 --with-threads --reload --debugger + flask run -p 8088 --reload --debugger node-app: cd superset-frontend; npm run dev-server diff --git a/RELEASING/Dockerfile.from_local_tarball b/RELEASING/Dockerfile.from_local_tarball index 3794ed4c80a..c56d75e87dc 100644 --- a/RELEASING/Dockerfile.from_local_tarball +++ b/RELEASING/Dockerfile.from_local_tarball @@ -14,7 +14,7 @@ # See the License for the specific language governing permissions and # limitations under the License. # -FROM python:3.10-slim-bookworm +FROM python:3.10-slim-trixie RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset diff --git a/RELEASING/Dockerfile.from_svn_tarball b/RELEASING/Dockerfile.from_svn_tarball index 33d0e9451b0..889724863a5 100644 --- a/RELEASING/Dockerfile.from_svn_tarball +++ b/RELEASING/Dockerfile.from_svn_tarball @@ -14,7 +14,7 @@ # See the License for the specific language governing permissions and # limitations under the License. # -FROM python:3.10-slim-bookworm +FROM python:3.10-slim-trixie RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset diff --git a/RELEASING/Dockerfile.make_docs b/RELEASING/Dockerfile.make_docs index c4bca3bb3a8..e62faa89f8b 100644 --- a/RELEASING/Dockerfile.make_docs +++ b/RELEASING/Dockerfile.make_docs @@ -14,7 +14,7 @@ # See the License for the specific language governing permissions and # limitations under the License. # -FROM python:3.10-slim-bookworm +FROM python:3.10-slim-trixie ARG VERSION RUN git clone --depth 1 --branch ${VERSION} https://github.com/apache/superset.git /superset diff --git a/RELEASING/Dockerfile.make_tarball b/RELEASING/Dockerfile.make_tarball index 4e701afd172..e3163273a5e 100644 --- a/RELEASING/Dockerfile.make_tarball +++ b/RELEASING/Dockerfile.make_tarball @@ -14,7 +14,7 @@ # See the License for the specific language governing permissions and # limitations under the License. # -FROM python:3.10-slim-bookworm +FROM python:3.10-slim-trixie RUN apt-get update -y RUN apt-get install -y \ diff --git a/RELEASING/README.md b/RELEASING/README.md index 0d0d8d1be43..af001ad6af9 100644 --- a/RELEASING/README.md +++ b/RELEASING/README.md @@ -469,6 +469,10 @@ an account first if you don't have one, and reference your username while requesting access to push packages. ```bash +# Run this first to make sure you are uploading the right version. +# Pypi does not allow you to delete or retract once uplaoded. +twine check dist/* + twine upload dist/* ``` @@ -518,6 +522,8 @@ takes the version (ie `3.1.1`), the git reference (any SHA, tag or branch reference), and whether to force the `latest` Docker tag on the generated images. +**NOTE:** If the docker image isn't built, you'll need to run this [GH action](https://github.com/apache/superset/actions/workflows/tag-release.yml) where you provide it the tag sha. + ### Npm Release You might want to publish the latest @superset-ui release to npm diff --git a/RELEASING/make_tarball.sh b/RELEASING/make_tarball.sh index c4c53f979e3..0dbb9d8866d 100755 --- a/RELEASING/make_tarball.sh +++ b/RELEASING/make_tarball.sh @@ -32,11 +32,10 @@ else SUPERSET_VERSION="${1}" SUPERSET_RC="${2}" SUPERSET_PGP_FULLNAME="${3}" + SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}" SUPERSET_RELEASE_RC_TARBALL="apache_superset-${SUPERSET_VERSION_RC}-source.tar.gz" fi -SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}" - if [ -z "${SUPERSET_SVN_DEV_PATH}" ]; then SUPERSET_SVN_DEV_PATH="$HOME/svn/superset_dev" fi diff --git a/RESOURCES/FEATURE_FLAGS.md b/RESOURCES/FEATURE_FLAGS.md index 21ae05d547c..9ba7ca44738 100644 --- a/RESOURCES/FEATURE_FLAGS.md +++ b/RESOURCES/FEATURE_FLAGS.md @@ -28,6 +28,7 @@ These features are considered **unfinished** and should only be used on developm [//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY" - ALERT_REPORT_TABS +- DATE_RANGE_TIMESHIFTS_ENABLED - ENABLE_ADVANCED_DATA_TYPES - PRESTO_EXPAND_DATA - SHARE_QUERIES_VIA_KV_STORE diff --git a/RESOURCES/STANDARD_ROLES.md b/RESOURCES/STANDARD_ROLES.md index 207474c9abe..48c80174453 100644 --- a/RESOURCES/STANDARD_ROLES.md +++ b/RESOURCES/STANDARD_ROLES.md @@ -94,9 +94,9 @@ under the License. | can available domains on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O| | can request access on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O| | can dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O| -| can post on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O| -| can expanded on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O| -| can delete on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O| +| can post on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:| +| can expanded on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:| +| can delete on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:| | can get on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:| | can post on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:| | can delete query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:| diff --git a/UPDATING.md b/UPDATING.md index f36778beb48..08cd483775e 100644 --- a/UPDATING.md +++ b/UPDATING.md @@ -23,12 +23,25 @@ This file documents any backwards-incompatible changes in Superset and assists people when migrating to a new version. ## Next - +- [34871](https://github.com/apache/superset/pull/34871): Fixed Jest test hanging issue from Ant Design v5 upgrade. MessageChannel is now mocked in test environment to prevent rc-overflow from causing Jest to hang. Test environment only - no production impact. +- [34782](https://github.com/apache/superset/pull/34782): Dataset exports now include the dataset ID in their file name (similar to charts and dashboards). If managing assets as code, make sure to rename existing dataset YAMLs to include the ID (and avoid duplicated files). +- [34536](https://github.com/apache/superset/pull/34536): The `ENVIRONMENT_TAG_CONFIG` color values have changed to support only Ant Design semantic colors. Update your `superset_config.py`: + - Change `"error.base"` to just `"error"` after this PR + - Change any hex color values to one of: `"success"`, `"processing"`, `"error"`, `"warning"`, `"default"` + - Custom colors are no longer supported to maintain consistency with Ant Design components +- [34561](https://github.com/apache/superset/pull/34561) Added tiled screenshot functionality for Playwright-based reports to handle large dashboards more efficiently. When enabled (default: `SCREENSHOT_TILED_ENABLED = True`), dashboards with 20+ charts or height exceeding 5000px will be captured using multiple viewport-sized tiles and combined into a single image. This improves report generation performance and reliability for large dashboards. +Note: Pillow is now a required dependency (previously optional) to support image processing for tiled screenshots. +`thumbnails` optional dependency is now deprecated and will be removed in the next major release (7.0). +- [33084](https://github.com/apache/superset/pull/33084) The DISALLOWED_SQL_FUNCTIONS configuration now includes additional potentially sensitive database functions across PostgreSQL, MySQL, SQLite, MS SQL Server, and ClickHouse. Existing queries using these functions may now be blocked. Review your SQL Lab queries and dashboards if you encounter "disallowed function" errors after upgrading +- [34235](https://github.com/apache/superset/pull/34235) CSV exports now use `utf-8-sig` encoding by default to include a UTF-8 BOM, improving compatibility with Excel. +- [34258](https://github.com/apache/superset/pull/34258) changing the default in Dockerfile to INCLUDE_CHROMIUM="false" (from "true") in the past. This ensures the `lean` layer is lean by default, and people can opt-in to the `chromium` layer by setting the build arg `INCLUDE_CHROMIUM=true`. This is a breaking change for anyone using the `lean` layer, as it will no longer include Chromium by default. +- [34204](https://github.com/apache/superset/pull/33603) OpenStreetView has been promoted as the new default for Deck.gl visualization since it can be enabled by default without requiring an API key. If you have Mapbox set up and want to disable OpenStreeView in your environment, please follow the steps documented here [https://superset.apache.org/docs/configuration/map-tiles]. - [33116](https://github.com/apache/superset/pull/33116) In Echarts Series charts (e.g. Line, Area, Bar, etc.) charts, the `x_axis_sort_series` and `x_axis_sort_series_ascending` form data items have been renamed with `x_axis_sort` and `x_axis_sort_asc`. There's a migration added that can potentially affect a significant number of existing charts. - [32317](https://github.com/apache/superset/pull/32317) The horizontal filter bar feature is now out of testing/beta development and its feature flag `HORIZONTAL_FILTER_BAR` has been removed. - [31590](https://github.com/apache/superset/pull/31590) Marks the begining of intricate work around supporting dynamic Theming, and breaks support for [THEME_OVERRIDES](https://github.com/apache/superset/blob/732de4ac7fae88e29b7f123b6cbb2d7cd411b0e4/superset/config.py#L671) in favor of a new theming system based on AntD V5. Likely this will be in disrepair until settling over the 5.x lifecycle. - [32432](https://github.com/apache/superset/pull/31260) Moves the List Roles FAB view to the frontend and requires `FAB_ADD_SECURITY_API` to be enabled in the configuration and `superset init` to be executed. +- [34319](https://github.com/apache/superset/pull/34319) Drill to Detail and Drill By is now supported in Embedded mode, and also with the `DASHBOARD_RBAC` FF. If you don't want to expose these features in Embedded / `DASHBOARD_RBAC`, make sure the roles used for Embedded / `DASHBOARD_RBAC`don't have the required permissions to perform D2D actions. ## 5.0.0 diff --git a/docker-compose-image-tag.yml b/docker-compose-image-tag.yml index 4246d80cefc..5bc24189790 100644 --- a/docker-compose-image-tag.yml +++ b/docker-compose-image-tag.yml @@ -20,6 +20,9 @@ # If you choose to use this type of deployment make sure to # create you own docker environment file (docker/.env) with your own # unique random secure passwords and SECRET_KEY. +# +# For verbose logging during development: +# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs # ----------------------------------------------------------------------- x-superset-image: &superset-image apachesuperset.docker.scarf.sh/apache/superset:${TAG:-latest-dev} x-superset-volumes: diff --git a/docker-compose-light.yml b/docker-compose-light.yml new file mode 100644 index 00000000000..1910699be4a --- /dev/null +++ b/docker-compose-light.yml @@ -0,0 +1,204 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one or more +# contributor license agreements. See the NOTICE file distributed with +# this work for additional information regarding copyright ownership. +# The ASF licenses this file to You under the Apache License, Version 2.0 +# (the "License"); you may not use this file except in compliance with +# the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +# ----------------------------------------------------------------------- +# Lightweight docker-compose for running multiple Superset instances +# This includes only essential services: database and Superset app (no Redis) +# +# RUNNING SUPERSET: +# 1. Start services: docker-compose -f docker-compose-light.yml up +# 2. Access at: http://localhost:9001 (or NODE_PORT if specified) +# +# RUNNING MULTIPLE INSTANCES: +# - Use different project names: docker-compose -p project1 -f docker-compose-light.yml up +# - Use different NODE_PORT values: NODE_PORT=9002 docker-compose -p project2 -f docker-compose-light.yml up +# - Volumes are isolated by project name (e.g., project1_db_home_light, project2_db_home_light) +# - Database name is intentionally different (superset_light) to prevent accidental cross-connections +# +# RUNNING TESTS WITH PYTEST: +# Tests run in an isolated environment with a separate test database. +# The pytest-runner service automatically creates and initializes the test database on first use. +# +# Basic usage: +# docker-compose -f docker-compose-light.yml run --rm pytest-runner pytest tests/unit_tests/ +# +# Run specific test file: +# docker-compose -f docker-compose-light.yml run --rm pytest-runner pytest tests/unit_tests/test_foo.py +# +# Run with pytest options: +# docker-compose -f docker-compose-light.yml run --rm pytest-runner pytest -v -s -x tests/ +# +# Force reload test database and run tests (when tests are failing due to bad state): +# docker-compose -f docker-compose-light.yml run --rm -e FORCE_RELOAD=true pytest-runner pytest tests/ +# +# Run any command in test environment: +# docker-compose -f docker-compose-light.yml run --rm pytest-runner bash +# docker-compose -f docker-compose-light.yml run --rm pytest-runner pytest --collect-only +# +# For parallel test execution with different projects: +# docker-compose -p project1 -f docker-compose-light.yml run --rm pytest-runner pytest tests/ +# +# DEVELOPMENT TIPS: +# - First test run takes ~20-30 seconds (database creation + initialization) +# - Subsequent runs are fast (~2-3 seconds startup) +# - Use FORCE_RELOAD=true when you need a clean test database +# - Tests use SimpleCache instead of Redis (no Redis required) +# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed logs +# ----------------------------------------------------------------------- +x-superset-user: &superset-user root +x-superset-volumes: &superset-volumes + # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container + - ./docker:/app/docker + - ./superset:/app/superset + - ./superset-frontend:/app/superset-frontend + - superset_home_light:/app/superset_home + - ./tests:/app/tests +x-common-build: &common-build + context: . + target: ${SUPERSET_BUILD_TARGET:-dev} # can use `dev` (default) or `lean` + cache_from: + - apache/superset-cache:3.10-slim-trixie + args: + DEV_MODE: "true" + INCLUDE_CHROMIUM: ${INCLUDE_CHROMIUM:-false} + INCLUDE_FIREFOX: ${INCLUDE_FIREFOX:-false} + BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false} + LOAD_EXAMPLES_DUCKDB: ${LOAD_EXAMPLES_DUCKDB:-true} + +services: + db-light: + env_file: + - path: docker/.env # default + required: true + - path: docker/.env-local # optional override + required: false + image: postgres:16 + restart: unless-stopped + volumes: + - db_home_light:/var/lib/postgresql/data + - ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d + environment: + POSTGRES_DB: superset_light + command: postgres -c max_connections=200 + + superset-light: + env_file: + - path: docker/.env # default + required: true + - path: docker/.env-local # optional override + required: false + build: + <<: *common-build + command: ["/app/docker/docker-bootstrap.sh", "app"] + restart: unless-stopped + extra_hosts: + - "host.docker.internal:host-gateway" + user: *superset-user + depends_on: + superset-init-light: + condition: service_completed_successfully + volumes: *superset-volumes + environment: + DATABASE_HOST: db-light + DATABASE_DB: superset_light + POSTGRES_DB: superset_light + SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb" + SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py + GITHUB_HEAD_REF: ${GITHUB_HEAD_REF:-} + GITHUB_SHA: ${GITHUB_SHA:-} + + superset-init-light: + build: + <<: *common-build + command: ["/app/docker/docker-init.sh"] + env_file: + - path: docker/.env # default + required: true + - path: docker/.env-local # optional override + required: false + user: *superset-user + depends_on: + db-light: + condition: service_started + volumes: *superset-volumes + environment: + DATABASE_HOST: db-light + DATABASE_DB: superset_light + POSTGRES_DB: superset_light + SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb" + SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py + healthcheck: + disable: true + + superset-node-light: + build: + context: . + target: superset-node + args: + # This prevents building the frontend bundle since we'll mount local folder + # and build it on startup while firing docker-frontend.sh in dev mode, where + # it'll mount and watch local files and rebuild as you update them + DEV_MODE: "true" + BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false} + environment: + # set this to false if you have perf issues running the npm i; npm run dev in-docker + # if you do so, you have to run this manually on the host, which should perform better! + BUILD_SUPERSET_FRONTEND_IN_DOCKER: true + NPM_RUN_PRUNE: false + SCARF_ANALYTICS: "${SCARF_ANALYTICS:-}" + # configuring the dev-server to use the host.docker.internal to connect to the backend + superset: "http://superset-light:8088" + ports: + - "127.0.0.1:${NODE_PORT:-9001}:9000" # Parameterized port + command: ["/app/docker/docker-frontend.sh"] + env_file: + - path: docker/.env # default + required: true + - path: docker/.env-local # optional override + required: false + volumes: *superset-volumes + + pytest-runner: + build: + <<: *common-build + entrypoint: ["/app/docker/docker-pytest-entrypoint.sh"] + env_file: + - path: docker/.env # default + required: true + - path: docker/.env-local # optional override + required: false + profiles: + - test # Only starts when --profile test is used + depends_on: + db-light: + condition: service_started + user: *superset-user + volumes: *superset-volumes + environment: + DATABASE_HOST: db-light + DATABASE_DB: test + POSTGRES_DB: test + SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@db-light:5432/test + SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb" + SUPERSET_CONFIG: superset_test_config_light + PYTHONPATH: /app/pythonpath:/app/docker/pythonpath_dev:/app + +volumes: + superset_home_light: + external: false + db_home_light: + external: false diff --git a/docker-compose-non-dev.yml b/docker-compose-non-dev.yml index cde53598252..5d221ab602c 100644 --- a/docker-compose-non-dev.yml +++ b/docker-compose-non-dev.yml @@ -20,6 +20,9 @@ # If you choose to use this type of deployment make sure to # create you own docker environment file (docker/.env) with your own # unique random secure passwords and SECRET_KEY. +# +# For verbose logging during development: +# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs # ----------------------------------------------------------------------- x-superset-volumes: &superset-volumes # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container @@ -30,7 +33,7 @@ x-common-build: &common-build context: . target: dev cache_from: - - apache/superset-cache:3.10-slim-bookworm + - apache/superset-cache:3.10-slim-trixie services: redis: diff --git a/docker-compose.yml b/docker-compose.yml index 12bcc8dd826..d58ed84488f 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -20,6 +20,9 @@ # If you choose to use this type of deployment make sure to # create you own docker environment file (docker/.env) with your own # unique random secure passwords and SECRET_KEY. +# +# For verbose logging during development: +# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs # ----------------------------------------------------------------------- x-superset-user: &superset-user root x-superset-volumes: &superset-volumes @@ -33,12 +36,13 @@ x-common-build: &common-build context: . target: ${SUPERSET_BUILD_TARGET:-dev} # can use `dev` (default) or `lean` cache_from: - - apache/superset-cache:3.10-slim-bookworm + - apache/superset-cache:3.10-slim-trixie args: DEV_MODE: "true" INCLUDE_CHROMIUM: ${INCLUDE_CHROMIUM:-false} INCLUDE_FIREFOX: ${INCLUDE_FIREFOX:-false} BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false} + LOAD_EXAMPLES_DUCKDB: ${LOAD_EXAMPLES_DUCKDB:-true} services: nginx: @@ -104,6 +108,8 @@ services: superset-init: condition: service_completed_successfully volumes: *superset-volumes + environment: + SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb" superset-websocket: container_name: superset_websocket @@ -155,6 +161,8 @@ services: condition: service_started user: *superset-user volumes: *superset-volumes + environment: + SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb" healthcheck: disable: true diff --git a/docker/.env b/docker/.env index 2ae0578bfc0..a0cbb47deb5 100644 --- a/docker/.env +++ b/docker/.env @@ -53,7 +53,12 @@ PYTHONPATH=/app/pythonpath:/app/docker/pythonpath_dev REDIS_HOST=redis REDIS_PORT=6379 +# Development and logging configuration +# FLASK_DEBUG: Enables Flask dev features (auto-reload, better error pages) - keep 'true' for development FLASK_DEBUG=true +# SUPERSET_LOG_LEVEL: Controls Superset application logging verbosity (debug, info, warning, error, critical) +SUPERSET_LOG_LEVEL=info + SUPERSET_APP_ROOT="/" SUPERSET_ENV=development SUPERSET_LOAD_EXAMPLES=yes @@ -66,4 +71,3 @@ SUPERSET_SECRET_KEY=TEST_NON_DEV_SECRET ENABLE_PLAYWRIGHT=false PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true BUILD_SUPERSET_FRONTEND_IN_DOCKER=true -SUPERSET_LOG_LEVEL=info diff --git a/docker/apt-install.sh b/docker/apt-install.sh index bd9152bebbd..1c36353199e 100755 --- a/docker/apt-install.sh +++ b/docker/apt-install.sh @@ -18,7 +18,7 @@ set -euo pipefail # Ensure this script is run as root -if [[ $EUID -ne 0 ]]; then +if [[ ${EUID} -ne 0 ]]; then echo "This script must be run as root" >&2 exit 1 fi @@ -42,7 +42,7 @@ echo -e "${GREEN}Installing packages: $@${RESET}" apt-get install -yqq --no-install-recommends "$@" echo -e "${GREEN}Autoremoving unnecessary packages...${RESET}" -apt-get autoremove -y +apt-get autoremove -yqq --purge echo -e "${GREEN}Cleaning up package cache and metadata...${RESET}" apt-get clean diff --git a/docker/docker-bootstrap.sh b/docker/docker-bootstrap.sh index fd017622a13..04709396cf9 100755 --- a/docker/docker-bootstrap.sh +++ b/docker/docker-bootstrap.sh @@ -72,7 +72,7 @@ case "${1}" in ;; app) echo "Starting web app (using development server)..." - flask run -p $PORT --with-threads --reload --debugger --host=0.0.0.0 + flask run -p $PORT --reload --debugger --without-threads --host=0.0.0.0 ;; app-gunicorn) echo "Starting web app..." diff --git a/docker/docker-init.sh b/docker/docker-init.sh index f9bd09ed14d..e4b25b5b187 100755 --- a/docker/docker-init.sh +++ b/docker/docker-init.sh @@ -69,6 +69,8 @@ echo_step "3" "Complete" "Setting up roles and perms" if [ "$SUPERSET_LOAD_EXAMPLES" = "yes" ]; then # Load some data to play with echo_step "4" "Starting" "Loading examples" + + # If Cypress run which consumes superset_test_config – load required data for tests if [ "$CYPRESS_CONFIG" == "true" ]; then superset load_examples --load-test-data diff --git a/docker/docker-pytest-entrypoint.sh b/docker/docker-pytest-entrypoint.sh new file mode 100755 index 00000000000..f155ee4c698 --- /dev/null +++ b/docker/docker-pytest-entrypoint.sh @@ -0,0 +1,152 @@ +#!/bin/bash +# +# Licensed to the Apache Software Foundation (ASF) under one or more +# contributor license agreements. See the NOTICE file distributed with +# this work for additional information regarding copyright ownership. +# The ASF licenses this file to You under the Apache License, Version 2.0 +# (the "License"); you may not use this file except in compliance with +# the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +set -e + +# Wait for PostgreSQL to be ready +echo "Waiting for database to be ready..." +for i in {1..30}; do + if python3 -c " +import psycopg2 +try: + conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='superset_light') + conn.close() + print('Database is ready!') +except: + exit(1) +" 2>/dev/null; then + echo "Database connection established!" + break + fi + echo "Waiting for database... ($i/30)" + if [ $i -eq 30 ]; then + echo "Database connection timeout after 30 seconds" + exit 1 + fi + sleep 1 +done + +# Handle database setup based on FORCE_RELOAD +if [ "${FORCE_RELOAD}" = "true" ]; then + echo "Force reload requested - resetting test database" + # Drop and recreate the test database using Python + python3 -c " +import psycopg2 +from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT + +# Connect to default database +conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='superset_light') +conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT) +cur = conn.cursor() + +# Drop and recreate test database +try: + cur.execute('DROP DATABASE IF EXISTS test') +except: + pass + +cur.execute('CREATE DATABASE test') +conn.close() + +# Connect to test database to create schemas +conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='test') +conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT) +cur = conn.cursor() + +cur.execute('CREATE SCHEMA sqllab_test_db') +cur.execute('CREATE SCHEMA admin_database') + +cur.close() +conn.close() +print('Test database reset successfully') +" + # Use --no-reset-db since we already reset it + FLAGS="--no-reset-db" +else + echo "Using existing test database (set FORCE_RELOAD=true to reset)" + FLAGS="--no-reset-db" + + # Ensure test database exists using Python + python3 -c " +import psycopg2 +from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT + +# Check if test database exists +try: + conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='test') + conn.close() + print('Test database already exists') +except: + print('Creating test database...') + # Connect to default database to create test database + conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='superset_light') + conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT) + cur = conn.cursor() + + # Create test database + cur.execute('CREATE DATABASE test') + conn.close() + + # Connect to test database to create schemas + conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='test') + conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT) + cur = conn.cursor() + + cur.execute('CREATE SCHEMA IF NOT EXISTS sqllab_test_db') + cur.execute('CREATE SCHEMA IF NOT EXISTS admin_database') + + cur.close() + conn.close() + print('Test database created successfully') +" +fi + +# Always run database migrations to ensure schema is up to date +echo "Running database migrations..." +cd /app +superset db upgrade + +# Initialize test environment if needed +if [ "${FORCE_RELOAD}" = "true" ] || [ ! -f "/app/superset_home/.test_initialized" ]; then + echo "Initializing test environment..." + # Run initialization commands + superset init + echo "Loading test users..." + superset load-test-users + + # Mark as initialized + touch /app/superset_home/.test_initialized +else + echo "Test environment already initialized (skipping init and load-test-users)" + echo "Tip: Use FORCE_RELOAD=true to reinitialize the test database" +fi + +# Create missing scripts needed for tests +if [ ! -f "/app/scripts/tag_latest_release.sh" ]; then + echo "Creating missing tag_latest_release.sh script for tests..." + cp /app/docker/tag_latest_release.sh /app/scripts/tag_latest_release.sh 2>/dev/null || true +fi + +# Install pip module for Shillelagh compatibility (aligns with CI environment) +echo "Installing pip module for Shillelagh compatibility..." +uv pip install pip + +# If arguments provided, execute them +if [ $# -gt 0 ]; then + exec "$@" +fi diff --git a/docker/entrypoints/run-server.sh b/docker/entrypoints/run-server.sh index 270c46f9253..28b5dae228f 100644 --- a/docker/entrypoints/run-server.sh +++ b/docker/entrypoints/run-server.sh @@ -26,7 +26,7 @@ gunicorn \ --workers ${SERVER_WORKER_AMOUNT:-1} \ --worker-class ${SERVER_WORKER_CLASS:-gthread} \ --threads ${SERVER_THREADS_AMOUNT:-20} \ - --log-level "${GUNICORN_LOGLEVEL:info}" \ + --log-level "${GUNICORN_LOGLEVEL:-info}" \ --timeout ${GUNICORN_TIMEOUT:-60} \ --keep-alive ${GUNICORN_KEEPALIVE:-2} \ --max-requests ${WORKER_MAX_REQUESTS:-0} \ diff --git a/docker/frontend-mem-nag.sh b/docker/frontend-mem-nag.sh index 40517216b64..8dd658e5660 100755 --- a/docker/frontend-mem-nag.sh +++ b/docker/frontend-mem-nag.sh @@ -23,25 +23,57 @@ MIN_MEM_FREE_GB=3 MIN_MEM_FREE_KB=$(($MIN_MEM_FREE_GB*1000000)) echo_mem_warn() { - MEM_FREE_KB=$(awk '/MemFree/ { printf "%s \n", $2 }' /proc/meminfo) - MEM_FREE_GB=$(awk '/MemFree/ { printf "%s \n", $2/1024/1024 }' /proc/meminfo) + # Check if running in Codespaces first + if [[ -n "${CODESPACES}" ]]; then + echo "Memory available: Codespaces managed" + return + fi - if [[ "${MEM_FREE_KB}" -lt "${MIN_MEM_FREE_KB}" ]]; then + # Check platform and get memory accordingly + if [[ -f /proc/meminfo ]]; then + # Linux + if grep -q MemAvailable /proc/meminfo; then + MEM_AVAIL_KB=$(awk '/MemAvailable/ { printf "%s \n", $2 }' /proc/meminfo) + MEM_AVAIL_GB=$(awk '/MemAvailable/ { printf "%s \n", $2/1024/1024 }' /proc/meminfo) + else + MEM_AVAIL_KB=$(awk '/MemFree/ { printf "%s \n", $2 }' /proc/meminfo) + MEM_AVAIL_GB=$(awk '/MemFree/ { printf "%s \n", $2/1024/1024 }' /proc/meminfo) + fi + elif [[ "$(uname)" == "Darwin" ]]; then + # macOS - use vm_stat to get free memory + # vm_stat reports in pages, typically 4096 bytes per page + PAGE_SIZE=$(pagesize) + FREE_PAGES=$(vm_stat | awk '/Pages free:/ {print $3}' | tr -d '.') + INACTIVE_PAGES=$(vm_stat | awk '/Pages inactive:/ {print $3}' | tr -d '.') + # Free + inactive pages give us available memory (similar to MemAvailable on Linux) + AVAIL_PAGES=$((FREE_PAGES + INACTIVE_PAGES)) + MEM_AVAIL_KB=$((AVAIL_PAGES * PAGE_SIZE / 1024)) + MEM_AVAIL_GB=$(echo "scale=2; $MEM_AVAIL_KB / 1024 / 1024" | bc) + else + # Other platforms + echo "Memory available: Unable to determine" + return + fi + + if [[ "${MEM_AVAIL_KB}" -lt "${MIN_MEM_FREE_KB}" ]]; then cat <" + echo "SKIP_TAG=true" >> $GITHUB_OUTPUT + exit 1 +fi + +if [ -z "$(git_show_ref)" ]; then + echo "The tag ${GITHUB_TAG_NAME} does not exist. Please use a different tag." + echo "SKIP_TAG=true" >> $GITHUB_OUTPUT + exit 0 +fi + +# check that this tag only contains a proper semantic version +if ! [[ ${GITHUB_TAG_NAME} =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]] +then + echo "This tag ${GITHUB_TAG_NAME} is not a valid release version. Not tagging." + echo "SKIP_TAG=true" >> $GITHUB_OUTPUT + exit 1 +fi + +## split the current GITHUB_TAG_NAME into an array at the dot +THIS_TAG_NAME=$(split_string "${GITHUB_TAG_NAME}" ".") + +# look up the 'latest' tag on git +LATEST_TAG_LIST=$(get_latest_tag_list) || echo 'not found' + +# if 'latest' tag doesn't exist, then set this commit to latest +if [[ -z "$LATEST_TAG_LIST" ]] +then + echo "there are no latest tags yet, so I'm going to start by tagging this sha as the latest" + run_git_tag + exit 0 +fi + +# remove parenthesis and tag: from the list of tags +LATEST_TAGS_STRINGS=$(echo "$LATEST_TAG_LIST" | sed 's/tag: \([^,]*\)/\1/g' | tr -d '()') + +LATEST_TAGS=$(split_string "$LATEST_TAGS_STRINGS" ",") +TAGS=($(split_string "$LATEST_TAGS" " ")) + +# Initialize a flag for comparison result +compare_result="" + +# Iterate through the tags of the latest release +for tag in $TAGS +do + if [[ $tag == "latest" ]]; then + continue + else + ## extract just the version from this tag + LATEST_RELEASE_TAG="$tag" + echo "LATEST_RELEASE_TAG: ${LATEST_RELEASE_TAG}" + + # check that this only contains a proper semantic version + if ! [[ ${LATEST_RELEASE_TAG} =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]] + then + echo "'Latest' has been associated with tag ${LATEST_RELEASE_TAG} which is not a valid release version. Looking for another." + continue + fi + echo "The current release with the latest tag is version ${LATEST_RELEASE_TAG}" + # Split the version strings into arrays + THIS_TAG_NAME_ARRAY=($(split_string "$THIS_TAG_NAME" ".")) + LATEST_RELEASE_TAG_ARRAY=($(split_string "$LATEST_RELEASE_TAG" ".")) + + # Iterate through the components of the version strings + for (( j=0; j<${#THIS_TAG_NAME_ARRAY[@]}; j++ )); do + echo "Comparing ${THIS_TAG_NAME_ARRAY[$j]} to ${LATEST_RELEASE_TAG_ARRAY[$j]}" + if [[ $((THIS_TAG_NAME_ARRAY[$j])) > $((LATEST_RELEASE_TAG_ARRAY[$j])) ]]; then + compare_result="greater" + break + elif [[ $((THIS_TAG_NAME_ARRAY[$j])) < $((LATEST_RELEASE_TAG_ARRAY[$j])) ]]; then + compare_result="lesser" + break + fi + done + fi +done + +# Determine the result based on the comparison +if [[ -z "$compare_result" ]]; then + echo "Versions are equal" + echo "SKIP_TAG=true" >> $GITHUB_OUTPUT +elif [[ "$compare_result" == "greater" ]]; then + echo "This release tag ${GITHUB_TAG_NAME} is newer than the latest." + echo "SKIP_TAG=false" >> $GITHUB_OUTPUT + # Add other actions you want to perform for a newer version +elif [[ "$compare_result" == "lesser" ]]; then + echo "This release tag ${GITHUB_TAG_NAME} is older than the latest." + echo "This release tag ${GITHUB_TAG_NAME} is not the latest. Not tagging." + # if you've gotten this far, then we don't want to run any tags in the next step + echo "SKIP_TAG=true" >> $GITHUB_OUTPUT +fi diff --git a/docs/README.md b/docs/README.md index ef5db9ebfd7..8af3ea90da5 100644 --- a/docs/README.md +++ b/docs/README.md @@ -21,3 +21,183 @@ This is the public documentation site for Superset, built using [Docusaurus 3](https://docusaurus.io/). See [CONTRIBUTING.md](../CONTRIBUTING.md#documentation) for documentation on contributing to documentation. + +## Version Management + +The Superset documentation site uses Docusaurus versioning with three independent versioned sections: + +- **Main Documentation** (`/docs/`) - Core Superset documentation +- **Developer Portal** (`/developer_portal/`) - Developer guides and tutorials +- **Component Playground** (`/components/`) - Interactive component examples (currently disabled) + +Each section maintains its own version history and can be versioned independently. + +### Creating a New Version + +To create a new version for any section, use the Docusaurus version command with the appropriate plugin ID or use our automated scripts: + +#### Using Automated Scripts (Required) + +**⚠️ Important:** Always use these custom commands instead of the native Docusaurus commands. These scripts ensure that both the Docusaurus versioning system AND the `versions-config.json` file are updated correctly. + +```bash +# Main Documentation +yarn version:add:docs 1.2.0 + +# Developer Portal +yarn version:add:developer_portal 1.2.0 + +# Component Playground (when enabled) +yarn version:add:components 1.2.0 +``` + +**Do NOT use** the native Docusaurus commands directly (`yarn docusaurus docs:version`), as they will: +- ❌ Create version files but NOT update `versions-config.json` +- ❌ Cause versions to not appear in dropdown menus +- ❌ Require manual fixes to synchronize the configuration + +### Managing Versions + +#### With Automated Scripts +The automated scripts handle all configuration updates automatically. No manual editing required! + +#### Manual Configuration +If creating versions manually, you'll need to: + +1. **Update `versions-config.json`** (or `docusaurus.config.ts` if not using dynamic config): + - Add version to `onlyIncludeVersions` array + - Add version metadata to `versions` object + - Update `lastVersion` if needed + +2. **Files Created by Versioning**: + When a new version is created, Docusaurus generates: + - **Versioned docs folder**: `[section]_versioned_docs/version-X.X.X/` + - **Versioned sidebars**: `[section]_versioned_sidebars/version-X.X.X-sidebars.json` + - **Versions list**: `[section]_versions.json` + + Note: For main docs, the prefix is omitted (e.g., `versioned_docs/` instead of `docs_versioned_docs/`) + +3. **Important**: After adding a version, restart the development server to see changes: + ```bash + yarn stop + yarn start + ``` + +### Removing a Version + +#### Using Automated Scripts (Recommended) +```bash +# Main Documentation +yarn version:remove:docs 1.0.0 + +# Developer Portal +yarn version:remove:developer_portal 1.0.0 + +# Component Playground +yarn version:remove:components 1.0.0 +``` + +#### Manual Removal +To manually remove a version: + +1. **Delete the version folder** from the appropriate location: + - Main docs: `versioned_docs/version-X.X.X/` (no prefix for main) + - Developer Portal: `developer_portal_versioned_docs/version-X.X.X/` + - Components: `components_versioned_docs/version-X.X.X/` + +2. **Delete the version metadata file**: + - Main docs: `versioned_sidebars/version-X.X.X-sidebars.json` (no prefix) + - Developer Portal: `developer_portal_versioned_sidebars/version-X.X.X-sidebars.json` + - Components: `components_versioned_sidebars/version-X.X.X-sidebars.json` + +3. **Update the versions list file**: + - Main docs: `versions.json` + - Developer Portal: `developer_portal_versions.json` + - Components: `components_versions.json` + +4. **Update configuration**: + - If using dynamic config: Update `versions-config.json` + - If using static config: Update `docusaurus.config.ts` + +5. **Restart the server** to see changes + +### Version Configuration Examples + +#### Main Documentation (default plugin) +```typescript +docs: { + includeCurrentVersion: true, + lastVersion: 'current', // Makes /docs/ show Next version + onlyIncludeVersions: ['current', '1.1.0', '1.0.0'], + versions: { + current: { + label: 'Next', + path: '', // Empty path for default routing + banner: 'unreleased', + }, + '1.1.0': { + label: '1.1.0', + path: '1.1.0', + banner: 'none', + }, + }, +} +``` + +#### Developer Portal & Components (custom plugins) +```typescript +{ + id: 'developer_portal', + path: 'developer_portal', + routeBasePath: 'developer_portal', + includeCurrentVersion: true, + lastVersion: '1.1.0', // Default version + onlyIncludeVersions: ['current', '1.1.0', '1.0.0'], + versions: { + current: { + label: 'Next', + path: 'next', + banner: 'unreleased', + }, + '1.1.0': { + label: '1.1.0', + path: '1.1.0', + banner: 'none', + }, + }, +} +``` + +### Best Practices + +1. **Version naming**: Use semantic versioning (e.g., 1.0.0, 1.1.0, 2.0.0) +2. **Version banners**: Use `'unreleased'` for development versions, `'none'` for stable releases +3. **Limit displayed versions**: Use `onlyIncludeVersions` to show only relevant versions +4. **Test locally**: Always test version changes locally before deploying +5. **Independent versioning**: Each section can have different version numbers and release cycles + +### Troubleshooting + +#### Version Not Showing After Creation + +If you accidentally used `yarn docusaurus docs:version` instead of `yarn version:add`: +1. **Problem**: The version files were created but `versions-config.json` wasn't updated +2. **Solution**: Either: + - Revert the changes: `git restore versions.json && rm -rf versioned_docs/ versioned_sidebars/` + - Then use the correct command: `yarn version:add:docs ` + +For other issues: +- **Restart the server**: Changes to version configuration require a server restart +- **Check config file**: Ensure `versions-config.json` includes the new version +- **Verify files exist**: Check that versioned docs folder was created + +#### Broken Links in Versioned Documentation +When creating a new version, links in the documentation are preserved as-is. Common issues: +- **Cross-section links**: Links between sections (e.g., from developer_portal to docs) need to be version-aware +- **Absolute vs relative paths**: Use relative paths within the same section +- **Version-specific URLs**: Update hardcoded URLs to use version variables + +To fix broken links: +1. Use `type: 'doc'` with `docId` for version-aware navigation in navbar +2. Use relative paths within the same documentation section +3. Test all versions after creation to identify broken links diff --git a/docs/components/chart-components/bar-chart.md b/docs/components/chart-components/bar-chart.md new file mode 100644 index 00000000000..2b8e336e6ea --- /dev/null +++ b/docs/components/chart-components/bar-chart.md @@ -0,0 +1,105 @@ + +--- +title: Bar Chart +sidebar_position: 1 +--- + +# Bar Chart Component + +The Bar Chart component is used to visualize categorical data with rectangular bars. + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `data` | `array` | `[]` | Array of data objects to visualize | +| `width` | `number` | `800` | Width of the chart in pixels | +| `height` | `number` | `600` | Height of the chart in pixels | +| `xField` | `string` | - | Field name for x-axis values | +| `yField` | `string` | - | Field name for y-axis values | +| `colorField` | `string` | - | Field name for color encoding | +| `colorScheme` | `string` | `'supersetColors'` | Color scheme to use | +| `showLegend` | `boolean` | `true` | Whether to show the legend | +| `showGrid` | `boolean` | `true` | Whether to show grid lines | +| `labelPosition` | `string` | `'top'` | Position of bar labels: 'top', 'middle', 'bottom' | + +## Examples + +### Basic Bar Chart + +```jsx +import { BarChart } from '@superset-ui/chart-components'; + +const data = [ + { category: 'A', value: 10 }, + { category: 'B', value: 20 }, + { category: 'C', value: 15 }, + { category: 'D', value: 25 }, +]; + +function Example() { + return ( + + ); +} +``` + +### Grouped Bar Chart + +```jsx +import { BarChart } from '@superset-ui/chart-components'; + +const data = [ + { category: 'A', group: 'Group 1', value: 10 }, + { category: 'A', group: 'Group 2', value: 15 }, + { category: 'B', group: 'Group 1', value: 20 }, + { category: 'B', group: 'Group 2', value: 25 }, + { category: 'C', group: 'Group 1', value: 15 }, + { category: 'C', group: 'Group 2', value: 10 }, +]; + +function Example() { + return ( + + ); +} +``` + +## Best Practices + +- Use bar charts when comparing quantities across categories +- Sort bars by value for better readability, unless there's a natural order to the categories +- Use consistent colors for the same categories across different charts +- Consider using horizontal bar charts when category labels are long diff --git a/docs/components/index.md b/docs/components/index.md new file mode 100644 index 00000000000..77fb2a9cca9 --- /dev/null +++ b/docs/components/index.md @@ -0,0 +1,59 @@ + +--- +title: Component Library +sidebar_position: 1 +--- + +# Superset Component Library + +Welcome to the Apache Superset Component Library documentation. This section provides comprehensive documentation for all the UI components, chart components, and layout components used in Superset. + +## What is the Component Library? + +The Component Library is a collection of reusable UI components that are used to build the Superset user interface. These components are designed to be consistent, accessible, and easy to use. + +## Component Categories + +The Component Library is organized into the following categories: + +### UI Components + +Basic UI components like buttons, inputs, dropdowns, and other form elements. + +### Chart Components + +Visualization components used to render different types of charts and graphs. + +### Layout Components + +Components used for page layout, such as containers, grids, and navigation elements. + +## Versioning + +The Component Library documentation follows its own versioning scheme, independent from the main Superset documentation. This allows us to update the component documentation as the components evolve, without affecting the main documentation. + +## Getting Started + +Browse the sidebar to explore the different components available in the library. Each component documentation includes: + +- Component description and purpose +- Props and configuration options +- Usage examples +- Best practices diff --git a/docs/components/layout-components/grid.md b/docs/components/layout-components/grid.md new file mode 100644 index 00000000000..a0980d2bdcf --- /dev/null +++ b/docs/components/layout-components/grid.md @@ -0,0 +1,113 @@ + +--- +title: Grid +sidebar_position: 1 +--- + +# Grid Component + +The Grid component provides a flexible layout system for arranging content in rows and columns. + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `gutter` | `number` or `[number, number]` | `0` | Grid spacing between items, can be a single number or [horizontal, vertical] | +| `columns` | `number` | `12` | Number of columns in the grid | +| `justify` | `string` | `'start'` | Horizontal alignment: 'start', 'center', 'end', 'space-between', 'space-around' | +| `align` | `string` | `'top'` | Vertical alignment: 'top', 'middle', 'bottom' | +| `wrap` | `boolean` | `true` | Whether to wrap items when they overflow | + +### Row Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `gutter` | `number` or `[number, number]` | `0` | Spacing between items in the row | +| `justify` | `string` | `'start'` | Horizontal alignment for this row | +| `align` | `string` | `'top'` | Vertical alignment for this row | + +### Col Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `span` | `number` | - | Number of columns the grid item spans | +| `offset` | `number` | `0` | Number of columns the grid item is offset | +| `xs`, `sm`, `md`, `lg`, `xl` | `number` or `object` | - | Responsive props for different screen sizes | + +## Examples + +### Basic Grid + +```jsx +import { Grid, Row, Col } from '@superset-ui/core'; + +function Example() { + return ( + + + +
Column 1
+ + +
Column 2
+ + +
Column 3
+ +
+
+ ); +} +``` + +### Responsive Grid + +```jsx +import { Grid, Row, Col } from '@superset-ui/core'; + +function Example() { + return ( + + + +
Responsive Column 1
+ + +
Responsive Column 2
+ + +
Responsive Column 3
+ + +
Responsive Column 4
+ +
+
+ ); +} +``` + +## Best Practices + +- Use the Grid system for complex layouts that need to be responsive +- Specify column widths for different screen sizes to ensure proper responsive behavior +- Use gutters to create appropriate spacing between grid items +- Keep the grid structure consistent throughout your application +- Consider using the grid system for dashboard layouts to ensure consistent spacing and alignment diff --git a/docs/components/test.mdx b/docs/components/test.mdx new file mode 100644 index 00000000000..d5f842ad631 --- /dev/null +++ b/docs/components/test.mdx @@ -0,0 +1,35 @@ + +--- +title: Test +--- + +import { StoryExample } from '../src/components/StorybookWrapper'; + +# Test + +This is a test using our custom StorybookWrapper component. + + ( +
+ This is a simple example component +
+ )} +/> diff --git a/docs/components/ui-components/button.mdx b/docs/components/ui-components/button.mdx new file mode 100644 index 00000000000..2102d146efe --- /dev/null +++ b/docs/components/ui-components/button.mdx @@ -0,0 +1,146 @@ + +--- +title: Button Component +sidebar_position: 1 +--- + +import { StoryExample, StoryWithControls } from '../../src/components/StorybookWrapper'; +import { Button } from '../../../superset-frontend/packages/superset-ui-core/src/components/Button'; + +# Button Component + +The Button component is a fundamental UI element used throughout Superset for user interactions. + +## Basic Usage + +The default button with primary styling: + ( + + )} +/> + +## Interactive Example + + ( + + )} + props={{ + buttonStyle: 'primary', + buttonSize: 'default', + label: 'Click Me', + disabled: false + }} + controls={[ + { + name: 'buttonStyle', + label: 'Button Style', + type: 'select', + options: ['primary', 'secondary', 'tertiary', 'success', 'warning', 'danger', 'default', 'link', 'dashed'] + }, + { + name: 'buttonSize', + label: 'Button Size', + type: 'select', + options: ['default', 'small', 'xsmall'] + }, + { + name: 'label', + label: 'Button Text', + type: 'text' + }, + { + name: 'disabled', + label: 'Disabled', + type: 'boolean' + } + ]} +/> + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `buttonStyle` | `'primary' \| 'secondary' \| 'tertiary' \| 'success' \| 'warning' \| 'danger' \| 'default' \| 'link' \| 'dashed'` | `'default'` | Button style | +| `buttonSize` | `'default' \| 'small' \| 'xsmall'` | `'default'` | Button size | +| `disabled` | `boolean` | `false` | Whether the button is disabled | +| `cta` | `boolean` | `false` | Whether the button is a call-to-action button | +| `tooltip` | `ReactNode` | - | Tooltip content | +| `placement` | `TooltipProps['placement']` | - | Tooltip placement | +| `onClick` | `function` | - | Callback when button is clicked | +| `href` | `string` | - | Turns button into an anchor link | +| `target` | `string` | - | Target attribute for anchor links | + +## Usage + +```jsx +import Button from 'src/components/Button'; + +function MyComponent() { + return ( + + ); +} +``` + +## Button Styles + +Superset provides a variety of button styles for different purposes: + +- **Primary**: Used for primary actions +- **Secondary**: Used for secondary actions +- **Tertiary**: Used for less important actions +- **Success**: Used for successful or confirming actions +- **Warning**: Used for actions that require caution +- **Danger**: Used for destructive actions +- **Link**: Used for navigation +- **Dashed**: Used for adding new items or features + +## Button Sizes + +Buttons come in three sizes: + +- **Default**: Standard size for most use cases +- **Small**: Compact size for tight spaces +- **XSmall**: Extra small size for very limited spaces + +## Best Practices + +- Use primary buttons for the main action in a form or page +- Use secondary buttons for alternative actions +- Use danger buttons for destructive actions +- Limit the number of primary buttons on a page to avoid confusion +- Use consistent button styles throughout your application +- Add tooltips to buttons when their purpose might not be immediately clear diff --git a/docs/components/versions.json b/docs/components/versions.json new file mode 100644 index 00000000000..fe51488c706 --- /dev/null +++ b/docs/components/versions.json @@ -0,0 +1 @@ +[] diff --git a/docs/components_versions.json b/docs/components_versions.json new file mode 100644 index 00000000000..fe51488c706 --- /dev/null +++ b/docs/components_versions.json @@ -0,0 +1 @@ +[] diff --git a/docs/developer_portal/api/frontend.md b/docs/developer_portal/api/frontend.md new file mode 100644 index 00000000000..e2d66ae9f0a --- /dev/null +++ b/docs/developer_portal/api/frontend.md @@ -0,0 +1,477 @@ +--- +title: Frontend API Reference +sidebar_position: 1 +hide_title: true +--- + + + +# Frontend API Reference + +The `@apache-superset/core` package provides comprehensive APIs for frontend extension development. All APIs are organized into logical namespaces for easy discovery and use. + +## Core API + +The core namespace provides fundamental extension functionality. + +### registerView + +Registers a new view or panel in the specified contribution point. + +```typescript +core.registerView( + id: string, + component: React.ComponentType +): Disposable +``` + +**Example:** +```typescript +const panel = context.core.registerView('my-extension.panel', () => ( + +)); +``` + +### getActiveView + +Gets the currently active view in a contribution area. + +```typescript +core.getActiveView(area: string): View | undefined +``` + +## Commands API + +Manages command registration and execution. + +### registerCommand + +Registers a new command that can be triggered by menus, shortcuts, or programmatically. + +```typescript +commands.registerCommand( + id: string, + handler: CommandHandler +): Disposable + +interface CommandHandler { + title: string; + icon?: string; + execute: (...args: any[]) => any; + isEnabled?: (...args: any[]) => boolean; +} +``` + +**Example:** +```typescript +const cmd = context.commands.registerCommand('my-extension.analyze', { + title: 'Analyze Query', + icon: 'BarChartOutlined', + execute: () => { + const query = context.sqlLab.getCurrentQuery(); + // Perform analysis + }, + isEnabled: () => { + return context.sqlLab.hasActiveEditor(); + } +}); +``` + +### executeCommand + +Executes a registered command by ID. + +```typescript +commands.executeCommand(id: string, ...args: any[]): Promise +``` + +## SQL Lab API + +Provides access to SQL Lab functionality and events. + +### Query Access + +```typescript +// Get current tab +sqlLab.getCurrentTab(): Tab | undefined + +// Get all tabs +sqlLab.getTabs(): Tab[] + +// Get current query +sqlLab.getCurrentQuery(): string + +// Get selected text +sqlLab.getSelectedText(): string | undefined +``` + +### Database Access + +```typescript +// Get available databases +sqlLab.getDatabases(): Database[] + +// Get database by ID +sqlLab.getDatabase(id: number): Database | undefined + +// Get schemas for database +sqlLab.getSchemas(databaseId: number): Promise + +// Get tables for schema +sqlLab.getTables( + databaseId: number, + schema: string +): Promise +``` + +### Events + +```typescript +// Query execution events +sqlLab.onDidQueryRun: Event +sqlLab.onDidQueryStop: Event +sqlLab.onDidQueryFail: Event + +// Editor events +sqlLab.onDidChangeEditorContent: Event +sqlLab.onDidChangeSelection: Event + +// Tab events +sqlLab.onDidChangeActiveTab: Event +sqlLab.onDidCloseTab: Event +sqlLab.onDidChangeTabTitle: Event<{tab: Tab, title: string}> + +// Panel events +sqlLab.onDidOpenPanel: Event +sqlLab.onDidClosePanel: Event +sqlLab.onDidChangeActivePanel: Event +``` + +**Event Usage Example:** +```typescript +const disposable = context.sqlLab.onDidQueryRun((result) => { + console.log('Query executed:', result.query); + console.log('Rows returned:', result.rowCount); + console.log('Execution time:', result.executionTime); +}); + +// Remember to dispose when done +context.subscriptions.push(disposable); +``` + +## Authentication API + +Handles authentication and security tokens. + +### getCSRFToken + +Gets the current CSRF token for API requests. + +```typescript +authentication.getCSRFToken(): Promise +``` + +### getCurrentUser + +Gets information about the current user. + +```typescript +authentication.getCurrentUser(): User + +interface User { + id: number; + username: string; + email: string; + roles: Role[]; + permissions: Permission[]; +} +``` + +### hasPermission + +Checks if the current user has a specific permission. + +```typescript +authentication.hasPermission(permission: string): boolean +``` + +## Extensions API + +Manages extension lifecycle and inter-extension communication. + +### getExtension + +Gets information about an installed extension. + +```typescript +extensions.getExtension(id: string): Extension | undefined + +interface Extension { + id: string; + name: string; + version: string; + isActive: boolean; + metadata: ExtensionMetadata; +} +``` + +### getActiveExtensions + +Gets all currently active extensions. + +```typescript +extensions.getActiveExtensions(): Extension[] +``` + +### Events + +```typescript +// Extension lifecycle events +extensions.onDidActivateExtension: Event +extensions.onDidDeactivateExtension: Event +``` + +## UI Components + +Import pre-built UI components from `@apache-superset/core`: + +```typescript +import { + Button, + Select, + Input, + Table, + Modal, + Alert, + Tabs, + Card, + Dropdown, + Menu, + Tooltip, + Icon, + // ... many more +} from '@apache-superset/core'; +``` + +### Example Component Usage + +```typescript +import { Button, Alert } from '@apache-superset/core'; + +function MyExtensionPanel() { + return ( +
+ + +
+ ); +} +``` + +## Storage API + +Provides persistent storage for extension data. + +### Local Storage + +```typescript +// Store data +storage.local.set(key: string, value: any): Promise + +// Retrieve data +storage.local.get(key: string): Promise + +// Remove data +storage.local.remove(key: string): Promise + +// Clear all extension data +storage.local.clear(): Promise +``` + +### Workspace Storage + +Workspace storage is shared across all users for collaborative features. + +```typescript +storage.workspace.set(key: string, value: any): Promise +storage.workspace.get(key: string): Promise +storage.workspace.remove(key: string): Promise +``` + +## Network API + +Utilities for making API calls to Superset. + +### fetch + +Enhanced fetch with CSRF token handling. + +```typescript +network.fetch(url: string, options?: RequestInit): Promise +``` + +### API Client + +Type-safe API client for Superset endpoints. + +```typescript +// Get chart data +network.api.charts.get(id: number): Promise + +// Query database +network.api.sqlLab.execute( + databaseId: number, + query: string +): Promise + +// Get datasets +network.api.datasets.list(): Promise +``` + +## Utility Functions + +### Formatting + +```typescript +// Format numbers +utils.formatNumber(value: number, format?: string): string + +// Format dates +utils.formatDate(date: Date, format?: string): string + +// Format SQL +utils.formatSQL(sql: string): string +``` + +### Validation + +```typescript +// Validate SQL syntax +utils.validateSQL(sql: string): ValidationResult + +// Check if valid database ID +utils.isValidDatabaseId(id: any): boolean +``` + +## TypeScript Types + +Import common types for type safety: + +```typescript +import type { + Database, + Dataset, + Chart, + Dashboard, + Query, + QueryResult, + Tab, + Panel, + User, + Role, + Permission, + ExtensionContext, + Disposable, + Event, + // ... more types +} from '@apache-superset/core'; +``` + +## Extension Context + +The context object passed to your extension's `activate` function: + +```typescript +interface ExtensionContext { + // Subscription management + subscriptions: Disposable[]; + + // Extension metadata + extensionId: string; + extensionPath: string; + + // API namespaces + core: CoreAPI; + commands: CommandsAPI; + sqlLab: SqlLabAPI; + authentication: AuthenticationAPI; + extensions: ExtensionsAPI; + storage: StorageAPI; + network: NetworkAPI; + utils: UtilsAPI; + + // Logging + logger: Logger; +} +``` + +## Event Handling + +Events follow the VS Code pattern with subscribe/dispose: + +```typescript +// Subscribe to event +const disposable = sqlLab.onDidQueryRun((result) => { + // Handle event +}); + +// Dispose when done +disposable.dispose(); + +// Or add to context for automatic cleanup +context.subscriptions.push(disposable); +``` + +## Best Practices + +1. **Always dispose subscriptions** to prevent memory leaks +2. **Use TypeScript** for better IDE support and type safety +3. **Handle errors gracefully** with try-catch blocks +4. **Check permissions** before sensitive operations +5. **Use provided UI components** for consistency +6. **Cache API responses** when appropriate +7. **Validate user input** before processing + +## Version Compatibility + +The frontend API follows semantic versioning: + +- **Major version**: Breaking changes +- **Minor version**: New features, backward compatible +- **Patch version**: Bug fixes + +Check compatibility in your `extension.json`: + +```json +{ + "engines": { + "@apache-superset/core": "^1.0.0" + } +} +``` diff --git a/docs/developer_portal/architecture/overview.md b/docs/developer_portal/architecture/overview.md new file mode 100644 index 00000000000..452e8fe1688 --- /dev/null +++ b/docs/developer_portal/architecture/overview.md @@ -0,0 +1,348 @@ +--- +title: Architecture Overview +sidebar_position: 1 +hide_title: true +--- + + + +# Extension Architecture Overview + +The Superset extension architecture is designed to be modular, secure, and performant. This document provides a comprehensive overview of how extensions work and interact with the Superset host application. + +## Core Principles + +### 1. Lean Core +Superset's core remains minimal, with features delegated to extensions wherever possible. Built-in features use the same APIs as external extensions, ensuring API quality through dogfooding. + +### 2. Explicit Contribution Points +All extension points are clearly defined and documented. Extensions declare their capabilities in metadata files, enabling predictable lifecycle management. + +### 3. Versioned APIs +Public interfaces follow semantic versioning, ensuring backward compatibility and safe evolution of the platform. + +### 4. Lazy Loading +Extensions load only when needed, minimizing performance impact and resource consumption. + +### 5. Composability +Architecture patterns and APIs are reusable across different Superset modules, promoting consistency. + +### 6. Community-Driven +The system evolves based on real-world feedback, with new extension points added as needs emerge. + +## System Architecture + +```mermaid +graph TB + subgraph "Superset Host Application" + Core[Core Application] + API[Extension APIs] + Loader[Extension Loader] + Manager[Extension Manager] + end + + subgraph "Core Packages" + FrontendCore["@apache-superset/core
(Frontend)"] + BackendCore["apache-superset-core
(Backend)"] + CLI["apache-superset-extensions-cli"] + end + + subgraph "Extension" + Metadata[extension.json] + Frontend[Frontend Code] + Backend[Backend Code] + Bundle[.supx Bundle] + end + + Core --> API + API --> FrontendCore + API --> BackendCore + Loader --> Manager + Manager --> Bundle + Frontend --> FrontendCore + Backend --> BackendCore + CLI --> Bundle +``` + +## Key Components + +### Host Application + +The Superset host application provides: + +- **Extension APIs**: Well-defined interfaces for extensions to interact with Superset +- **Extension Manager**: Handles lifecycle, activation, and deactivation +- **Module Loader**: Dynamically loads extension code using Webpack Module Federation +- **Security Context**: Manages permissions and sandboxing for extensions + +### Core Packages + +#### @apache-superset/core (Frontend) +- Shared UI components and utilities +- TypeScript type definitions +- Frontend API implementations +- Event system and command registry + +#### apache-superset-core (Backend) +- Python base classes and utilities +- Database access APIs +- Security and permission helpers +- REST API registration + +#### apache-superset-extensions-cli +- Project scaffolding +- Build and bundling tools +- Development server +- Package management + +### Extension Structure + +Each extension consists of: + +- **Metadata** (`extension.json`): Declares capabilities and requirements +- **Frontend**: React components and TypeScript code +- **Backend**: Python modules and API endpoints +- **Assets**: Styles, images, and other resources +- **Bundle** (`.supx`): Packaged distribution format + +## Module Federation + +Extensions use Webpack Module Federation for dynamic loading: + +```javascript +// Extension webpack.config.js +new ModuleFederationPlugin({ + name: 'my_extension', + filename: 'remoteEntry.[contenthash].js', + exposes: { + './index': './src/index.tsx', + }, + externals: { + '@apache-superset/core': 'superset', + }, + shared: { + react: { singleton: true }, + 'react-dom': { singleton: true }, + } +}) +``` + +This allows: +- **Independent builds**: Extensions compile separately from Superset +- **Shared dependencies**: Common libraries like React aren't duplicated +- **Dynamic loading**: Extensions load at runtime without rebuilding Superset +- **Version compatibility**: Extensions declare compatible core versions + +## Extension Lifecycle + +### 1. Registration +```typescript +// Extension registered with host +extensionManager.register({ + name: 'my-extension', + version: '1.0.0', + manifest: manifestData +}); +``` + +### 2. Activation +```typescript +// activate() called when extension loads +export function activate(context: ExtensionContext) { + // Register contributions + const disposables = []; + + // Add panel + disposables.push( + context.core.registerView('my-panel', MyPanel) + ); + + // Register command + disposables.push( + context.commands.registerCommand('my-command', { + execute: () => { /* ... */ } + }) + ); + + // Store for cleanup + context.subscriptions.push(...disposables); +} +``` + +### 3. Runtime +- Extension responds to events +- Provides UI components when requested +- Executes commands when triggered +- Accesses APIs as needed + +### 4. Deactivation +```typescript +// Automatic cleanup of registered items +export function deactivate() { + // context.subscriptions automatically disposed + // Additional cleanup if needed +} +``` + +## Contribution Types + +### Views +Extensions can add panels and UI components: + +```json +{ + "views": { + "sqllab.panels": [{ + "id": "my-panel", + "name": "My Panel", + "icon": "ToolOutlined" + }] + } +} +``` + +### Commands +Define executable actions: + +```json +{ + "commands": [{ + "command": "my-extension.run", + "title": "Run Analysis", + "icon": "PlayCircleOutlined" + }] +} +``` + +### Menus +Add items to existing menus: + +```json +{ + "menus": { + "sqllab.editor": { + "primary": [{ + "command": "my-extension.run", + "when": "editorHasSelection" + }] + } + } +} +``` + +### API Endpoints +Register backend REST endpoints: + +```python +from superset_core.api import rest_api + +@rest_api.route('/my-endpoint') +def my_endpoint(): + return {'data': 'value'} +``` + +## Security Model + +### Permissions +- Extensions run with user's permissions +- No elevation of privileges +- Access controlled by Superset's RBAC + +### Sandboxing +- Frontend code runs in browser context +- Backend code runs in Python process +- Future: Optional sandboxed execution + +### Validation +- Manifest validation on upload +- Signature verification (future) +- Dependency scanning + +## Performance Considerations + +### Lazy Loading +- Extensions load only when features are accessed +- Code splitting for large extensions +- Cached after first load + +### Bundle Optimization +- Tree shaking removes unused code +- Minification reduces size +- Compression for network transfer + +### Resource Management +- Automatic cleanup on deactivation +- Memory leak prevention +- Event listener management + +## Development vs Production + +### Development Mode +```python +# superset_config.py +ENABLE_EXTENSIONS = True +LOCAL_EXTENSIONS = ['/path/to/extension'] +``` +- Hot reloading +- Source maps +- Debug logging + +### Production Mode +- Optimized bundles +- Cached assets +- Performance monitoring + +## Future Enhancements + +### Planned Features +- Enhanced sandboxing +- Extension marketplace +- Inter-extension communication +- Theme contributions +- Chart type extensions + +### API Expansion +- Dashboard extensions +- Database connector API +- Security provider interface +- Workflow automation + +## Best Practices + +### Do's +- ✅ Use TypeScript for type safety +- ✅ Follow semantic versioning +- ✅ Handle errors gracefully +- ✅ Clean up resources properly +- ✅ Document your extension + +### Don'ts +- ❌ Access private APIs +- ❌ Modify global state directly +- ❌ Block the main thread +- ❌ Store sensitive data insecurely +- ❌ Assume API stability in 0.x versions + +## Learn More + +- [API Reference](../api/frontend) +- [Development Guide](../getting-started) +- [Security Guidelines](./security) +- [Performance Optimization](./performance) diff --git a/docs/developer_portal/cli/overview.md b/docs/developer_portal/cli/overview.md new file mode 100644 index 00000000000..6aa43dfcf4a --- /dev/null +++ b/docs/developer_portal/cli/overview.md @@ -0,0 +1,466 @@ + +--- +title: CLI Documentation +sidebar_position: 1 +hide_title: true +--- + +# Superset Extensions CLI + +The `apache-superset-extensions-cli` provides command-line tools for creating, developing, and packaging Superset extensions. + +## Installation + +```bash +pip install apache-superset-extensions-cli +``` + +## Commands + +### init + +Creates a new extension project with the standard folder structure. + +```bash +superset-extensions init [options] +``` + +**Options:** +- `--template