mirror of
https://github.com/apache/superset.git
synced 2026-04-29 04:54:21 +00:00
Compare commits
269 Commits
query-iden
...
standardiz
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
1dfe73d19c | ||
|
|
bbda5e2008 | ||
|
|
53999c12dd | ||
|
|
f554036d29 | ||
|
|
33e7932491 | ||
|
|
92b02d993b | ||
|
|
72ba972e42 | ||
|
|
b89b0bdf5d | ||
|
|
47414e18d4 | ||
|
|
9c9588cce6 | ||
|
|
471d9fe737 | ||
|
|
b381992a75 | ||
|
|
1204507d68 | ||
|
|
c7779578f9 | ||
|
|
b225432c55 | ||
|
|
547f297171 | ||
|
|
5c3c2599db | ||
|
|
a8be5a5a0c | ||
|
|
e1234b2264 | ||
|
|
75af53dc3d | ||
|
|
0a45a89786 | ||
|
|
2b2cc96f11 | ||
|
|
59c01e016d | ||
|
|
3895b8b127 | ||
|
|
da8c0f94e6 | ||
|
|
6908a733a0 | ||
|
|
e8e1466185 | ||
|
|
ff1f7b64e2 | ||
|
|
63bb1d55a4 | ||
|
|
c568d463b9 | ||
|
|
179a6f2cfe | ||
|
|
695a20d009 | ||
|
|
e908775fb2 | ||
|
|
af05396227 | ||
|
|
277f03c207 | ||
|
|
48699a7194 | ||
|
|
b7d076bfee | ||
|
|
009b99bfbb | ||
|
|
b45141b2a1 | ||
|
|
4683a0827d | ||
|
|
ffb617a4c8 | ||
|
|
9de1706baa | ||
|
|
a95566f114 | ||
|
|
a82e310600 | ||
|
|
691926f0e1 | ||
|
|
a42185cd3b | ||
|
|
89eb7b207c | ||
|
|
f99022b242 | ||
|
|
f8b9e3ace4 | ||
|
|
9cbe5a90b8 | ||
|
|
f7fe617f4c | ||
|
|
e6c8343fd0 | ||
|
|
6969f2cf7a | ||
|
|
852adaa6cc | ||
|
|
1f482b42eb | ||
|
|
31e2143c84 | ||
|
|
b89e0d74be | ||
|
|
1127ab6c07 | ||
|
|
8d210fc7b8 | ||
|
|
8acb2fb700 | ||
|
|
a3cbc9755f | ||
|
|
28788fd1fa | ||
|
|
21790814db | ||
|
|
ff6dc03ddf | ||
|
|
fbcdf6909c | ||
|
|
fc95c4fc89 | ||
|
|
3a007f6284 | ||
|
|
2403d8d584 | ||
|
|
47874318df | ||
|
|
f6353bd1e8 | ||
|
|
1101182654 | ||
|
|
d79fc92a1a | ||
|
|
bce476c4a2 | ||
|
|
ecfb9f7d7c | ||
|
|
58ebc57285 | ||
|
|
1a57e50bd6 | ||
|
|
f3884a2db8 | ||
|
|
cb899f691b | ||
|
|
b25722ee2b | ||
|
|
34e10f5972 | ||
|
|
e88096f75c | ||
|
|
6d827cf905 | ||
|
|
ab13166e41 | ||
|
|
89f09ea57c | ||
|
|
baec438be9 | ||
|
|
5309edf3a5 | ||
|
|
f50cbd7958 | ||
|
|
2465ab4a98 | ||
|
|
1947d4da76 | ||
|
|
e452f5b70d | ||
|
|
698de7a38d | ||
|
|
e2a9f2dead | ||
|
|
1f80725b0e | ||
|
|
c3cb5c7e99 | ||
|
|
f7dd0659bf | ||
|
|
3c17ff8445 | ||
|
|
57d0e78d40 | ||
|
|
ae986903b3 | ||
|
|
6964f9bdbf | ||
|
|
9efa9898ff | ||
|
|
22b44421a4 | ||
|
|
02924b3c74 | ||
|
|
99539c786e | ||
|
|
5e621ceb34 | ||
|
|
370a24da81 | ||
|
|
732506b3fa | ||
|
|
1af9c8dba2 | ||
|
|
1dc22a9002 | ||
|
|
ad592c717e | ||
|
|
38e15196f2 | ||
|
|
8131c24acd | ||
|
|
952b620465 | ||
|
|
f3e3bd0348 | ||
|
|
1e1310dbd8 | ||
|
|
adaae8ba15 | ||
|
|
a66b7e98e0 | ||
|
|
3e12d97e8e | ||
|
|
00304f77e1 | ||
|
|
e88db9f403 | ||
|
|
53e9cf6d17 | ||
|
|
5a004590e0 | ||
|
|
53503e32ae | ||
|
|
246181a546 | ||
|
|
6f5d9c989a | ||
|
|
8515792b04 | ||
|
|
923b2b1d77 | ||
|
|
486b0122d0 | ||
|
|
ae090fa74c | ||
|
|
35ec6d308a | ||
|
|
c62a6f5cee | ||
|
|
cdd140b3cc | ||
|
|
09cf49c2ba | ||
|
|
ac4b4c7646 | ||
|
|
d0a6c78966 | ||
|
|
65f2071aa4 | ||
|
|
e8f37a3f89 | ||
|
|
19d229ea12 | ||
|
|
622a62d7a1 | ||
|
|
4a556f4ac4 | ||
|
|
7a1839ba1b | ||
|
|
8f2afb8f4d | ||
|
|
02586981da | ||
|
|
8700a0b939 | ||
|
|
d843fef2ce | ||
|
|
f45654c03c | ||
|
|
761daec53d | ||
|
|
407fb67f1e | ||
|
|
49689eec6c | ||
|
|
791ea9860d | ||
|
|
2f8939d229 | ||
|
|
ccf6290120 | ||
|
|
96a1aa60e8 | ||
|
|
2ea0368c2d | ||
|
|
9e407e4e80 | ||
|
|
360e58c181 | ||
|
|
22d5eb7835 | ||
|
|
7c4a77a909 | ||
|
|
4e209e51d0 | ||
|
|
7191ae55c8 | ||
|
|
17725ebc83 | ||
|
|
1a7a381bd5 | ||
|
|
daf207e5c2 | ||
|
|
72294c569f | ||
|
|
792dd08d38 | ||
|
|
1e40e7d02b | ||
|
|
7e98c75f01 | ||
|
|
b18de05ea4 | ||
|
|
9300652277 | ||
|
|
7c2ec4ca5f | ||
|
|
6a83b6fd87 | ||
|
|
659cd33749 | ||
|
|
cb27d5fe8d | ||
|
|
6c9cda758a | ||
|
|
967134f540 | ||
|
|
25bb353f9d | ||
|
|
9cf2472291 | ||
|
|
cf5b976659 | ||
|
|
70394e79ef | ||
|
|
ea64f3122e | ||
|
|
50197fc33e | ||
|
|
c480fa7fcf | ||
|
|
6fc734da51 | ||
|
|
762a11b0bb | ||
|
|
f168dd69a8 | ||
|
|
becd0b8883 | ||
|
|
fd4570625a | ||
|
|
54a5b58e40 | ||
|
|
a611278e04 | ||
|
|
5c2eb0a68c | ||
|
|
0cbf4d5d4d | ||
|
|
6006a21378 | ||
|
|
bf967d6ba4 | ||
|
|
131ae5aa9d | ||
|
|
eca28582b6 | ||
|
|
14e90a0f52 | ||
|
|
a1c39d4906 | ||
|
|
0964a8bb7a | ||
|
|
8de8f95a3c | ||
|
|
16db999067 | ||
|
|
972be15dda | ||
|
|
c9e06714f8 | ||
|
|
32626ab707 | ||
|
|
a9cd58508b | ||
|
|
122bb68e5a | ||
|
|
914ce9aa4f | ||
|
|
bb572983cd | ||
|
|
ff76ab647f | ||
|
|
f554848c9f | ||
|
|
dc0c389488 | ||
|
|
22b3cc0480 | ||
|
|
604d72cc98 | ||
|
|
913e068113 | ||
|
|
1a4e2173f5 | ||
|
|
c49789167b | ||
|
|
1be2287b3a | ||
|
|
e741a3167f | ||
|
|
5f11f9097a | ||
|
|
8783579aa8 | ||
|
|
c25b4221f8 | ||
|
|
9c771fb2ba | ||
|
|
7f44992c4b | ||
|
|
8df5860826 | ||
|
|
b794b192d1 | ||
|
|
3177131d52 | ||
|
|
89bf77b5c9 | ||
|
|
30e5684006 | ||
|
|
3f8472ca7b | ||
|
|
efa8cb6fa4 | ||
|
|
ab59b7e9b0 | ||
|
|
c99843b13a | ||
|
|
da55a6c94a | ||
|
|
7a1c056374 | ||
|
|
1e5a4e9bdc | ||
|
|
9b88527883 | ||
|
|
800c1639ec | ||
|
|
43775e9373 | ||
|
|
9099b0f00d | ||
|
|
77ffe65773 | ||
|
|
32f8f33a4f | ||
|
|
710c277681 | ||
|
|
11324607d0 | ||
|
|
9c6271136d | ||
|
|
c444eed63e | ||
|
|
1df5e59fdf | ||
|
|
77f66e7434 | ||
|
|
2c81eb6c39 | ||
|
|
09c4afc894 | ||
|
|
229d92590a | ||
|
|
f4f516c64c | ||
|
|
fe1fddde05 | ||
|
|
7e67deead7 | ||
|
|
6e02603098 | ||
|
|
4518f6999c | ||
|
|
aff847b3af | ||
|
|
b24aca0304 | ||
|
|
2c453035e4 | ||
|
|
4fe11869fc | ||
|
|
a0a49f9300 | ||
|
|
29d2fac485 | ||
|
|
0c5da6cb5d | ||
|
|
da6947d295 | ||
|
|
2db8f809ba | ||
|
|
5912fad745 | ||
|
|
dc41c45bec | ||
|
|
88ee90c579 | ||
|
|
bbb2279644 | ||
|
|
1958df6b83 | ||
|
|
58bd3bfcf0 | ||
|
|
d6eb6e08d0 |
125
.cursor/rules/dev-standard.mdc
Normal file
125
.cursor/rules/dev-standard.mdc
Normal file
@@ -0,0 +1,125 @@
|
||||
---
|
||||
description: Apache Superset development standards and guidelines for Cursor IDE
|
||||
globs: ["**/*.py", "**/*.ts", "**/*.tsx", "**/*.js", "**/*.jsx", "**/*.sql", "**/*.md"]
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Apache Superset Development Standards for Cursor IDE
|
||||
|
||||
Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend.
|
||||
|
||||
## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do)
|
||||
|
||||
**These migrations are actively happening - avoid deprecated patterns:**
|
||||
|
||||
### Frontend Modernization
|
||||
- **NO `any` types** - Use proper TypeScript types
|
||||
- **NO JavaScript files** - Convert to TypeScript (.ts/.tsx)
|
||||
- **NO Enzyme** - Use React Testing Library/Jest (Enzyme fully removed)
|
||||
- **Use @superset-ui/core** - Don't import Ant Design directly
|
||||
|
||||
### Testing Strategy Migration
|
||||
- **Prefer unit tests** over integration tests
|
||||
- **Prefer integration tests** over Cypress end-to-end tests
|
||||
- **Cypress is last resort** - Actively moving away from Cypress
|
||||
- **Use Jest + React Testing Library** for component testing
|
||||
|
||||
### Backend Type Safety
|
||||
- **Add type hints** - All new Python code needs proper typing
|
||||
- **MyPy compliance** - Run `pre-commit run mypy` to validate
|
||||
- **SQLAlchemy typing** - Use proper model annotations
|
||||
|
||||
## Code Standards
|
||||
|
||||
### TypeScript Frontend
|
||||
- **NO `any` types** - Use proper TypeScript
|
||||
- **Functional components** with hooks
|
||||
- **@superset-ui/core** for UI components (not direct antd)
|
||||
- **Jest** for testing (NO Enzyme)
|
||||
- **Redux** for global state, hooks for local
|
||||
|
||||
### Python Backend
|
||||
- **Type hints required** for all new code
|
||||
- **MyPy compliant** - run `pre-commit run mypy`
|
||||
- **SQLAlchemy models** with proper typing
|
||||
- **pytest** for testing
|
||||
|
||||
### Apache License Headers
|
||||
- **New files require ASF license headers** - When creating new code files, include the standard Apache Software Foundation license header
|
||||
- **LLM instruction files are excluded** - Files like LLMS.md, CLAUDE.md, etc. are in `.rat-excludes` to avoid header token overhead
|
||||
|
||||
## Key Directory Structure
|
||||
|
||||
```
|
||||
superset/
|
||||
├── superset/ # Python backend (Flask, SQLAlchemy)
|
||||
│ ├── views/api/ # REST API endpoints
|
||||
│ ├── models/ # Database models
|
||||
│ └── connectors/ # Database connections
|
||||
├── superset-frontend/src/ # React TypeScript frontend
|
||||
│ ├── components/ # Reusable components
|
||||
│ ├── explore/ # Chart builder
|
||||
│ ├── dashboard/ # Dashboard interface
|
||||
│ └── SqlLab/ # SQL editor
|
||||
├── superset-frontend/packages/
|
||||
│ └── superset-ui-core/ # UI component library (USE THIS)
|
||||
├── tests/ # Python/integration tests
|
||||
├── docs/ # Documentation (UPDATE FOR CHANGES)
|
||||
└── UPDATING.md # Breaking changes log
|
||||
```
|
||||
|
||||
## Architecture Patterns
|
||||
|
||||
### Dataset-Centric Approach
|
||||
Charts built from enriched datasets containing:
|
||||
- Dimension columns with labels/descriptions
|
||||
- Predefined metrics as SQL expressions
|
||||
- Self-service analytics within defined contexts
|
||||
|
||||
### Security & Features
|
||||
- **RBAC**: Role-based access via Flask-AppBuilder
|
||||
- **Feature flags**: Control feature rollouts
|
||||
- **Row-level security**: SQL-based data access control
|
||||
|
||||
## Test Utilities
|
||||
|
||||
### Python Test Helpers
|
||||
- **`SupersetTestCase`** - Base class in `tests/integration_tests/base_tests.py`
|
||||
- **`@with_config`** - Config mocking decorator
|
||||
- **`@with_feature_flags`** - Feature flag testing
|
||||
- **`login_as()`, `login_as_admin()`** - Authentication helpers
|
||||
- **`create_dashboard()`, `create_slice()`** - Data setup utilities
|
||||
|
||||
### TypeScript Test Helpers
|
||||
- **`superset-frontend/spec/helpers/testing-library.tsx`** - Custom render() with providers
|
||||
- **`createWrapper()`** - Redux/Router/Theme wrapper
|
||||
- **`selectOption()`** - Select component helper
|
||||
- **React Testing Library** - NO Enzyme (removed)
|
||||
|
||||
## Pre-commit Validation
|
||||
|
||||
**Use pre-commit hooks for quality validation:**
|
||||
|
||||
```bash
|
||||
# Install hooks
|
||||
pre-commit install
|
||||
|
||||
# Quick validation (faster than --all-files)
|
||||
pre-commit run # Staged files only
|
||||
pre-commit run mypy # Python type checking
|
||||
pre-commit run prettier # Code formatting
|
||||
pre-commit run eslint # Frontend linting
|
||||
```
|
||||
|
||||
## Development Guidelines
|
||||
|
||||
- **Documentation**: Update docs/ for any user-facing changes
|
||||
- **Breaking Changes**: Add to UPDATING.md
|
||||
- **Docstrings**: Required for new functions/classes
|
||||
- **Follow existing patterns**: Mimic code style, use existing libraries and utilities
|
||||
- **Type Safety**: This codebase is actively modernizing toward full TypeScript and type safety
|
||||
- **Always run `pre-commit run`** to validate changes before committing
|
||||
|
||||
---
|
||||
|
||||
**Note**: This codebase is actively modernizing toward full TypeScript and type safety. Always run `pre-commit run` to validate changes. Follow the ongoing refactors section to avoid deprecated patterns.
|
||||
20
.devcontainer/Dockerfile
Normal file
20
.devcontainer/Dockerfile
Normal file
@@ -0,0 +1,20 @@
|
||||
# Keep this in sync with the base image in the main Dockerfile (ARG PY_VER)
|
||||
FROM python:3.11.13-bookworm AS base
|
||||
|
||||
# Install system dependencies that Superset needs
|
||||
# This layer will be cached across Codespace sessions
|
||||
RUN apt-get update && apt-get install -y \
|
||||
libsasl2-dev \
|
||||
libldap2-dev \
|
||||
libpq-dev \
|
||||
tmux \
|
||||
gh \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install uv for fast Python package management
|
||||
# This will also be cached in the image
|
||||
RUN curl -LsSf https://astral.sh/uv/install.sh | sh && \
|
||||
echo 'export PATH="/root/.cargo/bin:$PATH"' >> /etc/bash.bashrc
|
||||
|
||||
# Set the cargo/bin directory in PATH for all users
|
||||
ENV PATH="/root/.cargo/bin:${PATH}"
|
||||
16
.devcontainer/README.md
Normal file
16
.devcontainer/README.md
Normal file
@@ -0,0 +1,16 @@
|
||||
# Superset Development with GitHub Codespaces
|
||||
|
||||
For complete documentation on using GitHub Codespaces with Apache Superset, please see:
|
||||
|
||||
**[Setting up a Development Environment - GitHub Codespaces](https://superset.apache.org/docs/contributing/development#github-codespaces-cloud-development)**
|
||||
|
||||
## Pre-installed Development Environment
|
||||
|
||||
When you create a new Codespace from this repository, it automatically:
|
||||
|
||||
1. **Creates a Python virtual environment** using `uv venv`
|
||||
2. **Installs all development dependencies** via `uv pip install -r requirements/development.txt`
|
||||
3. **Sets up pre-commit hooks** with `pre-commit install`
|
||||
4. **Activates the virtual environment** automatically in all terminals
|
||||
|
||||
The virtual environment is located at `/workspaces/{repository-name}/.venv` and is automatically activated through environment variables set in the devcontainer configuration.
|
||||
62
.devcontainer/bashrc-additions
Normal file
62
.devcontainer/bashrc-additions
Normal file
@@ -0,0 +1,62 @@
|
||||
# Superset Codespaces environment setup
|
||||
# This file is appended to ~/.bashrc during Codespace setup
|
||||
|
||||
# Find the workspace directory (handles both 'superset' and 'superset-2' names)
|
||||
WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1)
|
||||
|
||||
if [ -n "$WORKSPACE_DIR" ]; then
|
||||
# Check if virtual environment exists
|
||||
if [ -d "$WORKSPACE_DIR/.venv" ]; then
|
||||
# Activate the virtual environment
|
||||
source "$WORKSPACE_DIR/.venv/bin/activate"
|
||||
echo "✅ Python virtual environment activated"
|
||||
|
||||
# Verify pre-commit is installed and set up
|
||||
if command -v pre-commit &> /dev/null; then
|
||||
echo "✅ pre-commit is available ($(pre-commit --version))"
|
||||
# Install git hooks if not already installed
|
||||
if [ -d "$WORKSPACE_DIR/.git" ] && [ ! -f "$WORKSPACE_DIR/.git/hooks/pre-commit" ]; then
|
||||
echo "🪝 Installing pre-commit hooks..."
|
||||
cd "$WORKSPACE_DIR" && pre-commit install
|
||||
fi
|
||||
else
|
||||
echo "⚠️ pre-commit not found. Run: pip install pre-commit"
|
||||
fi
|
||||
else
|
||||
echo "⚠️ Python virtual environment not found at $WORKSPACE_DIR/.venv"
|
||||
echo " Run: cd $WORKSPACE_DIR && .devcontainer/setup-dev.sh"
|
||||
fi
|
||||
|
||||
# Always cd to the workspace directory for convenience
|
||||
cd "$WORKSPACE_DIR"
|
||||
fi
|
||||
|
||||
# Add helpful aliases for Superset development
|
||||
alias start-superset="$WORKSPACE_DIR/.devcontainer/start-superset.sh"
|
||||
alias setup-dev="$WORKSPACE_DIR/.devcontainer/setup-dev.sh"
|
||||
|
||||
# Show helpful message on login
|
||||
echo ""
|
||||
echo "🚀 Superset Codespaces Environment"
|
||||
echo "=================================="
|
||||
|
||||
# Check if Superset is running
|
||||
if docker ps 2>/dev/null | grep -q "superset"; then
|
||||
echo "✅ Superset is running!"
|
||||
echo " - Check the 'Ports' tab for your live Superset URL"
|
||||
echo " - Initial startup takes 10-20 minutes"
|
||||
echo " - Login: admin/admin"
|
||||
else
|
||||
echo "⚠️ Superset is not running. Use: start-superset"
|
||||
# Check if there's a startup log
|
||||
if [ -f "/tmp/superset-startup.log" ]; then
|
||||
echo " 📋 Startup log found: cat /tmp/superset-startup.log"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "Quick commands:"
|
||||
echo " start-superset - Start Superset with Docker Compose"
|
||||
echo " setup-dev - Set up Python environment (if not already done)"
|
||||
echo " pre-commit run - Run pre-commit checks on staged files"
|
||||
echo ""
|
||||
20
.devcontainer/build-and-push-image.sh
Executable file
20
.devcontainer/build-and-push-image.sh
Executable file
@@ -0,0 +1,20 @@
|
||||
#!/bin/bash
|
||||
# Script to build and push the devcontainer image to GitHub Container Registry
|
||||
# This allows caching the image between Codespace sessions
|
||||
|
||||
# You'll need to run this with appropriate GitHub permissions
|
||||
# gh auth login --scopes write:packages
|
||||
|
||||
REGISTRY="ghcr.io"
|
||||
OWNER="apache"
|
||||
REPO="superset"
|
||||
TAG="devcontainer-base"
|
||||
|
||||
echo "Building devcontainer image..."
|
||||
docker build -t $REGISTRY/$OWNER/$REPO:$TAG .devcontainer/
|
||||
|
||||
echo "Pushing to GitHub Container Registry..."
|
||||
docker push $REGISTRY/$OWNER/$REPO:$TAG
|
||||
|
||||
echo "Done! Update .devcontainer/devcontainer.json to use:"
|
||||
echo " \"image\": \"$REGISTRY/$OWNER/$REPO:$TAG\""
|
||||
66
.devcontainer/devcontainer.json
Normal file
66
.devcontainer/devcontainer.json
Normal file
@@ -0,0 +1,66 @@
|
||||
{
|
||||
"name": "Apache Superset Development",
|
||||
// Option 1: Use pre-built image directly
|
||||
// "image": "ghcr.io/apache/superset:devcontainer-base",
|
||||
|
||||
// Option 2: Build from Dockerfile with cache (current approach)
|
||||
"build": {
|
||||
"dockerfile": "Dockerfile",
|
||||
"context": ".",
|
||||
// Cache from the Apache registry image
|
||||
"cacheFrom": ["ghcr.io/apache/superset:devcontainer-base"]
|
||||
},
|
||||
|
||||
"features": {
|
||||
"ghcr.io/devcontainers/features/docker-in-docker:2": {
|
||||
"moby": true,
|
||||
"dockerDashComposeVersion": "v2"
|
||||
},
|
||||
"ghcr.io/devcontainers/features/node:1": {
|
||||
"version": "20"
|
||||
},
|
||||
"ghcr.io/devcontainers/features/git:1": {},
|
||||
"ghcr.io/devcontainers/features/common-utils:2": {
|
||||
"configureZshAsDefaultShell": true
|
||||
},
|
||||
"ghcr.io/devcontainers/features/sshd:1": {
|
||||
"version": "latest"
|
||||
}
|
||||
},
|
||||
|
||||
// Forward ports for development
|
||||
"forwardPorts": [9001],
|
||||
"portsAttributes": {
|
||||
"9001": {
|
||||
"label": "Superset (via Webpack Dev Server)",
|
||||
"onAutoForward": "notify",
|
||||
"visibility": "public"
|
||||
}
|
||||
},
|
||||
|
||||
// Run commands after container is created
|
||||
"postCreateCommand": "bash .devcontainer/setup-dev.sh || echo '⚠️ Setup had issues - run .devcontainer/setup-dev.sh manually'",
|
||||
|
||||
// Auto-start Superset after ensuring Docker is ready
|
||||
// Run in foreground to see any errors, but don't block on failures
|
||||
"postStartCommand": "bash -c 'echo \"Waiting 30s for services to initialize...\"; sleep 30; .devcontainer/start-superset.sh || echo \"⚠️ Auto-start failed - run start-superset manually\"'",
|
||||
|
||||
// Set environment variables
|
||||
"remoteEnv": {
|
||||
// Removed automatic venv activation to prevent startup issues
|
||||
// The setup script will handle this
|
||||
},
|
||||
|
||||
// VS Code customizations
|
||||
"customizations": {
|
||||
"vscode": {
|
||||
"extensions": [
|
||||
"ms-python.python",
|
||||
"ms-python.vscode-pylance",
|
||||
"charliermarsh.ruff",
|
||||
"dbaeumer.vscode-eslint",
|
||||
"esbenp.prettier-vscode"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
78
.devcontainer/setup-dev.sh
Executable file
78
.devcontainer/setup-dev.sh
Executable file
@@ -0,0 +1,78 @@
|
||||
#!/bin/bash
|
||||
# Setup script for Superset Codespaces development environment
|
||||
|
||||
echo "🔧 Setting up Superset development environment..."
|
||||
|
||||
# System dependencies and uv are now pre-installed in the Docker image
|
||||
# This speeds up Codespace creation significantly!
|
||||
|
||||
# Create virtual environment using uv
|
||||
echo "🐍 Creating Python virtual environment..."
|
||||
if ! uv venv; then
|
||||
echo "❌ Failed to create virtual environment"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Install Python dependencies
|
||||
echo "📦 Installing Python dependencies..."
|
||||
if ! uv pip install -r requirements/development.txt; then
|
||||
echo "❌ Failed to install Python dependencies"
|
||||
echo "💡 You may need to run this manually after the Codespace starts"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Install pre-commit hooks
|
||||
echo "🪝 Installing pre-commit hooks..."
|
||||
if source .venv/bin/activate && pre-commit install; then
|
||||
echo "✅ Pre-commit hooks installed"
|
||||
else
|
||||
echo "⚠️ Pre-commit hooks installation failed (non-critical)"
|
||||
fi
|
||||
|
||||
# Install Claude Code CLI via npm
|
||||
echo "🤖 Installing Claude Code..."
|
||||
if npm install -g @anthropic-ai/claude-code; then
|
||||
echo "✅ Claude Code installed"
|
||||
else
|
||||
echo "⚠️ Claude Code installation failed (non-critical)"
|
||||
fi
|
||||
|
||||
# Make the start script executable
|
||||
chmod +x .devcontainer/start-superset.sh
|
||||
|
||||
# Add bashrc additions for automatic venv activation
|
||||
echo "🔧 Setting up automatic environment activation..."
|
||||
if [ -f ~/.bashrc ]; then
|
||||
# Check if we've already added our additions
|
||||
if ! grep -q "Superset Codespaces environment setup" ~/.bashrc; then
|
||||
echo "" >> ~/.bashrc
|
||||
cat .devcontainer/bashrc-additions >> ~/.bashrc
|
||||
echo "✅ Added automatic venv activation to ~/.bashrc"
|
||||
else
|
||||
echo "✅ Bashrc additions already present"
|
||||
fi
|
||||
else
|
||||
# Create bashrc if it doesn't exist
|
||||
cat .devcontainer/bashrc-additions > ~/.bashrc
|
||||
echo "✅ Created ~/.bashrc with automatic venv activation"
|
||||
fi
|
||||
|
||||
# Also add to zshrc since that's the default shell
|
||||
if [ -f ~/.zshrc ] || [ -n "$ZSH_VERSION" ]; then
|
||||
if ! grep -q "Superset Codespaces environment setup" ~/.zshrc; then
|
||||
echo "" >> ~/.zshrc
|
||||
cat .devcontainer/bashrc-additions >> ~/.zshrc
|
||||
echo "✅ Added automatic venv activation to ~/.zshrc"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "✅ Development environment setup complete!"
|
||||
echo ""
|
||||
echo "📝 The virtual environment will be automatically activated in new terminals"
|
||||
echo ""
|
||||
echo "🔄 To activate in this terminal, run:"
|
||||
echo " source ~/.bashrc"
|
||||
echo ""
|
||||
echo "🚀 To start Superset:"
|
||||
echo " start-superset"
|
||||
echo ""
|
||||
108
.devcontainer/start-superset.sh
Executable file
108
.devcontainer/start-superset.sh
Executable file
@@ -0,0 +1,108 @@
|
||||
#!/bin/bash
|
||||
# Startup script for Superset in Codespaces
|
||||
|
||||
# Log to a file for debugging
|
||||
LOG_FILE="/tmp/superset-startup.log"
|
||||
echo "[$(date)] Starting Superset startup script" >> "$LOG_FILE"
|
||||
echo "[$(date)] User: $(whoami), PWD: $(pwd)" >> "$LOG_FILE"
|
||||
|
||||
echo "🚀 Starting Superset in Codespaces..."
|
||||
echo "🌐 Frontend will be available at port 9001"
|
||||
|
||||
# Find the workspace directory (Codespaces clones as 'superset', not 'superset-2')
|
||||
WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1)
|
||||
if [ -n "$WORKSPACE_DIR" ]; then
|
||||
cd "$WORKSPACE_DIR"
|
||||
echo "📁 Working in: $WORKSPACE_DIR"
|
||||
else
|
||||
echo "📁 Using current directory: $(pwd)"
|
||||
fi
|
||||
|
||||
# Wait for Docker to be available
|
||||
echo "⏳ Waiting for Docker to start..."
|
||||
echo "[$(date)] Waiting for Docker..." >> "$LOG_FILE"
|
||||
max_attempts=30
|
||||
attempt=0
|
||||
while ! docker info > /dev/null 2>&1; do
|
||||
if [ $attempt -eq $max_attempts ]; then
|
||||
echo "❌ Docker failed to start after $max_attempts attempts"
|
||||
echo "[$(date)] Docker failed to start after $max_attempts attempts" >> "$LOG_FILE"
|
||||
echo "🔄 Please restart the Codespace or run this script manually later"
|
||||
exit 1
|
||||
fi
|
||||
echo " Attempt $((attempt + 1))/$max_attempts..."
|
||||
echo "[$(date)] Docker check attempt $((attempt + 1))/$max_attempts" >> "$LOG_FILE"
|
||||
sleep 2
|
||||
attempt=$((attempt + 1))
|
||||
done
|
||||
echo "✅ Docker is ready!"
|
||||
echo "[$(date)] Docker is ready" >> "$LOG_FILE"
|
||||
|
||||
# Check if Superset containers are already running
|
||||
if docker ps | grep -q "superset"; then
|
||||
echo "✅ Superset containers are already running!"
|
||||
echo ""
|
||||
echo "🌐 To access Superset:"
|
||||
echo " 1. Click the 'Ports' tab at the bottom of VS Code"
|
||||
echo " 2. Find port 9001 and click the globe icon to open"
|
||||
echo " 3. Wait 10-20 minutes for initial startup"
|
||||
echo ""
|
||||
echo "📝 Login credentials: admin/admin"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Clean up any existing containers
|
||||
echo "🧹 Cleaning up existing containers..."
|
||||
docker-compose -f docker-compose-light.yml down
|
||||
|
||||
# Start services
|
||||
echo "🏗️ Starting Superset in background (daemon mode)..."
|
||||
echo ""
|
||||
|
||||
# Start in detached mode
|
||||
docker-compose -f docker-compose-light.yml up -d
|
||||
|
||||
echo ""
|
||||
echo "✅ Docker Compose started successfully!"
|
||||
echo ""
|
||||
echo "📋 Important information:"
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
echo "⏱️ Initial startup takes 10-20 minutes"
|
||||
echo "🌐 Check the 'Ports' tab for your Superset URL (port 9001)"
|
||||
echo "👤 Login: admin / admin"
|
||||
echo ""
|
||||
echo "📊 Useful commands:"
|
||||
echo " docker-compose -f docker-compose-light.yml logs -f # Follow logs"
|
||||
echo " docker-compose -f docker-compose-light.yml ps # Check status"
|
||||
echo " docker-compose -f docker-compose-light.yml down # Stop services"
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
echo ""
|
||||
echo "💤 Keeping terminal open for 60 seconds to test persistence..."
|
||||
sleep 60
|
||||
echo "✅ Test complete - check if this terminal is still visible!"
|
||||
|
||||
# Show final status
|
||||
docker-compose -f docker-compose-light.yml ps
|
||||
EXIT_CODE=$?
|
||||
|
||||
# If it failed, provide helpful instructions
|
||||
if [ $EXIT_CODE -ne 0 ] && [ $EXIT_CODE -ne 130 ]; then # 130 is Ctrl+C
|
||||
echo ""
|
||||
echo "❌ Superset startup failed (exit code: $EXIT_CODE)"
|
||||
echo ""
|
||||
echo "🔄 To restart Superset, run:"
|
||||
echo " .devcontainer/start-superset.sh"
|
||||
echo ""
|
||||
echo "🔧 For troubleshooting:"
|
||||
echo " # View logs:"
|
||||
echo " docker-compose -f docker-compose-light.yml logs"
|
||||
echo ""
|
||||
echo " # Clean restart (removes volumes):"
|
||||
echo " docker-compose -f docker-compose-light.yml down -v"
|
||||
echo " .devcontainer/start-superset.sh"
|
||||
echo ""
|
||||
echo " # Common issues:"
|
||||
echo " - Network timeouts: Just retry, often transient"
|
||||
echo " - Port conflicts: Check 'docker ps'"
|
||||
echo " - Database issues: Try clean restart with -v"
|
||||
fi
|
||||
12
.github/CODEOWNERS
vendored
12
.github/CODEOWNERS
vendored
@@ -2,7 +2,7 @@
|
||||
|
||||
# https://github.com/apache/superset/issues/13351
|
||||
|
||||
/superset/migrations/ @mistercrunch @michael-s-molina @betodealmeida @eschutho
|
||||
/superset/migrations/ @mistercrunch @michael-s-molina @betodealmeida @eschutho @sadpandajoe
|
||||
|
||||
# Notify some committers of changes in the components
|
||||
|
||||
@@ -30,3 +30,13 @@
|
||||
|
||||
**/*.geojson @villebro @rusackas
|
||||
/superset-frontend/plugins/legacy-plugin-chart-country-map/ @villebro @rusackas
|
||||
|
||||
# Notify PMC members of changes to extension-related files
|
||||
|
||||
/superset-core/ @michael-s-molina @villebro
|
||||
/superset-cli/ @michael-s-molina @villebro
|
||||
/superset/core/ @michael-s-molina @villebro
|
||||
/superset/extensions/ @michael-s-molina @villebro
|
||||
/superset-frontend/src/packages/superset-core/ @michael-s-molina @villebro
|
||||
/superset-frontend/src/core/ @michael-s-molina @villebro
|
||||
/superset-frontend/src/extensions/ @michael-s-molina @villebro
|
||||
|
||||
2
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
2
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -42,7 +42,7 @@ body:
|
||||
options:
|
||||
- master / latest-dev
|
||||
- "5.0.0"
|
||||
- "4.1.2"
|
||||
- "4.1.3"
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
||||
19
.github/actions/change-detector/action.yml
vendored
19
.github/actions/change-detector/action.yml
vendored
@@ -1,24 +1,27 @@
|
||||
name: 'Change Detector'
|
||||
description: 'Detects file changes for pull request and push events'
|
||||
name: Change Detector
|
||||
description: Detects file changes for pull request and push events
|
||||
inputs:
|
||||
token:
|
||||
description: 'GitHub token for authentication'
|
||||
description: GitHub token for authentication
|
||||
required: true
|
||||
outputs:
|
||||
python:
|
||||
description: 'Whether Python-related files were changed'
|
||||
description: Whether Python-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.python }}
|
||||
frontend:
|
||||
description: 'Whether frontend-related files were changed'
|
||||
description: Whether frontend-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.frontend }}
|
||||
docker:
|
||||
description: 'Whether docker-related files were changed'
|
||||
description: Whether docker-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.docker }}
|
||||
docs:
|
||||
description: 'Whether docs-related files were changed'
|
||||
description: Whether docs-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.docs }}
|
||||
superset-cli:
|
||||
description: Whether superset-cli package-related files were changed
|
||||
value: ${{ steps.change-detector.outputs.superset-cli }}
|
||||
runs:
|
||||
using: 'composite'
|
||||
using: composite
|
||||
steps:
|
||||
- name: Detect file changes
|
||||
id: change-detector
|
||||
|
||||
1
.github/copilot-instructions.md
vendored
Symbolic link
1
.github/copilot-instructions.md
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../LLMS.md
|
||||
82
.github/workflows/claude.yml
vendored
Normal file
82
.github/workflows/claude.yml
vendored
Normal file
@@ -0,0 +1,82 @@
|
||||
name: Claude PR Assistant
|
||||
|
||||
on:
|
||||
issue_comment:
|
||||
types: [created]
|
||||
pull_request_review_comment:
|
||||
types: [created]
|
||||
|
||||
jobs:
|
||||
check-permissions:
|
||||
if: |
|
||||
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
|
||||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude'))
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
allowed: ${{ steps.check.outputs.allowed }}
|
||||
steps:
|
||||
- name: Check if user is allowed
|
||||
id: check
|
||||
run: |
|
||||
# List of allowed users
|
||||
ALLOWED_USERS="mistercrunch,rusackas"
|
||||
|
||||
# Get the commenter's username
|
||||
COMMENTER="${{ github.event.comment.user.login }}"
|
||||
|
||||
echo "Checking permissions for user: $COMMENTER"
|
||||
|
||||
# Check if user is in allowed list
|
||||
if [[ ",$ALLOWED_USERS," == *",$COMMENTER,"* ]]; then
|
||||
echo "allowed=true" >> $GITHUB_OUTPUT
|
||||
echo "✅ User $COMMENTER is allowed to use Claude"
|
||||
else
|
||||
echo "allowed=false" >> $GITHUB_OUTPUT
|
||||
echo "❌ User $COMMENTER is not allowed to use Claude"
|
||||
fi
|
||||
|
||||
deny-access:
|
||||
needs: check-permissions
|
||||
if: needs.check-permissions.outputs.allowed == 'false'
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
issues: write
|
||||
pull-requests: write
|
||||
steps:
|
||||
- name: Comment access denied
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const message = `👋 Hi @${{ github.event.comment.user.login || github.event.review.user.login || github.event.issue.user.login }}!
|
||||
|
||||
Thanks for trying to use Claude Code, but currently only certain team members have access to this feature.
|
||||
|
||||
If you believe you should have access, please contact a project maintainer.`;
|
||||
|
||||
await github.rest.issues.createComment({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: context.issue.number,
|
||||
body: message
|
||||
});
|
||||
|
||||
claude-code-action:
|
||||
needs: check-permissions
|
||||
if: needs.check-permissions.outputs.allowed == 'true'
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
issues: write
|
||||
id-token: write
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
- name: Run Claude PR Action
|
||||
uses: anthropics/claude-code-action@beta
|
||||
with:
|
||||
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
|
||||
timeout_minutes: "60"
|
||||
2
.github/workflows/docker.yml
vendored
2
.github/workflows/docker.yml
vendored
@@ -102,7 +102,7 @@ jobs:
|
||||
docker history $IMAGE_TAG
|
||||
|
||||
- name: docker-compose sanity check
|
||||
if: (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && (matrix.build_preset == 'dev' || matrix.build_preset == 'lean')
|
||||
if: (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'dev'
|
||||
shell: bash
|
||||
run: |
|
||||
export SUPERSET_BUILD_TARGET=${{ matrix.build_preset }}
|
||||
|
||||
67
.github/workflows/superset-app-cli.yml
vendored
Normal file
67
.github/workflows/superset-app-cli.yml
vendored
Normal file
@@ -0,0 +1,67 @@
|
||||
name: Superset App CLI tests
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- "master"
|
||||
- "[0-9].[0-9]*"
|
||||
pull_request:
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
|
||||
# cancel previous workflow jobs for PRs
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
test-load-examples:
|
||||
runs-on: ubuntu-24.04
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
ports:
|
||||
# Use custom ports for services to avoid accidentally connecting to
|
||||
# GitHub action runner's default installations
|
||||
- 15432:5432
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
- name: Setup Postgres
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
- name: superset init
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
pip install -e .
|
||||
superset db upgrade
|
||||
superset load_test_users
|
||||
- name: superset load_examples
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
# load examples without test data
|
||||
superset load_examples --load-big-data
|
||||
69
.github/workflows/superset-cli.yml
vendored
69
.github/workflows/superset-cli.yml
vendored
@@ -1,4 +1,4 @@
|
||||
name: Superset CLI tests
|
||||
name: Superset CLI Package Tests
|
||||
|
||||
on:
|
||||
push:
|
||||
@@ -14,54 +14,51 @@ concurrency:
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
test-load-examples:
|
||||
test-superset-cli-package:
|
||||
runs-on: ubuntu-24.04
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
ports:
|
||||
# Use custom ports for services to avoid accidentally connecting to
|
||||
# GitHub action runner's default installations
|
||||
- 15432:5432
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ["previous", "current", "next"]
|
||||
defaults:
|
||||
run:
|
||||
working-directory: superset-cli
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Setup Python
|
||||
if: steps.check.outputs.python
|
||||
if: steps.check.outputs.superset-cli
|
||||
uses: ./.github/actions/setup-backend/
|
||||
- name: Setup Postgres
|
||||
if: steps.check.outputs.python
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
- name: superset init
|
||||
if: steps.check.outputs.python
|
||||
python-version: ${{ matrix.python-version }}
|
||||
requirements-type: dev
|
||||
|
||||
- name: Run pytest with coverage
|
||||
if: steps.check.outputs.superset-cli
|
||||
run: |
|
||||
pip install -e .
|
||||
superset db upgrade
|
||||
superset load_test_users
|
||||
- name: superset load_examples
|
||||
if: steps.check.outputs.python
|
||||
run: |
|
||||
# load examples without test data
|
||||
superset load_examples --load-big-data
|
||||
pytest --cov=superset_cli --cov-report=xml --cov-report=term-missing --cov-report=html -v --tb=short
|
||||
|
||||
- name: Upload coverage reports to Codecov
|
||||
if: steps.check.outputs.superset-cli
|
||||
uses: codecov/codecov-action@v3
|
||||
with:
|
||||
file: ./coverage.xml
|
||||
flags: superset-cli
|
||||
name: superset-cli-coverage
|
||||
fail_ci_if_error: false
|
||||
|
||||
- name: Upload HTML coverage report
|
||||
if: steps.check.outputs.superset-cli
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: superset-cli-coverage-html
|
||||
path: htmlcov/
|
||||
|
||||
@@ -51,7 +51,7 @@ jobs:
|
||||
SUPERSET_TESTENV: true
|
||||
SUPERSET_SECRET_KEY: not-a-secret
|
||||
run: |
|
||||
pytest --durations-min=0.5 --cov-report= --cov=superset/sql/ ./tests/unit_tests/sql/ --cache-clear --cov-fail-under=100
|
||||
pytest --durations-min=0.5 --cov=superset/sql/ ./tests/unit_tests/sql/ --cache-clear --cov-fail-under=100
|
||||
- name: Upload code coverage
|
||||
uses: codecov/codecov-action@v5
|
||||
with:
|
||||
|
||||
2
.github/workflows/welcome-new-users.yml
vendored
2
.github/workflows/welcome-new-users.yml
vendored
@@ -12,7 +12,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Welcome Message
|
||||
uses: actions/first-interaction@v1
|
||||
uses: actions/first-interaction@v2
|
||||
continue-on-error: true
|
||||
with:
|
||||
repo-token: ${{ github.token }}
|
||||
|
||||
9
.gitignore
vendored
9
.gitignore
vendored
@@ -43,7 +43,7 @@ _modules
|
||||
_static
|
||||
build
|
||||
app.db
|
||||
apache_superset.egg-info/
|
||||
*.egg-info/
|
||||
changelog.sh
|
||||
dist
|
||||
dump.rdb
|
||||
@@ -92,6 +92,7 @@ scripts/*.zip
|
||||
# IntelliJ
|
||||
*.iml
|
||||
venv
|
||||
.venv
|
||||
@eaDir/
|
||||
|
||||
# PyCharm
|
||||
@@ -126,4 +127,10 @@ docker/*local*
|
||||
# Jest test report
|
||||
test-report.html
|
||||
superset/static/stats/statistics.html
|
||||
|
||||
# LLM-related
|
||||
CLAUDE.local.md
|
||||
PROJECT.md
|
||||
.aider*
|
||||
.claude_rc*
|
||||
.env.local
|
||||
|
||||
@@ -23,7 +23,9 @@ repos:
|
||||
rev: v1.15.0
|
||||
hooks:
|
||||
- id: mypy
|
||||
name: mypy (main)
|
||||
args: [--check-untyped-defs]
|
||||
exclude: ^superset-cli/
|
||||
additional_dependencies: [
|
||||
types-simplejson,
|
||||
types-python-dateutil,
|
||||
@@ -38,6 +40,10 @@ repos:
|
||||
types-paramiko,
|
||||
types-Markdown,
|
||||
]
|
||||
- id: mypy
|
||||
name: mypy (superset-cli)
|
||||
args: [--check-untyped-defs]
|
||||
files: ^superset-cli/
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v5.0.0
|
||||
hooks:
|
||||
@@ -52,35 +58,27 @@ repos:
|
||||
- id: trailing-whitespace
|
||||
exclude: ^.*\.(snap)
|
||||
args: ["--markdown-linebreak-ext=md"]
|
||||
- repo: https://github.com/pre-commit/mirrors-prettier
|
||||
rev: v4.0.0-alpha.8 # Use the sha or tag you want to point at
|
||||
hooks:
|
||||
- id: prettier
|
||||
additional_dependencies:
|
||||
- prettier@3.5.3
|
||||
args: ["--ignore-path=./superset-frontend/.prettierignore", "--exclude", "site-packages"]
|
||||
files: "superset-frontend"
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: eslint-frontend
|
||||
name: eslint (frontend)
|
||||
entry: ./scripts/eslint.sh
|
||||
language: system
|
||||
pass_filenames: true
|
||||
files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$
|
||||
- id: eslint-docs
|
||||
name: eslint (docs)
|
||||
entry: bash -c 'cd docs && FILES=$(echo "$@" | sed "s|docs/||g") && yarn eslint --fix --ext .js,.jsx,.ts,.tsx --quiet $FILES'
|
||||
language: system
|
||||
pass_filenames: true
|
||||
files: ^docs/.*\.(js|jsx|ts|tsx)$
|
||||
- id: type-checking-frontend
|
||||
name: Type-Checking (Frontend)
|
||||
entry: bash -c './scripts/check-type.js package=superset-frontend excludeDeclarationDir=cypress-base'
|
||||
language: system
|
||||
files: ^superset-frontend\/.*\.(js|jsx|ts|tsx)$
|
||||
exclude: ^superset-frontend/cypress-base\/
|
||||
require_serial: true
|
||||
- id: eslint-frontend
|
||||
name: eslint (frontend)
|
||||
entry: ./scripts/eslint.sh
|
||||
language: system
|
||||
pass_filenames: true
|
||||
files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$
|
||||
- id: eslint-docs
|
||||
name: eslint (docs)
|
||||
entry: bash -c 'cd docs && FILES=$(echo "$@" | sed "s|docs/||g") && yarn eslint --fix --ext .js,.jsx,.ts,.tsx --quiet $FILES'
|
||||
language: system
|
||||
pass_filenames: true
|
||||
files: ^docs/.*\.(js|jsx|ts|tsx)$
|
||||
- id: type-checking-frontend
|
||||
name: Type-Checking (Frontend)
|
||||
entry: ./scripts/check-type.js package=superset-frontend excludeDeclarationDir=cypress-base
|
||||
language: system
|
||||
files: ^superset-frontend\/.*\.(js|jsx|ts|tsx)$
|
||||
exclude: ^superset-frontend/cypress-base\/
|
||||
require_serial: true
|
||||
# blacklist unsafe functions like make_url (see #19526)
|
||||
- repo: https://github.com/skorokithakis/blacklist-pre-commit-hook
|
||||
rev: e2f070289d8eddcaec0b580d3bde29437e7c8221
|
||||
@@ -97,25 +95,26 @@ repos:
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: v0.9.7
|
||||
hooks:
|
||||
- id: ruff-format
|
||||
- id: ruff
|
||||
args: [--fix]
|
||||
- id: ruff-format
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: pylint
|
||||
name: pylint with custom Superset plugins
|
||||
entry: bash
|
||||
language: system
|
||||
types: [python]
|
||||
exclude: ^(tests/|superset/migrations/|scripts/|RELEASING/|docker/)
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
TARGET_BRANCH=${GITHUB_BASE_REF:-master}
|
||||
git fetch origin "$TARGET_BRANCH"
|
||||
files=$(git diff --name-only --diff-filter=ACM origin/"$TARGET_BRANCH"..HEAD | grep '^superset/.*\.py$' || true)
|
||||
if [ -n "$files" ]; then
|
||||
pylint --rcfile=.pylintrc --load-plugins=superset.extensions.pylint $files
|
||||
else
|
||||
echo "No Python files to lint."
|
||||
fi
|
||||
- id: pylint
|
||||
name: pylint with custom Superset plugins
|
||||
entry: bash
|
||||
language: system
|
||||
types: [python]
|
||||
exclude: ^(tests/|superset/migrations/|scripts/|RELEASING/|docker/)
|
||||
args:
|
||||
- -c
|
||||
- |
|
||||
TARGET_BRANCH=${GITHUB_BASE_REF:-master}
|
||||
git fetch origin "$TARGET_BRANCH"
|
||||
BASE=$(git merge-base origin/"$TARGET_BRANCH" HEAD)
|
||||
files=$(git diff --name-only --diff-filter=ACM "$BASE"..HEAD | grep '^superset/.*\.py$' || true)
|
||||
if [ -n "$files" ]; then
|
||||
pylint --rcfile=.pylintrc --load-plugins=superset.extensions.pylint --reports=no $files
|
||||
else
|
||||
echo "No Python files to lint."
|
||||
fi
|
||||
|
||||
@@ -32,6 +32,8 @@ apache_superset.egg-info
|
||||
# json and csv in general cannot have comments
|
||||
.*json
|
||||
.*csv
|
||||
# jinja templates often need to be as-is
|
||||
.*j2
|
||||
# Generated doc files
|
||||
env/*
|
||||
docs/.htaccess*
|
||||
@@ -76,3 +78,11 @@ ydb.svg
|
||||
erd.puml
|
||||
erd.svg
|
||||
intro_header.txt
|
||||
|
||||
# for LLMs
|
||||
llm-context.md
|
||||
LLMS.md
|
||||
CLAUDE.md
|
||||
CURSOR.md
|
||||
GEMINI.md
|
||||
GPT.md
|
||||
|
||||
58
CHANGELOG/4.1.3.md
Normal file
58
CHANGELOG/4.1.3.md
Normal file
@@ -0,0 +1,58 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
## Change Log
|
||||
|
||||
### 4.1.3 (Thu May 29 02:31:07 2025 -0500)
|
||||
|
||||
**Database Migrations**
|
||||
|
||||
**Features**
|
||||
|
||||
**Fixes**
|
||||
|
||||
- [#33522](https://github.com/apache/superset/pull/33522) fix(Sqllab): Autocomplete got stuck in UI when open it too fast (@rebenitez1802)
|
||||
- [#33425](https://github.com/apache/superset/pull/33425) fix(table-chart): time shift is not working (@justinpark)
|
||||
- [#32414](https://github.com/apache/superset/pull/32414) fix(api): Added uuid to list api calls (@withnale)
|
||||
- [#33354](https://github.com/apache/superset/pull/33354) fix: loading examples from raw.githubusercontent.com fails with 429 errors (@mistercrunch)
|
||||
- [#32382](https://github.com/apache/superset/pull/32382) fix(pinot): revert join and subquery flags (@yuribogomolov)
|
||||
- [#32473](https://github.com/apache/superset/pull/32473) fix(plugin-chart-echarts): remove erroneous upper bound value (@villebro)
|
||||
- [#33048](https://github.com/apache/superset/pull/33048) fix: improve error type on parse error (@justinpark)
|
||||
- [#32968](https://github.com/apache/superset/pull/32968) fix(pivot-table): Revert "fix(Pivot Table): Fix column width to respect currency config (#31414)" (@justinpark)
|
||||
- [#32795](https://github.com/apache/superset/pull/32795) fix(log): store navigation path to get correct logging path (@justinpark)
|
||||
- [#33216](https://github.com/apache/superset/pull/33216) fix: Downgrade to marshmallow<4 (@amotl)
|
||||
- [#32866](https://github.com/apache/superset/pull/32866) fix: make packages PEP 625 compliant (@sadpandajoe)
|
||||
- [#32035](https://github.com/apache/superset/pull/32035) fix(fe/dashboard-list): display modifier info for `Last modified` data (@hainenber)
|
||||
- [#32708](https://github.com/apache/superset/pull/32708) fix(logging): missing path in event data (@justinpark)
|
||||
- [#32699](https://github.com/apache/superset/pull/32699) fix: Signature of Celery pruner jobs (@michael-s-molina)
|
||||
- [#32681](https://github.com/apache/superset/pull/32681) fix(log): Update recent_activity by event name (@justinpark)
|
||||
- [#32608](https://github.com/apache/superset/pull/32608) fix(welcome): perf on distinct recent activities (@justinpark)
|
||||
- [#32572](https://github.com/apache/superset/pull/32572) fix: Log table retention policy (@michael-s-molina)
|
||||
- [#32406](https://github.com/apache/superset/pull/32406) fix(model/helper): represent RLS filter clause in proper textual SQL string (@hainenber)
|
||||
- [#32240](https://github.com/apache/superset/pull/32240) fix: upgrade to 3.11.11-slim-bookworm to address critical vulnerabilities (@gpchandran)
|
||||
- [#30858](https://github.com/apache/superset/pull/30858) fix(chart data): removing query from /chart/data payload when accessing as guest user (@fisjac)
|
||||
|
||||
**Others**
|
||||
|
||||
- [#33612](https://github.com/apache/superset/pull/33612) chore: update Dockerfile - Upgrade to 3.11.12 (@gpchandran)
|
||||
- [#33435](https://github.com/apache/superset/pull/33435) docs: CVEs fixed on 4.1.2 (@sha174n)
|
||||
- [#33339](https://github.com/apache/superset/pull/33339) chore(🦾): bump python h11 0.14.0 -> 0.16.0 (@github-actions[bot])
|
||||
- [#32745](https://github.com/apache/superset/pull/32745) chore(🦾): bump python sqlglot 26.1.3 -> 26.11.1 (@github-actions[bot])
|
||||
- [#32782](https://github.com/apache/superset/pull/32782) chore: Revert "chore: bump base image in Dockerfile with `ARG PY_VER=3.11.11-slim-bookworm`" (@sadpandajoe)
|
||||
- [#32780](https://github.com/apache/superset/pull/32780) chore: bump base image in Dockerfile with `ARG PY_VER=3.11.11-slim-bookworm` (@gpchandran)
|
||||
47
Dockerfile
47
Dockerfile
@@ -59,7 +59,7 @@ RUN mkdir -p /app/superset/static/assets \
|
||||
# NOTE: we mount packages and plugins as they are referenced in package.json as workspaces
|
||||
# ideally we'd COPY only their package.json. Here npm ci will be cached as long
|
||||
# as the full content of these folders don't change, yielding a decent cache reuse rate.
|
||||
# Note that's it's not possible selectively COPY of mount using blobs.
|
||||
# Note that it's not possible to selectively COPY or mount using blobs.
|
||||
RUN --mount=type=bind,source=./superset-frontend/package.json,target=./package.json \
|
||||
--mount=type=bind,source=./superset-frontend/package-lock.json,target=./package-lock.json \
|
||||
--mount=type=cache,target=/root/.cache \
|
||||
@@ -74,7 +74,7 @@ RUN --mount=type=bind,source=./superset-frontend/package.json,target=./package.j
|
||||
COPY superset-frontend /app/superset-frontend
|
||||
|
||||
######################################################################
|
||||
# superset-node used for compile frontend assets
|
||||
# superset-node is used for compiling frontend assets
|
||||
######################################################################
|
||||
FROM superset-node-ci AS superset-node
|
||||
|
||||
@@ -90,7 +90,7 @@ RUN --mount=type=cache,target=/root/.npm \
|
||||
# Copy translation files
|
||||
COPY superset/translations /app/superset/translations
|
||||
|
||||
# Build the frontend if not in dev mode
|
||||
# Build translations if enabled, then cleanup localization files
|
||||
RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \
|
||||
npm run build-translation; \
|
||||
fi; \
|
||||
@@ -145,6 +145,9 @@ RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \
|
||||
######################################################################
|
||||
FROM python-base AS python-common
|
||||
|
||||
# Build arg to pre-populate examples DuckDB file
|
||||
ARG LOAD_EXAMPLES_DUCKDB="false"
|
||||
|
||||
ENV SUPERSET_HOME="/app/superset_home" \
|
||||
HOME="/app/superset_home" \
|
||||
SUPERSET_ENV="production" \
|
||||
@@ -167,7 +170,7 @@ RUN mkdir -p \
|
||||
&& touch superset/static/version_info.json
|
||||
|
||||
# Install Playwright and optionally setup headless browsers
|
||||
ARG INCLUDE_CHROMIUM="true"
|
||||
ARG INCLUDE_CHROMIUM="false"
|
||||
ARG INCLUDE_FIREFOX="false"
|
||||
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
|
||||
if [ "$INCLUDE_CHROMIUM" = "true" ] || [ "$INCLUDE_FIREFOX" = "true" ]; then \
|
||||
@@ -196,6 +199,18 @@ RUN /app/docker/apt-install.sh \
|
||||
libecpg-dev \
|
||||
libldap2-dev
|
||||
|
||||
# Pre-load examples DuckDB file if requested
|
||||
RUN if [ "$LOAD_EXAMPLES_DUCKDB" = "true" ]; then \
|
||||
mkdir -p /app/data && \
|
||||
echo "Downloading pre-built examples.duckdb..." && \
|
||||
curl -L -o /app/data/examples.duckdb \
|
||||
"https://raw.githubusercontent.com/apache-superset/examples-data/master/examples.duckdb" && \
|
||||
chown -R superset:superset /app/data; \
|
||||
else \
|
||||
mkdir -p /app/data && \
|
||||
chown -R superset:superset /app/data; \
|
||||
fi
|
||||
|
||||
# Copy compiled things from previous stages
|
||||
COPY --from=superset-node /app/superset/static/assets superset/static/assets
|
||||
|
||||
@@ -219,11 +234,15 @@ FROM python-common AS lean
|
||||
|
||||
# Install Python dependencies using docker/pip-install.sh
|
||||
COPY requirements/base.txt requirements/
|
||||
|
||||
# Copy superset-core package needed for editable install in base.txt
|
||||
COPY superset-core superset-core
|
||||
|
||||
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
|
||||
/app/docker/pip-install.sh --requires-build-essential -r requirements/base.txt
|
||||
# Install the superset package
|
||||
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
|
||||
uv pip install .
|
||||
uv pip install -e .
|
||||
RUN python -m compileall /app/superset
|
||||
|
||||
USER superset
|
||||
@@ -241,12 +260,17 @@ RUN /app/docker/apt-install.sh \
|
||||
|
||||
# Copy development requirements and install them
|
||||
COPY requirements/*.txt requirements/
|
||||
|
||||
# Copy local packages needed for editable installs in development.txt
|
||||
COPY superset-core superset-core
|
||||
COPY superset-cli superset-cli
|
||||
|
||||
# Install Python dependencies using docker/pip-install.sh
|
||||
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
|
||||
/app/docker/pip-install.sh --requires-build-essential -r requirements/development.txt
|
||||
# Install the superset package
|
||||
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
|
||||
uv pip install .
|
||||
uv pip install -e .
|
||||
|
||||
RUN uv pip install .[postgres]
|
||||
RUN python -m compileall /app/superset
|
||||
@@ -258,6 +282,15 @@ USER superset
|
||||
######################################################################
|
||||
FROM lean AS ci
|
||||
USER root
|
||||
RUN uv pip install .[postgres]
|
||||
RUN uv pip install .[postgres,duckdb]
|
||||
USER superset
|
||||
CMD ["/app/docker/entrypoints/docker-ci.sh"]
|
||||
|
||||
######################################################################
|
||||
# Showtime image - lean + DuckDB for examples database
|
||||
######################################################################
|
||||
FROM lean AS showtime
|
||||
USER root
|
||||
RUN uv pip install .[duckdb]
|
||||
USER superset
|
||||
CMD ["/app/docker/entrypoints/docker-ci.sh"]
|
||||
|
||||
194
LLMS.md
Normal file
194
LLMS.md
Normal file
@@ -0,0 +1,194 @@
|
||||
# LLM Context Guide for Apache Superset
|
||||
|
||||
Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend.
|
||||
|
||||
## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do)
|
||||
|
||||
**These migrations are actively happening - avoid deprecated patterns:**
|
||||
|
||||
### Frontend Modernization
|
||||
- **NO `any` types** - Use proper TypeScript types
|
||||
- **NO JavaScript files** - Convert to TypeScript (.ts/.tsx)
|
||||
- **Use @superset-ui/core** - Don't import Ant Design directly, prefer Ant Design component wrappers from @superset-ui/core/components
|
||||
- **Use antd theming tokens** - Prefer antd tokens over legacy theming tokens
|
||||
- **Avoid custom css and styles** - Follow antd best practices and avoid styling and custom CSS whenever possible
|
||||
|
||||
### Testing Strategy Migration
|
||||
- **Prefer unit tests** over integration tests
|
||||
- **Prefer integration tests** over Cypress end-to-end tests
|
||||
- **Cypress is last resort** - Actively moving away from Cypress
|
||||
- **Use Jest + React Testing Library** for component testing
|
||||
- **Use `test()` instead of `describe()`** - Follow [avoid nesting when testing](https://kentcdodds.com/blog/avoid-nesting-when-youre-testing) principles
|
||||
|
||||
### Backend Type Safety
|
||||
- **Add type hints** - All new Python code needs proper typing
|
||||
- **MyPy compliance** - Run `pre-commit run mypy` to validate
|
||||
- **SQLAlchemy typing** - Use proper model annotations
|
||||
|
||||
### UUID Migration
|
||||
- **Prefer UUIDs over auto-incrementing IDs** - New models should use UUID primary keys
|
||||
- **External API exposure** - Use UUIDs in public APIs instead of internal integer IDs
|
||||
- **Existing models** - Add UUID fields alongside integer IDs for gradual migration
|
||||
|
||||
## Key Directories
|
||||
|
||||
```
|
||||
superset/
|
||||
├── superset/ # Python backend (Flask, SQLAlchemy)
|
||||
│ ├── views/api/ # REST API endpoints
|
||||
│ ├── models/ # Database models
|
||||
│ └── connectors/ # Database connections
|
||||
├── superset-frontend/src/ # React TypeScript frontend
|
||||
│ ├── components/ # Reusable components
|
||||
│ ├── explore/ # Chart builder
|
||||
│ ├── dashboard/ # Dashboard interface
|
||||
│ └── SqlLab/ # SQL editor
|
||||
├── superset-frontend/packages/
|
||||
│ └── superset-ui-core/ # UI component library (USE THIS)
|
||||
├── tests/ # Python/integration tests
|
||||
├── docs/ # Documentation (UPDATE FOR CHANGES)
|
||||
└── UPDATING.md # Breaking changes log
|
||||
```
|
||||
|
||||
## Code Standards
|
||||
|
||||
### TypeScript Frontend
|
||||
- **Avoid `any` types** - Use proper TypeScript, reuse existing types
|
||||
- **Functional components** with hooks
|
||||
- **@superset-ui/core** for UI components (not direct antd)
|
||||
- **Jest** for testing (NO Enzyme)
|
||||
- **Redux** for global state where it exists, hooks for local
|
||||
|
||||
### Python Backend
|
||||
- **Type hints required** for all new code
|
||||
- **MyPy compliant** - run `pre-commit run mypy`
|
||||
- **SQLAlchemy models** with proper typing
|
||||
- **pytest** for testing
|
||||
|
||||
### Apache License Headers
|
||||
- **New files require ASF license headers** - When creating new code files, include the standard Apache Software Foundation license header
|
||||
- **LLM instruction files are excluded** - Files like LLMS.md, CLAUDE.md, etc. are in `.rat-excludes` to avoid header token overhead
|
||||
|
||||
## Documentation Requirements
|
||||
|
||||
- **docs/**: Update for any user-facing changes
|
||||
- **UPDATING.md**: Add breaking changes here
|
||||
- **Docstrings**: Required for new functions/classes
|
||||
|
||||
## Architecture Patterns
|
||||
|
||||
### Security & Features
|
||||
- **RBAC**: Role-based access via Flask-AppBuilder
|
||||
- **Feature flags**: Control feature rollouts
|
||||
- **Row-level security**: SQL-based data access control
|
||||
|
||||
## Test Utilities
|
||||
|
||||
### Python Test Helpers
|
||||
- **`SupersetTestCase`** - Base class in `tests/integration_tests/base_tests.py`
|
||||
- **`@with_config`** - Config mocking decorator
|
||||
- **`@with_feature_flags`** - Feature flag testing
|
||||
- **`login_as()`, `login_as_admin()`** - Authentication helpers
|
||||
- **`create_dashboard()`, `create_slice()`** - Data setup utilities
|
||||
|
||||
### TypeScript Test Helpers
|
||||
- **`superset-frontend/spec/helpers/testing-library.tsx`** - Custom render() with providers
|
||||
- **`createWrapper()`** - Redux/Router/Theme wrapper
|
||||
- **`selectOption()`** - Select component helper
|
||||
- **React Testing Library** - NO Enzyme (removed)
|
||||
|
||||
### Test Database Patterns
|
||||
- **Mock patterns**: Use `MagicMock()` for config objects, avoid `AsyncMock` for synchronous code
|
||||
- **API tests**: Update expected columns when adding new model fields
|
||||
|
||||
### Running Tests
|
||||
```bash
|
||||
# Frontend
|
||||
npm run test # All tests
|
||||
npm run test -- filename.test.tsx # Single file
|
||||
|
||||
# Backend
|
||||
pytest # All tests
|
||||
pytest tests/unit_tests/specific_test.py # Single file
|
||||
pytest tests/unit_tests/ # Directory
|
||||
|
||||
# If pytest fails with database/setup issues, ask the user to run test environment setup
|
||||
```
|
||||
|
||||
## Environment Validation
|
||||
|
||||
**Quick Setup Check (run this first):**
|
||||
|
||||
```bash
|
||||
# Verify Superset is running
|
||||
curl -f http://localhost:8088/health || echo "❌ Setup required - see https://superset.apache.org/docs/contributing/development#working-with-llms"
|
||||
```
|
||||
|
||||
**If health checks fail:**
|
||||
"It appears you aren't set up properly. Please refer to the [Working with LLMs](https://superset.apache.org/docs/contributing/development#working-with-llms) section in the development docs for setup instructions."
|
||||
|
||||
**Key Project Files:**
|
||||
- `superset-frontend/package.json` - Frontend build scripts (`npm run dev` on port 9000, `npm run test`, `npm run lint`)
|
||||
- `pyproject.toml` - Python tooling (ruff, mypy configs)
|
||||
- `requirements/` folder - Python dependencies (base.txt, development.txt)
|
||||
|
||||
## SQLAlchemy Query Best Practices
|
||||
- **Use negation operator**: `~Model.field` instead of `== False` to avoid ruff E712 errors
|
||||
- **Example**: `~Model.is_active` instead of `Model.is_active == False`
|
||||
|
||||
## Pre-commit Validation
|
||||
|
||||
**Use pre-commit hooks for quality validation:**
|
||||
|
||||
```bash
|
||||
# Install hooks
|
||||
pre-commit install
|
||||
|
||||
# IMPORTANT: Stage your changes first!
|
||||
git add . # Pre-commit only checks staged files
|
||||
|
||||
# Quick validation (faster than --all-files)
|
||||
pre-commit run # Staged files only
|
||||
pre-commit run mypy # Python type checking
|
||||
pre-commit run prettier # Code formatting
|
||||
pre-commit run eslint # Frontend linting
|
||||
```
|
||||
|
||||
**Important pre-commit usage notes:**
|
||||
- **Stage files first**: Run `git add .` before `pre-commit run` to check only changed files (much faster)
|
||||
- **Virtual environment**: Activate your Python virtual environment before running pre-commit
|
||||
```bash
|
||||
# Common virtual environment locations (yours may differ):
|
||||
source .venv/bin/activate # if using .venv
|
||||
source venv/bin/activate # if using venv
|
||||
source ~/venvs/superset/bin/activate # if using a central location
|
||||
```
|
||||
If you get a "command not found" error, ask the user which virtual environment to activate
|
||||
- **Auto-fixes**: Some hooks auto-fix issues (e.g., trailing whitespace). Re-run after fixes are applied
|
||||
|
||||
## Common File Patterns
|
||||
|
||||
### API Structure
|
||||
- **`/api.py`** - REST endpoints with decorators and OpenAPI docstrings
|
||||
- **`/schemas.py`** - Marshmallow validation schemas for OpenAPI spec
|
||||
- **`/commands/`** - Business logic classes with @transaction() decorators
|
||||
- **`/models/`** - SQLAlchemy database models
|
||||
- **OpenAPI docs**: Auto-generated at `/swagger/v1` from docstrings and schemas
|
||||
|
||||
### Migration Files
|
||||
- **Location**: `superset/migrations/versions/`
|
||||
- **Naming**: `YYYY-MM-DD_HH-MM_hash_description.py`
|
||||
- **Utilities**: Use helpers from `superset.migrations.shared.utils` for database compatibility
|
||||
- **Pattern**: Import utilities instead of raw SQLAlchemy operations
|
||||
|
||||
## Platform-Specific Instructions
|
||||
|
||||
- **[CLAUDE.md](CLAUDE.md)** - For Claude/Anthropic tools
|
||||
- **[.github/copilot-instructions.md](.github/copilot-instructions.md)** - For GitHub Copilot
|
||||
- **[GEMINI.md](GEMINI.md)** - For Google Gemini tools
|
||||
- **[GPT.md](GPT.md)** - For OpenAI/ChatGPT tools
|
||||
- **[.cursor/rules/dev-standard.mdc](.cursor/rules/dev-standard.mdc)** - For Cursor editor
|
||||
|
||||
---
|
||||
|
||||
**LLM Note**: This codebase is actively modernizing toward full TypeScript and type safety. Always run `pre-commit run` to validate changes. Follow the ongoing refactors section to avoid deprecated patterns.
|
||||
@@ -32,11 +32,10 @@ else
|
||||
SUPERSET_VERSION="${1}"
|
||||
SUPERSET_RC="${2}"
|
||||
SUPERSET_PGP_FULLNAME="${3}"
|
||||
SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}"
|
||||
SUPERSET_RELEASE_RC_TARBALL="apache_superset-${SUPERSET_VERSION_RC}-source.tar.gz"
|
||||
fi
|
||||
|
||||
SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}"
|
||||
|
||||
if [ -z "${SUPERSET_SVN_DEV_PATH}" ]; then
|
||||
SUPERSET_SVN_DEV_PATH="$HOME/svn/superset_dev"
|
||||
fi
|
||||
|
||||
@@ -28,6 +28,7 @@ These features are considered **unfinished** and should only be used on developm
|
||||
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
|
||||
|
||||
- ALERT_REPORT_TABS
|
||||
- DATE_RANGE_TIMESHIFTS_ENABLED
|
||||
- ENABLE_ADVANCED_DATA_TYPES
|
||||
- PRESTO_EXPAND_DATA
|
||||
- SHARE_QUERIES_VIA_KV_STORE
|
||||
|
||||
@@ -94,9 +94,9 @@ under the License.
|
||||
| can available domains on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can request access on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can post on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can expanded on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can delete on TableSchemaView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can post on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can expanded on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can delete on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can get on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can post on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can delete query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
|
||||
14
UPDATING.md
14
UPDATING.md
@@ -23,12 +23,24 @@ This file documents any backwards-incompatible changes in Superset and
|
||||
assists people when migrating to a new version.
|
||||
|
||||
## Next
|
||||
|
||||
- [34782](https://github.com/apache/superset/pull/34782): Dataset exports now include the dataset ID in their file name (similar to charts and dashboards). If managing assets as code, make sure to rename existing dataset YAMLs to include the ID (and avoid duplicated files).
|
||||
- [34536](https://github.com/apache/superset/pull/34536): The `ENVIRONMENT_TAG_CONFIG` color values have changed to support only Ant Design semantic colors. Update your `superset_config.py`:
|
||||
- Change `"error.base"` to just `"error"` after this PR
|
||||
- Change any hex color values to one of: `"success"`, `"processing"`, `"error"`, `"warning"`, `"default"`
|
||||
- Custom colors are no longer supported to maintain consistency with Ant Design components
|
||||
- [34561](https://github.com/apache/superset/pull/34561) Added tiled screenshot functionality for Playwright-based reports to handle large dashboards more efficiently. When enabled (default: `SCREENSHOT_TILED_ENABLED = True`), dashboards with 20+ charts or height exceeding 5000px will be captured using multiple viewport-sized tiles and combined into a single image. This improves report generation performance and reliability for large dashboards.
|
||||
Note: Pillow is now a required dependency (previously optional) to support image processing for tiled screenshots.
|
||||
`thumbnails` optional dependency is now deprecated and will be removed in the next major release (7.0).
|
||||
- [33084](https://github.com/apache/superset/pull/33084) The DISALLOWED_SQL_FUNCTIONS configuration now includes additional potentially sensitive database functions across PostgreSQL, MySQL, SQLite, MS SQL Server, and ClickHouse. Existing queries using these functions may now be blocked. Review your SQL Lab queries and dashboards if you encounter "disallowed function" errors after upgrading
|
||||
- [34235](https://github.com/apache/superset/pull/34235) CSV exports now use `utf-8-sig` encoding by default to include a UTF-8 BOM, improving compatibility with Excel.
|
||||
- [34258](https://github.com/apache/superset/pull/34258) changing the default in Dockerfile to INCLUDE_CHROMIUM="false" (from "true") in the past. This ensures the `lean` layer is lean by default, and people can opt-in to the `chromium` layer by setting the build arg `INCLUDE_CHROMIUM=true`. This is a breaking change for anyone using the `lean` layer, as it will no longer include Chromium by default.
|
||||
- [34204](https://github.com/apache/superset/pull/33603) OpenStreetView has been promoted as the new default for Deck.gl visualization since it can be enabled by default without requiring an API key. If you have Mapbox set up and want to disable OpenStreeView in your environment, please follow the steps documented here [https://superset.apache.org/docs/configuration/map-tiles].
|
||||
- [33116](https://github.com/apache/superset/pull/33116) In Echarts Series charts (e.g. Line, Area, Bar, etc.) charts, the `x_axis_sort_series` and `x_axis_sort_series_ascending` form data items have been renamed with `x_axis_sort` and `x_axis_sort_asc`.
|
||||
There's a migration added that can potentially affect a significant number of existing charts.
|
||||
- [32317](https://github.com/apache/superset/pull/32317) The horizontal filter bar feature is now out of testing/beta development and its feature flag `HORIZONTAL_FILTER_BAR` has been removed.
|
||||
- [31590](https://github.com/apache/superset/pull/31590) Marks the begining of intricate work around supporting dynamic Theming, and breaks support for [THEME_OVERRIDES](https://github.com/apache/superset/blob/732de4ac7fae88e29b7f123b6cbb2d7cd411b0e4/superset/config.py#L671) in favor of a new theming system based on AntD V5. Likely this will be in disrepair until settling over the 5.x lifecycle.
|
||||
- [32432](https://github.com/apache/superset/pull/31260) Moves the List Roles FAB view to the frontend and requires `FAB_ADD_SECURITY_API` to be enabled in the configuration and `superset init` to be executed.
|
||||
- [34319](https://github.com/apache/superset/pull/34319) Drill to Detail and Drill By is now supported in Embedded mode, and also with the `DASHBOARD_RBAC` FF. If you don't want to expose these features in Embedded / `DASHBOARD_RBAC`, make sure the roles used for Embedded / `DASHBOARD_RBAC`don't have the required permissions to perform D2D actions.
|
||||
|
||||
## 5.0.0
|
||||
|
||||
|
||||
@@ -20,6 +20,9 @@
|
||||
# If you choose to use this type of deployment make sure to
|
||||
# create you own docker environment file (docker/.env) with your own
|
||||
# unique random secure passwords and SECRET_KEY.
|
||||
#
|
||||
# For verbose logging during development:
|
||||
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
|
||||
# -----------------------------------------------------------------------
|
||||
x-superset-image: &superset-image apachesuperset.docker.scarf.sh/apache/superset:${TAG:-latest-dev}
|
||||
x-superset-volumes:
|
||||
|
||||
202
docker-compose-light.yml
Normal file
202
docker-compose-light.yml
Normal file
@@ -0,0 +1,202 @@
|
||||
#
|
||||
# Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
# contributor license agreements. See the NOTICE file distributed with
|
||||
# this work for additional information regarding copyright ownership.
|
||||
# The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
# (the "License"); you may not use this file except in compliance with
|
||||
# the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# Lightweight docker-compose for running multiple Superset instances
|
||||
# This includes only essential services: database and Superset app (no Redis)
|
||||
#
|
||||
# RUNNING SUPERSET:
|
||||
# 1. Start services: docker-compose -f docker-compose-light.yml up
|
||||
# 2. Access at: http://localhost:9001 (or NODE_PORT if specified)
|
||||
#
|
||||
# RUNNING MULTIPLE INSTANCES:
|
||||
# - Use different project names: docker-compose -p project1 -f docker-compose-light.yml up
|
||||
# - Use different NODE_PORT values: NODE_PORT=9002 docker-compose -p project2 -f docker-compose-light.yml up
|
||||
# - Volumes are isolated by project name (e.g., project1_db_home_light, project2_db_home_light)
|
||||
# - Database name is intentionally different (superset_light) to prevent accidental cross-connections
|
||||
#
|
||||
# RUNNING TESTS WITH PYTEST:
|
||||
# Tests run in an isolated environment with a separate test database.
|
||||
# The pytest-runner service automatically creates and initializes the test database on first use.
|
||||
#
|
||||
# Basic usage:
|
||||
# docker-compose -f docker-compose-light.yml run --rm pytest-runner pytest tests/unit_tests/
|
||||
#
|
||||
# Run specific test file:
|
||||
# docker-compose -f docker-compose-light.yml run --rm pytest-runner pytest tests/unit_tests/test_foo.py
|
||||
#
|
||||
# Run with pytest options:
|
||||
# docker-compose -f docker-compose-light.yml run --rm pytest-runner pytest -v -s -x tests/
|
||||
#
|
||||
# Force reload test database and run tests (when tests are failing due to bad state):
|
||||
# docker-compose -f docker-compose-light.yml run --rm -e FORCE_RELOAD=true pytest-runner pytest tests/
|
||||
#
|
||||
# Run any command in test environment:
|
||||
# docker-compose -f docker-compose-light.yml run --rm pytest-runner bash
|
||||
# docker-compose -f docker-compose-light.yml run --rm pytest-runner pytest --collect-only
|
||||
#
|
||||
# For parallel test execution with different projects:
|
||||
# docker-compose -p project1 -f docker-compose-light.yml run --rm pytest-runner pytest tests/
|
||||
#
|
||||
# DEVELOPMENT TIPS:
|
||||
# - First test run takes ~20-30 seconds (database creation + initialization)
|
||||
# - Subsequent runs are fast (~2-3 seconds startup)
|
||||
# - Use FORCE_RELOAD=true when you need a clean test database
|
||||
# - Tests use SimpleCache instead of Redis (no Redis required)
|
||||
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed logs
|
||||
# -----------------------------------------------------------------------
|
||||
x-superset-user: &superset-user root
|
||||
x-superset-volumes: &superset-volumes
|
||||
# /app/pythonpath_docker will be appended to the PYTHONPATH in the final container
|
||||
- ./docker:/app/docker
|
||||
- ./superset:/app/superset
|
||||
- ./superset-frontend:/app/superset-frontend
|
||||
- superset_home_light:/app/superset_home
|
||||
- ./tests:/app/tests
|
||||
x-common-build: &common-build
|
||||
context: .
|
||||
target: ${SUPERSET_BUILD_TARGET:-dev} # can use `dev` (default) or `lean`
|
||||
cache_from:
|
||||
- apache/superset-cache:3.10-slim-bookworm
|
||||
args:
|
||||
DEV_MODE: "true"
|
||||
INCLUDE_CHROMIUM: ${INCLUDE_CHROMIUM:-false}
|
||||
INCLUDE_FIREFOX: ${INCLUDE_FIREFOX:-false}
|
||||
BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false}
|
||||
LOAD_EXAMPLES_DUCKDB: ${LOAD_EXAMPLES_DUCKDB:-true}
|
||||
|
||||
services:
|
||||
db-light:
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
image: postgres:16
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- db_home_light:/var/lib/postgresql/data
|
||||
- ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d
|
||||
environment:
|
||||
POSTGRES_DB: superset_light
|
||||
command: postgres -c max_connections=200
|
||||
|
||||
superset-light:
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
build:
|
||||
<<: *common-build
|
||||
command: ["/app/docker/docker-bootstrap.sh", "app"]
|
||||
restart: unless-stopped
|
||||
extra_hosts:
|
||||
- "host.docker.internal:host-gateway"
|
||||
user: *superset-user
|
||||
depends_on:
|
||||
superset-init-light:
|
||||
condition: service_completed_successfully
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
DATABASE_HOST: db-light
|
||||
DATABASE_DB: superset_light
|
||||
POSTGRES_DB: superset_light
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py
|
||||
|
||||
superset-init-light:
|
||||
build:
|
||||
<<: *common-build
|
||||
command: ["/app/docker/docker-init.sh"]
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
user: *superset-user
|
||||
depends_on:
|
||||
db-light:
|
||||
condition: service_started
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
DATABASE_HOST: db-light
|
||||
DATABASE_DB: superset_light
|
||||
POSTGRES_DB: superset_light
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py
|
||||
healthcheck:
|
||||
disable: true
|
||||
|
||||
superset-node-light:
|
||||
build:
|
||||
context: .
|
||||
target: superset-node
|
||||
args:
|
||||
# This prevents building the frontend bundle since we'll mount local folder
|
||||
# and build it on startup while firing docker-frontend.sh in dev mode, where
|
||||
# it'll mount and watch local files and rebuild as you update them
|
||||
DEV_MODE: "true"
|
||||
BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false}
|
||||
environment:
|
||||
# set this to false if you have perf issues running the npm i; npm run dev in-docker
|
||||
# if you do so, you have to run this manually on the host, which should perform better!
|
||||
BUILD_SUPERSET_FRONTEND_IN_DOCKER: true
|
||||
NPM_RUN_PRUNE: false
|
||||
SCARF_ANALYTICS: "${SCARF_ANALYTICS:-}"
|
||||
# configuring the dev-server to use the host.docker.internal to connect to the backend
|
||||
superset: "http://superset-light:8088"
|
||||
ports:
|
||||
- "127.0.0.1:${NODE_PORT:-9001}:9000" # Parameterized port
|
||||
command: ["/app/docker/docker-frontend.sh"]
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
volumes: *superset-volumes
|
||||
|
||||
pytest-runner:
|
||||
build:
|
||||
<<: *common-build
|
||||
entrypoint: ["/app/docker/docker-pytest-entrypoint.sh"]
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
required: true
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
profiles:
|
||||
- test # Only starts when --profile test is used
|
||||
depends_on:
|
||||
db-light:
|
||||
condition: service_started
|
||||
user: *superset-user
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
DATABASE_HOST: db-light
|
||||
DATABASE_DB: test
|
||||
POSTGRES_DB: test
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@db-light:5432/test
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
SUPERSET_CONFIG: superset_test_config_light
|
||||
PYTHONPATH: /app/pythonpath:/app/docker/pythonpath_dev:/app
|
||||
|
||||
volumes:
|
||||
superset_home_light:
|
||||
external: false
|
||||
db_home_light:
|
||||
external: false
|
||||
@@ -20,6 +20,9 @@
|
||||
# If you choose to use this type of deployment make sure to
|
||||
# create you own docker environment file (docker/.env) with your own
|
||||
# unique random secure passwords and SECRET_KEY.
|
||||
#
|
||||
# For verbose logging during development:
|
||||
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
|
||||
# -----------------------------------------------------------------------
|
||||
x-superset-volumes:
|
||||
&superset-volumes # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container
|
||||
|
||||
@@ -20,6 +20,9 @@
|
||||
# If you choose to use this type of deployment make sure to
|
||||
# create you own docker environment file (docker/.env) with your own
|
||||
# unique random secure passwords and SECRET_KEY.
|
||||
#
|
||||
# For verbose logging during development:
|
||||
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
|
||||
# -----------------------------------------------------------------------
|
||||
x-superset-user: &superset-user root
|
||||
x-superset-volumes: &superset-volumes
|
||||
@@ -39,6 +42,7 @@ x-common-build: &common-build
|
||||
INCLUDE_CHROMIUM: ${INCLUDE_CHROMIUM:-false}
|
||||
INCLUDE_FIREFOX: ${INCLUDE_FIREFOX:-false}
|
||||
BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false}
|
||||
LOAD_EXAMPLES_DUCKDB: ${LOAD_EXAMPLES_DUCKDB:-true}
|
||||
|
||||
services:
|
||||
nginx:
|
||||
@@ -104,6 +108,8 @@ services:
|
||||
superset-init:
|
||||
condition: service_completed_successfully
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
|
||||
superset-websocket:
|
||||
container_name: superset_websocket
|
||||
@@ -155,6 +161,8 @@ services:
|
||||
condition: service_started
|
||||
user: *superset-user
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
healthcheck:
|
||||
disable: true
|
||||
|
||||
|
||||
@@ -53,7 +53,12 @@ PYTHONPATH=/app/pythonpath:/app/docker/pythonpath_dev
|
||||
REDIS_HOST=redis
|
||||
REDIS_PORT=6379
|
||||
|
||||
# Development and logging configuration
|
||||
# FLASK_DEBUG: Enables Flask dev features (auto-reload, better error pages) - keep 'true' for development
|
||||
FLASK_DEBUG=true
|
||||
# SUPERSET_LOG_LEVEL: Controls Superset application logging verbosity (debug, info, warning, error, critical)
|
||||
SUPERSET_LOG_LEVEL=info
|
||||
|
||||
SUPERSET_APP_ROOT="/"
|
||||
SUPERSET_ENV=development
|
||||
SUPERSET_LOAD_EXAMPLES=yes
|
||||
@@ -66,4 +71,3 @@ SUPERSET_SECRET_KEY=TEST_NON_DEV_SECRET
|
||||
ENABLE_PLAYWRIGHT=false
|
||||
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
|
||||
BUILD_SUPERSET_FRONTEND_IN_DOCKER=true
|
||||
SUPERSET_LOG_LEVEL=info
|
||||
|
||||
@@ -69,6 +69,8 @@ echo_step "3" "Complete" "Setting up roles and perms"
|
||||
if [ "$SUPERSET_LOAD_EXAMPLES" = "yes" ]; then
|
||||
# Load some data to play with
|
||||
echo_step "4" "Starting" "Loading examples"
|
||||
|
||||
|
||||
# If Cypress run which consumes superset_test_config – load required data for tests
|
||||
if [ "$CYPRESS_CONFIG" == "true" ]; then
|
||||
superset load_examples --load-test-data
|
||||
|
||||
152
docker/docker-pytest-entrypoint.sh
Executable file
152
docker/docker-pytest-entrypoint.sh
Executable file
@@ -0,0 +1,152 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
# contributor license agreements. See the NOTICE file distributed with
|
||||
# this work for additional information regarding copyright ownership.
|
||||
# The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
# (the "License"); you may not use this file except in compliance with
|
||||
# the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
# Wait for PostgreSQL to be ready
|
||||
echo "Waiting for database to be ready..."
|
||||
for i in {1..30}; do
|
||||
if python3 -c "
|
||||
import psycopg2
|
||||
try:
|
||||
conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='superset_light')
|
||||
conn.close()
|
||||
print('Database is ready!')
|
||||
except:
|
||||
exit(1)
|
||||
" 2>/dev/null; then
|
||||
echo "Database connection established!"
|
||||
break
|
||||
fi
|
||||
echo "Waiting for database... ($i/30)"
|
||||
if [ $i -eq 30 ]; then
|
||||
echo "Database connection timeout after 30 seconds"
|
||||
exit 1
|
||||
fi
|
||||
sleep 1
|
||||
done
|
||||
|
||||
# Handle database setup based on FORCE_RELOAD
|
||||
if [ "${FORCE_RELOAD}" = "true" ]; then
|
||||
echo "Force reload requested - resetting test database"
|
||||
# Drop and recreate the test database using Python
|
||||
python3 -c "
|
||||
import psycopg2
|
||||
from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT
|
||||
|
||||
# Connect to default database
|
||||
conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='superset_light')
|
||||
conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
|
||||
cur = conn.cursor()
|
||||
|
||||
# Drop and recreate test database
|
||||
try:
|
||||
cur.execute('DROP DATABASE IF EXISTS test')
|
||||
except:
|
||||
pass
|
||||
|
||||
cur.execute('CREATE DATABASE test')
|
||||
conn.close()
|
||||
|
||||
# Connect to test database to create schemas
|
||||
conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='test')
|
||||
conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
|
||||
cur = conn.cursor()
|
||||
|
||||
cur.execute('CREATE SCHEMA sqllab_test_db')
|
||||
cur.execute('CREATE SCHEMA admin_database')
|
||||
|
||||
cur.close()
|
||||
conn.close()
|
||||
print('Test database reset successfully')
|
||||
"
|
||||
# Use --no-reset-db since we already reset it
|
||||
FLAGS="--no-reset-db"
|
||||
else
|
||||
echo "Using existing test database (set FORCE_RELOAD=true to reset)"
|
||||
FLAGS="--no-reset-db"
|
||||
|
||||
# Ensure test database exists using Python
|
||||
python3 -c "
|
||||
import psycopg2
|
||||
from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT
|
||||
|
||||
# Check if test database exists
|
||||
try:
|
||||
conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='test')
|
||||
conn.close()
|
||||
print('Test database already exists')
|
||||
except:
|
||||
print('Creating test database...')
|
||||
# Connect to default database to create test database
|
||||
conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='superset_light')
|
||||
conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
|
||||
cur = conn.cursor()
|
||||
|
||||
# Create test database
|
||||
cur.execute('CREATE DATABASE test')
|
||||
conn.close()
|
||||
|
||||
# Connect to test database to create schemas
|
||||
conn = psycopg2.connect(host='db-light', user='superset', password='superset', database='test')
|
||||
conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
|
||||
cur = conn.cursor()
|
||||
|
||||
cur.execute('CREATE SCHEMA IF NOT EXISTS sqllab_test_db')
|
||||
cur.execute('CREATE SCHEMA IF NOT EXISTS admin_database')
|
||||
|
||||
cur.close()
|
||||
conn.close()
|
||||
print('Test database created successfully')
|
||||
"
|
||||
fi
|
||||
|
||||
# Always run database migrations to ensure schema is up to date
|
||||
echo "Running database migrations..."
|
||||
cd /app
|
||||
superset db upgrade
|
||||
|
||||
# Initialize test environment if needed
|
||||
if [ "${FORCE_RELOAD}" = "true" ] || [ ! -f "/app/superset_home/.test_initialized" ]; then
|
||||
echo "Initializing test environment..."
|
||||
# Run initialization commands
|
||||
superset init
|
||||
echo "Loading test users..."
|
||||
superset load-test-users
|
||||
|
||||
# Mark as initialized
|
||||
touch /app/superset_home/.test_initialized
|
||||
else
|
||||
echo "Test environment already initialized (skipping init and load-test-users)"
|
||||
echo "Tip: Use FORCE_RELOAD=true to reinitialize the test database"
|
||||
fi
|
||||
|
||||
# Create missing scripts needed for tests
|
||||
if [ ! -f "/app/scripts/tag_latest_release.sh" ]; then
|
||||
echo "Creating missing tag_latest_release.sh script for tests..."
|
||||
cp /app/docker/tag_latest_release.sh /app/scripts/tag_latest_release.sh 2>/dev/null || true
|
||||
fi
|
||||
|
||||
# Install pip module for Shillelagh compatibility (aligns with CI environment)
|
||||
echo "Installing pip module for Shillelagh compatibility..."
|
||||
uv pip install pip
|
||||
|
||||
# If arguments provided, execute them
|
||||
if [ $# -gt 0 ]; then
|
||||
exec "$@"
|
||||
fi
|
||||
@@ -26,7 +26,7 @@ gunicorn \
|
||||
--workers ${SERVER_WORKER_AMOUNT:-1} \
|
||||
--worker-class ${SERVER_WORKER_CLASS:-gthread} \
|
||||
--threads ${SERVER_THREADS_AMOUNT:-20} \
|
||||
--log-level "${GUNICORN_LOGLEVEL:info}" \
|
||||
--log-level "${GUNICORN_LOGLEVEL:-info}" \
|
||||
--timeout ${GUNICORN_TIMEOUT:-60} \
|
||||
--keep-alive ${GUNICORN_KEEPALIVE:-2} \
|
||||
--max-requests ${WORKER_MAX_REQUESTS:-0} \
|
||||
|
||||
@@ -23,25 +23,57 @@ MIN_MEM_FREE_GB=3
|
||||
MIN_MEM_FREE_KB=$(($MIN_MEM_FREE_GB*1000000))
|
||||
|
||||
echo_mem_warn() {
|
||||
MEM_FREE_KB=$(awk '/MemFree/ { printf "%s \n", $2 }' /proc/meminfo)
|
||||
MEM_FREE_GB=$(awk '/MemFree/ { printf "%s \n", $2/1024/1024 }' /proc/meminfo)
|
||||
# Check if running in Codespaces first
|
||||
if [[ -n "${CODESPACES}" ]]; then
|
||||
echo "Memory available: Codespaces managed"
|
||||
return
|
||||
fi
|
||||
|
||||
if [[ "${MEM_FREE_KB}" -lt "${MIN_MEM_FREE_KB}" ]]; then
|
||||
# Check platform and get memory accordingly
|
||||
if [[ -f /proc/meminfo ]]; then
|
||||
# Linux
|
||||
if grep -q MemAvailable /proc/meminfo; then
|
||||
MEM_AVAIL_KB=$(awk '/MemAvailable/ { printf "%s \n", $2 }' /proc/meminfo)
|
||||
MEM_AVAIL_GB=$(awk '/MemAvailable/ { printf "%s \n", $2/1024/1024 }' /proc/meminfo)
|
||||
else
|
||||
MEM_AVAIL_KB=$(awk '/MemFree/ { printf "%s \n", $2 }' /proc/meminfo)
|
||||
MEM_AVAIL_GB=$(awk '/MemFree/ { printf "%s \n", $2/1024/1024 }' /proc/meminfo)
|
||||
fi
|
||||
elif [[ "$(uname)" == "Darwin" ]]; then
|
||||
# macOS - use vm_stat to get free memory
|
||||
# vm_stat reports in pages, typically 4096 bytes per page
|
||||
PAGE_SIZE=$(pagesize)
|
||||
FREE_PAGES=$(vm_stat | awk '/Pages free:/ {print $3}' | tr -d '.')
|
||||
INACTIVE_PAGES=$(vm_stat | awk '/Pages inactive:/ {print $3}' | tr -d '.')
|
||||
# Free + inactive pages give us available memory (similar to MemAvailable on Linux)
|
||||
AVAIL_PAGES=$((FREE_PAGES + INACTIVE_PAGES))
|
||||
MEM_AVAIL_KB=$((AVAIL_PAGES * PAGE_SIZE / 1024))
|
||||
MEM_AVAIL_GB=$(echo "scale=2; $MEM_AVAIL_KB / 1024 / 1024" | bc)
|
||||
else
|
||||
# Other platforms
|
||||
echo "Memory available: Unable to determine"
|
||||
return
|
||||
fi
|
||||
|
||||
if [[ "${MEM_AVAIL_KB}" -lt "${MIN_MEM_FREE_KB}" ]]; then
|
||||
cat <<EOF
|
||||
===============================================
|
||||
======== Memory Insufficient Warning =========
|
||||
===============================================
|
||||
|
||||
It looks like you only have ${MEM_FREE_GB}GB of
|
||||
memory free. Please increase your Docker
|
||||
It looks like you only have ${MEM_AVAIL_GB}GB of
|
||||
memory ${MEM_TYPE}. Please increase your Docker
|
||||
resources to at least ${MIN_MEM_FREE_GB}GB
|
||||
|
||||
Note: During builds, available memory may be
|
||||
temporarily low due to caching and compilation.
|
||||
|
||||
===============================================
|
||||
======== Memory Insufficient Warning =========
|
||||
===============================================
|
||||
EOF
|
||||
else
|
||||
echo "Memory check Ok [${MEM_FREE_GB}GB free]"
|
||||
echo "Memory available: ${MEM_AVAIL_GB} GB"
|
||||
fi
|
||||
}
|
||||
|
||||
|
||||
1
docker/pythonpath_dev/.gitignore
vendored
1
docker/pythonpath_dev/.gitignore
vendored
@@ -20,4 +20,5 @@
|
||||
# DON'T ignore the .gitignore
|
||||
!.gitignore
|
||||
!superset_config.py
|
||||
!superset_config_docker_light.py
|
||||
!superset_config_local.example
|
||||
|
||||
@@ -49,12 +49,18 @@ SQLALCHEMY_DATABASE_URI = (
|
||||
f"{DATABASE_HOST}:{DATABASE_PORT}/{DATABASE_DB}"
|
||||
)
|
||||
|
||||
SQLALCHEMY_EXAMPLES_URI = (
|
||||
f"{DATABASE_DIALECT}://"
|
||||
f"{EXAMPLES_USER}:{EXAMPLES_PASSWORD}@"
|
||||
f"{EXAMPLES_HOST}:{EXAMPLES_PORT}/{EXAMPLES_DB}"
|
||||
# Use environment variable if set, otherwise construct from components
|
||||
# This MUST take precedence over any other configuration
|
||||
SQLALCHEMY_EXAMPLES_URI = os.getenv(
|
||||
"SUPERSET__SQLALCHEMY_EXAMPLES_URI",
|
||||
(
|
||||
f"{DATABASE_DIALECT}://"
|
||||
f"{EXAMPLES_USER}:{EXAMPLES_PASSWORD}@"
|
||||
f"{EXAMPLES_HOST}:{EXAMPLES_PORT}/{EXAMPLES_DB}"
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
REDIS_HOST = os.getenv("REDIS_HOST", "redis")
|
||||
REDIS_PORT = os.getenv("REDIS_PORT", "6379")
|
||||
REDIS_CELERY_DB = os.getenv("REDIS_CELERY_DB", "0")
|
||||
@@ -129,7 +135,7 @@ if os.getenv("CYPRESS_CONFIG") == "true":
|
||||
#
|
||||
try:
|
||||
import superset_config_docker
|
||||
from superset_config_docker import * # noqa
|
||||
from superset_config_docker import * # noqa: F403
|
||||
|
||||
logger.info(
|
||||
f"Loaded your Docker configuration at [{superset_config_docker.__file__}]"
|
||||
|
||||
37
docker/pythonpath_dev/superset_config_docker_light.py
Normal file
37
docker/pythonpath_dev/superset_config_docker_light.py
Normal file
@@ -0,0 +1,37 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
# Configuration for docker-compose-light.yml - disables Redis and uses minimal services
|
||||
|
||||
# Import all settings from the main config first
|
||||
from flask_caching.backends.filesystemcache import FileSystemCache
|
||||
from superset_config import * # noqa: F403
|
||||
|
||||
# Override caching to use simple in-memory cache instead of Redis
|
||||
RESULTS_BACKEND = FileSystemCache("/app/superset_home/sqllab")
|
||||
|
||||
CACHE_CONFIG = {
|
||||
"CACHE_TYPE": "SimpleCache",
|
||||
"CACHE_DEFAULT_TIMEOUT": 300,
|
||||
"CACHE_KEY_PREFIX": "superset_light_",
|
||||
}
|
||||
DATA_CACHE_CONFIG = CACHE_CONFIG
|
||||
THUMBNAIL_CACHE_CONFIG = CACHE_CONFIG
|
||||
|
||||
|
||||
# Disable Celery entirely for lightweight mode
|
||||
CELERY_CONFIG = None # type: ignore[assignment,misc]
|
||||
55
docker/pythonpath_dev/superset_test_config_light.py
Normal file
55
docker/pythonpath_dev/superset_test_config_light.py
Normal file
@@ -0,0 +1,55 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
# Test configuration for docker-compose-light.yml - uses SimpleCache instead of Redis
|
||||
|
||||
# Import all settings from the main test config first
|
||||
import os
|
||||
import sys
|
||||
|
||||
# Add the tests directory to the path to import the test config
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", ".."))
|
||||
from tests.integration_tests.superset_test_config import * # noqa: F403
|
||||
|
||||
# Override Redis-based caching to use simple in-memory cache
|
||||
CACHE_CONFIG = {
|
||||
"CACHE_TYPE": "SimpleCache",
|
||||
"CACHE_DEFAULT_TIMEOUT": 300,
|
||||
"CACHE_KEY_PREFIX": "superset_test_",
|
||||
}
|
||||
|
||||
DATA_CACHE_CONFIG = {
|
||||
**CACHE_CONFIG,
|
||||
"CACHE_DEFAULT_TIMEOUT": 30,
|
||||
"CACHE_KEY_PREFIX": "superset_test_data_",
|
||||
}
|
||||
|
||||
# Keep SimpleCache for these as they're already using it
|
||||
# FILTER_STATE_CACHE_CONFIG - already SimpleCache in parent
|
||||
# EXPLORE_FORM_DATA_CACHE_CONFIG - already SimpleCache in parent
|
||||
|
||||
# Disable Celery for lightweight testing
|
||||
CELERY_CONFIG = None
|
||||
|
||||
# Use FileSystemCache for SQL Lab results instead of Redis
|
||||
from flask_caching.backends.filesystemcache import FileSystemCache # noqa: E402
|
||||
|
||||
RESULTS_BACKEND = FileSystemCache("/app/superset_home/sqllab_test")
|
||||
|
||||
# Override WEBDRIVER_BASEURL for tests to match expected values
|
||||
WEBDRIVER_BASEURL = "http://0.0.0.0:8080/"
|
||||
WEBDRIVER_BASEURL_USER_FRIENDLY = WEBDRIVER_BASEURL
|
||||
190
docker/tag_latest_release.sh
Executable file
190
docker/tag_latest_release.sh
Executable file
@@ -0,0 +1,190 @@
|
||||
#! /bin/bash
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
run_git_tag () {
|
||||
if [[ "$DRY_RUN" == "false" ]] && [[ "$SKIP_TAG" == "false" ]]
|
||||
then
|
||||
git tag -a -f latest "${GITHUB_TAG_NAME}" -m "latest tag"
|
||||
echo "${GITHUB_TAG_NAME} has been tagged 'latest'"
|
||||
fi
|
||||
exit 0
|
||||
}
|
||||
|
||||
###
|
||||
# separating out git commands into functions so they can be mocked in unit tests
|
||||
###
|
||||
git_show_ref () {
|
||||
if [[ "$TEST_ENV" == "true" ]]
|
||||
then
|
||||
if [[ "$GITHUB_TAG_NAME" == "does_not_exist" ]]
|
||||
# mock return for testing only
|
||||
then
|
||||
echo ""
|
||||
else
|
||||
echo "2817aebd69dc7d199ec45d973a2079f35e5658b6 refs/tags/${GITHUB_TAG_NAME}"
|
||||
fi
|
||||
fi
|
||||
result=$(git show-ref "${GITHUB_TAG_NAME}")
|
||||
echo "${result}"
|
||||
}
|
||||
|
||||
get_latest_tag_list () {
|
||||
if [[ "$TEST_ENV" == "true" ]]
|
||||
then
|
||||
echo "(tag: 2.1.0, apache/2.1test)"
|
||||
else
|
||||
result=$(git show-ref --tags --dereference latest | awk '{print $2}' | xargs git show --pretty=tformat:%d -s | grep tag:)
|
||||
echo "${result}"
|
||||
fi
|
||||
}
|
||||
###
|
||||
|
||||
split_string () {
|
||||
local version="$1"
|
||||
local delimiter="$2"
|
||||
local components=()
|
||||
local tmp=""
|
||||
for (( i=0; i<${#version}; i++ )); do
|
||||
local char="${version:$i:1}"
|
||||
if [[ "$char" != "$delimiter" ]]; then
|
||||
tmp="$tmp$char"
|
||||
elif [[ -n "$tmp" ]]; then
|
||||
components+=("$tmp")
|
||||
tmp=""
|
||||
fi
|
||||
done
|
||||
if [[ -n "$tmp" ]]; then
|
||||
components+=("$tmp")
|
||||
fi
|
||||
echo "${components[@]}"
|
||||
}
|
||||
|
||||
DRY_RUN=false
|
||||
|
||||
# get params passed in with script when it was run
|
||||
# --dry-run is optional and returns the value of SKIP_TAG, but does not run the git tag statement
|
||||
# A tag name is required as a param. A SHA won't work. You must first tag a sha with a release number
|
||||
# and then run this script
|
||||
while [[ $# -gt 0 ]]
|
||||
do
|
||||
key="$1"
|
||||
|
||||
case $key in
|
||||
--dry-run)
|
||||
DRY_RUN=true
|
||||
shift # past value
|
||||
;;
|
||||
*) # this should be the tag name
|
||||
GITHUB_TAG_NAME=$key
|
||||
shift # past value
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
if [ -z "${GITHUB_TAG_NAME}" ]; then
|
||||
echo "Missing tag parameter, usage: ./scripts/tag_latest_release.sh <GITHUB_TAG_NAME>"
|
||||
echo "SKIP_TAG=true" >> $GITHUB_OUTPUT
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -z "$(git_show_ref)" ]; then
|
||||
echo "The tag ${GITHUB_TAG_NAME} does not exist. Please use a different tag."
|
||||
echo "SKIP_TAG=true" >> $GITHUB_OUTPUT
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# check that this tag only contains a proper semantic version
|
||||
if ! [[ ${GITHUB_TAG_NAME} =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]
|
||||
then
|
||||
echo "This tag ${GITHUB_TAG_NAME} is not a valid release version. Not tagging."
|
||||
echo "SKIP_TAG=true" >> $GITHUB_OUTPUT
|
||||
exit 1
|
||||
fi
|
||||
|
||||
## split the current GITHUB_TAG_NAME into an array at the dot
|
||||
THIS_TAG_NAME=$(split_string "${GITHUB_TAG_NAME}" ".")
|
||||
|
||||
# look up the 'latest' tag on git
|
||||
LATEST_TAG_LIST=$(get_latest_tag_list) || echo 'not found'
|
||||
|
||||
# if 'latest' tag doesn't exist, then set this commit to latest
|
||||
if [[ -z "$LATEST_TAG_LIST" ]]
|
||||
then
|
||||
echo "there are no latest tags yet, so I'm going to start by tagging this sha as the latest"
|
||||
run_git_tag
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# remove parenthesis and tag: from the list of tags
|
||||
LATEST_TAGS_STRINGS=$(echo "$LATEST_TAG_LIST" | sed 's/tag: \([^,]*\)/\1/g' | tr -d '()')
|
||||
|
||||
LATEST_TAGS=$(split_string "$LATEST_TAGS_STRINGS" ",")
|
||||
TAGS=($(split_string "$LATEST_TAGS" " "))
|
||||
|
||||
# Initialize a flag for comparison result
|
||||
compare_result=""
|
||||
|
||||
# Iterate through the tags of the latest release
|
||||
for tag in $TAGS
|
||||
do
|
||||
if [[ $tag == "latest" ]]; then
|
||||
continue
|
||||
else
|
||||
## extract just the version from this tag
|
||||
LATEST_RELEASE_TAG="$tag"
|
||||
echo "LATEST_RELEASE_TAG: ${LATEST_RELEASE_TAG}"
|
||||
|
||||
# check that this only contains a proper semantic version
|
||||
if ! [[ ${LATEST_RELEASE_TAG} =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]
|
||||
then
|
||||
echo "'Latest' has been associated with tag ${LATEST_RELEASE_TAG} which is not a valid release version. Looking for another."
|
||||
continue
|
||||
fi
|
||||
echo "The current release with the latest tag is version ${LATEST_RELEASE_TAG}"
|
||||
# Split the version strings into arrays
|
||||
THIS_TAG_NAME_ARRAY=($(split_string "$THIS_TAG_NAME" "."))
|
||||
LATEST_RELEASE_TAG_ARRAY=($(split_string "$LATEST_RELEASE_TAG" "."))
|
||||
|
||||
# Iterate through the components of the version strings
|
||||
for (( j=0; j<${#THIS_TAG_NAME_ARRAY[@]}; j++ )); do
|
||||
echo "Comparing ${THIS_TAG_NAME_ARRAY[$j]} to ${LATEST_RELEASE_TAG_ARRAY[$j]}"
|
||||
if [[ $((THIS_TAG_NAME_ARRAY[$j])) > $((LATEST_RELEASE_TAG_ARRAY[$j])) ]]; then
|
||||
compare_result="greater"
|
||||
break
|
||||
elif [[ $((THIS_TAG_NAME_ARRAY[$j])) < $((LATEST_RELEASE_TAG_ARRAY[$j])) ]]; then
|
||||
compare_result="lesser"
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi
|
||||
done
|
||||
|
||||
# Determine the result based on the comparison
|
||||
if [[ -z "$compare_result" ]]; then
|
||||
echo "Versions are equal"
|
||||
echo "SKIP_TAG=true" >> $GITHUB_OUTPUT
|
||||
elif [[ "$compare_result" == "greater" ]]; then
|
||||
echo "This release tag ${GITHUB_TAG_NAME} is newer than the latest."
|
||||
echo "SKIP_TAG=false" >> $GITHUB_OUTPUT
|
||||
# Add other actions you want to perform for a newer version
|
||||
elif [[ "$compare_result" == "lesser" ]]; then
|
||||
echo "This release tag ${GITHUB_TAG_NAME} is older than the latest."
|
||||
echo "This release tag ${GITHUB_TAG_NAME} is not the latest. Not tagging."
|
||||
# if you've gotten this far, then we don't want to run any tags in the next step
|
||||
echo "SKIP_TAG=true" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
180
docs/README.md
180
docs/README.md
@@ -21,3 +21,183 @@ This is the public documentation site for Superset, built using
|
||||
[Docusaurus 3](https://docusaurus.io/). See
|
||||
[CONTRIBUTING.md](../CONTRIBUTING.md#documentation) for documentation on
|
||||
contributing to documentation.
|
||||
|
||||
## Version Management
|
||||
|
||||
The Superset documentation site uses Docusaurus versioning with three independent versioned sections:
|
||||
|
||||
- **Main Documentation** (`/docs/`) - Core Superset documentation
|
||||
- **Developer Portal** (`/developer_portal/`) - Developer guides and tutorials
|
||||
- **Component Playground** (`/components/`) - Interactive component examples (currently disabled)
|
||||
|
||||
Each section maintains its own version history and can be versioned independently.
|
||||
|
||||
### Creating a New Version
|
||||
|
||||
To create a new version for any section, use the Docusaurus version command with the appropriate plugin ID or use our automated scripts:
|
||||
|
||||
#### Using Automated Scripts (Required)
|
||||
|
||||
**⚠️ Important:** Always use these custom commands instead of the native Docusaurus commands. These scripts ensure that both the Docusaurus versioning system AND the `versions-config.json` file are updated correctly.
|
||||
|
||||
```bash
|
||||
# Main Documentation
|
||||
yarn version:add:docs 1.2.0
|
||||
|
||||
# Developer Portal
|
||||
yarn version:add:developer_portal 1.2.0
|
||||
|
||||
# Component Playground (when enabled)
|
||||
yarn version:add:components 1.2.0
|
||||
```
|
||||
|
||||
**Do NOT use** the native Docusaurus commands directly (`yarn docusaurus docs:version`), as they will:
|
||||
- ❌ Create version files but NOT update `versions-config.json`
|
||||
- ❌ Cause versions to not appear in dropdown menus
|
||||
- ❌ Require manual fixes to synchronize the configuration
|
||||
|
||||
### Managing Versions
|
||||
|
||||
#### With Automated Scripts
|
||||
The automated scripts handle all configuration updates automatically. No manual editing required!
|
||||
|
||||
#### Manual Configuration
|
||||
If creating versions manually, you'll need to:
|
||||
|
||||
1. **Update `versions-config.json`** (or `docusaurus.config.ts` if not using dynamic config):
|
||||
- Add version to `onlyIncludeVersions` array
|
||||
- Add version metadata to `versions` object
|
||||
- Update `lastVersion` if needed
|
||||
|
||||
2. **Files Created by Versioning**:
|
||||
When a new version is created, Docusaurus generates:
|
||||
- **Versioned docs folder**: `[section]_versioned_docs/version-X.X.X/`
|
||||
- **Versioned sidebars**: `[section]_versioned_sidebars/version-X.X.X-sidebars.json`
|
||||
- **Versions list**: `[section]_versions.json`
|
||||
|
||||
Note: For main docs, the prefix is omitted (e.g., `versioned_docs/` instead of `docs_versioned_docs/`)
|
||||
|
||||
3. **Important**: After adding a version, restart the development server to see changes:
|
||||
```bash
|
||||
yarn stop
|
||||
yarn start
|
||||
```
|
||||
|
||||
### Removing a Version
|
||||
|
||||
#### Using Automated Scripts (Recommended)
|
||||
```bash
|
||||
# Main Documentation
|
||||
yarn version:remove:docs 1.0.0
|
||||
|
||||
# Developer Portal
|
||||
yarn version:remove:developer_portal 1.0.0
|
||||
|
||||
# Component Playground
|
||||
yarn version:remove:components 1.0.0
|
||||
```
|
||||
|
||||
#### Manual Removal
|
||||
To manually remove a version:
|
||||
|
||||
1. **Delete the version folder** from the appropriate location:
|
||||
- Main docs: `versioned_docs/version-X.X.X/` (no prefix for main)
|
||||
- Developer Portal: `developer_portal_versioned_docs/version-X.X.X/`
|
||||
- Components: `components_versioned_docs/version-X.X.X/`
|
||||
|
||||
2. **Delete the version metadata file**:
|
||||
- Main docs: `versioned_sidebars/version-X.X.X-sidebars.json` (no prefix)
|
||||
- Developer Portal: `developer_portal_versioned_sidebars/version-X.X.X-sidebars.json`
|
||||
- Components: `components_versioned_sidebars/version-X.X.X-sidebars.json`
|
||||
|
||||
3. **Update the versions list file**:
|
||||
- Main docs: `versions.json`
|
||||
- Developer Portal: `developer_portal_versions.json`
|
||||
- Components: `components_versions.json`
|
||||
|
||||
4. **Update configuration**:
|
||||
- If using dynamic config: Update `versions-config.json`
|
||||
- If using static config: Update `docusaurus.config.ts`
|
||||
|
||||
5. **Restart the server** to see changes
|
||||
|
||||
### Version Configuration Examples
|
||||
|
||||
#### Main Documentation (default plugin)
|
||||
```typescript
|
||||
docs: {
|
||||
includeCurrentVersion: true,
|
||||
lastVersion: 'current', // Makes /docs/ show Next version
|
||||
onlyIncludeVersions: ['current', '1.1.0', '1.0.0'],
|
||||
versions: {
|
||||
current: {
|
||||
label: 'Next',
|
||||
path: '', // Empty path for default routing
|
||||
banner: 'unreleased',
|
||||
},
|
||||
'1.1.0': {
|
||||
label: '1.1.0',
|
||||
path: '1.1.0',
|
||||
banner: 'none',
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
#### Developer Portal & Components (custom plugins)
|
||||
```typescript
|
||||
{
|
||||
id: 'developer_portal',
|
||||
path: 'developer_portal',
|
||||
routeBasePath: 'developer_portal',
|
||||
includeCurrentVersion: true,
|
||||
lastVersion: '1.1.0', // Default version
|
||||
onlyIncludeVersions: ['current', '1.1.0', '1.0.0'],
|
||||
versions: {
|
||||
current: {
|
||||
label: 'Next',
|
||||
path: 'next',
|
||||
banner: 'unreleased',
|
||||
},
|
||||
'1.1.0': {
|
||||
label: '1.1.0',
|
||||
path: '1.1.0',
|
||||
banner: 'none',
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### Best Practices
|
||||
|
||||
1. **Version naming**: Use semantic versioning (e.g., 1.0.0, 1.1.0, 2.0.0)
|
||||
2. **Version banners**: Use `'unreleased'` for development versions, `'none'` for stable releases
|
||||
3. **Limit displayed versions**: Use `onlyIncludeVersions` to show only relevant versions
|
||||
4. **Test locally**: Always test version changes locally before deploying
|
||||
5. **Independent versioning**: Each section can have different version numbers and release cycles
|
||||
|
||||
### Troubleshooting
|
||||
|
||||
#### Version Not Showing After Creation
|
||||
|
||||
If you accidentally used `yarn docusaurus docs:version` instead of `yarn version:add`:
|
||||
1. **Problem**: The version files were created but `versions-config.json` wasn't updated
|
||||
2. **Solution**: Either:
|
||||
- Revert the changes: `git restore versions.json && rm -rf versioned_docs/ versioned_sidebars/`
|
||||
- Then use the correct command: `yarn version:add:docs <version>`
|
||||
|
||||
For other issues:
|
||||
- **Restart the server**: Changes to version configuration require a server restart
|
||||
- **Check config file**: Ensure `versions-config.json` includes the new version
|
||||
- **Verify files exist**: Check that versioned docs folder was created
|
||||
|
||||
#### Broken Links in Versioned Documentation
|
||||
When creating a new version, links in the documentation are preserved as-is. Common issues:
|
||||
- **Cross-section links**: Links between sections (e.g., from developer_portal to docs) need to be version-aware
|
||||
- **Absolute vs relative paths**: Use relative paths within the same section
|
||||
- **Version-specific URLs**: Update hardcoded URLs to use version variables
|
||||
|
||||
To fix broken links:
|
||||
1. Use `type: 'doc'` with `docId` for version-aware navigation in navbar
|
||||
2. Use relative paths within the same documentation section
|
||||
3. Test all versions after creation to identify broken links
|
||||
|
||||
105
docs/components/chart-components/bar-chart.md
Normal file
105
docs/components/chart-components/bar-chart.md
Normal file
@@ -0,0 +1,105 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
---
|
||||
title: Bar Chart
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
# Bar Chart Component
|
||||
|
||||
The Bar Chart component is used to visualize categorical data with rectangular bars.
|
||||
|
||||
## Props
|
||||
|
||||
| Prop | Type | Default | Description |
|
||||
|------|------|---------|-------------|
|
||||
| `data` | `array` | `[]` | Array of data objects to visualize |
|
||||
| `width` | `number` | `800` | Width of the chart in pixels |
|
||||
| `height` | `number` | `600` | Height of the chart in pixels |
|
||||
| `xField` | `string` | - | Field name for x-axis values |
|
||||
| `yField` | `string` | - | Field name for y-axis values |
|
||||
| `colorField` | `string` | - | Field name for color encoding |
|
||||
| `colorScheme` | `string` | `'supersetColors'` | Color scheme to use |
|
||||
| `showLegend` | `boolean` | `true` | Whether to show the legend |
|
||||
| `showGrid` | `boolean` | `true` | Whether to show grid lines |
|
||||
| `labelPosition` | `string` | `'top'` | Position of bar labels: 'top', 'middle', 'bottom' |
|
||||
|
||||
## Examples
|
||||
|
||||
### Basic Bar Chart
|
||||
|
||||
```jsx
|
||||
import { BarChart } from '@superset-ui/chart-components';
|
||||
|
||||
const data = [
|
||||
{ category: 'A', value: 10 },
|
||||
{ category: 'B', value: 20 },
|
||||
{ category: 'C', value: 15 },
|
||||
{ category: 'D', value: 25 },
|
||||
];
|
||||
|
||||
function Example() {
|
||||
return (
|
||||
<BarChart
|
||||
data={data}
|
||||
width={800}
|
||||
height={400}
|
||||
xField="category"
|
||||
yField="value"
|
||||
colorScheme="supersetColors"
|
||||
/>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### Grouped Bar Chart
|
||||
|
||||
```jsx
|
||||
import { BarChart } from '@superset-ui/chart-components';
|
||||
|
||||
const data = [
|
||||
{ category: 'A', group: 'Group 1', value: 10 },
|
||||
{ category: 'A', group: 'Group 2', value: 15 },
|
||||
{ category: 'B', group: 'Group 1', value: 20 },
|
||||
{ category: 'B', group: 'Group 2', value: 25 },
|
||||
{ category: 'C', group: 'Group 1', value: 15 },
|
||||
{ category: 'C', group: 'Group 2', value: 10 },
|
||||
];
|
||||
|
||||
function Example() {
|
||||
return (
|
||||
<BarChart
|
||||
data={data}
|
||||
width={800}
|
||||
height={400}
|
||||
xField="category"
|
||||
yField="value"
|
||||
colorField="group"
|
||||
colorScheme="supersetColors"
|
||||
/>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
- Use bar charts when comparing quantities across categories
|
||||
- Sort bars by value for better readability, unless there's a natural order to the categories
|
||||
- Use consistent colors for the same categories across different charts
|
||||
- Consider using horizontal bar charts when category labels are long
|
||||
59
docs/components/index.md
Normal file
59
docs/components/index.md
Normal file
@@ -0,0 +1,59 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
---
|
||||
title: Component Library
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
# Superset Component Library
|
||||
|
||||
Welcome to the Apache Superset Component Library documentation. This section provides comprehensive documentation for all the UI components, chart components, and layout components used in Superset.
|
||||
|
||||
## What is the Component Library?
|
||||
|
||||
The Component Library is a collection of reusable UI components that are used to build the Superset user interface. These components are designed to be consistent, accessible, and easy to use.
|
||||
|
||||
## Component Categories
|
||||
|
||||
The Component Library is organized into the following categories:
|
||||
|
||||
### UI Components
|
||||
|
||||
Basic UI components like buttons, inputs, dropdowns, and other form elements.
|
||||
|
||||
### Chart Components
|
||||
|
||||
Visualization components used to render different types of charts and graphs.
|
||||
|
||||
### Layout Components
|
||||
|
||||
Components used for page layout, such as containers, grids, and navigation elements.
|
||||
|
||||
## Versioning
|
||||
|
||||
The Component Library documentation follows its own versioning scheme, independent from the main Superset documentation. This allows us to update the component documentation as the components evolve, without affecting the main documentation.
|
||||
|
||||
## Getting Started
|
||||
|
||||
Browse the sidebar to explore the different components available in the library. Each component documentation includes:
|
||||
|
||||
- Component description and purpose
|
||||
- Props and configuration options
|
||||
- Usage examples
|
||||
- Best practices
|
||||
113
docs/components/layout-components/grid.md
Normal file
113
docs/components/layout-components/grid.md
Normal file
@@ -0,0 +1,113 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
---
|
||||
title: Grid
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
# Grid Component
|
||||
|
||||
The Grid component provides a flexible layout system for arranging content in rows and columns.
|
||||
|
||||
## Props
|
||||
|
||||
| Prop | Type | Default | Description |
|
||||
|------|------|---------|-------------|
|
||||
| `gutter` | `number` or `[number, number]` | `0` | Grid spacing between items, can be a single number or [horizontal, vertical] |
|
||||
| `columns` | `number` | `12` | Number of columns in the grid |
|
||||
| `justify` | `string` | `'start'` | Horizontal alignment: 'start', 'center', 'end', 'space-between', 'space-around' |
|
||||
| `align` | `string` | `'top'` | Vertical alignment: 'top', 'middle', 'bottom' |
|
||||
| `wrap` | `boolean` | `true` | Whether to wrap items when they overflow |
|
||||
|
||||
### Row Props
|
||||
|
||||
| Prop | Type | Default | Description |
|
||||
|------|------|---------|-------------|
|
||||
| `gutter` | `number` or `[number, number]` | `0` | Spacing between items in the row |
|
||||
| `justify` | `string` | `'start'` | Horizontal alignment for this row |
|
||||
| `align` | `string` | `'top'` | Vertical alignment for this row |
|
||||
|
||||
### Col Props
|
||||
|
||||
| Prop | Type | Default | Description |
|
||||
|------|------|---------|-------------|
|
||||
| `span` | `number` | - | Number of columns the grid item spans |
|
||||
| `offset` | `number` | `0` | Number of columns the grid item is offset |
|
||||
| `xs`, `sm`, `md`, `lg`, `xl` | `number` or `object` | - | Responsive props for different screen sizes |
|
||||
|
||||
## Examples
|
||||
|
||||
### Basic Grid
|
||||
|
||||
```jsx
|
||||
import { Grid, Row, Col } from '@superset-ui/core';
|
||||
|
||||
function Example() {
|
||||
return (
|
||||
<Grid>
|
||||
<Row gutter={16}>
|
||||
<Col span={8}>
|
||||
<div>Column 1</div>
|
||||
</Col>
|
||||
<Col span={8}>
|
||||
<div>Column 2</div>
|
||||
</Col>
|
||||
<Col span={8}>
|
||||
<div>Column 3</div>
|
||||
</Col>
|
||||
</Row>
|
||||
</Grid>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### Responsive Grid
|
||||
|
||||
```jsx
|
||||
import { Grid, Row, Col } from '@superset-ui/core';
|
||||
|
||||
function Example() {
|
||||
return (
|
||||
<Grid>
|
||||
<Row gutter={[16, 24]}>
|
||||
<Col xs={24} sm={12} md={8} lg={6}>
|
||||
<div>Responsive Column 1</div>
|
||||
</Col>
|
||||
<Col xs={24} sm={12} md={8} lg={6}>
|
||||
<div>Responsive Column 2</div>
|
||||
</Col>
|
||||
<Col xs={24} sm={12} md={8} lg={6}>
|
||||
<div>Responsive Column 3</div>
|
||||
</Col>
|
||||
<Col xs={24} sm={12} md={8} lg={6}>
|
||||
<div>Responsive Column 4</div>
|
||||
</Col>
|
||||
</Row>
|
||||
</Grid>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
- Use the Grid system for complex layouts that need to be responsive
|
||||
- Specify column widths for different screen sizes to ensure proper responsive behavior
|
||||
- Use gutters to create appropriate spacing between grid items
|
||||
- Keep the grid structure consistent throughout your application
|
||||
- Consider using the grid system for dashboard layouts to ensure consistent spacing and alignment
|
||||
35
docs/components/test.mdx
Normal file
35
docs/components/test.mdx
Normal file
@@ -0,0 +1,35 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
---
|
||||
title: Test
|
||||
---
|
||||
|
||||
import { StoryExample } from '../src/components/StorybookWrapper';
|
||||
|
||||
# Test
|
||||
|
||||
This is a test using our custom StorybookWrapper component.
|
||||
|
||||
<StoryExample
|
||||
component={() => (
|
||||
<div style={{ padding: '10px', background: '#f0f0f0', borderRadius: '4px' }}>
|
||||
This is a simple example component
|
||||
</div>
|
||||
)}
|
||||
/>
|
||||
146
docs/components/ui-components/button.mdx
Normal file
146
docs/components/ui-components/button.mdx
Normal file
@@ -0,0 +1,146 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
---
|
||||
title: Button Component
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
import { StoryExample, StoryWithControls } from '../../src/components/StorybookWrapper';
|
||||
import { Button } from '../../../superset-frontend/packages/superset-ui-core/src/components/Button';
|
||||
|
||||
# Button Component
|
||||
|
||||
The Button component is a fundamental UI element used throughout Superset for user interactions.
|
||||
|
||||
## Basic Usage
|
||||
|
||||
The default button with primary styling:
|
||||
<StoryExample
|
||||
component={() => (
|
||||
<Button buttonStyle="primary" onClick={() => console.log('Clicked!')}>
|
||||
Click Me
|
||||
</Button>
|
||||
)}
|
||||
/>
|
||||
|
||||
## Interactive Example
|
||||
|
||||
<StoryWithControls
|
||||
component={({ buttonStyle, buttonSize, label, disabled }) => (
|
||||
<Button
|
||||
buttonStyle={buttonStyle}
|
||||
buttonSize={buttonSize}
|
||||
disabled={disabled}
|
||||
onClick={() => console.log('Clicked!')}
|
||||
>
|
||||
{label}
|
||||
</Button>
|
||||
)}
|
||||
props={{
|
||||
buttonStyle: 'primary',
|
||||
buttonSize: 'default',
|
||||
label: 'Click Me',
|
||||
disabled: false
|
||||
}}
|
||||
controls={[
|
||||
{
|
||||
name: 'buttonStyle',
|
||||
label: 'Button Style',
|
||||
type: 'select',
|
||||
options: ['primary', 'secondary', 'tertiary', 'success', 'warning', 'danger', 'default', 'link', 'dashed']
|
||||
},
|
||||
{
|
||||
name: 'buttonSize',
|
||||
label: 'Button Size',
|
||||
type: 'select',
|
||||
options: ['default', 'small', 'xsmall']
|
||||
},
|
||||
{
|
||||
name: 'label',
|
||||
label: 'Button Text',
|
||||
type: 'text'
|
||||
},
|
||||
{
|
||||
name: 'disabled',
|
||||
label: 'Disabled',
|
||||
type: 'boolean'
|
||||
}
|
||||
]}
|
||||
/>
|
||||
|
||||
## Props
|
||||
|
||||
| Prop | Type | Default | Description |
|
||||
|------|------|---------|-------------|
|
||||
| `buttonStyle` | `'primary' \| 'secondary' \| 'tertiary' \| 'success' \| 'warning' \| 'danger' \| 'default' \| 'link' \| 'dashed'` | `'default'` | Button style |
|
||||
| `buttonSize` | `'default' \| 'small' \| 'xsmall'` | `'default'` | Button size |
|
||||
| `disabled` | `boolean` | `false` | Whether the button is disabled |
|
||||
| `cta` | `boolean` | `false` | Whether the button is a call-to-action button |
|
||||
| `tooltip` | `ReactNode` | - | Tooltip content |
|
||||
| `placement` | `TooltipProps['placement']` | - | Tooltip placement |
|
||||
| `onClick` | `function` | - | Callback when button is clicked |
|
||||
| `href` | `string` | - | Turns button into an anchor link |
|
||||
| `target` | `string` | - | Target attribute for anchor links |
|
||||
|
||||
## Usage
|
||||
|
||||
```jsx
|
||||
import Button from 'src/components/Button';
|
||||
|
||||
function MyComponent() {
|
||||
return (
|
||||
<Button
|
||||
buttonStyle="primary"
|
||||
onClick={() => console.log('Button clicked')}
|
||||
>
|
||||
Click Me
|
||||
</Button>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Button Styles
|
||||
|
||||
Superset provides a variety of button styles for different purposes:
|
||||
|
||||
- **Primary**: Used for primary actions
|
||||
- **Secondary**: Used for secondary actions
|
||||
- **Tertiary**: Used for less important actions
|
||||
- **Success**: Used for successful or confirming actions
|
||||
- **Warning**: Used for actions that require caution
|
||||
- **Danger**: Used for destructive actions
|
||||
- **Link**: Used for navigation
|
||||
- **Dashed**: Used for adding new items or features
|
||||
|
||||
## Button Sizes
|
||||
|
||||
Buttons come in three sizes:
|
||||
|
||||
- **Default**: Standard size for most use cases
|
||||
- **Small**: Compact size for tight spaces
|
||||
- **XSmall**: Extra small size for very limited spaces
|
||||
|
||||
## Best Practices
|
||||
|
||||
- Use primary buttons for the main action in a form or page
|
||||
- Use secondary buttons for alternative actions
|
||||
- Use danger buttons for destructive actions
|
||||
- Limit the number of primary buttons on a page to avoid confusion
|
||||
- Use consistent button styles throughout your application
|
||||
- Add tooltips to buttons when their purpose might not be immediately clear
|
||||
1
docs/components/versions.json
Normal file
1
docs/components/versions.json
Normal file
@@ -0,0 +1 @@
|
||||
[]
|
||||
1
docs/components_versions.json
Normal file
1
docs/components_versions.json
Normal file
@@ -0,0 +1 @@
|
||||
[]
|
||||
477
docs/developer_portal/api/frontend.md
Normal file
477
docs/developer_portal/api/frontend.md
Normal file
@@ -0,0 +1,477 @@
|
||||
---
|
||||
title: Frontend API Reference
|
||||
sidebar_position: 1
|
||||
hide_title: true
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Frontend API Reference
|
||||
|
||||
The `@apache-superset/core` package provides comprehensive APIs for frontend extension development. All APIs are organized into logical namespaces for easy discovery and use.
|
||||
|
||||
## Core API
|
||||
|
||||
The core namespace provides fundamental extension functionality.
|
||||
|
||||
### registerView
|
||||
|
||||
Registers a new view or panel in the specified contribution point.
|
||||
|
||||
```typescript
|
||||
core.registerView(
|
||||
id: string,
|
||||
component: React.ComponentType
|
||||
): Disposable
|
||||
```
|
||||
|
||||
**Example:**
|
||||
```typescript
|
||||
const panel = context.core.registerView('my-extension.panel', () => (
|
||||
<MyPanelComponent />
|
||||
));
|
||||
```
|
||||
|
||||
### getActiveView
|
||||
|
||||
Gets the currently active view in a contribution area.
|
||||
|
||||
```typescript
|
||||
core.getActiveView(area: string): View | undefined
|
||||
```
|
||||
|
||||
## Commands API
|
||||
|
||||
Manages command registration and execution.
|
||||
|
||||
### registerCommand
|
||||
|
||||
Registers a new command that can be triggered by menus, shortcuts, or programmatically.
|
||||
|
||||
```typescript
|
||||
commands.registerCommand(
|
||||
id: string,
|
||||
handler: CommandHandler
|
||||
): Disposable
|
||||
|
||||
interface CommandHandler {
|
||||
title: string;
|
||||
icon?: string;
|
||||
execute: (...args: any[]) => any;
|
||||
isEnabled?: (...args: any[]) => boolean;
|
||||
}
|
||||
```
|
||||
|
||||
**Example:**
|
||||
```typescript
|
||||
const cmd = context.commands.registerCommand('my-extension.analyze', {
|
||||
title: 'Analyze Query',
|
||||
icon: 'BarChartOutlined',
|
||||
execute: () => {
|
||||
const query = context.sqlLab.getCurrentQuery();
|
||||
// Perform analysis
|
||||
},
|
||||
isEnabled: () => {
|
||||
return context.sqlLab.hasActiveEditor();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
### executeCommand
|
||||
|
||||
Executes a registered command by ID.
|
||||
|
||||
```typescript
|
||||
commands.executeCommand(id: string, ...args: any[]): Promise<any>
|
||||
```
|
||||
|
||||
## SQL Lab API
|
||||
|
||||
Provides access to SQL Lab functionality and events.
|
||||
|
||||
### Query Access
|
||||
|
||||
```typescript
|
||||
// Get current tab
|
||||
sqlLab.getCurrentTab(): Tab | undefined
|
||||
|
||||
// Get all tabs
|
||||
sqlLab.getTabs(): Tab[]
|
||||
|
||||
// Get current query
|
||||
sqlLab.getCurrentQuery(): string
|
||||
|
||||
// Get selected text
|
||||
sqlLab.getSelectedText(): string | undefined
|
||||
```
|
||||
|
||||
### Database Access
|
||||
|
||||
```typescript
|
||||
// Get available databases
|
||||
sqlLab.getDatabases(): Database[]
|
||||
|
||||
// Get database by ID
|
||||
sqlLab.getDatabase(id: number): Database | undefined
|
||||
|
||||
// Get schemas for database
|
||||
sqlLab.getSchemas(databaseId: number): Promise<string[]>
|
||||
|
||||
// Get tables for schema
|
||||
sqlLab.getTables(
|
||||
databaseId: number,
|
||||
schema: string
|
||||
): Promise<Table[]>
|
||||
```
|
||||
|
||||
### Events
|
||||
|
||||
```typescript
|
||||
// Query execution events
|
||||
sqlLab.onDidQueryRun: Event<QueryResult>
|
||||
sqlLab.onDidQueryStop: Event<QueryResult>
|
||||
sqlLab.onDidQueryFail: Event<QueryError>
|
||||
|
||||
// Editor events
|
||||
sqlLab.onDidChangeEditorContent: Event<string>
|
||||
sqlLab.onDidChangeSelection: Event<Selection>
|
||||
|
||||
// Tab events
|
||||
sqlLab.onDidChangeActiveTab: Event<Tab>
|
||||
sqlLab.onDidCloseTab: Event<Tab>
|
||||
sqlLab.onDidChangeTabTitle: Event<{tab: Tab, title: string}>
|
||||
|
||||
// Panel events
|
||||
sqlLab.onDidOpenPanel: Event<Panel>
|
||||
sqlLab.onDidClosePanel: Event<Panel>
|
||||
sqlLab.onDidChangeActivePanel: Event<Panel>
|
||||
```
|
||||
|
||||
**Event Usage Example:**
|
||||
```typescript
|
||||
const disposable = context.sqlLab.onDidQueryRun((result) => {
|
||||
console.log('Query executed:', result.query);
|
||||
console.log('Rows returned:', result.rowCount);
|
||||
console.log('Execution time:', result.executionTime);
|
||||
});
|
||||
|
||||
// Remember to dispose when done
|
||||
context.subscriptions.push(disposable);
|
||||
```
|
||||
|
||||
## Authentication API
|
||||
|
||||
Handles authentication and security tokens.
|
||||
|
||||
### getCSRFToken
|
||||
|
||||
Gets the current CSRF token for API requests.
|
||||
|
||||
```typescript
|
||||
authentication.getCSRFToken(): Promise<string>
|
||||
```
|
||||
|
||||
### getCurrentUser
|
||||
|
||||
Gets information about the current user.
|
||||
|
||||
```typescript
|
||||
authentication.getCurrentUser(): User
|
||||
|
||||
interface User {
|
||||
id: number;
|
||||
username: string;
|
||||
email: string;
|
||||
roles: Role[];
|
||||
permissions: Permission[];
|
||||
}
|
||||
```
|
||||
|
||||
### hasPermission
|
||||
|
||||
Checks if the current user has a specific permission.
|
||||
|
||||
```typescript
|
||||
authentication.hasPermission(permission: string): boolean
|
||||
```
|
||||
|
||||
## Extensions API
|
||||
|
||||
Manages extension lifecycle and inter-extension communication.
|
||||
|
||||
### getExtension
|
||||
|
||||
Gets information about an installed extension.
|
||||
|
||||
```typescript
|
||||
extensions.getExtension(id: string): Extension | undefined
|
||||
|
||||
interface Extension {
|
||||
id: string;
|
||||
name: string;
|
||||
version: string;
|
||||
isActive: boolean;
|
||||
metadata: ExtensionMetadata;
|
||||
}
|
||||
```
|
||||
|
||||
### getActiveExtensions
|
||||
|
||||
Gets all currently active extensions.
|
||||
|
||||
```typescript
|
||||
extensions.getActiveExtensions(): Extension[]
|
||||
```
|
||||
|
||||
### Events
|
||||
|
||||
```typescript
|
||||
// Extension lifecycle events
|
||||
extensions.onDidActivateExtension: Event<Extension>
|
||||
extensions.onDidDeactivateExtension: Event<Extension>
|
||||
```
|
||||
|
||||
## UI Components
|
||||
|
||||
Import pre-built UI components from `@apache-superset/core`:
|
||||
|
||||
```typescript
|
||||
import {
|
||||
Button,
|
||||
Select,
|
||||
Input,
|
||||
Table,
|
||||
Modal,
|
||||
Alert,
|
||||
Tabs,
|
||||
Card,
|
||||
Dropdown,
|
||||
Menu,
|
||||
Tooltip,
|
||||
Icon,
|
||||
// ... many more
|
||||
} from '@apache-superset/core';
|
||||
```
|
||||
|
||||
### Example Component Usage
|
||||
|
||||
```typescript
|
||||
import { Button, Alert } from '@apache-superset/core';
|
||||
|
||||
function MyExtensionPanel() {
|
||||
return (
|
||||
<div>
|
||||
<Alert
|
||||
message="Extension Loaded"
|
||||
description="Your extension is ready to use"
|
||||
type="success"
|
||||
/>
|
||||
<Button
|
||||
type="primary"
|
||||
onClick={() => console.log('Clicked!')}
|
||||
>
|
||||
Execute Action
|
||||
</Button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Storage API
|
||||
|
||||
Provides persistent storage for extension data.
|
||||
|
||||
### Local Storage
|
||||
|
||||
```typescript
|
||||
// Store data
|
||||
storage.local.set(key: string, value: any): Promise<void>
|
||||
|
||||
// Retrieve data
|
||||
storage.local.get(key: string): Promise<any>
|
||||
|
||||
// Remove data
|
||||
storage.local.remove(key: string): Promise<void>
|
||||
|
||||
// Clear all extension data
|
||||
storage.local.clear(): Promise<void>
|
||||
```
|
||||
|
||||
### Workspace Storage
|
||||
|
||||
Workspace storage is shared across all users for collaborative features.
|
||||
|
||||
```typescript
|
||||
storage.workspace.set(key: string, value: any): Promise<void>
|
||||
storage.workspace.get(key: string): Promise<any>
|
||||
storage.workspace.remove(key: string): Promise<void>
|
||||
```
|
||||
|
||||
## Network API
|
||||
|
||||
Utilities for making API calls to Superset.
|
||||
|
||||
### fetch
|
||||
|
||||
Enhanced fetch with CSRF token handling.
|
||||
|
||||
```typescript
|
||||
network.fetch(url: string, options?: RequestInit): Promise<Response>
|
||||
```
|
||||
|
||||
### API Client
|
||||
|
||||
Type-safe API client for Superset endpoints.
|
||||
|
||||
```typescript
|
||||
// Get chart data
|
||||
network.api.charts.get(id: number): Promise<Chart>
|
||||
|
||||
// Query database
|
||||
network.api.sqlLab.execute(
|
||||
databaseId: number,
|
||||
query: string
|
||||
): Promise<QueryResult>
|
||||
|
||||
// Get datasets
|
||||
network.api.datasets.list(): Promise<Dataset[]>
|
||||
```
|
||||
|
||||
## Utility Functions
|
||||
|
||||
### Formatting
|
||||
|
||||
```typescript
|
||||
// Format numbers
|
||||
utils.formatNumber(value: number, format?: string): string
|
||||
|
||||
// Format dates
|
||||
utils.formatDate(date: Date, format?: string): string
|
||||
|
||||
// Format SQL
|
||||
utils.formatSQL(sql: string): string
|
||||
```
|
||||
|
||||
### Validation
|
||||
|
||||
```typescript
|
||||
// Validate SQL syntax
|
||||
utils.validateSQL(sql: string): ValidationResult
|
||||
|
||||
// Check if valid database ID
|
||||
utils.isValidDatabaseId(id: any): boolean
|
||||
```
|
||||
|
||||
## TypeScript Types
|
||||
|
||||
Import common types for type safety:
|
||||
|
||||
```typescript
|
||||
import type {
|
||||
Database,
|
||||
Dataset,
|
||||
Chart,
|
||||
Dashboard,
|
||||
Query,
|
||||
QueryResult,
|
||||
Tab,
|
||||
Panel,
|
||||
User,
|
||||
Role,
|
||||
Permission,
|
||||
ExtensionContext,
|
||||
Disposable,
|
||||
Event,
|
||||
// ... more types
|
||||
} from '@apache-superset/core';
|
||||
```
|
||||
|
||||
## Extension Context
|
||||
|
||||
The context object passed to your extension's `activate` function:
|
||||
|
||||
```typescript
|
||||
interface ExtensionContext {
|
||||
// Subscription management
|
||||
subscriptions: Disposable[];
|
||||
|
||||
// Extension metadata
|
||||
extensionId: string;
|
||||
extensionPath: string;
|
||||
|
||||
// API namespaces
|
||||
core: CoreAPI;
|
||||
commands: CommandsAPI;
|
||||
sqlLab: SqlLabAPI;
|
||||
authentication: AuthenticationAPI;
|
||||
extensions: ExtensionsAPI;
|
||||
storage: StorageAPI;
|
||||
network: NetworkAPI;
|
||||
utils: UtilsAPI;
|
||||
|
||||
// Logging
|
||||
logger: Logger;
|
||||
}
|
||||
```
|
||||
|
||||
## Event Handling
|
||||
|
||||
Events follow the VS Code pattern with subscribe/dispose:
|
||||
|
||||
```typescript
|
||||
// Subscribe to event
|
||||
const disposable = sqlLab.onDidQueryRun((result) => {
|
||||
// Handle event
|
||||
});
|
||||
|
||||
// Dispose when done
|
||||
disposable.dispose();
|
||||
|
||||
// Or add to context for automatic cleanup
|
||||
context.subscriptions.push(disposable);
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always dispose subscriptions** to prevent memory leaks
|
||||
2. **Use TypeScript** for better IDE support and type safety
|
||||
3. **Handle errors gracefully** with try-catch blocks
|
||||
4. **Check permissions** before sensitive operations
|
||||
5. **Use provided UI components** for consistency
|
||||
6. **Cache API responses** when appropriate
|
||||
7. **Validate user input** before processing
|
||||
|
||||
## Version Compatibility
|
||||
|
||||
The frontend API follows semantic versioning:
|
||||
|
||||
- **Major version**: Breaking changes
|
||||
- **Minor version**: New features, backward compatible
|
||||
- **Patch version**: Bug fixes
|
||||
|
||||
Check compatibility in your `extension.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"engines": {
|
||||
"@apache-superset/core": "^1.0.0"
|
||||
}
|
||||
}
|
||||
```
|
||||
348
docs/developer_portal/architecture/overview.md
Normal file
348
docs/developer_portal/architecture/overview.md
Normal file
@@ -0,0 +1,348 @@
|
||||
---
|
||||
title: Architecture Overview
|
||||
sidebar_position: 1
|
||||
hide_title: true
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Extension Architecture Overview
|
||||
|
||||
The Superset extension architecture is designed to be modular, secure, and performant. This document provides a comprehensive overview of how extensions work and interact with the Superset host application.
|
||||
|
||||
## Core Principles
|
||||
|
||||
### 1. Lean Core
|
||||
Superset's core remains minimal, with features delegated to extensions wherever possible. Built-in features use the same APIs as external extensions, ensuring API quality through dogfooding.
|
||||
|
||||
### 2. Explicit Contribution Points
|
||||
All extension points are clearly defined and documented. Extensions declare their capabilities in metadata files, enabling predictable lifecycle management.
|
||||
|
||||
### 3. Versioned APIs
|
||||
Public interfaces follow semantic versioning, ensuring backward compatibility and safe evolution of the platform.
|
||||
|
||||
### 4. Lazy Loading
|
||||
Extensions load only when needed, minimizing performance impact and resource consumption.
|
||||
|
||||
### 5. Composability
|
||||
Architecture patterns and APIs are reusable across different Superset modules, promoting consistency.
|
||||
|
||||
### 6. Community-Driven
|
||||
The system evolves based on real-world feedback, with new extension points added as needs emerge.
|
||||
|
||||
## System Architecture
|
||||
|
||||
```mermaid
|
||||
graph TB
|
||||
subgraph "Superset Host Application"
|
||||
Core[Core Application]
|
||||
API[Extension APIs]
|
||||
Loader[Extension Loader]
|
||||
Manager[Extension Manager]
|
||||
end
|
||||
|
||||
subgraph "Core Packages"
|
||||
FrontendCore["@apache-superset/core<br/>(Frontend)"]
|
||||
BackendCore["apache-superset-core<br/>(Backend)"]
|
||||
CLI["apache-superset-extensions-cli"]
|
||||
end
|
||||
|
||||
subgraph "Extension"
|
||||
Metadata[extension.json]
|
||||
Frontend[Frontend Code]
|
||||
Backend[Backend Code]
|
||||
Bundle[.supx Bundle]
|
||||
end
|
||||
|
||||
Core --> API
|
||||
API --> FrontendCore
|
||||
API --> BackendCore
|
||||
Loader --> Manager
|
||||
Manager --> Bundle
|
||||
Frontend --> FrontendCore
|
||||
Backend --> BackendCore
|
||||
CLI --> Bundle
|
||||
```
|
||||
|
||||
## Key Components
|
||||
|
||||
### Host Application
|
||||
|
||||
The Superset host application provides:
|
||||
|
||||
- **Extension APIs**: Well-defined interfaces for extensions to interact with Superset
|
||||
- **Extension Manager**: Handles lifecycle, activation, and deactivation
|
||||
- **Module Loader**: Dynamically loads extension code using Webpack Module Federation
|
||||
- **Security Context**: Manages permissions and sandboxing for extensions
|
||||
|
||||
### Core Packages
|
||||
|
||||
#### @apache-superset/core (Frontend)
|
||||
- Shared UI components and utilities
|
||||
- TypeScript type definitions
|
||||
- Frontend API implementations
|
||||
- Event system and command registry
|
||||
|
||||
#### apache-superset-core (Backend)
|
||||
- Python base classes and utilities
|
||||
- Database access APIs
|
||||
- Security and permission helpers
|
||||
- REST API registration
|
||||
|
||||
#### apache-superset-extensions-cli
|
||||
- Project scaffolding
|
||||
- Build and bundling tools
|
||||
- Development server
|
||||
- Package management
|
||||
|
||||
### Extension Structure
|
||||
|
||||
Each extension consists of:
|
||||
|
||||
- **Metadata** (`extension.json`): Declares capabilities and requirements
|
||||
- **Frontend**: React components and TypeScript code
|
||||
- **Backend**: Python modules and API endpoints
|
||||
- **Assets**: Styles, images, and other resources
|
||||
- **Bundle** (`.supx`): Packaged distribution format
|
||||
|
||||
## Module Federation
|
||||
|
||||
Extensions use Webpack Module Federation for dynamic loading:
|
||||
|
||||
```javascript
|
||||
// Extension webpack.config.js
|
||||
new ModuleFederationPlugin({
|
||||
name: 'my_extension',
|
||||
filename: 'remoteEntry.[contenthash].js',
|
||||
exposes: {
|
||||
'./index': './src/index.tsx',
|
||||
},
|
||||
externals: {
|
||||
'@apache-superset/core': 'superset',
|
||||
},
|
||||
shared: {
|
||||
react: { singleton: true },
|
||||
'react-dom': { singleton: true },
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
This allows:
|
||||
- **Independent builds**: Extensions compile separately from Superset
|
||||
- **Shared dependencies**: Common libraries like React aren't duplicated
|
||||
- **Dynamic loading**: Extensions load at runtime without rebuilding Superset
|
||||
- **Version compatibility**: Extensions declare compatible core versions
|
||||
|
||||
## Extension Lifecycle
|
||||
|
||||
### 1. Registration
|
||||
```typescript
|
||||
// Extension registered with host
|
||||
extensionManager.register({
|
||||
name: 'my-extension',
|
||||
version: '1.0.0',
|
||||
manifest: manifestData
|
||||
});
|
||||
```
|
||||
|
||||
### 2. Activation
|
||||
```typescript
|
||||
// activate() called when extension loads
|
||||
export function activate(context: ExtensionContext) {
|
||||
// Register contributions
|
||||
const disposables = [];
|
||||
|
||||
// Add panel
|
||||
disposables.push(
|
||||
context.core.registerView('my-panel', MyPanel)
|
||||
);
|
||||
|
||||
// Register command
|
||||
disposables.push(
|
||||
context.commands.registerCommand('my-command', {
|
||||
execute: () => { /* ... */ }
|
||||
})
|
||||
);
|
||||
|
||||
// Store for cleanup
|
||||
context.subscriptions.push(...disposables);
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Runtime
|
||||
- Extension responds to events
|
||||
- Provides UI components when requested
|
||||
- Executes commands when triggered
|
||||
- Accesses APIs as needed
|
||||
|
||||
### 4. Deactivation
|
||||
```typescript
|
||||
// Automatic cleanup of registered items
|
||||
export function deactivate() {
|
||||
// context.subscriptions automatically disposed
|
||||
// Additional cleanup if needed
|
||||
}
|
||||
```
|
||||
|
||||
## Contribution Types
|
||||
|
||||
### Views
|
||||
Extensions can add panels and UI components:
|
||||
|
||||
```json
|
||||
{
|
||||
"views": {
|
||||
"sqllab.panels": [{
|
||||
"id": "my-panel",
|
||||
"name": "My Panel",
|
||||
"icon": "ToolOutlined"
|
||||
}]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Commands
|
||||
Define executable actions:
|
||||
|
||||
```json
|
||||
{
|
||||
"commands": [{
|
||||
"command": "my-extension.run",
|
||||
"title": "Run Analysis",
|
||||
"icon": "PlayCircleOutlined"
|
||||
}]
|
||||
}
|
||||
```
|
||||
|
||||
### Menus
|
||||
Add items to existing menus:
|
||||
|
||||
```json
|
||||
{
|
||||
"menus": {
|
||||
"sqllab.editor": {
|
||||
"primary": [{
|
||||
"command": "my-extension.run",
|
||||
"when": "editorHasSelection"
|
||||
}]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### API Endpoints
|
||||
Register backend REST endpoints:
|
||||
|
||||
```python
|
||||
from superset_core.api import rest_api
|
||||
|
||||
@rest_api.route('/my-endpoint')
|
||||
def my_endpoint():
|
||||
return {'data': 'value'}
|
||||
```
|
||||
|
||||
## Security Model
|
||||
|
||||
### Permissions
|
||||
- Extensions run with user's permissions
|
||||
- No elevation of privileges
|
||||
- Access controlled by Superset's RBAC
|
||||
|
||||
### Sandboxing
|
||||
- Frontend code runs in browser context
|
||||
- Backend code runs in Python process
|
||||
- Future: Optional sandboxed execution
|
||||
|
||||
### Validation
|
||||
- Manifest validation on upload
|
||||
- Signature verification (future)
|
||||
- Dependency scanning
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
### Lazy Loading
|
||||
- Extensions load only when features are accessed
|
||||
- Code splitting for large extensions
|
||||
- Cached after first load
|
||||
|
||||
### Bundle Optimization
|
||||
- Tree shaking removes unused code
|
||||
- Minification reduces size
|
||||
- Compression for network transfer
|
||||
|
||||
### Resource Management
|
||||
- Automatic cleanup on deactivation
|
||||
- Memory leak prevention
|
||||
- Event listener management
|
||||
|
||||
## Development vs Production
|
||||
|
||||
### Development Mode
|
||||
```python
|
||||
# superset_config.py
|
||||
ENABLE_EXTENSIONS = True
|
||||
LOCAL_EXTENSIONS = ['/path/to/extension']
|
||||
```
|
||||
- Hot reloading
|
||||
- Source maps
|
||||
- Debug logging
|
||||
|
||||
### Production Mode
|
||||
- Optimized bundles
|
||||
- Cached assets
|
||||
- Performance monitoring
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
### Planned Features
|
||||
- Enhanced sandboxing
|
||||
- Extension marketplace
|
||||
- Inter-extension communication
|
||||
- Theme contributions
|
||||
- Chart type extensions
|
||||
|
||||
### API Expansion
|
||||
- Dashboard extensions
|
||||
- Database connector API
|
||||
- Security provider interface
|
||||
- Workflow automation
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Do's
|
||||
- ✅ Use TypeScript for type safety
|
||||
- ✅ Follow semantic versioning
|
||||
- ✅ Handle errors gracefully
|
||||
- ✅ Clean up resources properly
|
||||
- ✅ Document your extension
|
||||
|
||||
### Don'ts
|
||||
- ❌ Access private APIs
|
||||
- ❌ Modify global state directly
|
||||
- ❌ Block the main thread
|
||||
- ❌ Store sensitive data insecurely
|
||||
- ❌ Assume API stability in 0.x versions
|
||||
|
||||
## Learn More
|
||||
|
||||
- [API Reference](../api/frontend)
|
||||
- [Development Guide](../getting-started)
|
||||
- [Security Guidelines](./security)
|
||||
- [Performance Optimization](./performance)
|
||||
466
docs/developer_portal/cli/overview.md
Normal file
466
docs/developer_portal/cli/overview.md
Normal file
@@ -0,0 +1,466 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
---
|
||||
title: CLI Documentation
|
||||
sidebar_position: 1
|
||||
hide_title: true
|
||||
---
|
||||
|
||||
# Superset Extensions CLI
|
||||
|
||||
The `apache-superset-extensions-cli` provides command-line tools for creating, developing, and packaging Superset extensions.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install apache-superset-extensions-cli
|
||||
```
|
||||
|
||||
## Commands
|
||||
|
||||
### init
|
||||
|
||||
Creates a new extension project with the standard folder structure.
|
||||
|
||||
```bash
|
||||
superset-extensions init <extension-name> [options]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--template <template>`: Use a specific template (default: basic)
|
||||
- `--author <name>`: Set the author name
|
||||
- `--description <text>`: Set the extension description
|
||||
- `--with-backend`: Include backend code structure
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
superset-extensions init my-extension \
|
||||
--author "John Doe" \
|
||||
--description "Adds custom analytics to SQL Lab" \
|
||||
--with-backend
|
||||
```
|
||||
|
||||
**Generated Structure:**
|
||||
```
|
||||
my-extension/
|
||||
├── extension.json
|
||||
├── frontend/
|
||||
│ ├── src/
|
||||
│ │ └── index.tsx
|
||||
│ ├── package.json
|
||||
│ ├── tsconfig.json
|
||||
│ └── webpack.config.js
|
||||
├── backend/
|
||||
│ ├── src/
|
||||
│ │ └── my_extension/
|
||||
│ │ ├── __init__.py
|
||||
│ │ └── entrypoint.py
|
||||
│ ├── tests/
|
||||
│ ├── pyproject.toml
|
||||
│ └── requirements.txt
|
||||
└── README.md
|
||||
```
|
||||
|
||||
### dev
|
||||
|
||||
Starts the development server with hot reloading.
|
||||
|
||||
```bash
|
||||
superset-extensions dev [options]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--port <port>`: Development server port (default: 9001)
|
||||
- `--host <host>`: Development server host (default: localhost)
|
||||
- `--no-watch`: Disable file watching
|
||||
- `--verbose`: Show detailed output
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
# Start development server
|
||||
superset-extensions dev
|
||||
|
||||
# Output:
|
||||
⚙️ Building frontend assets...
|
||||
✅ Frontend rebuilt
|
||||
✅ Backend files synced
|
||||
✅ Manifest updated
|
||||
👀 Watching for changes...
|
||||
```
|
||||
|
||||
### build
|
||||
|
||||
Builds the extension for production.
|
||||
|
||||
```bash
|
||||
superset-extensions build [options]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--mode <mode>`: Build mode (development | production)
|
||||
- `--analyze`: Generate bundle analysis
|
||||
- `--source-maps`: Include source maps
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
# Production build
|
||||
superset-extensions build --mode production
|
||||
|
||||
# With analysis
|
||||
superset-extensions build --analyze
|
||||
```
|
||||
|
||||
### bundle
|
||||
|
||||
Creates a `.supx` package for distribution.
|
||||
|
||||
```bash
|
||||
superset-extensions bundle [options]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--output <path>`: Output directory (default: current)
|
||||
- `--sign`: Sign the package (requires certificate)
|
||||
- `--compress`: Compression level (0-9, default: 6)
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
# Create bundle
|
||||
superset-extensions bundle
|
||||
|
||||
# Creates: my-extension-1.0.0.supx
|
||||
```
|
||||
|
||||
### validate
|
||||
|
||||
Validates extension configuration and structure.
|
||||
|
||||
```bash
|
||||
superset-extensions validate [options]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--fix`: Auto-fix common issues
|
||||
- `--strict`: Enable strict validation
|
||||
|
||||
**Checks:**
|
||||
- Valid extension.json syntax
|
||||
- Required files present
|
||||
- Dependency versions
|
||||
- Module exports
|
||||
- TypeScript configuration
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
superset-extensions validate --strict
|
||||
|
||||
# Output:
|
||||
✅ extension.json valid
|
||||
✅ Frontend structure valid
|
||||
✅ Backend structure valid
|
||||
⚠️ Warning: Missing LICENSE file
|
||||
✅ Validation passed with warnings
|
||||
```
|
||||
|
||||
### test
|
||||
|
||||
Runs extension tests.
|
||||
|
||||
```bash
|
||||
superset-extensions test [options]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--coverage`: Generate coverage report
|
||||
- `--watch`: Run in watch mode
|
||||
- `--frontend-only`: Run only frontend tests
|
||||
- `--backend-only`: Run only backend tests
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
# Run all tests
|
||||
superset-extensions test --coverage
|
||||
|
||||
# Watch mode for frontend
|
||||
superset-extensions test --frontend-only --watch
|
||||
```
|
||||
|
||||
### publish
|
||||
|
||||
Publishes extension to a registry (future feature).
|
||||
|
||||
```bash
|
||||
superset-extensions publish [options]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--registry <url>`: Registry URL
|
||||
- `--token <token>`: Authentication token
|
||||
- `--dry-run`: Simulate publish
|
||||
|
||||
## Configuration
|
||||
|
||||
### Project Configuration
|
||||
|
||||
The CLI reads configuration from multiple sources:
|
||||
|
||||
1. **extension.json** - Extension metadata
|
||||
2. **package.json** - Frontend dependencies
|
||||
3. **pyproject.toml** - Backend configuration
|
||||
4. **.extensionrc** - CLI-specific settings
|
||||
|
||||
### .extensionrc Example
|
||||
|
||||
```json
|
||||
{
|
||||
"dev": {
|
||||
"port": 9001,
|
||||
"host": "localhost",
|
||||
"autoReload": true
|
||||
},
|
||||
"build": {
|
||||
"mode": "production",
|
||||
"sourceMaps": false,
|
||||
"optimization": true
|
||||
},
|
||||
"test": {
|
||||
"coverage": true,
|
||||
"threshold": {
|
||||
"statements": 80,
|
||||
"branches": 70,
|
||||
"functions": 80,
|
||||
"lines": 80
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Templates
|
||||
|
||||
### Available Templates
|
||||
|
||||
- **basic**: Simple extension with frontend only
|
||||
- **full-stack**: Frontend and backend components
|
||||
- **sql-panel**: SQL Lab panel extension
|
||||
- **api-only**: Backend API extension
|
||||
- **chart-plugin**: Custom chart visualization
|
||||
|
||||
### Using Templates
|
||||
|
||||
```bash
|
||||
# Use specific template
|
||||
superset-extensions init my-chart --template chart-plugin
|
||||
|
||||
# List available templates
|
||||
superset-extensions init --list-templates
|
||||
```
|
||||
|
||||
### Custom Templates
|
||||
|
||||
Create custom templates in `~/.superset-extensions/templates/`:
|
||||
|
||||
```
|
||||
~/.superset-extensions/templates/
|
||||
└── my-template/
|
||||
├── template.json
|
||||
└── files/
|
||||
└── ... template files ...
|
||||
```
|
||||
|
||||
## Development Workflow
|
||||
|
||||
### 1. Create Extension
|
||||
|
||||
```bash
|
||||
superset-extensions init awesome-feature
|
||||
cd awesome-feature
|
||||
```
|
||||
|
||||
### 2. Install Dependencies
|
||||
|
||||
```bash
|
||||
# Frontend
|
||||
cd frontend && npm install
|
||||
|
||||
# Backend (if applicable)
|
||||
cd ../backend && pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 3. Configure Superset
|
||||
|
||||
```python
|
||||
# superset_config.py
|
||||
ENABLE_EXTENSIONS = True
|
||||
LOCAL_EXTENSIONS = [
|
||||
"/path/to/awesome-feature"
|
||||
]
|
||||
```
|
||||
|
||||
### 4. Start Development
|
||||
|
||||
```bash
|
||||
# Terminal 1: Extension dev server
|
||||
superset-extensions dev
|
||||
|
||||
# Terminal 2: Superset
|
||||
superset run -p 8088 --reload
|
||||
```
|
||||
|
||||
### 5. Test Changes
|
||||
|
||||
Make changes to your code and see them reflected immediately in Superset.
|
||||
|
||||
### 6. Build and Package
|
||||
|
||||
```bash
|
||||
# Validate
|
||||
superset-extensions validate
|
||||
|
||||
# Test
|
||||
superset-extensions test
|
||||
|
||||
# Build
|
||||
superset-extensions build --mode production
|
||||
|
||||
# Bundle
|
||||
superset-extensions bundle
|
||||
```
|
||||
|
||||
### 7. Deploy
|
||||
|
||||
Upload the `.supx` file to your Superset instance.
|
||||
|
||||
## Environment Variables
|
||||
|
||||
The CLI respects these environment variables:
|
||||
|
||||
- `SUPERSET_EXTENSIONS_DEV_PORT`: Development server port
|
||||
- `SUPERSET_EXTENSIONS_DEV_HOST`: Development server host
|
||||
- `SUPERSET_BASE_URL`: Superset instance URL
|
||||
- `NODE_ENV`: Node environment (development/production)
|
||||
- `PYTHONPATH`: Python module search path
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
#### Port Already in Use
|
||||
|
||||
```bash
|
||||
# Use different port
|
||||
superset-extensions dev --port 9002
|
||||
```
|
||||
|
||||
#### Module Federation Errors
|
||||
|
||||
```bash
|
||||
# Rebuild with clean cache
|
||||
rm -rf dist/ node_modules/.cache
|
||||
superset-extensions build
|
||||
```
|
||||
|
||||
#### Python Import Errors
|
||||
|
||||
```bash
|
||||
# Ensure virtual environment is activated
|
||||
source venv/bin/activate
|
||||
superset-extensions dev
|
||||
```
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable verbose output for troubleshooting:
|
||||
|
||||
```bash
|
||||
# Verbose output
|
||||
superset-extensions dev --verbose
|
||||
|
||||
# Debug webpack
|
||||
DEBUG=webpack:* superset-extensions build
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Version Control**: Commit `extension.json` but not `dist/`
|
||||
2. **Dependencies**: Pin versions in package.json
|
||||
3. **Testing**: Write tests for critical functionality
|
||||
4. **Documentation**: Keep README.md updated
|
||||
5. **Validation**: Run validate before bundling
|
||||
6. **Semantic Versioning**: Follow semver for releases
|
||||
|
||||
## Advanced Usage
|
||||
|
||||
### Custom Webpack Configuration
|
||||
|
||||
Extend the default webpack config:
|
||||
|
||||
```javascript
|
||||
// webpack.config.js
|
||||
const baseConfig = require('./webpack.base.config');
|
||||
|
||||
module.exports = {
|
||||
...baseConfig,
|
||||
// Custom modifications
|
||||
resolve: {
|
||||
...baseConfig.resolve,
|
||||
alias: {
|
||||
'@': path.resolve(__dirname, 'src'),
|
||||
},
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### CI/CD Integration
|
||||
|
||||
```yaml
|
||||
# .github/workflows/extension.yml
|
||||
name: Extension CI
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
test:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions/setup-node@v2
|
||||
- uses: actions/setup-python@v2
|
||||
|
||||
- name: Install CLI
|
||||
run: pip install apache-superset-extensions-cli
|
||||
|
||||
- name: Validate
|
||||
run: superset-extensions validate --strict
|
||||
|
||||
- name: Test
|
||||
run: superset-extensions test --coverage
|
||||
|
||||
- name: Build
|
||||
run: superset-extensions build --mode production
|
||||
|
||||
- name: Bundle
|
||||
run: superset-extensions bundle
|
||||
```
|
||||
|
||||
## Getting Help
|
||||
|
||||
- **Documentation**: [Developer Portal](../)
|
||||
- **Examples**: [GitHub Repository](https://github.com/apache/superset/tree/master/extensions)
|
||||
- **Issues**: [GitHub Issues](https://github.com/apache/superset/issues)
|
||||
- **Community**: [Slack Channel](https://apache-superset.slack.com)
|
||||
464
docs/developer_portal/examples/index.md
Normal file
464
docs/developer_portal/examples/index.md
Normal file
@@ -0,0 +1,464 @@
|
||||
---
|
||||
title: Extension Examples
|
||||
sidebar_position: 1
|
||||
hide_title: true
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Extension Examples
|
||||
|
||||
Learn from real-world extension implementations that showcase different capabilities of the Superset extension system.
|
||||
|
||||
## Dataset References Panel
|
||||
|
||||
A SQL Lab panel that analyzes queries and displays information about referenced tables.
|
||||
|
||||
### Features
|
||||
- Parses SQL to extract table references
|
||||
- Shows table owners and permissions
|
||||
- Displays last partition information
|
||||
- Provides row count estimates
|
||||
|
||||
### Key Implementation
|
||||
|
||||
```typescript
|
||||
// Parse SQL and extract tables
|
||||
function extractTables(sql: string): TableReference[] {
|
||||
const tables = [];
|
||||
const tableRegex = /FROM\s+(\w+\.?\w+)/gi;
|
||||
let match;
|
||||
|
||||
while ((match = tableRegex.exec(sql)) !== null) {
|
||||
tables.push({
|
||||
schema: match[1].split('.')[0],
|
||||
table: match[1].split('.')[1] || match[1],
|
||||
});
|
||||
}
|
||||
|
||||
return tables;
|
||||
}
|
||||
|
||||
// Register panel
|
||||
export function activate(context: ExtensionContext) {
|
||||
const panel = context.core.registerView('dataset-references.panel', () => (
|
||||
<DatasetReferencesPanel />
|
||||
));
|
||||
|
||||
// Listen for query changes
|
||||
const listener = context.sqlLab.onDidChangeEditorContent((content) => {
|
||||
const tables = extractTables(content);
|
||||
updatePanelWithTables(tables);
|
||||
});
|
||||
|
||||
context.subscriptions.push(panel, listener);
|
||||
}
|
||||
```
|
||||
|
||||
### Manifest
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "dataset-references",
|
||||
"contributions": {
|
||||
"views": {
|
||||
"sqllab.panels": [{
|
||||
"id": "dataset-references.panel",
|
||||
"name": "Dataset References",
|
||||
"icon": "DatabaseOutlined",
|
||||
"location": "right"
|
||||
}]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Query Optimizer
|
||||
|
||||
Analyzes SQL queries and suggests optimizations.
|
||||
|
||||
### Features
|
||||
- Detects missing indexes
|
||||
- Suggests query rewrites
|
||||
- Identifies expensive operations
|
||||
- Provides execution plan analysis
|
||||
|
||||
### Implementation Highlights
|
||||
|
||||
```typescript
|
||||
// Register optimization command
|
||||
const optimizeCommand = context.commands.registerCommand('query-optimizer.analyze', {
|
||||
title: 'Analyze Query Performance',
|
||||
icon: 'ThunderboltOutlined',
|
||||
execute: async () => {
|
||||
const query = context.sqlLab.getCurrentQuery();
|
||||
const database = context.sqlLab.getCurrentDatabase();
|
||||
|
||||
// Get execution plan
|
||||
const plan = await getExecutionPlan(database.id, query);
|
||||
|
||||
// Analyze and suggest improvements
|
||||
const suggestions = analyzeExecutionPlan(plan);
|
||||
|
||||
// Show results in panel
|
||||
showOptimizationResults(suggestions);
|
||||
}
|
||||
});
|
||||
|
||||
// Add to editor menu
|
||||
"menus": {
|
||||
"sqllab.editor": {
|
||||
"primary": [{
|
||||
"command": "query-optimizer.analyze",
|
||||
"when": "editorHasContent"
|
||||
}]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Natural Language to SQL
|
||||
|
||||
Converts natural language questions to SQL queries using LLM integration.
|
||||
|
||||
### Features
|
||||
- Natural language input
|
||||
- Context-aware SQL generation
|
||||
- Query validation
|
||||
- History tracking
|
||||
|
||||
### Key Components
|
||||
|
||||
```typescript
|
||||
// Backend API endpoint
|
||||
@rest_api.route('/nl2sql/generate')
|
||||
def generate_sql(prompt: str, context: dict):
|
||||
# Use LLM to generate SQL
|
||||
sql = llm_client.generate(
|
||||
prompt=prompt,
|
||||
schema=context['schema'],
|
||||
examples=context['examples']
|
||||
)
|
||||
|
||||
# Validate generated SQL
|
||||
validation = validate_sql(sql)
|
||||
|
||||
return {
|
||||
'sql': sql,
|
||||
'valid': validation.is_valid,
|
||||
'errors': validation.errors
|
||||
}
|
||||
```
|
||||
|
||||
```typescript
|
||||
// Frontend integration
|
||||
function NL2SQLPanel() {
|
||||
const [prompt, setPrompt] = useState('');
|
||||
const [loading, setLoading] = useState(false);
|
||||
|
||||
const generateSQL = async () => {
|
||||
setLoading(true);
|
||||
|
||||
const response = await context.network.api.post('/extensions/nl2sql/generate', {
|
||||
prompt,
|
||||
context: {
|
||||
database: context.sqlLab.getCurrentDatabase(),
|
||||
schema: await context.sqlLab.getCurrentSchema(),
|
||||
}
|
||||
});
|
||||
|
||||
if (response.valid) {
|
||||
// Insert SQL into editor
|
||||
context.sqlLab.insertText(response.sql);
|
||||
}
|
||||
|
||||
setLoading(false);
|
||||
};
|
||||
|
||||
return (
|
||||
<div>
|
||||
<Input.TextArea
|
||||
value={prompt}
|
||||
onChange={(e) => setPrompt(e.target.value)}
|
||||
placeholder="Describe what data you want..."
|
||||
/>
|
||||
<Button onClick={generateSQL} loading={loading}>
|
||||
Generate SQL
|
||||
</Button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Schema Visualizer
|
||||
|
||||
Interactive database schema visualization.
|
||||
|
||||
### Features
|
||||
- Visual ERD diagram
|
||||
- Table relationships
|
||||
- Column details on hover
|
||||
- Export to image
|
||||
|
||||
### Implementation
|
||||
|
||||
```typescript
|
||||
import { Graph } from '@antv/g6';
|
||||
|
||||
function SchemaVisualizer() {
|
||||
const containerRef = useRef<HTMLDivElement>(null);
|
||||
const [graph, setGraph] = useState<Graph>();
|
||||
|
||||
useEffect(() => {
|
||||
if (!containerRef.current) return;
|
||||
|
||||
const g = new Graph({
|
||||
container: containerRef.current,
|
||||
layout: {
|
||||
type: 'dagre',
|
||||
rankdir: 'LR',
|
||||
},
|
||||
defaultNode: {
|
||||
type: 'sql-table-node',
|
||||
},
|
||||
defaultEdge: {
|
||||
type: 'sql-relation-edge',
|
||||
},
|
||||
});
|
||||
|
||||
setGraph(g);
|
||||
loadSchemaData(g);
|
||||
|
||||
return () => g.destroy();
|
||||
}, []);
|
||||
|
||||
const loadSchemaData = async (g: Graph) => {
|
||||
const tables = await context.sqlLab.getTables();
|
||||
const nodes = tables.map(table => ({
|
||||
id: table.name,
|
||||
label: table.name,
|
||||
columns: table.columns,
|
||||
}));
|
||||
|
||||
const edges = extractRelationships(tables);
|
||||
|
||||
g.data({ nodes, edges });
|
||||
g.render();
|
||||
};
|
||||
|
||||
return <div ref={containerRef} style={{ height: '100%' }} />;
|
||||
}
|
||||
```
|
||||
|
||||
## SQL Formatter
|
||||
|
||||
Formats and beautifies SQL code with customizable rules.
|
||||
|
||||
### Features
|
||||
- Multiple formatting styles
|
||||
- Custom rule configuration
|
||||
- Batch formatting
|
||||
- Format on save
|
||||
|
||||
### Simple Implementation
|
||||
|
||||
```typescript
|
||||
import { format } from 'sql-formatter';
|
||||
|
||||
const formatCommand = context.commands.registerCommand('sql-formatter.format', {
|
||||
title: 'Format SQL',
|
||||
execute: () => {
|
||||
const sql = context.sqlLab.getCurrentQuery();
|
||||
|
||||
const formatted = format(sql, {
|
||||
language: 'sql',
|
||||
indent: ' ',
|
||||
uppercase: true,
|
||||
linesBetweenQueries: 2,
|
||||
});
|
||||
|
||||
context.sqlLab.replaceQuery(formatted);
|
||||
}
|
||||
});
|
||||
|
||||
// Auto-format on save
|
||||
context.sqlLab.onWillSaveQuery((event) => {
|
||||
if (context.storage.local.get('autoFormat')) {
|
||||
const formatted = format(event.query);
|
||||
event.waitUntil(Promise.resolve(formatted));
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Query History Search
|
||||
|
||||
Enhanced query history with advanced search and filtering.
|
||||
|
||||
### Features
|
||||
- Full-text search
|
||||
- Filter by date, user, database
|
||||
- Query statistics
|
||||
- Export capabilities
|
||||
|
||||
### UI Component
|
||||
|
||||
```typescript
|
||||
function QueryHistoryPanel() {
|
||||
const [queries, setQueries] = useState<Query[]>([]);
|
||||
const [filters, setFilters] = useState<Filters>({});
|
||||
|
||||
useEffect(() => {
|
||||
loadQueries();
|
||||
}, [filters]);
|
||||
|
||||
const loadQueries = async () => {
|
||||
const history = await context.network.api.get('/api/v1/query', {
|
||||
params: {
|
||||
...filters,
|
||||
page_size: 100,
|
||||
}
|
||||
});
|
||||
|
||||
setQueries(history.result);
|
||||
};
|
||||
|
||||
return (
|
||||
<div>
|
||||
<SearchFilters onChange={setFilters} />
|
||||
<Table
|
||||
dataSource={queries}
|
||||
columns={[
|
||||
{ title: 'Query', dataIndex: 'sql', ellipsis: true },
|
||||
{ title: 'Database', dataIndex: 'database' },
|
||||
{ title: 'Status', dataIndex: 'status' },
|
||||
{ title: 'Duration', dataIndex: 'duration' },
|
||||
{ title: 'User', dataIndex: 'user' },
|
||||
{
|
||||
title: 'Actions',
|
||||
render: (query) => (
|
||||
<Button
|
||||
icon={<CopyOutlined />}
|
||||
onClick={() => context.sqlLab.insertText(query.sql)}
|
||||
/>
|
||||
),
|
||||
},
|
||||
]}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Git Integration
|
||||
|
||||
Version control for SQL queries and dashboards.
|
||||
|
||||
### Features
|
||||
- Save queries to Git
|
||||
- Track changes
|
||||
- Collaborative editing
|
||||
- Branch management
|
||||
|
||||
### Backend Integration
|
||||
|
||||
```python
|
||||
from git import Repo
|
||||
|
||||
class GitExtension:
|
||||
def __init__(self, repo_path):
|
||||
self.repo = Repo(repo_path)
|
||||
|
||||
def save_query(self, query, message):
|
||||
# Save query to file
|
||||
path = f"queries/{query.name}.sql"
|
||||
with open(path, 'w') as f:
|
||||
f.write(query.sql)
|
||||
|
||||
# Commit to Git
|
||||
self.repo.index.add([path])
|
||||
self.repo.index.commit(message)
|
||||
|
||||
return {
|
||||
'status': 'success',
|
||||
'commit': self.repo.head.commit.hexsha
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices from Examples
|
||||
|
||||
### 1. User Experience
|
||||
- Provide clear feedback for async operations
|
||||
- Handle errors gracefully
|
||||
- Include loading states
|
||||
- Add keyboard shortcuts
|
||||
|
||||
### 2. Performance
|
||||
- Debounce expensive operations
|
||||
- Cache API responses
|
||||
- Use virtual scrolling for large lists
|
||||
- Lazy load heavy components
|
||||
|
||||
### 3. Integration
|
||||
- Respect Superset's theme
|
||||
- Use provided UI components
|
||||
- Follow existing UX patterns
|
||||
- Integrate with existing menus
|
||||
|
||||
### 4. Code Organization
|
||||
```
|
||||
extension/
|
||||
├── frontend/
|
||||
│ ├── src/
|
||||
│ │ ├── components/ # UI components
|
||||
│ │ ├── hooks/ # Custom hooks
|
||||
│ │ ├── services/ # API services
|
||||
│ │ ├── utils/ # Utilities
|
||||
│ │ └── index.tsx # Entry point
|
||||
│ └── tests/
|
||||
├── backend/
|
||||
│ ├── src/
|
||||
│ │ ├── api/ # REST endpoints
|
||||
│ │ ├── models/ # Data models
|
||||
│ │ ├── services/ # Business logic
|
||||
│ │ └── entrypoint.py
|
||||
│ └── tests/
|
||||
```
|
||||
|
||||
### 5. Testing
|
||||
```typescript
|
||||
// Test example
|
||||
describe('DatasetReferences', () => {
|
||||
it('should extract tables from SQL', () => {
|
||||
const sql = 'SELECT * FROM users JOIN orders ON users.id = orders.user_id';
|
||||
const tables = extractTables(sql);
|
||||
|
||||
expect(tables).toEqual([
|
||||
{ schema: 'public', table: 'users' },
|
||||
{ schema: 'public', table: 'orders' },
|
||||
]);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Learn More
|
||||
|
||||
- [API Reference](../api/frontend)
|
||||
- [Architecture Overview](../architecture/overview)
|
||||
- [Getting Started Guide](../getting-started)
|
||||
- [CLI Documentation](../cli/overview)
|
||||
248
docs/developer_portal/getting-started/index.md
Normal file
248
docs/developer_portal/getting-started/index.md
Normal file
@@ -0,0 +1,248 @@
|
||||
---
|
||||
title: Getting Started
|
||||
sidebar_position: 2
|
||||
hide_title: true
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Getting Started with Extensions
|
||||
|
||||
This guide will walk you through creating, developing, and deploying your first Superset extension.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Before you begin, make sure you have:
|
||||
|
||||
- **Node.js 16+** and npm/yarn installed
|
||||
- **Python 3.9+** and pip installed
|
||||
- **Superset** running locally or access to a Superset instance
|
||||
- **Git** for version control
|
||||
- Basic knowledge of React and TypeScript
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Install the CLI
|
||||
|
||||
First, install the Superset Extensions CLI globally:
|
||||
|
||||
```bash
|
||||
pip install apache-superset-extensions-cli
|
||||
```
|
||||
|
||||
### 2. Create Your First Extension
|
||||
|
||||
Use the CLI to scaffold a new extension project:
|
||||
|
||||
```bash
|
||||
superset-extensions init my-first-extension
|
||||
cd my-first-extension
|
||||
```
|
||||
|
||||
This creates the following structure:
|
||||
|
||||
```
|
||||
my-first-extension/
|
||||
├── extension.json # Extension metadata
|
||||
├── frontend/ # Frontend code
|
||||
│ ├── src/
|
||||
│ │ └── index.tsx # Main entry point
|
||||
│ ├── package.json
|
||||
│ └── webpack.config.js
|
||||
├── backend/ # Backend code (optional)
|
||||
│ ├── src/
|
||||
│ └── requirements.txt
|
||||
└── README.md
|
||||
```
|
||||
|
||||
### 3. Configure Your Extension
|
||||
|
||||
Edit `extension.json` to define your extension's capabilities:
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "my-first-extension",
|
||||
"version": "1.0.0",
|
||||
"description": "My first Superset extension",
|
||||
"author": "Your Name",
|
||||
"frontend": {
|
||||
"contributions": {
|
||||
"views": {
|
||||
"sqllab.panels": [
|
||||
{
|
||||
"id": "my-extension.main",
|
||||
"name": "My Panel",
|
||||
"icon": "ToolOutlined"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Develop Your Extension
|
||||
|
||||
Edit `frontend/src/index.tsx` to implement your extension:
|
||||
|
||||
```typescript
|
||||
import React from 'react';
|
||||
import { ExtensionContext } from '@apache-superset/core';
|
||||
|
||||
export function activate(context: ExtensionContext) {
|
||||
// Register your panel component
|
||||
const panel = context.core.registerView('my-extension.main', () => (
|
||||
<div style={{ padding: 20 }}>
|
||||
<h2>Hello from My Extension!</h2>
|
||||
<p>This is my first Superset extension.</p>
|
||||
</div>
|
||||
));
|
||||
|
||||
// Clean up on deactivation
|
||||
context.subscriptions.push(panel);
|
||||
}
|
||||
|
||||
export function deactivate() {
|
||||
// Cleanup code if needed
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Test Locally
|
||||
|
||||
Enable development mode in your Superset configuration:
|
||||
|
||||
```python
|
||||
# superset_config.py
|
||||
ENABLE_EXTENSIONS = True
|
||||
LOCAL_EXTENSIONS = [
|
||||
"/path/to/my-first-extension"
|
||||
]
|
||||
```
|
||||
|
||||
Run the development server:
|
||||
|
||||
```bash
|
||||
# In your extension directory
|
||||
superset-extensions dev
|
||||
|
||||
# In a separate terminal, start Superset
|
||||
superset run -p 8088 --with-threads --reload
|
||||
```
|
||||
|
||||
Your extension will now appear in SQL Lab!
|
||||
|
||||
### 6. Build and Package
|
||||
|
||||
When ready to distribute your extension:
|
||||
|
||||
```bash
|
||||
superset-extensions build
|
||||
superset-extensions bundle
|
||||
```
|
||||
|
||||
This creates a `my-first-extension-1.0.0.supx` file that can be uploaded to any Superset instance.
|
||||
|
||||
### 7. Deploy
|
||||
|
||||
Upload your extension via the Superset UI or API:
|
||||
|
||||
```bash
|
||||
curl -X POST http://localhost:8088/api/v1/extensions/import/ \
|
||||
-H "Authorization: Bearer YOUR_TOKEN" \
|
||||
-F "file=@my-first-extension-1.0.0.supx"
|
||||
```
|
||||
|
||||
## What's Next?
|
||||
|
||||
Now that you have a basic extension working, explore:
|
||||
|
||||
- [Extension Architecture](../architecture/overview) - Understand how extensions work
|
||||
- [API Reference](../api/frontend) - Learn about available APIs
|
||||
- [Examples](../examples) - See more complex extension examples
|
||||
- [Best Practices](./best-practices) - Learn extension development best practices
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### Adding a SQL Lab Panel
|
||||
|
||||
```typescript
|
||||
const panel = context.core.registerView('my-extension.panel', () => (
|
||||
<MyPanelComponent />
|
||||
));
|
||||
```
|
||||
|
||||
### Registering Commands
|
||||
|
||||
```typescript
|
||||
const command = context.commands.registerCommand('my-extension.run', {
|
||||
title: 'Run My Command',
|
||||
execute: () => {
|
||||
// Command logic
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
### Listening to Events
|
||||
|
||||
```typescript
|
||||
const listener = context.sqlLab.onDidQueryRun((query) => {
|
||||
console.log('Query executed:', query);
|
||||
});
|
||||
```
|
||||
|
||||
### Adding Menu Items
|
||||
|
||||
```json
|
||||
{
|
||||
"menus": {
|
||||
"sqllab.editor": {
|
||||
"primary": [{
|
||||
"command": "my-extension.run",
|
||||
"when": "editorHasSelection"
|
||||
}]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Extension Not Loading
|
||||
|
||||
- Ensure `ENABLE_EXTENSIONS = True` in your config
|
||||
- Check the browser console for errors
|
||||
- Verify the extension path in `LOCAL_EXTENSIONS`
|
||||
|
||||
### Module Not Found Errors
|
||||
|
||||
- Run `npm install` in the frontend directory
|
||||
- Check that `@apache-superset/core` is properly externalized
|
||||
|
||||
### Build Failures
|
||||
|
||||
- Ensure all dependencies are installed
|
||||
- Check webpack.config.js for proper Module Federation setup
|
||||
- Verify extension.json is valid JSON
|
||||
|
||||
## Get Help
|
||||
|
||||
- Join the [Superset Slack](https://join.slack.com/t/apache-superset/shared_invite/zt-16jvzmoi8-sI1TY1Pm~y_RnSiUAN0jqQ)
|
||||
- Ask questions on [GitHub Discussions](https://github.com/apache/superset/discussions)
|
||||
- Report issues on [GitHub Issues](https://github.com/apache/superset/issues)
|
||||
126
docs/developer_portal/index.md
Normal file
126
docs/developer_portal/index.md
Normal file
@@ -0,0 +1,126 @@
|
||||
---
|
||||
title: Developer Portal
|
||||
sidebar_position: 1
|
||||
hide_title: true
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Superset Developer Portal
|
||||
|
||||
Welcome to the Apache Superset Developer Portal! This is your comprehensive guide to extending and customizing Superset through our new extension architecture.
|
||||
|
||||
## What are Superset Extensions?
|
||||
|
||||
Superset Extensions provide a powerful way to enhance and customize Apache Superset without modifying the core codebase. Following the successful model pioneered by VS Code, our extension architecture enables developers to:
|
||||
|
||||
- **Add custom features** to SQL Lab and other Superset modules
|
||||
- **Create reusable components** that can be shared across organizations
|
||||
- **Integrate external tools** and services seamlessly
|
||||
- **Customize workflows** to match your team's specific needs
|
||||
|
||||
## Why Extensions?
|
||||
|
||||
As Superset has grown, we've recognized the need for a more modular architecture that allows:
|
||||
|
||||
- **Innovation without fragmentation** - Build features without forking the codebase
|
||||
- **Community-driven development** - Share and reuse extensions across organizations
|
||||
- **Stable APIs** - Develop against versioned, well-documented interfaces
|
||||
- **Rapid iteration** - Deploy custom features without waiting for core releases
|
||||
|
||||
## Key Features
|
||||
|
||||
### 🎯 Well-Defined Extension Points
|
||||
|
||||
Extensions can contribute to specific areas of Superset:
|
||||
- SQL Lab panels (left, right, bottom, editor)
|
||||
- Custom commands and menu items
|
||||
- Status bar components
|
||||
- API endpoints and backend functionality
|
||||
|
||||
### 🔧 Modern Development Experience
|
||||
|
||||
- **CLI tools** for scaffolding, building, and packaging
|
||||
- **Hot reloading** during development
|
||||
- **TypeScript support** with full type safety
|
||||
- **Module Federation** for dynamic loading
|
||||
|
||||
### 📦 Simple Distribution
|
||||
|
||||
- Package extensions as `.supx` files
|
||||
- Upload via REST API or UI
|
||||
- Automatic activation and lifecycle management
|
||||
- Version compatibility checking
|
||||
|
||||
## Getting Started
|
||||
|
||||
Ready to build your first extension? Check out our [Getting Started Guide](./getting-started) to:
|
||||
|
||||
1. Set up your development environment
|
||||
2. Create your first extension
|
||||
3. Test it locally
|
||||
4. Package and deploy it
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
Our extension architecture is built on several key principles:
|
||||
|
||||
- **Lean Core**: Keep Superset's core minimal and delegate features to extensions
|
||||
- **Explicit APIs**: Clear, versioned interfaces for extension interactions
|
||||
- **Lazy Loading**: Extensions load only when needed for optimal performance
|
||||
- **Security First**: Extensions run with appropriate permissions and sandboxing
|
||||
|
||||
Learn more in our [Architecture Documentation](./architecture/overview).
|
||||
|
||||
## Current Status
|
||||
|
||||
The extension architecture is currently in active development. The initial focus is on SQL Lab extensions, with plans to expand to:
|
||||
|
||||
- Dashboard extensions
|
||||
- Chart plugins (enhanced from current system)
|
||||
- Database connectors
|
||||
- Security providers
|
||||
|
||||
## Example: Dataset References Extension
|
||||
|
||||
See the extension system in action with our Dataset References example, which adds a SQL Lab panel showing:
|
||||
|
||||
- Tables referenced in queries
|
||||
- Table owners for permission requests
|
||||
- Last available partitions
|
||||
- Estimated row counts
|
||||
|
||||
## Join the Community
|
||||
|
||||
- **Contribute**: Help shape the future of Superset extensions
|
||||
- **Share**: Publish your extensions for others to use
|
||||
- **Learn**: Explore extensions built by the community
|
||||
|
||||
## Quick Links
|
||||
|
||||
- [Getting Started Guide](./getting-started)
|
||||
- [Extension Architecture](./architecture/overview)
|
||||
- [API Reference](./api/frontend)
|
||||
- [CLI Documentation](./cli/overview)
|
||||
- [Examples](./examples)
|
||||
|
||||
---
|
||||
|
||||
*The Superset extension architecture is inspired by the successful model of VS Code Extensions, bringing similar flexibility and power to the data exploration domain.*
|
||||
59
docs/developer_portal/sidebars.js
Normal file
59
docs/developer_portal/sidebars.js
Normal file
@@ -0,0 +1,59 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
module.exports = {
|
||||
developerPortalSidebar: [
|
||||
'index',
|
||||
{
|
||||
type: 'category',
|
||||
label: 'Getting Started',
|
||||
items: [
|
||||
'getting-started/index',
|
||||
],
|
||||
},
|
||||
{
|
||||
type: 'category',
|
||||
label: 'Architecture',
|
||||
items: [
|
||||
'architecture/overview',
|
||||
],
|
||||
},
|
||||
{
|
||||
type: 'category',
|
||||
label: 'API Reference',
|
||||
items: [
|
||||
'api/frontend',
|
||||
],
|
||||
},
|
||||
{
|
||||
type: 'category',
|
||||
label: 'CLI',
|
||||
items: [
|
||||
'cli/overview',
|
||||
],
|
||||
},
|
||||
{
|
||||
type: 'category',
|
||||
label: 'Examples',
|
||||
items: [
|
||||
'examples/index',
|
||||
],
|
||||
},
|
||||
],
|
||||
};
|
||||
1
docs/developer_portal/versions.json
Normal file
1
docs/developer_portal/versions.json
Normal file
@@ -0,0 +1 @@
|
||||
[]
|
||||
1
docs/developer_portal_versions.json
Normal file
1
docs/developer_portal_versions.json
Normal file
@@ -0,0 +1 @@
|
||||
[]
|
||||
78
docs/docs/configuration/map-tiles.mdx
Normal file
78
docs/docs/configuration/map-tiles.mdx
Normal file
@@ -0,0 +1,78 @@
|
||||
---
|
||||
title: Map Tiles
|
||||
sidebar_position: 12
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Map tiles
|
||||
|
||||
Superset uses OSM and Mapbox tiles by default. OSM is free but you still need setting your MAPBOX_API_KEY if you want to use mapbox maps.
|
||||
|
||||
## Setting map tiles
|
||||
|
||||
Map tiles can be set with `DECKGL_BASE_MAP` in your `superset_config.py` or `superset_config_docker.py`
|
||||
For adding your own map tiles, you can use the following format.
|
||||
|
||||
```python
|
||||
DECKGL_BASE_MAP = [
|
||||
['tile://https://your_personal_url/{z}/{x}/{y}.png', 'MyTile']
|
||||
]
|
||||
```
|
||||
Openstreetmap tiles url can be added without prefix.
|
||||
```python
|
||||
DECKGL_BASE_MAP = [
|
||||
['https://c.tile.openstreetmap.org/{z}/{x}/{y}.png', 'OpenStreetMap']
|
||||
]
|
||||
```
|
||||
|
||||
Default values are:
|
||||
```python
|
||||
DECKGL_BASE_MAP = [
|
||||
['https://tile.openstreetmap.org/{z}/{x}/{y}.png', 'Streets (OSM)'],
|
||||
['https://tile.osm.ch/osm-swiss-style/{z}/{x}/{y}.png', 'Topography (OSM)'],
|
||||
['mapbox://styles/mapbox/streets-v9', 'Streets'],
|
||||
['mapbox://styles/mapbox/dark-v9', 'Dark'],
|
||||
['mapbox://styles/mapbox/light-v9', 'Light'],
|
||||
['mapbox://styles/mapbox/satellite-streets-v9', 'Satellite Streets'],
|
||||
['mapbox://styles/mapbox/satellite-v9', 'Satellite'],
|
||||
['mapbox://styles/mapbox/outdoors-v9', 'Outdoors'],
|
||||
]
|
||||
```
|
||||
|
||||
It is possible to set only mapbox by removing osm tiles and other way around.
|
||||
|
||||
:::warning
|
||||
Setting `DECKGL_BASE_MAP` overwrite default values
|
||||
:::
|
||||
|
||||
After defining your map tiles, set them in these variables:
|
||||
- `CORS_OPTIONS`
|
||||
- `connect-src` of `TALISMAN_CONFIG` and `TALISMAN_CONFIG_DEV` variables.
|
||||
|
||||
```python
|
||||
ENABLE_CORS = True
|
||||
CORS_OPTIONS: dict[Any, Any] = {
|
||||
"origins": [
|
||||
"https://tile.openstreetmap.org",
|
||||
"https://tile.osm.ch",
|
||||
"https://your_personal_url/{z}/{x}/{y}.png",
|
||||
]
|
||||
}
|
||||
|
||||
.
|
||||
.
|
||||
|
||||
TALISMAN_CONFIG = {
|
||||
"content_security_policy": {
|
||||
...
|
||||
"connect-src": [
|
||||
"'self'",
|
||||
"https://api.mapbox.com",
|
||||
"https://events.mapbox.com",
|
||||
"https://tile.openstreetmap.org",
|
||||
"https://tile.osm.ch",
|
||||
"https://your_personal_url/{z}/{x}/{y}.png",
|
||||
],
|
||||
...
|
||||
}
|
||||
```
|
||||
@@ -10,44 +10,183 @@ version: 1
|
||||
apache-superset>=6.0
|
||||
:::
|
||||
|
||||
Superset now rides on **Ant Design v5’s token-based theming**.
|
||||
Superset now rides on **Ant Design v5's token-based theming**.
|
||||
Every Antd token works, plus a handful of Superset-specific ones for charts and dashboard chrome.
|
||||
|
||||
## 1 — Create a theme
|
||||
## Managing Themes via UI
|
||||
|
||||
1. Open the official [Ant Design Theme Editor](https://ant.design/theme-editor)
|
||||
2. Design your palette, typography, and component overrides.
|
||||
3. Open the `CONFIG` modal and paste the JSON.
|
||||
Superset includes a built-in **Theme Management** interface accessible from the admin menu under **Settings > Themes**.
|
||||
|
||||
### Creating a New Theme
|
||||
|
||||
1. Navigate to **Settings > Themes** in the Superset interface
|
||||
2. Click **+ Theme** to create a new theme
|
||||
3. Use the [Ant Design Theme Editor](https://ant.design/theme-editor) to design your theme:
|
||||
- Design your palette, typography, and component overrides
|
||||
- Open the `CONFIG` modal and copy the JSON configuration
|
||||
4. Paste the JSON into the theme definition field in Superset
|
||||
5. Give your theme a descriptive name and save
|
||||
|
||||
You can also extend with Superset-specific tokens (documented in the default theme object) before you import.
|
||||
|
||||
## 2 — Apply it instance-wide
|
||||
### System Theme Administration
|
||||
|
||||
When `ENABLE_UI_THEME_ADMINISTRATION = True` is configured, administrators can manage system-wide themes directly from the UI:
|
||||
|
||||
#### Setting System Themes
|
||||
- **System Default Theme**: Click the sun icon on any theme to set it as the system-wide default
|
||||
- **System Dark Theme**: Click the moon icon on any theme to set it as the system dark mode theme
|
||||
- **Automatic OS Detection**: When both default and dark themes are set, Superset automatically detects and applies the appropriate theme based on OS preferences
|
||||
|
||||
#### Managing System Themes
|
||||
- System themes are indicated with special badges in the theme list
|
||||
- Only administrators with write permissions can modify system theme settings
|
||||
- Removing a system theme designation reverts to configuration file defaults
|
||||
|
||||
### Applying Themes to Dashboards
|
||||
|
||||
Once created, themes can be applied to individual dashboards:
|
||||
- Edit any dashboard and select your custom theme from the theme dropdown
|
||||
- Each dashboard can have its own theme, allowing for branded or context-specific styling
|
||||
|
||||
## Configuration Options
|
||||
|
||||
### Python Configuration
|
||||
|
||||
Configure theme behavior via `superset_config.py`:
|
||||
|
||||
```python
|
||||
# superset_config.py
|
||||
THEME = {
|
||||
# Paste your JSON theme definition here
|
||||
# Enable UI-based theme administration for admins
|
||||
ENABLE_UI_THEME_ADMINISTRATION = True
|
||||
|
||||
# Optional: Set initial default themes via configuration
|
||||
# These can be overridden via the UI when ENABLE_UI_THEME_ADMINISTRATION = True
|
||||
THEME_DEFAULT = {
|
||||
"token": {
|
||||
"colorPrimary": "#2893B3",
|
||||
"colorSuccess": "#5ac189",
|
||||
# ... your theme JSON configuration
|
||||
}
|
||||
}
|
||||
|
||||
# Optional: Dark theme configuration
|
||||
THEME_DARK = {
|
||||
"algorithm": "dark",
|
||||
"token": {
|
||||
"colorPrimary": "#2893B3",
|
||||
# ... your dark theme overrides
|
||||
}
|
||||
}
|
||||
|
||||
# To force a single theme on all users, set THEME_DARK = None
|
||||
# When both themes are defined (via UI or config):
|
||||
# - Users can manually switch between themes
|
||||
# - OS preference detection is automatically enabled
|
||||
```
|
||||
|
||||
### Migration from Configuration to UI
|
||||
|
||||
When `ENABLE_UI_THEME_ADMINISTRATION = True`:
|
||||
|
||||
1. System themes set via the UI take precedence over configuration file settings
|
||||
2. The UI shows which themes are currently set as system defaults
|
||||
3. Administrators can change system themes without restarting Superset
|
||||
4. Configuration file themes serve as fallbacks when no UI themes are set
|
||||
|
||||
### Copying Themes Between Systems
|
||||
|
||||
To export a theme for use in configuration files or another instance:
|
||||
|
||||
1. Navigate to **Settings > Themes** and click the export icon on your desired theme
|
||||
2. Extract the JSON configuration from the exported YAML file
|
||||
3. Use this JSON in your `superset_config.py` or import it into another Superset instance
|
||||
|
||||
## Theme Development Workflow
|
||||
|
||||
1. **Design**: Use the [Ant Design Theme Editor](https://ant.design/theme-editor) to iterate on your design
|
||||
2. **Test**: Create themes in Superset's CRUD interface for testing
|
||||
3. **Apply**: Assign themes to specific dashboards or configure instance-wide
|
||||
4. **Iterate**: Modify theme JSON directly in the CRUD interface or re-import from the theme editor
|
||||
|
||||
## Custom Fonts
|
||||
|
||||
Superset supports custom fonts through runtime configuration, allowing you to use branded or custom typefaces without rebuilding the application.
|
||||
|
||||
### Configuring Custom Fonts
|
||||
|
||||
Add font URLs to your `superset_config.py`:
|
||||
|
||||
```python
|
||||
# Load fonts from Google Fonts, Adobe Fonts, or self-hosted sources
|
||||
CUSTOM_FONT_URLS = [
|
||||
"https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap",
|
||||
"https://fonts.googleapis.com/css2?family=JetBrains+Mono:wght@400;500&display=swap",
|
||||
]
|
||||
|
||||
# Update CSP to allow font sources
|
||||
TALISMAN_CONFIG = {
|
||||
"content_security_policy": {
|
||||
"font-src": ["'self'", "https://fonts.googleapis.com", "https://fonts.gstatic.com"],
|
||||
"style-src": ["'self'", "'unsafe-inline'", "https://fonts.googleapis.com"],
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Restart Superset to apply changes
|
||||
### Using Custom Fonts in Themes
|
||||
|
||||
## 3 — Tweak live in the app (beta)
|
||||
Once configured, reference the fonts in your theme configuration:
|
||||
|
||||
Set the feature flag in your `superset_config`
|
||||
```python
|
||||
DEFAULT_FEATURE_FLAGS: dict[str, bool] = {
|
||||
{{ ... }}
|
||||
THEME_ALLOW_THEME_EDITOR_BETA = True,
|
||||
THEME_DEFAULT = {
|
||||
"token": {
|
||||
"fontFamily": "Inter, -apple-system, BlinkMacSystemFont, sans-serif",
|
||||
"fontFamilyCode": "JetBrains Mono, Monaco, monospace",
|
||||
# ... other theme tokens
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- Enables a JSON editor panel inside Superset as a new icon in the navbar
|
||||
- Intended for testing/design and rapid in-context iteration
|
||||
- End-user theme switching & preferences coming later
|
||||
Or in the CRUD interface theme JSON:
|
||||
|
||||
## 4 — Potential Next Steps
|
||||
```json
|
||||
{
|
||||
"token": {
|
||||
"fontFamily": "Inter, -apple-system, BlinkMacSystemFont, sans-serif",
|
||||
"fontFamilyCode": "JetBrains Mono, Monaco, monospace"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- CRUD UI for managing multiple themes
|
||||
- Per-dashboard & per-workspace theme assignment
|
||||
- User-selectable theme preferences
|
||||
### Font Sources
|
||||
|
||||
- **Google Fonts**: Free, CDN-hosted fonts with wide variety
|
||||
- **Adobe Fonts**: Premium fonts (requires subscription and kit ID)
|
||||
- **Self-hosted**: Place font files in `/static/assets/fonts/` and reference via CSS
|
||||
|
||||
This feature works with the stock Docker image - no custom build required!
|
||||
|
||||
## Advanced Features
|
||||
|
||||
- **System Themes**: Manage system-wide default and dark themes via UI or configuration
|
||||
- **Per-Dashboard Theming**: Each dashboard can have its own visual identity
|
||||
- **JSON Editor**: Edit theme configurations directly within Superset's interface
|
||||
- **Custom Fonts**: Load external fonts via configuration without rebuilding
|
||||
- **OS Dark Mode Detection**: Automatically switches themes based on system preferences
|
||||
- **Theme Import/Export**: Share themes between instances via YAML files
|
||||
|
||||
## API Access
|
||||
|
||||
For programmatic theme management, Superset provides REST endpoints:
|
||||
|
||||
- `GET /api/v1/theme/` - List all themes
|
||||
- `POST /api/v1/theme/` - Create a new theme
|
||||
- `PUT /api/v1/theme/{id}` - Update a theme
|
||||
- `DELETE /api/v1/theme/{id}` - Delete a theme
|
||||
- `PUT /api/v1/theme/{id}/set_system_default` - Set as system default theme (admin only)
|
||||
- `PUT /api/v1/theme/{id}/set_system_dark` - Set as system dark theme (admin only)
|
||||
- `DELETE /api/v1/theme/unset_system_default` - Remove system default designation
|
||||
- `DELETE /api/v1/theme/unset_system_dark` - Remove system dark designation
|
||||
- `GET /api/v1/theme/export/` - Export themes as YAML
|
||||
- `POST /api/v1/theme/import/` - Import themes from YAML
|
||||
|
||||
These endpoints require appropriate permissions and are subject to RBAC controls.
|
||||
|
||||
@@ -120,6 +120,78 @@ docker volume rm superset_db_home
|
||||
docker-compose up
|
||||
```
|
||||
|
||||
## GitHub Codespaces (Cloud Development)
|
||||
|
||||
GitHub Codespaces provides a complete, pre-configured development environment in the cloud. This is ideal for:
|
||||
- Quick contributions without local setup
|
||||
- Consistent development environments across team members
|
||||
- Working from devices that can't run Docker locally
|
||||
- Safe experimentation in isolated environments
|
||||
|
||||
:::info
|
||||
We're grateful to GitHub for providing this excellent cloud development service that makes
|
||||
contributing to Apache Superset more accessible to developers worldwide.
|
||||
:::
|
||||
|
||||
### Getting Started with Codespaces
|
||||
|
||||
1. **Create a Codespace**: Use this pre-configured link that sets up everything you need:
|
||||
|
||||
[**Launch Superset Codespace →**](https://github.com/codespaces/new?skip_quickstart=true&machine=standardLinux32gb&repo=39464018&ref=master&devcontainer_path=.devcontainer%2Fdevcontainer.json&geo=UsWest)
|
||||
|
||||
:::caution
|
||||
**Important**: You must select at least the **4 CPU / 16GB RAM** machine type (pre-selected in the link above).
|
||||
Smaller instances will not have sufficient resources to run Superset effectively.
|
||||
:::
|
||||
|
||||
2. **Wait for Setup**: The initial setup takes several minutes. The Codespace will:
|
||||
- Build the development container
|
||||
- Install all dependencies
|
||||
- Start all required services (PostgreSQL, Redis, etc.)
|
||||
- Initialize the database with example data
|
||||
|
||||
3. **Access Superset**: Once ready, check the **PORTS** tab in VS Code for port `9001`.
|
||||
Click the globe icon to open Superset in your browser.
|
||||
- Default credentials: `admin` / `admin`
|
||||
|
||||
### Key Features
|
||||
|
||||
- **Auto-reload**: Both Python and TypeScript files auto-refresh on save
|
||||
- **Pre-installed Extensions**: VS Code extensions for Python, TypeScript, and database tools
|
||||
- **Multiple Instances**: Run multiple Codespaces for different branches/features
|
||||
- **SSH Access**: Connect via terminal using `gh cs ssh` or through the GitHub web UI
|
||||
- **VS Code Integration**: Works seamlessly with VS Code desktop app
|
||||
|
||||
### Managing Codespaces
|
||||
|
||||
- **List active Codespaces**: `gh cs list`
|
||||
- **SSH into a Codespace**: `gh cs ssh`
|
||||
- **Stop a Codespace**: Via GitHub UI or `gh cs stop`
|
||||
- **Delete a Codespace**: Via GitHub UI or `gh cs delete`
|
||||
|
||||
### Debugging and Logs
|
||||
|
||||
Since Codespaces uses `docker-compose-light.yml`, you can monitor all services:
|
||||
|
||||
```bash
|
||||
# Stream logs from all services
|
||||
docker compose -f docker-compose-light.yml logs -f
|
||||
|
||||
# Stream logs from a specific service
|
||||
docker compose -f docker-compose-light.yml logs -f superset
|
||||
|
||||
# View last 100 lines and follow
|
||||
docker compose -f docker-compose-light.yml logs --tail=100 -f
|
||||
|
||||
# List all running services
|
||||
docker compose -f docker-compose-light.yml ps
|
||||
```
|
||||
|
||||
:::tip
|
||||
Codespaces automatically stop after 30 minutes of inactivity to save resources.
|
||||
Your work is preserved and you can restart anytime.
|
||||
:::
|
||||
|
||||
## Installing Development Tools
|
||||
|
||||
:::note
|
||||
@@ -194,6 +266,48 @@ You can also run the pre-commit checks manually in various ways:
|
||||
Replace `<hook_id>` with the ID of the specific hook you want to run. You can find the list
|
||||
of available hooks in the `.pre-commit-config.yaml` file.
|
||||
|
||||
## Working with LLMs
|
||||
|
||||
### Environment Setup
|
||||
Ensure Docker Compose is running before starting LLM sessions:
|
||||
```bash
|
||||
docker compose up
|
||||
```
|
||||
|
||||
Validate your environment:
|
||||
```bash
|
||||
curl -f http://localhost:8088/health && echo "✅ Superset ready"
|
||||
```
|
||||
|
||||
### LLM Session Best Practices
|
||||
- Always validate environment setup first using the health checks above
|
||||
- Use focused validation commands: `pre-commit run` (not `--all-files`)
|
||||
- **Read [LLMS.md](https://github.com/apache/superset/blob/master/LLMS.md) first** - Contains comprehensive development guidelines, coding standards, and critical refactor information
|
||||
- **Check platform-specific files** when available:
|
||||
- `CLAUDE.md` - For Claude/Anthropic tools
|
||||
- `CURSOR.md` - For Cursor editor
|
||||
- `GEMINI.md` - For Google Gemini tools
|
||||
- `GPT.md` - For OpenAI/ChatGPT tools
|
||||
- Follow the TypeScript migration guidelines and avoid deprecated patterns listed in LLMS.md
|
||||
|
||||
### Key Development Commands
|
||||
```bash
|
||||
# Frontend development
|
||||
cd superset-frontend
|
||||
npm run dev # Development server on http://localhost:9000
|
||||
npm run test # Run all tests
|
||||
npm run test -- filename.test.tsx # Run single test file
|
||||
npm run lint # Linting and type checking
|
||||
|
||||
# Backend validation
|
||||
pre-commit run mypy # Type checking
|
||||
pytest # Run all tests
|
||||
pytest tests/unit_tests/specific_test.py # Run single test file
|
||||
pytest tests/unit_tests/ # Run all tests in directory
|
||||
```
|
||||
|
||||
For detailed development context, environment setup, and coding guidelines, see [LLMS.md](https://github.com/apache/superset/blob/master/LLMS.md).
|
||||
|
||||
## Alternatives to `docker compose`
|
||||
|
||||
:::caution
|
||||
@@ -307,14 +421,6 @@ Then make sure you run your WSGI server using the right worker type:
|
||||
gunicorn "superset.app:create_app()" -k "geventwebsocket.gunicorn.workers.GeventWebSocketWorker" -b 127.0.0.1:8088 --reload
|
||||
```
|
||||
|
||||
You can log anything to the browser console, including objects:
|
||||
|
||||
```python
|
||||
from superset import app
|
||||
app.logger.error('An exception occurred!')
|
||||
app.logger.info(form_data)
|
||||
```
|
||||
|
||||
### Frontend
|
||||
|
||||
Frontend assets (TypeScript, JavaScript, CSS, and images) must be compiled in order to properly display the web UI. The `superset-frontend` directory contains all NPM-managed frontend assets. Note that for some legacy pages there are additional frontend assets bundled with Flask-Appbuilder (e.g. jQuery and bootstrap). These are not managed by NPM and may be phased out in the future.
|
||||
|
||||
@@ -26,11 +26,14 @@ Superset locally is using Docker Compose on a Linux or Mac OSX
|
||||
computer. Superset does not have official support for Windows. It's also the easiest
|
||||
way to launch a fully functioning **development environment** quickly.
|
||||
|
||||
Note that there are 3 major ways we support to run `docker compose`:
|
||||
Note that there are 4 major ways we support to run `docker compose`:
|
||||
|
||||
1. **docker-compose.yml:** for interactive development, where we mount your local folder with the
|
||||
frontend/backend files that you can edit and experience the changes you
|
||||
make in the app in real time
|
||||
1. **docker-compose-light.yml:** a lightweight configuration with minimal services (database,
|
||||
Superset app, and frontend dev server) for development. Uses in-memory caching instead of Redis
|
||||
and is designed for running multiple instances simultaneously
|
||||
1. **docker-compose-non-dev.yml** where we just build a more immutable image based on the
|
||||
local branch and get all the required images running. Changes in the local branch
|
||||
at the time you fire this up will be reflected, but changes to the code
|
||||
@@ -44,7 +47,7 @@ Note that there are 3 major ways we support to run `docker compose`:
|
||||
The `dev` builds include the `psycopg2-binary` required to connect
|
||||
to the Postgres database launched as part of the `docker compose` builds.
|
||||
|
||||
More on these two approaches after setting up the requirements for either.
|
||||
More on these approaches after setting up the requirements for either.
|
||||
|
||||
## Requirements
|
||||
|
||||
@@ -103,13 +106,36 @@ and help you start fresh. In the context of `docker compose` setting
|
||||
from within docker. This will slow down the startup, but will fix various npm-related issues.
|
||||
:::
|
||||
|
||||
### Option #2 - build a set of immutable images from the local branch
|
||||
### Option #2 - lightweight development with multiple instances
|
||||
|
||||
For a lighter development setup that uses fewer resources and supports running multiple instances:
|
||||
|
||||
```bash
|
||||
# Single lightweight instance (default port 9001)
|
||||
docker compose -f docker-compose-light.yml up
|
||||
|
||||
# Multiple instances with different ports
|
||||
NODE_PORT=9001 docker compose -p superset-1 -f docker-compose-light.yml up
|
||||
NODE_PORT=9002 docker compose -p superset-2 -f docker-compose-light.yml up
|
||||
NODE_PORT=9003 docker compose -p superset-3 -f docker-compose-light.yml up
|
||||
```
|
||||
|
||||
This configuration includes:
|
||||
- PostgreSQL database (internal network only)
|
||||
- Superset application server
|
||||
- Frontend development server with webpack hot reloading
|
||||
- In-memory caching (no Redis)
|
||||
- Isolated volumes and networks per instance
|
||||
|
||||
Access each instance at `http://localhost:{NODE_PORT}` (e.g., `http://localhost:9001`).
|
||||
|
||||
### Option #3 - build a set of immutable images from the local branch
|
||||
|
||||
```bash
|
||||
docker compose -f docker-compose-non-dev.yml up
|
||||
```
|
||||
|
||||
### Option #3 - boot up an official release
|
||||
### Option #4 - boot up an official release
|
||||
|
||||
```bash
|
||||
# Set the version you want to run
|
||||
|
||||
@@ -2,6 +2,20 @@
|
||||
title: CVEs fixed by release
|
||||
sidebar_position: 2
|
||||
---
|
||||
#### Version 5.0.0
|
||||
|
||||
| CVE | Title | Affected |
|
||||
|:---------------|:-----------------------------------------------------------------------------------|---------:|
|
||||
| CVE-2025-55673 | Exposure of Sensitive Information to an Unauthorized Actor | < 5.0.0 |
|
||||
| CVE-2025-55674 | Improper Neutralization of Special Elements used in an SQL Command | < 5.0.0 |
|
||||
| CVE-2025-55675 | Improper Access Control leading to Information Disclosure | < 5.0.0 |
|
||||
|
||||
#### Version 4.1.3
|
||||
|
||||
| CVE | Title | Affected |
|
||||
|:---------------|:-----------------------------------------------------------------------------------|---------:|
|
||||
| CVE-2025-55672 | Improper Neutralization of Input During Web Page Generation | < 4.1.3 |
|
||||
|
||||
#### Version 4.1.2
|
||||
|
||||
| CVE | Title | Affected |
|
||||
|
||||
@@ -20,16 +20,125 @@
|
||||
import type { Config } from '@docusaurus/types';
|
||||
import type { Options, ThemeConfig } from '@docusaurus/preset-classic';
|
||||
import { themes } from 'prism-react-renderer';
|
||||
import remarkImportPartial from 'remark-import-partial';
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
|
||||
const { github: lightCodeTheme, vsDark: darkCodeTheme } = themes;
|
||||
|
||||
// Load version configuration from external file
|
||||
const versionsConfigPath = path.join(__dirname, 'versions-config.json');
|
||||
const versionsConfig = JSON.parse(fs.readFileSync(versionsConfigPath, 'utf8'));
|
||||
|
||||
// Build plugins array dynamically based on disabled flags
|
||||
const dynamicPlugins = [];
|
||||
|
||||
// Add components plugin if not disabled
|
||||
if (!versionsConfig.components.disabled) {
|
||||
dynamicPlugins.push([
|
||||
'@docusaurus/plugin-content-docs',
|
||||
{
|
||||
id: 'components',
|
||||
path: 'components',
|
||||
routeBasePath: 'components',
|
||||
sidebarPath: require.resolve('./sidebarComponents.js'),
|
||||
editUrl:
|
||||
'https://github.com/apache/superset/edit/master/docs/components',
|
||||
remarkPlugins: [remarkImportPartial],
|
||||
docItemComponent: '@theme/DocItem',
|
||||
includeCurrentVersion: versionsConfig.components.includeCurrentVersion,
|
||||
lastVersion: versionsConfig.components.lastVersion,
|
||||
onlyIncludeVersions: versionsConfig.components.onlyIncludeVersions,
|
||||
versions: versionsConfig.components.versions,
|
||||
disableVersioning: false,
|
||||
showLastUpdateAuthor: true,
|
||||
showLastUpdateTime: true,
|
||||
},
|
||||
]);
|
||||
}
|
||||
|
||||
// Add developer_portal plugin if not disabled
|
||||
if (!versionsConfig.developer_portal.disabled) {
|
||||
dynamicPlugins.push([
|
||||
'@docusaurus/plugin-content-docs',
|
||||
{
|
||||
id: 'developer_portal',
|
||||
path: 'developer_portal',
|
||||
routeBasePath: 'developer_portal',
|
||||
sidebarPath: require.resolve('./sidebarTutorials.js'),
|
||||
editUrl:
|
||||
'https://github.com/apache/superset/edit/master/docs/developer_portal',
|
||||
remarkPlugins: [remarkImportPartial],
|
||||
docItemComponent: '@theme/DocItem',
|
||||
includeCurrentVersion: versionsConfig.developer_portal.includeCurrentVersion,
|
||||
lastVersion: versionsConfig.developer_portal.lastVersion,
|
||||
onlyIncludeVersions: versionsConfig.developer_portal.onlyIncludeVersions,
|
||||
versions: versionsConfig.developer_portal.versions,
|
||||
disableVersioning: false,
|
||||
showLastUpdateAuthor: true,
|
||||
showLastUpdateTime: true,
|
||||
},
|
||||
]);
|
||||
}
|
||||
|
||||
// Build navbar items dynamically based on disabled flags
|
||||
const dynamicNavbarItems = [];
|
||||
|
||||
// Add Component Playground navbar item if not disabled
|
||||
if (!versionsConfig.components.disabled) {
|
||||
dynamicNavbarItems.push({
|
||||
label: 'Component Playground',
|
||||
to: '/components',
|
||||
items: [
|
||||
{
|
||||
label: 'Introduction',
|
||||
to: '/components',
|
||||
},
|
||||
{
|
||||
label: 'UI Components',
|
||||
to: '/components/ui-components/button',
|
||||
},
|
||||
{
|
||||
label: 'Chart Components',
|
||||
to: '/components/chart-components/bar-chart',
|
||||
},
|
||||
{
|
||||
label: 'Layout Components',
|
||||
to: '/components/layout-components/grid',
|
||||
},
|
||||
],
|
||||
});
|
||||
}
|
||||
|
||||
// Add Developer Portal navbar item if not disabled
|
||||
if (!versionsConfig.developer_portal.disabled) {
|
||||
dynamicNavbarItems.push({
|
||||
label: 'Developer Portal',
|
||||
position: 'left',
|
||||
items: [
|
||||
{
|
||||
type: 'doc',
|
||||
docsPluginId: 'developer_portal',
|
||||
docId: 'index',
|
||||
label: 'Introduction',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
docsPluginId: 'developer_portal',
|
||||
docId: 'getting-started/index',
|
||||
label: 'Getting Started',
|
||||
},
|
||||
],
|
||||
});
|
||||
}
|
||||
|
||||
const config: Config = {
|
||||
title: 'Superset',
|
||||
tagline:
|
||||
'Apache Superset is a modern data exploration and visualization platform',
|
||||
url: 'https://superset.apache.org',
|
||||
baseUrl: '/',
|
||||
onBrokenLinks: 'throw',
|
||||
onBrokenLinks: 'warn',
|
||||
onBrokenMarkdownLinks: 'throw',
|
||||
markdown: {
|
||||
mermaid: true,
|
||||
@@ -39,6 +148,7 @@ const config: Config = {
|
||||
projectName: 'superset',
|
||||
themes: ['@saucelabs/theme-github-codeblock', '@docusaurus/theme-mermaid'],
|
||||
plugins: [
|
||||
require.resolve('./src/webpack.extend.ts'),
|
||||
[
|
||||
'docusaurus-plugin-less',
|
||||
{
|
||||
@@ -47,11 +157,10 @@ const config: Config = {
|
||||
},
|
||||
},
|
||||
],
|
||||
...dynamicPlugins,
|
||||
[
|
||||
'@docusaurus/plugin-client-redirects',
|
||||
{
|
||||
fromExtensions: ['html', 'htm'],
|
||||
toExtensions: ['exe', 'zip'],
|
||||
redirects: [
|
||||
{
|
||||
to: '/docs/installation/docker-compose',
|
||||
@@ -210,6 +319,13 @@ const config: Config = {
|
||||
}
|
||||
return `https://github.com/apache/superset/edit/master/docs/${versionDocsDirPath}/${docPath}`;
|
||||
},
|
||||
includeCurrentVersion: versionsConfig.docs.includeCurrentVersion,
|
||||
lastVersion: versionsConfig.docs.lastVersion, // Make 'next' the default
|
||||
onlyIncludeVersions: versionsConfig.docs.onlyIncludeVersions,
|
||||
versions: versionsConfig.docs.versions,
|
||||
disableVersioning: false,
|
||||
showLastUpdateAuthor: true,
|
||||
showLastUpdateTime: true,
|
||||
},
|
||||
blog: {
|
||||
showReadingTime: true,
|
||||
@@ -235,6 +351,13 @@ const config: Config = {
|
||||
apiKey: 'd0d22810f2e9b614ffac3a73b26891fe',
|
||||
indexName: 'superset-apache',
|
||||
},
|
||||
mermaid: {
|
||||
theme: { light: 'neutral', dark: 'dark' },
|
||||
options: {
|
||||
// Any Mermaid config options go here...
|
||||
maxTextSize: 100000,
|
||||
},
|
||||
},
|
||||
navbar: {
|
||||
logo: {
|
||||
alt: 'Superset Logo',
|
||||
@@ -244,20 +367,22 @@ const config: Config = {
|
||||
items: [
|
||||
{
|
||||
label: 'Documentation',
|
||||
to: '/docs/intro',
|
||||
position: 'left',
|
||||
items: [
|
||||
{
|
||||
type: 'doc',
|
||||
docId: 'intro',
|
||||
label: 'Getting Started',
|
||||
to: '/docs/intro',
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
docId: 'faq',
|
||||
label: 'FAQ',
|
||||
to: '/docs/faq',
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
label: 'Community',
|
||||
label: 'Community Resources',
|
||||
to: '/community',
|
||||
items: [
|
||||
{
|
||||
@@ -282,6 +407,7 @@ const config: Config = {
|
||||
},
|
||||
],
|
||||
},
|
||||
...dynamicNavbarItems,
|
||||
{
|
||||
href: '/docs/intro',
|
||||
position: 'right',
|
||||
@@ -336,7 +462,6 @@ const config: Config = {
|
||||
// src: 'https://www.bugherd.com/sidebarv2.js?apikey=enilpiu7bgexxsnoqfjtxa',
|
||||
// async: true,
|
||||
// },
|
||||
'/script/matomo.js',
|
||||
{
|
||||
src: 'https://widget.kapa.ai/kapa-widget.bundle.js',
|
||||
async: true,
|
||||
|
||||
75
docs/eslint.config.js
Normal file
75
docs/eslint.config.js
Normal file
@@ -0,0 +1,75 @@
|
||||
/* eslint-env node */
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
const typescriptEslintParser = require('@typescript-eslint/parser');
|
||||
const typescriptEslintPlugin = require('@typescript-eslint/eslint-plugin');
|
||||
const eslintConfigPrettier = require('eslint-config-prettier');
|
||||
const prettierEslintPlugin = require('eslint-plugin-prettier');
|
||||
const js = require('@eslint/js');
|
||||
const ts = require('typescript-eslint');
|
||||
const react = require('eslint-plugin-react');
|
||||
const globals = require('globals');
|
||||
const { defineConfig, globalIgnores } = require('eslint/config');
|
||||
|
||||
module.exports = defineConfig([
|
||||
{
|
||||
files: ['**/*.{js,jsx,ts,tsx}'],
|
||||
},
|
||||
globalIgnores(['build/**/*', '.docusaurus/**/*', 'node_modules/**/*']),
|
||||
js.configs.recommended,
|
||||
...ts.configs.recommended,
|
||||
eslintConfigPrettier,
|
||||
{
|
||||
files: ['eslint.config.js'],
|
||||
rules: {
|
||||
'@typescript-eslint/no-require-imports': 'off',
|
||||
},
|
||||
},
|
||||
{
|
||||
languageOptions: {
|
||||
parser: typescriptEslintParser,
|
||||
parserOptions: {
|
||||
ecmaFeatures: {
|
||||
jsx: true,
|
||||
},
|
||||
ecmaVersion: 2020,
|
||||
sourceType: 'module',
|
||||
},
|
||||
globals: {
|
||||
...globals.browser,
|
||||
...globals.node,
|
||||
},
|
||||
},
|
||||
plugins: {
|
||||
typescript: typescriptEslintPlugin,
|
||||
react,
|
||||
prettier: prettierEslintPlugin,
|
||||
},
|
||||
rules: {
|
||||
'react/react-in-jsx-scope': 'off',
|
||||
'react/prop-types': 'off',
|
||||
'@typescript-eslint/explicit-module-boundary-types': 'off',
|
||||
},
|
||||
settings: {
|
||||
react: {
|
||||
version: 'detect',
|
||||
},
|
||||
},
|
||||
},
|
||||
]);
|
||||
@@ -6,7 +6,8 @@
|
||||
"scripts": {
|
||||
"docusaurus": "docusaurus",
|
||||
"_init": "cat src/intro_header.txt ../README.md > docs/intro.md",
|
||||
"start": "yarn run _init && docusaurus start",
|
||||
"start": "yarn run _init && NODE_ENV=development docusaurus start",
|
||||
"stop": "pkill -f 'docusaurus start' || echo 'No docusaurus server running'",
|
||||
"build": "yarn run _init && DEBUG=docusaurus:* docusaurus build",
|
||||
"swizzle": "docusaurus swizzle",
|
||||
"deploy": "docusaurus deploy",
|
||||
@@ -15,41 +16,73 @@
|
||||
"write-translations": "docusaurus write-translations",
|
||||
"write-heading-ids": "docusaurus write-heading-ids",
|
||||
"typecheck": "tsc",
|
||||
"eslint": "eslint . --ext .js,.jsx,.ts,.tsx"
|
||||
"eslint": "eslint .",
|
||||
"version:add": "node scripts/manage-versions.mjs add",
|
||||
"version:remove": "node scripts/manage-versions.mjs remove",
|
||||
"version:add:docs": "node scripts/manage-versions.mjs add docs",
|
||||
"version:add:developer_portal": "node scripts/manage-versions.mjs add developer_portal",
|
||||
"version:add:components": "node scripts/manage-versions.mjs add components",
|
||||
"version:remove:docs": "node scripts/manage-versions.mjs remove docs",
|
||||
"version:remove:developer_portal": "node scripts/manage-versions.mjs remove developer_portal",
|
||||
"version:remove:components": "node scripts/manage-versions.mjs remove components"
|
||||
},
|
||||
"dependencies": {
|
||||
"@ant-design/icons": "^6.0.0",
|
||||
"@docusaurus/core": "3.8.1",
|
||||
"@docusaurus/plugin-client-redirects": "3.8.1",
|
||||
"@docusaurus/preset-classic": "3.8.1",
|
||||
"@docusaurus/theme-mermaid": "3.8.1",
|
||||
"@docusaurus/theme-mermaid": "^3.8.1",
|
||||
"@emotion/core": "^10.0.27",
|
||||
"@emotion/react": "^11.13.3",
|
||||
"@emotion/styled": "^10.0.27",
|
||||
"@mdx-js/react": "^3.1.0",
|
||||
"@saucelabs/theme-github-codeblock": "^0.3.0",
|
||||
"@superset-ui/style": "^0.14.23",
|
||||
"antd": "^5.26.3",
|
||||
"@storybook/addon-docs": "^8.6.11",
|
||||
"@storybook/blocks": "^8.6.11",
|
||||
"@storybook/channels": "^8.6.11",
|
||||
"@storybook/client-logger": "^8.6.11",
|
||||
"@storybook/components": "^8.6.11",
|
||||
"@storybook/core": "^8.6.11",
|
||||
"@storybook/core-events": "^8.6.11",
|
||||
"@storybook/csf": "^0.1.13",
|
||||
"@storybook/docs-tools": "^8.6.11",
|
||||
"@storybook/preview-api": "^8.6.11",
|
||||
"@storybook/theming": "^8.6.11",
|
||||
"@superset-ui/core": "^0.20.4",
|
||||
"antd": "^5.26.7",
|
||||
"caniuse-lite": "^1.0.30001707",
|
||||
"docusaurus-plugin-less": "^2.0.2",
|
||||
"less": "^4.3.0",
|
||||
"json-bigint": "^1.0.0",
|
||||
"less": "^4.4.0",
|
||||
"less-loader": "^12.3.0",
|
||||
"prism-react-renderer": "^2.4.1",
|
||||
"react": "^18.3.1",
|
||||
"react-dom": "^18.3.1",
|
||||
"react-github-btn": "^1.4.0",
|
||||
"react-svg-pan-zoom": "^3.13.1",
|
||||
"swagger-ui-react": "^5.26.0"
|
||||
"remark-import-partial": "^0.0.2",
|
||||
"reselect": "^5.1.1",
|
||||
"storybook": "^8.6.11",
|
||||
"swagger-ui-react": "^5.27.1",
|
||||
"tinycolor2": "^1.4.2",
|
||||
"ts-loader": "^9.5.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@docusaurus/module-type-aliases": "^3.8.1",
|
||||
"@docusaurus/tsconfig": "^3.8.1",
|
||||
"@eslint/js": "^9.32.0",
|
||||
"@types/react": "^19.1.8",
|
||||
"@typescript-eslint/eslint-plugin": "^5.0.0",
|
||||
"@typescript-eslint/parser": "^5.0.0",
|
||||
"eslint": "^8.0.0",
|
||||
"eslint-config-prettier": "^10.1.5",
|
||||
"eslint-plugin-prettier": "^4.0.0",
|
||||
"@typescript-eslint/eslint-plugin": "^8.37.0",
|
||||
"@typescript-eslint/parser": "^8.37.0",
|
||||
"eslint": "^9.32.0",
|
||||
"eslint-config-prettier": "^10.1.8",
|
||||
"eslint-plugin-prettier": "^5.5.3",
|
||||
"eslint-plugin-react": "^7.37.5",
|
||||
"prettier": "^2.0.0",
|
||||
"globals": "^16.3.0",
|
||||
"prettier": "^3.6.2",
|
||||
"typescript": "~5.8.3",
|
||||
"webpack": "^5.99.9"
|
||||
"typescript-eslint": "^8.39.0",
|
||||
"webpack": "^5.101.0"
|
||||
},
|
||||
"browserslist": {
|
||||
"production": [
|
||||
@@ -62,5 +95,6 @@
|
||||
"last 1 firefox version",
|
||||
"last 1 safari version"
|
||||
]
|
||||
}
|
||||
},
|
||||
"packageManager": "yarn@1.22.22+sha1.ac34549e6aa8e7ead463a7407e1c7390f61a6610"
|
||||
}
|
||||
|
||||
242
docs/scripts/manage-versions.mjs
Normal file
242
docs/scripts/manage-versions.mjs
Normal file
@@ -0,0 +1,242 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { execSync } from 'child_process';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = path.dirname(__filename);
|
||||
|
||||
const CONFIG_FILE = path.join(__dirname, '..', 'versions-config.json');
|
||||
|
||||
// Parse command line arguments
|
||||
const args = process.argv.slice(2);
|
||||
const command = args[0]; // 'add' or 'remove'
|
||||
const section = args[1]; // 'docs', 'developer_portal', or 'components'
|
||||
const version = args[2]; // version string like '1.2.0'
|
||||
|
||||
function loadConfig() {
|
||||
return JSON.parse(fs.readFileSync(CONFIG_FILE, 'utf8'));
|
||||
}
|
||||
|
||||
function saveConfig(config) {
|
||||
fs.writeFileSync(CONFIG_FILE, JSON.stringify(config, null, 2) + '\n');
|
||||
}
|
||||
|
||||
function fixVersionedImports(version) {
|
||||
const versionedDocsPath = path.join(__dirname, '..', 'versioned_docs', `version-${version}`);
|
||||
|
||||
// Files that need import path fixes
|
||||
const filesToFix = [
|
||||
'contributing/resources.mdx',
|
||||
'configuration/country-map-tools.mdx'
|
||||
];
|
||||
|
||||
console.log(` Fixing relative imports in versioned docs...`);
|
||||
|
||||
filesToFix.forEach(filePath => {
|
||||
const fullPath = path.join(versionedDocsPath, filePath);
|
||||
if (fs.existsSync(fullPath)) {
|
||||
let content = fs.readFileSync(fullPath, 'utf8');
|
||||
|
||||
// Fix imports that go up two directories to go up three instead
|
||||
content = content.replace(
|
||||
/from ['"]\.\.\/\.\.\/src\//g,
|
||||
"from '../../../src/"
|
||||
);
|
||||
content = content.replace(
|
||||
/from ['"]\.\.\/\.\.\/data\//g,
|
||||
"from '../../../data/"
|
||||
);
|
||||
|
||||
fs.writeFileSync(fullPath, content);
|
||||
console.log(` Fixed imports in ${filePath}`);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function addVersion(section, version) {
|
||||
const config = loadConfig();
|
||||
|
||||
if (!config[section]) {
|
||||
console.error(`Section '${section}' not found in config`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Check if version already exists
|
||||
if (config[section].onlyIncludeVersions.includes(version)) {
|
||||
console.error(`Version ${version} already exists in ${section}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`Creating version ${version} for ${section}...`);
|
||||
|
||||
// Run Docusaurus version command
|
||||
const docusaurusCommand = section === 'docs'
|
||||
? `yarn docusaurus docs:version ${version}`
|
||||
: `yarn docusaurus docs:version:${section} ${version}`;
|
||||
|
||||
try {
|
||||
execSync(docusaurusCommand, { stdio: 'inherit' });
|
||||
} catch (error) {
|
||||
console.error(`Failed to create version: ${error.message}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Fix relative imports in versioned docs (for main docs section only)
|
||||
if (section === 'docs') {
|
||||
fixVersionedImports(version);
|
||||
}
|
||||
|
||||
// Update config
|
||||
// Add to onlyIncludeVersions array (after 'current')
|
||||
const versionIndex = config[section].onlyIncludeVersions.indexOf('current') + 1;
|
||||
config[section].onlyIncludeVersions.splice(versionIndex, 0, version);
|
||||
|
||||
// Add version metadata
|
||||
const versionPath = section === 'docs' ? version : version;
|
||||
config[section].versions[version] = {
|
||||
label: version,
|
||||
path: versionPath,
|
||||
banner: 'none'
|
||||
};
|
||||
|
||||
// Optionally update lastVersion if this is the first non-current version
|
||||
if (config[section].onlyIncludeVersions.length === 2) {
|
||||
config[section].lastVersion = version;
|
||||
}
|
||||
|
||||
saveConfig(config);
|
||||
console.log(`✅ Version ${version} added successfully to ${section}`);
|
||||
console.log(`📝 Updated versions-config.json`);
|
||||
}
|
||||
|
||||
function removeVersion(section, version) {
|
||||
const config = loadConfig();
|
||||
|
||||
if (!config[section]) {
|
||||
console.error(`Section '${section}' not found in config`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (version === 'current') {
|
||||
console.error(`Cannot remove 'current' version`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (!config[section].onlyIncludeVersions.includes(version)) {
|
||||
console.error(`Version ${version} not found in ${section}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`Removing version ${version} from ${section}...`);
|
||||
|
||||
// Determine file paths based on section
|
||||
const versionedDocsDir = section === 'docs'
|
||||
? `versioned_docs/version-${version}`
|
||||
: `${section}_versioned_docs/version-${version}`;
|
||||
|
||||
const versionedSidebarsFile = section === 'docs'
|
||||
? `versioned_sidebars/version-${version}-sidebars.json`
|
||||
: `${section}_versioned_sidebars/version-${version}-sidebars.json`;
|
||||
|
||||
// Remove versioned files
|
||||
const docsPath = path.join(__dirname, '..', versionedDocsDir);
|
||||
const sidebarsPath = path.join(__dirname, '..', versionedSidebarsFile);
|
||||
|
||||
if (fs.existsSync(docsPath)) {
|
||||
fs.rmSync(docsPath, { recursive: true });
|
||||
console.log(` Removed ${versionedDocsDir}`);
|
||||
}
|
||||
|
||||
if (fs.existsSync(sidebarsPath)) {
|
||||
fs.unlinkSync(sidebarsPath);
|
||||
console.log(` Removed ${versionedSidebarsFile}`);
|
||||
}
|
||||
|
||||
// Update versions.json file
|
||||
const versionsJsonFile = section === 'docs'
|
||||
? 'versions.json'
|
||||
: `${section}_versions.json`;
|
||||
const versionsJsonPath = path.join(__dirname, '..', versionsJsonFile);
|
||||
|
||||
if (fs.existsSync(versionsJsonPath)) {
|
||||
const versions = JSON.parse(fs.readFileSync(versionsJsonPath, 'utf8'));
|
||||
const versionIndex = versions.indexOf(version);
|
||||
if (versionIndex > -1) {
|
||||
versions.splice(versionIndex, 1);
|
||||
fs.writeFileSync(versionsJsonPath, JSON.stringify(versions, null, 2) + '\n');
|
||||
console.log(` Updated ${versionsJsonFile}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Update config
|
||||
const versionIndex = config[section].onlyIncludeVersions.indexOf(version);
|
||||
config[section].onlyIncludeVersions.splice(versionIndex, 1);
|
||||
delete config[section].versions[version];
|
||||
|
||||
// Update lastVersion if needed
|
||||
if (config[section].lastVersion === version) {
|
||||
// Set to the next available version or 'current'
|
||||
const remainingVersions = config[section].onlyIncludeVersions.filter(v => v !== 'current');
|
||||
config[section].lastVersion = remainingVersions.length > 0 ? remainingVersions[0] : 'current';
|
||||
console.log(` Updated lastVersion to ${config[section].lastVersion}`);
|
||||
}
|
||||
|
||||
saveConfig(config);
|
||||
console.log(`✅ Version ${version} removed successfully from ${section}`);
|
||||
console.log(`📝 Updated versions-config.json`);
|
||||
}
|
||||
|
||||
function printUsage() {
|
||||
console.log(`
|
||||
Usage:
|
||||
node scripts/manage-versions.js add <section> <version>
|
||||
node scripts/manage-versions.js remove <section> <version>
|
||||
|
||||
Where:
|
||||
- section: 'docs', 'developer_portal', or 'components'
|
||||
- version: version string (e.g., '1.2.0', '2.0.0')
|
||||
|
||||
Examples:
|
||||
node scripts/manage-versions.js add docs 2.0.0
|
||||
node scripts/manage-versions.js add developer_portal 1.3.0
|
||||
node scripts/manage-versions.js remove components 1.0.0
|
||||
`);
|
||||
}
|
||||
|
||||
// Main execution
|
||||
if (!command || !section || !version) {
|
||||
printUsage();
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (command === 'add') {
|
||||
addVersion(section, version);
|
||||
} else if (command === 'remove') {
|
||||
removeVersion(section, version);
|
||||
} else {
|
||||
console.error(`Unknown command: ${command}`);
|
||||
printUsage();
|
||||
process.exit(1);
|
||||
}
|
||||
68
docs/sidebarComponents.js
Normal file
68
docs/sidebarComponents.js
Normal file
@@ -0,0 +1,68 @@
|
||||
/* eslint-env node */
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
// @ts-check
|
||||
|
||||
/** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */
|
||||
const sidebars = {
|
||||
// By default, Docusaurus generates a sidebar from the docs folder structure
|
||||
//tutorialSidebar: [{type: 'autogenerated', dirName: '.'}],
|
||||
|
||||
// But we're not doing that.
|
||||
ComponentSidebar: [
|
||||
{
|
||||
type: 'doc',
|
||||
label: 'Introduction',
|
||||
id: 'index',
|
||||
},
|
||||
{
|
||||
type: 'category',
|
||||
label: 'UI Components',
|
||||
items: [
|
||||
{
|
||||
type: 'autogenerated',
|
||||
dirName: 'ui-components',
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
type: 'category',
|
||||
label: 'Chart Components',
|
||||
items: [
|
||||
{
|
||||
type: 'autogenerated',
|
||||
dirName: 'chart-components',
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
type: 'category',
|
||||
label: 'Layout Components',
|
||||
items: [
|
||||
{
|
||||
type: 'autogenerated',
|
||||
dirName: 'layout-components',
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
module.exports = sidebars;
|
||||
@@ -17,31 +17,32 @@
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
module.exports = {
|
||||
extends: [
|
||||
'eslint:recommended',
|
||||
'plugin:@typescript-eslint/recommended',
|
||||
'plugin:react/recommended',
|
||||
'plugin:prettier/recommended',
|
||||
|
||||
// @ts-check
|
||||
|
||||
/** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */
|
||||
const sidebars = {
|
||||
// By default, Docusaurus generates a sidebar from the docs folder structure
|
||||
//tutorialSidebar: [{type: 'autogenerated', dirName: '.'}],
|
||||
|
||||
// But we're not doing that.
|
||||
TutorialsSidebar: [
|
||||
{
|
||||
type: 'doc',
|
||||
label: 'Introduction',
|
||||
id: 'index',
|
||||
},
|
||||
{
|
||||
type: 'category',
|
||||
label: 'Getting Started',
|
||||
items: [
|
||||
{
|
||||
type: 'autogenerated',
|
||||
dirName: 'getting-started',
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
parser: '@typescript-eslint/parser',
|
||||
parserOptions: {
|
||||
ecmaFeatures: {
|
||||
jsx: true,
|
||||
},
|
||||
ecmaVersion: 2020,
|
||||
sourceType: 'module',
|
||||
},
|
||||
plugins: ['@typescript-eslint', 'react', 'prettier'],
|
||||
rules: {
|
||||
'react/react-in-jsx-scope': 'off',
|
||||
'react/prop-types': 'off',
|
||||
'@typescript-eslint/explicit-module-boundary-types': 'off',
|
||||
},
|
||||
settings: {
|
||||
react: {
|
||||
version: 'detect',
|
||||
},
|
||||
},
|
||||
ignorePatterns: ['build/**/*', '.docusaurus/**/*', 'node_modules/**/*'],
|
||||
};
|
||||
|
||||
module.exports = sidebars;
|
||||
23
docs/src/components/Button.jsx
Normal file
23
docs/src/components/Button.jsx
Normal file
@@ -0,0 +1,23 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
// Re-export the Button component as a default export
|
||||
import { Button } from '../../../superset-frontend/packages/superset-ui-core/src/components/Button';
|
||||
|
||||
export default Button;
|
||||
121
docs/src/components/StorybookWrapper.jsx
Normal file
121
docs/src/components/StorybookWrapper.jsx
Normal file
@@ -0,0 +1,121 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import React from 'react';
|
||||
import { supersetTheme, ThemeProvider } from '@superset-ui/core';
|
||||
|
||||
// A simple component to display a story example
|
||||
export function StoryExample({ component: Component, props = {} }) {
|
||||
return (
|
||||
<ThemeProvider theme={supersetTheme}>
|
||||
<div
|
||||
className="storybook-example"
|
||||
style={{
|
||||
border: '1px solid #e8e8e8',
|
||||
borderRadius: '4px',
|
||||
padding: '20px',
|
||||
marginBottom: '20px',
|
||||
}}
|
||||
>
|
||||
{Component && <Component {...props} />}
|
||||
</div>
|
||||
</ThemeProvider>
|
||||
);
|
||||
}
|
||||
|
||||
// A simple component to display a story with controls
|
||||
export function StoryWithControls({
|
||||
component: Component,
|
||||
props = {},
|
||||
controls = [],
|
||||
}) {
|
||||
const [stateProps, setStateProps] = React.useState(props);
|
||||
|
||||
const updateProp = (key, value) => {
|
||||
setStateProps(prev => ({
|
||||
...prev,
|
||||
[key]: value,
|
||||
}));
|
||||
};
|
||||
|
||||
return (
|
||||
<ThemeProvider theme={supersetTheme}>
|
||||
<div className="storybook-with-controls">
|
||||
<div
|
||||
className="storybook-example"
|
||||
style={{
|
||||
border: '1px solid #e8e8e8',
|
||||
borderRadius: '4px',
|
||||
padding: '20px',
|
||||
marginBottom: '20px',
|
||||
}}
|
||||
>
|
||||
{Component && <Component {...stateProps} />}
|
||||
</div>
|
||||
|
||||
{controls.length > 0 && (
|
||||
<div
|
||||
className="storybook-controls"
|
||||
style={{
|
||||
border: '1px solid #e8e8e8',
|
||||
borderRadius: '4px',
|
||||
padding: '20px',
|
||||
marginBottom: '20px',
|
||||
}}
|
||||
>
|
||||
<h4>Controls</h4>
|
||||
{controls.map(control => (
|
||||
<div key={control.name} style={{ marginBottom: '10px' }}>
|
||||
<label style={{ display: 'block', marginBottom: '5px' }}>
|
||||
{control.label || control.name}:
|
||||
</label>
|
||||
{control.type === 'select' ? (
|
||||
<select
|
||||
value={stateProps[control.name]}
|
||||
onChange={e => updateProp(control.name, e.target.value)}
|
||||
style={{ width: '100%', padding: '5px' }}
|
||||
>
|
||||
{control.options.map(option => (
|
||||
<option key={option} value={option}>
|
||||
{option}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
) : control.type === 'boolean' ? (
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={stateProps[control.name]}
|
||||
onChange={e => updateProp(control.name, e.target.checked)}
|
||||
/>
|
||||
) : (
|
||||
<input
|
||||
type="text"
|
||||
value={stateProps[control.name]}
|
||||
onChange={e => updateProp(control.name, e.target.value)}
|
||||
style={{ width: '100%', padding: '5px' }}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</ThemeProvider>
|
||||
);
|
||||
}
|
||||
@@ -111,7 +111,7 @@ const StyledTitleContainer = styled('div')`
|
||||
}
|
||||
`;
|
||||
|
||||
const StyledButton = styled(Link as React.ComponentType<any>)`
|
||||
const StyledButton = styled(Link)`
|
||||
border-radius: 10px;
|
||||
font-size: 20px;
|
||||
font-weight: bold;
|
||||
@@ -460,15 +460,23 @@ export default function Home(): JSX.Element {
|
||||
const changeToDark = () => {
|
||||
const navbar = document.body.querySelector('.navbar');
|
||||
const logo = document.body.querySelector('.navbar__logo img');
|
||||
navbar.classList.add('navbar--dark');
|
||||
logo.setAttribute('src', '/img/superset-logo-horiz-dark.svg');
|
||||
if (navbar) {
|
||||
navbar.classList.add('navbar--dark');
|
||||
}
|
||||
if (logo) {
|
||||
logo.setAttribute('src', '/img/superset-logo-horiz-dark.svg');
|
||||
}
|
||||
};
|
||||
|
||||
const changeToLight = () => {
|
||||
const navbar = document.body.querySelector('.navbar');
|
||||
const logo = document.body.querySelector('.navbar__logo img');
|
||||
navbar.classList.remove('navbar--dark');
|
||||
logo.setAttribute('src', '/img/superset-logo-horiz.svg');
|
||||
if (navbar) {
|
||||
navbar.classList.remove('navbar--dark');
|
||||
}
|
||||
if (logo) {
|
||||
logo.setAttribute('src', '/img/superset-logo-horiz.svg');
|
||||
}
|
||||
};
|
||||
|
||||
// Set up dark <-> light navbar change
|
||||
@@ -476,7 +484,9 @@ export default function Home(): JSX.Element {
|
||||
changeToDark();
|
||||
|
||||
const navbarToggle = document.body.querySelector('.navbar__toggle');
|
||||
navbarToggle.addEventListener('click', () => changeToLight());
|
||||
if (navbarToggle) {
|
||||
navbarToggle.addEventListener('click', () => changeToLight());
|
||||
}
|
||||
|
||||
const scrollListener = () => {
|
||||
if (window.scrollY > 0) {
|
||||
|
||||
@@ -45,6 +45,60 @@ ul.dropdown__menu svg {
|
||||
display: none;
|
||||
}
|
||||
|
||||
/* Consistent dropdown styling for navbar */
|
||||
.navbar__item.dropdown .dropdown__menu {
|
||||
background-color: white;
|
||||
border: 1px solid rgba(0, 0, 0, 0.1);
|
||||
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
.navbar__item.dropdown .dropdown__link {
|
||||
color: #1c1e21 !important;
|
||||
background-color: transparent !important;
|
||||
border-radius: 0 !important;
|
||||
padding: 0.5rem 1rem !important;
|
||||
display: block !important;
|
||||
}
|
||||
|
||||
.navbar__item.dropdown .dropdown__link:hover {
|
||||
background-color: #f5f5f5 !important;
|
||||
color: #1c1e21 !important;
|
||||
text-decoration: none !important;
|
||||
}
|
||||
|
||||
/* Remove the blue box styling for doc links in dropdowns */
|
||||
.navbar__item.dropdown .dropdown__link--active {
|
||||
background-color: transparent !important;
|
||||
color: #1c1e21 !important;
|
||||
}
|
||||
|
||||
.navbar__item.dropdown .dropdown__link--active:hover {
|
||||
background-color: #f5f5f5 !important;
|
||||
}
|
||||
|
||||
/* Dark mode support */
|
||||
[data-theme='dark'] .navbar__item.dropdown .dropdown__menu {
|
||||
background-color: #242526;
|
||||
border: 1px solid rgba(255, 255, 255, 0.1);
|
||||
}
|
||||
|
||||
[data-theme='dark'] .navbar__item.dropdown .dropdown__link {
|
||||
color: #f5f6f7 !important;
|
||||
}
|
||||
|
||||
[data-theme='dark'] .navbar__item.dropdown .dropdown__link:hover {
|
||||
background-color: #3a3b3c !important;
|
||||
color: #f5f6f7 !important;
|
||||
}
|
||||
|
||||
[data-theme='dark'] .navbar__item.dropdown .dropdown__link--active {
|
||||
color: #f5f6f7 !important;
|
||||
}
|
||||
|
||||
[data-theme='dark'] .navbar__item.dropdown .dropdown__link--active:hover {
|
||||
background-color: #3a3b3c !important;
|
||||
}
|
||||
|
||||
:root {
|
||||
--ifm-color-primary: #20a7c9;
|
||||
--ifm-color-primary-dark: #1985a0;
|
||||
|
||||
119
docs/src/theme/DocVersionBadge/index.js
Normal file
119
docs/src/theme/DocVersionBadge/index.js
Normal file
@@ -0,0 +1,119 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import React from 'react';
|
||||
import {
|
||||
useActivePlugin,
|
||||
useDocsVersion,
|
||||
useVersions,
|
||||
} from '@docusaurus/plugin-content-docs/client';
|
||||
import { useLocation } from '@docusaurus/router';
|
||||
import { useDocsPreferredVersion } from '@docusaurus/theme-common';
|
||||
import { Dropdown } from 'antd';
|
||||
import { DownOutlined } from '@ant-design/icons';
|
||||
|
||||
import styles from './styles.module.css';
|
||||
|
||||
export default function DocVersionBadge() {
|
||||
const activePlugin = useActivePlugin();
|
||||
const { pathname } = useLocation();
|
||||
const pluginId = activePlugin?.pluginId;
|
||||
const [versionedPath, setVersionedPath] = React.useState('');
|
||||
|
||||
// Show version selector for all versioned sections
|
||||
const isVersioned = [
|
||||
'default', // main docs
|
||||
'components',
|
||||
'tutorials',
|
||||
'developer_portal',
|
||||
].includes(pluginId);
|
||||
|
||||
const { preferredVersion } = useDocsPreferredVersion(pluginId);
|
||||
const versions = useVersions(pluginId);
|
||||
const version = useDocsVersion();
|
||||
|
||||
// Extract the current page path relative to the version
|
||||
React.useEffect(() => {
|
||||
if (!pathname || !version || !pluginId) return;
|
||||
|
||||
let relativePath = '';
|
||||
const basePath = pluginId === 'default' ? '/docs' : `/${pluginId}`;
|
||||
|
||||
// Handle different version path patterns
|
||||
if (pathname.includes(basePath)) {
|
||||
// Extract the part after the base path
|
||||
const parts = pathname.split(basePath);
|
||||
if (parts.length > 1) {
|
||||
const afterBase = parts[1];
|
||||
// For versioned paths, remove the version segment
|
||||
if (afterBase.startsWith('/')) {
|
||||
const segments = afterBase.substring(1).split('/');
|
||||
// Check if first segment is a version (e.g., "1.1.0", "next")
|
||||
if (segments[0] && (segments[0].match(/^\d+\.\d+\.\d+$/) || segments[0] === 'next')) {
|
||||
// Skip the version segment
|
||||
relativePath = segments.length > 1 ? '/' + segments.slice(1).join('/') : '';
|
||||
} else {
|
||||
// No version in path (e.g., /docs/intro for current version with empty path)
|
||||
relativePath = afterBase;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
setVersionedPath(relativePath);
|
||||
}, [pathname, version, pluginId]);
|
||||
|
||||
// Create dropdown items for version selection
|
||||
const items = versions.map(v => {
|
||||
// Construct the URL for this version, preserving the current page
|
||||
// v.path contains the full path including base, e.g., "/docs/1.1.0" or "/docs"
|
||||
let versionUrl = v.path;
|
||||
|
||||
if (versionedPath) {
|
||||
// Append the current page path to the version base
|
||||
versionUrl = v.path + versionedPath;
|
||||
}
|
||||
|
||||
return {
|
||||
key: v.name,
|
||||
label: (
|
||||
<a href={versionUrl}>
|
||||
{v.label}
|
||||
{v.name === version.name && ' (current)'}
|
||||
{v.name === preferredVersion?.name && ' (preferred)'}
|
||||
</a>
|
||||
),
|
||||
};
|
||||
});
|
||||
|
||||
if (!isVersioned) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<span className={styles.versionBadge}>
|
||||
Version:{' '}
|
||||
<Dropdown menu={{ items }} trigger={['click']}>
|
||||
<a onClick={e => e.preventDefault()} className={styles.versionSelector}>
|
||||
{version.label} <DownOutlined />
|
||||
</a>
|
||||
</Dropdown>
|
||||
</span>
|
||||
);
|
||||
}
|
||||
42
docs/src/theme/DocVersionBadge/styles.module.css
Normal file
42
docs/src/theme/DocVersionBadge/styles.module.css
Normal file
@@ -0,0 +1,42 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
.versionBadge {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
font-size: 0.8rem;
|
||||
font-weight: 500;
|
||||
color: var(--ifm-color-emphasis-700);
|
||||
background-color: var(--ifm-color-emphasis-200);
|
||||
border-radius: 0.5rem;
|
||||
padding: 0.2rem 0.5rem;
|
||||
margin-right: 0.5rem;
|
||||
}
|
||||
|
||||
.versionSelector {
|
||||
cursor: pointer;
|
||||
color: var(--ifm-color-primary);
|
||||
font-weight: 500;
|
||||
margin-left: 0.25rem;
|
||||
}
|
||||
|
||||
.versionSelector:hover {
|
||||
text-decoration: none;
|
||||
color: var(--ifm-color-primary-darker);
|
||||
}
|
||||
121
docs/src/theme/DocVersionBanner/index.js
Normal file
121
docs/src/theme/DocVersionBanner/index.js
Normal file
@@ -0,0 +1,121 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import React, { useState, useEffect } from 'react';
|
||||
import DocVersionBanner from '@theme-original/DocVersionBanner';
|
||||
import {
|
||||
useActivePlugin,
|
||||
useDocsVersion,
|
||||
useVersions,
|
||||
} from '@docusaurus/plugin-content-docs/client';
|
||||
import { useLocation } from '@docusaurus/router';
|
||||
import { useDocsPreferredVersion } from '@docusaurus/theme-common';
|
||||
import { Dropdown } from 'antd';
|
||||
import { DownOutlined } from '@ant-design/icons';
|
||||
|
||||
import styles from './styles.module.css';
|
||||
|
||||
export default function DocVersionBannerWrapper(props) {
|
||||
const activePlugin = useActivePlugin();
|
||||
const { pathname } = useLocation();
|
||||
const pluginId = activePlugin?.pluginId;
|
||||
const [versionedPath, setVersionedPath] = useState('');
|
||||
|
||||
// Only show version selector for tutorials
|
||||
// Main docs, components, and developer_portal use the DocVersionBadge component instead
|
||||
const isVersioned = pluginId && ['tutorials'].includes(pluginId);
|
||||
|
||||
const { preferredVersion } = useDocsPreferredVersion(pluginId);
|
||||
const versions = useVersions(pluginId);
|
||||
const version = useDocsVersion();
|
||||
|
||||
// Early return if required data is not available
|
||||
if (!isVersioned || !versions || !version) {
|
||||
return <DocVersionBanner {...props} />;
|
||||
}
|
||||
|
||||
// Extract the current page path relative to the version
|
||||
useEffect(() => {
|
||||
if (!pathname || !version || !pluginId) return;
|
||||
|
||||
let relativePath = '';
|
||||
|
||||
// Handle different version path patterns
|
||||
if (pathname.includes(`/${pluginId}/`)) {
|
||||
// Extract the part after the version
|
||||
// Example: /components/1.1.0/ui-components/button -> /ui-components/button
|
||||
const parts = pathname.split(`/${pluginId}/`);
|
||||
if (parts.length > 1) {
|
||||
const afterPluginId = parts[1];
|
||||
// Find where the version part ends
|
||||
const versionParts = afterPluginId.split('/');
|
||||
if (versionParts.length > 1) {
|
||||
// Remove the version part and join the rest
|
||||
relativePath = '/' + versionParts.slice(1).join('/');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
setVersionedPath(relativePath);
|
||||
}, [pathname, version, pluginId]);
|
||||
|
||||
// Create dropdown items for version selection
|
||||
const items = versions.map(v => {
|
||||
// Construct the URL for this version, preserving the current page
|
||||
// v.path is the version-specific path like "1.0.0" or "next"
|
||||
let versionUrl = v.path;
|
||||
|
||||
if (versionedPath) {
|
||||
// Construct the full URL with the version and the current page path
|
||||
versionUrl = v.path + versionedPath;
|
||||
}
|
||||
|
||||
return {
|
||||
key: v.name,
|
||||
label: (
|
||||
<a href={versionUrl}>
|
||||
{v.label}
|
||||
{v.name === version.name && ' (current)'}
|
||||
{v.name === preferredVersion?.name && ' (preferred)'}
|
||||
</a>
|
||||
),
|
||||
};
|
||||
});
|
||||
|
||||
return (
|
||||
<>
|
||||
<DocVersionBanner {...props} />
|
||||
{isVersioned && (
|
||||
<div className={styles.versionBanner}>
|
||||
<div className={styles.versionContainer}>
|
||||
<span className={styles.versionLabel}>Version:</span>
|
||||
<Dropdown menu={{ items }} trigger={['click']}>
|
||||
<a
|
||||
onClick={e => e.preventDefault()}
|
||||
className={styles.versionSelector}
|
||||
>
|
||||
{version.label} <DownOutlined />
|
||||
</a>
|
||||
</Dropdown>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
);
|
||||
}
|
||||
49
docs/src/theme/DocVersionBanner/styles.module.css
Normal file
49
docs/src/theme/DocVersionBanner/styles.module.css
Normal file
@@ -0,0 +1,49 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
.versionBanner {
|
||||
background-color: var(--ifm-color-emphasis-100);
|
||||
padding: 0.5rem 1rem;
|
||||
margin-bottom: 1rem;
|
||||
border-bottom: 1px solid var(--ifm-color-emphasis-200);
|
||||
}
|
||||
|
||||
.versionContainer {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
max-width: var(--ifm-container-width);
|
||||
margin: 0 auto;
|
||||
padding: 0 var(--ifm-spacing-horizontal);
|
||||
}
|
||||
|
||||
.versionLabel {
|
||||
font-weight: bold;
|
||||
margin-right: 0.5rem;
|
||||
}
|
||||
|
||||
.versionSelector {
|
||||
cursor: pointer;
|
||||
color: var(--ifm-color-primary);
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.versionSelector:hover {
|
||||
text-decoration: none;
|
||||
color: var(--ifm-color-primary-darker);
|
||||
}
|
||||
@@ -16,7 +16,6 @@
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
/* eslint-disable no-undef */
|
||||
import { useEffect } from 'react';
|
||||
import useDocusaurusContext from '@docusaurus/useDocusaurusContext';
|
||||
|
||||
@@ -50,7 +49,9 @@ export default function Root({ children }) {
|
||||
|
||||
// Handle route changes for SPA
|
||||
const handleRouteChange = () => {
|
||||
devMode && console.log('Route changed to:', window.location.pathname);
|
||||
if (devMode) {
|
||||
console.log('Route changed to:', window.location.pathname);
|
||||
}
|
||||
|
||||
// Short timeout to ensure the page has fully rendered
|
||||
setTimeout(() => {
|
||||
@@ -58,11 +59,9 @@ export default function Root({ children }) {
|
||||
const currentTitle = document.title;
|
||||
const currentPath = window.location.pathname;
|
||||
|
||||
devMode &&
|
||||
console.log('Tracking page view:', currentPath, currentTitle);
|
||||
|
||||
// For testing: impersonate real domain - ONLY FOR DEVELOPMENT
|
||||
if (devMode) {
|
||||
console.log('Tracking page view:', currentPath, currentTitle);
|
||||
window._paq.push(['setDomains', ['superset.apache.org']]);
|
||||
window._paq.push([
|
||||
'setCustomUrl',
|
||||
@@ -85,17 +84,22 @@ export default function Root({ children }) {
|
||||
'routeDidUpdate',
|
||||
];
|
||||
|
||||
devMode && console.log('Setting up Docusaurus route listeners');
|
||||
if (devMode) {
|
||||
console.log('Setting up Docusaurus route listeners');
|
||||
}
|
||||
possibleEvents.forEach(eventName => {
|
||||
document.addEventListener(eventName, () => {
|
||||
devMode &&
|
||||
if (devMode) {
|
||||
console.log(`Docusaurus route update detected via ${eventName}`);
|
||||
}
|
||||
handleRouteChange();
|
||||
});
|
||||
});
|
||||
|
||||
// Also set up manual history tracking as fallback
|
||||
devMode && console.log('Setting up manual history tracking as fallback');
|
||||
if (devMode) {
|
||||
console.log('Setting up manual history tracking as fallback');
|
||||
}
|
||||
const originalPushState = window.history.pushState;
|
||||
window.history.pushState = function () {
|
||||
originalPushState.apply(this, arguments);
|
||||
|
||||
115
docs/src/webpack.extend.ts
Normal file
115
docs/src/webpack.extend.ts
Normal file
@@ -0,0 +1,115 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import path from 'path';
|
||||
import type { Plugin } from '@docusaurus/types';
|
||||
|
||||
export default function webpackExtendPlugin(): Plugin<void> {
|
||||
return {
|
||||
name: 'custom-webpack-plugin',
|
||||
configureWebpack(config, isServer, utils) {
|
||||
const { isDev } = utils;
|
||||
return {
|
||||
devtool: isDev ? 'eval-source-map' : config.devtool,
|
||||
...(isDev && {
|
||||
optimization: {
|
||||
...config.optimization,
|
||||
minimize: false,
|
||||
removeAvailableModules: false,
|
||||
removeEmptyChunks: false,
|
||||
splitChunks: false,
|
||||
},
|
||||
}),
|
||||
resolve: {
|
||||
alias: {
|
||||
...config.resolve.alias,
|
||||
// Allow importing from superset-frontend
|
||||
src: path.resolve(__dirname, '../../superset-frontend/src'),
|
||||
// '@superset-ui/core': path.resolve(
|
||||
// __dirname,
|
||||
// '../../superset-frontend/packages/superset-ui-core',
|
||||
// ),
|
||||
// Add aliases for our components to make imports easier
|
||||
'@docs/components': path.resolve(__dirname, '../src/components'),
|
||||
'@superset/components': path.resolve(
|
||||
__dirname,
|
||||
'../../superset-frontend/packages/superset-ui-core/src/components',
|
||||
),
|
||||
// Add proper Storybook aliases
|
||||
'@storybook/blocks': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/blocks',
|
||||
),
|
||||
'@storybook/components': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/components',
|
||||
),
|
||||
'@storybook/theming': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/theming',
|
||||
),
|
||||
'@storybook/client-logger': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/client-logger',
|
||||
),
|
||||
'@storybook/core-events': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/core-events',
|
||||
),
|
||||
// Add internal Storybook aliases
|
||||
'storybook/internal/components': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/components',
|
||||
),
|
||||
'storybook/internal/theming': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/theming',
|
||||
),
|
||||
'storybook/internal/client-logger': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/client-logger',
|
||||
),
|
||||
'storybook/internal/csf': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/csf',
|
||||
),
|
||||
'storybook/internal/preview-api': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/preview-api',
|
||||
),
|
||||
'storybook/internal/docs-tools': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/docs-tools',
|
||||
),
|
||||
'storybook/internal/core-events': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/core-events',
|
||||
),
|
||||
'storybook/internal/channels': path.resolve(
|
||||
__dirname,
|
||||
'../node_modules/@storybook/channels',
|
||||
),
|
||||
},
|
||||
},
|
||||
// We're removing the ts-loader rule that was processing superset-frontend files
|
||||
// This will prevent TypeScript errors from files outside the docs directory
|
||||
};
|
||||
},
|
||||
};
|
||||
}
|
||||
3
docs/tutorials_versions.json
Normal file
3
docs/tutorials_versions.json
Normal file
@@ -0,0 +1,3 @@
|
||||
[
|
||||
"1.0.0"
|
||||
]
|
||||
35
docs/versioned_docs/version-6.0.0/api.mdx
Normal file
35
docs/versioned_docs/version-6.0.0/api.mdx
Normal file
@@ -0,0 +1,35 @@
|
||||
---
|
||||
title: API
|
||||
hide_title: true
|
||||
sidebar_position: 10
|
||||
---
|
||||
|
||||
import SwaggerUI from 'swagger-ui-react';
|
||||
import openapi from '/resources/openapi.json';
|
||||
import 'swagger-ui-react/swagger-ui.css';
|
||||
import { Alert } from 'antd';
|
||||
|
||||
## API
|
||||
|
||||
Superset's public **REST API** follows the
|
||||
[OpenAPI specification](https://swagger.io/specification/), and is
|
||||
documented here. The docs below are generated using
|
||||
[Swagger React UI](https://www.npmjs.com/package/swagger-ui-react).
|
||||
|
||||
<Alert
|
||||
type="info"
|
||||
message={
|
||||
<div>
|
||||
<strong>NOTE! </strong>
|
||||
You can find an interactive version of this documentation on your local Superset
|
||||
instance at <strong>/swagger/v1</strong> (unless disabled)
|
||||
</div>
|
||||
}
|
||||
/>
|
||||
|
||||
<br />
|
||||
<br />
|
||||
<hr />
|
||||
<div className="swagger-container">
|
||||
<SwaggerUI spec={openapi} />
|
||||
</div>
|
||||
@@ -0,0 +1,436 @@
|
||||
---
|
||||
title: Alerts and Reports
|
||||
hide_title: true
|
||||
sidebar_position: 2
|
||||
version: 2
|
||||
---
|
||||
|
||||
# Alerts and Reports
|
||||
|
||||
Users can configure automated alerts and reports to send dashboards or charts to an email recipient or Slack channel.
|
||||
|
||||
- *Alerts* are sent when a SQL condition is reached
|
||||
- *Reports* are sent on a schedule
|
||||
|
||||
Alerts and reports are disabled by default. To turn them on, you need to do some setup, described here.
|
||||
|
||||
## Requirements
|
||||
|
||||
### Commons
|
||||
|
||||
#### In your `superset_config.py` or `superset_config_docker.py`
|
||||
|
||||
- `"ALERT_REPORTS"` [feature flag](/docs/configuration/configuring-superset#feature-flags) must be turned to True.
|
||||
- `beat_schedule` in CeleryConfig must contain schedule for `reports.scheduler`.
|
||||
- At least one of those must be configured, depending on what you want to use:
|
||||
- emails: `SMTP_*` settings
|
||||
- Slack messages: `SLACK_API_TOKEN`
|
||||
- Users can customize the email subject by including date code placeholders, which will automatically be replaced with the corresponding UTC date when the email is sent. To enable this functionality, activate the `"DATE_FORMAT_IN_EMAIL_SUBJECT"` [feature flag](/docs/configuration/configuring-superset#feature-flags). This enables date formatting in email subjects, preventing all reporting emails from being grouped into the same thread (optional for the reporting feature).
|
||||
- Use date codes from [strftime.org](https://strftime.org/) to create the email subject.
|
||||
- If no date code is provided, the original string will be used as the email subject.
|
||||
|
||||
##### Disable dry-run mode
|
||||
|
||||
Screenshots will be taken but no messages actually sent as long as `ALERT_REPORTS_NOTIFICATION_DRY_RUN = True`, its default value in `docker/pythonpath_dev/superset_config.py`. To disable dry-run mode and start receiving email/Slack notifications, set `ALERT_REPORTS_NOTIFICATION_DRY_RUN` to `False` in [superset config](https://github.com/apache/superset/blob/master/docker/pythonpath_dev/superset_config.py).
|
||||
|
||||
#### In your `Dockerfile`
|
||||
|
||||
- You must install a headless browser, for taking screenshots of the charts and dashboards. Only Firefox and Chrome are currently supported.
|
||||
> If you choose Chrome, you must also change the value of `WEBDRIVER_TYPE` to `"chrome"` in your `superset_config.py`.
|
||||
|
||||
Note: All the components required (Firefox headless browser, Redis, Postgres db, celery worker and celery beat) are present in the *dev* docker image if you are following [Installing Superset Locally](/docs/installation/docker-compose/).
|
||||
All you need to do is add the required config variables described in this guide (See `Detailed Config`).
|
||||
|
||||
If you are running a non-dev docker image, e.g., a stable release like `apache/superset:3.1.0`, that image does not include a headless browser. Only the `superset_worker` container needs this headless browser to browse to the target chart or dashboard.
|
||||
You can either install and configure the headless browser - see "Custom Dockerfile" section below - or when deploying via `docker compose`, modify your `docker-compose.yml` file to use a dev image for the worker container and a stable release image for the `superset_app` container.
|
||||
|
||||
*Note*: In this context, a "dev image" is the same application software as its corresponding non-dev image, just bundled with additional tools. So an image like `3.1.0-dev` is identical to `3.1.0` when it comes to stability, functionality, and running in production. The actual "in-development" versions of Superset - cutting-edge and unstable - are not tagged with version numbers on Docker Hub and will display version `0.0.0-dev` within the Superset UI.
|
||||
|
||||
### Slack integration
|
||||
|
||||
To send alerts and reports to Slack channels, you need to create a new Slack Application on your workspace.
|
||||
|
||||
1. Connect to your Slack workspace, then head to [https://api.slack.com/apps].
|
||||
2. Create a new app.
|
||||
3. Go to "OAuth & Permissions" section, and give the following scopes to your app:
|
||||
- `incoming-webhook`
|
||||
- `files:write`
|
||||
- `chat:write`
|
||||
- `channels:read`
|
||||
- `groups:read`
|
||||
4. At the top of the "OAuth and Permissions" section, click "install to workspace".
|
||||
5. Select a default channel for your app and continue.
|
||||
(You can post to any channel by inviting your Superset app into that channel).
|
||||
6. The app should now be installed in your workspace, and a "Bot User OAuth Access Token" should have been created. Copy that token in the `SLACK_API_TOKEN` variable of your `superset_config.py`.
|
||||
7. Ensure the feature flag `ALERT_REPORT_SLACK_V2` is set to True in `superset_config.py`
|
||||
8. Restart the service (or run `superset init`) to pull in the new configuration.
|
||||
|
||||
Note: when you configure an alert or a report, the Slack channel list takes channel names without the leading '#' e.g. use `alerts` instead of `#alerts`.
|
||||
|
||||
### Kubernetes-specific
|
||||
|
||||
- You must have a `celery beat` pod running. If you're using the chart included in the GitHub repository under [helm/superset](https://github.com/apache/superset/tree/master/helm/superset), you need to put `supersetCeleryBeat.enabled = true` in your values override.
|
||||
- You can see the dedicated docs about [Kubernetes installation](/docs/installation/kubernetes) for more details.
|
||||
|
||||
### Docker Compose specific
|
||||
|
||||
#### You must have in your `docker-compose.yml`
|
||||
|
||||
- A Redis message broker
|
||||
- PostgreSQL DB instead of SQLlite
|
||||
- One or more `celery worker`
|
||||
- A single `celery beat`
|
||||
|
||||
This process also works in a Docker swarm environment, you would just need to add `Deploy:` to the Superset, Redis and Postgres services along with your specific configs for your swarm.
|
||||
|
||||
### Detailed config
|
||||
|
||||
The following configurations need to be added to the `superset_config.py` file. This file is loaded when the image runs, and any configurations in it will override the default configurations found in the `config.py`.
|
||||
|
||||
You can find documentation about each field in the default `config.py` in the GitHub repository under [superset/config.py](https://github.com/apache/superset/blob/master/superset/config.py).
|
||||
|
||||
You need to replace default values with your custom Redis, Slack and/or SMTP config.
|
||||
|
||||
Superset uses Celery beat and Celery worker(s) to send alerts and reports.
|
||||
|
||||
- The beat is the scheduler that tells the worker when to perform its tasks. This schedule is defined when you create the alert or report.
|
||||
- The worker will process the tasks that need to be performed when an alert or report is fired.
|
||||
|
||||
In the `CeleryConfig`, only the `beat_schedule` is relevant to this feature, the rest of the `CeleryConfig` can be changed for your needs.
|
||||
|
||||
```python
|
||||
from celery.schedules import crontab
|
||||
|
||||
FEATURE_FLAGS = {
|
||||
"ALERT_REPORTS": True
|
||||
}
|
||||
|
||||
REDIS_HOST = "superset_cache"
|
||||
REDIS_PORT = "6379"
|
||||
|
||||
class CeleryConfig:
|
||||
broker_url = f"redis://{REDIS_HOST}:{REDIS_PORT}/0"
|
||||
imports = (
|
||||
"superset.sql_lab",
|
||||
"superset.tasks.scheduler",
|
||||
)
|
||||
result_backend = f"redis://{REDIS_HOST}:{REDIS_PORT}/0"
|
||||
worker_prefetch_multiplier = 10
|
||||
task_acks_late = True
|
||||
task_annotations = {
|
||||
"sql_lab.get_sql_results": {
|
||||
"rate_limit": "100/s",
|
||||
},
|
||||
}
|
||||
beat_schedule = {
|
||||
"reports.scheduler": {
|
||||
"task": "reports.scheduler",
|
||||
"schedule": crontab(minute="*", hour="*"),
|
||||
},
|
||||
"reports.prune_log": {
|
||||
"task": "reports.prune_log",
|
||||
"schedule": crontab(minute=0, hour=0),
|
||||
},
|
||||
}
|
||||
CELERY_CONFIG = CeleryConfig
|
||||
|
||||
SCREENSHOT_LOCATE_WAIT = 100
|
||||
SCREENSHOT_LOAD_WAIT = 600
|
||||
|
||||
# Slack configuration
|
||||
SLACK_API_TOKEN = "xoxb-"
|
||||
|
||||
# Email configuration
|
||||
SMTP_HOST = "smtp.sendgrid.net" # change to your host
|
||||
SMTP_PORT = 2525 # your port, e.g. 587
|
||||
SMTP_STARTTLS = True
|
||||
SMTP_SSL_SERVER_AUTH = True # If you're using an SMTP server with a valid certificate
|
||||
SMTP_SSL = False
|
||||
SMTP_USER = "your_user" # use the empty string "" if using an unauthenticated SMTP server
|
||||
SMTP_PASSWORD = "your_password" # use the empty string "" if using an unauthenticated SMTP server
|
||||
SMTP_MAIL_FROM = "noreply@youremail.com"
|
||||
EMAIL_REPORTS_SUBJECT_PREFIX = "[Superset] " # optional - overwrites default value in config.py of "[Report] "
|
||||
|
||||
# WebDriver configuration
|
||||
# If you use Firefox, you can stick with default values
|
||||
# If you use Chrome, then add the following WEBDRIVER_TYPE and WEBDRIVER_OPTION_ARGS
|
||||
WEBDRIVER_TYPE = "chrome"
|
||||
WEBDRIVER_OPTION_ARGS = [
|
||||
"--force-device-scale-factor=2.0",
|
||||
"--high-dpi-support=2.0",
|
||||
"--headless",
|
||||
"--disable-gpu",
|
||||
"--disable-dev-shm-usage",
|
||||
"--no-sandbox",
|
||||
"--disable-setuid-sandbox",
|
||||
"--disable-extensions",
|
||||
]
|
||||
|
||||
# This is for internal use, you can keep http
|
||||
WEBDRIVER_BASEURL = "http://superset:8088" # When running using docker compose use "http://superset_app:8088'
|
||||
# This is the link sent to the recipient. Change to your domain, e.g. https://superset.mydomain.com
|
||||
WEBDRIVER_BASEURL_USER_FRIENDLY = "http://localhost:8088"
|
||||
```
|
||||
|
||||
You also need
|
||||
to specify on behalf of which username to render the dashboards. In general, dashboards and charts
|
||||
are not accessible to unauthorized requests, that is why the worker needs to take over credentials
|
||||
of an existing user to take a snapshot.
|
||||
|
||||
By default, Alerts and Reports are executed as the owner of the alert/report object. To use a fixed user account,
|
||||
just change the config as follows (`admin` in this example):
|
||||
|
||||
```python
|
||||
from superset.tasks.types import FixedExecutor
|
||||
|
||||
ALERT_REPORTS_EXECUTORS = [FixedExecutor("admin")]
|
||||
```
|
||||
|
||||
Please refer to `ExecutorType` in the codebase for other executor types.
|
||||
|
||||
**Important notes**
|
||||
|
||||
- Be mindful of the concurrency setting for celery (using `-c 4`). Selenium/webdriver instances can
|
||||
consume a lot of CPU / memory on your servers.
|
||||
- In some cases, if you notice a lot of leaked geckodriver processes, try running your celery
|
||||
processes with `celery worker --pool=prefork --max-tasks-per-child=128 ...`
|
||||
- It is recommended to run separate workers for the `sql_lab` and `email_reports` tasks. This can be
|
||||
done using the `queue` field in `task_annotations`.
|
||||
- Adjust `WEBDRIVER_BASEURL` in your configuration file if celery workers can’t access Superset via
|
||||
its default value of `http://0.0.0.0:8080/`.
|
||||
|
||||
It's also possible to specify a minimum interval between each report's execution through the config file:
|
||||
|
||||
``` python
|
||||
# Set a minimum interval threshold between executions (for each Alert/Report)
|
||||
# Value should be an integer
|
||||
ALERT_MINIMUM_INTERVAL = int(timedelta(minutes=10).total_seconds())
|
||||
REPORT_MINIMUM_INTERVAL = int(timedelta(minutes=5).total_seconds())
|
||||
```
|
||||
|
||||
Alternatively, you can assign a function to `ALERT_MINIMUM_INTERVAL` and/or `REPORT_MINIMUM_INTERVAL`. This is useful to dynamically retrieve a value as needed:
|
||||
|
||||
``` python
|
||||
def alert_dynamic_minimal_interval(**kwargs) -> int:
|
||||
"""
|
||||
Define logic here to retrieve the value dynamically
|
||||
"""
|
||||
|
||||
ALERT_MINIMUM_INTERVAL = alert_dynamic_minimal_interval
|
||||
```
|
||||
|
||||
## Custom Dockerfile
|
||||
|
||||
If you're running the dev version of a released Superset image, like `apache/superset:3.1.0-dev`, you should be set with the above.
|
||||
|
||||
But if you're building your own image, or starting with a non-dev version, a webdriver (and headless browser) is needed to capture screenshots of the charts and dashboards which are then sent to the recipient.
|
||||
Here's how you can modify your Dockerfile to take the screenshots either with Firefox or Chrome.
|
||||
|
||||
### Using Firefox
|
||||
|
||||
```docker
|
||||
FROM apache/superset:3.1.0
|
||||
|
||||
USER root
|
||||
|
||||
RUN apt-get update && \
|
||||
apt-get install --no-install-recommends -y firefox-esr
|
||||
|
||||
ENV GECKODRIVER_VERSION=0.29.0
|
||||
RUN wget -q https://github.com/mozilla/geckodriver/releases/download/v${GECKODRIVER_VERSION}/geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz && \
|
||||
tar -x geckodriver -zf geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz -O > /usr/bin/geckodriver && \
|
||||
chmod 755 /usr/bin/geckodriver && \
|
||||
rm geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz
|
||||
|
||||
RUN pip install --no-cache gevent psycopg2 redis
|
||||
|
||||
USER superset
|
||||
```
|
||||
|
||||
### Using Chrome
|
||||
|
||||
```docker
|
||||
FROM apache/superset:3.1.0
|
||||
|
||||
USER root
|
||||
|
||||
RUN apt-get update && \
|
||||
apt-get install -y wget zip libaio1
|
||||
|
||||
RUN export CHROMEDRIVER_VERSION=$(curl --silent https://googlechromelabs.github.io/chrome-for-testing/LATEST_RELEASE_116) && \
|
||||
wget -O google-chrome-stable_current_amd64.deb -q http://dl.google.com/linux/chrome/deb/pool/main/g/google-chrome-stable/google-chrome-stable_${CHROMEDRIVER_VERSION}-1_amd64.deb && \
|
||||
apt-get install -y --no-install-recommends ./google-chrome-stable_current_amd64.deb && \
|
||||
rm -f google-chrome-stable_current_amd64.deb
|
||||
|
||||
RUN export CHROMEDRIVER_VERSION=$(curl --silent https://googlechromelabs.github.io/chrome-for-testing/LATEST_RELEASE_116) && \
|
||||
wget -q https://storage.googleapis.com/chrome-for-testing-public/${CHROMEDRIVER_VERSION}/linux64/chromedriver-linux64.zip && \
|
||||
unzip -j chromedriver-linux64.zip -d /usr/bin && \
|
||||
chmod 755 /usr/bin/chromedriver && \
|
||||
rm -f chromedriver-linux64.zip
|
||||
|
||||
RUN pip install --no-cache gevent psycopg2 redis
|
||||
|
||||
USER superset
|
||||
```
|
||||
|
||||
Don't forget to set `WEBDRIVER_TYPE` and `WEBDRIVER_OPTION_ARGS` in your config if you use Chrome.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
There are many reasons that reports might not be working. Try these steps to check for specific issues.
|
||||
|
||||
### Confirm feature flag is enabled and you have sufficient permissions
|
||||
|
||||
If you don't see "Alerts & Reports" under the *Manage* section of the Settings dropdown in the Superset UI, you need to enable the `ALERT_REPORTS` feature flag (see above). Enable another feature flag and check to see that it took effect, to verify that your config file is getting loaded.
|
||||
|
||||
Log in as an admin user to ensure you have adequate permissions.
|
||||
|
||||
### Check the logs of your Celery worker
|
||||
|
||||
This is the best source of information about the problem. In a docker compose deployment, you can do this with a command like `docker logs superset_worker --since 1h`.
|
||||
|
||||
### Check web browser and webdriver installation
|
||||
|
||||
To take a screenshot, the worker visits the dashboard or chart using a headless browser, then takes a screenshot. If you are able to send a chart as CSV or text but can't send as PNG, your problem may lie with the browser.
|
||||
|
||||
Superset docker images that have a tag ending with `-dev` have the Firefox headless browser and geckodriver already installed. You can test that these are installed and in the proper path by entering your Superset worker and running `firefox --headless` and then `geckodriver`. Both commands should start those applications.
|
||||
|
||||
If you are handling the installation of that software on your own, or wish to use Chromium instead, do your own verification to ensure that the headless browser opens successfully in the worker environment.
|
||||
|
||||
### Send a test email
|
||||
|
||||
One symptom of an invalid connection to an email server is receiving an error of `[Errno 110] Connection timed out` in your logs when the report tries to send.
|
||||
|
||||
Confirm via testing that your outbound email configuration is correct. Here is the simplest test, for an un-authenticated email SMTP email service running on port 25. If you are sending over SSL, for instance, study how [Superset's codebase sends emails](https://github.com/apache/superset/blob/master/superset/utils/core.py#L818) and then test with those commands and arguments.
|
||||
|
||||
Start Python in your worker environment, replace all example values, and run:
|
||||
|
||||
```python
|
||||
import smtplib
|
||||
from email.mime.multipart import MIMEMultipart
|
||||
from email.mime.text import MIMEText
|
||||
|
||||
from_email = 'superset_emails@example.com'
|
||||
to_email = 'your_email@example.com'
|
||||
msg = MIMEMultipart()
|
||||
msg['From'] = from_email
|
||||
msg['To'] = to_email
|
||||
msg['Subject'] = 'Superset SMTP config test'
|
||||
message = 'It worked'
|
||||
msg.attach(MIMEText(message))
|
||||
mailserver = smtplib.SMTP('smtpmail.example.com', 25)
|
||||
mailserver.sendmail(from_email, to_email, msg.as_string())
|
||||
mailserver.quit()
|
||||
```
|
||||
|
||||
This should send an email.
|
||||
|
||||
Possible fixes:
|
||||
|
||||
- Some cloud hosts disable outgoing unauthenticated SMTP email to prevent spam. For instance, [Azure blocks port 25 by default on some machines](https://learn.microsoft.com/en-us/azure/virtual-network/troubleshoot-outbound-smtp-connectivity). Enable that port or use another sending method.
|
||||
- Use another set of SMTP credentials that you verify works in this setup.
|
||||
|
||||
### Browse to your report from the worker
|
||||
|
||||
The worker may be unable to reach the report. It will use the value of `WEBDRIVER_BASEURL` to browse to the report. If that route is invalid, or presents an authentication challenge that the worker can't pass, the report screenshot will fail.
|
||||
|
||||
Check this by attempting to `curl` the URL of a report that you see in the error logs of your worker. For instance, from the worker environment, run `curl http://superset_app:8088/superset/dashboard/1/`. You may get different responses depending on whether the dashboard exists - for example, you may need to change the `1` in that URL. If there's a URL in your logs from a failed report screenshot, that's a good place to start. The goal is to determine a valid value for `WEBDRIVER_BASEURL` and determine if an issue like HTTPS or authentication is redirecting your worker.
|
||||
|
||||
In a deployment with authentication measures enabled like HTTPS and Single Sign-On, it may make sense to have the worker navigate directly to the Superset application running in the same location, avoiding the need to sign in. For instance, you could use `WEBDRIVER_BASEURL="http://superset_app:8088"` for a docker compose deployment, and set `"force_https": False,` in your `TALISMAN_CONFIG`.
|
||||
|
||||
## Scheduling Queries as Reports
|
||||
|
||||
You can optionally allow your users to schedule queries directly in SQL Lab. This is done by adding
|
||||
extra metadata to saved queries, which are then picked up by an external scheduled (like
|
||||
[Apache Airflow](https://airflow.apache.org/)).
|
||||
|
||||
To allow scheduled queries, add the following to `SCHEDULED_QUERIES` in your configuration file:
|
||||
|
||||
```python
|
||||
SCHEDULED_QUERIES = {
|
||||
# This information is collected when the user clicks "Schedule query",
|
||||
# and saved into the `extra` field of saved queries.
|
||||
# See: https://github.com/mozilla-services/react-jsonschema-form
|
||||
'JSONSCHEMA': {
|
||||
'title': 'Schedule',
|
||||
'description': (
|
||||
'In order to schedule a query, you need to specify when it '
|
||||
'should start running, when it should stop running, and how '
|
||||
'often it should run. You can also optionally specify '
|
||||
'dependencies that should be met before the query is '
|
||||
'executed. Please read the documentation for best practices '
|
||||
'and more information on how to specify dependencies.'
|
||||
),
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'output_table': {
|
||||
'type': 'string',
|
||||
'title': 'Output table name',
|
||||
},
|
||||
'start_date': {
|
||||
'type': 'string',
|
||||
'title': 'Start date',
|
||||
# date-time is parsed using the chrono library, see
|
||||
# https://www.npmjs.com/package/chrono-node#usage
|
||||
'format': 'date-time',
|
||||
'default': 'tomorrow at 9am',
|
||||
},
|
||||
'end_date': {
|
||||
'type': 'string',
|
||||
'title': 'End date',
|
||||
# date-time is parsed using the chrono library, see
|
||||
# https://www.npmjs.com/package/chrono-node#usage
|
||||
'format': 'date-time',
|
||||
'default': '9am in 30 days',
|
||||
},
|
||||
'schedule_interval': {
|
||||
'type': 'string',
|
||||
'title': 'Schedule interval',
|
||||
},
|
||||
'dependencies': {
|
||||
'type': 'array',
|
||||
'title': 'Dependencies',
|
||||
'items': {
|
||||
'type': 'string',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
'UISCHEMA': {
|
||||
'schedule_interval': {
|
||||
'ui:placeholder': '@daily, @weekly, etc.',
|
||||
},
|
||||
'dependencies': {
|
||||
'ui:help': (
|
||||
'Check the documentation for the correct format when '
|
||||
'defining dependencies.'
|
||||
),
|
||||
},
|
||||
},
|
||||
'VALIDATION': [
|
||||
# ensure that start_date <= end_date
|
||||
{
|
||||
'name': 'less_equal',
|
||||
'arguments': ['start_date', 'end_date'],
|
||||
'message': 'End date cannot be before start date',
|
||||
# this is where the error message is shown
|
||||
'container': 'end_date',
|
||||
},
|
||||
],
|
||||
# link to the scheduler; this example links to an Airflow pipeline
|
||||
# that uses the query id and the output table as its name
|
||||
'linkback': (
|
||||
'https://airflow.example.com/admin/airflow/tree?'
|
||||
'dag_id=query_${id}_${extra_json.schedule_info.output_table}'
|
||||
),
|
||||
}
|
||||
```
|
||||
|
||||
This configuration is based on
|
||||
[react-jsonschema-form](https://github.com/mozilla-services/react-jsonschema-form) and will add a
|
||||
menu item called “Schedule” to SQL Lab. When the menu item is clicked, a modal will show up where
|
||||
the user can add the metadata required for scheduling the query.
|
||||
|
||||
This information can then be retrieved from the endpoint `/api/v1/saved_query/` and used to
|
||||
schedule the queries that have `schedule_info` in their JSON metadata. For schedulers other than
|
||||
Airflow, additional fields can be easily added to the configuration file above.
|
||||
@@ -0,0 +1,104 @@
|
||||
---
|
||||
title: Async Queries via Celery
|
||||
hide_title: true
|
||||
sidebar_position: 4
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Async Queries via Celery
|
||||
|
||||
## Celery
|
||||
|
||||
On large analytic databases, it’s common to run queries that execute for minutes or hours. To enable
|
||||
support for long running queries that execute beyond the typical web request’s timeout (30-60
|
||||
seconds), it is necessary to configure an asynchronous backend for Superset which consists of:
|
||||
|
||||
- one or many Superset workers (which is implemented as a Celery worker), and can be started with
|
||||
the `celery worker` command, run `celery worker --help` to view the related options.
|
||||
- a celery broker (message queue) for which we recommend using Redis or RabbitMQ
|
||||
- a results backend that defines where the worker will persist the query results
|
||||
|
||||
Configuring Celery requires defining a `CELERY_CONFIG` in your `superset_config.py`. Both the worker
|
||||
and web server processes should have the same configuration.
|
||||
|
||||
```python
|
||||
class CeleryConfig(object):
|
||||
broker_url = "redis://localhost:6379/0"
|
||||
imports = (
|
||||
"superset.sql_lab",
|
||||
"superset.tasks.scheduler",
|
||||
)
|
||||
result_backend = "redis://localhost:6379/0"
|
||||
worker_prefetch_multiplier = 10
|
||||
task_acks_late = True
|
||||
task_annotations = {
|
||||
"sql_lab.get_sql_results": {
|
||||
"rate_limit": "100/s",
|
||||
},
|
||||
}
|
||||
|
||||
CELERY_CONFIG = CeleryConfig
|
||||
```
|
||||
|
||||
To start a Celery worker to leverage the configuration, run the following command:
|
||||
|
||||
```bash
|
||||
celery --app=superset.tasks.celery_app:app worker --pool=prefork -O fair -c 4
|
||||
```
|
||||
|
||||
To start a job which schedules periodic background jobs, run the following command:
|
||||
|
||||
```bash
|
||||
celery --app=superset.tasks.celery_app:app beat
|
||||
```
|
||||
|
||||
To setup a result backend, you need to pass an instance of a derivative of from
|
||||
from flask_caching.backends.base import BaseCache to the RESULTS_BACKEND configuration key in your superset_config.py. You can
|
||||
use Memcached, Redis, S3 (https://pypi.python.org/pypi/s3werkzeugcache), memory or the file system
|
||||
(in a single server-type setup or for testing), or to write your own caching interface. Your
|
||||
`superset_config.py` may look something like:
|
||||
|
||||
```python
|
||||
# On S3
|
||||
from s3cache.s3cache import S3Cache
|
||||
S3_CACHE_BUCKET = 'foobar-superset'
|
||||
S3_CACHE_KEY_PREFIX = 'sql_lab_result'
|
||||
RESULTS_BACKEND = S3Cache(S3_CACHE_BUCKET, S3_CACHE_KEY_PREFIX)
|
||||
|
||||
# On Redis
|
||||
from flask_caching.backends.rediscache import RedisCache
|
||||
RESULTS_BACKEND = RedisCache(
|
||||
host='localhost', port=6379, key_prefix='superset_results')
|
||||
```
|
||||
|
||||
For performance gains, [MessagePack](https://github.com/msgpack/msgpack-python) and
|
||||
[PyArrow](https://arrow.apache.org/docs/python/) are now used for results serialization. This can be
|
||||
disabled by setting `RESULTS_BACKEND_USE_MSGPACK = False` in your `superset_config.py`, should any
|
||||
issues arise. Please clear your existing results cache store when upgrading an existing environment.
|
||||
|
||||
**Important Notes**
|
||||
|
||||
- It is important that all the worker nodes and web servers in the Superset cluster _share a common
|
||||
metadata database_. This means that SQLite will not work in this context since it has limited
|
||||
support for concurrency and typically lives on the local file system.
|
||||
|
||||
- There should _only be one instance of celery beat running_ in your entire setup. If not,
|
||||
background jobs can get scheduled multiple times resulting in weird behaviors like duplicate
|
||||
delivery of reports, higher than expected load / traffic etc.
|
||||
|
||||
- SQL Lab will _only run your queries asynchronously if_ you enable **Asynchronous Query Execution**
|
||||
in your database settings (Sources > Databases > Edit record).
|
||||
|
||||
## Celery Flower
|
||||
|
||||
Flower is a web based tool for monitoring the Celery cluster which you can install from pip:
|
||||
|
||||
```bash
|
||||
pip install flower
|
||||
```
|
||||
|
||||
You can run flower using:
|
||||
|
||||
```bash
|
||||
celery --app=superset.tasks.celery_app:app flower
|
||||
```
|
||||
154
docs/versioned_docs/version-6.0.0/configuration/cache.mdx
Normal file
154
docs/versioned_docs/version-6.0.0/configuration/cache.mdx
Normal file
@@ -0,0 +1,154 @@
|
||||
---
|
||||
title: Caching
|
||||
hide_title: true
|
||||
sidebar_position: 3
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Caching
|
||||
|
||||
Superset uses [Flask-Caching](https://flask-caching.readthedocs.io/) for caching purposes.
|
||||
Flask-Caching supports various caching backends, including Redis (recommended), Memcached,
|
||||
SimpleCache (in-memory), or the local filesystem.
|
||||
[Custom cache backends](https://flask-caching.readthedocs.io/en/latest/#custom-cache-backends)
|
||||
are also supported.
|
||||
|
||||
Caching can be configured by providing dictionaries in
|
||||
`superset_config.py` that comply with [the Flask-Caching config specifications](https://flask-caching.readthedocs.io/en/latest/#configuring-flask-caching).
|
||||
|
||||
The following cache configurations can be customized in this way:
|
||||
|
||||
- Dashboard filter state (required): `FILTER_STATE_CACHE_CONFIG`.
|
||||
- Explore chart form data (required): `EXPLORE_FORM_DATA_CACHE_CONFIG`
|
||||
- Metadata cache (optional): `CACHE_CONFIG`
|
||||
- Charting data queried from datasets (optional): `DATA_CACHE_CONFIG`
|
||||
|
||||
For example, to configure the filter state cache using Redis:
|
||||
|
||||
```python
|
||||
FILTER_STATE_CACHE_CONFIG = {
|
||||
'CACHE_TYPE': 'RedisCache',
|
||||
'CACHE_DEFAULT_TIMEOUT': 86400,
|
||||
'CACHE_KEY_PREFIX': 'superset_filter_cache',
|
||||
'CACHE_REDIS_URL': 'redis://localhost:6379/0'
|
||||
}
|
||||
```
|
||||
|
||||
## Dependencies
|
||||
|
||||
In order to use dedicated cache stores, additional python libraries must be installed
|
||||
|
||||
- For Redis: we recommend the [redis](https://pypi.python.org/pypi/redis) Python package
|
||||
- Memcached: we recommend using [pylibmc](https://pypi.org/project/pylibmc/) client library as
|
||||
`python-memcached` does not handle storing binary data correctly.
|
||||
|
||||
These libraries can be installed using pip.
|
||||
|
||||
## Fallback Metastore Cache
|
||||
|
||||
Note, that some form of Filter State and Explore caching are required. If either of these caches
|
||||
are undefined, Superset falls back to using a built-in cache that stores data in the metadata
|
||||
database. While it is recommended to use a dedicated cache, the built-in cache can also be used
|
||||
to cache other data.
|
||||
|
||||
For example, to use the built-in cache to store chart data, use the following config:
|
||||
|
||||
```python
|
||||
DATA_CACHE_CONFIG = {
|
||||
"CACHE_TYPE": "SupersetMetastoreCache",
|
||||
"CACHE_KEY_PREFIX": "superset_results", # make sure this string is unique to avoid collisions
|
||||
"CACHE_DEFAULT_TIMEOUT": 86400, # 60 seconds * 60 minutes * 24 hours
|
||||
}
|
||||
```
|
||||
|
||||
## Chart Cache Timeout
|
||||
|
||||
The cache timeout for charts may be overridden by the settings for an individual chart, dataset, or
|
||||
database. Each of these configurations will be checked in order before falling back to the default
|
||||
value defined in `DATA_CACHE_CONFIG`.
|
||||
|
||||
Note, that by setting the cache timeout to `-1`, caching for charting data can be disabled, either
|
||||
per chart, dataset or database, or by default if set in `DATA_CACHE_CONFIG`.
|
||||
|
||||
## SQL Lab Query Results
|
||||
|
||||
Caching for SQL Lab query results is used when async queries are enabled and is configured using
|
||||
`RESULTS_BACKEND`.
|
||||
|
||||
Note that this configuration does not use a flask-caching dictionary for its configuration, but
|
||||
instead requires a cachelib object.
|
||||
|
||||
See [Async Queries via Celery](/docs/configuration/async-queries-celery) for details.
|
||||
|
||||
## Caching Thumbnails
|
||||
|
||||
This is an optional feature that can be turned on by activating its [feature flag](/docs/configuration/configuring-superset#feature-flags) on config:
|
||||
|
||||
```
|
||||
FEATURE_FLAGS = {
|
||||
"THUMBNAILS": True,
|
||||
"THUMBNAILS_SQLA_LISTENERS": True,
|
||||
}
|
||||
```
|
||||
|
||||
By default thumbnails are rendered per user, and will fall back to the Selenium user for anonymous users.
|
||||
To always render thumbnails as a fixed user (`admin` in this example), use the following configuration:
|
||||
|
||||
```python
|
||||
from superset.tasks.types import FixedExecutor
|
||||
|
||||
THUMBNAIL_EXECUTORS = [FixedExecutor("admin")]
|
||||
```
|
||||
|
||||
For this feature you will need a cache system and celery workers. All thumbnails are stored on cache
|
||||
and are processed asynchronously by the workers.
|
||||
|
||||
An example config where images are stored on S3 could be:
|
||||
|
||||
```python
|
||||
from flask import Flask
|
||||
from s3cache.s3cache import S3Cache
|
||||
|
||||
...
|
||||
|
||||
class CeleryConfig(object):
|
||||
broker_url = "redis://localhost:6379/0"
|
||||
imports = (
|
||||
"superset.sql_lab",
|
||||
"superset.tasks.thumbnails",
|
||||
)
|
||||
result_backend = "redis://localhost:6379/0"
|
||||
worker_prefetch_multiplier = 10
|
||||
task_acks_late = True
|
||||
|
||||
|
||||
CELERY_CONFIG = CeleryConfig
|
||||
|
||||
def init_thumbnail_cache(app: Flask) -> S3Cache:
|
||||
return S3Cache("bucket_name", 'thumbs_cache/')
|
||||
|
||||
|
||||
THUMBNAIL_CACHE_CONFIG = init_thumbnail_cache
|
||||
```
|
||||
|
||||
Using the above example cache keys for dashboards will be `superset_thumb__dashboard__{ID}`. You can
|
||||
override the base URL for selenium using:
|
||||
|
||||
```
|
||||
WEBDRIVER_BASEURL = "https://superset.company.com"
|
||||
```
|
||||
|
||||
Additional selenium web drive configuration can be set using `WEBDRIVER_CONFIGURATION`. You can
|
||||
implement a custom function to authenticate selenium. The default function uses the `flask-login`
|
||||
session cookie. Here's an example of a custom function signature:
|
||||
|
||||
```python
|
||||
def auth_driver(driver: WebDriver, user: "User") -> WebDriver:
|
||||
pass
|
||||
```
|
||||
|
||||
Then on configuration:
|
||||
|
||||
```
|
||||
WEBDRIVER_AUTH_FUNC = auth_driver
|
||||
```
|
||||
@@ -0,0 +1,548 @@
|
||||
---
|
||||
title: Configuring Superset
|
||||
hide_title: true
|
||||
sidebar_position: 1
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Configuring Superset
|
||||
|
||||
## superset_config.py
|
||||
|
||||
Superset exposes hundreds of configurable parameters through its
|
||||
[config.py module](https://github.com/apache/superset/blob/master/superset/config.py). The
|
||||
variables and objects exposed act as a public interface of the bulk of what you may want
|
||||
to configure, alter and interface with. In this python module, you'll find all these
|
||||
parameters, sensible defaults, as well as rich documentation in the form of comments
|
||||
|
||||
To configure your application, you need to create your own configuration module, which
|
||||
will allow you to override few or many of these parameters. Instead of altering the core module,
|
||||
you'll want to define your own module (typically a file named `superset_config.py`).
|
||||
Add this file to your `PYTHONPATH` or create an environment variable
|
||||
`SUPERSET_CONFIG_PATH` specifying the full path of the `superset_config.py`.
|
||||
|
||||
For example, if deploying on Superset directly on a Linux-based system where your
|
||||
`superset_config.py` is under `/app` directory, you can run:
|
||||
|
||||
```bash
|
||||
export SUPERSET_CONFIG_PATH=/app/superset_config.py
|
||||
```
|
||||
|
||||
If you are using your own custom Dockerfile with the official Superset image as base image,
|
||||
then you can add your overrides as shown below:
|
||||
|
||||
```bash
|
||||
COPY --chown=superset superset_config.py /app/
|
||||
ENV SUPERSET_CONFIG_PATH /app/superset_config.py
|
||||
```
|
||||
|
||||
Docker compose deployments handle application configuration differently using specific conventions.
|
||||
Refer to the [docker compose tips & configuration](/docs/installation/docker-compose#docker-compose-tips--configuration)
|
||||
for details.
|
||||
|
||||
The following is an example of just a few of the parameters you can set in your `superset_config.py` file:
|
||||
|
||||
```
|
||||
# Superset specific config
|
||||
ROW_LIMIT = 5000
|
||||
|
||||
# Flask App Builder configuration
|
||||
# Your App secret key will be used for securely signing the session cookie
|
||||
# and encrypting sensitive information on the database
|
||||
# Make sure you are changing this key for your deployment with a strong key.
|
||||
# Alternatively you can set it with `SUPERSET_SECRET_KEY` environment variable.
|
||||
# You MUST set this for production environments or the server will refuse
|
||||
# to start and you will see an error in the logs accordingly.
|
||||
SECRET_KEY = 'YOUR_OWN_RANDOM_GENERATED_SECRET_KEY'
|
||||
|
||||
# The SQLAlchemy connection string to your database backend
|
||||
# This connection defines the path to the database that stores your
|
||||
# superset metadata (slices, connections, tables, dashboards, ...).
|
||||
# Note that the connection information to connect to the datasources
|
||||
# you want to explore are managed directly in the web UI
|
||||
# The check_same_thread=false property ensures the sqlite client does not attempt
|
||||
# to enforce single-threaded access, which may be problematic in some edge cases
|
||||
SQLALCHEMY_DATABASE_URI = 'sqlite:////path/to/superset.db?check_same_thread=false'
|
||||
|
||||
# Flask-WTF flag for CSRF
|
||||
WTF_CSRF_ENABLED = True
|
||||
# Add endpoints that need to be exempt from CSRF protection
|
||||
WTF_CSRF_EXEMPT_LIST = []
|
||||
# A CSRF token that expires in 1 year
|
||||
WTF_CSRF_TIME_LIMIT = 60 * 60 * 24 * 365
|
||||
|
||||
# Set this API key to enable Mapbox visualizations
|
||||
MAPBOX_API_KEY = ''
|
||||
```
|
||||
|
||||
:::tip
|
||||
Note that it is typical to copy and paste [only] the portions of the
|
||||
core [superset/config.py](https://github.com/apache/superset/blob/master/superset/config.py) that
|
||||
you want to alter, along with the related comments into your own `superset_config.py` file.
|
||||
:::
|
||||
|
||||
All the parameters and default values defined
|
||||
in [superset/config.py](https://github.com/apache/superset/blob/master/superset/config.py)
|
||||
can be altered in your local `superset_config.py`. Administrators will want to read through the file
|
||||
to understand what can be configured locally as well as the default values in place.
|
||||
|
||||
Since `superset_config.py` acts as a Flask configuration module, it can be used to alter the
|
||||
settings of Flask itself, as well as Flask extensions that Superset bundles like
|
||||
`flask-wtf`, `flask-caching`, `flask-migrate`,
|
||||
and `flask-appbuilder`. Each one of these extensions offers intricate configurability.
|
||||
Flask App Builder, the web framework used by Superset, also offers many
|
||||
configuration settings. Please consult the
|
||||
[Flask App Builder Documentation](https://flask-appbuilder.readthedocs.org/en/latest/config.html)
|
||||
for more information on how to configure it.
|
||||
|
||||
At the very least, you'll want to change `SECRET_KEY` and `SQLALCHEMY_DATABASE_URI`. Continue reading for more about each of these.
|
||||
|
||||
## Specifying a SECRET_KEY
|
||||
|
||||
### Adding an initial SECRET_KEY
|
||||
|
||||
Superset requires a user-specified SECRET_KEY to start up. This requirement was [added in version 2.1.0 to force secure configurations](https://preset.io/blog/superset-security-update-default-secret_key-vulnerability/). Add a strong SECRET_KEY to your `superset_config.py` file like:
|
||||
|
||||
```python
|
||||
SECRET_KEY = 'YOUR_OWN_RANDOM_GENERATED_SECRET_KEY'
|
||||
```
|
||||
|
||||
You can generate a strong secure key with `openssl rand -base64 42`.
|
||||
|
||||
:::caution Use a strong secret key
|
||||
This key will be used for securely signing session cookies and encrypting sensitive information stored in Superset's application metadata database.
|
||||
Your deployment must use a complex, unique key.
|
||||
:::
|
||||
|
||||
### Rotating to a newer SECRET_KEY
|
||||
|
||||
If you wish to change your existing SECRET_KEY, add the existing SECRET_KEY to your `superset_config.py` file as
|
||||
`PREVIOUS_SECRET_KEY =`and provide your new key as `SECRET_KEY =`. You can find your current SECRET_KEY with these
|
||||
commands - if running Superset with Docker, execute from within the Superset application container:
|
||||
|
||||
```python
|
||||
superset shell
|
||||
from flask import current_app; print(current_app.config["SECRET_KEY"])
|
||||
```
|
||||
|
||||
Save your `superset_config.py` with these values and then run `superset re-encrypt-secrets`.
|
||||
|
||||
## Setting up a production metadata database
|
||||
|
||||
Superset needs a database to store the information it manages, like the definitions of
|
||||
charts, dashboards, and many other things.
|
||||
|
||||
By default, Superset is configured to use [SQLite](https://www.sqlite.org/),
|
||||
a self-contained, single-file database that offers a simple and fast way to get started
|
||||
(without requiring any installation). However, for production environments,
|
||||
using SQLite is highly discouraged due to security, scalability, and data integrity reasons.
|
||||
It's important to use only the supported database engines and consider using a different
|
||||
database engine on a separate host or container.
|
||||
|
||||
Superset supports the following database engines/versions:
|
||||
|
||||
| Database Engine | Supported Versions |
|
||||
| ----------------------------------------- | ---------------------------------------- |
|
||||
| [PostgreSQL](https://www.postgresql.org/) | 10.X, 11.X, 12.X, 13.X, 14.X, 15.X, 16.X |
|
||||
| [MySQL](https://www.mysql.com/) | 5.7, 8.X |
|
||||
|
||||
Use the following database drivers and connection strings:
|
||||
|
||||
| Database | PyPI package | Connection String |
|
||||
| ----------------------------------------- | ------------------------- | ---------------------------------------------------------------------- |
|
||||
| [PostgreSQL](https://www.postgresql.org/) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
|
||||
| [MySQL](https://www.mysql.com/) | `pip install mysqlclient` | `mysql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
|
||||
|
||||
:::tip
|
||||
Properly setting up metadata store is beyond the scope of this documentation. We recommend
|
||||
using a hosted managed service such as [Amazon RDS](https://aws.amazon.com/rds/) or
|
||||
[Google Cloud Databases](https://cloud.google.com/products/databases?hl=en) to handle
|
||||
service and supporting infrastructure and backup strategy.
|
||||
:::
|
||||
|
||||
To configure Superset metastore set `SQLALCHEMY_DATABASE_URI` config key on `superset_config`
|
||||
to the appropriate connection string.
|
||||
|
||||
## Running on a WSGI HTTP Server
|
||||
|
||||
While you can run Superset on NGINX or Apache, we recommend using Gunicorn in async mode. This
|
||||
enables impressive concurrency even and is fairly easy to install and configure. Please refer to the
|
||||
documentation of your preferred technology to set up this Flask WSGI application in a way that works
|
||||
well in your environment. Here’s an async setup known to work well in production:
|
||||
|
||||
```
|
||||
-w 10 \
|
||||
-k gevent \
|
||||
--worker-connections 1000 \
|
||||
--timeout 120 \
|
||||
-b 0.0.0.0:6666 \
|
||||
--limit-request-line 0 \
|
||||
--limit-request-field_size 0 \
|
||||
--statsd-host localhost:8125 \
|
||||
"superset.app:create_app()"
|
||||
```
|
||||
|
||||
Refer to the [Gunicorn documentation](https://docs.gunicorn.org/en/stable/design.html) for more
|
||||
information. _Note that the development web server (`superset run` or `flask run`) is not intended
|
||||
for production use._
|
||||
|
||||
If you're not using Gunicorn, you may want to disable the use of `flask-compress` by setting
|
||||
`COMPRESS_REGISTER = False` in your `superset_config.py`.
|
||||
|
||||
Currently, the Google BigQuery Python SDK is not compatible with `gevent`, due to some dynamic monkeypatching on python core library by `gevent`.
|
||||
So, when you use `BigQuery` datasource on Superset, you have to use `gunicorn` worker type except `gevent`.
|
||||
|
||||
## HTTPS Configuration
|
||||
|
||||
You can configure HTTPS upstream via a load balancer or a reverse proxy (such as nginx) and do SSL/TLS Offloading before traffic reaches the Superset application. In this setup, local traffic from a Celery worker taking a snapshot of a chart for Alerts & Reports can access Superset at a `http://` URL, from behind the ingress point.
|
||||
You can also configure [SSL in Gunicorn](https://docs.gunicorn.org/en/stable/settings.html#ssl) (the Python webserver) if you are using an official Superset Docker image.
|
||||
|
||||
## Configuration Behind a Load Balancer
|
||||
|
||||
If you are running superset behind a load balancer or reverse proxy (e.g. NGINX or ELB on AWS), you
|
||||
may need to utilize a healthcheck endpoint so that your load balancer knows if your superset
|
||||
instance is running. This is provided at `/health` which will return a 200 response containing “OK”
|
||||
if the webserver is running.
|
||||
|
||||
If the load balancer is inserting `X-Forwarded-For/X-Forwarded-Proto` headers, you should set
|
||||
`ENABLE_PROXY_FIX = True` in the superset config file (`superset_config.py`) to extract and use the
|
||||
headers.
|
||||
|
||||
In case the reverse proxy is used for providing SSL encryption, an explicit definition of the
|
||||
`X-Forwarded-Proto` may be required. For the Apache webserver this can be set as follows:
|
||||
|
||||
```
|
||||
RequestHeader set X-Forwarded-Proto "https"
|
||||
```
|
||||
|
||||
## Configuring the application root
|
||||
|
||||
*Please be advised that this feature is in BETA.*
|
||||
|
||||
Superset supports running the application under a non-root path. The root path
|
||||
prefix can be specified in one of two ways:
|
||||
|
||||
- Setting the `SUPERSET_APP_ROOT` environment variable to the desired prefix.
|
||||
- Customizing the [Flask entrypoint](https://github.com/apache/superset/blob/master/superset/app.py#L29)
|
||||
by passing the `superset_app_root` variable.
|
||||
|
||||
Note, the prefix should start with a `/`.
|
||||
|
||||
### Customizing the Flask entrypoint
|
||||
|
||||
To configure a prefix, e.g `/analytics`, pass the `superset_app_root` argument to
|
||||
`create_app` when calling flask run either through the `FLASK_APP`
|
||||
environment variable:
|
||||
|
||||
```sh
|
||||
FLASK_APP="superset:create_app(superset_app_root='/analytics')"
|
||||
```
|
||||
|
||||
or as part of the `--app` argument to `flask run`:
|
||||
|
||||
```sh
|
||||
flask --app "superset.app:create_app(superset_app_root='/analytics')"
|
||||
```
|
||||
|
||||
### Docker builds
|
||||
|
||||
The [docker compose](/docs/installation/docker-compose#configuring-further) developer
|
||||
configuration includes an additional environmental variable,
|
||||
[`SUPERSET_APP_ROOT`](https://github.com/apache/superset/blob/master/docker/.env),
|
||||
to simplify the process of setting up a non-default root path across the services.
|
||||
|
||||
In `docker/.env-local` set `SUPERSET_APP_ROOT` to the desired prefix and then bring the
|
||||
services up with `docker compose up --detach`.
|
||||
|
||||
## Custom OAuth2 Configuration
|
||||
|
||||
Superset is built on Flask-AppBuilder (FAB), which supports many providers out of the box
|
||||
(GitHub, Twitter, LinkedIn, Google, Azure, etc). Beyond those, Superset can be configured to connect
|
||||
with other OAuth2 Authorization Server implementations that support “code” authorization.
|
||||
|
||||
Make sure the pip package [`Authlib`](https://authlib.org/) is installed on the webserver.
|
||||
|
||||
First, configure authorization in Superset `superset_config.py`.
|
||||
|
||||
```python
|
||||
from flask_appbuilder.security.manager import AUTH_OAUTH
|
||||
|
||||
# Set the authentication type to OAuth
|
||||
AUTH_TYPE = AUTH_OAUTH
|
||||
|
||||
OAUTH_PROVIDERS = [
|
||||
{ 'name':'egaSSO',
|
||||
'token_key':'access_token', # Name of the token in the response of access_token_url
|
||||
'icon':'fa-address-card', # Icon for the provider
|
||||
'remote_app': {
|
||||
'client_id':'myClientId', # Client Id (Identify Superset application)
|
||||
'client_secret':'MySecret', # Secret for this Client Id (Identify Superset application)
|
||||
'client_kwargs':{
|
||||
'scope': 'read' # Scope for the Authorization
|
||||
},
|
||||
'access_token_method':'POST', # HTTP Method to call access_token_url
|
||||
'access_token_params':{ # Additional parameters for calls to access_token_url
|
||||
'client_id':'myClientId'
|
||||
},
|
||||
'jwks_uri':'https://myAuthorizationServe/adfs/discovery/keys', # may be required to generate token
|
||||
'access_token_headers':{ # Additional headers for calls to access_token_url
|
||||
'Authorization': 'Basic Base64EncodedClientIdAndSecret'
|
||||
},
|
||||
'api_base_url':'https://myAuthorizationServer/oauth2AuthorizationServer/',
|
||||
'access_token_url':'https://myAuthorizationServer/oauth2AuthorizationServer/token',
|
||||
'authorize_url':'https://myAuthorizationServer/oauth2AuthorizationServer/authorize'
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
# Will allow user self registration, allowing to create Flask users from Authorized User
|
||||
AUTH_USER_REGISTRATION = True
|
||||
|
||||
# The default user self registration role
|
||||
AUTH_USER_REGISTRATION_ROLE = "Public"
|
||||
```
|
||||
|
||||
In case you want to assign the `Admin` role on new user registration, it can be assigned as follows:
|
||||
```python
|
||||
AUTH_USER_REGISTRATION_ROLE = "Admin"
|
||||
```
|
||||
If you encounter the [issue](https://github.com/apache/superset/issues/13243) of not being able to list users from the Superset main page settings, although a newly registered user has an `Admin` role, please re-run `superset init` to sync the required permissions. Below is the command to re-run `superset init` using docker compose.
|
||||
```
|
||||
docker-compose exec superset superset init
|
||||
```
|
||||
|
||||
Then, create a `CustomSsoSecurityManager` that extends `SupersetSecurityManager` and overrides
|
||||
`oauth_user_info`:
|
||||
|
||||
```python
|
||||
import logging
|
||||
from superset.security import SupersetSecurityManager
|
||||
|
||||
class CustomSsoSecurityManager(SupersetSecurityManager):
|
||||
|
||||
def oauth_user_info(self, provider, response=None):
|
||||
logging.debug("Oauth2 provider: {0}.".format(provider))
|
||||
if provider == 'egaSSO':
|
||||
# As example, this line request a GET to base_url + '/' + userDetails with Bearer Authentication,
|
||||
# and expects that authorization server checks the token, and response with user details
|
||||
me = self.appbuilder.sm.oauth_remotes[provider].get('userDetails').data
|
||||
logging.debug("user_data: {0}".format(me))
|
||||
return { 'name' : me['name'], 'email' : me['email'], 'id' : me['user_name'], 'username' : me['user_name'], 'first_name':'', 'last_name':''}
|
||||
...
|
||||
```
|
||||
|
||||
This file must be located in the same directory as `superset_config.py` with the name
|
||||
`custom_sso_security_manager.py`. Finally, add the following 2 lines to `superset_config.py`:
|
||||
|
||||
```
|
||||
from custom_sso_security_manager import CustomSsoSecurityManager
|
||||
CUSTOM_SECURITY_MANAGER = CustomSsoSecurityManager
|
||||
```
|
||||
|
||||
**Notes**
|
||||
|
||||
- The redirect URL will be `https://<superset-webserver>/oauth-authorized/<provider-name>`
|
||||
When configuring an OAuth2 authorization provider if needed. For instance, the redirect URL will
|
||||
be `https://<superset-webserver>/oauth-authorized/egaSSO` for the above configuration.
|
||||
|
||||
- If an OAuth2 authorization server supports OpenID Connect 1.0, you could configure its configuration
|
||||
document URL only without providing `api_base_url`, `access_token_url`, `authorize_url` and other
|
||||
required options like user info endpoint, jwks uri etc. For instance:
|
||||
|
||||
```python
|
||||
OAUTH_PROVIDERS = [
|
||||
{ 'name':'egaSSO',
|
||||
'token_key':'access_token', # Name of the token in the response of access_token_url
|
||||
'icon':'fa-address-card', # Icon for the provider
|
||||
'remote_app': {
|
||||
'client_id':'myClientId', # Client Id (Identify Superset application)
|
||||
'client_secret':'MySecret', # Secret for this Client Id (Identify Superset application)
|
||||
'server_metadata_url': 'https://myAuthorizationServer/.well-known/openid-configuration'
|
||||
}
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Keycloak-Specific Configuration using Flask-OIDC
|
||||
|
||||
If you are using Keycloak as OpenID Connect 1.0 Provider, the above configuration based on [`Authlib`](https://authlib.org/) might not work. In this case using [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is a viable option.
|
||||
|
||||
Make sure the pip package [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is installed on the webserver. This was successfully tested using version 2.2.0. This package requires [`Flask-OpenID`](https://pypi.org/project/Flask-OpenID/) as a dependency.
|
||||
|
||||
The following code defines a new security manager. Add it to a new file named `keycloak_security_manager.py`, placed in the same directory as your `superset_config.py` file.
|
||||
|
||||
```python
|
||||
from flask_appbuilder.security.manager import AUTH_OID
|
||||
from superset.security import SupersetSecurityManager
|
||||
from flask_oidc import OpenIDConnect
|
||||
from flask_appbuilder.security.views import AuthOIDView
|
||||
from flask_login import login_user
|
||||
from urllib.parse import quote
|
||||
from flask_appbuilder.views import ModelView, SimpleFormView, expose
|
||||
from flask import (
|
||||
redirect,
|
||||
request
|
||||
)
|
||||
import logging
|
||||
|
||||
class OIDCSecurityManager(SupersetSecurityManager):
|
||||
|
||||
def __init__(self, appbuilder):
|
||||
super(OIDCSecurityManager, self).__init__(appbuilder)
|
||||
if self.auth_type == AUTH_OID:
|
||||
self.oid = OpenIDConnect(self.appbuilder.get_app)
|
||||
self.authoidview = AuthOIDCView
|
||||
|
||||
class AuthOIDCView(AuthOIDView):
|
||||
|
||||
@expose('/login/', methods=['GET', 'POST'])
|
||||
def login(self, flag=True):
|
||||
sm = self.appbuilder.sm
|
||||
oidc = sm.oid
|
||||
|
||||
@self.appbuilder.sm.oid.require_login
|
||||
def handle_login():
|
||||
user = sm.auth_user_oid(oidc.user_getfield('email'))
|
||||
|
||||
if user is None:
|
||||
info = oidc.user_getinfo(['preferred_username', 'given_name', 'family_name', 'email'])
|
||||
user = sm.add_user(info.get('preferred_username'), info.get('given_name'), info.get('family_name'),
|
||||
info.get('email'), sm.find_role('Gamma'))
|
||||
|
||||
login_user(user, remember=False)
|
||||
return redirect(self.appbuilder.get_url_for_index)
|
||||
|
||||
return handle_login()
|
||||
|
||||
@expose('/logout/', methods=['GET', 'POST'])
|
||||
def logout(self):
|
||||
oidc = self.appbuilder.sm.oid
|
||||
|
||||
oidc.logout()
|
||||
super(AuthOIDCView, self).logout()
|
||||
redirect_url = request.url_root.strip('/') + self.appbuilder.get_url_for_login
|
||||
|
||||
return redirect(
|
||||
oidc.client_secrets.get('issuer') + '/protocol/openid-connect/logout?redirect_uri=' + quote(redirect_url))
|
||||
```
|
||||
|
||||
Then add to your `superset_config.py` file:
|
||||
|
||||
```python
|
||||
from keycloak_security_manager import OIDCSecurityManager
|
||||
from flask_appbuilder.security.manager import AUTH_OID, AUTH_REMOTE_USER, AUTH_DB, AUTH_LDAP, AUTH_OAUTH
|
||||
import os
|
||||
|
||||
AUTH_TYPE = AUTH_OID
|
||||
SECRET_KEY: 'SomethingNotEntirelySecret'
|
||||
OIDC_CLIENT_SECRETS = '/path/to/client_secret.json'
|
||||
OIDC_ID_TOKEN_COOKIE_SECURE = False
|
||||
OIDC_OPENID_REALM: '<myRealm>'
|
||||
OIDC_INTROSPECTION_AUTH_METHOD: 'client_secret_post'
|
||||
CUSTOM_SECURITY_MANAGER = OIDCSecurityManager
|
||||
|
||||
# Will allow user self registration, allowing to create Flask users from Authorized User
|
||||
AUTH_USER_REGISTRATION = True
|
||||
|
||||
# The default user self registration role
|
||||
AUTH_USER_REGISTRATION_ROLE = 'Public'
|
||||
```
|
||||
|
||||
Store your client-specific OpenID information in a file called `client_secret.json`. Create this file in the same directory as `superset_config.py`:
|
||||
|
||||
```json
|
||||
{
|
||||
"<myOpenIDProvider>": {
|
||||
"issuer": "https://<myKeycloakDomain>/realms/<myRealm>",
|
||||
"auth_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/auth",
|
||||
"client_id": "https://<myKeycloakDomain>",
|
||||
"client_secret": "<myClientSecret>",
|
||||
"redirect_uris": [
|
||||
"https://<SupersetWebserver>/oauth-authorized/<myOpenIDProvider>"
|
||||
],
|
||||
"userinfo_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/userinfo",
|
||||
"token_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/token",
|
||||
"token_introspection_uri": "https://<myKeycloakDomain>/realms/<myRealm>/protocol/openid-connect/token/introspect"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## LDAP Authentication
|
||||
|
||||
FAB supports authenticating user credentials against an LDAP server.
|
||||
To use LDAP you must install the [python-ldap](https://www.python-ldap.org/en/latest/installing.html) package.
|
||||
See [FAB's LDAP documentation](https://flask-appbuilder.readthedocs.io/en/latest/security.html#authentication-ldap)
|
||||
for details.
|
||||
|
||||
## Mapping LDAP or OAUTH groups to Superset roles
|
||||
|
||||
AUTH_ROLES_MAPPING in Flask-AppBuilder is a dictionary that maps from LDAP/OAUTH group names to FAB roles.
|
||||
It is used to assign roles to users who authenticate using LDAP or OAuth.
|
||||
|
||||
### Mapping OAUTH groups to Superset roles
|
||||
|
||||
The following `AUTH_ROLES_MAPPING` dictionary would map the OAUTH group "superset_users" to the Superset roles "Gamma" as well as "Alpha", and the OAUTH group "superset_admins" to the Superset role "Admin".
|
||||
|
||||
```python
|
||||
AUTH_ROLES_MAPPING = {
|
||||
"superset_users": ["Gamma","Alpha"],
|
||||
"superset_admins": ["Admin"],
|
||||
}
|
||||
```
|
||||
|
||||
### Mapping LDAP groups to Superset roles
|
||||
|
||||
The following `AUTH_ROLES_MAPPING` dictionary would map the LDAP DN "cn=superset_users,ou=groups,dc=example,dc=com" to the Superset roles "Gamma" as well as "Alpha", and the LDAP DN "cn=superset_admins,ou=groups,dc=example,dc=com" to the Superset role "Admin".
|
||||
|
||||
```python
|
||||
AUTH_ROLES_MAPPING = {
|
||||
"cn=superset_users,ou=groups,dc=example,dc=com": ["Gamma","Alpha"],
|
||||
"cn=superset_admins,ou=groups,dc=example,dc=com": ["Admin"],
|
||||
}
|
||||
```
|
||||
|
||||
Note: This requires `AUTH_LDAP_SEARCH` to be set. For more details, please see the [FAB Security documentation](https://flask-appbuilder.readthedocs.io/en/latest/security.html).
|
||||
|
||||
### Syncing roles at login
|
||||
|
||||
You can also use the `AUTH_ROLES_SYNC_AT_LOGIN` configuration variable to control how often Flask-AppBuilder syncs the user's roles with the LDAP/OAUTH groups. If `AUTH_ROLES_SYNC_AT_LOGIN` is set to True, Flask-AppBuilder will sync the user's roles each time they log in. If `AUTH_ROLES_SYNC_AT_LOGIN` is set to False, Flask-AppBuilder will only sync the user's roles when they first register.
|
||||
|
||||
## Flask app Configuration Hook
|
||||
|
||||
`FLASK_APP_MUTATOR` is a configuration function that can be provided in your environment, receives
|
||||
the app object and can alter it in any way. For example, add `FLASK_APP_MUTATOR` into your
|
||||
`superset_config.py` to setup session cookie expiration time to 24 hours:
|
||||
|
||||
```python
|
||||
from flask import session
|
||||
from flask import Flask
|
||||
|
||||
|
||||
def make_session_permanent():
|
||||
'''
|
||||
Enable maxAge for the cookie 'session'
|
||||
'''
|
||||
session.permanent = True
|
||||
|
||||
# Set up max age of session to 24 hours
|
||||
PERMANENT_SESSION_LIFETIME = timedelta(hours=24)
|
||||
def FLASK_APP_MUTATOR(app: Flask) -> None:
|
||||
app.before_request_funcs.setdefault(None, []).append(make_session_permanent)
|
||||
```
|
||||
|
||||
## Feature Flags
|
||||
|
||||
To support a diverse set of users, Superset has some features that are not enabled by default. For
|
||||
example, some users have stronger security restrictions, while some others may not. So Superset
|
||||
allows users to enable or disable some features by config. For feature owners, you can add optional
|
||||
functionalities in Superset, but will be only affected by a subset of users.
|
||||
|
||||
You can enable or disable features with flag from `superset_config.py`:
|
||||
|
||||
```python
|
||||
FEATURE_FLAGS = {
|
||||
'PRESTO_EXPAND_DATA': False,
|
||||
}
|
||||
```
|
||||
|
||||
A current list of feature flags can be found in [RESOURCES/FEATURE_FLAGS.md](https://github.com/apache/superset/blob/master/RESOURCES/FEATURE_FLAGS.md).
|
||||
@@ -0,0 +1,40 @@
|
||||
---
|
||||
title: Country Map Tools
|
||||
sidebar_position: 10
|
||||
version: 1
|
||||
---
|
||||
|
||||
import countriesData from '../../../data/countries.json';
|
||||
|
||||
# The Country Map Visualization
|
||||
|
||||
The Country Map visualization allows you to plot lightweight choropleth maps of
|
||||
your countries by province, states, or other subdivision types. It does not rely
|
||||
on any third-party map services but would require you to provide the
|
||||
[ISO-3166-2](https://en.wikipedia.org/wiki/ISO_3166-2) codes of your country's
|
||||
top-level subdivisions. Comparing to a province or state's full names, the ISO
|
||||
code is less ambiguous and is unique to all regions in the world.
|
||||
|
||||
## Included Maps
|
||||
|
||||
The current list of countries can be found in the src
|
||||
[legacy-plugin-chart-country-map/src/countries.ts](https://github.com/apache/superset/blob/master/superset-frontend/plugins/legacy-plugin-chart-country-map/src/countries.ts)
|
||||
|
||||
The Country Maps visualization already ships with the maps for the following countries:
|
||||
|
||||
<ul style={{columns: 3}}>
|
||||
{countriesData.countries.map((country, index) => (
|
||||
<li key={index}>{country}</li>
|
||||
))}
|
||||
</ul>
|
||||
|
||||
## Adding a New Country
|
||||
|
||||
To add a new country to the list, you'd have to edit files in
|
||||
[@superset-ui/legacy-plugin-chart-country-map](https://github.com/apache/superset/tree/master/superset-frontend/plugins/legacy-plugin-chart-country-map).
|
||||
|
||||
1. Generate a new GeoJSON file for your country following the guide in [this Jupyter notebook](https://github.com/apache/superset/blob/master/superset-frontend/plugins/legacy-plugin-chart-country-map/scripts/Country%20Map%20GeoJSON%20Generator.ipynb).
|
||||
2. Edit the countries list in [legacy-plugin-chart-country-map/src/countries.ts](https://github.com/apache/superset/blob/master/superset-frontend/plugins/legacy-plugin-chart-country-map/src/countries.ts).
|
||||
3. Install superset-frontend dependencies: `cd superset-frontend && npm install`
|
||||
4. Verify your countries in Superset plugins storybook: `npm run plugins:storybook`.
|
||||
5. Build and install Superset from source code.
|
||||
1815
docs/versioned_docs/version-6.0.0/configuration/databases.mdx
Normal file
1815
docs/versioned_docs/version-6.0.0/configuration/databases.mdx
Normal file
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,62 @@
|
||||
---
|
||||
title: Event Logging
|
||||
sidebar_position: 9
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Logging
|
||||
|
||||
## Event Logging
|
||||
|
||||
Superset by default logs special action events in its internal database (DBEventLogger). These logs can be accessed
|
||||
on the UI by navigating to **Security > Action Log**. You can freely customize these logs by
|
||||
implementing your own event log class.
|
||||
**When custom log class is enabled DBEventLogger is disabled and logs
|
||||
stop being populated in UI logs view.**
|
||||
To achieve both, custom log class should extend built-in DBEventLogger log class.
|
||||
|
||||
Here's an example of a simple JSON-to-stdout class:
|
||||
|
||||
```python
|
||||
def log(self, user_id, action, *args, **kwargs):
|
||||
records = kwargs.get('records', list())
|
||||
dashboard_id = kwargs.get('dashboard_id')
|
||||
slice_id = kwargs.get('slice_id')
|
||||
duration_ms = kwargs.get('duration_ms')
|
||||
referrer = kwargs.get('referrer')
|
||||
|
||||
for record in records:
|
||||
log = dict(
|
||||
action=action,
|
||||
json=record,
|
||||
dashboard_id=dashboard_id,
|
||||
slice_id=slice_id,
|
||||
duration_ms=duration_ms,
|
||||
referrer=referrer,
|
||||
user_id=user_id
|
||||
)
|
||||
print(json.dumps(log))
|
||||
```
|
||||
|
||||
End by updating your config to pass in an instance of the logger you want to use:
|
||||
|
||||
```
|
||||
EVENT_LOGGER = JSONStdOutEventLogger()
|
||||
```
|
||||
|
||||
## StatsD Logging
|
||||
|
||||
Superset can be configured to log events to [StatsD](https://github.com/statsd/statsd)
|
||||
if desired. Most endpoints hit are logged as
|
||||
well as key events like query start and end in SQL Lab.
|
||||
|
||||
To setup StatsD logging, it’s a matter of configuring the logger in your `superset_config.py`.
|
||||
If not already present, you need to ensure that the `statsd`-package is installed in Superset's python environment.
|
||||
|
||||
```python
|
||||
from superset.stats_logger import StatsdStatsLogger
|
||||
STATS_LOGGER = StatsdStatsLogger(host='localhost', port=8125, prefix='superset')
|
||||
```
|
||||
|
||||
Note that it’s also possible to implement your own logger by deriving
|
||||
`superset.stats_logger.BaseStatsLogger`.
|
||||
@@ -0,0 +1,126 @@
|
||||
---
|
||||
title: Importing and Exporting Datasources
|
||||
hide_title: true
|
||||
sidebar_position: 11
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Importing and Exporting Datasources
|
||||
|
||||
The superset cli allows you to import and export datasources from and to YAML. Datasources include
|
||||
databases. The data is expected to be organized in the following hierarchy:
|
||||
|
||||
```text
|
||||
├──databases
|
||||
| ├──database_1
|
||||
| | ├──table_1
|
||||
| | | ├──columns
|
||||
| | | | ├──column_1
|
||||
| | | | ├──column_2
|
||||
| | | | └──... (more columns)
|
||||
| | | └──metrics
|
||||
| | | ├──metric_1
|
||||
| | | ├──metric_2
|
||||
| | | └──... (more metrics)
|
||||
| | └── ... (more tables)
|
||||
| └── ... (more databases)
|
||||
```
|
||||
|
||||
## Exporting Datasources to YAML
|
||||
|
||||
You can print your current datasources to stdout by running:
|
||||
|
||||
```bash
|
||||
superset export_datasources
|
||||
```
|
||||
|
||||
To save your datasources to a ZIP file run:
|
||||
|
||||
```bash
|
||||
superset export_datasources -f <filename>
|
||||
```
|
||||
|
||||
By default, default (null) values will be omitted. Use the -d flag to include them. If you want back
|
||||
references to be included (e.g. a column to include the table id it belongs to) use the -b flag.
|
||||
|
||||
Alternatively, you can export datasources using the UI:
|
||||
|
||||
1. Open **Sources -> Databases** to export all tables associated to a single or multiple databases.
|
||||
(**Tables** for one or more tables)
|
||||
2. Select the items you would like to export.
|
||||
3. Click **Actions -> Export** to YAML
|
||||
4. If you want to import an item that you exported through the UI, you will need to nest it inside
|
||||
its parent element, e.g. a database needs to be nested under databases a table needs to be nested
|
||||
inside a database element.
|
||||
|
||||
In order to obtain an **exhaustive list of all fields** you can import using the YAML import run:
|
||||
|
||||
```bash
|
||||
superset export_datasource_schema
|
||||
```
|
||||
|
||||
As a reminder, you can use the `-b` flag to include back references.
|
||||
|
||||
## Importing Datasources
|
||||
|
||||
In order to import datasources from a ZIP file, run:
|
||||
|
||||
```bash
|
||||
superset import_datasources -p <path / filename>
|
||||
```
|
||||
|
||||
The optional username flag **-u** sets the user used for the datasource import. The default is 'admin'. Example:
|
||||
|
||||
```bash
|
||||
superset import_datasources -p <path / filename> -u 'admin'
|
||||
```
|
||||
|
||||
## Legacy Importing Datasources
|
||||
|
||||
### From older versions of Superset to current version
|
||||
|
||||
When using Superset version 4.x.x to import from an older version (2.x.x or 3.x.x) importing is supported as the command `legacy_import_datasources` and expects a JSON or directory of JSONs. The options are `-r` for recursive and `-u` for specifying a user. Example of legacy import without options:
|
||||
|
||||
```bash
|
||||
superset legacy_import_datasources -p <path or filename>
|
||||
```
|
||||
|
||||
### From older versions of Superset to older versions
|
||||
|
||||
When using an older Superset version (2.x.x & 3.x.x) of Superset, the command is `import_datasources`. ZIP and YAML files are supported and to switch between them the feature flag `VERSIONED_EXPORT` is used. When `VERSIONED_EXPORT` is `True`, `import_datasources` expects a ZIP file, otherwise YAML. Example:
|
||||
|
||||
```bash
|
||||
superset import_datasources -p <path or filename>
|
||||
```
|
||||
|
||||
When `VERSIONED_EXPORT` is `False`, if you supply a path all files ending with **yaml** or **yml** will be parsed. You can apply
|
||||
additional flags (e.g. to search the supplied path recursively):
|
||||
|
||||
```bash
|
||||
superset import_datasources -p <path> -r
|
||||
```
|
||||
|
||||
The sync flag **-s** takes parameters in order to sync the supplied elements with your file. Be
|
||||
careful this can delete the contents of your meta database. Example:
|
||||
|
||||
```bash
|
||||
superset import_datasources -p <path / filename> -s columns,metrics
|
||||
```
|
||||
|
||||
This will sync all metrics and columns for all datasources found in the `<path /filename>` in the
|
||||
Superset meta database. This means columns and metrics not specified in YAML will be deleted. If you
|
||||
would add tables to columns,metrics those would be synchronised as well.
|
||||
|
||||
If you don’t supply the sync flag (**-s**) importing will only add and update (override) fields.
|
||||
E.g. you can add a verbose_name to the column ds in the table random_time_series from the example
|
||||
datasets by saving the following YAML to file and then running the **import_datasources** command.
|
||||
|
||||
```yaml
|
||||
databases:
|
||||
- database_name: main
|
||||
tables:
|
||||
- table_name: random_time_series
|
||||
columns:
|
||||
- column_name: ds
|
||||
verbose_name: datetime
|
||||
```
|
||||
@@ -0,0 +1,78 @@
|
||||
---
|
||||
title: Map Tiles
|
||||
sidebar_position: 12
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Map tiles
|
||||
|
||||
Superset uses OSM and Mapbox tiles by default. OSM is free but you still need setting your MAPBOX_API_KEY if you want to use mapbox maps.
|
||||
|
||||
## Setting map tiles
|
||||
|
||||
Map tiles can be set with `DECKGL_BASE_MAP` in your `superset_config.py` or `superset_config_docker.py`
|
||||
For adding your own map tiles, you can use the following format.
|
||||
|
||||
```python
|
||||
DECKGL_BASE_MAP = [
|
||||
['tile://https://your_personal_url/{z}/{x}/{y}.png', 'MyTile']
|
||||
]
|
||||
```
|
||||
Openstreetmap tiles url can be added without prefix.
|
||||
```python
|
||||
DECKGL_BASE_MAP = [
|
||||
['https://c.tile.openstreetmap.org/{z}/{x}/{y}.png', 'OpenStreetMap']
|
||||
]
|
||||
```
|
||||
|
||||
Default values are:
|
||||
```python
|
||||
DECKGL_BASE_MAP = [
|
||||
['https://tile.openstreetmap.org/{z}/{x}/{y}.png', 'Streets (OSM)'],
|
||||
['https://tile.osm.ch/osm-swiss-style/{z}/{x}/{y}.png', 'Topography (OSM)'],
|
||||
['mapbox://styles/mapbox/streets-v9', 'Streets'],
|
||||
['mapbox://styles/mapbox/dark-v9', 'Dark'],
|
||||
['mapbox://styles/mapbox/light-v9', 'Light'],
|
||||
['mapbox://styles/mapbox/satellite-streets-v9', 'Satellite Streets'],
|
||||
['mapbox://styles/mapbox/satellite-v9', 'Satellite'],
|
||||
['mapbox://styles/mapbox/outdoors-v9', 'Outdoors'],
|
||||
]
|
||||
```
|
||||
|
||||
It is possible to set only mapbox by removing osm tiles and other way around.
|
||||
|
||||
:::warning
|
||||
Setting `DECKGL_BASE_MAP` overwrite default values
|
||||
:::
|
||||
|
||||
After defining your map tiles, set them in these variables:
|
||||
- `CORS_OPTIONS`
|
||||
- `connect-src` of `TALISMAN_CONFIG` and `TALISMAN_CONFIG_DEV` variables.
|
||||
|
||||
```python
|
||||
ENABLE_CORS = True
|
||||
CORS_OPTIONS: dict[Any, Any] = {
|
||||
"origins": [
|
||||
"https://tile.openstreetmap.org",
|
||||
"https://tile.osm.ch",
|
||||
"https://your_personal_url/{z}/{x}/{y}.png",
|
||||
]
|
||||
}
|
||||
|
||||
.
|
||||
.
|
||||
|
||||
TALISMAN_CONFIG = {
|
||||
"content_security_policy": {
|
||||
...
|
||||
"connect-src": [
|
||||
"'self'",
|
||||
"https://api.mapbox.com",
|
||||
"https://events.mapbox.com",
|
||||
"https://tile.openstreetmap.org",
|
||||
"https://tile.osm.ch",
|
||||
"https://your_personal_url/{z}/{x}/{y}.png",
|
||||
],
|
||||
...
|
||||
}
|
||||
```
|
||||
@@ -0,0 +1,141 @@
|
||||
---
|
||||
title: Network and Security Settings
|
||||
sidebar_position: 7
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Network and Security Settings
|
||||
|
||||
## CORS
|
||||
|
||||
|
||||
:::note
|
||||
In Superset versions prior to `5.x` you have to install to install `flask-cors` with `pip install flask-cors` to enable CORS support.
|
||||
:::
|
||||
|
||||
|
||||
The following keys in `superset_config.py` can be specified to configure CORS:
|
||||
|
||||
- `ENABLE_CORS`: Must be set to `True` in order to enable CORS
|
||||
- `CORS_OPTIONS`: options passed to Flask-CORS
|
||||
([documentation](https://flask-cors.readthedocs.io/en/latest/api.html#extension))
|
||||
|
||||
## HTTP headers
|
||||
|
||||
Note that Superset bundles [flask-talisman](https://pypi.org/project/talisman/)
|
||||
Self-described as a small Flask extension that handles setting HTTP headers that can help
|
||||
protect against a few common web application security issues.
|
||||
|
||||
## HTML Embedding of Dashboards and Charts
|
||||
|
||||
There are two ways to embed a dashboard: Using the [SDK](https://www.npmjs.com/package/@superset-ui/embedded-sdk) or embedding a direct link. Note that in the latter case everybody who knows the link is able to access the dashboard.
|
||||
|
||||
### Embedding a Public Direct Link to a Dashboard
|
||||
|
||||
This works by first changing the content security policy (CSP) of [flask-talisman](https://github.com/GoogleCloudPlatform/flask-talisman) to allow for certain domains to display Superset content. Then a dashboard can be made publicly accessible, i.e. **bypassing authentication**. Once made public, the dashboard's URL can be added to an iframe in another website's HTML code.
|
||||
|
||||
#### Changing flask-talisman CSP
|
||||
|
||||
Add to `superset_config.py` the entire `TALISMAN_CONFIG` section from `config.py` and include a `frame-ancestors` section:
|
||||
|
||||
```python
|
||||
TALISMAN_ENABLED = True
|
||||
TALISMAN_CONFIG = {
|
||||
"content_security_policy": {
|
||||
...
|
||||
"frame-ancestors": ["*.my-domain.com", "*.another-domain.com"],
|
||||
...
|
||||
```
|
||||
|
||||
Restart Superset for this configuration change to take effect.
|
||||
|
||||
#### Making a Dashboard Public
|
||||
|
||||
1. Add the `'DASHBOARD_RBAC': True` [Feature Flag](https://github.com/apache/superset/blob/master/RESOURCES/FEATURE_FLAGS.md) to `superset_config.py`
|
||||
2. Add the `Public` role to your dashboard as described [here](https://superset.apache.org/docs/using-superset/creating-your-first-dashboard/#manage-access-to-dashboards)
|
||||
|
||||
#### Embedding a Public Dashboard
|
||||
|
||||
Now anybody can directly access the dashboard's URL. You can embed it in an iframe like so:
|
||||
|
||||
```html
|
||||
<iframe
|
||||
width="600"
|
||||
height="400"
|
||||
seamless
|
||||
frameBorder="0"
|
||||
scrolling="no"
|
||||
src="https://superset.my-domain.com/superset/dashboard/10/?standalone=1&height=400"
|
||||
>
|
||||
</iframe>
|
||||
```
|
||||
|
||||
#### Embedding a Chart
|
||||
|
||||
A chart's embed code can be generated by going to a chart's edit view and then clicking at the top right on `...` > `Share` > `Embed code`
|
||||
|
||||
### Enabling Embedding via the SDK
|
||||
|
||||
Clicking on `...` next to `EDIT DASHBOARD` on the top right of the dashboard's overview page should yield a drop-down menu including the entry "Embed dashboard".
|
||||
|
||||
To enable this entry, add the following line to the `.env` file:
|
||||
|
||||
```text
|
||||
SUPERSET_FEATURE_EMBEDDED_SUPERSET=true
|
||||
```
|
||||
|
||||
## CSRF settings
|
||||
|
||||
Similarly, [flask-wtf](https://flask-wtf.readthedocs.io/en/0.15.x/config/) is used to manage
|
||||
some CSRF configurations. If you need to exempt endpoints from CSRF (e.g. if you are
|
||||
running a custom auth postback endpoint), you can add the endpoints to `WTF_CSRF_EXEMPT_LIST`:
|
||||
|
||||
## SSH Tunneling
|
||||
|
||||
1. Turn on feature flag
|
||||
- Change [`SSH_TUNNELING`](https://github.com/apache/superset/blob/eb8386e3f0647df6d1bbde8b42073850796cc16f/superset/config.py#L489) to `True`
|
||||
- If you want to add more security when establishing the tunnel we allow users to overwrite the `SSHTunnelManager` class [here](https://github.com/apache/superset/blob/eb8386e3f0647df6d1bbde8b42073850796cc16f/superset/config.py#L507)
|
||||
- You can also set the [`SSH_TUNNEL_LOCAL_BIND_ADDRESS`](https://github.com/apache/superset/blob/eb8386e3f0647df6d1bbde8b42073850796cc16f/superset/config.py#L508) this the host address where the tunnel will be accessible on your VPC
|
||||
|
||||
2. Create database w/ ssh tunnel enabled
|
||||
- With the feature flag enabled you should now see ssh tunnel toggle.
|
||||
- Click the toggle to enable SSH tunneling and add your credentials accordingly.
|
||||
- Superset allows for two different types of authentication (Basic + Private Key). These credentials should come from your service provider.
|
||||
|
||||
3. Verify data is flowing
|
||||
- Once SSH tunneling has been enabled, go to SQL Lab and write a query to verify data is properly flowing.
|
||||
|
||||
## Domain Sharding
|
||||
|
||||
:::note
|
||||
Domain Sharding is deprecated as of Superset 5.0.0, and will be removed in Superset 6.0.0. Please Enable HTTP2 to keep more open connections per domain.
|
||||
:::
|
||||
|
||||
Chrome allows up to 6 open connections per domain at a time. When there are more than 6 slices in
|
||||
dashboard, a lot of time fetch requests are queued up and wait for next available socket.
|
||||
[PR 5039](https://github.com/apache/superset/pull/5039) adds domain sharding to Superset,
|
||||
and this feature will be enabled by configuration only (by default Superset doesn’t allow
|
||||
cross-domain request).
|
||||
|
||||
Add the following setting in your `superset_config.py` file:
|
||||
|
||||
- `SUPERSET_WEBSERVER_DOMAINS`: list of allowed hostnames for domain sharding feature.
|
||||
|
||||
Please create your domain shards as subdomains of your main domain for authorization to
|
||||
work properly on new domains. For Example:
|
||||
|
||||
- `SUPERSET_WEBSERVER_DOMAINS=['superset-1.mydomain.com','superset-2.mydomain.com','superset-3.mydomain.com','superset-4.mydomain.com']`
|
||||
|
||||
or add the following setting in your `superset_config.py` file if domain shards are not subdomains of main domain.
|
||||
|
||||
- `SESSION_COOKIE_DOMAIN = '.mydomain.com'`
|
||||
|
||||
## Middleware
|
||||
|
||||
Superset allows you to add your own middleware. To add your own middleware, update the
|
||||
`ADDITIONAL_MIDDLEWARE` key in your `superset_config.py`. `ADDITIONAL_MIDDLEWARE` should be a list
|
||||
of your additional middleware classes.
|
||||
|
||||
For example, to use `AUTH_REMOTE_USER` from behind a proxy server like nginx, you have to add a
|
||||
simple middleware class to add the value of `HTTP_X_PROXY_REMOTE_USER` (or any other custom header
|
||||
from the proxy) to Gunicorn’s `REMOTE_USER` environment variable.
|
||||
@@ -0,0 +1,535 @@
|
||||
---
|
||||
title: SQL Templating
|
||||
hide_title: true
|
||||
sidebar_position: 5
|
||||
version: 1
|
||||
---
|
||||
|
||||
# SQL Templating
|
||||
|
||||
## Jinja Templates
|
||||
|
||||
SQL Lab and Explore supports [Jinja templating](https://jinja.palletsprojects.com/en/2.11.x/) in queries.
|
||||
To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/docs/configuration/configuring-superset#feature-flags) needs to be enabled in
|
||||
`superset_config.py`. When templating is enabled, python code can be embedded in virtual datasets and
|
||||
in Custom SQL in the filter and metric controls in Explore. By default, the following variables are
|
||||
made available in the Jinja context:
|
||||
|
||||
- `columns`: columns which to group by in the query
|
||||
- `filter`: filters applied in the query
|
||||
- `from_dttm`: start `datetime` value from the selected time range (`None` if undefined) (deprecated beginning in version 5.0, use `get_time_filter` instead)
|
||||
- `to_dttm`: end `datetime` value from the selected time range (`None` if undefined). (deprecated beginning in version 5.0, use `get_time_filter` instead)
|
||||
- `groupby`: columns which to group by in the query (deprecated)
|
||||
- `metrics`: aggregate expressions in the query
|
||||
- `row_limit`: row limit of the query
|
||||
- `row_offset`: row offset of the query
|
||||
- `table_columns`: columns available in the dataset
|
||||
- `time_column`: temporal column of the query (`None` if undefined)
|
||||
- `time_grain`: selected time grain (`None` if undefined)
|
||||
|
||||
For example, to add a time range to a virtual dataset, you can write the following:
|
||||
|
||||
```sql
|
||||
SELECT *
|
||||
FROM tbl
|
||||
WHERE dttm_col > '{{ from_dttm }}' and dttm_col < '{{ to_dttm }}'
|
||||
```
|
||||
|
||||
You can also use [Jinja's logic](https://jinja.palletsprojects.com/en/2.11.x/templates/#tests)
|
||||
to make your query robust to clearing the timerange filter:
|
||||
|
||||
```sql
|
||||
SELECT *
|
||||
FROM tbl
|
||||
WHERE (
|
||||
{% if from_dttm is not none %}
|
||||
dttm_col > '{{ from_dttm }}' AND
|
||||
{% endif %}
|
||||
{% if to_dttm is not none %}
|
||||
dttm_col < '{{ to_dttm }}' AND
|
||||
{% endif %}
|
||||
1 = 1
|
||||
)
|
||||
```
|
||||
|
||||
The `1 = 1` at the end ensures a value is present for the `WHERE` clause even when
|
||||
the time filter is not set. For many database engines, this could be replaced with `true`.
|
||||
|
||||
Note that the Jinja parameters are called within _double_ brackets in the query and with
|
||||
_single_ brackets in the logic blocks.
|
||||
|
||||
To add custom functionality to the Jinja context, you need to overload the default Jinja
|
||||
context in your environment by defining the `JINJA_CONTEXT_ADDONS` in your superset configuration
|
||||
(`superset_config.py`). Objects referenced in this dictionary are made available for users to use
|
||||
where the Jinja context is made available.
|
||||
|
||||
```python
|
||||
JINJA_CONTEXT_ADDONS = {
|
||||
'my_crazy_macro': lambda x: x*2,
|
||||
}
|
||||
```
|
||||
|
||||
Default values for jinja templates can be specified via `Parameters` menu in the SQL Lab user interface.
|
||||
In the UI you can assign a set of parameters as JSON
|
||||
|
||||
```json
|
||||
{
|
||||
"my_table": "foo"
|
||||
}
|
||||
```
|
||||
|
||||
The parameters become available in your SQL (example: `SELECT * FROM {{ my_table }}` ) by using Jinja templating syntax.
|
||||
SQL Lab template parameters are stored with the dataset as `TEMPLATE PARAMETERS`.
|
||||
|
||||
There is a special ``_filters`` parameter which can be used to test filters used in the jinja template.
|
||||
|
||||
```json
|
||||
{
|
||||
"_filters": [
|
||||
{
|
||||
"col": "action_type",
|
||||
"op": "IN",
|
||||
"val": ["sell", "buy"]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
```sql
|
||||
SELECT action, count(*) as times
|
||||
FROM logs
|
||||
WHERE action in {{ filter_values('action_type')|where_in }}
|
||||
GROUP BY action
|
||||
```
|
||||
|
||||
Note ``_filters`` is not stored with the dataset. It's only used within the SQL Lab UI.
|
||||
|
||||
Besides default Jinja templating, SQL lab also supports self-defined template processor by setting
|
||||
the `CUSTOM_TEMPLATE_PROCESSORS` in your superset configuration. The values in this dictionary
|
||||
overwrite the default Jinja template processors of the specified database engine. The example below
|
||||
configures a custom presto template processor which implements its own logic of processing macro
|
||||
template with regex parsing. It uses the `$` style macro instead of `{{ }}` style in Jinja
|
||||
templating.
|
||||
|
||||
By configuring it with `CUSTOM_TEMPLATE_PROCESSORS`, a SQL template on a presto database is
|
||||
processed by the custom one rather than the default one.
|
||||
|
||||
```python
|
||||
def DATE(
|
||||
ts: datetime, day_offset: SupportsInt = 0, hour_offset: SupportsInt = 0
|
||||
) -> str:
|
||||
"""Current day as a string."""
|
||||
day_offset, hour_offset = int(day_offset), int(hour_offset)
|
||||
offset_day = (ts + timedelta(days=day_offset, hours=hour_offset)).date()
|
||||
return str(offset_day)
|
||||
|
||||
class CustomPrestoTemplateProcessor(PrestoTemplateProcessor):
|
||||
"""A custom presto template processor."""
|
||||
|
||||
engine = "presto"
|
||||
|
||||
def process_template(self, sql: str, **kwargs) -> str:
|
||||
"""Processes a sql template with $ style macro using regex."""
|
||||
# Add custom macros functions.
|
||||
macros = {
|
||||
"DATE": partial(DATE, datetime.utcnow())
|
||||
} # type: Dict[str, Any]
|
||||
# Update with macros defined in context and kwargs.
|
||||
macros.update(self.context)
|
||||
macros.update(kwargs)
|
||||
|
||||
def replacer(match):
|
||||
"""Expand $ style macros with corresponding function calls."""
|
||||
macro_name, args_str = match.groups()
|
||||
args = [a.strip() for a in args_str.split(",")]
|
||||
if args == [""]:
|
||||
args = []
|
||||
f = macros[macro_name[1:]]
|
||||
return f(*args)
|
||||
|
||||
macro_names = ["$" + name for name in macros.keys()]
|
||||
pattern = r"(%s)\s*\(([^()]*)\)" % "|".join(map(re.escape, macro_names))
|
||||
return re.sub(pattern, replacer, sql)
|
||||
|
||||
CUSTOM_TEMPLATE_PROCESSORS = {
|
||||
CustomPrestoTemplateProcessor.engine: CustomPrestoTemplateProcessor
|
||||
}
|
||||
```
|
||||
|
||||
SQL Lab also includes a live query validation feature with pluggable backends. You can configure
|
||||
which validation implementation is used with which database engine by adding a block like the
|
||||
following to your configuration file:
|
||||
|
||||
```python
|
||||
FEATURE_FLAGS = {
|
||||
'SQL_VALIDATORS_BY_ENGINE': {
|
||||
'presto': 'PrestoDBSQLValidator',
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The available validators and names can be found in
|
||||
[sql_validators](https://github.com/apache/superset/tree/master/superset/sql_validators).
|
||||
|
||||
## Available Macros
|
||||
|
||||
In this section, we'll walkthrough the pre-defined Jinja macros in Superset.
|
||||
|
||||
**Current Username**
|
||||
|
||||
The `{{ current_username() }}` macro returns the `username` of the currently logged in user.
|
||||
|
||||
If you have caching enabled in your Superset configuration, then by default the `username` value will be used
|
||||
by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
|
||||
cache hit in the future and Superset can retrieve cached data.
|
||||
|
||||
You can disable the inclusion of the `username` value in the calculation of the
|
||||
cache key by adding the following parameter to your Jinja code:
|
||||
|
||||
```python
|
||||
{{ current_username(add_to_cache_keys=False) }}
|
||||
```
|
||||
|
||||
**Current User ID**
|
||||
|
||||
The `{{ current_user_id() }}` macro returns the account ID of the currently logged in user.
|
||||
|
||||
If you have caching enabled in your Superset configuration, then by default the account `id` value will be used
|
||||
by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
|
||||
cache hit in the future and Superset can retrieve cached data.
|
||||
|
||||
You can disable the inclusion of the account `id` value in the calculation of the
|
||||
cache key by adding the following parameter to your Jinja code:
|
||||
|
||||
```python
|
||||
{{ current_user_id(add_to_cache_keys=False) }}
|
||||
```
|
||||
|
||||
**Current User Email**
|
||||
|
||||
The `{{ current_user_email() }}` macro returns the email address of the currently logged in user.
|
||||
|
||||
If you have caching enabled in your Superset configuration, then by default the email address value will be used
|
||||
by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
|
||||
cache hit in the future and Superset can retrieve cached data.
|
||||
|
||||
You can disable the inclusion of the email value in the calculation of the
|
||||
cache key by adding the following parameter to your Jinja code:
|
||||
|
||||
```python
|
||||
{{ current_user_email(add_to_cache_keys=False) }}
|
||||
```
|
||||
|
||||
**Current User Roles**
|
||||
|
||||
The `{{ current_user_roles() }}` macro returns an array of roles for the logged in user.
|
||||
|
||||
If you have caching enabled in your Superset configuration, then by default the roles value will be used
|
||||
by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
|
||||
cache hit in the future and Superset can retrieve cached data.
|
||||
|
||||
You can disable the inclusion of the roles value in the calculation of the
|
||||
cache key by adding the following parameter to your Jinja code:
|
||||
|
||||
```python
|
||||
{{ current_user_roles(add_to_cache_keys=False) }}
|
||||
```
|
||||
|
||||
You can json-stringify the array by adding `|tojson` to your Jinja code:
|
||||
```python
|
||||
{{ current_user_roles()|tojson }}
|
||||
```
|
||||
|
||||
You can use the `|where_in` filter to use your roles in a SQL statement. For example, if `current_user_roles()` returns `['admin', 'viewer']`, the following template:
|
||||
```python
|
||||
SELECT * FROM users WHERE role IN {{ current_user_roles()|where_in }}
|
||||
```
|
||||
|
||||
Will be rendered as:
|
||||
```sql
|
||||
SELECT * FROM users WHERE role IN ('admin', 'viewer')
|
||||
```
|
||||
|
||||
**Current User RLS Rules**
|
||||
|
||||
The `{{ current_user_rls_rules() }}` macro returns an array of RLS rules applied to the current dataset for the logged in user.
|
||||
|
||||
If you have caching enabled in your Superset configuration, then the list of RLS Rules will be used
|
||||
by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
|
||||
cache hit in the future and Superset can retrieve cached data.
|
||||
|
||||
**Custom URL Parameters**
|
||||
|
||||
The `{{ url_param('custom_variable') }}` macro lets you define arbitrary URL
|
||||
parameters and reference them in your SQL code.
|
||||
|
||||
Here's a concrete example:
|
||||
|
||||
- You write the following query in SQL Lab:
|
||||
|
||||
```sql
|
||||
SELECT count(*)
|
||||
FROM ORDERS
|
||||
WHERE country_code = '{{ url_param('countrycode') }}'
|
||||
```
|
||||
|
||||
- You're hosting Superset at the domain www.example.com and you send your
|
||||
coworker in Spain the following SQL Lab URL `www.example.com/superset/sqllab?countrycode=ES`
|
||||
and your coworker in the USA the following SQL Lab URL `www.example.com/superset/sqllab?countrycode=US`
|
||||
- For your coworker in Spain, the SQL Lab query will be rendered as:
|
||||
|
||||
```sql
|
||||
SELECT count(*)
|
||||
FROM ORDERS
|
||||
WHERE country_code = 'ES'
|
||||
```
|
||||
|
||||
- For your coworker in the USA, the SQL Lab query will be rendered as:
|
||||
|
||||
```sql
|
||||
SELECT count(*)
|
||||
FROM ORDERS
|
||||
WHERE country_code = 'US'
|
||||
```
|
||||
|
||||
**Explicitly Including Values in Cache Key**
|
||||
|
||||
The `{{ cache_key_wrapper() }}` function explicitly instructs Superset to add a value to the
|
||||
accumulated list of values used in the calculation of the cache key.
|
||||
|
||||
This function is only needed when you want to wrap your own custom function return values
|
||||
in the cache key. You can gain more context
|
||||
[here](https://github.com/apache/superset/blob/efd70077014cbed62e493372d33a2af5237eaadf/superset/jinja_context.py#L133-L148).
|
||||
|
||||
Note that this function powers the caching of the `user_id` and `username` values
|
||||
in the `current_user_id()` and `current_username()` function calls (if you have caching enabled).
|
||||
|
||||
**Filter Values**
|
||||
|
||||
You can retrieve the value for a specific filter as a list using `{{ filter_values() }}`.
|
||||
|
||||
This is useful if:
|
||||
|
||||
- You want to use a filter component to filter a query where the name of filter component column doesn't match the one in the select statement
|
||||
- You want to have the ability to filter inside the main query for performance purposes
|
||||
|
||||
Here's a concrete example:
|
||||
|
||||
```sql
|
||||
SELECT action, count(*) as times
|
||||
FROM logs
|
||||
WHERE
|
||||
action in {{ filter_values('action_type')|where_in }}
|
||||
GROUP BY action
|
||||
```
|
||||
|
||||
There `where_in` filter converts the list of values from `filter_values('action_type')` into a string suitable for an `IN` expression.
|
||||
|
||||
**Filters for a Specific Column**
|
||||
|
||||
The `{{ get_filters() }}` macro returns the filters applied to a given column. In addition to
|
||||
returning the values (similar to how `filter_values()` does), the `get_filters()` macro
|
||||
returns the operator specified in the Explore UI.
|
||||
|
||||
This is useful if:
|
||||
|
||||
- You want to handle more than the IN operator in your SQL clause
|
||||
- You want to handle generating custom SQL conditions for a filter
|
||||
- You want to have the ability to filter inside the main query for speed purposes
|
||||
|
||||
Here's a concrete example:
|
||||
|
||||
```sql
|
||||
WITH RECURSIVE
|
||||
superiors(employee_id, manager_id, full_name, level, lineage) AS (
|
||||
SELECT
|
||||
employee_id,
|
||||
manager_id,
|
||||
full_name,
|
||||
1 as level,
|
||||
employee_id as lineage
|
||||
FROM
|
||||
employees
|
||||
WHERE
|
||||
1=1
|
||||
|
||||
{# Render a blank line #}
|
||||
{%- for filter in get_filters('full_name', remove_filter=True) -%}
|
||||
|
||||
{%- if filter.get('op') == 'IN' -%}
|
||||
AND
|
||||
full_name IN {{ filter.get('val')|where_in }}
|
||||
{%- endif -%}
|
||||
|
||||
{%- if filter.get('op') == 'LIKE' -%}
|
||||
AND
|
||||
full_name LIKE {{ "'" + filter.get('val') + "'" }}
|
||||
{%- endif -%}
|
||||
|
||||
{%- endfor -%}
|
||||
UNION ALL
|
||||
SELECT
|
||||
e.employee_id,
|
||||
e.manager_id,
|
||||
e.full_name,
|
||||
s.level + 1 as level,
|
||||
s.lineage
|
||||
FROM
|
||||
employees e,
|
||||
superiors s
|
||||
WHERE s.manager_id = e.employee_id
|
||||
)
|
||||
|
||||
SELECT
|
||||
employee_id, manager_id, full_name, level, lineage
|
||||
FROM
|
||||
superiors
|
||||
order by lineage, level
|
||||
```
|
||||
|
||||
**Time Filter**
|
||||
|
||||
The `{{ get_time_filter() }}` macro returns the time filter applied to a specific column. This is useful if you want
|
||||
to handle time filters inside the virtual dataset, as by default the time filter is placed on the outer query. This can
|
||||
considerably improve performance, as many databases and query engines are able to optimize the query better
|
||||
if the temporal filter is placed on the inner query, as opposed to the outer query.
|
||||
|
||||
The macro takes the following parameters:
|
||||
|
||||
- `column`: Name of the temporal column. Leave undefined to reference the time range from a Dashboard Native Time Range
|
||||
filter (when present).
|
||||
- `default`: The default value to fall back to if the time filter is not present, or has the value `No filter`
|
||||
- `target_type`: The target temporal type as recognized by the target database (e.g. `TIMESTAMP`, `DATE` or
|
||||
`DATETIME`). If `column` is defined, the format will default to the type of the column. This is used to produce
|
||||
the format of the `from_expr` and `to_expr` properties of the returned `TimeFilter` object.
|
||||
- `strftime`: format using the `strftime` method of `datetime` for custom time formatting.
|
||||
([see docs for valid format codes](https://docs.python.org/3/library/datetime.html#strftime-and-strptime-format-codes)).
|
||||
When defined `target_type` will be ignored.
|
||||
- `remove_filter`: When set to true, mark the filter as processed, removing it from the outer query. Useful when a
|
||||
filter should only apply to the inner query.
|
||||
|
||||
The return type has the following properties:
|
||||
|
||||
- `from_expr`: the start of the time filter (if any)
|
||||
- `to_expr`: the end of the time filter (if any)
|
||||
- `time_range`: The applied time range
|
||||
|
||||
Here's a concrete example using the `logs` table from the Superset metastore:
|
||||
|
||||
```
|
||||
{% set time_filter = get_time_filter("dttm", remove_filter=True) %}
|
||||
{% set from_expr = time_filter.from_expr %}
|
||||
{% set to_expr = time_filter.to_expr %}
|
||||
{% set time_range = time_filter.time_range %}
|
||||
SELECT
|
||||
*,
|
||||
'{{ time_range }}' as time_range
|
||||
FROM logs
|
||||
{% if from_expr or to_expr %}WHERE 1 = 1
|
||||
{% if from_expr %}AND dttm >= {{ from_expr }}{% endif %}
|
||||
{% if to_expr %}AND dttm < {{ to_expr }}{% endif %}
|
||||
{% endif %}
|
||||
```
|
||||
|
||||
Assuming we are creating a table chart with a simple `COUNT(*)` as the metric with a time filter `Last week` on the
|
||||
`dttm` column, this would render the following query on Postgres (note the formatting of the temporal filters, and
|
||||
the absence of time filters on the outer query):
|
||||
|
||||
```
|
||||
SELECT COUNT(*) AS count
|
||||
FROM
|
||||
(SELECT *,
|
||||
'Last week' AS time_range
|
||||
FROM public.logs
|
||||
WHERE 1 = 1
|
||||
AND dttm >= TO_TIMESTAMP('2024-08-27 00:00:00.000000', 'YYYY-MM-DD HH24:MI:SS.US')
|
||||
AND dttm < TO_TIMESTAMP('2024-09-03 00:00:00.000000', 'YYYY-MM-DD HH24:MI:SS.US')) AS virtual_table
|
||||
ORDER BY count DESC
|
||||
LIMIT 1000;
|
||||
```
|
||||
|
||||
When using the `default` parameter, the templated query can be simplified, as the endpoints will always be defined
|
||||
(to use a fixed time range, you can also use something like `default="2024-08-27 : 2024-09-03"`)
|
||||
|
||||
```
|
||||
{% set time_filter = get_time_filter("dttm", default="Last week", remove_filter=True) %}
|
||||
SELECT
|
||||
*,
|
||||
'{{ time_filter.time_range }}' as time_range
|
||||
FROM logs
|
||||
WHERE
|
||||
dttm >= {{ time_filter.from_expr }}
|
||||
AND dttm < {{ time_filter.to_expr }}
|
||||
```
|
||||
|
||||
**Datasets**
|
||||
|
||||
It's possible to query physical and virtual datasets using the `dataset` macro. This is useful if you've defined computed columns and metrics on your datasets, and want to reuse the definition in adhoc SQL Lab queries.
|
||||
|
||||
To use the macro, first you need to find the ID of the dataset. This can be done by going to the view showing all the datasets, hovering over the dataset you're interested in, and looking at its URL. For example, if the URL for a dataset is https://superset.example.org/explore/?dataset_type=table&dataset_id=42 its ID is 42.
|
||||
|
||||
Once you have the ID you can query it as if it were a table:
|
||||
|
||||
```sql
|
||||
SELECT * FROM {{ dataset(42) }} LIMIT 10
|
||||
```
|
||||
|
||||
If you want to select the metric definitions as well, in addition to the columns, you need to pass an additional keyword argument:
|
||||
|
||||
```sql
|
||||
SELECT * FROM {{ dataset(42, include_metrics=True) }} LIMIT 10
|
||||
```
|
||||
|
||||
Since metrics are aggregations, the resulting SQL expression will be grouped by all non-metric columns. You can specify a subset of columns to group by instead:
|
||||
|
||||
```sql
|
||||
SELECT * FROM {{ dataset(42, include_metrics=True, columns=["ds", "category"]) }} LIMIT 10
|
||||
```
|
||||
|
||||
**Metrics**
|
||||
|
||||
The `{{ metric('metric_key', dataset_id) }}` macro can be used to retrieve the metric SQL syntax from a dataset. This can be useful for different purposes:
|
||||
|
||||
- Override the metric label in the chart level
|
||||
- Combine multiple metrics in a calculation
|
||||
- Retrieve a metric syntax in SQL lab
|
||||
- Re-use metrics across datasets
|
||||
|
||||
This macro avoids copy/paste, allowing users to centralize the metric definition in the dataset layer.
|
||||
|
||||
The `dataset_id` parameter is optional, and if not provided Superset will use the current dataset from context (for example, when using this macro in the Chart Builder, by default the `macro_key` will be searched in the dataset powering the chart).
|
||||
The parameter can be used in SQL Lab, or when fetching a metric from another dataset.
|
||||
|
||||
## Available Filters
|
||||
|
||||
Superset supports [builtin filters from the Jinja2 templating package](https://jinja.palletsprojects.com/en/stable/templates/#builtin-filters). Custom filters have also been implemented:
|
||||
|
||||
**Where In**
|
||||
Parses a list into a SQL-compatible statement. This is useful with macros that return an array (for example the `filter_values` macro):
|
||||
|
||||
```
|
||||
Dashboard filter with "First", "Second" and "Third" options selected
|
||||
{{ filter_values('column') }} => ["First", "Second", "Third"]
|
||||
{{ filter_values('column')|where_in }} => ('First', 'Second', 'Third')
|
||||
```
|
||||
|
||||
By default, this filter returns `()` (as a string) in case the value is null. The `default_to_none` parameter can be se to `True` to return null in this case:
|
||||
|
||||
```
|
||||
Dashboard filter without any value applied
|
||||
{{ filter_values('column') }} => ()
|
||||
{{ filter_values('column')|where_in(default_to_none=True) }} => None
|
||||
```
|
||||
|
||||
**To Datetime**
|
||||
|
||||
Loads a string as a `datetime` object. This is useful when performing date operations. For example:
|
||||
```
|
||||
{% set from_expr = get_time_filter("dttm", strftime="%Y-%m-%d").from_expr %}
|
||||
{% set to_expr = get_time_filter("dttm", strftime="%Y-%m-%d").to_expr %}
|
||||
{% if (to_expr|to_datetime(format="%Y-%m-%d") - from_expr|to_datetime(format="%Y-%m-%d")).days > 100 %}
|
||||
do something
|
||||
{% else %}
|
||||
do something else
|
||||
{% endif %}
|
||||
```
|
||||
192
docs/versioned_docs/version-6.0.0/configuration/theming.mdx
Normal file
192
docs/versioned_docs/version-6.0.0/configuration/theming.mdx
Normal file
@@ -0,0 +1,192 @@
|
||||
---
|
||||
title: Theming
|
||||
hide_title: true
|
||||
sidebar_position: 12
|
||||
version: 1
|
||||
---
|
||||
# Theming Superset
|
||||
|
||||
:::note
|
||||
apache-superset>=6.0
|
||||
:::
|
||||
|
||||
Superset now rides on **Ant Design v5's token-based theming**.
|
||||
Every Antd token works, plus a handful of Superset-specific ones for charts and dashboard chrome.
|
||||
|
||||
## Managing Themes via UI
|
||||
|
||||
Superset includes a built-in **Theme Management** interface accessible from the admin menu under **Settings > Themes**.
|
||||
|
||||
### Creating a New Theme
|
||||
|
||||
1. Navigate to **Settings > Themes** in the Superset interface
|
||||
2. Click **+ Theme** to create a new theme
|
||||
3. Use the [Ant Design Theme Editor](https://ant.design/theme-editor) to design your theme:
|
||||
- Design your palette, typography, and component overrides
|
||||
- Open the `CONFIG` modal and copy the JSON configuration
|
||||
4. Paste the JSON into the theme definition field in Superset
|
||||
5. Give your theme a descriptive name and save
|
||||
|
||||
You can also extend with Superset-specific tokens (documented in the default theme object) before you import.
|
||||
|
||||
### System Theme Administration
|
||||
|
||||
When `ENABLE_UI_THEME_ADMINISTRATION = True` is configured, administrators can manage system-wide themes directly from the UI:
|
||||
|
||||
#### Setting System Themes
|
||||
- **System Default Theme**: Click the sun icon on any theme to set it as the system-wide default
|
||||
- **System Dark Theme**: Click the moon icon on any theme to set it as the system dark mode theme
|
||||
- **Automatic OS Detection**: When both default and dark themes are set, Superset automatically detects and applies the appropriate theme based on OS preferences
|
||||
|
||||
#### Managing System Themes
|
||||
- System themes are indicated with special badges in the theme list
|
||||
- Only administrators with write permissions can modify system theme settings
|
||||
- Removing a system theme designation reverts to configuration file defaults
|
||||
|
||||
### Applying Themes to Dashboards
|
||||
|
||||
Once created, themes can be applied to individual dashboards:
|
||||
- Edit any dashboard and select your custom theme from the theme dropdown
|
||||
- Each dashboard can have its own theme, allowing for branded or context-specific styling
|
||||
|
||||
## Configuration Options
|
||||
|
||||
### Python Configuration
|
||||
|
||||
Configure theme behavior via `superset_config.py`:
|
||||
|
||||
```python
|
||||
# Enable UI-based theme administration for admins
|
||||
ENABLE_UI_THEME_ADMINISTRATION = True
|
||||
|
||||
# Optional: Set initial default themes via configuration
|
||||
# These can be overridden via the UI when ENABLE_UI_THEME_ADMINISTRATION = True
|
||||
THEME_DEFAULT = {
|
||||
"token": {
|
||||
"colorPrimary": "#2893B3",
|
||||
"colorSuccess": "#5ac189",
|
||||
# ... your theme JSON configuration
|
||||
}
|
||||
}
|
||||
|
||||
# Optional: Dark theme configuration
|
||||
THEME_DARK = {
|
||||
"algorithm": "dark",
|
||||
"token": {
|
||||
"colorPrimary": "#2893B3",
|
||||
# ... your dark theme overrides
|
||||
}
|
||||
}
|
||||
|
||||
# To force a single theme on all users, set THEME_DARK = None
|
||||
# When both themes are defined (via UI or config):
|
||||
# - Users can manually switch between themes
|
||||
# - OS preference detection is automatically enabled
|
||||
```
|
||||
|
||||
### Migration from Configuration to UI
|
||||
|
||||
When `ENABLE_UI_THEME_ADMINISTRATION = True`:
|
||||
|
||||
1. System themes set via the UI take precedence over configuration file settings
|
||||
2. The UI shows which themes are currently set as system defaults
|
||||
3. Administrators can change system themes without restarting Superset
|
||||
4. Configuration file themes serve as fallbacks when no UI themes are set
|
||||
|
||||
### Copying Themes Between Systems
|
||||
|
||||
To export a theme for use in configuration files or another instance:
|
||||
|
||||
1. Navigate to **Settings > Themes** and click the export icon on your desired theme
|
||||
2. Extract the JSON configuration from the exported YAML file
|
||||
3. Use this JSON in your `superset_config.py` or import it into another Superset instance
|
||||
|
||||
## Theme Development Workflow
|
||||
|
||||
1. **Design**: Use the [Ant Design Theme Editor](https://ant.design/theme-editor) to iterate on your design
|
||||
2. **Test**: Create themes in Superset's CRUD interface for testing
|
||||
3. **Apply**: Assign themes to specific dashboards or configure instance-wide
|
||||
4. **Iterate**: Modify theme JSON directly in the CRUD interface or re-import from the theme editor
|
||||
|
||||
## Custom Fonts
|
||||
|
||||
Superset supports custom fonts through runtime configuration, allowing you to use branded or custom typefaces without rebuilding the application.
|
||||
|
||||
### Configuring Custom Fonts
|
||||
|
||||
Add font URLs to your `superset_config.py`:
|
||||
|
||||
```python
|
||||
# Load fonts from Google Fonts, Adobe Fonts, or self-hosted sources
|
||||
CUSTOM_FONT_URLS = [
|
||||
"https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap",
|
||||
"https://fonts.googleapis.com/css2?family=JetBrains+Mono:wght@400;500&display=swap",
|
||||
]
|
||||
|
||||
# Update CSP to allow font sources
|
||||
TALISMAN_CONFIG = {
|
||||
"content_security_policy": {
|
||||
"font-src": ["'self'", "https://fonts.googleapis.com", "https://fonts.gstatic.com"],
|
||||
"style-src": ["'self'", "'unsafe-inline'", "https://fonts.googleapis.com"],
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Using Custom Fonts in Themes
|
||||
|
||||
Once configured, reference the fonts in your theme configuration:
|
||||
|
||||
```python
|
||||
THEME_DEFAULT = {
|
||||
"token": {
|
||||
"fontFamily": "Inter, -apple-system, BlinkMacSystemFont, sans-serif",
|
||||
"fontFamilyCode": "JetBrains Mono, Monaco, monospace",
|
||||
# ... other theme tokens
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Or in the CRUD interface theme JSON:
|
||||
|
||||
```json
|
||||
{
|
||||
"token": {
|
||||
"fontFamily": "Inter, -apple-system, BlinkMacSystemFont, sans-serif",
|
||||
"fontFamilyCode": "JetBrains Mono, Monaco, monospace"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Font Sources
|
||||
|
||||
- **Google Fonts**: Free, CDN-hosted fonts with wide variety
|
||||
- **Adobe Fonts**: Premium fonts (requires subscription and kit ID)
|
||||
- **Self-hosted**: Place font files in `/static/assets/fonts/` and reference via CSS
|
||||
|
||||
This feature works with the stock Docker image - no custom build required!
|
||||
|
||||
## Advanced Features
|
||||
|
||||
- **System Themes**: Manage system-wide default and dark themes via UI or configuration
|
||||
- **Per-Dashboard Theming**: Each dashboard can have its own visual identity
|
||||
- **JSON Editor**: Edit theme configurations directly within Superset's interface
|
||||
- **Custom Fonts**: Load external fonts via configuration without rebuilding
|
||||
- **OS Dark Mode Detection**: Automatically switches themes based on system preferences
|
||||
- **Theme Import/Export**: Share themes between instances via YAML files
|
||||
|
||||
## API Access
|
||||
|
||||
For programmatic theme management, Superset provides REST endpoints:
|
||||
|
||||
- `GET /api/v1/theme/` - List all themes
|
||||
- `POST /api/v1/theme/` - Create a new theme
|
||||
- `PUT /api/v1/theme/{id}` - Update a theme
|
||||
- `DELETE /api/v1/theme/{id}` - Delete a theme
|
||||
- `PUT /api/v1/theme/{id}/set_system_default` - Set as system default theme (admin only)
|
||||
- `PUT /api/v1/theme/{id}/set_system_dark` - Set as system dark theme (admin only)
|
||||
- `DELETE /api/v1/theme/unset_system_default` - Remove system default designation
|
||||
- `DELETE /api/v1/theme/unset_system_dark` - Remove system dark designation
|
||||
- `GET /api/v1/theme/export/` - Export themes as YAML
|
||||
- `POST /api/v1/theme/import/` - Import themes from YAML
|
||||
|
||||
These endpoints require appropriate permissions and are subject to RBAC controls.
|
||||
@@ -0,0 +1,50 @@
|
||||
---
|
||||
title: Timezones
|
||||
hide_title: true
|
||||
sidebar_position: 6
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Timezones
|
||||
|
||||
There are four distinct timezone components which relate to Apache Superset,
|
||||
|
||||
1. The timezone that the underlying data is encoded in.
|
||||
2. The timezone of the database engine.
|
||||
3. The timezone of the Apache Superset backend.
|
||||
4. The timezone of the Apache Superset client.
|
||||
|
||||
where if a temporal field (`DATETIME`, `TIME`, `TIMESTAMP`, etc.) does not explicitly define a timezone it defaults to the underlying timezone of the component.
|
||||
|
||||
To help make the problem somewhat tractable—given that Apache Superset has no control on either how the data is ingested (1) or the timezone of the client (4)—from a consistency standpoint it is highly recommended that both (2) and (3) are configured to use the same timezone with a strong preference given to [UTC](https://en.wikipedia.org/wiki/Coordinated_Universal_Time) to ensure temporal fields without an explicit timestamp are not incorrectly coerced into the wrong timezone. Actually Apache Superset currently has implicit assumptions that timestamps are in UTC and thus configuring (3) to a non-UTC timezone could be problematic.
|
||||
|
||||
To strive for data consistency (regardless of the timezone of the client) the Apache Superset backend tries to ensure that any timestamp sent to the client has an explicit (or semi-explicit as in the case with [Epoch time](https://en.wikipedia.org/wiki/Unix_time) which is always in reference to UTC) timezone encoded within.
|
||||
|
||||
The challenge however lies with the slew of [database engines](/docs/configuration/databases#installing-drivers-in-docker-images) which Apache Superset supports and various inconsistencies between their [Python Database API (DB-API)](https://www.python.org/dev/peps/pep-0249/) implementations combined with the fact that we use [Pandas](https://pandas.pydata.org/) to read SQL into a DataFrame prior to serializing to JSON. Regrettably Pandas ignores the DB-API [type_code](https://www.python.org/dev/peps/pep-0249/#type-objects) relying by default on the underlying Python type returned by the DB-API. Currently only a subset of the supported database engines work correctly with Pandas, i.e., ensuring timestamps without an explicit timestamp are serializd to JSON with the server timezone, thus guaranteeing the client will display timestamps in a consistent manner irrespective of the client's timezone.
|
||||
|
||||
For example the following is a comparison of MySQL and Presto,
|
||||
|
||||
```python
|
||||
import pandas as pd
|
||||
from sqlalchemy import create_engine
|
||||
|
||||
pd.read_sql_query(
|
||||
sql="SELECT TIMESTAMP('2022-01-01 00:00:00') AS ts",
|
||||
con=create_engine("mysql://root@localhost:3360"),
|
||||
).to_json()
|
||||
|
||||
pd.read_sql_query(
|
||||
sql="SELECT TIMESTAMP '2022-01-01 00:00:00' AS ts",
|
||||
con=create_engine("presto://localhost:8080"),
|
||||
).to_json()
|
||||
```
|
||||
|
||||
which outputs `{"ts":{"0":1640995200000}}` (which infers the UTC timezone per the Epoch time definition) and `{"ts":{"0":"2022-01-01 00:00:00.000"}}` (without an explicit timezone) respectively and thus are treated differently in JavaScript:
|
||||
|
||||
```js
|
||||
new Date(1640995200000)
|
||||
> Sat Jan 01 2022 13:00:00 GMT+1300 (New Zealand Daylight Time)
|
||||
|
||||
new Date("2022-01-01 00:00:00.000")
|
||||
> Sat Jan 01 2022 00:00:00 GMT+1300 (New Zealand Daylight Time)
|
||||
```
|
||||
138
docs/versioned_docs/version-6.0.0/contributing/contributing.mdx
Normal file
138
docs/versioned_docs/version-6.0.0/contributing/contributing.mdx
Normal file
@@ -0,0 +1,138 @@
|
||||
---
|
||||
title: Contributing to Superset
|
||||
sidebar_position: 1
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Contributing to Superset
|
||||
|
||||
Superset is an [Apache Software foundation](https://www.apache.org/theapacheway/index.html) project.
|
||||
The core contributors (or committers) to Superset communicate primarily in the following channels (
|
||||
which can be joined by anyone):
|
||||
|
||||
- [Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org)
|
||||
- [Apache Superset Slack community](http://bit.ly/join-superset-slack)
|
||||
- [GitHub issues](https://github.com/apache/superset/issues)
|
||||
- [GitHub pull requests](https://github.com/apache/superset/pulls)
|
||||
- [GitHub discussions](https://github.com/apache/superset/discussions)
|
||||
- [Superset Community Calendar](https://superset.apache.org/community)
|
||||
|
||||
More references:
|
||||
|
||||
- [Superset Wiki (code guidelines and additional resources)](https://github.com/apache/superset/wiki)
|
||||
|
||||
## Orientation
|
||||
|
||||
Here's a list of repositories that contain Superset-related packages:
|
||||
|
||||
- [apache/superset](https://github.com/apache/superset)
|
||||
is the main repository containing the `apache_superset` Python package
|
||||
distributed on
|
||||
[pypi](https://pypi.org/project/apache_superset/). This repository
|
||||
also includes Superset's main TypeScript/JavaScript bundles and react apps under
|
||||
the [superset-frontend](https://github.com/apache/superset/tree/master/superset-frontend)
|
||||
folder.
|
||||
- [github.com/apache-superset](https://github.com/apache-superset) is the
|
||||
GitHub organization under which we manage Superset-related
|
||||
small tools, forks and Superset-related experimental ideas.
|
||||
|
||||
## Types of Contributions
|
||||
|
||||
### Report Bug
|
||||
|
||||
The best way to report a bug is to file an issue on GitHub. Please include:
|
||||
|
||||
- Your operating system name and version.
|
||||
- Superset version.
|
||||
- Detailed steps to reproduce the bug.
|
||||
- Any details about your local setup that might be helpful in troubleshooting.
|
||||
|
||||
When posting Python stack traces, please quote them using
|
||||
[Markdown blocks](https://help.github.com/articles/creating-and-highlighting-code-blocks/).
|
||||
|
||||
_Please note that feature requests opened as GitHub Issues will be moved to Discussions._
|
||||
|
||||
### Submit Ideas or Feature Requests
|
||||
|
||||
The best way is to start an ["Ideas" Discussion thread](https://github.com/apache/superset/discussions/categories/ideas) on GitHub:
|
||||
|
||||
- Explain in detail how it would work.
|
||||
- Keep the scope as narrow as possible, to make it easier to implement.
|
||||
- Remember that this is a volunteer-driven project, and that your contributions are as welcome as anyone's :)
|
||||
|
||||
To propose large features or major changes to codebase, and help usher in those changes, please create a **Superset Improvement Proposal (SIP)**. See template from [SIP-0](https://github.com/apache/superset/issues/5602)
|
||||
|
||||
### Fix Bugs
|
||||
|
||||
Look through the GitHub issues. Issues tagged with `#bug` are
|
||||
open to whoever wants to implement them.
|
||||
|
||||
### Implement Features
|
||||
|
||||
Look through the GitHub issues. Issues tagged with
|
||||
`#feature` are open to whoever wants to implement them.
|
||||
|
||||
### Improve Documentation
|
||||
|
||||
Superset could always use better documentation,
|
||||
whether as part of the official Superset docs,
|
||||
in docstrings, `docs/*.rst` or even on the web as blog posts or
|
||||
articles. See [Documentation](/docs/contributing/howtos#contributing-to-documentation) for more details.
|
||||
|
||||
### Add Translations
|
||||
|
||||
If you are proficient in a non-English language, you can help translate
|
||||
text strings from Superset's UI. You can jump into the existing
|
||||
language dictionaries at
|
||||
`superset/translations/<language_code>/LC_MESSAGES/messages.po`, or
|
||||
even create a dictionary for a new language altogether.
|
||||
See [Translating](howtos#contributing-translations) for more details.
|
||||
|
||||
### Ask Questions
|
||||
|
||||
There is a dedicated [`apache-superset` tag](https://stackoverflow.com/questions/tagged/apache-superset) on [StackOverflow](https://stackoverflow.com/). Please use it when asking questions.
|
||||
|
||||
## Types of Contributors
|
||||
|
||||
Following the project governance model of the Apache Software Foundation (ASF), Apache Superset has a specific set of contributor roles:
|
||||
|
||||
### PMC Member
|
||||
|
||||
A Project Management Committee (PMC) member is a person who has been elected by the PMC to help manage the project. PMC members are responsible for the overall health of the project, including community development, release management, and project governance. PMC members are also responsible for the technical direction of the project.
|
||||
|
||||
For more information about Apache Project PMCs, please refer to https://www.apache.org/foundation/governance/pmcs.html
|
||||
|
||||
### Committer
|
||||
|
||||
A committer is a person who has been elected by the PMC to have write access (commit access) to the code repository. They can modify the code, documentation, and website and accept contributions from others.
|
||||
|
||||
The official list of committers and PMC members can be found [here](https://projects.apache.org/committee.html?superset).
|
||||
|
||||
### Contributor
|
||||
|
||||
A contributor is a person who has contributed to the project in any way, including but not limited to code, tests, documentation, issues, and discussions.
|
||||
|
||||
> You can also review the Superset project's guidelines for PMC member promotion here: https://github.com/apache/superset/wiki/Guidelines-for-promoting-Superset-Committers-to-the-Superset-PMC
|
||||
|
||||
### Security Team
|
||||
|
||||
The security team is a selected subset of PMC members, committers and non-committers who are responsible for handling security issues.
|
||||
|
||||
New members of the security team are selected by the PMC members in a vote. You can request to be added to the team by sending a message to private@superset.apache.org. However, the team should be small and focused on solving security issues, so the requests will be evaluated on a case-by-case basis and the team size will be kept relatively small, limited to only actively security-focused contributors.
|
||||
|
||||
This security team must follow the [ASF vulnerability handling process](https://apache.org/security/committers.html#asf-project-security-for-committers).
|
||||
|
||||
Each new security issue is tracked as a JIRA ticket on the [ASF's JIRA Superset security project](https://issues.apache.org/jira/secure/RapidBoard.jspa?rapidView=588&projectKey=SUPERSETSEC)
|
||||
|
||||
Security team members must:
|
||||
|
||||
- Have an [ICLA](https://www.apache.org/licenses/contributor-agreements.html) signed with Apache Software Foundation.
|
||||
- Not reveal information about pending and unfixed security issues to anyone (including their employers) unless specifically authorised by the security team members, e.g., if the security team agrees that diagnosing and solving an issue requires the involvement of external experts.
|
||||
|
||||
A release manager, the contributor overseeing the release of a specific version of Apache Superset, is by default a member of the security team. However, they are not expected to be active in assessing, discussing, and fixing security issues.
|
||||
|
||||
Security team members should also follow these general expectations:
|
||||
|
||||
- Actively participate in assessing, discussing, fixing, and releasing security issues in Superset.
|
||||
- Avoid discussing security fixes in public forums. Pull request (PR) descriptions should not contain any information about security issues. The corresponding JIRA ticket should contain a link to the PR.
|
||||
- Security team members who contribute to a fix may be listed as remediation developers in the CVE report, along with their job affiliation (if they choose to include it).
|
||||
1069
docs/versioned_docs/version-6.0.0/contributing/development.mdx
Normal file
1069
docs/versioned_docs/version-6.0.0/contributing/development.mdx
Normal file
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user