Compare commits

...

25 Commits

Author SHA1 Message Date
Maxime Beauchemin
0796aa1c6d Update UPDATING.md
Co-authored-by: Evan Rusackas <evan@preset.io>
2025-07-30 14:46:17 -07:00
Maxime Beauchemin
14e6ec7d9f fix(examples): Load all YAML examples with --load-test-data flag
The integration tests depend on core examples like birth_names being loaded
even when using the --load-test-data flag. This fix ensures that all YAML
files are loaded when load_test_data is True, not just .test. files.

This resolves CI failures where tests couldn't find expected slices because
the birth_names examples weren't being loaded.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-07-30 00:11:59 -07:00
Maxime Beauchemin
14ffa69e0b fix(tests): Align test slice names with YAML examples 2025-07-29 23:13:46 -07:00
Maxime Beauchemin
ef4cf2b430 remove --max-fail 1 on integration tests to iterate faster 2025-07-29 22:36:12 -07:00
Maxime Beauchemin
48d8c91b19 feat: migrate examples from Python to YAML format with enhanced CLI
Migrates Superset's example data system from Python-based scripts to YAML configuration files, providing a cleaner, more maintainable approach to managing example datasets, charts, and dashboards.

- Converted 9 Python example modules to YAML configurations
- Exported existing examples from database and added as YAML files:
  - 11 dashboards (USA Births Names, World Bank's Data, etc.)
  - 115 charts
  - 25 datasets
- Moved test-specific fixtures to `tests/fixtures/examples/`
- Removed theme_id from dashboard exports for compatibility

- **New command group**: `superset examples` with subcommands:
  - `load` - Load example data (replaces `load-examples`)
  - `clear-old` - Remove old Python-based examples
  - `clear` - Placeholder for future YAML clearing
  - `reload` - Clear and reload in one command
- **Backwards compatibility**: `superset load-examples` still works with deprecation warning
- **Safety mechanism**: Detects old examples and preserves them to avoid data loss

- Fixed JSON data loading - examples can now load `.json.gz` files from CDN
- Fixed Docker compose configuration for isolated development
- Fixed webpack WebSocket configuration for different ports

- Import operations now log what's being created vs updated:
  - "Creating new dashboard: Sales Dashboard"
  - "Updating existing chart: World's Population"
- Provides clear visibility into the import process

- Moved import logging to individual import functions (DRY principle)
- Non-destructive migration approach - no user data is deleted
- Deterministic UUID generation for consistent example data

- Tested migration from old Python examples to new YAML format
- Verified safety mechanism prevents accidental data overwrites
- Confirmed backwards compatibility with deprecated command
- All pre-commit checks pass

- Updated installation docs to use new CLI commands
- Added deprecation notice to UPDATING.md
- Updated development documentation

None - the old `load-examples` command continues to work with a deprecation warning.

For users with existing Python-based examples:
1. Run `superset examples clear-old --confirm` to remove old examples
2. Run `superset examples load` to load new YAML-based examples
2025-07-29 22:23:52 -07:00
Hari Kiran
6006a21378 docs(development): fix comment in the dockerfile (#34391) 2025-07-29 21:53:46 -07:00
Maxime Beauchemin
bf967d6ba4 fix(charts): Fix unquoted 'Others' literal in series limit GROUP BY clause (#34390)
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-29 17:36:10 -07:00
Hari Kiran
131ae5aa9d docs(development): fix typo in the dockerfile (#34387) 2025-07-29 14:24:18 -07:00
Cesc Bausà
eca28582b6 feat(i18n): update Spanish translations (messages.po) (#34206) 2025-07-29 13:49:40 -07:00
Maxime Beauchemin
14e90a0f52 feat: Add GitHub Codespaces support with docker-compose-light (#34376)
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-29 13:10:17 -07:00
Maxime Beauchemin
a1c39d4906 feat(charts): Enable async buildQuery support for complex chart logic (#34383)
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-29 13:08:55 -07:00
Maxime Beauchemin
0964a8bb7a fix(big number with trendline): running 2 identical queries for no good reason (#34296) 2025-07-29 13:07:28 -07:00
Beto Dealmeida
8de8f95a3c feat: allow creating dataset without exploring (#34380) 2025-07-29 15:43:47 -04:00
Maxime Beauchemin
16db999067 fix: rate limiting issues with example data hosted on github.com (#34381) 2025-07-29 11:19:29 -07:00
Beto Dealmeida
972be15dda feat: focus on text input when modal opens (#34379) 2025-07-29 14:01:10 -04:00
Maxime Beauchemin
c9e06714f8 fix: prevent theme initialization errors during fresh installs (#34339)
Co-authored-by: Claude <noreply@anthropic.com>
2025-07-29 09:32:53 -07:00
Beto Dealmeida
32626ab707 fix: use catalog name on generated queries (#34360) 2025-07-29 12:30:46 -04:00
dependabot[bot]
a9cd58508b chore(deps): bump cookie and @types/cookie in /superset-websocket (#34335)
Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: hainenber <dotronghai96@gmail.com>
2025-07-29 20:19:31 +07:00
Beto Dealmeida
122bb68e5a fix: subquery alias in RLS (#34374) 2025-07-28 22:58:15 -04:00
Beto Dealmeida
914ce9aa4f feat: read column metadata (#34359) 2025-07-28 22:57:57 -04:00
Gabriel Torres Ruiz
bb572983cd feat(theming): Align embedded sdk with theme configs (#34273) 2025-07-28 19:26:17 -07:00
Đỗ Trọng Hải
ff76ab647f build(deps): update ag-grid to non-breaking major v34 (#34326) 2025-07-29 07:46:55 +07:00
Mehmet Salih Yavuz
f554848c9f fix(PivotTable): Render html in cells if allowRenderHtml is true (#34351) 2025-07-29 01:12:37 +03:00
Hari Kiran
dc0c389488 docs(development): fix 2 typos in the dockerfile (#34341) 2025-07-28 15:06:21 -07:00
Beto Dealmeida
22b3cc0480 chore: bump BigQuery dialect to 1.15.0 (#34371) 2025-07-28 16:39:18 -04:00
173 changed files with 14690 additions and 8842 deletions

5
.devcontainer/README.md Normal file
View File

@@ -0,0 +1,5 @@
# Superset Development with GitHub Codespaces
For complete documentation on using GitHub Codespaces with Apache Superset, please see:
**[Setting up a Development Environment - GitHub Codespaces](https://superset.apache.org/docs/contributing/development#github-codespaces-cloud-development)**

View File

@@ -0,0 +1,52 @@
{
"name": "Apache Superset Development",
// Keep this in sync with the base image in Dockerfile (ARG PY_VER)
// Using the same base as Dockerfile, but non-slim for dev tools
"image": "python:3.11.13-bookworm",
"features": {
"ghcr.io/devcontainers/features/docker-in-docker:2": {
"moby": true,
"dockerDashComposeVersion": "v2"
},
"ghcr.io/devcontainers/features/node:1": {
"version": "20"
},
"ghcr.io/devcontainers/features/git:1": {},
"ghcr.io/devcontainers/features/common-utils:2": {
"configureZshAsDefaultShell": true
},
"ghcr.io/devcontainers/features/sshd:1": {
"version": "latest"
}
},
// Forward ports for development
"forwardPorts": [9001],
"portsAttributes": {
"9001": {
"label": "Superset (via Webpack Dev Server)",
"onAutoForward": "notify",
"visibility": "public"
}
},
// Run commands after container is created
"postCreateCommand": "chmod +x .devcontainer/setup-dev.sh && .devcontainer/setup-dev.sh",
// Auto-start Superset on Codespace resume
"postStartCommand": ".devcontainer/start-superset.sh",
// VS Code customizations
"customizations": {
"vscode": {
"extensions": [
"ms-python.python",
"ms-python.vscode-pylance",
"charliermarsh.ruff",
"dbaeumer.vscode-eslint",
"esbenp.prettier-vscode"
]
}
}
}

32
.devcontainer/setup-dev.sh Executable file
View File

@@ -0,0 +1,32 @@
#!/bin/bash
# Setup script for Superset Codespaces development environment
echo "🔧 Setting up Superset development environment..."
# The universal image has most tools, just need Superset-specific libs
echo "📦 Installing Superset-specific dependencies..."
sudo apt-get update
sudo apt-get install -y \
libsasl2-dev \
libldap2-dev \
libpq-dev \
tmux \
gh
# Install uv for fast Python package management
echo "📦 Installing uv..."
curl -LsSf https://astral.sh/uv/install.sh | sh
# Add cargo/bin to PATH for uv
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.bashrc
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.zshrc
# Install Claude Code CLI via npm
echo "🤖 Installing Claude Code..."
npm install -g @anthropic-ai/claude-code
# Make the start script executable
chmod +x .devcontainer/start-superset.sh
echo "✅ Development environment setup complete!"
echo "🚀 Run '.devcontainer/start-superset.sh' to start Superset"

59
.devcontainer/start-superset.sh Executable file
View File

@@ -0,0 +1,59 @@
#!/bin/bash
# Startup script for Superset in Codespaces
echo "🚀 Starting Superset in Codespaces..."
echo "🌐 Frontend will be available at port 9001"
# Find the workspace directory (Codespaces clones as 'superset', not 'superset-2')
WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1)
if [ -n "$WORKSPACE_DIR" ]; then
cd "$WORKSPACE_DIR"
echo "📁 Working in: $WORKSPACE_DIR"
else
echo "📁 Using current directory: $(pwd)"
fi
# Check if docker is running
if ! docker info > /dev/null 2>&1; then
echo "⏳ Waiting for Docker to start..."
sleep 5
fi
# Clean up any existing containers
echo "🧹 Cleaning up existing containers..."
docker-compose -f docker-compose-light.yml down
# Start services
echo "🏗️ Building and starting services..."
echo ""
echo "📝 Once started, login with:"
echo " Username: admin"
echo " Password: admin"
echo ""
echo "📋 Running in foreground with live logs (Ctrl+C to stop)..."
# Run docker-compose and capture exit code
docker-compose -f docker-compose-light.yml up
EXIT_CODE=$?
# If it failed, provide helpful instructions
if [ $EXIT_CODE -ne 0 ] && [ $EXIT_CODE -ne 130 ]; then # 130 is Ctrl+C
echo ""
echo "❌ Superset startup failed (exit code: $EXIT_CODE)"
echo ""
echo "🔄 To restart Superset, run:"
echo " .devcontainer/start-superset.sh"
echo ""
echo "🔧 For troubleshooting:"
echo " # View logs:"
echo " docker-compose -f docker-compose-light.yml logs"
echo ""
echo " # Clean restart (removes volumes):"
echo " docker-compose -f docker-compose-light.yml down -v"
echo " .devcontainer/start-superset.sh"
echo ""
echo " # Common issues:"
echo " - Network timeouts: Just retry, often transient"
echo " - Port conflicts: Check 'docker ps'"
echo " - Database issues: Try clean restart with -v"
fi

View File

@@ -55,6 +55,7 @@ esm/*
tsconfig.tsbuildinfo
.*ipynb
.*yml
.*yaml
.*iml
.esprintrc
.prettierignore

View File

@@ -59,7 +59,7 @@ RUN mkdir -p /app/superset/static/assets \
# NOTE: we mount packages and plugins as they are referenced in package.json as workspaces
# ideally we'd COPY only their package.json. Here npm ci will be cached as long
# as the full content of these folders don't change, yielding a decent cache reuse rate.
# Note that's it's not possible selectively COPY of mount using blobs.
# Note that it's not possible to selectively COPY or mount using blobs.
RUN --mount=type=bind,source=./superset-frontend/package.json,target=./package.json \
--mount=type=bind,source=./superset-frontend/package-lock.json,target=./package-lock.json \
--mount=type=cache,target=/root/.cache \
@@ -74,7 +74,7 @@ RUN --mount=type=bind,source=./superset-frontend/package.json,target=./package.j
COPY superset-frontend /app/superset-frontend
######################################################################
# superset-node used for compile frontend assets
# superset-node is used for compiling frontend assets
######################################################################
FROM superset-node-ci AS superset-node
@@ -90,7 +90,7 @@ RUN --mount=type=cache,target=/root/.npm \
# Copy translation files
COPY superset/translations /app/superset/translations
# Build the frontend if not in dev mode
# Build translations if enabled, then cleanup localization files
RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \
npm run build-translation; \
fi; \

View File

@@ -23,6 +23,8 @@ This file documents any backwards-incompatible changes in Superset and
assists people when migrating to a new version.
## Next
- [34346](https://github.com/apache/superset/pull/34346) The examples system has been migrated from Python-based scripts to YAML configuration files. The CLI command `superset load-examples` has been deprecated in favor of `superset examples load`. The old command still works but will show a deprecation warning. Additional example management commands are available under `superset examples` including `clear-old` and `reload`. If you have old Python-based examples loaded, the new YAML-based examples will not load automatically to preserve your existing data. To migrate to the new examples, run `superset examples clear-old --confirm` followed by `superset examples load`.
**Note**: This change affects Cypress tests that rely on specific chart names from the old examples (e.g., "Num Births Trend", "Daily Totals"). These charts may not exist in the new YAML examples, causing test failures. Consider updating your Cypress tests or creating test-specific fixtures.
- [33084](https://github.com/apache/superset/pull/33084) The DISALLOWED_SQL_FUNCTIONS configuration now includes additional potentially sensitive database functions across PostgreSQL, MySQL, SQLite, MS SQL Server, and ClickHouse. Existing queries using these functions may now be blocked. Review your SQL Lab queries and dashboards if you encounter "disallowed function" errors after upgrading
- [34235](https://github.com/apache/superset/pull/34235) CSV exports now use `utf-8-sig` encoding by default to include a UTF-8 BOM, improving compatibility with Excel.
- [34258](https://github.com/apache/superset/pull/34258) changing the default in Dockerfile to INCLUDE_CHROMIUM="false" (from "true") in the past. This ensures the `lean` layer is lean by default, and people can opt-in to the `chromium` layer by setting the build arg `INCLUDE_CHROMIUM=true`. This is a breaking change for anyone using the `lean` layer, as it will no longer include Chromium by default.

View File

@@ -20,9 +20,6 @@
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.
#
# For verbose logging during development:
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
# -----------------------------------------------------------------------
x-superset-image: &superset-image apachesuperset.docker.scarf.sh/apache/superset:${TAG:-latest-dev}
x-superset-volumes:

View File

@@ -20,9 +20,6 @@
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.
#
# For verbose logging during development:
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
# -----------------------------------------------------------------------
x-superset-volumes:
&superset-volumes # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container

View File

@@ -20,9 +20,6 @@
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.
#
# For verbose logging during development:
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
# -----------------------------------------------------------------------
x-superset-user: &superset-user root
x-superset-volumes: &superset-volumes

View File

@@ -53,12 +53,7 @@ PYTHONPATH=/app/pythonpath:/app/docker/pythonpath_dev
REDIS_HOST=redis
REDIS_PORT=6379
# Development and logging configuration
# FLASK_DEBUG: Enables Flask dev features (auto-reload, better error pages) - keep 'true' for development
FLASK_DEBUG=true
# SUPERSET_LOG_LEVEL: Controls Superset application logging verbosity (debug, info, warning, error, critical)
SUPERSET_LOG_LEVEL=info
SUPERSET_APP_ROOT="/"
SUPERSET_ENV=development
SUPERSET_LOAD_EXAMPLES=yes
@@ -71,3 +66,4 @@ SUPERSET_SECRET_KEY=TEST_NON_DEV_SECRET
ENABLE_PLAYWRIGHT=false
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
BUILD_SUPERSET_FRONTEND_IN_DOCKER=true
SUPERSET_LOG_LEVEL=info

View File

@@ -20,5 +20,4 @@
# DON'T ignore the .gitignore
!.gitignore
!superset_config.py
!superset_config_docker_light.py
!superset_config_local.example

View File

@@ -14,6 +14,7 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# mypy: disable-error-code="assignment,misc"
#
# This file is included in the final Docker image and SHOULD be overridden when
# deploying the image to prod. Settings configured here are intended for use in local

View File

@@ -120,6 +120,78 @@ docker volume rm superset_db_home
docker-compose up
```
## GitHub Codespaces (Cloud Development)
GitHub Codespaces provides a complete, pre-configured development environment in the cloud. This is ideal for:
- Quick contributions without local setup
- Consistent development environments across team members
- Working from devices that can't run Docker locally
- Safe experimentation in isolated environments
:::info
We're grateful to GitHub for providing this excellent cloud development service that makes
contributing to Apache Superset more accessible to developers worldwide.
:::
### Getting Started with Codespaces
1. **Create a Codespace**: Use this pre-configured link that sets up everything you need:
[**Launch Superset Codespace →**](https://github.com/codespaces/new?skip_quickstart=true&machine=standardLinux32gb&repo=39464018&ref=codespaces&geo=UsWest&devcontainer_path=.devcontainer%2Fdevcontainer.json)
:::caution
**Important**: You must select at least the **4 CPU / 16GB RAM** machine type (pre-selected in the link above).
Smaller instances will not have sufficient resources to run Superset effectively.
:::
2. **Wait for Setup**: The initial setup takes several minutes. The Codespace will:
- Build the development container
- Install all dependencies
- Start all required services (PostgreSQL, Redis, etc.)
- Initialize the database with example data
3. **Access Superset**: Once ready, check the **PORTS** tab in VS Code for port `9001`.
Click the globe icon to open Superset in your browser.
- Default credentials: `admin` / `admin`
### Key Features
- **Auto-reload**: Both Python and TypeScript files auto-refresh on save
- **Pre-installed Extensions**: VS Code extensions for Python, TypeScript, and database tools
- **Multiple Instances**: Run multiple Codespaces for different branches/features
- **SSH Access**: Connect via terminal using `gh cs ssh` or through the GitHub web UI
- **VS Code Integration**: Works seamlessly with VS Code desktop app
### Managing Codespaces
- **List active Codespaces**: `gh cs list`
- **SSH into a Codespace**: `gh cs ssh`
- **Stop a Codespace**: Via GitHub UI or `gh cs stop`
- **Delete a Codespace**: Via GitHub UI or `gh cs delete`
### Debugging and Logs
Since Codespaces uses `docker-compose-light.yml`, you can monitor all services:
```bash
# Stream logs from all services
docker compose -f docker-compose-light.yml logs -f
# Stream logs from a specific service
docker compose -f docker-compose-light.yml logs -f superset
# View last 100 lines and follow
docker compose -f docker-compose-light.yml logs --tail=100 -f
# List all running services
docker compose -f docker-compose-light.yml ps
```
:::tip
Codespaces automatically stop after 30 minutes of inactivity to save resources.
Your work is preserved and you can restart anytime.
:::
## Installing Development Tools
:::note
@@ -276,7 +348,7 @@ superset init
# Load some data to play with.
# Note: you MUST have previously created an admin user with the username `admin` for this command to work.
superset load-examples
superset examples load
# Start the Flask dev web server from inside your virtualenv.
# Note that your page may not have CSS at this point.

View File

@@ -26,14 +26,11 @@ Superset locally is using Docker Compose on a Linux or Mac OSX
computer. Superset does not have official support for Windows. It's also the easiest
way to launch a fully functioning **development environment** quickly.
Note that there are 4 major ways we support to run `docker compose`:
Note that there are 3 major ways we support to run `docker compose`:
1. **docker-compose.yml:** for interactive development, where we mount your local folder with the
frontend/backend files that you can edit and experience the changes you
make in the app in real time
1. **docker-compose-light.yml:** a lightweight configuration with minimal services (database,
Superset app, and frontend dev server) for development. Uses in-memory caching instead of Redis
and is designed for running multiple instances simultaneously
1. **docker-compose-non-dev.yml** where we just build a more immutable image based on the
local branch and get all the required images running. Changes in the local branch
at the time you fire this up will be reflected, but changes to the code
@@ -47,7 +44,7 @@ Note that there are 4 major ways we support to run `docker compose`:
The `dev` builds include the `psycopg2-binary` required to connect
to the Postgres database launched as part of the `docker compose` builds.
More on these approaches after setting up the requirements for either.
More on these two approaches after setting up the requirements for either.
## Requirements
@@ -106,36 +103,13 @@ and help you start fresh. In the context of `docker compose` setting
from within docker. This will slow down the startup, but will fix various npm-related issues.
:::
### Option #2 - lightweight development with multiple instances
For a lighter development setup that uses fewer resources and supports running multiple instances:
```bash
# Single lightweight instance (default port 9001)
docker compose -f docker-compose-light.yml up
# Multiple instances with different ports
NODE_PORT=9001 docker compose -p superset-1 -f docker-compose-light.yml up
NODE_PORT=9002 docker compose -p superset-2 -f docker-compose-light.yml up
NODE_PORT=9003 docker compose -p superset-3 -f docker-compose-light.yml up
```
This configuration includes:
- PostgreSQL database (internal network only)
- Superset application server
- Frontend development server with webpack hot reloading
- In-memory caching (no Redis)
- Isolated volumes and networks per instance
Access each instance at `http://localhost:{NODE_PORT}` (e.g., `http://localhost:9001`).
### Option #3 - build a set of immutable images from the local branch
### Option #2 - build a set of immutable images from the local branch
```bash
docker compose -f docker-compose-non-dev.yml up
```
### Option #4 - boot up an official release
### Option #3 - boot up an official release
```bash
# Set the version you want to run

View File

@@ -151,7 +151,7 @@ Finish installing by running through the following commands:
superset fab create-admin
# Load some data to play with
superset load_examples
superset examples load
# Create default roles and permissions
superset init

View File

@@ -111,7 +111,7 @@ athena = ["pyathena[pandas]>=2, <3"]
aurora-data-api = ["preset-sqlalchemy-aurora-data-api>=0.2.8,<0.3"]
bigquery = [
"pandas-gbq>=0.19.1",
"sqlalchemy-bigquery>=1.6.1",
"sqlalchemy-bigquery>=1.15.0",
"google-cloud-bigquery>=3.10.0",
]
clickhouse = ["clickhouse-connect>=0.5.14, <1.0"]

View File

@@ -795,7 +795,7 @@ sqlalchemy==1.4.54
# shillelagh
# sqlalchemy-bigquery
# sqlalchemy-utils
sqlalchemy-bigquery==1.12.0
sqlalchemy-bigquery==1.15.0
# via apache-superset
sqlalchemy-utils==0.38.3
# via

View File

@@ -33,4 +33,4 @@ superset load-test-users
echo "Running tests"
pytest --durations-min=2 --maxfail=1 --cov-report= --cov=superset ./tests/integration_tests "$@"
pytest --durations-min=2 --cov-report= --cov=superset ./tests/integration_tests "$@"

View File

@@ -53,8 +53,8 @@
"@visx/scale": "^3.5.0",
"@visx/tooltip": "^3.0.0",
"@visx/xychart": "^3.5.1",
"ag-grid-community": "33.1.1",
"ag-grid-react": "33.1.1",
"ag-grid-community": "^34.0.2",
"ag-grid-react": "34.0.2",
"antd": "^5.24.6",
"chrono-node": "^2.7.8",
"classnames": "^2.2.5",
@@ -18747,27 +18747,27 @@
}
},
"node_modules/ag-charts-types": {
"version": "11.1.1",
"resolved": "https://registry.npmjs.org/ag-charts-types/-/ag-charts-types-11.1.1.tgz",
"integrity": "sha512-bRmUcf5VVhEEekhX8Vk0NSwa8Te8YM/zchjyYKR2CX4vDYiwoohM1Jg9RFvbIhVbLC1S6QrPEbx5v2C6RDfpSA==",
"version": "12.0.2",
"resolved": "https://registry.npmjs.org/ag-charts-types/-/ag-charts-types-12.0.2.tgz",
"integrity": "sha512-AWM1Y+XW+9VMmV3AbzdVEnreh/I2C9Pmqpc2iLmtId3Xbvmv7O56DqnuDb9EXjK5uPxmyUerTP+utL13UGcztw==",
"license": "MIT"
},
"node_modules/ag-grid-community": {
"version": "33.1.1",
"resolved": "https://registry.npmjs.org/ag-grid-community/-/ag-grid-community-33.1.1.tgz",
"integrity": "sha512-CNubIro0ipj4nfQ5WJPG9Isp7UI6MMDvNzrPdHNf3W+IoM8Uv3RUhjEn7xQqpQHuu6o/tMjrqpacipMUkhzqnw==",
"version": "34.0.2",
"resolved": "https://registry.npmjs.org/ag-grid-community/-/ag-grid-community-34.0.2.tgz",
"integrity": "sha512-hVJp5vrmwHRB10YjfSOVni5YJkO/v+asLjT72S4YnIFSx8lAgyPmByNJgtojk1aJ5h6Up93jTEmGDJeuKiWWLA==",
"license": "MIT",
"dependencies": {
"ag-charts-types": "11.1.1"
"ag-charts-types": "12.0.2"
}
},
"node_modules/ag-grid-react": {
"version": "33.1.1",
"resolved": "https://registry.npmjs.org/ag-grid-react/-/ag-grid-react-33.1.1.tgz",
"integrity": "sha512-xJ+t2gpqUUwpFqAeDvKz/GLVR4unkOghfQBr8iIY9RAdGFarYFClJavsOa8XPVVUqEB9OIuPVFnOdtocbX0jeA==",
"version": "34.0.2",
"resolved": "https://registry.npmjs.org/ag-grid-react/-/ag-grid-react-34.0.2.tgz",
"integrity": "sha512-1KBXkTvwtZiYVlSuDzBkiqfHjZgsATOmpLZdAtdmsCSOOOEWai0F9zHHgBuHfyciAE4nrbQWfojkx8IdnwsKFw==",
"license": "MIT",
"dependencies": {
"ag-grid-community": "33.1.1",
"ag-grid-community": "34.0.2",
"prop-types": "^15.8.1"
},
"peerDependencies": {
@@ -61168,8 +61168,8 @@
"@react-icons/all-files": "^4.1.0",
"@types/d3-array": "^2.9.0",
"@types/react-table": "^7.7.20",
"ag-grid-community": "^33.1.1",
"ag-grid-react": "^33.1.1",
"ag-grid-community": "^34.0.2",
"ag-grid-react": "^34.0.2",
"classnames": "^2.5.1",
"d3-array": "^2.4.0",
"lodash": "^4.17.21",
@@ -61205,8 +61205,10 @@
},
"peerDependencies": {
"@ant-design/icons": "^5.2.6",
"@reduxjs/toolkit": "*",
"@superset-ui/chart-controls": "*",
"@superset-ui/core": "*",
"@types/react-redux": "*",
"geostyler": "^14.1.3",
"geostyler-data": "^1.0.0",
"geostyler-openlayers-parser": "^4.0.0",

View File

@@ -121,8 +121,8 @@
"@visx/scale": "^3.5.0",
"@visx/tooltip": "^3.0.0",
"@visx/xychart": "^3.5.1",
"ag-grid-community": "33.1.1",
"ag-grid-react": "33.1.1",
"ag-grid-community": "^34.0.2",
"ag-grid-react": "34.0.2",
"antd": "^5.24.6",
"chrono-node": "^2.7.8",
"classnames": "^2.2.5",

View File

@@ -41,6 +41,53 @@ import {
import { checkColumnType } from '../utils/checkColumnType';
import { isSortable } from '../utils/isSortable';
// Aggregation choices with computation methods for plugins and controls
export const aggregationChoices = {
raw: {
label: 'Overall value',
compute: (data: number[]) => {
if (!data.length) return null;
return data[0];
},
},
LAST_VALUE: {
label: 'Last Value',
compute: (data: number[]) => {
if (!data.length) return null;
return data[0];
},
},
sum: {
label: 'Total (Sum)',
compute: (data: number[]) =>
data.length ? data.reduce((a, b) => a + b, 0) : null,
},
mean: {
label: 'Average (Mean)',
compute: (data: number[]) =>
data.length ? data.reduce((a, b) => a + b, 0) / data.length : null,
},
min: {
label: 'Minimum',
compute: (data: number[]) => (data.length ? Math.min(...data) : null),
},
max: {
label: 'Maximum',
compute: (data: number[]) => (data.length ? Math.max(...data) : null),
},
median: {
label: 'Median',
compute: (data: number[]) => {
if (!data.length) return null;
const sorted = [...data].sort((a, b) => a - b);
const mid = Math.floor(sorted.length / 2);
return sorted.length % 2 === 0
? (sorted[mid - 1] + sorted[mid]) / 2
: sorted[mid];
},
},
} as const;
export const contributionModeControl = {
name: 'contributionMode',
config: {
@@ -69,17 +116,12 @@ export const aggregationControl = {
default: 'LAST_VALUE',
clearable: false,
renderTrigger: false,
choices: [
['raw', t('None')],
['LAST_VALUE', t('Last Value')],
['sum', t('Total (Sum)')],
['mean', t('Average (Mean)')],
['min', t('Minimum')],
['max', t('Maximum')],
['median', t('Median')],
],
choices: Object.entries(aggregationChoices).map(([value, { label }]) => [
value,
t(label),
]),
description: t(
'Aggregation method used to compute the Big Number from the Trendline.For non-additive metrics like ratios, averages, distinct counts, etc use NONE.',
'Method to compute the displayed value. "Overall value" calculates a single metric across the entire filtered time period, ideal for non-additive metrics like ratios, averages, or distinct counts. Other methods operate over the time series data points.',
),
provideFormDataToProps: true,
mapStateToProps: ({ form_data }: ControlPanelState) => ({

View File

@@ -17,7 +17,7 @@
* under the License.
*/
import { render } from '@superset-ui/core/spec';
import TelemetryPixel from '.';
import { TelemetryPixel } from '.';
const OLD_ENV = process.env;

View File

@@ -39,7 +39,7 @@ interface TelemetryPixelProps {
const PIXEL_ID = '0d3461e1-abb1-4691-a0aa-5ed50de66af0';
const TelemetryPixel = ({
export const TelemetryPixel = ({
version = 'unknownVersion',
sha = 'unknownSHA',
build = 'unknownBuild',
@@ -56,4 +56,3 @@ const TelemetryPixel = ({
/>
);
};
export default TelemetryPixel;

View File

@@ -1,116 +0,0 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import { Dropdown, Icons } from '@superset-ui/core/components';
import type { MenuItem } from '@superset-ui/core/components/Menu';
import { t, useTheme } from '@superset-ui/core';
import { ThemeAlgorithm, ThemeMode } from '../../theme/types';
export interface ThemeSelectProps {
setThemeMode: (newMode: ThemeMode) => void;
tooltipTitle?: string;
themeMode: ThemeMode;
hasLocalOverride?: boolean;
onClearLocalSettings?: () => void;
allowOSPreference?: boolean;
}
const ThemeSelect: React.FC<ThemeSelectProps> = ({
setThemeMode,
tooltipTitle = 'Select theme',
themeMode,
hasLocalOverride = false,
onClearLocalSettings,
allowOSPreference = true,
}) => {
const theme = useTheme();
const handleSelect = (mode: ThemeMode) => {
setThemeMode(mode);
};
const themeIconMap: Record<ThemeAlgorithm | ThemeMode, React.ReactNode> = {
[ThemeAlgorithm.DEFAULT]: <Icons.SunOutlined />,
[ThemeAlgorithm.DARK]: <Icons.MoonOutlined />,
[ThemeMode.SYSTEM]: <Icons.FormatPainterOutlined />,
[ThemeAlgorithm.COMPACT]: <Icons.CompressOutlined />,
};
// Use different icon when local theme is active
const triggerIcon = hasLocalOverride ? (
<Icons.FormatPainterOutlined style={{ color: theme.colorErrorText }} />
) : (
themeIconMap[themeMode] || <Icons.FormatPainterOutlined />
);
const menuItems: MenuItem[] = [
{
type: 'group',
label: t('Theme'),
},
{
key: ThemeMode.DEFAULT,
label: t('Light'),
icon: <Icons.SunOutlined />,
onClick: () => handleSelect(ThemeMode.DEFAULT),
},
{
key: ThemeMode.DARK,
label: t('Dark'),
icon: <Icons.MoonOutlined />,
onClick: () => handleSelect(ThemeMode.DARK),
},
...(allowOSPreference
? [
{
key: ThemeMode.SYSTEM,
label: t('Match system'),
icon: <Icons.FormatPainterOutlined />,
onClick: () => handleSelect(ThemeMode.SYSTEM),
},
]
: []),
];
// Add clear settings option only when there's a local theme active
if (onClearLocalSettings && hasLocalOverride) {
menuItems.push(
{ type: 'divider' } as MenuItem,
{
key: 'clear-local',
label: t('Clear local theme'),
icon: <Icons.ClearOutlined />,
onClick: onClearLocalSettings,
} as MenuItem,
);
}
return (
<Dropdown
menu={{
items: menuItems,
selectedKeys: [themeMode],
}}
trigger={['hover']}
>
{triggerIcon}
</Dropdown>
);
};
export default ThemeSelect;

View File

@@ -0,0 +1,273 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import {
render,
screen,
userEvent,
waitFor,
within,
} from '@superset-ui/core/spec';
import { ThemeMode } from '@superset-ui/core';
import { Menu } from '@superset-ui/core/components';
import { ThemeSubMenu } from '.';
// Mock the translation function
jest.mock('@superset-ui/core', () => ({
...jest.requireActual('@superset-ui/core'),
t: (key: string) => key,
}));
describe('ThemeSubMenu', () => {
const defaultProps = {
allowOSPreference: true,
setThemeMode: jest.fn(),
themeMode: ThemeMode.DEFAULT,
hasLocalOverride: false,
onClearLocalSettings: jest.fn(),
};
const renderThemeSubMenu = (props = defaultProps) =>
render(
<Menu>
<ThemeSubMenu {...props} />
</Menu>,
);
const findMenuWithText = async (text: string) => {
await waitFor(() => {
const found = screen
.getAllByRole('menu')
.some(m => within(m).queryByText(text));
if (!found) throw new Error(`Menu with text "${text}" not yet rendered`);
});
return screen.getAllByRole('menu').find(m => within(m).queryByText(text))!;
};
beforeEach(() => {
jest.clearAllMocks();
});
it('renders Light and Dark theme options by default', async () => {
renderThemeSubMenu();
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Light');
expect(within(menu!).getByText('Light')).toBeInTheDocument();
expect(within(menu!).getByText('Dark')).toBeInTheDocument();
});
it('does not render Match system option when allowOSPreference is false', async () => {
renderThemeSubMenu({ ...defaultProps, allowOSPreference: false });
userEvent.hover(await screen.findByRole('menuitem'));
await waitFor(() => {
expect(screen.queryByText('Match system')).not.toBeInTheDocument();
});
});
it('renders with allowOSPreference as true by default', async () => {
renderThemeSubMenu();
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Match system');
expect(within(menu).getByText('Match system')).toBeInTheDocument();
});
it('renders clear option when both hasLocalOverride and onClearLocalSettings are provided', async () => {
const mockClear = jest.fn();
renderThemeSubMenu({
...defaultProps,
hasLocalOverride: true,
onClearLocalSettings: mockClear,
});
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Clear local theme');
expect(within(menu).getByText('Clear local theme')).toBeInTheDocument();
});
it('does not render clear option when hasLocalOverride is false', async () => {
const mockClear = jest.fn();
renderThemeSubMenu({
...defaultProps,
hasLocalOverride: false,
onClearLocalSettings: mockClear,
});
userEvent.hover(await screen.findByRole('menuitem'));
await waitFor(() => {
expect(screen.queryByText('Clear local theme')).not.toBeInTheDocument();
});
});
it('calls setThemeMode with DEFAULT when Light is clicked', async () => {
const mockSet = jest.fn();
renderThemeSubMenu({ ...defaultProps, setThemeMode: mockSet });
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Light');
userEvent.click(within(menu).getByText('Light'));
expect(mockSet).toHaveBeenCalledWith(ThemeMode.DEFAULT);
});
it('calls setThemeMode with DARK when Dark is clicked', async () => {
const mockSet = jest.fn();
renderThemeSubMenu({ ...defaultProps, setThemeMode: mockSet });
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Dark');
userEvent.click(within(menu).getByText('Dark'));
expect(mockSet).toHaveBeenCalledWith(ThemeMode.DARK);
});
it('calls setThemeMode with SYSTEM when Match system is clicked', async () => {
const mockSet = jest.fn();
renderThemeSubMenu({ ...defaultProps, setThemeMode: mockSet });
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Match system');
userEvent.click(within(menu).getByText('Match system'));
expect(mockSet).toHaveBeenCalledWith(ThemeMode.SYSTEM);
});
it('calls onClearLocalSettings when Clear local theme is clicked', async () => {
const mockClear = jest.fn();
renderThemeSubMenu({
...defaultProps,
hasLocalOverride: true,
onClearLocalSettings: mockClear,
});
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Clear local theme');
userEvent.click(within(menu).getByText('Clear local theme'));
expect(mockClear).toHaveBeenCalledTimes(1);
});
it('displays sun icon for DEFAULT theme', () => {
renderThemeSubMenu({ ...defaultProps, themeMode: ThemeMode.DEFAULT });
expect(screen.getByTestId('sun')).toBeInTheDocument();
});
it('displays moon icon for DARK theme', () => {
renderThemeSubMenu({ ...defaultProps, themeMode: ThemeMode.DARK });
expect(screen.getByTestId('moon')).toBeInTheDocument();
});
it('displays format-painter icon for SYSTEM theme', () => {
renderThemeSubMenu({ ...defaultProps, themeMode: ThemeMode.SYSTEM });
expect(screen.getByTestId('format-painter')).toBeInTheDocument();
});
it('displays override icon when hasLocalOverride is true', () => {
renderThemeSubMenu({ ...defaultProps, hasLocalOverride: true });
expect(screen.getByTestId('format-painter')).toBeInTheDocument();
});
it('renders Theme group header', async () => {
renderThemeSubMenu();
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Theme');
expect(within(menu).getByText('Theme')).toBeInTheDocument();
});
it('renders sun icon for Light theme option', async () => {
renderThemeSubMenu();
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Light');
const lightOption = within(menu).getByText('Light').closest('li');
expect(within(lightOption!).getByTestId('sun')).toBeInTheDocument();
});
it('renders moon icon for Dark theme option', async () => {
renderThemeSubMenu();
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Dark');
const darkOption = within(menu).getByText('Dark').closest('li');
expect(within(darkOption!).getByTestId('moon')).toBeInTheDocument();
});
it('renders format-painter icon for Match system option', async () => {
renderThemeSubMenu({ ...defaultProps, allowOSPreference: true });
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Match system');
const matchOption = within(menu).getByText('Match system').closest('li');
expect(
within(matchOption!).getByTestId('format-painter'),
).toBeInTheDocument();
});
it('renders clear icon for Clear local theme option', async () => {
renderThemeSubMenu({
...defaultProps,
hasLocalOverride: true,
onClearLocalSettings: jest.fn(),
});
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Clear local theme');
const clearOption = within(menu)
.getByText('Clear local theme')
.closest('li');
expect(within(clearOption!).getByTestId('clear')).toBeInTheDocument();
});
it('renders divider before clear option when clear option is present', async () => {
renderThemeSubMenu({
...defaultProps,
hasLocalOverride: true,
onClearLocalSettings: jest.fn(),
});
userEvent.hover(await screen.findByRole('menuitem'));
const menu = await findMenuWithText('Clear local theme');
const divider = within(menu).queryByRole('separator');
expect(divider).toBeInTheDocument();
});
it('does not render divider when clear option is not present', async () => {
renderThemeSubMenu({ ...defaultProps });
userEvent.hover(await screen.findByRole('menuitem'));
const divider = document.querySelector('.ant-menu-item-divider');
expect(divider).toBeNull();
});
});

View File

@@ -0,0 +1,170 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import { useMemo } from 'react';
import { Icons, Menu } from '@superset-ui/core/components';
import {
css,
styled,
t,
ThemeMode,
useTheme,
ThemeAlgorithm,
} from '@superset-ui/core';
const StyledThemeSubMenu = styled(Menu.SubMenu)`
${({ theme }) => css`
[data-icon='caret-down'] {
color: ${theme.colorIcon};
font-size: ${theme.fontSizeXS}px;
margin-left: ${theme.sizeUnit}px;
}
&.ant-menu-submenu-active {
.ant-menu-title-content {
color: ${theme.colorPrimary};
}
}
`}
`;
const StyledThemeSubMenuItem = styled(Menu.Item)<{ selected: boolean }>`
${({ theme, selected }) => css`
&:hover {
color: ${theme.colorPrimary} !important;
cursor: pointer !important;
}
${selected &&
css`
background-color: ${theme.colors.primary.light4} !important;
color: ${theme.colors.primary.dark1} !important;
`}
`}
`;
export interface ThemeSubMenuOption {
key: ThemeMode;
label: string;
icon: React.ReactNode;
onClick: () => void;
}
export interface ThemeSubMenuProps {
setThemeMode: (newMode: ThemeMode) => void;
themeMode: ThemeMode;
hasLocalOverride?: boolean;
onClearLocalSettings?: () => void;
allowOSPreference?: boolean;
}
export const ThemeSubMenu: React.FC<ThemeSubMenuProps> = ({
setThemeMode,
themeMode,
hasLocalOverride = false,
onClearLocalSettings,
allowOSPreference = true,
}: ThemeSubMenuProps) => {
const theme = useTheme();
const handleSelect = (mode: ThemeMode) => {
setThemeMode(mode);
};
const themeIconMap: Record<ThemeAlgorithm | ThemeMode, React.ReactNode> =
useMemo(
() => ({
[ThemeAlgorithm.DEFAULT]: <Icons.SunOutlined />,
[ThemeAlgorithm.DARK]: <Icons.MoonOutlined />,
[ThemeMode.SYSTEM]: <Icons.FormatPainterOutlined />,
[ThemeAlgorithm.COMPACT]: <Icons.CompressOutlined />,
}),
[],
);
const selectedThemeModeIcon = useMemo(
() =>
hasLocalOverride ? (
<Icons.FormatPainterOutlined
style={{ color: theme.colors.error.base }}
/>
) : (
themeIconMap[themeMode]
),
[hasLocalOverride, theme.colors.error.base, themeIconMap, themeMode],
);
const themeOptions: ThemeSubMenuOption[] = [
{
key: ThemeMode.DEFAULT,
label: t('Light'),
icon: <Icons.SunOutlined />,
onClick: () => handleSelect(ThemeMode.DEFAULT),
},
{
key: ThemeMode.DARK,
label: t('Dark'),
icon: <Icons.MoonOutlined />,
onClick: () => handleSelect(ThemeMode.DARK),
},
...(allowOSPreference
? [
{
key: ThemeMode.SYSTEM,
label: t('Match system'),
icon: <Icons.FormatPainterOutlined />,
onClick: () => handleSelect(ThemeMode.SYSTEM),
},
]
: []),
];
// Add clear settings option only when there's a local theme active
const clearOption =
onClearLocalSettings && hasLocalOverride
? {
key: 'clear-local',
label: t('Clear local theme'),
icon: <Icons.ClearOutlined />,
onClick: onClearLocalSettings,
}
: null;
return (
<StyledThemeSubMenu
key="theme-sub-menu"
title={selectedThemeModeIcon}
icon={<Icons.CaretDownOutlined iconSize="xs" />}
>
<Menu.ItemGroup title={t('Theme')} />
{themeOptions.map(option => (
<StyledThemeSubMenuItem
key={option.key}
onClick={option.onClick}
selected={option.key === themeMode}
>
{option.icon} {option.label}
</StyledThemeSubMenuItem>
))}
{clearOption && [
<Menu.Divider key="theme-divider" />,
<Menu.Item key={clearOption.key} onClick={clearOption.onClick}>
{clearOption.icon} {clearOption.label}
</Menu.Item>,
]}
</StyledThemeSubMenu>
);
};

View File

@@ -16,14 +16,8 @@
* specific language governing permissions and limitations
* under the License.
*/
import { t, css, useTheme } from '@superset-ui/core';
import {
Icons,
Modal,
Typography,
Button,
Flex,
} from '@superset-ui/core/components';
import { t } from '@superset-ui/core';
import { Icons, Modal, Typography, Button } from '@superset-ui/core/components';
import type { FC, ReactElement } from 'react';
export type UnsavedChangesModalProps = {
@@ -42,66 +36,30 @@ export const UnsavedChangesModal: FC<UnsavedChangesModalProps> = ({
onConfirmNavigation,
title = 'Unsaved Changes',
body = "If you don't save, changes will be lost.",
}): ReactElement => {
const theme = useTheme();
return (
<Modal
name={title}
centered
responsive
onHide={onHide}
show={showModal}
width="444px"
title={
<Flex>
<Icons.WarningOutlined
iconColor={theme.colorWarning}
css={css`
margin-right: ${theme.sizeUnit * 2}px;
`}
iconSize="l"
/>
<Typography.Title
css={css`
&& {
margin: 0;
margin-bottom: 0;
}
`}
level={5}
>
{title}
</Typography.Title>
</Flex>
}
footer={
<Flex
justify="flex-end"
css={css`
width: 100%;
`}
>
<Button
htmlType="button"
buttonSize="small"
buttonStyle="secondary"
onClick={onConfirmNavigation}
>
{t('Discard')}
</Button>
<Button
htmlType="button"
buttonSize="small"
buttonStyle="primary"
onClick={handleSave}
>
{t('Save')}
</Button>
</Flex>
}
>
<Typography.Text>{body}</Typography.Text>
</Modal>
);
};
}: UnsavedChangesModalProps): ReactElement => (
<Modal
centered
responsive
onHide={onHide}
show={showModal}
width="444px"
title={
<>
<Icons.WarningOutlined iconSize="m" style={{ marginRight: 8 }} />
{title}
</>
}
footer={
<>
<Button buttonStyle="secondary" onClick={onConfirmNavigation}>
{t('Discard')}
</Button>
<Button buttonStyle="primary" onClick={handleSave}>
{t('Save')}
</Button>
</>
}
>
<Typography.Text>{body}</Typography.Text>
</Modal>
);

View File

@@ -164,6 +164,8 @@ export * from './Steps';
export * from './Table';
export * from './TableView';
export * from './Tag';
export * from './TelemetryPixel';
export * from './ThemeSubMenu';
export * from './UnsavedChangesModal';
export * from './constants';
export * from './Result';

View File

@@ -26,7 +26,9 @@ import {
type ThemeStorage,
type ThemeControllerOptions,
type ThemeContextType,
type SupersetThemeConfig,
ThemeAlgorithm,
ThemeMode,
} from './types';
export {
@@ -66,7 +68,16 @@ const themeObject: Theme = Theme.fromConfig({
const { theme } = themeObject;
const supersetTheme = theme;
export { Theme, themeObject, styled, theme, supersetTheme };
export {
Theme,
ThemeAlgorithm,
ThemeMode,
themeObject,
styled,
theme,
supersetTheme,
};
export type {
SupersetTheme,
SerializableThemeConfig,
@@ -74,6 +85,7 @@ export type {
ThemeStorage,
ThemeControllerOptions,
ThemeContextType,
SupersetThemeConfig,
};
// Export theme utility functions

View File

@@ -429,3 +429,16 @@ export interface ThemeContextType {
canDetectOSPreference: () => boolean;
createDashboardThemeProvider: (themeId: string) => Promise<Theme | null>;
}
/**
* Configuration object for complete theme setup including default, dark themes and settings
*/
export interface SupersetThemeConfig {
theme_default: AnyThemeConfig;
theme_dark?: AnyThemeConfig;
theme_settings?: {
enforced?: boolean;
allowSwitching?: boolean;
allowOSPreference?: boolean;
};
}

View File

@@ -1385,7 +1385,7 @@ export default function (config) {
p[0] = p[0] - __.margin.left;
p[1] = p[1] - __.margin.top;
(dims = dimensionsForPoint(p)),
((dims = dimensionsForPoint(p)),
(strum = {
p1: p,
dims: dims,
@@ -1393,7 +1393,7 @@ export default function (config) {
maxX: xscale(dims.right),
minY: 0,
maxY: h(),
});
}));
strums[dims.i] = strum;
strums.active = dims.i;
@@ -1942,7 +1942,7 @@ export default function (config) {
p[0] = p[0] - __.margin.left;
p[1] = p[1] - __.margin.top;
(dims = dimensionsForPoint(p)),
((dims = dimensionsForPoint(p)),
(arc = {
p1: p,
dims: dims,
@@ -1953,7 +1953,7 @@ export default function (config) {
startAngle: undefined,
endAngle: undefined,
arc: d3.svg.arc().innerRadius(0),
});
}));
arcs[dims.i] = arc;
arcs.active = dims.i;

View File

@@ -27,8 +27,8 @@
"@react-icons/all-files": "^4.1.0",
"@types/d3-array": "^2.9.0",
"@types/react-table": "^7.7.20",
"ag-grid-community": "^33.1.1",
"ag-grid-react": "^33.1.1",
"ag-grid-community": "^34.0.2",
"ag-grid-react": "^34.0.2",
"classnames": "^2.5.1",
"d3-array": "^2.4.0",
"lodash": "^4.17.21",

View File

@@ -49,38 +49,53 @@ describe('BigNumberWithTrendline buildQuery', () => {
aggregation: null,
};
it('creates raw metric query when aggregation is null', () => {
const queryContext = buildQuery({ ...baseFormData });
it('creates raw metric query when aggregation is "raw"', () => {
const queryContext = buildQuery({ ...baseFormData, aggregation: 'raw' });
const bigNumberQuery = queryContext.queries[1];
expect(bigNumberQuery.post_processing).toEqual([{ operation: 'pivot' }]);
expect(bigNumberQuery.is_timeseries).toBe(true);
expect(bigNumberQuery.post_processing).toEqual([]);
expect(bigNumberQuery.is_timeseries).toBe(false);
expect(bigNumberQuery.columns).toEqual([]);
});
it('adds aggregation operator when aggregation is "sum"', () => {
it('returns single query for aggregation methods that can be computed client-side', () => {
const queryContext = buildQuery({ ...baseFormData, aggregation: 'sum' });
const bigNumberQuery = queryContext.queries[1];
expect(bigNumberQuery.post_processing).toEqual([
expect(queryContext.queries.length).toBe(1);
expect(queryContext.queries[0].post_processing).toEqual([
{ operation: 'pivot' },
{ operation: 'aggregation', options: { operator: 'sum' } },
{ operation: 'rolling' },
{ operation: 'resample' },
{ operation: 'flatten' },
]);
expect(bigNumberQuery.is_timeseries).toBe(true);
});
it('skips aggregation when aggregation is LAST_VALUE', () => {
it('returns single query for LAST_VALUE aggregation', () => {
const queryContext = buildQuery({
...baseFormData,
aggregation: 'LAST_VALUE',
});
const bigNumberQuery = queryContext.queries[1];
expect(bigNumberQuery.post_processing).toEqual([{ operation: 'pivot' }]);
expect(bigNumberQuery.is_timeseries).toBe(true);
expect(queryContext.queries.length).toBe(1);
expect(queryContext.queries[0].post_processing).toEqual([
{ operation: 'pivot' },
{ operation: 'rolling' },
{ operation: 'resample' },
{ operation: 'flatten' },
]);
});
it('always returns two queries', () => {
const queryContext = buildQuery({ ...baseFormData });
it('returns two queries only for raw aggregation', () => {
const queryContext = buildQuery({ ...baseFormData, aggregation: 'raw' });
expect(queryContext.queries.length).toBe(2);
const queryContextLastValue = buildQuery({
...baseFormData,
aggregation: 'LAST_VALUE',
});
expect(queryContextLastValue.queries.length).toBe(1);
const queryContextSum = buildQuery({ ...baseFormData, aggregation: 'sum' });
expect(queryContextSum.queries.length).toBe(1);
});
});

View File

@@ -39,28 +39,37 @@ export default function buildQuery(formData: QueryFormData) {
? ensureIsArray(getXAxisColumn(formData))
: [];
return buildQueryContext(formData, baseQueryObject => [
{
...baseQueryObject,
columns: [...timeColumn],
...(timeColumn.length ? {} : { is_timeseries: true }),
post_processing: [
pivotOperator(formData, baseQueryObject),
rollingWindowOperator(formData, baseQueryObject),
resampleOperator(formData, baseQueryObject),
flattenOperator(formData, baseQueryObject),
],
},
{
...baseQueryObject,
columns: [...(isRawMetric ? [] : timeColumn)],
is_timeseries: !isRawMetric,
post_processing: isRawMetric
? []
: [
pivotOperator(formData, baseQueryObject),
aggregationOperator(formData, baseQueryObject),
],
},
]);
return buildQueryContext(formData, baseQueryObject => {
const queries = [
{
...baseQueryObject,
columns: [...timeColumn],
...(timeColumn.length ? {} : { is_timeseries: true }),
post_processing: [
pivotOperator(formData, baseQueryObject),
rollingWindowOperator(formData, baseQueryObject),
resampleOperator(formData, baseQueryObject),
flattenOperator(formData, baseQueryObject),
].filter(Boolean),
},
];
// Only add second query for raw metrics which need different query structure
// All other aggregations (sum, mean, min, max, median, LAST_VALUE) can be computed client-side from trendline data
if (formData.aggregation === 'raw') {
queries.push({
...baseQueryObject,
columns: [...(isRawMetric ? [] : timeColumn)],
is_timeseries: !isRawMetric,
post_processing: isRawMetric
? []
: ([
pivotOperator(formData, baseQueryObject),
aggregationOperator(formData, baseQueryObject),
].filter(Boolean) as any[]),
});
}
return queries;
});
}

View File

@@ -20,6 +20,41 @@ import { GenericDataType } from '@superset-ui/core';
import transformProps from './transformProps';
import { BigNumberWithTrendlineChartProps, BigNumberDatum } from '../types';
// Mock chart-controls to avoid styled-components issues in Jest
jest.mock('@superset-ui/chart-controls', () => ({
aggregationChoices: {
raw: {
label: 'Force server-side aggregation',
compute: (data: number[]) => data[0] ?? null,
},
LAST_VALUE: {
label: 'Last Value',
compute: (data: number[]) => data[0] ?? null,
},
sum: {
label: 'Total (Sum)',
compute: (data: number[]) => data.reduce((a, b) => a + b, 0),
},
mean: {
label: 'Average (Mean)',
compute: (data: number[]) =>
data.reduce((a, b) => a + b, 0) / data.length,
},
min: { label: 'Minimum', compute: (data: number[]) => Math.min(...data) },
max: { label: 'Maximum', compute: (data: number[]) => Math.max(...data) },
median: {
label: 'Median',
compute: (data: number[]) => {
const sorted = [...data].sort((a, b) => a - b);
const mid = Math.floor(sorted.length / 2);
return sorted.length % 2 === 0
? (sorted[mid - 1] + sorted[mid]) / 2
: sorted[mid];
},
},
},
}));
jest.mock('@superset-ui/core', () => ({
GenericDataType: { Temporal: 2, String: 1 },
extractTimegrain: jest.fn(() => 'P1D'),
@@ -218,7 +253,7 @@ describe('BigNumberWithTrendline transformProps', () => {
coltypes: ['NUMERIC'],
},
],
formData: { ...baseFormData, aggregation: 'SUM' },
formData: { ...baseFormData, aggregation: 'sum' },
rawFormData: baseRawFormData,
hooks: baseHooks,
datasource: baseDatasource,

View File

@@ -29,6 +29,7 @@ import {
tooltipHtml,
} from '@superset-ui/core';
import { EChartsCoreOption, graphic } from 'echarts/core';
import { aggregationChoices } from '@superset-ui/chart-controls';
import {
BigNumberVizProps,
BigNumberDatum,
@@ -43,6 +44,31 @@ const formatPercentChange = getNumberFormatter(
NumberFormats.PERCENT_SIGNED_1_POINT,
);
// Client-side aggregation function using shared aggregationChoices
function computeClientSideAggregation(
data: [number | null, number | null][],
aggregation: string | undefined | null,
): number | null {
if (!data.length) return null;
// Find the aggregation method, handling case variations
const methodKey = Object.keys(aggregationChoices).find(
key => key.toLowerCase() === (aggregation || '').toLowerCase(),
);
// Use the compute method from aggregationChoices, fallback to LAST_VALUE
const selectedMethod = methodKey
? aggregationChoices[methodKey as keyof typeof aggregationChoices]
: aggregationChoices.LAST_VALUE;
// Extract values from tuple array and filter out nulls
const values = data
.map(([, value]) => value)
.filter((v): v is number => v !== null);
return selectedMethod.compute(values);
}
export default function transformProps(
chartProps: BigNumberWithTrendlineChartProps,
): BigNumberVizProps {
@@ -126,27 +152,33 @@ export default function transformProps(
// sort in time descending order
.sort((a, b) => (a[0] !== null && b[0] !== null ? b[0] - a[0] : 0));
}
if (hasAggregatedData && aggregatedData) {
if (
aggregatedData[metricName] !== null &&
aggregatedData[metricName] !== undefined
) {
bigNumber = aggregatedData[metricName];
} else {
const metricKeys = Object.keys(aggregatedData).filter(
key =>
key !== xAxisLabel &&
aggregatedData[key] !== null &&
typeof aggregatedData[key] === 'number',
);
bigNumber = metricKeys.length > 0 ? aggregatedData[metricKeys[0]] : null;
}
timestamp = sortedData.length > 0 ? sortedData[0][0] : null;
} else if (sortedData.length > 0) {
bigNumber = sortedData[0][1];
if (sortedData.length > 0) {
timestamp = sortedData[0][0];
// Raw aggregation uses server-side data, all others use client-side
if (aggregation === 'raw' && hasAggregatedData && aggregatedData) {
// Use server-side aggregation for raw
if (
aggregatedData[metricName] !== null &&
aggregatedData[metricName] !== undefined
) {
bigNumber = aggregatedData[metricName];
} else {
const metricKeys = Object.keys(aggregatedData).filter(
key =>
key !== xAxisLabel &&
aggregatedData[key] !== null &&
typeof aggregatedData[key] === 'number',
);
bigNumber =
metricKeys.length > 0 ? aggregatedData[metricKeys[0]] : null;
}
} else {
// Use client-side aggregation for all other methods
bigNumber = computeClientSideAggregation(sortedData, aggregation);
}
// Handle null bigNumber case
if (bigNumber === null) {
bigNumberFallback = sortedData.find(d => d[1] !== null);
bigNumber = bigNumberFallback ? bigNumberFallback[1] : null;

View File

@@ -128,9 +128,10 @@ describe('BigNumberWithTrendline', () => {
expect(lastDatum?.[0]).toStrictEqual(100);
expect(lastDatum?.[1]).toBeNull();
// should note this is a fallback
// should get the last non-null value
expect(transformed.bigNumber).toStrictEqual(1.2345);
expect(transformed.bigNumberFallback).not.toBeNull();
// bigNumberFallback is only set when bigNumber is null after aggregation
expect(transformed.bigNumberFallback).toBeNull();
// should successfully formatTime by granularity
// @ts-ignore

View File

@@ -34,6 +34,12 @@ const parseLabel = value => {
return String(value);
};
function displayCell(value, allowRenderHtml) {
if (allowRenderHtml && typeof value === 'string') {
return safeHtmlSpan(value);
}
return parseLabel(value);
}
function displayHeaderCell(
needToggle,
ArrowIcon,
@@ -742,7 +748,7 @@ export class TableRenderer extends Component {
onContextMenu={e => this.props.onContextMenu(e, colKey, rowKey)}
style={style}
>
{agg.format(aggValue)}
{displayCell(agg.format(aggValue), allowRenderHtml)}
</td>
);
});
@@ -759,7 +765,7 @@ export class TableRenderer extends Component {
onClick={rowTotalCallbacks[flatRowKey]}
onContextMenu={e => this.props.onContextMenu(e, undefined, rowKey)}
>
{agg.format(aggValue)}
{displayCell(agg.format(aggValue), allowRenderHtml)}
</td>
);
}
@@ -823,7 +829,7 @@ export class TableRenderer extends Component {
onContextMenu={e => this.props.onContextMenu(e, colKey, undefined)}
style={{ padding: '5px' }}
>
{agg.format(aggValue)}
{displayCell(agg.format(aggValue), this.props.allowRenderHtml)}
</td>
);
});
@@ -840,7 +846,7 @@ export class TableRenderer extends Component {
onClick={grandTotalCallback}
onContextMenu={e => this.props.onContextMenu(e, undefined, undefined)}
>
{agg.format(aggValue)}
{displayCell(agg.format(aggValue), this.props.allowRenderHtml)}
</td>
);
}

View File

@@ -164,7 +164,7 @@ const v1ChartDataRequest = async (
ownState,
parseMethod,
) => {
const payload = buildV1ChartDataPayload({
const payload = await buildV1ChartDataPayload({
formData,
resultType,
resultFormat,
@@ -255,7 +255,7 @@ export function runAnnotationQuery({
isDashboardRequest = false,
force = false,
}) {
return function (dispatch, getState) {
return async function (dispatch, getState) {
const { charts, common } = getState();
const sliceKey = key || Object.keys(charts)[0];
const queryTimeout = timeout || common.conf.SUPERSET_WEBSERVER_TIMEOUT;
@@ -310,17 +310,19 @@ export function runAnnotationQuery({
fd.annotation_layers[annotationIndex].overrides = sliceFormData;
}
const payload = await buildV1ChartDataPayload({
formData: fd,
force,
resultFormat: 'json',
resultType: 'full',
});
return SupersetClient.post({
url,
signal,
timeout: queryTimeout * 1000,
headers: { 'Content-Type': 'application/json' },
jsonPayload: buildV1ChartDataPayload({
formData: fd,
force,
resultFormat: 'json',
resultType: 'full',
}),
jsonPayload: payload,
})
.then(({ json }) => {
const data = json?.result?.[0]?.annotation_data?.[annotation.name];
@@ -420,6 +422,8 @@ export function exploreJSON(
const setDataMask = dataMask => {
dispatch(updateDataMask(formData.slice_id, dataMask));
};
dispatch(chartUpdateStarted(controller, formData, key));
const chartDataRequest = getChartDataRequest({
setDataMask,
formData,
@@ -431,8 +435,6 @@ export function exploreJSON(
ownState,
});
dispatch(chartUpdateStarted(controller, formData, key));
const [useLegacyApi] = getQuerySettings(formData);
const chartDataRequestCaught = chartDataRequest
.then(({ response, json }) =>

View File

@@ -64,6 +64,7 @@ describe('chart actions', () => {
let dispatch;
let getExploreUrlStub;
let getChartDataUriStub;
let buildV1ChartDataPayloadStub;
let waitForAsyncDataStub;
let fakeMetadata;
@@ -85,6 +86,13 @@ describe('chart actions', () => {
getChartDataUriStub = sinon
.stub(exploreUtils, 'getChartDataUri')
.callsFake(({ qs }) => URI(MOCK_URL).query(qs));
buildV1ChartDataPayloadStub = sinon
.stub(exploreUtils, 'buildV1ChartDataPayload')
.resolves({
some_param: 'fake query!',
result_type: 'full',
result_format: 'json',
});
fakeMetadata = { useLegacyApi: true };
getChartMetadataRegistry.mockImplementation(() => ({
get: () => fakeMetadata,
@@ -104,6 +112,7 @@ describe('chart actions', () => {
afterEach(() => {
getExploreUrlStub.restore();
getChartDataUriStub.restore();
buildV1ChartDataPayloadStub.restore();
fetchMock.resetHistory();
waitForAsyncDataStub.restore();
@@ -362,7 +371,7 @@ describe('chart actions timeout', () => {
jest.clearAllMocks();
});
it('should use the timeout from arguments when given', () => {
it('should use the timeout from arguments when given', async () => {
const postSpy = jest.spyOn(SupersetClient, 'post');
postSpy.mockImplementation(() => Promise.resolve({ json: { result: [] } }));
const timeout = 10; // Set the timeout value here
@@ -370,7 +379,7 @@ describe('chart actions timeout', () => {
const key = 'chartKey'; // Set the chart key here
const store = mockStore(initialState);
store.dispatch(
await store.dispatch(
actions.runAnnotationQuery({
annotation: {
value: 'annotationValue',
@@ -394,14 +403,14 @@ describe('chart actions timeout', () => {
expect(postSpy).toHaveBeenCalledWith(expectedPayload);
});
it('should use the timeout from common.conf when not passed as an argument', () => {
it('should use the timeout from common.conf when not passed as an argument', async () => {
const postSpy = jest.spyOn(SupersetClient, 'post');
postSpy.mockImplementation(() => Promise.resolve({ json: { result: [] } }));
const formData = { datasource: 'table__1' }; // Set the formData here
const key = 'chartKey'; // Set the chart key here
const store = mockStore(initialState);
store.dispatch(
await store.dispatch(
actions.runAnnotationQuery({
annotation: {
value: 'annotationValue',

View File

@@ -0,0 +1,93 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import { Route } from 'react-router-dom';
import { getExtensionsRegistry } from '@superset-ui/core';
import { Provider as ReduxProvider } from 'react-redux';
import { QueryParamProvider } from 'use-query-params';
import { DndProvider } from 'react-dnd';
import { HTML5Backend } from 'react-dnd-html5-backend';
import { FlashProvider, DynamicPluginProvider } from 'src/components';
import { EmbeddedUiConfigProvider } from 'src/components/UiConfigContext';
import { SupersetThemeProvider } from 'src/theme/ThemeProvider';
import { ThemeController } from 'src/theme/ThemeController';
import type { ThemeStorage } from '@superset-ui/core';
import { store } from 'src/views/store';
import getBootstrapData from 'src/utils/getBootstrapData';
/**
* In-memory implementation of ThemeStorage interface for embedded contexts.
* Persistent storage is not required for embedded dashboards.
*/
class ThemeMemoryStorageAdapter implements ThemeStorage {
private storage = new Map<string, string>();
getItem(key: string): string | null {
return this.storage.get(key) || null;
}
setItem(key: string, value: string): void {
this.storage.set(key, value);
}
removeItem(key: string): void {
this.storage.delete(key);
}
}
const themeController = new ThemeController({
storage: new ThemeMemoryStorageAdapter(),
});
export const getThemeController = (): ThemeController => themeController;
const { common } = getBootstrapData();
const extensionsRegistry = getExtensionsRegistry();
export const EmbeddedContextProviders: React.FC = ({ children }) => {
const RootContextProviderExtension = extensionsRegistry.get(
'root.context.provider',
);
return (
<SupersetThemeProvider themeController={themeController}>
<ReduxProvider store={store}>
<DndProvider backend={HTML5Backend}>
<FlashProvider messages={common.flash_messages}>
<EmbeddedUiConfigProvider>
<DynamicPluginProvider>
<QueryParamProvider
ReactRouterRoute={Route}
stringifyOptions={{ encode: false }}
>
{RootContextProviderExtension ? (
<RootContextProviderExtension>
{children}
</RootContextProviderExtension>
) : (
children
)}
</QueryParamProvider>
</DynamicPluginProvider>
</EmbeddedUiConfigProvider>
</FlashProvider>
</DndProvider>
</ReduxProvider>
</SupersetThemeProvider>
);
};

View File

@@ -21,20 +21,27 @@ import 'src/public-path';
import { lazy, Suspense } from 'react';
import ReactDOM from 'react-dom';
import { BrowserRouter as Router, Route } from 'react-router-dom';
import { makeApi, t, logging, themeObject } from '@superset-ui/core';
import {
type SupersetThemeConfig,
makeApi,
t,
logging,
} from '@superset-ui/core';
import Switchboard from '@superset-ui/switchboard';
import getBootstrapData, { applicationRoot } from 'src/utils/getBootstrapData';
import setupClient from 'src/setup/setupClient';
import setupPlugins from 'src/setup/setupPlugins';
import { useUiConfig } from 'src/components/UiConfigContext';
import { RootContextProviders } from 'src/views/RootContextProviders';
import { store, USER_LOADED } from 'src/views/store';
import { Loading } from '@superset-ui/core/components';
import { ErrorBoundary } from 'src/components';
import { addDangerToast } from 'src/components/MessageToasts/actions';
import ToastContainer from 'src/components/MessageToasts/ToastContainer';
import { UserWithPermissionsAndRoles } from 'src/types/bootstrapTypes';
import { AnyThemeConfig } from 'packages/superset-ui-core/src/theme/types';
import {
EmbeddedContextProviders,
getThemeController,
} from './EmbeddedContextProviders';
import { embeddedApi } from './api';
import { getDataMaskChangeTrigger } from './utils';
@@ -44,9 +51,7 @@ const debugMode = process.env.WEBPACK_MODE === 'development';
const bootstrapData = getBootstrapData();
function log(...info: unknown[]) {
if (debugMode) {
logging.debug(`[superset]`, ...info);
}
if (debugMode) logging.debug(`[superset]`, ...info);
}
const LazyDashboardPage = lazy(
@@ -85,12 +90,12 @@ const EmbededLazyDashboardPage = () => {
const EmbeddedRoute = () => (
<Suspense fallback={<Loading />}>
<RootContextProviders>
<EmbeddedContextProviders>
<ErrorBoundary>
<EmbededLazyDashboardPage />
</ErrorBoundary>
<ToastContainer position="top" />
</RootContextProviders>
</EmbeddedContextProviders>
</Suspense>
);
@@ -245,12 +250,13 @@ window.addEventListener('message', function embeddedPageInitializer(event) {
Switchboard.defineMethod('getDataMask', embeddedApi.getDataMask);
Switchboard.defineMethod(
'setThemeConfig',
(payload: { themeConfig: AnyThemeConfig }) => {
(payload: { themeConfig: SupersetThemeConfig }) => {
const { themeConfig } = payload;
log('Received setThemeConfig request:', themeConfig);
try {
themeObject.setConfig(themeConfig);
const themeController = getThemeController();
themeController.setThemeConfig(themeConfig);
return { success: true, message: 'Theme applied' };
} catch (error) {
logging.error('Failed to apply theme config:', error);
@@ -258,8 +264,22 @@ window.addEventListener('message', function embeddedPageInitializer(event) {
}
},
);
Switchboard.start();
}
});
// Clean up theme controller on page unload
window.addEventListener('beforeunload', () => {
try {
const controller = getThemeController();
if (controller) {
log('Destroying theme controller');
controller.destroy();
}
} catch (error) {
logging.warn('Failed to destroy theme controller:', error);
}
});
log('embed page is ready to receive messages');

View File

@@ -91,7 +91,7 @@ afterEach(() => {
});
const getFormatSwitch = () =>
screen.getByRole('switch', { name: 'Show original SQL' });
screen.getByRole('switch', { name: 'formatted original' });
test('renders the component with Formatted SQL and buttons', async () => {
const { container } = setup(mockProps);

View File

@@ -26,11 +26,17 @@ import {
} from 'react';
import { useSelector } from 'react-redux';
import rison from 'rison';
import { styled, SupersetClient, t } from '@superset-ui/core';
import { Icons, Switch, Button, Skeleton } from '@superset-ui/core/components';
import { styled, SupersetClient, t, useTheme } from '@superset-ui/core';
import {
Icons,
Switch,
Button,
Skeleton,
Card,
Space,
} from '@superset-ui/core/components';
import { CopyToClipboard } from 'src/components';
import { RootState } from 'src/dashboard/types';
import { CopyButton } from 'src/explore/components/DataTableControl';
import { findPermission } from 'src/utils/findPermission';
import CodeSyntaxHighlighter, {
SupportedLanguage,
@@ -38,14 +44,6 @@ import CodeSyntaxHighlighter, {
} from '@superset-ui/core/components/CodeSyntaxHighlighter';
import { useHistory } from 'react-router-dom';
const CopyButtonViewQuery = styled(CopyButton)`
${({ theme }) => `
&& {
margin: 0 0 ${theme.sizeUnit}px;
}
`}
`;
export interface ViewQueryProps {
sql: string;
datasource: string;
@@ -58,26 +56,14 @@ const StyledSyntaxContainer = styled.div`
flex-direction: column;
`;
const StyledHeaderMenuContainer = styled.div`
display: flex;
flex-direction: row;
justify-content: space-between;
margin-top: ${({ theme }) => -theme.sizeUnit * 4}px;
align-items: flex-end;
`;
const StyledHeaderActionContainer = styled.div`
display: flex;
flex-direction: row;
column-gap: ${({ theme }) => theme.sizeUnit * 2}px;
`;
const StyledThemedSyntaxHighlighter = styled(CodeSyntaxHighlighter)`
flex: 1;
`;
const StyledLabel = styled.label`
font-size: ${({ theme }) => theme.fontSize}px;
const StyledFooter = styled.div`
display: flex;
justify-content: space-between;
align-items: center;
`;
const DATASET_BACKEND_QUERY = {
@@ -87,6 +73,7 @@ const DATASET_BACKEND_QUERY = {
const ViewQuery: FC<ViewQueryProps> = props => {
const { sql, language = 'sql', datasource } = props;
const theme = useTheme();
const datasetId = datasource.split('__')[0];
const [formattedSQL, setFormattedSQL] = useState<string>();
const [showFormatSQL, setShowFormatSQL] = useState(true);
@@ -153,46 +140,57 @@ const ViewQuery: FC<ViewQueryProps> = props => {
}, [sql]);
return (
<StyledSyntaxContainer key={sql}>
<StyledHeaderMenuContainer>
<StyledHeaderActionContainer>
<CopyToClipboard
text={currentSQL}
shouldShowText={false}
copyNode={
<CopyButtonViewQuery
<Card bodyStyle={{ padding: theme.sizeUnit * 4 }}>
<StyledSyntaxContainer key={sql}>
{!formattedSQL && <Skeleton active />}
{formattedSQL && (
<StyledThemedSyntaxHighlighter
language={language}
customStyle={{ flex: 1, marginBottom: theme.sizeUnit * 3 }}
>
{currentSQL}
</StyledThemedSyntaxHighlighter>
)}
<StyledFooter>
<Space size={theme.sizeUnit * 2}>
<CopyToClipboard
text={currentSQL}
shouldShowText={false}
copyNode={
<Button
buttonStyle="secondary"
buttonSize="small"
icon={<Icons.CopyOutlined />}
>
{t('Copy')}
</Button>
}
/>
{canAccessSQLLab && (
<Button
buttonStyle="secondary"
buttonSize="small"
icon={<Icons.CopyOutlined />}
onClick={navToSQLLab}
>
{t('Copy')}
</CopyButtonViewQuery>
}
/>
{canAccessSQLLab && (
<Button onClick={navToSQLLab}>{t('View in SQL Lab')}</Button>
)}
</StyledHeaderActionContainer>
<StyledHeaderActionContainer>
<Switch
id="formatSwitch"
checked={!showFormatSQL}
onChange={formatCurrentQuery}
/>
<StyledLabel htmlFor="formatSwitch">
{t('Show original SQL')}
</StyledLabel>
</StyledHeaderActionContainer>
</StyledHeaderMenuContainer>
{!formattedSQL && <Skeleton active />}
{formattedSQL && (
<StyledThemedSyntaxHighlighter
language={language}
customStyle={{ flex: 1 }}
>
{currentSQL}
</StyledThemedSyntaxHighlighter>
)}
</StyledSyntaxContainer>
{t('View in SQL Lab')}
</Button>
)}
</Space>
<Space size={theme.sizeUnit * 2} align="center">
<Icons.ConsoleSqlOutlined />
<Switch
id="formatSwitch"
checked={showFormatSQL}
onChange={formatCurrentQuery}
checkedChildren={t('formatted')}
unCheckedChildren={t('original')}
/>
</Space>
</StyledFooter>
</StyledSyntaxContainer>
</Card>
);
};

View File

@@ -42,6 +42,7 @@ const ViewQueryModalContainer = styled.div`
height: 100%;
display: flex;
flex-direction: column;
gap: ${({ theme }) => theme.sizeUnit * 4}px;
`;
const ViewQueryModal: FC<Props> = ({ latestQueryFormData }) => {
@@ -86,9 +87,10 @@ const ViewQueryModal: FC<Props> = ({ latestQueryFormData }) => {
return (
<ViewQueryModalContainer>
{result.map(item =>
{result.map((item, index) =>
item.query ? (
<ViewQuery
key={`query-${index}`}
datasource={latestQueryFormData.datasource}
sql={item.query}
language="sql"

View File

@@ -41,6 +41,9 @@ import {
import TableChartPlugin from '../../../../../plugins/plugin-chart-table/src';
import VizTypeControl, { VIZ_TYPE_CONTROL_TEST_ID } from './index';
// Mock scrollIntoView to avoid errors in test environment
jest.mock('scroll-into-view-if-needed', () => jest.fn());
jest.useFakeTimers();
class MainPreset extends Preset {
@@ -256,4 +259,22 @@ describe('VizTypeControl', () => {
expect(defaultProps.onChange).toHaveBeenCalledWith(VizType.Line);
});
it('Search input is focused when modal opens', async () => {
// Mock the focus method to track if it was called
const focusSpy = jest.fn();
const originalFocus = HTMLInputElement.prototype.focus;
HTMLInputElement.prototype.focus = focusSpy;
await waitForRenderWrapper();
const searchInput = screen.getByTestId(getTestId('search-input'));
// Verify that focus() was called on the search input
expect(focusSpy).toHaveBeenCalled();
expect(searchInput).toBeInTheDocument();
// Restore the original focus method
HTMLInputElement.prototype.focus = originalFocus;
});
});

View File

@@ -575,6 +575,13 @@ export default function VizTypeGallery(props: VizTypeGalleryProps) {
setIsSearchFocused(true);
}, []);
// Auto-focus the search input when the modal opens
useEffect(() => {
if (searchInputRef.current) {
searchInputRef.current.focus();
}
}, []);
const changeSearch: ChangeEventHandler<HTMLInputElement> = useCallback(
event => setSearchInputValue(event.target.value),
[],

View File

@@ -191,8 +191,8 @@ describe('exploreUtils', () => {
});
describe('buildV1ChartDataPayload', () => {
it('generate valid request payload despite no registered buildQuery', () => {
const v1RequestPayload = buildV1ChartDataPayload({
it('generate valid request payload despite no registered buildQuery', async () => {
const v1RequestPayload = await buildV1ChartDataPayload({
formData: { ...formData, viz_type: 'my_custom_viz' },
});
expect(v1RequestPayload.hasOwnProperty('queries')).toBeTruthy();

View File

@@ -207,7 +207,7 @@ export const getQuerySettings = formData => {
];
};
export const buildV1ChartDataPayload = ({
export const buildV1ChartDataPayload = async ({
formData,
force,
resultFormat,
@@ -242,7 +242,7 @@ export const buildV1ChartDataPayload = ({
export const getLegacyEndpointType = ({ resultType, resultFormat }) =>
resultFormat === 'csv' ? resultFormat : resultType;
export const exportChart = ({
export const exportChart = async ({
formData,
resultFormat = 'json',
resultType = 'full',
@@ -262,7 +262,7 @@ export const exportChart = ({
payload = formData;
} else {
url = ensureAppRoot('/api/v1/chart/data');
payload = buildV1ChartDataPayload({
payload = await buildV1ChartDataPayload({
formData,
force,
resultFormat,

View File

@@ -16,7 +16,12 @@
* specific language governing permissions and limitations
* under the License.
*/
import { render, screen } from 'spec/helpers/testing-library';
import {
render,
screen,
waitFor,
userEvent,
} from 'spec/helpers/testing-library';
import Footer from 'src/features/datasets/AddDataset/Footer';
const mockHistoryPush = jest.fn();
@@ -27,6 +32,14 @@ jest.mock('react-router-dom', () => ({
}),
}));
// Mock the API call
const mockCreateResource = jest.fn();
jest.mock('src/views/CRUD/hooks', () => ({
useSingleViewResource: () => ({
createResource: mockCreateResource,
}),
}));
const mockedProps = {
url: 'realwebsite.com',
};
@@ -34,7 +47,7 @@ const mockedProps = {
const mockPropsWithDataset = {
url: 'realwebsite.com',
datasetObject: {
database: {
db: {
id: '1',
database_name: 'examples',
},
@@ -47,6 +60,10 @@ const mockPropsWithDataset = {
};
describe('Footer', () => {
beforeEach(() => {
jest.clearAllMocks();
});
test('renders a Footer with a cancel button and a disabled create button', () => {
render(<Footer {...mockedProps} />, { useRedux: true });
@@ -55,21 +72,28 @@ describe('Footer', () => {
});
const createButton = screen.getByRole('button', {
name: /Create/i,
name: /Create dataset and create chart/i,
});
expect(saveButton).toBeVisible();
expect(createButton).toBeDisabled();
});
test('renders a Create Dataset button when a table is selected', () => {
test('renders a Create Dataset dropdown button when a table is selected', () => {
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
const createButton = screen.getByRole('button', {
name: /Create/i,
name: /Create dataset and create chart/i,
});
expect(createButton).toBeEnabled();
// Check that it's a dropdown button with the correct text
expect(createButton).toHaveTextContent('Create dataset and create chart');
// Check for the dropdown arrow
const dropdownArrow = screen.getByRole('img', { hidden: true });
expect(dropdownArrow).toBeInTheDocument();
});
test('create button becomes disabled when table already has a dataset', () => {
@@ -78,9 +102,119 @@ describe('Footer', () => {
});
const createButton = screen.getByRole('button', {
name: /Create/i,
name: /Create dataset and create chart/i,
});
expect(createButton).toBeDisabled();
});
test('shows dropdown menu when dropdown arrow is clicked', async () => {
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
// Find and click the dropdown trigger (the arrow part)
const dropdownTrigger = screen.getByRole('button', { name: 'down' });
userEvent.click(dropdownTrigger);
// Check that the dropdown menu option is visible
await waitFor(() => {
expect(screen.getByText('Create dataset only')).toBeVisible();
});
});
test('navigates to chart creation when main button is clicked', async () => {
mockCreateResource.mockResolvedValue(123); // Mock successful dataset creation
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
});
userEvent.click(createButton);
await waitFor(() => {
expect(mockCreateResource).toHaveBeenCalledWith({
database: '1',
catalog: undefined,
schema: 'public',
table_name: 'real_info',
});
expect(mockHistoryPush).toHaveBeenCalledWith(
'/chart/add/?dataset=real_info',
);
});
});
test('navigates to dataset list when "Create dataset only" menu option is clicked', async () => {
mockCreateResource.mockResolvedValue(123);
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
// Open dropdown menu
const dropdownTrigger = screen.getByRole('button', { name: 'down' });
userEvent.click(dropdownTrigger);
// Click the "Create dataset only" option
await waitFor(() => {
const datasetOnlyOption = screen.getByText('Create dataset only');
userEvent.click(datasetOnlyOption);
});
await waitFor(() => {
expect(mockCreateResource).toHaveBeenCalledWith({
database: '1',
catalog: undefined,
schema: 'public',
table_name: 'real_info',
});
expect(mockHistoryPush).toHaveBeenCalledWith('/tablemodelview/list/');
});
});
test('handles dataset creation failure gracefully', async () => {
mockCreateResource.mockResolvedValue(null); // Mock failed dataset creation
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
});
userEvent.click(createButton);
await waitFor(() => {
expect(mockCreateResource).toHaveBeenCalled();
// Should not navigate if creation failed
expect(mockHistoryPush).not.toHaveBeenCalled();
});
});
test('passes correct data to createResource with catalog', async () => {
const mockPropsWithCatalog = {
...mockPropsWithDataset,
datasetObject: {
...mockPropsWithDataset.datasetObject,
catalog: 'test_catalog',
},
};
mockCreateResource.mockResolvedValue(456);
render(<Footer {...mockPropsWithCatalog} />, { useRedux: true });
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
});
userEvent.click(createButton);
await waitFor(() => {
expect(mockCreateResource).toHaveBeenCalledWith({
database: '1',
catalog: 'test_catalog',
schema: 'public',
table_name: 'real_info',
});
});
});
});

View File

@@ -17,8 +17,14 @@
* under the License.
*/
import { useHistory } from 'react-router-dom';
import { Button } from '@superset-ui/core/components';
import { t } from '@superset-ui/core';
import {
Button,
DropdownButton,
Menu,
Flex,
} from '@superset-ui/core/components';
import { t, useTheme } from '@superset-ui/core';
import { Icons } from '@superset-ui/core/components/Icons';
import { useSingleViewResource } from 'src/views/CRUD/hooks';
import { logEvent } from 'src/logger/actions';
import withToasts from 'src/components/MessageToasts/withToasts';
@@ -55,6 +61,7 @@ function Footer({
datasets,
}: FooterProps) {
const history = useHistory();
const theme = useTheme();
const { createResource } = useSingleViewResource<Partial<DatasetObject>>(
'dataset',
t('dataset'),
@@ -85,7 +92,7 @@ function Footer({
const tooltipText = t('Select a database table.');
const onSave = () => {
const onSave = (createChart: boolean = true) => {
if (datasetObject) {
const data = {
database: datasetObject.db?.id,
@@ -100,32 +107,57 @@ function Footer({
if (typeof response === 'number') {
logEvent(LOG_ACTIONS_DATASET_CREATION_SUCCESS, datasetObject);
// When a dataset is created the response we get is its ID number
history.push(`/chart/add/?dataset=${datasetObject.table_name}`);
if (createChart) {
history.push(`/chart/add/?dataset=${datasetObject.table_name}`);
} else {
history.push('/tablemodelview/list/');
}
}
});
}
};
const onSaveOnly = () => {
onSave(false);
};
const CREATE_DATASET_TEXT = t('Create dataset and create chart');
const CREATE_DATASET_ONLY_TEXT = t('Create dataset only');
const disabledCheck =
!datasetObject?.table_name ||
!hasColumns ||
datasets?.includes(datasetObject?.table_name);
const dropdownMenu = (
<Menu>
<Menu.Item key="create-only" onClick={onSaveOnly}>
{CREATE_DATASET_ONLY_TEXT}
</Menu.Item>
</Menu>
);
return (
<>
<Flex align="center" justify="flex-end" gap="8px">
<Button buttonStyle="secondary" onClick={cancelButtonOnClick}>
{t('Cancel')}
</Button>
<Button
buttonStyle="primary"
<DropdownButton
type="primary"
disabled={disabledCheck}
tooltip={!datasetObject?.table_name ? tooltipText : undefined}
onClick={onSave}
onClick={() => onSave(true)}
popupRender={() => dropdownMenu}
icon={
<Icons.DownOutlined
iconSize="xs"
iconColor={theme.colors.grayscale.light5}
/>
}
trigger={['click']}
>
{CREATE_DATASET_TEXT}
</Button>
</>
</DropdownButton>
</Flex>
);
}

View File

@@ -17,13 +17,11 @@
* under the License.
*/
import { Fragment, useState, useEffect, FC, PureComponent } from 'react';
import rison from 'rison';
import { useSelector } from 'react-redux';
import { Link } from 'react-router-dom';
import { useQueryParams, BooleanParam } from 'use-query-params';
import { get, isEmpty } from 'lodash';
import {
t,
styled,
@@ -33,10 +31,15 @@ import {
getExtensionsRegistry,
useTheme,
} from '@superset-ui/core';
import { Menu } from '@superset-ui/core/components/Menu';
import { Label, Tooltip } from '@superset-ui/core/components';
import { Icons } from '@superset-ui/core/components/Icons';
import { Typography } from '@superset-ui/core/components/Typography';
import {
Label,
Tooltip,
ThemeSubMenu,
Menu,
Icons,
Typography,
TelemetryPixel,
} from '@superset-ui/core/components';
import { ensureAppRoot } from 'src/utils/pathUtils';
import { findPermission } from 'src/utils/findPermission';
import { isUserAdmin } from 'src/dashboard/util/permissionUtils';
@@ -49,9 +52,7 @@ import { RootState } from 'src/dashboard/types';
import DatabaseModal from 'src/features/databases/DatabaseModal';
import UploadDataModal from 'src/features/databases/UploadDataModel';
import { uploadUserPerms } from 'src/views/CRUD/utils';
import TelemetryPixel from '@superset-ui/core/components/TelemetryPixel';
import { useThemeContext } from 'src/theme/ThemeProvider';
import ThemeSelect from '@superset-ui/core/components/ThemeSelect';
import LanguagePicker from './LanguagePicker';
import {
ExtensionConfigs,
@@ -138,6 +139,7 @@ const RightMenu = ({
datasetAdded?: boolean;
}) => void;
}) => {
const theme = useTheme();
const user = useSelector<any, UserWithPermissionsAndRoles>(
state => state.user,
);
@@ -371,7 +373,6 @@ const RightMenu = ({
localStorage.removeItem('redux');
};
const theme = useTheme();
return (
<StyledDiv align={align}>
{canDatabase && (
@@ -493,16 +494,15 @@ const RightMenu = ({
})}
</StyledSubMenu>
)}
{canSetMode() && (
<span>
<ThemeSelect
setThemeMode={setThemeMode}
themeMode={themeMode}
hasLocalOverride={hasDevOverride()}
onClearLocalSettings={clearLocalOverrides}
allowOSPreference={canDetectOSPreference()}
/>
</span>
<ThemeSubMenu
setThemeMode={setThemeMode}
themeMode={themeMode}
hasLocalOverride={hasDevOverride()}
onClearLocalSettings={clearLocalOverrides}
allowOSPreference={canDetectOSPreference()}
/>
)}
<StyledSubMenu

View File

@@ -17,13 +17,15 @@
* under the License.
*/
import {
type AnyThemeConfig,
type SupersetTheme,
type SupersetThemeConfig,
type ThemeControllerOptions,
type ThemeStorage,
Theme,
AnyThemeConfig,
ThemeStorage,
ThemeControllerOptions,
ThemeMode,
themeObject as supersetThemeObject,
} from '@superset-ui/core';
import { SupersetTheme, ThemeMode } from '@superset-ui/core/theme/types';
import {
getAntdConfig,
normalizeThemeConfig,
@@ -94,7 +96,7 @@ export class ThemeController {
private currentMode: ThemeMode;
private readonly hasBootstrapThemes: boolean;
private hasCustomThemes: boolean;
private onChangeCallbacks: Set<(theme: Theme) => void> = new Set();
@@ -109,15 +111,13 @@ export class ThemeController {
private dashboardCrudTheme: AnyThemeConfig | null = null;
constructor(options: ThemeControllerOptions = {}) {
const {
storage = new LocalStorageAdapter(),
modeStorageKey = STORAGE_KEYS.THEME_MODE,
themeObject = supersetThemeObject,
defaultTheme = (supersetThemeObject.theme as AnyThemeConfig) ?? {},
onChange = null,
} = options;
constructor({
storage = new LocalStorageAdapter(),
modeStorageKey = STORAGE_KEYS.THEME_MODE,
themeObject = supersetThemeObject,
defaultTheme = (supersetThemeObject.theme as AnyThemeConfig) ?? {},
onChange = undefined,
}: ThemeControllerOptions = {}) {
this.storage = storage;
this.modeStorageKey = modeStorageKey;
@@ -129,14 +129,14 @@ export class ThemeController {
bootstrapDefaultTheme,
bootstrapDarkTheme,
bootstrapThemeSettings,
hasBootstrapThemes,
hasCustomThemes,
}: BootstrapThemeData = this.loadBootstrapData();
this.hasBootstrapThemes = hasBootstrapThemes;
this.hasCustomThemes = hasCustomThemes;
this.themeSettings = bootstrapThemeSettings || {};
// Set themes based on bootstrap data availability
if (this.hasBootstrapThemes) {
if (this.hasCustomThemes) {
this.darkTheme = bootstrapDarkTheme || bootstrapDefaultTheme || null;
this.defaultTheme =
bootstrapDefaultTheme || bootstrapDarkTheme || defaultTheme;
@@ -424,6 +424,42 @@ export class ThemeController {
return allowOSPreference === true;
}
/**
* Sets an entire new theme configuration, replacing all existing theme data and settings.
* This method is designed for use cases like embedded dashboards where themes are provided
* dynamically from external sources.
* @param config - The complete theme configuration object
*/
public setThemeConfig(config: SupersetThemeConfig): void {
this.defaultTheme = config.theme_default;
this.darkTheme = config.theme_dark || null;
this.hasCustomThemes = true;
this.themeSettings = {
enforced: config.theme_settings?.enforced ?? false,
allowSwitching: config.theme_settings?.allowSwitching ?? true,
allowOSPreference: config.theme_settings?.allowOSPreference ?? true,
};
let newMode: ThemeMode;
try {
this.validateModeUpdatePermission(this.currentMode);
const hasRequiredTheme = this.isValidThemeMode(this.currentMode);
newMode = hasRequiredTheme
? this.currentMode
: this.determineInitialMode();
} catch {
newMode = this.determineInitialMode();
}
this.currentMode = newMode;
const themeToApply =
this.getThemeForMode(this.currentMode) || this.defaultTheme;
this.updateTheme(themeToApply);
}
/**
* Handles system theme changes with error recovery.
*/
@@ -547,7 +583,7 @@ export class ThemeController {
bootstrapDefaultTheme: hasValidDefault ? defaultTheme : null,
bootstrapDarkTheme: hasValidDark ? darkTheme : null,
bootstrapThemeSettings: hasValidSettings ? themeSettings : null,
hasBootstrapThemes: hasValidDefault || hasValidDark,
hasCustomThemes: hasValidDefault || hasValidDark,
};
}
@@ -607,7 +643,7 @@ export class ThemeController {
resolvedMode = ThemeController.getSystemPreferredMode();
}
if (!this.hasBootstrapThemes) {
if (!this.hasCustomThemes) {
const baseTheme = this.defaultTheme.token as Partial<SupersetTheme>;
return getAntdConfig(baseTheme, resolvedMode === ThemeMode.DARK);
}

View File

@@ -24,8 +24,12 @@ import {
useMemo,
useState,
} from 'react';
import { Theme, AnyThemeConfig, ThemeContextType } from '@superset-ui/core';
import { ThemeMode } from '@superset-ui/core/theme/types';
import {
type AnyThemeConfig,
type ThemeContextType,
Theme,
ThemeMode,
} from '@superset-ui/core';
import { ThemeController } from './ThemeController';
const ThemeContext = createContext<ThemeContextType | null>(null);

View File

@@ -17,12 +17,17 @@
* under the License.
*/
import { theme as antdThemeImport } from 'antd';
import { Theme } from '@superset-ui/core';
import {
type AnyThemeConfig,
type SupersetThemeConfig,
Theme,
ThemeAlgorithm,
ThemeMode,
} from '@superset-ui/core';
import type {
BootstrapThemeDataConfig,
CommonBootstrapData,
} from 'src/types/bootstrapTypes';
import { ThemeAlgorithm, ThemeMode } from '@superset-ui/core/theme/types';
import getBootstrapData from 'src/utils/getBootstrapData';
import { LocalStorageAdapter, ThemeController } from '../ThemeController';
@@ -43,7 +48,7 @@ const mockThemeFromConfig = jest.fn();
const mockSetConfig = jest.fn();
// Mock data constants
const DEFAULT_THEME = {
const DEFAULT_THEME: AnyThemeConfig = {
token: {
colorBgBase: '#ededed',
colorTextBase: '#120f0f',
@@ -55,7 +60,7 @@ const DEFAULT_THEME = {
},
};
const DARK_THEME = {
const DARK_THEME: AnyThemeConfig = {
token: {
colorBgBase: '#141118',
colorTextBase: '#fdc7c7',
@@ -65,7 +70,7 @@ const DARK_THEME = {
colorSuccess: '#3c7c1b',
colorWarning: '#dc9811',
},
algorithm: ThemeMode.DARK,
algorithm: ThemeAlgorithm.DARK,
};
const THEME_SETTINGS = {
@@ -1049,4 +1054,298 @@ describe('ThemeController', () => {
);
});
});
describe('setThemeConfig', () => {
beforeEach(() => {
mockGetBootstrapData.mockReturnValue(
createMockBootstrapData({
default: {},
dark: {},
settings: {},
}),
);
controller = new ThemeController({
themeObject: mockThemeObject,
defaultTheme: { token: {} },
});
jest.clearAllMocks();
});
it('should set complete theme configuration', () => {
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_dark: DARK_THEME,
theme_settings: {
enforced: false,
allowSwitching: true,
allowOSPreference: true,
},
};
controller.setThemeConfig(themeConfig);
expect(mockSetConfig).toHaveBeenCalledTimes(1);
expect(mockSetConfig).toHaveBeenCalledWith(
expect.objectContaining({
token: expect.objectContaining(DEFAULT_THEME.token),
algorithm: antdThemeImport.defaultAlgorithm,
}),
);
expect(controller.getCurrentMode()).toBe(ThemeMode.SYSTEM);
expect(controller.canSetTheme()).toBe(true);
expect(controller.canSetMode()).toBe(true);
});
it('should handle theme_default only', () => {
const themeConfig = {
theme_default: DEFAULT_THEME,
};
controller.setThemeConfig(themeConfig);
expect(mockSetConfig).toHaveBeenCalledTimes(1);
expect(mockSetConfig).toHaveBeenCalledWith(
expect.objectContaining({
token: expect.objectContaining(DEFAULT_THEME.token),
algorithm: antdThemeImport.defaultAlgorithm,
}),
);
expect(controller.canSetTheme()).toBe(true);
expect(controller.canSetMode()).toBe(true);
});
it('should handle theme_default and theme_dark without settings', () => {
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_dark: DARK_THEME,
};
controller.setThemeConfig(themeConfig);
expect(mockSetConfig).toHaveBeenCalledTimes(1);
expect(mockSetConfig).toHaveBeenCalledWith(
expect.objectContaining({
token: expect.objectContaining(DEFAULT_THEME.token),
}),
);
jest.clearAllMocks();
controller.setThemeMode(ThemeMode.DARK);
expect(mockSetConfig).toHaveBeenCalledTimes(1);
expect(mockSetConfig).toHaveBeenCalledWith(
expect.objectContaining({
token: expect.objectContaining(DARK_THEME.token),
algorithm: antdThemeImport.darkAlgorithm,
}),
);
});
it('should handle enforced theme settings', () => {
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_dark: DARK_THEME,
theme_settings: {
enforced: true,
allowSwitching: false,
allowOSPreference: false,
},
};
controller.setThemeConfig(themeConfig);
expect(controller.canSetTheme()).toBe(false);
expect(controller.canSetMode()).toBe(false);
expect(controller.getCurrentMode()).toBe(ThemeMode.DEFAULT);
expect(() => {
controller.setThemeMode(ThemeMode.DARK);
}).toThrow('User does not have permission to update the theme mode');
});
it('should handle allowOSPreference: false setting', () => {
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_dark: DARK_THEME,
theme_settings: {
enforced: false,
allowSwitching: true,
allowOSPreference: false,
},
};
controller.setThemeConfig(themeConfig);
expect(controller.getCurrentMode()).toBe(ThemeMode.DEFAULT);
expect(controller.canSetMode()).toBe(true);
expect(() => {
controller.setThemeMode(ThemeMode.SYSTEM);
}).toThrow('System theme mode is not allowed');
});
it('should re-determine initial mode based on new settings', () => {
mockMatchMedia.mockReturnValue({
matches: true,
addEventListener: jest.fn(),
removeEventListener: jest.fn(),
});
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_dark: DARK_THEME,
theme_settings: {
enforced: false,
allowSwitching: false,
allowOSPreference: true,
},
};
controller.setThemeConfig(themeConfig);
expect(controller.getCurrentMode()).toBe(ThemeMode.SYSTEM);
expect(controller.canSetMode()).toBe(false);
});
it('should apply appropriate theme after configuration', () => {
controller.setThemeMode(ThemeMode.DARK);
jest.clearAllMocks();
const themeConfig = {
theme_default: {
token: {
colorPrimary: '#00ff00',
},
},
theme_dark: {
token: {
colorPrimary: '#ff0000',
colorBgBase: '#000000',
},
algorithm: 'dark',
},
};
controller.setThemeConfig(themeConfig as SupersetThemeConfig);
expect(mockSetConfig).toHaveBeenCalledTimes(1);
expect(mockSetConfig).toHaveBeenCalledWith(
expect.objectContaining({
token: expect.objectContaining({
colorPrimary: '#ff0000',
colorBgBase: '#000000',
}),
algorithm: antdThemeImport.darkAlgorithm,
}),
);
});
it('should handle missing theme_dark gracefully', () => {
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_settings: {
allowSwitching: true,
},
};
controller.setThemeConfig(themeConfig);
jest.clearAllMocks();
controller.setThemeMode(ThemeMode.DARK);
expect(mockSetConfig).toHaveBeenCalledTimes(1);
expect(mockSetConfig).toHaveBeenCalledWith(
expect.objectContaining({
token: expect.objectContaining(DEFAULT_THEME.token),
algorithm: antdThemeImport.defaultAlgorithm,
}),
);
});
it('should preserve existing theme mode when possible', () => {
controller.setThemeMode(ThemeMode.DARK);
const initialMode = controller.getCurrentMode();
jest.clearAllMocks();
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_dark: DARK_THEME,
theme_settings: {
allowSwitching: true,
allowOSPreference: false,
},
};
controller.setThemeConfig(themeConfig);
expect(controller.getCurrentMode()).toBe(initialMode);
});
it('should trigger onChange callbacks', () => {
const changeCallback = jest.fn();
controller.onChange(changeCallback);
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_dark: DARK_THEME,
};
controller.setThemeConfig(themeConfig);
expect(changeCallback).toHaveBeenCalledTimes(1);
expect(changeCallback).toHaveBeenCalledWith(mockThemeObject);
});
it('should handle partial theme_settings', () => {
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_settings: {
enforced: true,
},
};
controller.setThemeConfig(themeConfig);
expect(controller.canSetTheme()).toBe(false);
expect(controller.canSetMode()).toBe(false);
});
it('should handle error in theme application', () => {
mockSetConfig.mockImplementationOnce(() => {
throw new Error('Theme application error');
});
const themeConfig = {
theme_default: DEFAULT_THEME,
};
expect(() => {
controller.setThemeConfig(themeConfig);
}).not.toThrow();
expect(consoleErrorSpy).toHaveBeenCalledWith(
'Failed to apply theme:',
expect.any(Error),
);
});
it('should update stored theme mode', () => {
const themeConfig = {
theme_default: DEFAULT_THEME,
theme_dark: DARK_THEME,
};
controller.setThemeConfig(themeConfig);
expect(mockLocalStorage.setItem).toHaveBeenCalledWith(
'superset-theme-mode',
expect.any(String),
);
});
});
});

View File

@@ -17,8 +17,7 @@
* under the License.
*/
import { ReactNode } from 'react';
import { Theme } from '@superset-ui/core';
import { ThemeContextType, ThemeMode } from '@superset-ui/core/theme/types';
import { type ThemeContextType, Theme, ThemeMode } from '@superset-ui/core';
import { act, render, screen } from '@superset-ui/core/spec';
import { renderHook } from '@testing-library/react-hooks';
import { SupersetThemeProvider, useThemeContext } from '../ThemeProvider';

View File

@@ -16,14 +16,6 @@
* specific language governing permissions and limitations
* under the License.
*/
import {
ColorSchemeConfig,
FeatureFlagMap,
JsonObject,
LanguagePack,
Locale,
SequentialSchemeConfig,
} from '@superset-ui/core';
import { FormatLocaleDefinition } from 'd3-format';
import { TimeLocaleDefinition } from 'd3-time-format';
import { isPlainObject } from 'lodash';
@@ -31,8 +23,14 @@ import { Languages } from 'src/features/home/LanguagePicker';
import type { FlashMessage } from 'src/components';
import type {
AnyThemeConfig,
ColorSchemeConfig,
FeatureFlagMap,
JsonObject,
LanguagePack,
Locale,
SequentialSchemeConfig,
SerializableThemeConfig,
} from '@superset-ui/core/theme/types';
} from '@superset-ui/core';
export type User = {
createdOn?: string;
@@ -189,7 +187,7 @@ export interface BootstrapThemeData {
bootstrapDefaultTheme: AnyThemeConfig | null;
bootstrapDarkTheme: AnyThemeConfig | null;
bootstrapThemeSettings: SerializableThemeSettings | null;
hasBootstrapThemes: boolean;
hasCustomThemes: boolean;
}
export function isUser(user: any): user is User {

View File

@@ -9,7 +9,7 @@
"version": "0.0.1",
"license": "Apache-2.0",
"dependencies": {
"cookie": "^0.7.0",
"cookie": "^1.0.2",
"hot-shots": "^11.1.0",
"ioredis": "^5.6.1",
"jsonwebtoken": "^9.0.2",
@@ -20,7 +20,6 @@
},
"devDependencies": {
"@eslint/js": "^9.25.1",
"@types/cookie": "^0.6.0",
"@types/eslint__js": "^8.42.3",
"@types/ioredis": "^4.27.8",
"@types/jest": "^29.5.14",
@@ -1721,12 +1720,6 @@
"@babel/types": "^7.3.0"
}
},
"node_modules/@types/cookie": {
"version": "0.6.0",
"resolved": "https://registry.npmjs.org/@types/cookie/-/cookie-0.6.0.tgz",
"integrity": "sha512-4Kh9a6B2bQciAhf7FSuMRRkUWecJgJu9nPnx3yzpsfXX/c50REIqpHY4C82bXP90qrLtXtkDxTZosYO3UpOwlA==",
"dev": true
},
"node_modules/@types/eslint": {
"version": "9.6.1",
"resolved": "https://registry.npmjs.org/@types/eslint/-/eslint-9.6.1.tgz",
@@ -3045,11 +3038,11 @@
"dev": true
},
"node_modules/cookie": {
"version": "0.7.0",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.0.tgz",
"integrity": "sha512-qCf+V4dtlNhSRXGAZatc1TasyFO6GjohcOul807YOb5ik3+kQSnb4d7iajeCL8QHaJ4uZEjCgiCJerKXwdRVlQ==",
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-1.0.2.tgz",
"integrity": "sha512-9Kr/j4O16ISv8zBBhJoi4bXOYNTkFLOqSL3UDB0njXxCXNezjeyVrJyGOWtgfs/q2km1gwBcfH8q1yEGoMYunA==",
"engines": {
"node": ">= 0.6"
"node": ">=18"
}
},
"node_modules/create-jest": {
@@ -8402,12 +8395,6 @@
"@babel/types": "^7.3.0"
}
},
"@types/cookie": {
"version": "0.6.0",
"resolved": "https://registry.npmjs.org/@types/cookie/-/cookie-0.6.0.tgz",
"integrity": "sha512-4Kh9a6B2bQciAhf7FSuMRRkUWecJgJu9nPnx3yzpsfXX/c50REIqpHY4C82bXP90qrLtXtkDxTZosYO3UpOwlA==",
"dev": true
},
"@types/eslint": {
"version": "9.6.1",
"resolved": "https://registry.npmjs.org/@types/eslint/-/eslint-9.6.1.tgz",
@@ -9317,9 +9304,9 @@
"dev": true
},
"cookie": {
"version": "0.7.0",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.0.tgz",
"integrity": "sha512-qCf+V4dtlNhSRXGAZatc1TasyFO6GjohcOul807YOb5ik3+kQSnb4d7iajeCL8QHaJ4uZEjCgiCJerKXwdRVlQ=="
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-1.0.2.tgz",
"integrity": "sha512-9Kr/j4O16ISv8zBBhJoi4bXOYNTkFLOqSL3UDB0njXxCXNezjeyVrJyGOWtgfs/q2km1gwBcfH8q1yEGoMYunA=="
},
"create-jest": {
"version": "29.7.0",

View File

@@ -17,7 +17,7 @@
},
"license": "Apache-2.0",
"dependencies": {
"cookie": "^0.7.0",
"cookie": "^1.0.2",
"hot-shots": "^11.1.0",
"ioredis": "^5.6.1",
"jsonwebtoken": "^9.0.2",
@@ -28,7 +28,6 @@
},
"devDependencies": {
"@eslint/js": "^9.25.1",
"@types/cookie": "^0.6.0",
"@types/eslint__js": "^8.42.3",
"@types/ioredis": "^4.27.8",
"@types/jest": "^29.5.14",
@@ -52,7 +51,7 @@
"typescript-eslint": "^8.19.0"
},
"engines": {
"node": "^16.9.1",
"npm": "^7.5.4 || ^8.1.2"
"node": "^20.19.4",
"npm": "^10.8.2"
}
}

View File

@@ -17,6 +17,7 @@
* under the License.
*/
import { buildConfig } from '../src/config';
import { expect, test } from '@jest/globals';
test('buildConfig() builds configuration and applies env var overrides', () => {
let config = buildConfig();

View File

@@ -19,15 +19,26 @@
const jwt = require('jsonwebtoken');
const config = require('../config.test.json');
import { describe, expect, test, beforeEach, afterEach } from '@jest/globals';
import {
describe,
expect,
test,
beforeEach,
afterEach,
jest,
} from '@jest/globals';
import * as http from 'http';
import * as net from 'net';
import { WebSocket } from 'ws';
interface MockedRedisXrange {
(): Promise<server.StreamResult[]>;
}
// NOTE: these mock variables needs to start with "mock" due to
// calls to `jest.mock` being hoisted to the top of the file.
// https://jestjs.io/docs/es6-class-mocks#calling-jestmock-with-the-module-factory-parameter
const mockRedisXrange = jest.fn();
const mockRedisXrange = jest.fn() as jest.MockedFunction<MockedRedisXrange>;
jest.mock('ws');
jest.mock('ioredis', () => {
@@ -59,7 +70,7 @@ import * as server from '../src/index';
import { statsd } from '../src/index';
describe('server', () => {
let statsdIncrementMock: jest.SpyInstance;
let statsdIncrementMock: jest.SpiedFunction<typeof statsd.increment>;
beforeEach(() => {
mockRedisXrange.mockClear();
@@ -319,10 +330,12 @@ describe('server', () => {
describe('wsConnection', () => {
let ws: WebSocket;
let wsEventMock: jest.SpyInstance;
let trackClientSpy: jest.SpyInstance;
let fetchRangeFromStreamSpy: jest.SpyInstance;
let dateNowSpy: jest.SpyInstance;
let wsEventMock: jest.SpiedFunction<typeof ws.on>;
let trackClientSpy: jest.SpiedFunction<typeof server.trackClient>;
let fetchRangeFromStreamSpy: jest.SpiedFunction<
typeof server.fetchRangeFromStream
>;
let dateNowSpy: jest.SpiedFunction<typeof Date.now>;
let socketInstanceExpected: server.SocketInstance;
const getRequest = (token: string, url: string): http.IncomingMessage => {
@@ -431,8 +444,8 @@ describe('server', () => {
describe('httpUpgrade', () => {
let socket: net.Socket;
let socketDestroySpy: jest.SpyInstance;
let wssUpgradeSpy: jest.SpyInstance;
let socketDestroySpy: jest.SpiedFunction<typeof socket.destroy>;
let wssUpgradeSpy: jest.SpiedFunction<typeof server.wss.handleUpgrade>;
const getRequest = (token: string, url: string): http.IncomingMessage => {
const request = new http.IncomingMessage(new net.Socket());
@@ -496,8 +509,8 @@ describe('server', () => {
describe('checkSockets', () => {
let ws: WebSocket;
let pingSpy: jest.SpyInstance;
let terminateSpy: jest.SpyInstance;
let pingSpy: jest.SpiedFunction<typeof ws.ping>;
let terminateSpy: jest.SpiedFunction<typeof ws.terminate>;
let socketInstance: server.SocketInstance;
beforeEach(() => {

View File

@@ -21,7 +21,7 @@ import * as net from 'net';
import WebSocket from 'ws';
import { v4 as uuidv4 } from 'uuid';
import jwt, { Algorithm } from 'jsonwebtoken';
import cookie from 'cookie';
import { parse } from 'cookie';
import Redis, { RedisOptions } from 'ioredis';
import StatsD from 'hot-shots';
@@ -285,7 +285,7 @@ export const processStreamResults = (results: StreamResult[]): void => {
* configured via 'jwtCookieName' in the config.
*/
const readChannelId = (request: http.IncomingMessage): string => {
const cookies = cookie.parse(request.headers.cookie || '');
const cookies = parse(request.headers.cookie || '');
const token = cookies[opts.jwtCookieName];
if (!token) throw new Error('JWT not present');

View File

@@ -30,6 +30,7 @@ def load_examples_run(
load_big_data: bool = False,
only_metadata: bool = False,
force: bool = False,
cleanup: bool = False,
) -> None:
if only_metadata:
logger.info("Loading examples metadata")
@@ -40,51 +41,41 @@ def load_examples_run(
# pylint: disable=import-outside-toplevel
import superset.examples.data_loading as examples
# Clear old examples if requested
if cleanup:
clear_old_examples()
examples.load_css_templates()
if load_test_data:
# Import test fixtures from tests directory
from tests.fixtures.examples.energy import load_energy
from tests.fixtures.examples.supported_charts_dashboard import (
load_supported_charts_dashboard,
)
from tests.fixtures.examples.tabbed_dashboard import load_tabbed_dashboard
logger.info("Loading energy related dataset")
examples.load_energy(only_metadata, force)
load_energy(only_metadata, force)
logger.info("Loading [World Bank's Health Nutrition and Population Stats]")
examples.load_world_bank_health_n_pop(only_metadata, force)
logger.info("Loading [Birth names]")
examples.load_birth_names(only_metadata, force)
if load_test_data:
logger.info("Loading [Tabbed dashboard]")
examples.load_tabbed_dashboard(only_metadata)
load_tabbed_dashboard(only_metadata)
logger.info("Loading [Supported Charts Dashboard]")
examples.load_supported_charts_dashboard()
load_supported_charts_dashboard()
else:
logger.info("Loading [Random long/lat data]")
examples.load_long_lat_data(only_metadata, force)
logger.info("Loading [Country Map data]")
examples.load_country_map_data(only_metadata, force)
logger.info("Loading [San Francisco population polygons]")
examples.load_sf_population_polygons(only_metadata, force)
logger.info("Loading [Flights data]")
examples.load_flights(only_metadata, force)
logger.info("Loading [BART lines]")
examples.load_bart_lines(only_metadata, force)
logger.info("Loading [Misc Charts] dashboard")
examples.load_misc_dashboard()
logger.info("Loading DECK.gl demo")
examples.load_deck_dash()
if load_big_data:
# Import test fixture from tests directory
from tests.fixtures.examples.big_data import load_big_data as load_big_data_func
logger.info("Loading big synthetic data for tests")
examples.load_big_data()
load_big_data_func()
# load examples that are stored as YAML config files
logger.info("Loading examples from YAML configuration files")
examples.load_examples_from_configs(force, load_test_data)
@@ -112,4 +103,222 @@ def load_examples(
force: bool = False,
) -> None:
"""Loads a set of Slices and Dashboards and a supporting dataset"""
# Show deprecation warning
click.echo(
click.style(
"\nWARNING: 'superset load-examples' is deprecated. "
"Please use 'superset examples load' instead.\n",
fg="yellow",
),
err=True,
)
load_examples_run(load_test_data, load_big_data, only_metadata, force)
# New CLI structure
@click.group(name="examples", help="Manage example data")
def examples_cli() -> None:
"""Group for example-related commands."""
pass
@examples_cli.command(name="load", help="Load example data into the database")
@with_appcontext
@transaction()
@click.option("--load-test-data", "-t", is_flag=True, help="Load additional test data")
@click.option("--load-big-data", "-b", is_flag=True, help="Load additional big data")
@click.option(
"--only-metadata",
"-m",
is_flag=True,
help="Only load metadata, skip actual data",
)
@click.option(
"--force",
"-f",
is_flag=True,
help="Force load data even if table already exists",
)
def load(
load_test_data: bool = False,
load_big_data: bool = False,
only_metadata: bool = False,
force: bool = False,
) -> None:
"""Load example datasets, charts, and dashboards."""
load_examples_run(
load_test_data, load_big_data, only_metadata, force, cleanup=False
)
def clear_old_examples() -> bool:
"""
Clear old Python-generated examples.
Returns True if clear was performed, False otherwise.
"""
from superset import db
from superset.connectors.sqla.models import SqlaTable
from superset.examples.utils import _has_old_examples
from superset.models.core import Database
from superset.models.dashboard import Dashboard, dashboard_slices
from superset.models.slice import Slice
# Check if old examples exist
if not _has_old_examples():
logger.info("No old examples found to clear")
return False
# Find the examples database
examples_db = db.session.query(Database).filter_by(database_name="examples").first()
if not examples_db:
return False
logger.info("Found examples database (id=%s)", examples_db.id)
logger.info("Clearing old examples...")
# 1. Get all datasets from examples database
example_datasets = (
db.session.query(SqlaTable).filter_by(database_id=examples_db.id).all()
)
dataset_ids = [ds.id for ds in example_datasets]
logger.info("Found %d example datasets", len(example_datasets))
# 2. Find all charts using these datasets
example_charts = []
if dataset_ids:
example_charts = (
db.session.query(Slice)
.filter(
Slice.datasource_id.in_(dataset_ids),
Slice.datasource_type == "table",
)
.all()
)
logger.info("Found %d example charts", len(example_charts))
chart_ids = [chart.id for chart in example_charts]
# 3. Find dashboards that contain these charts
example_dashboards = []
if chart_ids:
# Get dashboards that have relationships with our example charts
example_dashboards = (
db.session.query(Dashboard)
.join(dashboard_slices)
.filter(dashboard_slices.c.slice_id.in_(chart_ids))
.distinct()
.all()
)
logger.info("Found %d example dashboards", len(example_dashboards))
# Remove dashboard-slice relationships first
db.session.execute(
dashboard_slices.delete().where(dashboard_slices.c.slice_id.in_(chart_ids))
)
logger.info(
"Removed dashboard-slice relationships for %d charts",
len(chart_ids),
)
# 4. Delete dashboards that are now empty (contain only example charts)
for dashboard in example_dashboards:
# Since we already deleted the relationships, check if dashboard is empty
remaining_charts = (
db.session.query(dashboard_slices.c.slice_id)
.filter(dashboard_slices.c.dashboard_id == dashboard.id)
.count()
)
if remaining_charts == 0:
db.session.delete(dashboard)
logger.info(
"Deleted dashboard: %s (slug: %s)",
dashboard.dashboard_title,
dashboard.slug,
)
else:
logger.info(
"Keeping dashboard %s as it contains non-example charts",
dashboard.dashboard_title,
)
# 5. Delete charts
for chart in example_charts:
db.session.delete(chart)
logger.info("Deleted %d example charts", len(example_charts))
# 6. Delete the database - this will cascade delete all datasets,
# columns, and metrics thanks to the cascade="all, delete-orphan"
db.session.delete(examples_db)
logger.info("Examples database and all related objects removed successfully")
return True
@examples_cli.command(name="clear-old", help="Clear old Python-based example data")
@with_appcontext
@transaction()
@click.option(
"--confirm",
is_flag=True,
help="Skip confirmation prompt",
)
def clear_old(confirm: bool = False) -> None:
"""Clear old Python-generated example datasets, charts, and dashboards."""
if not confirm:
click.confirm(
"This will delete old Python-based example data. Are you sure?",
abort=True,
)
try:
if clear_old_examples():
logger.info("Old examples cleared successfully")
else:
logger.info("No old examples found to clear")
except Exception as e:
logger.error(f"Failed to clear old examples: {e}")
raise
@examples_cli.command(name="clear", help="Clear all example data (NOT YET IMPLEMENTED)")
@with_appcontext
def clear() -> None:
"""Clear all example data including YAML-based examples."""
click.echo(
click.style(
"Clearing YAML-based examples is NOT YET IMPLEMENTED.\n"
"Use 'superset examples clear-old' to remove old Python-based examples.",
fg="yellow",
)
)
@examples_cli.command(name="reload", help="Clear and reload example data")
@with_appcontext
@transaction()
@click.option("--load-test-data", "-t", is_flag=True, help="Load additional test data")
@click.option("--load-big-data", "-b", is_flag=True, help="Load additional big data")
@click.option(
"--only-metadata",
"-m",
is_flag=True,
help="Only load metadata, skip actual data",
)
@click.option(
"--force",
"-f",
is_flag=True,
help="Force load data even if table already exists",
)
def reload(
load_test_data: bool = False,
load_big_data: bool = False,
only_metadata: bool = False,
force: bool = False,
) -> None:
"""Clear existing examples and load fresh ones."""
# This is essentially the old --cleanup behavior
load_examples_run(load_test_data, load_big_data, only_metadata, force, cleanup=True)

View File

@@ -16,6 +16,7 @@
# under the License.
import copy
import logging
from inspect import isclass
from typing import Any
@@ -27,6 +28,8 @@ from superset.models.slice import Slice
from superset.utils import json
from superset.utils.core import AnnotationType, get_user
logger = logging.getLogger(__name__)
def filter_chart_annotations(chart_config: dict[str, Any]) -> None:
"""
@@ -63,10 +66,13 @@ def import_chart(
if not overwrite or not can_write:
return existing
config["id"] = existing.id
logger.info(f"Updating existing chart: {config.get('slice_name')}")
elif not can_write:
raise ImportFailedError(
"Chart doesn't exist and user doesn't have permission to create charts"
)
else:
logger.info(f"Creating new chart: {config.get('slice_name')}")
filter_chart_annotations(config)

View File

@@ -123,6 +123,9 @@ class ExportDashboardsCommand(ExportModelsCommand):
include_defaults=True,
export_uuids=True,
)
# Remove theme_id from export to make dashboards theme-free
payload.pop("theme_id", None)
# TODO (betodealmeida): move this logic to export_to_dict once this
# becomes the default export endpoint
for key, new_name in JSON_KEYS.items():

View File

@@ -166,11 +166,14 @@ def import_dashboard( # noqa: C901
elif not overwrite or not can_write:
return existing
config["id"] = existing.id
logger.info(f"Updating existing dashboard: {config.get('dashboard_title')}")
elif not can_write:
raise ImportFailedError(
"Dashboard doesn't exist and user doesn't "
"have permission to create dashboards"
)
else:
logger.info(f"Creating new dashboard: {config.get('dashboard_title')}")
# TODO (betodealmeida): move this logic to import_from_dict
config = config.copy()

View File

@@ -46,10 +46,13 @@ def import_database(
if not overwrite or not can_write:
return existing
config["id"] = existing.id
logger.info(f"Updating existing database: {config.get('database_name')}")
elif not can_write:
raise ImportFailedError(
"Database doesn't exist and user doesn't have permission to create databases" # noqa: E501
)
else:
logger.info(f"Creating new database: {config.get('database_name')}")
# Check if this URI is allowed
if app.config["PREVENT_UNSAFE_DB_CONNECTIONS"]:
try:

View File

@@ -124,11 +124,13 @@ def import_dataset( # noqa: C901
if not overwrite or not can_write:
return existing
config["id"] = existing.id
logger.info(f"Updating existing dataset: {config.get('table_name')}")
elif not can_write:
raise ImportFailedError(
"Dataset doesn't exist and user doesn't have permission to create datasets"
)
else:
logger.info(f"Creating new dataset: {config.get('table_name')}")
# TODO (betodealmeida): move this logic to import_from_dict
config = config.copy()
@@ -199,12 +201,22 @@ def load_data(data_uri: str, dataset: SqlaTable, database: Database) -> None:
:raises DatasetUnAllowedDataURI: If a dataset is trying
to load data from a URI that is not allowed.
"""
from superset.examples.helpers import normalize_example_data_url
# Convert example URLs to align with configuration
data_uri = normalize_example_data_url(data_uri)
validate_data_uri(data_uri)
logger.info("Downloading data from %s", data_uri)
data = request.urlopen(data_uri) # pylint: disable=consider-using-with # noqa: S310
if data_uri.endswith(".gz"):
data = gzip.open(data)
df = pd.read_csv(data, encoding="utf-8")
# Determine file format based on URI
if ".json" in data_uri:
df = pd.read_json(data, encoding="utf-8")
else:
df = pd.read_csv(data, encoding="utf-8")
dtype = get_dtype(df, dataset)
# convert temporal columns

View File

@@ -195,4 +195,5 @@ class ImportExamplesCommand(ImportModelsCommand):
{"dashboard_id": dashboard_id, "slice_id": chart_id}
for (dashboard_id, chart_id) in dashboard_chart_ids
]
db.session.execute(dashboard_slices.insert(), values)
if values:
db.session.execute(dashboard_slices.insert(), values)

View File

@@ -190,6 +190,12 @@ def load_configs(
db_ssh_tunnel_priv_key_passws[config["uuid"]]
)
# Normalize example data URLs before schema validation
if prefix == "datasets" and "data" in config:
from superset.examples.helpers import normalize_example_data_url
config["data"] = normalize_example_data_url(config["data"])
schema.load(config)
configs[file_name] = config
except ValidationError as exc:

View File

@@ -1155,7 +1155,7 @@ class CeleryConfig: # pylint: disable=too-few-public-methods
}
CELERY_CONFIG: type[CeleryConfig] | None = CeleryConfig
CELERY_CONFIG: type[CeleryConfig] = CeleryConfig
# Set celery config to None to disable all the above configuration
# CELERY_CONFIG = None

View File

@@ -1368,10 +1368,23 @@ class SqlaTable(
return get_template_processor(table=self, database=self.database, **kwargs)
def get_sqla_table(self) -> TableClause:
tbl = table(self.table_name)
# For databases that support cross-catalog queries (like BigQuery),
# include the catalog in the table identifier to generate
# project.dataset.table format
if self.catalog and self.database.db_engine_spec.supports_cross_catalog_queries:
# SQLAlchemy doesn't have built-in catalog support for TableClause,
# so we need to construct the full identifier manually
if self.schema:
full_name = f"{self.catalog}.{self.schema}.{self.table_name}"
else:
full_name = f"{self.catalog}.{self.table_name}"
return table(full_name)
if self.schema:
tbl.schema = self.schema
return tbl
return table(self.table_name, schema=self.schema)
return table(self.table_name)
def get_from_clause(
self,
@@ -1680,6 +1693,9 @@ class SqlaTable(
table=self,
)
new_column.is_dttm = new_column.is_temporal
# Set description from comment field if available
if col.get("comment"):
new_column.description = col["comment"]
db_engine_spec.alter_new_orm_column(new_column)
else:
new_column = old_column
@@ -1687,6 +1703,9 @@ class SqlaTable(
results.modified.append(col["column_name"])
new_column.type = col["type"]
new_column.expression = ""
# Set description from comment field if available
if col.get("comment"):
new_column.description = col["comment"]
new_column.groupby = True
new_column.filterable = True
columns.append(new_column)

View File

@@ -1,71 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
import polyline
from sqlalchemy import inspect, String, Text
from superset import db
from superset.sql.parse import Table
from superset.utils import json
from ..utils.database import get_example_database # noqa: TID252
from .helpers import get_table_connector_registry, read_example_data
logger = logging.getLogger(__name__)
def load_bart_lines(only_metadata: bool = False, force: bool = False) -> None:
tbl_name = "bart_lines"
database = get_example_database()
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
table_exists = database.has_table(Table(tbl_name, schema))
if not only_metadata and (not table_exists or force):
df = read_example_data(
"bart-lines.json.gz", encoding="latin-1", compression="gzip"
)
df["path_json"] = df.path.map(json.dumps)
df["polyline"] = df.path.map(polyline.encode)
del df["path"]
df.to_sql(
tbl_name,
engine,
schema=schema,
if_exists="replace",
chunksize=500,
dtype={
"color": String(255),
"name": String(255),
"polyline": Text,
"path_json": Text,
},
index=False,
)
logger.debug(f"Creating table {tbl_name} reference")
table = get_table_connector_registry()
tbl = db.session.query(table).filter_by(table_name=tbl_name).first()
if not tbl:
tbl = table(table_name=tbl_name, schema=schema)
db.session.add(tbl)
tbl.description = "BART lines"
tbl.database = database
tbl.filter_select_enabled = True
tbl.fetch_metadata()

View File

@@ -1,869 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
import textwrap
from typing import Union
import pandas as pd
from sqlalchemy import DateTime, inspect, String
from sqlalchemy.sql import column
from superset import app, db, security_manager
from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn
from superset.models.core import Database
from superset.models.dashboard import Dashboard
from superset.models.slice import Slice
from superset.sql.parse import Table
from superset.utils import json
from superset.utils.core import DatasourceType
from ..utils.database import get_example_database # noqa: TID252
from .helpers import (
get_slice_json,
get_table_connector_registry,
merge_slice,
misc_dash_slices,
read_example_data,
update_slice_ids,
)
logger = logging.getLogger(__name__)
def gen_filter(
subject: str, comparator: str, operator: str = "=="
) -> dict[str, Union[bool, str]]:
return {
"clause": "WHERE",
"comparator": comparator,
"expressionType": "SIMPLE",
"operator": operator,
"subject": subject,
}
def load_data(tbl_name: str, database: Database, sample: bool = False) -> None:
pdf = read_example_data("birth_names2.json.gz", compression="gzip")
# TODO(bkyryliuk): move load examples data into the pytest fixture
if database.backend == "presto":
pdf.ds = pd.to_datetime(pdf.ds, unit="ms")
pdf.ds = pdf.ds.dt.strftime("%Y-%m-%d %H:%M%:%S")
else:
pdf.ds = pd.to_datetime(pdf.ds, unit="ms")
pdf = pdf.head(100) if sample else pdf
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
pdf.to_sql(
tbl_name,
engine,
schema=schema,
if_exists="replace",
chunksize=500,
dtype={
# TODO(bkyryliuk): use TIMESTAMP type for presto
"ds": DateTime if database.backend != "presto" else String(255),
"gender": String(16),
"state": String(10),
"name": String(255),
},
method="multi",
index=False,
)
logger.debug("Done loading table!")
logger.debug("-" * 80)
def load_birth_names(
only_metadata: bool = False, force: bool = False, sample: bool = False
) -> None:
"""Loading birth name dataset from a zip file in the repo"""
database = get_example_database()
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
tbl_name = "birth_names"
table_exists = database.has_table(Table(tbl_name, schema))
if not only_metadata and (not table_exists or force):
load_data(tbl_name, database, sample=sample)
table = get_table_connector_registry()
obj = db.session.query(table).filter_by(table_name=tbl_name, schema=schema).first()
if not obj:
logger.debug(f"Creating table [{tbl_name}] reference")
obj = table(table_name=tbl_name, schema=schema)
db.session.add(obj)
_set_table_metadata(obj, database)
_add_table_metrics(obj)
slices, _ = create_slices(obj)
create_dashboard(slices)
def _set_table_metadata(datasource: SqlaTable, database: "Database") -> None:
datasource.main_dttm_col = "ds"
datasource.database = database
datasource.filter_select_enabled = True
datasource.fetch_metadata()
def _add_table_metrics(datasource: SqlaTable) -> None:
# By accessing the attribute first, we make sure `datasource.columns` and
# `datasource.metrics` are already loaded. Otherwise accessing them later
# may trigger an unnecessary and unexpected `after_update` event.
columns, metrics = datasource.columns, datasource.metrics
if not any(col.column_name == "num_california" for col in columns):
col_state = str(column("state").compile(db.engine))
col_num = str(column("num").compile(db.engine))
columns.append(
TableColumn(
column_name="num_california",
expression=f"CASE WHEN {col_state} = 'CA' THEN {col_num} ELSE 0 END",
)
)
if not any(col.metric_name == "sum__num" for col in metrics):
col = str(column("num").compile(db.engine))
metrics.append(SqlMetric(metric_name="sum__num", expression=f"SUM({col})"))
for col in columns:
if col.column_name == "ds": # type: ignore
col.is_dttm = True # type: ignore
break
datasource.columns = columns
datasource.metrics = metrics
def create_slices(tbl: SqlaTable) -> tuple[list[Slice], list[Slice]]:
owner = security_manager.get_user_by_id(1)
metrics = [
{
"expressionType": "SIMPLE",
"column": {"column_name": "num", "type": "BIGINT"},
"aggregate": "SUM",
"label": "Births",
"optionName": "metric_11",
}
]
metric = "sum__num"
defaults = {
"compare_lag": "10",
"compare_suffix": "o10Y",
"limit": "25",
"granularity_sqla": "ds",
"groupby": [],
"row_limit": app.config["ROW_LIMIT"],
"time_range": "100 years ago : now",
"viz_type": "table",
"markup_type": "markdown",
}
default_query_context = {
"result_format": "json",
"result_type": "full",
"datasource": {
"id": tbl.id,
"type": "table",
},
"queries": [
{
"columns": [],
"metrics": [],
},
],
}
slice_kwargs = {
"datasource_id": tbl.id,
"datasource_type": DatasourceType.TABLE,
}
logger.debug("Creating some slices")
slices = [
Slice(
**slice_kwargs,
slice_name="Participants",
viz_type="big_number",
params=get_slice_json(
defaults,
viz_type="big_number",
granularity_sqla="ds",
compare_lag="5",
compare_suffix="over 5Y",
metric=metric,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Genders",
viz_type="pie",
params=get_slice_json(
defaults, viz_type="pie", groupby=["gender"], metric=metric
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Trends",
viz_type="echarts_timeseries_line",
params=get_slice_json(
defaults,
viz_type="echarts_timeseries_line",
groupby=["name"],
granularity_sqla="ds",
rich_tooltip=True,
show_legend=True,
metrics=metrics,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Genders by State",
viz_type="echarts_timeseries_bar",
params=get_slice_json(
defaults,
adhoc_filters=[
{
"clause": "WHERE",
"expressionType": "SIMPLE",
"filterOptionName": "2745eae5",
"comparator": ["other"],
"operator": "NOT IN",
"subject": "state",
}
],
viz_type="echarts_timeseries_bar",
metrics=[
{
"expressionType": "SIMPLE",
"column": {"column_name": "num_boys", "type": "BIGINT(20)"},
"aggregate": "SUM",
"label": "Boys",
"optionName": "metric_11",
},
{
"expressionType": "SIMPLE",
"column": {"column_name": "num_girls", "type": "BIGINT(20)"},
"aggregate": "SUM",
"label": "Girls",
"optionName": "metric_12",
},
],
groupby=["state"],
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Girls",
viz_type="table",
params=get_slice_json(
defaults,
groupby=["name"],
adhoc_filters=[gen_filter("gender", "girl")],
row_limit=50,
timeseries_limit_metric=metric,
metrics=[metric],
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Girl Name Cloud",
viz_type="word_cloud",
params=get_slice_json(
defaults,
viz_type="word_cloud",
size_from="10",
series="name",
size_to="70",
rotation="square",
limit="100",
adhoc_filters=[gen_filter("gender", "girl")],
metric=metric,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Boys",
viz_type="table",
params=get_slice_json(
defaults,
groupby=["name"],
adhoc_filters=[gen_filter("gender", "boy")],
row_limit=50,
timeseries_limit_metric=metric,
metrics=[metric],
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Boy Name Cloud",
viz_type="word_cloud",
params=get_slice_json(
defaults,
viz_type="word_cloud",
size_from="10",
series="name",
size_to="70",
rotation="square",
limit="100",
adhoc_filters=[gen_filter("gender", "boy")],
metric=metric,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Top 10 Girl Name Share",
viz_type="echarts_area",
params=get_slice_json(
defaults,
adhoc_filters=[gen_filter("gender", "girl")],
comparison_type="values",
groupby=["name"],
limit=10,
stacked_style="expand",
time_grain_sqla="P1D",
viz_type="echarts_area",
x_axis_forma="smart_date",
metrics=metrics,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Top 10 Boy Name Share",
viz_type="echarts_area",
params=get_slice_json(
defaults,
adhoc_filters=[gen_filter("gender", "boy")],
comparison_type="values",
groupby=["name"],
limit=10,
stacked_style="expand",
time_grain_sqla="P1D",
viz_type="echarts_area",
x_axis_forma="smart_date",
metrics=metrics,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Pivot Table v2",
viz_type="pivot_table_v2",
params=get_slice_json(
defaults,
viz_type="pivot_table_v2",
groupbyRows=["name"],
groupbyColumns=["state"],
metrics=[metric],
),
query_context=get_slice_json(
default_query_context,
queries=[
{
"columns": ["name", "state"],
"metrics": [metric],
}
],
),
owners=[],
),
]
misc_slices = [
Slice(
**slice_kwargs,
slice_name="Average and Sum Trends",
viz_type="mixed_timeseries",
params=get_slice_json(
defaults,
viz_type="mixed_timeseries",
metrics=[
{
"expressionType": "SIMPLE",
"column": {"column_name": "num", "type": "BIGINT(20)"},
"aggregate": "AVG",
"label": "AVG(num)",
"optionName": "metric_vgops097wej_g8uff99zhk7",
}
],
metrics_b=["sum__num"],
granularity_sqla="ds",
yAxisIndex=0,
yAxisIndexB=1,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Num Births Trend",
viz_type="echarts_timeseries_line",
params=get_slice_json(
defaults, viz_type="echarts_timeseries_line", metrics=metrics
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Daily Totals",
viz_type="table",
params=get_slice_json(
defaults,
groupby=["ds"],
time_range="1983 : 2023",
viz_type="table",
metrics=metrics,
),
query_context=get_slice_json(
default_query_context,
queries=[
{
"columns": ["ds"],
"metrics": metrics,
"time_range": "1983 : 2023",
}
],
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Number of California Births",
viz_type="big_number_total",
params=get_slice_json(
defaults,
metric={
"expressionType": "SIMPLE",
"column": {
"column_name": "num_california",
"expression": "CASE WHEN state = 'CA' THEN num ELSE 0 END",
},
"aggregate": "SUM",
"label": "SUM(num_california)",
},
viz_type="big_number_total",
granularity_sqla="ds",
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Top 10 California Names Timeseries",
viz_type="echarts_timeseries_line",
params=get_slice_json(
defaults,
metrics=[
{
"expressionType": "SIMPLE",
"column": {
"column_name": "num_california",
"expression": "CASE WHEN state = 'CA' THEN num ELSE 0 END",
},
"aggregate": "SUM",
"label": "SUM(num_california)",
}
],
viz_type="echarts_timeseries_line",
granularity_sqla="ds",
groupby=["name"],
timeseries_limit_metric={
"expressionType": "SIMPLE",
"column": {
"column_name": "num_california",
"expression": "CASE WHEN state = 'CA' THEN num ELSE 0 END",
},
"aggregate": "SUM",
"label": "SUM(num_california)",
},
limit="10",
),
owners=[owner] if owner else [],
),
Slice(
**slice_kwargs,
slice_name="Names Sorted by Num in California",
viz_type="table",
params=get_slice_json(
defaults,
metrics=metrics,
groupby=["name"],
row_limit=50,
timeseries_limit_metric={
"expressionType": "SIMPLE",
"column": {
"column_name": "num_california",
"expression": "CASE WHEN state = 'CA' THEN num ELSE 0 END",
},
"aggregate": "SUM",
"label": "SUM(num_california)",
},
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Number of Girls",
viz_type="big_number_total",
params=get_slice_json(
defaults,
metric=metric,
viz_type="big_number_total",
granularity_sqla="ds",
adhoc_filters=[gen_filter("gender", "girl")],
subheader="total female participants",
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Pivot Table",
viz_type="pivot_table_v2",
params=get_slice_json(
defaults,
viz_type="pivot_table_v2",
groupbyRows=["name"],
groupbyColumns=["state"],
metrics=metrics,
),
owners=[],
),
]
for slc in slices:
merge_slice(slc)
for slc in misc_slices:
merge_slice(slc)
misc_dash_slices.add(slc.slice_name)
return slices, misc_slices
def create_dashboard(slices: list[Slice]) -> Dashboard:
logger.debug("Creating a dashboard")
dash = db.session.query(Dashboard).filter_by(slug="births").first()
if not dash:
dash = Dashboard()
db.session.add(dash)
dash.published = True
dash.json_metadata = textwrap.dedent(
"""\
{
"label_colors": {
"Girls": "#FF69B4",
"Boys": "#ADD8E6",
"girl": "#FF69B4",
"boy": "#ADD8E6"
}
}"""
)
# pylint: disable=echarts_timeseries_line-too-long
pos = json.loads(
textwrap.dedent(
"""\
{
"CHART-6GdlekVise": {
"children": [],
"id": "CHART-6GdlekVise",
"meta": {
"chartId": 5547,
"height": 50,
"sliceName": "Top 10 Girl Name Share",
"width": 5
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-eh0w37bWbR"
],
"type": "CHART"
},
"CHART-6n9jxb30JG": {
"children": [],
"id": "CHART-6n9jxb30JG",
"meta": {
"chartId": 5540,
"height": 36,
"sliceName": "Genders by State",
"width": 5
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW--EyBZQlDi"
],
"type": "CHART"
},
"CHART-Jj9qh1ol-N": {
"children": [],
"id": "CHART-Jj9qh1ol-N",
"meta": {
"chartId": 5545,
"height": 50,
"sliceName": "Boy Name Cloud",
"width": 4
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-kzWtcvo8R1"
],
"type": "CHART"
},
"CHART-ODvantb_bF": {
"children": [],
"id": "CHART-ODvantb_bF",
"meta": {
"chartId": 5548,
"height": 50,
"sliceName": "Top 10 Boy Name Share",
"width": 5
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-kzWtcvo8R1"
],
"type": "CHART"
},
"CHART-PAXUUqwmX9": {
"children": [],
"id": "CHART-PAXUUqwmX9",
"meta": {
"chartId": 5538,
"height": 34,
"sliceName": "Genders",
"width": 3
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-2n0XgiHDgs"
],
"type": "CHART"
},
"CHART-_T6n_K9iQN": {
"children": [],
"id": "CHART-_T6n_K9iQN",
"meta": {
"chartId": 5539,
"height": 36,
"sliceName": "Trends",
"width": 7
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW--EyBZQlDi"
],
"type": "CHART"
},
"CHART-eNY0tcE_ic": {
"children": [],
"id": "CHART-eNY0tcE_ic",
"meta": {
"chartId": 5537,
"height": 34,
"sliceName": "Participants",
"width": 3
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-2n0XgiHDgs"
],
"type": "CHART"
},
"CHART-g075mMgyYb": {
"children": [],
"id": "CHART-g075mMgyYb",
"meta": {
"chartId": 5541,
"height": 50,
"sliceName": "Girls",
"width": 3
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-eh0w37bWbR"
],
"type": "CHART"
},
"CHART-n-zGGE6S1y": {
"children": [],
"id": "CHART-n-zGGE6S1y",
"meta": {
"chartId": 5542,
"height": 50,
"sliceName": "Girl Name Cloud",
"width": 4
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-eh0w37bWbR"
],
"type": "CHART"
},
"CHART-vJIPjmcbD3": {
"children": [],
"id": "CHART-vJIPjmcbD3",
"meta": {
"chartId": 5543,
"height": 50,
"sliceName": "Boys",
"width": 3
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-kzWtcvo8R1"
],
"type": "CHART"
},
"DASHBOARD_VERSION_KEY": "v2",
"GRID_ID": {
"children": [
"ROW-2n0XgiHDgs",
"ROW--EyBZQlDi",
"ROW-eh0w37bWbR",
"ROW-kzWtcvo8R1"
],
"id": "GRID_ID",
"parents": [
"ROOT_ID"
],
"type": "GRID"
},
"HEADER_ID": {
"id": "HEADER_ID",
"meta": {
"text": "Births"
},
"type": "HEADER"
},
"MARKDOWN-zaflB60tbC": {
"children": [],
"id": "MARKDOWN-zaflB60tbC",
"meta": {
"code": "<div style=\\"text-align:center\\"> <h1>Birth Names Dashboard</h1> <img src=\\"/static/assets/images/babies.png\\" style=\\"width:50%;\\"></div>",
"height": 34,
"width": 6
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-2n0XgiHDgs"
],
"type": "MARKDOWN"
},
"ROOT_ID": {
"children": [
"GRID_ID"
],
"id": "ROOT_ID",
"type": "ROOT"
},
"ROW--EyBZQlDi": {
"children": [
"CHART-_T6n_K9iQN",
"CHART-6n9jxb30JG"
],
"id": "ROW--EyBZQlDi",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
},
"ROW-2n0XgiHDgs": {
"children": [
"CHART-eNY0tcE_ic",
"MARKDOWN-zaflB60tbC",
"CHART-PAXUUqwmX9"
],
"id": "ROW-2n0XgiHDgs",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
},
"ROW-eh0w37bWbR": {
"children": [
"CHART-g075mMgyYb",
"CHART-n-zGGE6S1y",
"CHART-6GdlekVise"
],
"id": "ROW-eh0w37bWbR",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
},
"ROW-kzWtcvo8R1": {
"children": [
"CHART-vJIPjmcbD3",
"CHART-Jj9qh1ol-N",
"CHART-ODvantb_bF"
],
"id": "ROW-kzWtcvo8R1",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
}
}
""" # noqa: E501
)
)
# pylint: enable=echarts_timeseries_line-too-long
# dashboard v2 doesn't allow add markup slice
dash.slices = [slc for slc in slices if slc.viz_type != "markup"]
update_slice_ids(pos)
dash.dashboard_title = "USA Births Names"
dash.position_json = json.dumps(pos, indent=4)
dash.slug = "births"
return dash

View File

@@ -0,0 +1,26 @@
slice_name: Birth in France by department in 2016
description: null
certified_by: null
certification_details: null
viz_type: country_map
params:
entity: DEPT_ID
granularity_sqla: ''
metric:
aggregate: AVG
column:
column_name: '2004'
type: INT
expressionType: SIMPLE
label: Boys
optionName: metric_112342
row_limit: 500000
select_country: france
since: ''
until: ''
viz_type: country_map
query_context: null
cache_timeout: null
uuid: 6bd584f1-0ef5-44fc-8a05-61400f83bb62
version: 1.0.0
dataset_uuid: c21dd48d-9a4b-4a08-a926-47c3601c2a8d

View File

@@ -0,0 +1,21 @@
slice_name: OSM Long/Lat
description: null
certified_by: null
certification_details: null
viz_type: osm
params:
all_columns:
- occupancy
all_columns_x: LON
all_columns_y: LAT
granularity_sqla: day
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
row_limit: 500000
since: '2014-01-01'
until: now
viz_type: mapbox
query_context: null
cache_timeout: null
uuid: a4e90860-c8f5-4c50-8c04-06b2e144809c
version: 1.0.0
dataset_uuid: 605eaec7-ebf1-4fea-ac4b-07652fcb46e7

View File

@@ -0,0 +1,31 @@
slice_name: Parallel Coordinates
description: null
certified_by: null
certification_details: null
viz_type: para
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby: []
limit: 100
markup_type: markdown
metrics:
- sum__SP_POP_TOTL
- sum__SP_RUR_TOTL_ZS
- sum__SH_DYN_AIDS
row_limit: 50000
secondary_metric: sum__SP_POP_TOTL
series: country_name
show_bubbles: true
since: '2011-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: '2012-01-01'
viz_type: para
query_context: null
cache_timeout: null
uuid: 041377c4-0ca9-4a40-8abd-befcd137c0dc
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -0,0 +1,30 @@
slice_name: Pivot Table v2
description: null
certified_by: null
certification_details: null
viz_type: pivot_table_v2
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
groupbyColumns:
- state
groupbyRows:
- name
limit: '25'
markup_type: markdown
metrics:
- sum__num
row_limit: 50000
time_range: '100 years ago : now'
viz_type: pivot_table_v2
query_context: "{\n \"datasource\": {\n \"id\": 2,\n \"type\": \"\
table\"\n },\n \"queries\": [\n {\n \"columns\": [\n \
\ \"name\",\n \"state\"\n ],\n \
\ \"metrics\": [\n \"sum__num\"\n ]\n }\n ],\n\
\ \"result_format\": \"json\",\n \"result_type\": \"full\"\n}"
cache_timeout: null
uuid: 86778b63-19d8-4278-a79f-c90a1b31e162
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,32 @@
slice_name: Average and Sum Trends
description: null
certified_by: null
certification_details: null
viz_type: mixed_timeseries
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '25'
markup_type: markdown
metrics:
- aggregate: AVG
column:
column_name: num
type: BIGINT(20)
expressionType: SIMPLE
label: AVG(num)
optionName: metric_vgops097wej_g8uff99zhk7
metrics_b:
- sum__num
row_limit: 50000
time_range: '100 years ago : now'
viz_type: mixed_timeseries
yAxisIndex: 0
yAxisIndexB: 1
query_context: null
cache_timeout: null
uuid: 9c690f97-9196-5e01-bec9-8f4975ea5108
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,31 @@
slice_name: Boy Name Cloud
description: null
certified_by: null
certification_details: null
viz_type: word_cloud
params:
adhoc_filters:
- clause: WHERE
comparator: boy
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '100'
markup_type: markdown
metric: sum__num
rotation: square
row_limit: 50000
series: name
size_from: '10'
size_to: '70'
time_range: '100 years ago : now'
viz_type: word_cloud
query_context: null
cache_timeout: null
uuid: 6994ec83-0cf2-4a26-97e2-1e30b0002aa0
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,30 @@
slice_name: Boys
description: null
certified_by: null
certification_details: null
viz_type: table
params:
adhoc_filters:
- clause: WHERE
comparator: boy
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- sum__num
row_limit: 50
time_range: '100 years ago : now'
timeseries_limit_metric: sum__num
viz_type: table
query_context: null
cache_timeout: null
uuid: 0af97164-82f0-42bb-a611-7093e5c56596
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,28 @@
slice_name: Daily Totals
description: null
certified_by: null
certification_details: null
viz_type: table
params:
granularity_sqla: ds
groupby:
- ds
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50
time_range: '1983 : 2023'
timeseries_limit_metric: sum__num
viz_type: table
query_context: null
cache_timeout: null
uuid: a3d4f2e1-8c9b-4d2a-9e7f-1b6c8d5e2f4a
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,22 @@
slice_name: Genders
description: null
certified_by: null
certification_details: null
viz_type: pie
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- gender
limit: '25'
markup_type: markdown
metric: sum__num
row_limit: 50000
time_range: '100 years ago : now'
viz_type: pie
query_context: null
cache_timeout: null
uuid: fb05dca0-bd3e-4953-a0a5-94b51de3a653
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,44 @@
slice_name: Genders by State
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_bar
params:
adhoc_filters:
- clause: WHERE
comparator:
- other
expressionType: SIMPLE
filterOptionName: 2745eae5
operator: NOT IN
subject: state
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- state
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num_boys
type: BIGINT(20)
expressionType: SIMPLE
label: Boys
optionName: metric_11
- aggregate: SUM
column:
column_name: num_girls
type: BIGINT(20)
expressionType: SIMPLE
label: Girls
optionName: metric_12
row_limit: 50000
time_range: '100 years ago : now'
viz_type: echarts_timeseries_bar
query_context: null
cache_timeout: null
uuid: 2cc25185-3d8c-494c-aa3c-14f081ac7e54
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,31 @@
slice_name: Girl Name Cloud
description: null
certified_by: null
certification_details: null
viz_type: word_cloud
params:
adhoc_filters:
- clause: WHERE
comparator: girl
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '100'
markup_type: markdown
metric: sum__num
rotation: square
row_limit: 50000
series: name
size_from: '10'
size_to: '70'
time_range: '100 years ago : now'
viz_type: word_cloud
query_context: null
cache_timeout: null
uuid: ba6574fe-a6c0-41ef-9499-1ea6ff36bd2d
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,30 @@
slice_name: Girls
description: null
certified_by: null
certification_details: null
viz_type: table
params:
adhoc_filters:
- clause: WHERE
comparator: girl
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- sum__num
row_limit: 50
time_range: '100 years ago : now'
timeseries_limit_metric: sum__num
viz_type: table
query_context: null
cache_timeout: null
uuid: 44cfa30e-af8e-4176-8612-4df0c0609516
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,36 @@
slice_name: Names Sorted by Num in California
description: null
certified_by: null
certification_details: null
viz_type: table
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50
time_range: '100 years ago : now'
timeseries_limit_metric:
aggregate: SUM
column:
column_name: num_california
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
expressionType: SIMPLE
label: SUM(num_california)
viz_type: table
query_context: null
cache_timeout: null
uuid: e49ed2c4-b8a3-5736-bafe-4658790b113a
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,31 @@
slice_name: Num Births Trend
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_line
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
rich_tooltip: true
row_limit: 50000
show_legend: true
time_range: '100 years ago : now'
viz_type: echarts_timeseries_line
query_context: null
cache_timeout: null
uuid: 5b8c76e5-0e5e-45c1-b07e-3b2cb9b9c7e8
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,27 @@
slice_name: Number of California Births
description: null
certified_by: null
certification_details: null
viz_type: big_number_total
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '25'
markup_type: markdown
metric:
aggregate: SUM
column:
column_name: num_california
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
expressionType: SIMPLE
label: SUM(num_california)
row_limit: 50000
time_range: '100 years ago : now'
viz_type: big_number_total
query_context: null
cache_timeout: null
uuid: 400ee69f-eda4-5fe8-bc30-299184e08048
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,28 @@
slice_name: Number of Girls
description: null
certified_by: null
certification_details: null
viz_type: big_number_total
params:
adhoc_filters:
- clause: WHERE
comparator: girl
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '25'
markup_type: markdown
metric: sum__num
row_limit: 50000
subheader: total female participants
time_range: '100 years ago : now'
viz_type: big_number_total
query_context: null
cache_timeout: null
uuid: 2f1a8720-7ea6-5b0f-b419-b75163f6bf17
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,21 @@
slice_name: Participants
description: null
certified_by: null
certification_details: null
viz_type: big_number
params:
compare_lag: '5'
compare_suffix: over 5Y
granularity_sqla: ds
groupby: []
limit: '25'
markup_type: markdown
metric: sum__num
row_limit: 50000
time_range: '100 years ago : now'
viz_type: big_number
query_context: null
cache_timeout: null
uuid: 89ae3c32-eafa-4466-82cf-8c4328420782
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,32 @@
slice_name: Pivot Table
description: null
certified_by: null
certification_details: null
viz_type: pivot_table_v2
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
groupbyColumns:
- state
groupbyRows:
- name
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50000
time_range: '100 years ago : now'
viz_type: pivot_table_v2
query_context: null
cache_timeout: null
uuid: b9038f33-aea3-52de-840b-0a32f4c0eb41
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,39 @@
slice_name: Top 10 Boy Name Share
description: null
certified_by: null
certification_details: null
viz_type: echarts_area
params:
adhoc_filters:
- clause: WHERE
comparator: boy
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
comparison_type: values
granularity_sqla: ds
groupby:
- name
limit: 10
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50000
stacked_style: expand
time_grain_sqla: P1D
time_range: '100 years ago : now'
viz_type: echarts_area
x_axis_forma: smart_date
query_context: null
cache_timeout: null
uuid: f35cca46-bb11-440e-8ba1-7f021bfe52a7
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,35 @@
slice_name: Top 10 California Names Timeseries
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_line
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '10'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num_california
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
expressionType: SIMPLE
label: SUM(num_california)
row_limit: 50000
time_range: '100 years ago : now'
timeseries_limit_metric:
aggregate: SUM
column:
column_name: num_california
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
expressionType: SIMPLE
label: SUM(num_california)
viz_type: echarts_timeseries_line
query_context: null
cache_timeout: null
uuid: 6a587b9e-e28b-5c2a-abb9-c6c1f4fd56b5
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,39 @@
slice_name: Top 10 Girl Name Share
description: null
certified_by: null
certification_details: null
viz_type: echarts_area
params:
adhoc_filters:
- clause: WHERE
comparator: girl
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
comparison_type: values
granularity_sqla: ds
groupby:
- name
limit: 10
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50000
stacked_style: expand
time_grain_sqla: P1D
time_range: '100 years ago : now'
viz_type: echarts_area
x_axis_forma: smart_date
query_context: null
cache_timeout: null
uuid: da76899a-d75c-467b-b0ce-cfa4819ed1b1
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,31 @@
slice_name: Trends
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_line
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
rich_tooltip: true
row_limit: 50000
show_legend: true
time_range: '100 years ago : now'
viz_type: echarts_timeseries_line
query_context: null
cache_timeout: null
uuid: c6024db9-1695-4aa6-b846-42d9c96bfcbf
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -0,0 +1,120 @@
slice_name: Rise & Fall of Video Game Consoles
description: null
certified_by: null
certification_details: null
viz_type: echarts_area
params:
adhoc_filters: []
annotation_layers: []
bottom_margin: auto
color_scheme: supersetColors
comparison_type: values
contribution: false
datasource: 21__table
granularity_sqla: year
groupby:
- platform
label_colors:
'0': '#1FA8C9'
'1': '#454E7C'
'2600': '#666666'
3DO: '#B2B2B2'
3DS: '#D1C6BC'
Action: '#1FA8C9'
Adventure: '#454E7C'
DC: '#A38F79'
DS: '#8FD3E4'
Europe: '#5AC189'
Fighting: '#5AC189'
GB: '#FDE380'
GBA: '#ACE1C4'
GC: '#5AC189'
GEN: '#3CCCCB'
GG: '#EFA1AA'
Japan: '#FF7F44'
Microsoft Game Studios: '#D1C6BC'
Misc: '#FF7F44'
N64: '#1FA8C9'
NES: '#9EE5E5'
NG: '#A1A6BD'
Nintendo: '#D3B3DA'
North America: '#666666'
Other: '#E04355'
PC: '#EFA1AA'
PCFX: '#FDE380'
PS: '#A1A6BD'
PS2: '#FCC700'
PS3: '#3CCCCB'
PS4: '#B2B2B2'
PSP: '#FEC0A1'
PSV: '#FCC700'
Platform: '#666666'
Puzzle: '#E04355'
Racing: '#FCC700'
Role-Playing: '#A868B7'
SAT: '#A868B7'
SCD: '#8FD3E4'
SNES: '#454E7C'
Shooter: '#3CCCCB'
Simulation: '#A38F79'
Sports: '#8FD3E4'
Strategy: '#A1A6BD'
TG16: '#FEC0A1'
Take-Two Interactive: '#9EE5E5'
WS: '#ACE1C4'
Wii: '#A38F79'
WiiU: '#E04355'
X360: '#A868B7'
XB: '#D3B3DA'
XOne: '#FF7F44'
line_interpolation: linear
metrics:
- aggregate: SUM
column:
column_name: global_sales
description: null
expression: null
filterable: true
groupby: true
id: 887
is_dttm: false
optionName: _col_Global_Sales
python_date_format: null
type: DOUBLE PRECISION
verbose_name: null
expressionType: SIMPLE
hasCustomLabel: false
isNew: false
label: SUM(Global_Sales)
optionName: metric_ufl75addr8c_oqqhdumirpn
sqlExpression: null
order_desc: true
queryFields:
groupby: groupby
metrics: metrics
rich_tooltip: true
rolling_type: None
row_limit: null
show_brush: auto
show_legend: false
slice_id: 659
stacked_style: stream
time_grain_sqla: null
time_range: No filter
url_params:
preselect_filters: '{"1389": {"platform": ["PS", "PS2", "PS3", "PS4"], "genre":
null, "__time_range": "No filter"}}'
viz_type: echarts_area
x_axis_format: smart_date
x_axis_label: Year Published
x_axis_showminmax: true
x_ticks_layout: auto
y_axis_bounds:
- null
- null
y_axis_format: SMART_NUMBER
query_context: null
cache_timeout: null
uuid: 3d926244-6e32-5e42-8ade-7302b83a65d7
version: 1.0.0
dataset_uuid: 53d47c0c-c03d-47f0-b9ac-81225f808283

View File

@@ -0,0 +1,30 @@
slice_name: Box plot
description: null
certified_by: null
certification_details: null
viz_type: box_plot
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby:
- region
limit: '25'
markup_type: markdown
metrics:
- sum__SP_POP_TOTL
row_limit: 50000
show_bubbles: true
since: '1960-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: now
viz_type: box_plot
whisker_options: Min/max (no outliers)
x_ticks_layout: staggered
query_context: null
cache_timeout: null
uuid: d31ba9c7-798b-4f84-87ef-ab31721680a8
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -0,0 +1,29 @@
slice_name: Growth Rate
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_line
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby:
- country_name
limit: '25'
markup_type: markdown
metrics:
- sum__SP_POP_TOTL
num_period_compare: '10'
row_limit: 50000
show_bubbles: true
since: '1960-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: '2014-01-02'
viz_type: echarts_timeseries_line
query_context: null
cache_timeout: null
uuid: cfcd7c5e-4759-4b28-bb7c-e2200508e978
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

Some files were not shown because too many files have changed in this diff Show More