diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index bc34e39bc24..a15303b2a3c 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -83,7 +83,7 @@ repos: files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$ - id: eslint-docs name: eslint (docs) - entry: bash -c 'cd docs && FILES=$(echo "$@" | sed "s|docs/||g") && yarn eslint --fix --quiet $FILES' + entry: bash -c 'cd docs && FILES=$(printf "%s\n" "$@" | sed "s|^docs/||" | tr "\n" " ") && yarn eslint --fix --quiet $FILES' language: system pass_filenames: true files: ^docs/.*\.(js|jsx|ts|tsx)$ diff --git a/README.md b/README.md index e6dd708623d..e40a0b572b9 100644 --- a/README.md +++ b/README.md @@ -48,12 +48,16 @@ under the License. A modern, enterprise-ready business intelligence web application. +### Documentation + +- **[User Guide](https://superset.apache.org/user-docs/)** — For analysts and business users. Explore data, build charts, create dashboards, and connect databases. +- **[Administrator Guide](https://superset.apache.org/admin-docs/)** — Install, configure, and operate Superset. Covers security, scaling, and database drivers. +- **[Developer Guide](https://superset.apache.org/developer-docs/)** — Contribute to Superset or build on its REST API and extension framework. + [**Why Superset?**](#why-superset) | [**Supported Databases**](#supported-databases) | -[**Installation and Configuration**](#installation-and-configuration) | [**Release Notes**](https://github.com/apache/superset/blob/master/RELEASING/README.md#release-notes-for-recent-releases) | [**Get Involved**](#get-involved) | -[**Contributor Guide**](#contributor-guide) | [**Resources**](#resources) | [**Organizations Using Superset**](https://superset.apache.org/inTheWild) @@ -191,7 +195,7 @@ Try out Superset's [quickstart](https://superset.apache.org/docs/quickstart/) gu ## Contributor Guide Interested in contributing? Check out our -[Developer Portal](https://superset.apache.org/developer_portal/) +[Developer Guide](https://superset.apache.org/developer-docs/) to find resources around contributing along with a detailed guide on how to set up a development environment. diff --git a/docs/.claude/instructions.md b/docs/.claude/instructions.md index 54db2f6f121..4e3b5ccd538 100644 --- a/docs/.claude/instructions.md +++ b/docs/.claude/instructions.md @@ -111,5 +111,5 @@ InteractiveMenu.parameters = { - **Generator**: `docs/scripts/generate-superset-components.mjs` - **Wrapper**: `docs/src/components/StorybookWrapper.jsx` -- **Output**: `docs/developer_portal/components/` +- **Output**: `docs/developer_docs/components/` - **Stories**: `superset-frontend/packages/superset-ui-core/src/components/*/` diff --git a/docs/.gitignore b/docs/.gitignore index 37df51ce524..bdf3078bb36 100644 --- a/docs/.gitignore +++ b/docs/.gitignore @@ -33,7 +33,7 @@ docs/databases/ # Generated API documentation (regenerated at build time from openapi.json) # Source of truth is static/resources/openapi.json -docs/api/ +developer_docs/api/ # Generated component documentation MDX files (regenerated at build time) # Source of truth is Storybook stories in superset-frontend/packages/superset-ui-core/src/components/ diff --git a/docs/DOCS_CLAUDE.md b/docs/DOCS_CLAUDE.md index 039b701ba1d..1ac49462afe 100644 --- a/docs/DOCS_CLAUDE.md +++ b/docs/DOCS_CLAUDE.md @@ -416,7 +416,7 @@ If versions don't appear in dropdown: - [Docusaurus Documentation](https://docusaurus.io/docs) - [MDX Documentation](https://mdxjs.com/) -- [Superset Developer Portal](https://superset.apache.org/developer_portal/) +- [Superset Developer Docs](https://superset.apache.org/developer-docs/) - [Main Superset Documentation](https://superset.apache.org/docs/intro) ## 📖 Real Examples and Patterns diff --git a/docs/README.md b/docs/README.md index dbf7a51238b..6f3a6b4902e 100644 --- a/docs/README.md +++ b/docs/README.md @@ -19,15 +19,16 @@ under the License. This is the public documentation site for Superset, built using [Docusaurus 3](https://docusaurus.io/). See the -[Developer Portal](https://superset.apache.org/developer_portal/contributing/development-setup#documentation) +[Developer Docs](https://superset.apache.org/developer-docs/contributing/development-setup#documentation) for documentation on contributing to documentation. ## Version Management -The Superset documentation site uses Docusaurus versioning with three independent versioned sections: +The Superset documentation site uses Docusaurus versioning with four independent sections: -- **Main Documentation** (`/docs/`) - Core Superset documentation -- **Developer Portal** (`/developer_portal/`) - Developer guides and tutorials +- **User Documentation** (`/user-docs/`) - End-user guides and tutorials +- **Admin Documentation** (`/admin-docs/`) - Installation, configuration, and security +- **Developer Docs** (`/developer-docs/`) - Developer guides, contributing, and extensions - **Component Playground** (`/components/`) - Interactive component examples (currently disabled) Each section maintains its own version history and can be versioned independently. diff --git a/docs/docs/configuration/alerts-reports.mdx b/docs/admin_docs/configuration/alerts-reports.mdx similarity index 96% rename from docs/docs/configuration/alerts-reports.mdx rename to docs/admin_docs/configuration/alerts-reports.mdx index 9a6b1ff5732..8364b8e50ae 100644 --- a/docs/docs/configuration/alerts-reports.mdx +++ b/docs/admin_docs/configuration/alerts-reports.mdx @@ -20,12 +20,12 @@ Alerts and reports are disabled by default. To turn them on, you'll need to chan #### In your `superset_config.py` or `superset_config_docker.py` -- `"ALERT_REPORTS"` [feature flag](/docs/configuration/configuring-superset#feature-flags) must be turned to True. +- `"ALERT_REPORTS"` [feature flag](/admin-docs/configuration/configuring-superset#feature-flags) must be turned to True. - `beat_schedule` in CeleryConfig must contain schedule for `reports.scheduler`. - At least one of those must be configured, depending on what you want to use: - emails: `SMTP_*` settings - Slack messages: `SLACK_API_TOKEN` -- Users can customize the email subject by including date code placeholders, which will automatically be replaced with the corresponding UTC date when the email is sent. To enable this functionality, activate the `"DATE_FORMAT_IN_EMAIL_SUBJECT"` [feature flag](/docs/configuration/configuring-superset#feature-flags). This enables date formatting in email subjects, preventing all reporting emails from being grouped into the same thread (optional for the reporting feature). +- Users can customize the email subject by including date code placeholders, which will automatically be replaced with the corresponding UTC date when the email is sent. To enable this functionality, activate the `"DATE_FORMAT_IN_EMAIL_SUBJECT"` [feature flag](/admin-docs/configuration/configuring-superset#feature-flags). This enables date formatting in email subjects, preventing all reporting emails from being grouped into the same thread (optional for the reporting feature). - Use date codes from [strftime.org](https://strftime.org/) to create the email subject. - If no date code is provided, the original string will be used as the email subject. @@ -36,7 +36,7 @@ Screenshots will be taken but no messages actually sent as long as `ALERT_REPORT #### In your `Dockerfile` You'll need to extend the Superset image to include a headless browser. Your options include: -- Use Playwright with Chrome: this is the recommended approach as of version 4.1.x or greater. A working example of a Dockerfile that installs these tools is provided under "Building your own production Docker image" on the [Docker Builds](/docs/installation/docker-builds#building-your-own-production-docker-image) page. Read the code comments there as you'll also need to change a feature flag in your config. +- Use Playwright with Chrome: this is the recommended approach as of version 4.1.x or greater. A working example of a Dockerfile that installs these tools is provided under "Building your own production Docker image" on the [Docker Builds](/admin-docs/installation/docker-builds#building-your-own-production-docker-image) page. Read the code comments there as you'll also need to change a feature flag in your config. - Use Firefox: you'll need to install geckodriver and Firefox. - Use Chrome without Playwright: you'll need to install Chrome and set the value of `WEBDRIVER_TYPE` to `"chrome"` in your `superset_config.py`. @@ -84,7 +84,7 @@ SLACK_API_RATE_LIMIT_RETRY_COUNT = 5 ### Kubernetes-specific - You must have a `celery beat` pod running. If you're using the chart included in the GitHub repository under [helm/superset](https://github.com/apache/superset/tree/master/helm/superset), you need to put `supersetCeleryBeat.enabled = true` in your values override. -- You can see the dedicated docs about [Kubernetes installation](/docs/installation/kubernetes) for more details. +- You can see the dedicated docs about [Kubernetes installation](/admin-docs/installation/kubernetes) for more details. ### Docker Compose specific diff --git a/docs/docs/configuration/async-queries-celery.mdx b/docs/admin_docs/configuration/async-queries-celery.mdx similarity index 100% rename from docs/docs/configuration/async-queries-celery.mdx rename to docs/admin_docs/configuration/async-queries-celery.mdx diff --git a/docs/docs/configuration/cache.mdx b/docs/admin_docs/configuration/cache.mdx similarity index 97% rename from docs/docs/configuration/cache.mdx rename to docs/admin_docs/configuration/cache.mdx index 73a73332ba6..4bb37483181 100644 --- a/docs/docs/configuration/cache.mdx +++ b/docs/admin_docs/configuration/cache.mdx @@ -84,11 +84,11 @@ Caching for SQL Lab query results is used when async queries are enabled and is Note that this configuration does not use a flask-caching dictionary for its configuration, but instead requires a cachelib object. -See [Async Queries via Celery](/docs/configuration/async-queries-celery) for details. +See [Async Queries via Celery](/admin-docs/configuration/async-queries-celery) for details. ## Caching Thumbnails -This is an optional feature that can be turned on by activating its [feature flag](/docs/configuration/configuring-superset#feature-flags) on config: +This is an optional feature that can be turned on by activating its [feature flag](/admin-docs/configuration/configuring-superset#feature-flags) on config: ``` FEATURE_FLAGS = { diff --git a/docs/docs/configuration/configuring-superset.mdx b/docs/admin_docs/configuration/configuring-superset.mdx similarity index 98% rename from docs/docs/configuration/configuring-superset.mdx rename to docs/admin_docs/configuration/configuring-superset.mdx index c0032f1727f..13b7edd5026 100644 --- a/docs/docs/configuration/configuring-superset.mdx +++ b/docs/admin_docs/configuration/configuring-superset.mdx @@ -37,7 +37,7 @@ ENV SUPERSET_CONFIG_PATH /app/superset_config.py ``` Docker compose deployments handle application configuration differently using specific conventions. -Refer to the [docker compose tips & configuration](/docs/installation/docker-compose#docker-compose-tips--configuration) +Refer to the [docker compose tips & configuration](/admin-docs/installation/docker-compose#docker-compose-tips--configuration) for details. The following is an example of just a few of the parameters you can set in your `superset_config.py` file: @@ -141,10 +141,10 @@ database engine on a separate host or container. Superset supports the following database engines/versions: -| Database Engine | Supported Versions | -| ----------------------------------------- | ---------------------------------------------- | -| [PostgreSQL](https://www.postgresql.org/) | 10.X, 11.X, 12.X, 13.X, 14.X, 15.X, 16.X, 17.X | -| [MySQL](https://www.mysql.com/) | 5.7, 8.X | +| Database Engine | Supported Versions | +| ----------------------------------------- | ---------------------------------------- | +| [PostgreSQL](https://www.postgresql.org/) | 10.X, 11.X, 12.X, 13.X, 14.X, 15.X, 16.X | +| [MySQL](https://www.mysql.com/) | 5.7, 8.X | Use the following database drivers and connection strings: @@ -246,7 +246,7 @@ flask --app "superset.app:create_app(superset_app_root='/analytics')" ### Docker builds -The [docker compose](/docs/installation/docker-compose#configuring-further) developer +The [docker compose](/admin-docs/installation/docker-compose#configuring-further) developer configuration includes an additional environmental variable, [`SUPERSET_APP_ROOT`](https://github.com/apache/superset/blob/master/docker/.env), to simplify the process of setting up a non-default root path across the services. @@ -441,7 +441,7 @@ FEATURE_FLAGS = { } ``` -A current list of feature flags can be found in the [Feature Flags](/docs/configuration/feature-flags) documentation. +A current list of feature flags can be found in the [Feature Flags](/admin-docs/configuration/feature-flags) documentation. :::resources - [Blog: Feature Flags in Apache Superset](https://preset.io/blog/feature-flags-in-apache-superset-and-preset/) diff --git a/docs/docs/configuration/country-map-tools.mdx b/docs/admin_docs/configuration/country-map-tools.mdx similarity index 100% rename from docs/docs/configuration/country-map-tools.mdx rename to docs/admin_docs/configuration/country-map-tools.mdx diff --git a/docs/docs/configuration/event-logging.mdx b/docs/admin_docs/configuration/event-logging.mdx similarity index 100% rename from docs/docs/configuration/event-logging.mdx rename to docs/admin_docs/configuration/event-logging.mdx diff --git a/docs/docs/configuration/feature-flags.mdx b/docs/admin_docs/configuration/feature-flags.mdx similarity index 100% rename from docs/docs/configuration/feature-flags.mdx rename to docs/admin_docs/configuration/feature-flags.mdx diff --git a/docs/docs/configuration/importing-exporting-datasources.mdx b/docs/admin_docs/configuration/importing-exporting-datasources.mdx similarity index 100% rename from docs/docs/configuration/importing-exporting-datasources.mdx rename to docs/admin_docs/configuration/importing-exporting-datasources.mdx diff --git a/docs/docs/configuration/map-tiles.mdx b/docs/admin_docs/configuration/map-tiles.mdx similarity index 100% rename from docs/docs/configuration/map-tiles.mdx rename to docs/admin_docs/configuration/map-tiles.mdx diff --git a/docs/docs/configuration/networking-settings.mdx b/docs/admin_docs/configuration/networking-settings.mdx similarity index 97% rename from docs/docs/configuration/networking-settings.mdx rename to docs/admin_docs/configuration/networking-settings.mdx index 59017fa9612..f1af8a87ab6 100644 --- a/docs/docs/configuration/networking-settings.mdx +++ b/docs/admin_docs/configuration/networking-settings.mdx @@ -60,11 +60,11 @@ There are two approaches to making dashboards publicly accessible: **Option 2: Dashboard-level access (selective control)** 1. Set `PUBLIC_ROLE_LIKE = "Public"` in `superset_config.py` -2. Add the `'DASHBOARD_RBAC': True` [Feature Flag](/docs/configuration/feature-flags) +2. Add the `'DASHBOARD_RBAC': True` [Feature Flag](/admin-docs/configuration/feature-flags) 3. Edit each dashboard's properties and add the "Public" role 4. Only dashboards with the Public role explicitly assigned are visible to anonymous users -See the [Public role documentation](/docs/security/security#public) for more details. +See the [Public role documentation](/admin-docs/security/security#public) for more details. #### Embedding a Public Dashboard diff --git a/docs/docs/configuration/sql-templating.mdx b/docs/admin_docs/configuration/sql-templating.mdx similarity index 98% rename from docs/docs/configuration/sql-templating.mdx rename to docs/admin_docs/configuration/sql-templating.mdx index b6337e06876..893c1886419 100644 --- a/docs/docs/configuration/sql-templating.mdx +++ b/docs/admin_docs/configuration/sql-templating.mdx @@ -7,10 +7,14 @@ version: 1 # SQL Templating +:::tip Looking to use SQL templating? +For a user-focused guide on writing Jinja templates in SQL Lab and virtual datasets, see the [SQL Templating User Guide](/user-docs/using-superset/sql-templating). This page covers administrator configuration options. +::: + ## Jinja Templates SQL Lab and Explore supports [Jinja templating](https://jinja.palletsprojects.com/en/2.11.x/) in queries. -To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/docs/configuration/configuring-superset#feature-flags) needs to be enabled in `superset_config.py`. +To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/admin-docs/configuration/configuring-superset#feature-flags) needs to be enabled in `superset_config.py`. :::warning[Security Warning] diff --git a/docs/docs/configuration/theming.mdx b/docs/admin_docs/configuration/theming.mdx similarity index 100% rename from docs/docs/configuration/theming.mdx rename to docs/admin_docs/configuration/theming.mdx diff --git a/docs/docs/configuration/timezones.mdx b/docs/admin_docs/configuration/timezones.mdx similarity index 74% rename from docs/docs/configuration/timezones.mdx rename to docs/admin_docs/configuration/timezones.mdx index a1289d1cb9c..db53fcc6f46 100644 --- a/docs/docs/configuration/timezones.mdx +++ b/docs/admin_docs/configuration/timezones.mdx @@ -20,7 +20,7 @@ To help make the problem somewhat tractable—given that Apache Superset has no To strive for data consistency (regardless of the timezone of the client) the Apache Superset backend tries to ensure that any timestamp sent to the client has an explicit (or semi-explicit as in the case with [Epoch time](https://en.wikipedia.org/wiki/Unix_time) which is always in reference to UTC) timezone encoded within. -The challenge however lies with the slew of [database engines](/docs/databases#installing-drivers-in-docker) which Apache Superset supports and various inconsistencies between their [Python Database API (DB-API)](https://www.python.org/dev/peps/pep-0249/) implementations combined with the fact that we use [Pandas](https://pandas.pydata.org/) to read SQL into a DataFrame prior to serializing to JSON. Regrettably Pandas ignores the DB-API [type_code](https://www.python.org/dev/peps/pep-0249/#type-objects) relying by default on the underlying Python type returned by the DB-API. Currently only a subset of the supported database engines work correctly with Pandas, i.e., ensuring timestamps without an explicit timestamp are serializd to JSON with the server timezone, thus guaranteeing the client will display timestamps in a consistent manner irrespective of the client's timezone. +The challenge however lies with the slew of [database engines](/admin-docs/databases#installing-drivers-in-docker) which Apache Superset supports and various inconsistencies between their [Python Database API (DB-API)](https://www.python.org/dev/peps/pep-0249/) implementations combined with the fact that we use [Pandas](https://pandas.pydata.org/) to read SQL into a DataFrame prior to serializing to JSON. Regrettably Pandas ignores the DB-API [type_code](https://www.python.org/dev/peps/pep-0249/#type-objects) relying by default on the underlying Python type returned by the DB-API. Currently only a subset of the supported database engines work correctly with Pandas, i.e., ensuring timestamps without an explicit timestamp are serializd to JSON with the server timezone, thus guaranteeing the client will display timestamps in a consistent manner irrespective of the client's timezone. For example the following is a comparison of MySQL and Presto, diff --git a/docs/admin_docs/index.md b/docs/admin_docs/index.md new file mode 100644 index 00000000000..8b1c1de4b1a --- /dev/null +++ b/docs/admin_docs/index.md @@ -0,0 +1,42 @@ +--- +title: Admin Documentation +description: Administrator guides for installing, configuring, and managing Apache Superset +--- + + + +# Admin Documentation + +This section contains documentation for system administrators and operators who deploy and manage Apache Superset installations. + +## What's in this section + +- **[Installation](/admin-docs/installation/installation-methods)** - Deploy Superset using Docker, Kubernetes, or PyPI +- **[Configuration](/admin-docs/configuration/configuring-superset)** - Configure authentication, caching, feature flags, and more +- **[Security](/admin-docs/security/)** - Set up roles, permissions, and secure your deployment + +## Related + +- **[Database Drivers](/user-docs/databases/)** - See User Docs for database connection setup (admins may need to install drivers) + +## Looking for something else? + +- **[User Documentation](/user-docs/)** - Guides for analysts and business users +- **[Developer Documentation](/developer-docs)** - Contributing, extensions, and development guides diff --git a/docs/docs/installation/architecture.mdx b/docs/admin_docs/installation/architecture.mdx similarity index 85% rename from docs/docs/installation/architecture.mdx rename to docs/admin_docs/installation/architecture.mdx index 7830d5e078f..e531b30017e 100644 --- a/docs/docs/installation/architecture.mdx +++ b/docs/admin_docs/installation/architecture.mdx @@ -24,10 +24,10 @@ A Superset installation is made up of these components: The optional components above are necessary to enable these features: -- [Alerts and Reports](/docs/configuration/alerts-reports) -- [Caching](/docs/configuration/cache) -- [Async Queries](/docs/configuration/async-queries-celery/) -- [Dashboard Thumbnails](/docs/configuration/cache/#caching-thumbnails) +- [Alerts and Reports](/admin-docs/configuration/alerts-reports) +- [Caching](/admin-docs/configuration/cache) +- [Async Queries](/admin-docs/configuration/async-queries-celery/) +- [Dashboard Thumbnails](/admin-docs/configuration/cache/#caching-thumbnails) If you install with Kubernetes or Docker Compose, all of these components will be created. @@ -59,7 +59,7 @@ The caching layer serves two main functions: - Store the results of queries to your data warehouse so that when a chart is loaded twice, it pulls from the cache the second time, speeding up the application and reducing load on your data warehouse. - Act as a message broker for the worker, enabling the Alerts & Reports, async queries, and thumbnail caching features. -Most people use Redis for their cache, but Superset supports other options too. See the [cache docs](/docs/configuration/cache/) for more. +Most people use Redis for their cache, but Superset supports other options too. See the [cache docs](/admin-docs/configuration/cache/) for more. ### Worker and Beat @@ -67,6 +67,6 @@ This is one or more workers who execute tasks like run async queries or take sna ## Other components -Other components can be incorporated into Superset. The best place to learn about additional configurations is the [Configuration page](/docs/configuration/configuring-superset). For instance, you could set up a load balancer or reverse proxy to implement HTTPS in front of your Superset application, or specify a Mapbox URL to enable geospatial charts, etc. +Other components can be incorporated into Superset. The best place to learn about additional configurations is the [Configuration page](/admin-docs/configuration/configuring-superset). For instance, you could set up a load balancer or reverse proxy to implement HTTPS in front of your Superset application, or specify a Mapbox URL to enable geospatial charts, etc. Superset won't even start without certain configuration settings established, so it's essential to review that page. diff --git a/docs/docs/installation/docker-builds.mdx b/docs/admin_docs/installation/docker-builds.mdx similarity index 100% rename from docs/docs/installation/docker-builds.mdx rename to docs/admin_docs/installation/docker-builds.mdx diff --git a/docs/docs/installation/docker-compose.mdx b/docs/admin_docs/installation/docker-compose.mdx similarity index 98% rename from docs/docs/installation/docker-compose.mdx rename to docs/admin_docs/installation/docker-compose.mdx index 9727b97eaeb..04d14915860 100644 --- a/docs/docs/installation/docker-compose.mdx +++ b/docs/admin_docs/installation/docker-compose.mdx @@ -17,11 +17,11 @@ Since `docker compose` is primarily designed to run a set of containers on **a s and can't support requirements for **high availability**, we do not support nor recommend using our `docker compose` constructs to support production-type use-cases. For single host environments, we recommend using [minikube](https://minikube.sigs.k8s.io/docs/start/) along -with our [installing on k8s](https://superset.apache.org/docs/installation/running-on-kubernetes) +with our [installing on k8s](https://superset.apache.org/admin-docs/installation/running-on-kubernetes) documentation. ::: -As mentioned in our [quickstart guide](/docs/quickstart), the fastest way to try +As mentioned in our [quickstart guide](/user-docs/quickstart), the fastest way to try Superset locally is using Docker Compose on a Linux or Mac OSX computer. Superset does not have official support for Windows. It's also the easiest way to launch a fully functioning **development environment** quickly. diff --git a/docs/docs/installation/installation-methods.mdx b/docs/admin_docs/installation/installation-methods.mdx similarity index 90% rename from docs/docs/installation/installation-methods.mdx rename to docs/admin_docs/installation/installation-methods.mdx index 51f3708b3ef..0e8f11b7d57 100644 --- a/docs/docs/installation/installation-methods.mdx +++ b/docs/admin_docs/installation/installation-methods.mdx @@ -9,11 +9,11 @@ import useBaseUrl from "@docusaurus/useBaseUrl"; # Installation Methods -How should you install Superset? Here's a comparison of the different options. It will help if you've first read the [Architecture](/docs/installation/architecture.mdx) page to understand Superset's different components. +How should you install Superset? Here's a comparison of the different options. It will help if you've first read the [Architecture](/admin-docs/installation/architecture) page to understand Superset's different components. The fundamental trade-off is between you needing to do more of the detail work yourself vs. using a more complex deployment route that handles those details. -## [Docker Compose](/docs/installation/docker-compose.mdx) +## [Docker Compose](/admin-docs/installation/docker-compose) **Summary:** This takes advantage of containerization while remaining simpler than Kubernetes. This is the best way to try out Superset; it's also useful for developing & contributing back to Superset. @@ -27,9 +27,9 @@ You will need to back up your metadata DB. That could mean backing up the servic You will also need to extend the Superset docker image. The default `lean` images do not contain drivers needed to access your metadata database (Postgres or MySQL), nor to access your data warehouse, nor the headless browser needed for Alerts & Reports. You could run a `-dev` image while demoing Superset, which has some of this, but you'll still need to install the driver for your data warehouse. The `-dev` images run as root, which is not recommended for production. -Ideally you will build your own image of Superset that extends `lean`, adding what your deployment needs. See [Building your own production Docker image](/docs/installation/docker-builds/#building-your-own-production-docker-image). +Ideally you will build your own image of Superset that extends `lean`, adding what your deployment needs. See [Building your own production Docker image](/admin-docs/installation/docker-builds/#building-your-own-production-docker-image). -## [Kubernetes (K8s)](/docs/installation/kubernetes.mdx) +## [Kubernetes (K8s)](/admin-docs/installation/kubernetes) **Summary:** This is the best-practice way to deploy a production instance of Superset, but has the steepest skill requirement - someone who knows Kubernetes. @@ -41,7 +41,7 @@ A K8s deployment can scale up and down based on usage and deploy rolling updates You will need to build your own Docker image, and back up your metadata DB, both as described in Docker Compose above. You'll also need to customize your Helm chart values and deploy and maintain your Kubernetes cluster. -## [PyPI (Python)](/docs/installation/pypi.mdx) +## [PyPI (Python)](/admin-docs/installation/pypi) **Summary:** This is the only method that requires no knowledge of containers. It requires the most hands-on work to deploy, connect, and maintain each component. diff --git a/docs/docs/installation/kubernetes.mdx b/docs/admin_docs/installation/kubernetes.mdx similarity index 98% rename from docs/docs/installation/kubernetes.mdx rename to docs/admin_docs/installation/kubernetes.mdx index cf8e0542a67..e54cf47cd2e 100644 --- a/docs/docs/installation/kubernetes.mdx +++ b/docs/admin_docs/installation/kubernetes.mdx @@ -149,7 +149,7 @@ For production clusters it's recommended to build own image with this step done Superset requires a Python DB-API database driver and a SQLAlchemy dialect to be installed for each datastore you want to connect to. -See [Install Database Drivers](/docs/databases#installing-database-drivers) for more information. +See [Install Database Drivers](/admin-docs/databases#installing-database-drivers) for more information. It is recommended that you refer to versions listed in [pyproject.toml](https://github.com/apache/superset/blob/master/pyproject.toml) instead of hard-coding them in your bootstrap script, as seen below. @@ -310,7 +310,7 @@ configOverrides: ### Enable Alerts and Reports -For this, as per the [Alerts and Reports doc](/docs/configuration/alerts-reports), you will need to: +For this, as per the [Alerts and Reports doc](/admin-docs/configuration/alerts-reports), you will need to: #### Install a supported webdriver in the Celery worker @@ -323,7 +323,7 @@ supersetWorker: - -c - | # Install chrome webdriver - # See https://github.com/apache/superset/blob/4fa3b6c7185629b87c27fc2c0e5435d458f7b73d/docs/src/pages/docs/installation/email_reports.mdx + # See https://github.com/apache/superset/blob/4fa3b6c7185629b87c27fc2c0e5435d458f7b73d/docs/src/pages/admin-docs/installation/email_reports.mdx apt-get update apt-get install -y wget wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb diff --git a/docs/docs/installation/pypi.mdx b/docs/admin_docs/installation/pypi.mdx similarity index 97% rename from docs/docs/installation/pypi.mdx rename to docs/admin_docs/installation/pypi.mdx index 14228c20812..820ac776b95 100644 --- a/docs/docs/installation/pypi.mdx +++ b/docs/admin_docs/installation/pypi.mdx @@ -134,7 +134,7 @@ pip install apache_superset Then, define mandatory configurations, SECRET_KEY and FLASK_APP: ```bash -export SUPERSET_SECRET_KEY=YOUR-SECRET-KEY # For production use, make sure this is a strong key, for example generated using `openssl rand -base64 42`. See https://superset.apache.org/docs/configuration/configuring-superset#specifying-a-secret_key +export SUPERSET_SECRET_KEY=YOUR-SECRET-KEY # For production use, make sure this is a strong key, for example generated using `openssl rand -base64 42`. See https://superset.apache.org/admin-docs/configuration/configuring-superset#specifying-a-secret_key export FLASK_APP=superset ``` diff --git a/docs/docs/installation/upgrading-superset.mdx b/docs/admin_docs/installation/upgrading-superset.mdx similarity index 100% rename from docs/docs/installation/upgrading-superset.mdx rename to docs/admin_docs/installation/upgrading-superset.mdx diff --git a/docs/docs/security/cves.mdx b/docs/admin_docs/security/cves.mdx similarity index 100% rename from docs/docs/security/cves.mdx rename to docs/admin_docs/security/cves.mdx diff --git a/docs/docs/security/securing_superset.mdx b/docs/admin_docs/security/securing_superset.mdx similarity index 98% rename from docs/docs/security/securing_superset.mdx rename to docs/admin_docs/security/securing_superset.mdx index a0d5f7b5914..92d42e385e0 100644 --- a/docs/docs/security/securing_superset.mdx +++ b/docs/admin_docs/security/securing_superset.mdx @@ -114,7 +114,7 @@ Superset can use Flask-Talisman to set security headers. However, it must be exp > > In Superset 4.0 and later, Talisman is disabled by default (`TALISMAN_ENABLED = False`). You **must** explicitly enable it in your `superset_config.py` for the security headers defined in `TALISMAN_CONFIG` to take effect. -Here's the documentation section how how to set up Talisman: https://superset.apache.org/docs/security/#content-security-policy-csp +Here's the documentation section how how to set up Talisman: https://superset.apache.org/admin-docs/security/#content-security-policy-csp ### **Database Security** @@ -171,7 +171,7 @@ Rotating the `SUPERSET_SECRET_KEY` is a critical security procedure. It is manda **Procedure for Rotating the Key** The procedure for safely rotating the SECRET_KEY must be followed precisely to avoid locking yourself out of your instance. The official Apache Superset documentation maintains the correct, up-to-date procedure. Please follow the official guide here: -https://superset.apache.org/docs/configuration/configuring-superset/#rotating-to-a-newer-secret_key +https://superset.apache.org/admin-docs/configuration/configuring-superset/#rotating-to-a-newer-secret_key :::resources - [Blog: Running Apache Superset on the Open Internet](https://preset.io/blog/running-apache-superset-on-the-open-internet-a-report-from-the-fireline/) diff --git a/docs/docs/security/security.mdx b/docs/admin_docs/security/security.mdx similarity index 99% rename from docs/docs/security/security.mdx rename to docs/admin_docs/security/security.mdx index 8983dd6d33a..cc68b1241b9 100644 --- a/docs/docs/security/security.mdx +++ b/docs/admin_docs/security/security.mdx @@ -431,7 +431,7 @@ TALISMAN_CONFIG = { ``` For more information on setting up Talisman, please refer to -https://superset.apache.org/docs/configuration/networking-settings/#changing-flask-talisman-csp. +https://superset.apache.org/admin-docs/configuration/networking-settings/#changing-flask-talisman-csp. ### Reporting Security Vulnerabilities diff --git a/docs/docs/api.mdx b/docs/developer_docs/api.mdx similarity index 100% rename from docs/docs/api.mdx rename to docs/developer_docs/api.mdx diff --git a/docs/developer_docs/components/TODO.md b/docs/developer_docs/components/TODO.md new file mode 100644 index 00000000000..c3564104795 --- /dev/null +++ b/docs/developer_docs/components/TODO.md @@ -0,0 +1,71 @@ +--- +title: Components TODO +sidebar_class_name: hidden +--- + +# Components TODO + +These components were found but not yet supported for documentation generation. +Future phases will add support for these sources. + +## Summary + +- **Total skipped:** 19 story files +- **Reason:** Import path resolution not yet implemented + +## Skipped by Source + +### App Components + +9 components + +- [ ] `superset-frontend/src/components/AlteredSliceTag/AlteredSliceTag.stories.tsx` +- [ ] `superset-frontend/src/components/Chart/DrillDetail/DrillDetailTableControls.stories.tsx` +- [ ] `superset-frontend/src/components/CopyToClipboard/CopyToClipboard.stories.tsx` +- [ ] `superset-frontend/src/components/ErrorMessage/ErrorAlert.stories.tsx` +- [ ] `superset-frontend/src/components/FacePile/FacePile.stories.tsx` +- [ ] `superset-frontend/src/components/FilterableTable/FilterableTable.stories.tsx` +- [ ] `superset-frontend/src/components/RowCountLabel/RowCountLabel.stories.tsx` +- [ ] `superset-frontend/src/components/Tag/Tag.stories.tsx` +- [ ] `superset-frontend/src/components/TagsList/TagsList.stories.tsx` + +### Dashboard Components + +2 components + +- [ ] `superset-frontend/src/dashboard/components/AnchorLink/AnchorLink.stories.tsx` +- [ ] `superset-frontend/src/dashboard/components/nativeFilters/FilterBar/FilterControls/FilterDivider.stories.tsx` + +### Explore Components + +4 components + +- [ ] `superset-frontend/src/explore/components/ControlHeader.stories.tsx` +- [ ] `superset-frontend/src/explore/components/RunQueryButton/RunQueryButton.stories.tsx` +- [ ] `superset-frontend/src/explore/components/controls/BoundsControl.stories.tsx` +- [ ] `superset-frontend/src/explore/components/controls/SliderControl.stories.tsx` + +### Feature Components + +2 components + +- [ ] `superset-frontend/src/features/datasets/AddDataset/DatasetPanel/DatasetPanel.stories.tsx` +- [ ] `superset-frontend/src/features/home/LanguagePicker.stories.tsx` + +### Filter Components + +2 components + +- [ ] `superset-frontend/src/filters/components/Range/RangeFilterPlugin.stories.tsx` +- [ ] `superset-frontend/src/filters/components/Select/SelectFilterPlugin.stories.tsx` + +## How to Add Support + +1. Determine the correct import path for the source +2. Update `generate-superset-components.mjs` to handle the source +3. Add source to `SUPPORTED_SOURCES` array +4. Re-run the generator + +--- + +*Auto-generated by generate-superset-components.mjs* diff --git a/docs/developer_docs/components/design-system/dropdowncontainer.mdx b/docs/developer_docs/components/design-system/dropdowncontainer.mdx new file mode 100644 index 00000000000..205a209a5fd --- /dev/null +++ b/docs/developer_docs/components/design-system/dropdowncontainer.mdx @@ -0,0 +1,167 @@ +--- +title: DropdownContainer +sidebar_label: DropdownContainer +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# DropdownContainer + +DropdownContainer arranges items horizontally and moves overflowing items into a dropdown popover. Resize the container to see the overflow behavior. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + const items = Array.from({ length: 6 }, (_, i) => ({ + id: 'item-' + i, + element: React.createElement('div', { + style: { + minWidth: 120, + padding: '4px 12px', + background: '#e6f4ff', + border: '1px solid #91caff', + borderRadius: 4, + }, + }, 'Filter ' + (i + 1)), + })); + return ( +
+ +
+ Drag the right edge to resize and see items overflow into a dropdown +
+
+ ); +} +``` + +## With Select Filters + +```tsx live +function SelectFilters() { + const items = ['Region', 'Category', 'Date Range', 'Status', 'Owner'].map( + (label, i) => ({ + id: 'filter-' + i, + element: React.createElement('div', { + style: { minWidth: 150, padding: '4px 12px', background: '#f5f5f5', border: '1px solid #d9d9d9', borderRadius: 4 }, + }, label + ': All'), + }) + ); + return ( +
+ +
+ ); +} +``` + + + +## Import + +```tsx +import { DropdownContainer } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/DropdownContainer/DropdownContainer.stories.tsx). +::: diff --git a/docs/developer_docs/components/design-system/flex.mdx b/docs/developer_docs/components/design-system/flex.mdx new file mode 100644 index 00000000000..6b6d674a859 --- /dev/null +++ b/docs/developer_docs/components/design-system/flex.mdx @@ -0,0 +1,197 @@ +--- +title: Flex +sidebar_label: Flex +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Flex + +The Flex component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + {['Item 1', 'Item 2', 'Item 3', 'Item 4', 'Item 5'].map(item => ( +
+ {item} +
+ ))} +
+ ); +} +``` + +## Vertical Layout + +```tsx live +function VerticalFlex() { + return ( + + + + + + ); +} +``` + +## Justify and Align + +```tsx live +function JustifyAlign() { + const boxStyle = { + width: '100%', + height: 120, + borderRadius: 6, + border: '1px solid #40a9ff', + }; + const itemStyle = { + width: 60, + height: 40, + backgroundColor: '#1677ff', + borderRadius: 4, + }; + return ( +
+ {['flex-start', 'center', 'flex-end', 'space-between', 'space-around'].map(justify => ( +
+ {justify} + +
+
+
+ +
+ ))} +
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `vertical` | `boolean` | `false` | - | +| `wrap` | `string` | `"nowrap"` | - | +| `justify` | `string` | `"normal"` | - | +| `align` | `string` | `"normal"` | - | +| `flex` | `string` | `"normal"` | - | +| `gap` | `string` | `"small"` | - | + +## Import + +```tsx +import { Flex } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Flex/Flex.stories.tsx). +::: diff --git a/docs/developer_docs/components/design-system/grid.mdx b/docs/developer_docs/components/design-system/grid.mdx new file mode 100644 index 00000000000..6400917f88c --- /dev/null +++ b/docs/developer_docs/components/design-system/grid.mdx @@ -0,0 +1,192 @@ +--- +title: Grid +sidebar_label: Grid +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Grid + +The Grid system of Ant Design is based on a 24-grid layout. The `Row` and `Col` components are used to create flexible and responsive grid layouts. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + +
col-12
+ + +
col-12
+ + +
col-8
+ + +
col-8
+ + +
col-8
+ +
+ ); +} +``` + +## Responsive Grid + +```tsx live +function ResponsiveGrid() { + return ( + + +
+ Responsive +
+ + +
+ Responsive +
+ + +
+ Responsive +
+ + +
+ Responsive +
+ +
+ ); +} +``` + +## Alignment + +```tsx live +function AlignmentDemo() { + const boxStyle = { background: '#e6f4ff', padding: '16px 0', border: '1px solid #91caff', textAlign: 'center' }; + return ( +
+ +
start
+
start
+
+ +
center
+
center
+
+ +
end
+
end
+
+ +
between
+
between
+
+
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `align` | `string` | `"top"` | Vertical alignment of columns within the row. | +| `justify` | `string` | `"start"` | Horizontal distribution of columns within the row. | +| `wrap` | `boolean` | `true` | Whether columns are allowed to wrap to the next line. | +| `gutter` | `number` | `16` | Spacing between columns in pixels. | + +## Import + +```tsx +import Grid from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Grid/Grid.stories.tsx). +::: diff --git a/docs/developer_docs/components/design-system/index.mdx b/docs/developer_docs/components/design-system/index.mdx new file mode 100644 index 00000000000..2f4c4e30e84 --- /dev/null +++ b/docs/developer_docs/components/design-system/index.mdx @@ -0,0 +1,38 @@ +--- +title: Layout Components +sidebar_label: Layout Components +sidebar_position: 1 +--- + + + +# Layout Components + +7 components available in this category. + +## Components + +- [DropdownContainer](./dropdowncontainer) +- [Flex](./flex) +- [Grid](./grid) +- [Layout](./layout) +- [MetadataBar](./metadatabar) +- [Space](./space) +- [Table](./table) diff --git a/docs/developer_docs/components/design-system/layout.mdx b/docs/developer_docs/components/design-system/layout.mdx new file mode 100644 index 00000000000..9fe934308a5 --- /dev/null +++ b/docs/developer_docs/components/design-system/layout.mdx @@ -0,0 +1,139 @@ +--- +title: Layout +sidebar_label: Layout +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Layout + +Ant Design Layout component with configurable Sider, Header, Footer, and Content. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + +
Sidebar
+
+ + + Header + + + Content + + + Footer + + +
+ ); +} +``` + +## Content Only + +```tsx live +function ContentOnly() { + return ( + + + Application Header + + + Main content area without a sidebar + + + Footer Content + + + ); +} +``` + +## Right Sidebar + +```tsx live +function RightSidebar() { + return ( + + + + Header + + + Content with right sidebar + + + +
Right Sidebar
+
+
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `hasSider` | `boolean` | `false` | Whether the layout contains a Sider sub-component. | +| `style` | `any` | `{"minHeight":200}` | - | + +## Import + +```tsx +import { Layout } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Layout/Layout.stories.tsx). +::: diff --git a/docs/developer_docs/components/design-system/metadatabar.mdx b/docs/developer_docs/components/design-system/metadatabar.mdx new file mode 100644 index 00000000000..a8064c23a91 --- /dev/null +++ b/docs/developer_docs/components/design-system/metadatabar.mdx @@ -0,0 +1,174 @@ +--- +title: MetadataBar +sidebar_label: MetadataBar +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# MetadataBar + +MetadataBar displays a row of metadata items (SQL info, owners, last modified, tags, dashboards, etc.) that collapse responsively based on available width. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + const items = [ + { type: 'sql', title: 'Click to view query' }, + { + type: 'owner', + createdBy: 'Jane Smith', + owners: ['John Doe', 'Mary Wilson'], + createdOn: 'a week ago', + }, + { + type: 'lastModified', + value: 'a week ago', + modifiedBy: 'Jane Smith', + }, + { type: 'tags', values: ['management', 'research', 'poc'] }, + ]; + return ; +} +``` + +## Minimal Metadata + +```tsx live +function MinimalMetadata() { + const items = [ + { type: 'owner', createdBy: 'Admin', owners: ['Admin'], createdOn: 'yesterday' }, + { type: 'lastModified', value: '2 hours ago', modifiedBy: 'Admin' }, + ]; + return ; +} +``` + +## Full Metadata + +```tsx live +function FullMetadata() { + const items = [ + { type: 'sql', title: 'SELECT * FROM ...' }, + { type: 'owner', createdBy: 'Jane Smith', owners: ['Jane Smith', 'John Doe', 'Bob Wilson'], createdOn: '2 weeks ago' }, + { type: 'lastModified', value: '3 days ago', modifiedBy: 'John Doe' }, + { type: 'tags', values: ['production', 'finance', 'quarterly'] }, + { type: 'dashboards', title: 'Used in 12 dashboards' }, + { type: 'description', value: 'This chart shows quarterly revenue breakdown by region and product line.' }, + { type: 'rows', title: '1.2M rows' }, + { type: 'table', title: 'public.revenue_data' }, + ]; + return ; +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `title` | `string` | `"Added to 3 dashboards"` | - | +| `createdBy` | `string` | `"Jane Smith"` | - | +| `modifiedBy` | `string` | `"Jane Smith"` | - | +| `description` | `string` | `"To preview the list of dashboards go to More settings."` | - | +| `items` | `any` | `[{"type":"sql","title":"Click to view query"},{"type":"owner","createdBy":"Jane Smith","owners":["John Doe","Mary Wilson"],"createdOn":"a week ago"},{"type":"lastModified","value":"a week ago","modifiedBy":"Jane Smith"},{"type":"tags","values":["management","research","poc"]},{"type":"dashboards","title":"Added to 3 dashboards","description":"To preview the list of dashboards go to More settings."}]` | - | + +## Import + +```tsx +import MetadataBar from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/MetadataBar/MetadataBar.stories.tsx). +::: diff --git a/docs/developer_docs/components/design-system/space.mdx b/docs/developer_docs/components/design-system/space.mdx new file mode 100644 index 00000000000..cbbef161fbf --- /dev/null +++ b/docs/developer_docs/components/design-system/space.mdx @@ -0,0 +1,168 @@ +--- +title: Space +sidebar_label: Space +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Space + +The Space component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + {['Item 1', 'Item 2', 'Item 3', 'Item 4', 'Item 5'].map(item => ( +
+ {item} +
+ ))} +
+ ); +} +``` + +## Vertical Space + +```tsx live +function VerticalSpace() { + return ( + + + + + + ); +} +``` + +## Space Sizes + +```tsx live +function SpaceSizes() { + const items = ['Item 1', 'Item 2', 'Item 3']; + const itemStyle = { + padding: '8px 16px', + background: '#e6f4ff', + border: '1px solid #91caff', + borderRadius: 4, + }; + return ( +
+ {['small', 'middle', 'large'].map(size => ( +
+

{size}

+ + {items.map(item => ( +
{item}
+ ))} +
+
+ ))} +
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `direction` | `string` | `"horizontal"` | - | +| `size` | `string` | `"small"` | - | +| `wrap` | `boolean` | `false` | - | + +## Import + +```tsx +import { Space } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Space/Space.stories.tsx). +::: diff --git a/docs/developer_docs/components/design-system/table.mdx b/docs/developer_docs/components/design-system/table.mdx new file mode 100644 index 00000000000..16451533f44 --- /dev/null +++ b/docs/developer_docs/components/design-system/table.mdx @@ -0,0 +1,311 @@ +--- +title: Table +sidebar_label: Table +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Table + +A data table component with sorting, pagination, row selection, resizable columns, reorderable columns, and virtualization for large datasets. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + const data = [ + { key: 1, name: 'PostgreSQL', type: 'Database', status: 'Active' }, + { key: 2, name: 'MySQL', type: 'Database', status: 'Active' }, + { key: 3, name: 'SQLite', type: 'Database', status: 'Inactive' }, + { key: 4, name: 'Presto', type: 'Query Engine', status: 'Active' }, + ]; + const columns = [ + { title: 'Name', dataIndex: 'name', key: 'name', width: 150 }, + { title: 'Type', dataIndex: 'type', key: 'type' }, + { title: 'Status', dataIndex: 'status', key: 'status', width: 100 }, + ]; + return ; +} +``` + +## With Pagination + +```tsx live +function PaginatedTable() { + const data = Array.from({ length: 20 }, (_, i) => ({ + key: i, + name: 'Record ' + (i + 1), + value: Math.round(Math.random() * 1000), + category: ['A', 'B', 'C'][i % 3], + })); + const columns = [ + { title: 'Name', dataIndex: 'name', key: 'name' }, + { title: 'Value', dataIndex: 'value', key: 'value', width: 100 }, + { title: 'Category', dataIndex: 'category', key: 'category', width: 100 }, + ]; + return ( +
+ ); +} +``` + +## Loading State + +```tsx live +function LoadingTable() { + const columns = [ + { title: 'Name', dataIndex: 'name', key: 'name' }, + { title: 'Status', dataIndex: 'status', key: 'status' }, + ]; + return
; +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `size` | `string` | `"small"` | Table size. | +| `bordered` | `boolean` | `false` | Whether to show all table borders. | +| `loading` | `boolean` | `false` | Whether the table is in a loading state. | +| `sticky` | `boolean` | `true` | Whether the table header is sticky. | +| `resizable` | `boolean` | `false` | Whether columns can be resized by dragging column edges. | +| `reorderable` | `boolean` | `false` | EXPERIMENTAL: Whether columns can be reordered by dragging. May not work in all contexts. | +| `usePagination` | `boolean` | `false` | Whether to enable pagination. When enabled, the table displays 5 rows per page. | +| `key` | `number` | `5` | - | +| `name` | `string` | `"1GB USB Flash Drive"` | - | +| `category` | `string` | `"Portable Storage"` | - | +| `price` | `number` | `9.99` | - | +| `height` | `number` | `350` | - | +| `defaultPageSize` | `number` | `5` | - | +| `pageSizeOptions` | `any` | `["5","10"]` | - | +| `data` | `any` | `[{"key":1,"name":"Floppy Disk 10 pack","category":"Disk Storage","price":9.99},{"key":2,"name":"DVD 100 pack","category":"Optical Storage","price":27.99},{"key":3,"name":"128 GB SSD","category":"Harddrive","price":49.99},{"key":4,"name":"4GB 144mhz","category":"Memory","price":19.99},{"key":5,"name":"1GB USB Flash Drive","category":"Portable Storage","price":9.99},{"key":6,"name":"256 GB SSD","category":"Harddrive","price":89.99},{"key":7,"name":"1 TB SSD","category":"Harddrive","price":349.99},{"key":8,"name":"16 GB DDR4","category":"Memory","price":59.99},{"key":9,"name":"32 GB DDR5","category":"Memory","price":129.99},{"key":10,"name":"Blu-ray 50 pack","category":"Optical Storage","price":34.99},{"key":11,"name":"64 GB USB Drive","category":"Portable Storage","price":14.99},{"key":12,"name":"2 TB HDD","category":"Harddrive","price":59.99}]` | - | +| `columns` | `any` | `[{"title":"Name","dataIndex":"name","key":"name","width":200},{"title":"Category","dataIndex":"category","key":"category","width":150},{"title":"Price","dataIndex":"price","key":"price","width":100}]` | - | + +## Import + +```tsx +import { Table } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Table/Table.stories.tsx). +::: diff --git a/docs/developer_docs/components/index.mdx b/docs/developer_docs/components/index.mdx new file mode 100644 index 00000000000..26cf64e1cdc --- /dev/null +++ b/docs/developer_docs/components/index.mdx @@ -0,0 +1,74 @@ +--- +title: UI Components Overview +sidebar_label: Overview +sidebar_position: 0 +--- + + + +# Superset Design System + +A design system is a complete set of standards intended to manage design at scale using reusable components and patterns. + +The Superset Design System uses [Atomic Design](https://bradfrost.com/blog/post/atomic-web-design/) principles with adapted terminology: + +| Atomic Design | Atoms | Molecules | Organisms | Templates | Pages / Screens | +|---|:---:|:---:|:---:|:---:|:---:| +| **Superset Design** | Foundations | Components | Patterns | Templates | Features | + +Atoms = Foundations, Molecules = Components, Organisms = Patterns, Templates = Templates, Pages / Screens = Features + +--- + +## Component Library + +Interactive documentation for Superset's UI component library. **53 components** documented across 2 categories. + +### [Core Components](./ui/) +46 components — Buttons, inputs, modals, selects, and other fundamental UI elements. + +### [Layout Components](./design-system/) +7 components — Grid, Layout, Table, Flex, Space, and container components for page structure. + + +## Usage + +All components are exported from `@superset-ui/core/components`: + +```tsx +import { Button, Modal, Select } from '@superset-ui/core/components'; +``` + +## Contributing + +This documentation is auto-generated from Storybook stories. To add or update component documentation: + +1. Create or update the component's `.stories.tsx` file +2. Add a descriptive `title` and `description` in the story meta +3. Export an interactive story with `args` for configurable props +4. Run `yarn generate:superset-components` in the `docs/` directory + +:::info Work in Progress +This component library is actively being documented. See the [Components TODO](./TODO) page for a list of components awaiting documentation. +::: + +--- + +*Auto-generated from Storybook stories in the [Design System/Introduction](https://github.com/apache/superset/blob/master/superset-frontend/packages/superset-ui-core/src/components/DesignSystem.stories.tsx) story.* diff --git a/docs/developer_docs/components/ui/autocomplete.mdx b/docs/developer_docs/components/ui/autocomplete.mdx new file mode 100644 index 00000000000..e0d721a7b9a --- /dev/null +++ b/docs/developer_docs/components/ui/autocomplete.mdx @@ -0,0 +1,215 @@ +--- +title: AutoComplete +sidebar_label: AutoComplete +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# AutoComplete + +AutoComplete component for search functionality. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `placeholder` | `string` | `"Type to search..."` | Placeholder text for AutoComplete | +| `options` | `any` | `[{"value":"Dashboard","label":"Dashboard"},{"value":"Chart","label":"Chart"},{"value":"Dataset","label":"Dataset"},{"value":"Database","label":"Database"},{"value":"Query","label":"Query"}]` | The dropdown options | +| `style` | `any` | `{"width":300}` | Custom styles for AutoComplete | +| `filterOption` | `boolean` | `true` | Enable filtering of options based on input | + +## Import + +```tsx +import { AutoComplete } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/AutoComplete/AutoComplete.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/avatar.mdx b/docs/developer_docs/components/ui/avatar.mdx new file mode 100644 index 00000000000..96c438d3721 --- /dev/null +++ b/docs/developer_docs/components/ui/avatar.mdx @@ -0,0 +1,140 @@ +--- +title: Avatar +sidebar_label: Avatar +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Avatar + +The Avatar component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + AB + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `children` | `string` | `"AB"` | Text or initials to display inside the avatar. | +| `alt` | `string` | `""` | - | +| `gap` | `number` | `4` | Letter spacing inside the avatar. | +| `shape` | `string` | `"circle"` | The shape of the avatar. | +| `size` | `string` | `"default"` | The size of the avatar. | +| `src` | `string` | `""` | Image URL for the avatar. If provided, overrides children. | +| `draggable` | `boolean` | `false` | - | + +## Import + +```tsx +import { Avatar } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Avatar/Avatar.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/badge.mdx b/docs/developer_docs/components/ui/badge.mdx new file mode 100644 index 00000000000..5381fb76fb3 --- /dev/null +++ b/docs/developer_docs/components/ui/badge.mdx @@ -0,0 +1,160 @@ +--- +title: Badge +sidebar_label: Badge +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Badge + +The Badge component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Status Badge + +```tsx live +function StatusBadgeDemo() { + const statuses = ['default', 'success', 'processing', 'warning', 'error']; + return ( +
+ {statuses.map(status => ( + + ))} +
+ ); +} +``` + +## Color Gallery + +```tsx live +function ColorGallery() { + const colors = ['pink', 'red', 'orange', 'green', 'cyan', 'blue', 'purple']; + return ( +
+ {colors.map(color => ( + + ))} +
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `count` | `number` | `5` | Number to show in the badge. | +| `size` | `string` | `"default"` | Size of the badge. | +| `showZero` | `boolean` | `false` | Whether to show badge when count is zero. | +| `overflowCount` | `number` | `99` | Max count to show. Shows count+ when exceeded (e.g., 99+). | + +## Import + +```tsx +import { Badge } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Badge/Badge.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/breadcrumb.mdx b/docs/developer_docs/components/ui/breadcrumb.mdx new file mode 100644 index 00000000000..60081b937cf --- /dev/null +++ b/docs/developer_docs/components/ui/breadcrumb.mdx @@ -0,0 +1,93 @@ +--- +title: Breadcrumb +sidebar_label: Breadcrumb +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Breadcrumb + +Breadcrumb component for displaying navigation paths. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + + + +## Import + +```tsx +import { Breadcrumb } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Breadcrumb/Breadcrumb.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/button.mdx b/docs/developer_docs/components/ui/button.mdx new file mode 100644 index 00000000000..d36acbbf680 --- /dev/null +++ b/docs/developer_docs/components/ui/button.mdx @@ -0,0 +1,142 @@ +--- +title: Button +sidebar_label: Button +--- + + + +import { StoryWithControls, ComponentGallery } from '../../../src/components/StorybookWrapper'; + +# Button + +The Button component from Superset's UI library. + +## All Variants + + + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `buttonStyle` | `string` | `"default"` | The style variant of the button. | +| `buttonSize` | `string` | `"default"` | The size of the button. | +| `children` | `string` | `"Button!"` | The button text or content. | + +## Import + +```tsx +import { Button } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Button/Button.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/buttongroup.mdx b/docs/developer_docs/components/ui/buttongroup.mdx new file mode 100644 index 00000000000..6f35bea6159 --- /dev/null +++ b/docs/developer_docs/components/ui/buttongroup.mdx @@ -0,0 +1,88 @@ +--- +title: ButtonGroup +sidebar_label: ButtonGroup +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# ButtonGroup + +ButtonGroup is a container that groups multiple Button components together with consistent spacing and styling. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + + + + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `expand` | `boolean` | `false` | When true, buttons expand to fill available width. | + +## Import + +```tsx +import { ButtonGroup } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/ButtonGroup/ButtonGroup.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/cachedlabel.mdx b/docs/developer_docs/components/ui/cachedlabel.mdx new file mode 100644 index 00000000000..65a83d5a7c7 --- /dev/null +++ b/docs/developer_docs/components/ui/cachedlabel.mdx @@ -0,0 +1,79 @@ +--- +title: CachedLabel +sidebar_label: CachedLabel +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# CachedLabel + +The CachedLabel component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + + + +## Import + +```tsx +import { CachedLabel } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/CachedLabel/CachedLabel.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/card.mdx b/docs/developer_docs/components/ui/card.mdx new file mode 100644 index 00000000000..989706ed940 --- /dev/null +++ b/docs/developer_docs/components/ui/card.mdx @@ -0,0 +1,142 @@ +--- +title: Card +sidebar_label: Card +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Card + +A container component for grouping related content. Supports titles, borders, loading states, and hover effects. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + This card displays a summary of your dashboard metrics and recent activity. + + ); +} +``` + +## Card States + +```tsx live +function CardStates() { + return ( +
+ + Default card content. + + + Hover over this card. + + + This content is hidden while loading. + + + Borderless card. + +
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `padded` | `boolean` | `true` | Whether the card content has padding. | +| `title` | `string` | `"Dashboard Overview"` | Title text displayed at the top of the card. | +| `children` | `string` | `"This card displays a summary of your dashboard metrics and recent activity."` | The content inside the card. | +| `bordered` | `boolean` | `true` | Whether to show a border around the card. | +| `loading` | `boolean` | `false` | Whether to show a loading skeleton. | +| `hoverable` | `boolean` | `false` | Whether the card lifts on hover. | + +## Import + +```tsx +import { Card } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Card/Card.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/checkbox.mdx b/docs/developer_docs/components/ui/checkbox.mdx new file mode 100644 index 00000000000..a7b76e5b5ca --- /dev/null +++ b/docs/developer_docs/components/ui/checkbox.mdx @@ -0,0 +1,141 @@ +--- +title: Checkbox +sidebar_label: Checkbox +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Checkbox + +Checkbox component that supports both regular and indeterminate states, built on top of Ant Design v5 Checkbox. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## All Checkbox States + +```tsx live +function AllStates() { + return ( +
+ Unchecked + Checked + Indeterminate + Disabled unchecked + Disabled checked +
+ ); +} +``` + +## Select All Pattern + +```tsx live +function SelectAllDemo() { + const [selected, setSelected] = React.useState([]); + const options = ['Option A', 'Option B', 'Option C']; + + const allSelected = selected.length === options.length; + const indeterminate = selected.length > 0 && !allSelected; + + return ( +
+ setSelected(e.target.checked ? [...options] : [])} + > + Select All + +
+ {options.map(opt => ( +
+ setSelected(prev => + prev.includes(opt) ? prev.filter(x => x !== opt) : [...prev, opt] + )} + > + {opt} + +
+ ))} +
+
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `checked` | `boolean` | `false` | Whether the checkbox is checked. | +| `indeterminate` | `boolean` | `false` | Whether the checkbox is in indeterminate state (partially selected). | + +## Import + +```tsx +import { Checkbox } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Checkbox/Checkbox.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/collapse.mdx b/docs/developer_docs/components/ui/collapse.mdx new file mode 100644 index 00000000000..1bfef9f2868 --- /dev/null +++ b/docs/developer_docs/components/ui/collapse.mdx @@ -0,0 +1,106 @@ +--- +title: Collapse +sidebar_label: Collapse +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Collapse + +The Collapse component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `ghost` | `boolean` | `false` | - | +| `bordered` | `boolean` | `true` | - | +| `accordion` | `boolean` | `false` | - | +| `animateArrows` | `boolean` | `false` | - | +| `modalMode` | `boolean` | `false` | - | + +## Import + +```tsx +import { Collapse } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Collapse/Collapse.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/datepicker.mdx b/docs/developer_docs/components/ui/datepicker.mdx new file mode 100644 index 00000000000..0daebc8058a --- /dev/null +++ b/docs/developer_docs/components/ui/datepicker.mdx @@ -0,0 +1,110 @@ +--- +title: DatePicker +sidebar_label: DatePicker +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# DatePicker + +The DatePicker component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `placeholder` | `string` | `"Select date"` | - | +| `showNow` | `boolean` | `true` | Show "Now" button to select current date and time. | +| `allowClear` | `boolean` | `false` | - | +| `autoFocus` | `boolean` | `true` | - | +| `disabled` | `boolean` | `false` | - | +| `format` | `string` | `"YYYY-MM-DD hh:mm a"` | - | +| `inputReadOnly` | `boolean` | `false` | - | +| `picker` | `string` | `"date"` | - | +| `placement` | `string` | `"bottomLeft"` | - | +| `size` | `string` | `"middle"` | - | +| `showTime` | `any` | `{"format":"hh:mm a","needConfirm":false}` | - | + +## Import + +```tsx +import { DatePicker } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/DatePicker/DatePicker.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/divider.mdx b/docs/developer_docs/components/ui/divider.mdx new file mode 100644 index 00000000000..73bf9aa5b3f --- /dev/null +++ b/docs/developer_docs/components/ui/divider.mdx @@ -0,0 +1,144 @@ +--- +title: Divider +sidebar_label: Divider +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Divider + +The Divider component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + <> +

Horizontal divider with title (orientationMargin applies here):

+ Left Title + Right Title + Center Title +

Vertical divider (use container gap for spacing):

+
+ Link + + Link + + Link +
+ + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `dashed` | `boolean` | `false` | Whether line is dashed (deprecated, use variant). | +| `variant` | `string` | `"solid"` | Line style of the divider. | +| `orientation` | `string` | `"center"` | Position of title inside divider. | +| `orientationMargin` | `string` | `""` | Margin from divider edge to title. | +| `plain` | `boolean` | `true` | Use plain style without bold title. | +| `type` | `string` | `"horizontal"` | Direction of the divider. | + +## Import + +```tsx +import { Divider } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Divider/Divider.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/editabletitle.mdx b/docs/developer_docs/components/ui/editabletitle.mdx new file mode 100644 index 00000000000..d5e3e2a20c5 --- /dev/null +++ b/docs/developer_docs/components/ui/editabletitle.mdx @@ -0,0 +1,172 @@ +--- +title: EditableTitle +sidebar_label: EditableTitle +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# EditableTitle + +The EditableTitle component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + console.log('Saved:', newTitle)} + /> + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `canEdit` | `boolean` | `true` | Whether the title can be edited. | +| `editing` | `boolean` | `false` | Whether the title is currently in edit mode. | +| `emptyText` | `string` | `"Empty text"` | Text to display when title is empty. | +| `noPermitTooltip` | `string` | `"Not permitted"` | Tooltip shown when user lacks edit permission. | +| `showTooltip` | `boolean` | `true` | Whether to show tooltip on hover. | +| `title` | `string` | `"Title"` | The title text to display. | +| `defaultTitle` | `string` | `"Default title"` | Default title when none is provided. | +| `placeholder` | `string` | `"Placeholder"` | Placeholder text when editing. | +| `certifiedBy` | `string` | `""` | Name of person/team who certified this item. | +| `certificationDetails` | `string` | `""` | Additional certification details or description. | +| `maxWidth` | `number` | `100` | Maximum width of the title in pixels. | +| `autoSize` | `boolean` | `true` | Whether to auto-size based on content. | + +## Import + +```tsx +import { EditableTitle } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/EditableTitle/EditableTitle.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/emptystate.mdx b/docs/developer_docs/components/ui/emptystate.mdx new file mode 100644 index 00000000000..aeb7938f311 --- /dev/null +++ b/docs/developer_docs/components/ui/emptystate.mdx @@ -0,0 +1,147 @@ +--- +title: EmptyState +sidebar_label: EmptyState +--- + + + +import { StoryWithControls, ComponentGallery } from '../../../src/components/StorybookWrapper'; + +# EmptyState + +The EmptyState component from Superset's UI library. + +## All Variants + + + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + alert('Filters cleared!')} + /> + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `size` | `string` | `"medium"` | Size of the empty state component. | +| `title` | `string` | `"No Data Available"` | Main title text. | +| `description` | `string` | `"There is no data to display at this time."` | Description text below the title. | +| `image` | `string` | `"empty.svg"` | Predefined image to display. | +| `buttonText` | `string` | `""` | Text for optional action button. | + +## Import + +```tsx +import { EmptyState } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/EmptyState/EmptyState.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/favestar.mdx b/docs/developer_docs/components/ui/favestar.mdx new file mode 100644 index 00000000000..263659ac4a6 --- /dev/null +++ b/docs/developer_docs/components/ui/favestar.mdx @@ -0,0 +1,96 @@ +--- +title: FaveStar +sidebar_label: FaveStar +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# FaveStar + +FaveStar component for marking items as favorites + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `itemId` | `number` | `1` | Unique identifier for the item | +| `isStarred` | `boolean` | `false` | Whether the item is currently starred. | +| `showTooltip` | `boolean` | `true` | Show tooltip on hover. | + +## Import + +```tsx +import { FaveStar } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/FaveStar/FaveStar.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/iconbutton.mdx b/docs/developer_docs/components/ui/iconbutton.mdx new file mode 100644 index 00000000000..387b937e2e7 --- /dev/null +++ b/docs/developer_docs/components/ui/iconbutton.mdx @@ -0,0 +1,106 @@ +--- +title: IconButton +sidebar_label: IconButton +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# IconButton + +The IconButton component is a versatile button that allows you to combine an icon with a text label. It is designed for use in situations where you want to display an icon along with some text in a single clickable element. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `buttonText` | `string` | `"IconButton"` | The text inside the button. | +| `altText` | `string` | `"Icon button alt text"` | The alt text for the button. If not provided, the button text is used as the alt text by default. | +| `padded` | `boolean` | `true` | Add padding between icon and button text. | +| `icon` | `string` | `"https://superset.apache.org/img/superset-logo-horiz.svg"` | Icon inside the button (URL or path). | + +## Import + +```tsx +import { IconButton } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/IconButton/IconButton.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/icons.mdx b/docs/developer_docs/components/ui/icons.mdx new file mode 100644 index 00000000000..97b0b023862 --- /dev/null +++ b/docs/developer_docs/components/ui/icons.mdx @@ -0,0 +1,252 @@ +--- +title: Icons +sidebar_label: Icons +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Icons + +Icon library for Apache Superset. Contains over 200 icons based on Ant Design icons with consistent sizing and theming support. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( +
+ + + + +
+ ); +} +``` + +## Icon Sizes + +```tsx live +function IconSizes() { + const sizes = ['s', 'm', 'l', 'xl', 'xxl']; + return ( +
+ {sizes.map(size => ( +
+ +
{size}
+
+ ))} +
+ ); +} +``` + +## Icon Gallery + +```tsx live +function IconGallery() { + const Section = ({ title, children }) => ( +
+
{title}
+
{children}
+
+ ); + return ( +
+
+ + + + + + +
+
+ + + + + + + + +
+
+ + + + + + + + + + + + + + +
+
+ + + + + + + + + + +
+
+ + + + + + + + + +
+
+ + + + + + + + + + + + + + + +
+
+ + + + +
+
+ + + + + + + + +
+
+ ); +} +``` + +## Icon with Text + +```tsx live +function IconWithText() { + return ( +
+
+ + Success message +
+
+ + Information message +
+
+ + Warning message +
+
+ + Error message +
+
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `iconSize` | `string` | `"xl"` | Size of the icons: s (12px), m (16px), l (20px), xl (24px), xxl (32px). | + +## Import + +```tsx +import { Icons } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Icons/Icons.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/icontooltip.mdx b/docs/developer_docs/components/ui/icontooltip.mdx new file mode 100644 index 00000000000..df7e0445e94 --- /dev/null +++ b/docs/developer_docs/components/ui/icontooltip.mdx @@ -0,0 +1,100 @@ +--- +title: IconTooltip +sidebar_label: IconTooltip +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# IconTooltip + +The IconTooltip component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `tooltip` | `string` | `"Tooltip"` | Text content to display in the tooltip. | + +## Import + +```tsx +import { IconTooltip } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/IconTooltip/IconTooltip.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/index.mdx b/docs/developer_docs/components/ui/index.mdx new file mode 100644 index 00000000000..2a0d1aa8d41 --- /dev/null +++ b/docs/developer_docs/components/ui/index.mdx @@ -0,0 +1,77 @@ +--- +title: Core Components +sidebar_label: Core Components +sidebar_position: 1 +--- + + + +# Core Components + +46 components available in this category. + +## Components + +- [AutoComplete](./autocomplete) +- [Avatar](./avatar) +- [Badge](./badge) +- [Breadcrumb](./breadcrumb) +- [Button](./button) +- [ButtonGroup](./buttongroup) +- [CachedLabel](./cachedlabel) +- [Card](./card) +- [Checkbox](./checkbox) +- [Collapse](./collapse) +- [DatePicker](./datepicker) +- [Divider](./divider) +- [EditableTitle](./editabletitle) +- [EmptyState](./emptystate) +- [FaveStar](./favestar) +- [IconButton](./iconbutton) +- [Icons](./icons) +- [IconTooltip](./icontooltip) +- [InfoTooltip](./infotooltip) +- [Input](./input) +- [Label](./label) +- [List](./list) +- [ListViewCard](./listviewcard) +- [Loading](./loading) +- [Menu](./menu) +- [Modal](./modal) +- [ModalTrigger](./modaltrigger) +- [Popover](./popover) +- [ProgressBar](./progressbar) +- [Radio](./radio) +- [SafeMarkdown](./safemarkdown) +- [Select](./select) +- [Skeleton](./skeleton) +- [Slider](./slider) +- [Steps](./steps) +- [Switch](./switch) +- [TableCollection](./tablecollection) +- [TableView](./tableview) +- [Tabs](./tabs) +- [Timer](./timer) +- [Tooltip](./tooltip) +- [Tree](./tree) +- [TreeSelect](./treeselect) +- [Typography](./typography) +- [UnsavedChangesModal](./unsavedchangesmodal) +- [Upload](./upload) diff --git a/docs/developer_docs/components/ui/infotooltip.mdx b/docs/developer_docs/components/ui/infotooltip.mdx new file mode 100644 index 00000000000..28d0514bdc9 --- /dev/null +++ b/docs/developer_docs/components/ui/infotooltip.mdx @@ -0,0 +1,106 @@ +--- +title: InfoTooltip +sidebar_label: InfoTooltip +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# InfoTooltip + +The InfoTooltip component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `tooltip` | `string` | `"This is the text that will display!"` | - | + +## Import + +```tsx +import { InfoTooltip } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/InfoTooltip/InfoTooltip.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/input.mdx b/docs/developer_docs/components/ui/input.mdx new file mode 100644 index 00000000000..1fa8fd59073 --- /dev/null +++ b/docs/developer_docs/components/ui/input.mdx @@ -0,0 +1,162 @@ +--- +title: Input +sidebar_label: Input +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Input + +The Input component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `allowClear` | `boolean` | `false` | - | +| `disabled` | `boolean` | `false` | - | +| `showCount` | `boolean` | `false` | - | +| `type` | `string` | `"text"` | HTML input type | +| `variant` | `string` | `"outlined"` | Input style variant | + +## Import + +```tsx +import { Input } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Input/Input.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/label.mdx b/docs/developer_docs/components/ui/label.mdx new file mode 100644 index 00000000000..a0c8ee6bd48 --- /dev/null +++ b/docs/developer_docs/components/ui/label.mdx @@ -0,0 +1,105 @@ +--- +title: Label +sidebar_label: Label +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Label + +The Label component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `type` | `string` | `"default"` | The visual style of the label. | +| `children` | `string` | `"Label text"` | The label text content. | +| `monospace` | `boolean` | `false` | Use monospace font. | + +## Import + +```tsx +import { Label } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Label/Label.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/list.mdx b/docs/developer_docs/components/ui/list.mdx new file mode 100644 index 00000000000..3ff0c30cd2d --- /dev/null +++ b/docs/developer_docs/components/ui/list.mdx @@ -0,0 +1,117 @@ +--- +title: List +sidebar_label: List +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# List + +The List component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + const data = ['Dashboard Analytics', 'User Management', 'Data Sources']; + return ( + {item}} + /> + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `bordered` | `boolean` | `false` | Whether to show a border around the list. | +| `split` | `boolean` | `true` | Whether to show a divider between items. | +| `size` | `string` | `"default"` | Size of the list. | +| `loading` | `boolean` | `false` | Whether to show a loading indicator. | +| `dataSource` | `any` | `["Dashboard Analytics","User Management","Data Sources"]` | - | + +## Import + +```tsx +import { List } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/List/List.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/listviewcard.mdx b/docs/developer_docs/components/ui/listviewcard.mdx new file mode 100644 index 00000000000..4d2ea55d81b --- /dev/null +++ b/docs/developer_docs/components/ui/listviewcard.mdx @@ -0,0 +1,132 @@ +--- +title: ListViewCard +sidebar_label: ListViewCard +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# ListViewCard + +ListViewCard is a card component used to display items in list views with an image, title, description, and optional cover sections. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `title` | `string` | `"Superset Card Title"` | Title displayed on the card. | +| `loading` | `boolean` | `false` | Whether the card is in loading state. | +| `url` | `string` | `"/superset/dashboard/births/"` | URL the card links to. | +| `imgURL` | `string` | `"https://picsum.photos/seed/superset/300/200"` | Primary image URL for the card. | +| `description` | `string` | `"Lorem ipsum dolor sit amet, consectetur adipiscing elit..."` | Description text displayed on the card. | +| `coverLeft` | `string` | `"Left Section"` | Content for the left section of the cover. | +| `coverRight` | `string` | `"Right Section"` | Content for the right section of the cover. | + +## Import + +```tsx +import { ListViewCard } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/ListViewCard/ListViewCard.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/loading.mdx b/docs/developer_docs/components/ui/loading.mdx new file mode 100644 index 00000000000..4033746d776 --- /dev/null +++ b/docs/developer_docs/components/ui/loading.mdx @@ -0,0 +1,187 @@ +--- +title: Loading +sidebar_label: Loading +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Loading + +The Loading component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( +
+ {['normal', 'floating', 'inline'].map(position => ( +
+

{position}

+ +
+ ))} +
+ ); +} +``` + +## Size and Opacity Showcase + +```tsx live +function SizeShowcase() { + const sizes = ['s', 'm', 'l']; + return ( +
+
+
Size
+
Normal
+
Muted
+
Usage
+ {sizes.map(size => ( + +
+ {size.toUpperCase()} ({size === 's' ? '40px' : size === 'm' ? '70px' : '100px'}) +
+
+ +
+
+ +
+
+ {size === 's' && 'Filter bars, inline'} + {size === 'm' && 'Explore pages'} + {size === 'l' && 'Full page loading'} +
+
+ ))} +
+
+ ); +} +``` + +## Contextual Examples + +```tsx live +function ContextualDemo() { + return ( +
+

Filter Bar (size="s", muted)

+
+ Filter 1: + + Filter 2: + +
+ +

Dashboard Grid (size="s", muted)

+
+ {[1, 2, 3].map(i => ( +
+ +
+ ))} +
+ +

Main Loading (size="l")

+
+ +
+
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `size` | `string` | `"m"` | Size of the spinner: s (40px), m (70px), or l (100px). | +| `position` | `string` | `"normal"` | Position style: normal (inline flow), floating (overlay), or inline. | +| `muted` | `boolean` | `false` | Whether to show a muted/subtle version of the spinner. | + +## Import + +```tsx +import { Loading } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Loading/Loading.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/menu.mdx b/docs/developer_docs/components/ui/menu.mdx new file mode 100644 index 00000000000..7dd4baac68d --- /dev/null +++ b/docs/developer_docs/components/ui/menu.mdx @@ -0,0 +1,174 @@ +--- +title: Menu +sidebar_label: Menu +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Menu + +Navigation menu component supporting horizontal, vertical, and inline modes. Based on Ant Design Menu with Superset styling. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Vertical Menu + +```tsx live +function VerticalMenu() { + return ( + + ); +} +``` + +## Menu with Icons + +```tsx live +function MenuWithIcons() { + return ( + Dashboards, key: 'dashboards' }, + { label: <> Charts, key: 'charts' }, + { label: <> Datasets, key: 'datasets' }, + { label: <> SQL Lab, key: 'sqllab' }, + ]} + /> + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `mode` | `string` | `"horizontal"` | Menu display mode: horizontal navbar, vertical sidebar, or inline collapsible. | +| `selectable` | `boolean` | `true` | Whether menu items can be selected. | +| `items` | `any` | `[{"label":"Dashboards","key":"dashboards"},{"label":"Charts","key":"charts"},{"label":"Datasets","key":"datasets"},{"label":"SQL Lab","key":"sqllab"}]` | - | + +## Import + +```tsx +import { Menu } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Menu/Menu.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/modal.mdx b/docs/developer_docs/components/ui/modal.mdx new file mode 100644 index 00000000000..67a984dabb2 --- /dev/null +++ b/docs/developer_docs/components/ui/modal.mdx @@ -0,0 +1,207 @@ +--- +title: Modal +sidebar_label: Modal +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Modal + +Modal dialog component for displaying content that requires user attention or interaction. Supports customizable buttons, drag/resize, and confirmation dialogs. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function ModalDemo() { + const [isOpen, setIsOpen] = React.useState(false); + return ( + <> + + setIsOpen(false)} + title="Example Modal" + primaryButtonName="Submit" + onHandledPrimaryAction={() => { + alert('Submitted!'); + setIsOpen(false); + }} + > +

This is the modal content. Click Submit or close the modal.

+
+ + ); +} +``` + +## Danger Modal + +```tsx live +function DangerModal() { + const [isOpen, setIsOpen] = React.useState(false); + return ( + <> + + setIsOpen(false)} + title="Confirm Delete" + primaryButtonName="Delete" + primaryButtonStyle="danger" + onHandledPrimaryAction={() => { + alert('Deleted!'); + setIsOpen(false); + }} + > +

Are you sure you want to delete this item? This action cannot be undone.

+
+ + ); +} +``` + +## Confirmation Dialogs + +```tsx live +function ConfirmationDialogs() { + return ( +
+ + + +
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `disablePrimaryButton` | `boolean` | `false` | Whether the primary button is disabled. | +| `primaryButtonName` | `string` | `"Submit"` | Text for the primary action button. | +| `primaryButtonStyle` | `string` | `"primary"` | The style of the primary action button. | +| `show` | `boolean` | `false` | Whether the modal is visible. Use the "Try It" example below for a working demo. | +| `title` | `string` | `"I'm a modal!"` | Title displayed in the modal header. | +| `resizable` | `boolean` | `false` | Whether the modal can be resized by dragging corners. | +| `draggable` | `boolean` | `false` | Whether the modal can be dragged by its header. | +| `width` | `number` | `500` | Width of the modal in pixels. | + +## Import + +```tsx +import { Modal } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Modal/Modal.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/modaltrigger.mdx b/docs/developer_docs/components/ui/modaltrigger.mdx new file mode 100644 index 00000000000..1c3eddd84fc --- /dev/null +++ b/docs/developer_docs/components/ui/modaltrigger.mdx @@ -0,0 +1,192 @@ +--- +title: ModalTrigger +sidebar_label: ModalTrigger +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# ModalTrigger + +A component that renders a trigger element which opens a modal when clicked. Useful for actions that need confirmation or additional input. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + Click to Open} + modalTitle="Example Modal" + modalBody={

This is the modal content. You can put any React elements here.

} + width="500px" + responsive + /> + ); +} +``` + +## With Custom Trigger + +```tsx live +function CustomTrigger() { + return ( + + Add New Item + + } + modalTitle="Add New Item" + modalBody={ +
+

Fill out the form to add a new item.

+ +
+ } + width="400px" + /> + ); +} +``` + +## Draggable & Resizable + +```tsx live +function DraggableModal() { + return ( + Open Draggable Modal} + modalTitle="Draggable & Resizable" + modalBody={

Try dragging the header or resizing from the corners!

} + draggable + resizable + width="500px" + /> + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `isButton` | `boolean` | `true` | Whether to wrap the trigger in a button element. | +| `modalTitle` | `string` | `"Modal Title"` | Title displayed in the modal header. | +| `modalBody` | `string` | `"This is the modal body content."` | Content displayed in the modal body. | +| `tooltip` | `string` | `"Click to open modal"` | Tooltip text shown on hover over the trigger. | +| `width` | `string` | `"600px"` | Width of the modal (e.g., "600px", "80%"). | +| `maxWidth` | `string` | `"1000px"` | Maximum width of the modal. | +| `responsive` | `boolean` | `true` | Whether the modal should be responsive. | +| `draggable` | `boolean` | `false` | Whether the modal can be dragged by its header. | +| `resizable` | `boolean` | `false` | Whether the modal can be resized by dragging corners. | +| `triggerNode` | `string` | `"Click to Open Modal"` | The clickable element that opens the modal when clicked. | + +## Import + +```tsx +import { ModalTrigger } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/ModalTrigger/ModalTrigger.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/popover.mdx b/docs/developer_docs/components/ui/popover.mdx new file mode 100644 index 00000000000..12f04d65700 --- /dev/null +++ b/docs/developer_docs/components/ui/popover.mdx @@ -0,0 +1,199 @@ +--- +title: Popover +sidebar_label: Popover +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Popover + +A floating card that appears when hovering or clicking a trigger element. Supports configurable placement, trigger behavior, and custom content. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + + + ); +} +``` + +## Click Trigger + +```tsx live +function ClickPopover() { + return ( + + + + ); +} +``` + +## Placements + +```tsx live +function PlacementsDemo() { + return ( +
+ {['top', 'right', 'bottom', 'left'].map(placement => ( + + + + ))} +
+ ); +} +``` + +## Rich Content + +```tsx live +function RichPopover() { + return ( + +

Created by: Admin

+

Last modified: Jan 2025

+

Charts: 12

+ + } + > + +
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `content` | `string` | `"Popover sample content"` | Content displayed inside the popover body. | +| `title` | `string` | `"Popover title"` | Title displayed in the popover header. | +| `arrow` | `boolean` | `true` | Whether to show the popover's arrow pointing to the trigger. | +| `color` | `string` | `"#fff"` | The background color of the popover. | + +## Import + +```tsx +import { Popover } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Popover/Popover.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/progressbar.mdx b/docs/developer_docs/components/ui/progressbar.mdx new file mode 100644 index 00000000000..7bfd4d17491 --- /dev/null +++ b/docs/developer_docs/components/ui/progressbar.mdx @@ -0,0 +1,206 @@ +--- +title: ProgressBar +sidebar_label: ProgressBar +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# ProgressBar + +Progress bar component for displaying completion status. Supports line, circle, and dashboard display types. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## All Progress Types + +```tsx live +function AllTypesDemo() { + return ( +
+
+

Line

+ +
+
+

Circle

+ +
+
+

Dashboard

+ +
+
+ ); +} +``` + +## Status Variants + +```tsx live +function StatusDemo() { + const statuses = ['normal', 'success', 'exception', 'active']; + return ( +
+ {statuses.map(status => ( +
+ {status} + +
+ ))} +
+ ); +} +``` + +## Custom Colors + +```tsx live +function CustomColors() { + return ( +
+ + + + +
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `percent` | `number` | `75` | Completion percentage (0-100). | +| `status` | `string` | `"normal"` | Current status of the progress bar. | +| `type` | `string` | `"line"` | Display type: line, circle, or dashboard gauge. | +| `striped` | `boolean` | `false` | Whether to show striped animation on the bar. | +| `showInfo` | `boolean` | `true` | Whether to show the percentage text. | +| `strokeLinecap` | `string` | `"round"` | Shape of the progress bar endpoints. | + +## Import + +```tsx +import { ProgressBar } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/ProgressBar/ProgressBar.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/radio.mdx b/docs/developer_docs/components/ui/radio.mdx new file mode 100644 index 00000000000..ee5c2a67bf7 --- /dev/null +++ b/docs/developer_docs/components/ui/radio.mdx @@ -0,0 +1,137 @@ +--- +title: Radio +sidebar_label: Radio +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Radio + +Radio button component for selecting one option from a set. Supports standalone radio buttons, radio buttons styled as buttons, and grouped radio buttons with layout configuration. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + Radio + + ); +} +``` + +## Radio Button Variants + +```tsx live +function RadioButtonDemo() { + const [value, setValue] = React.useState('line'); + return ( + setValue(e.target.value)}> + Line Chart + Bar Chart + Pie Chart + + ); +} +``` + +## Vertical Radio Group + +```tsx live +function VerticalDemo() { + const [value, setValue] = React.useState('option1'); + return ( + setValue(e.target.value)}> +
+ First option + Second option + Third option +
+
+ ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `value` | `string` | `"radio1"` | The value associated with this radio button. | +| `disabled` | `boolean` | `false` | Whether the radio button is disabled. | +| `checked` | `boolean` | `false` | Whether the radio button is checked (controlled mode). | +| `children` | `string` | `"Radio"` | Label text displayed next to the radio button. | + +## Import + +```tsx +import { Radio } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/Radio/Radio.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/safemarkdown.mdx b/docs/developer_docs/components/ui/safemarkdown.mdx new file mode 100644 index 00000000000..ab41307b235 --- /dev/null +++ b/docs/developer_docs/components/ui/safemarkdown.mdx @@ -0,0 +1,85 @@ +--- +title: SafeMarkdown +sidebar_label: SafeMarkdown +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# SafeMarkdown + +The SafeMarkdown component from Superset's UI library. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( + + ); +} +``` + +## Props + +| Prop | Type | Default | Description | +|------|------|---------|-------------| +| `htmlSanitization` | `boolean` | `true` | Enable HTML sanitization (recommended for user input) | + +## Import + +```tsx +import { SafeMarkdown } from '@superset/components'; +``` + +--- + +:::tip[Improve this page] +This documentation is auto-generated from the component's Storybook story. +Help improve it by [editing the story file](https://github.com/apache/superset/edit/master/superset-frontend/packages/superset-ui-core/src/components/SafeMarkdown/SafeMarkdown.stories.tsx). +::: diff --git a/docs/developer_docs/components/ui/select.mdx b/docs/developer_docs/components/ui/select.mdx new file mode 100644 index 00000000000..f84be6b30b0 --- /dev/null +++ b/docs/developer_docs/components/ui/select.mdx @@ -0,0 +1,308 @@ +--- +title: Select +sidebar_label: Select +--- + + + +import { StoryWithControls } from '../../../src/components/StorybookWrapper'; + +# Select + +A versatile select component supporting single and multi-select modes, search filtering, option creation, and both synchronous and asynchronous data sources. + +## Live Example + + + +## Try It + +Edit the code below to experiment with the component: + +```tsx live +function Demo() { + return ( +
+ +
+ ); +} +``` + +## Allow New Options + +```tsx live +function AllowNewDemo() { + return ( +
+ +
+ ); +} +``` + +## One Line Mode + +```tsx live +function OneLineDemo() { + return ( +
+
@@ -115,8 +115,8 @@ Everything you need to contribute to the Apache Superset project. This section i **I found a bug** 1. [Search existing issues](https://github.com/apache/superset/issues) -2. [Report the bug](/developer_portal/contributing/issue-reporting) -3. [Submit a fix](/developer_portal/contributing/submitting-pr) +2. [Report the bug](/developer-docs/contributing/issue-reporting) +3. [Submit a fix](/developer-docs/contributing/submitting-pr)
**I want to contribute code** -1. [Set up development environment](/developer_portal/contributing/development-setup) +1. [Set up development environment](/developer-docs/contributing/development-setup) 2. [Find a good first issue](https://github.com/apache/superset/labels/good%20first%20issue) -3. [Submit your first PR](/developer_portal/contributing/submitting-pr) +3. [Submit your first PR](/developer-docs/contributing/submitting-pr) **I want to build an extension** -1. [Start with Quick Start](/developer_portal/extensions/quick-start) -2. [Learn extension structure](/developer_portal/extensions/extension-project-structure) -3. [Explore architecture](/developer_portal/extensions/architecture) +1. [Start with Quick Start](/developer-docs/extensions/quick-start) +2. [Learn extension development](/developer-docs/extensions/development) +3. [Explore architecture](/developer-docs/extensions/architecture)
@@ -132,4 +132,4 @@ Everything you need to contribute to the Apache Superset project. This section i --- -Welcome to the Apache Superset community! We're excited to have you contribute. 🎉 +Welcome to the Apache Superset community! We're excited to have you contribute. diff --git a/docs/developer_portal/sidebars.js b/docs/developer_docs/sidebars.js similarity index 100% rename from docs/developer_portal/sidebars.js rename to docs/developer_docs/sidebars.js diff --git a/docs/developer_portal/testing/backend-testing.md b/docs/developer_docs/testing/backend-testing.md similarity index 100% rename from docs/developer_portal/testing/backend-testing.md rename to docs/developer_docs/testing/backend-testing.md diff --git a/docs/developer_portal/testing/ci-cd.md b/docs/developer_docs/testing/ci-cd.md similarity index 100% rename from docs/developer_portal/testing/ci-cd.md rename to docs/developer_docs/testing/ci-cd.md diff --git a/docs/developer_portal/testing/e2e-testing.md b/docs/developer_docs/testing/e2e-testing.md similarity index 100% rename from docs/developer_portal/testing/e2e-testing.md rename to docs/developer_docs/testing/e2e-testing.md diff --git a/docs/developer_portal/testing/frontend-testing.md b/docs/developer_docs/testing/frontend-testing.md similarity index 100% rename from docs/developer_portal/testing/frontend-testing.md rename to docs/developer_docs/testing/frontend-testing.md diff --git a/docs/developer_portal/testing/overview.md b/docs/developer_docs/testing/overview.md similarity index 100% rename from docs/developer_portal/testing/overview.md rename to docs/developer_docs/testing/overview.md diff --git a/docs/developer_portal/testing/storybook.md b/docs/developer_docs/testing/storybook.md similarity index 100% rename from docs/developer_portal/testing/storybook.md rename to docs/developer_docs/testing/storybook.md diff --git a/docs/developer_portal/testing/testing-guidelines.md b/docs/developer_docs/testing/testing-guidelines.md similarity index 100% rename from docs/developer_portal/testing/testing-guidelines.md rename to docs/developer_docs/testing/testing-guidelines.md diff --git a/docs/developer_portal/versions.json b/docs/developer_docs/versions.json similarity index 100% rename from docs/developer_portal/versions.json rename to docs/developer_docs/versions.json diff --git a/docs/docs/contributing/contributing.mdx b/docs/docs/contributing/contributing.mdx deleted file mode 100644 index d3aec8e3d11..00000000000 --- a/docs/docs/contributing/contributing.mdx +++ /dev/null @@ -1,142 +0,0 @@ ---- -title: Contributing to Superset -sidebar_position: 1 -version: 1 ---- - -# Contributing to Superset - -Superset is an [Apache Software foundation](https://www.apache.org/theapacheway/index.html) project. -The core contributors (or committers) to Superset communicate primarily in the following channels ( -which can be joined by anyone): - -- [Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org) -- [Apache Superset Slack community](http://bit.ly/join-superset-slack) -- [GitHub issues](https://github.com/apache/superset/issues) -- [GitHub pull requests](https://github.com/apache/superset/pulls) -- [GitHub discussions](https://github.com/apache/superset/discussions) -- [Superset Community Calendar](https://superset.apache.org/community) - -More references: - -- [Superset Wiki (code guidelines and additional resources)](https://github.com/apache/superset/wiki) - -## Orientation - -Here's a list of repositories that contain Superset-related packages: - -- [apache/superset](https://github.com/apache/superset) - is the main repository containing the `apache_superset` Python package - distributed on - [pypi](https://pypi.org/project/apache_superset/). This repository - also includes Superset's main TypeScript/JavaScript bundles and react apps under - the [superset-frontend](https://github.com/apache/superset/tree/master/superset-frontend) - folder. -- [github.com/apache-superset](https://github.com/apache-superset) is the - GitHub organization under which we manage Superset-related - small tools, forks and Superset-related experimental ideas. - -## Types of Contributions - -### Report Bug - -The best way to report a bug is to file an issue on GitHub. Please include: - -- Your operating system name and version. -- Superset version. -- Detailed steps to reproduce the bug. -- Any details about your local setup that might be helpful in troubleshooting. - -When posting Python stack traces, please quote them using -[Markdown blocks](https://help.github.com/articles/creating-and-highlighting-code-blocks/). - -_Please note that feature requests opened as GitHub Issues will be moved to Discussions._ - -### Submit Ideas or Feature Requests - -The best way is to start an ["Ideas" Discussion thread](https://github.com/apache/superset/discussions/categories/ideas) on GitHub: - -- Explain in detail how it would work. -- Keep the scope as narrow as possible, to make it easier to implement. -- Remember that this is a volunteer-driven project, and that your contributions are as welcome as anyone's :) - -To propose large features or major changes to codebase, and help usher in those changes, please create a **Superset Improvement Proposal (SIP)**. See template from [SIP-0](https://github.com/apache/superset/issues/5602) - -### Fix Bugs - -Look through the GitHub issues. Issues tagged with `#bug` are -open to whoever wants to implement them. - -### Implement Features - -Look through the GitHub issues. Issues tagged with -`#feature` are open to whoever wants to implement them. - -### Improve Documentation - -Superset could always use better documentation, -whether as part of the official Superset docs, -in docstrings, `docs/*.rst` or even on the web as blog posts or -articles. See [Documentation](/docs/contributing/howtos#contributing-to-documentation) for more details. - -### Add Translations - -If you are proficient in a non-English language, you can help translate -text strings from Superset's UI. You can jump into the existing -language dictionaries at -`superset/translations//LC_MESSAGES/messages.po`, or -even create a dictionary for a new language altogether. -See [Translating](howtos#contributing-translations) for more details. - -### Ask Questions - -There is a dedicated [`apache-superset` tag](https://stackoverflow.com/questions/tagged/apache-superset) on [StackOverflow](https://stackoverflow.com/). Please use it when asking questions. - -## Types of Contributors - -Following the project governance model of the Apache Software Foundation (ASF), Apache Superset has a specific set of contributor roles: - -### PMC Member - -A Project Management Committee (PMC) member is a person who has been elected by the PMC to help manage the project. PMC members are responsible for the overall health of the project, including community development, release management, and project governance. PMC members are also responsible for the technical direction of the project. - -For more information about Apache Project PMCs, please refer to https://www.apache.org/foundation/governance/pmcs.html - -### Committer - -A committer is a person who has been elected by the PMC to have write access (commit access) to the code repository. They can modify the code, documentation, and website and accept contributions from others. - -The official list of committers and PMC members can be found [here](https://projects.apache.org/committee.html?superset). - -### Contributor - -A contributor is a person who has contributed to the project in any way, including but not limited to code, tests, documentation, issues, and discussions. - -> You can also review the Superset project's guidelines for PMC member promotion here: https://github.com/apache/superset/wiki/Guidelines-for-promoting-Superset-Committers-to-the-Superset-PMC - -### Security Team - -The security team is a selected subset of PMC members, committers and non-committers who are responsible for handling security issues. - -New members of the security team are selected by the PMC members in a vote. You can request to be added to the team by sending a message to private@superset.apache.org. However, the team should be small and focused on solving security issues, so the requests will be evaluated on a case-by-case basis and the team size will be kept relatively small, limited to only actively security-focused contributors. - -This security team must follow the [ASF vulnerability handling process](https://apache.org/security/committers.html#asf-project-security-for-committers). - -Each new security issue is tracked as a JIRA ticket on the [ASF's JIRA Superset security project](https://issues.apache.org/jira/secure/RapidBoard.jspa?rapidView=588&projectKey=SUPERSETSEC) - -Security team members must: - -- Have an [ICLA](https://www.apache.org/licenses/contributor-agreements.html) signed with Apache Software Foundation. -- Not reveal information about pending and unfixed security issues to anyone (including their employers) unless specifically authorised by the security team members, e.g., if the security team agrees that diagnosing and solving an issue requires the involvement of external experts. - -A release manager, the contributor overseeing the release of a specific version of Apache Superset, is by default a member of the security team. However, they are not expected to be active in assessing, discussing, and fixing security issues. - -Security team members should also follow these general expectations: - -- Actively participate in assessing, discussing, fixing, and releasing security issues in Superset. -- Avoid discussing security fixes in public forums. Pull request (PR) descriptions should not contain any information about security issues. The corresponding JIRA ticket should contain a link to the PR. -- Security team members who contribute to a fix may be listed as remediation developers in the CVE report, along with their job affiliation (if they choose to include it). - -:::resources -- [Blog: Comprehensive Tutorial for Contributing Code to Apache Superset](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/) -::: diff --git a/docs/docs/contributing/development.mdx b/docs/docs/contributing/development.mdx deleted file mode 100644 index b75f2d5118a..00000000000 --- a/docs/docs/contributing/development.mdx +++ /dev/null @@ -1,1205 +0,0 @@ ---- -title: Setting up a Development Environment -sidebar_position: 3 -version: 1 ---- -# Setting up a Development Environment - -The documentation in this section is a bit of a patchwork of knowledge representing the -multitude of ways that exist to run Superset (`docker compose`, just "docker", on "metal", using -a Makefile). - -:::note -Now we have evolved to recommend and support `docker compose` more actively as the main way -to run Superset for development and preserve your sanity. **Most people should stick to -the first few sections - ("Fork & Clone", "docker compose" and "Installing Dev Tools")** -::: - -## Fork and Clone - -First, [fork the repository on GitHub](https://help.github.com/articles/about-forks/), -then clone it. - -Second, you can clone the main repository directly, but you won't be able to send pull requests. - -```bash -git clone git@github.com:your-username/superset.git -cd superset -``` - -## docker compose (recommended!) - -Setting things up to squeeze a "hello world" into any part of Superset should be as simple as - -```bash -# getting docker compose to fire up services, and rebuilding if some docker layers have changed -# using the `--build` suffix may be slower and optional if layers like py dependencies haven't changed -docker compose up --build -``` - -Note that: - -- this will pull/build docker images and run a cluster of services, including: - - A Superset **Flask web server**, mounting the local python repo/code - - A Superset **Celery worker**, also mounting the local python repo/code - - A Superset **Node service**, mounting, compiling and bundling the JS/TS assets - - A Superset **Node websocket service** to power the async backend - - **Postgres** as the metadata database and to store example datasets, charts and dashboards which - should be populated upon startup - - **Redis** as the message queue for our async backend and caching backend -- It'll load up examples into the database upon the first startup -- all other details and pointers available in - [docker-compose.yml](https://github.com/apache/superset/blob/master/docker-compose.yml) -- The local repository is mounted within the services, meaning updating - the code on the host will be reflected in the docker images -- Superset is served at localhost:9000/ -- You can login with admin/admin - -:::note -Installing and building Node modules for Apache Superset inside `superset-node` can take a -significant amount of time. This is normal due to the size of the dependencies. Please be -patient while the process completes, as long wait times do not indicate an issue with your setup. -If delays seem excessive, check your internet connection or system resources. -::: - -:::caution -Since `docker compose` is primarily designed to run a set of containers on **a single host** -and can't credibly support **high availability** as a result, we do not support nor recommend -using our `docker compose` constructs to support production-type use-cases. For single host -environments, we recommend using [minikube](https://minikube.sigs.k8s.io/docs/start/) along -our [installing on k8s](https://superset.apache.org/docs/installation/running-on-kubernetes) -documentation. -configured to be secure. -::: - -### Supported environment variables - -Affecting the Docker build process: - -- **SUPERSET_BUILD_TARGET (default=dev):** which --target to build, either `lean` or `dev` are commonly used -- **INCLUDE_FIREFOX (default=false):** whether to include the Firefox headless browser in the build -- **INCLUDE_CHROMIUM (default=false):** whether to include the Chromium headless browser in the build -- **BUILD_TRANSLATIONS(default=false):** whether to compile the translations from the .po files available -- **SUPERSET_LOAD_EXAMPLES (default=yes):** whether to load the examples into the database upon startup, - save some precious time on startup by `SUPERSET_LOAD_EXAMPLES=no docker compose up` -- **SUPERSET_LOG_LEVEL (default=info)**: Can be set to debug, info, warning, error, critical - for more verbose logging - -For more env vars that affect your configuration, see this -[superset_config.py](https://github.com/apache/superset/blob/master/docker/pythonpath_dev/superset_config.py) -used in the `docker compose` context to assign env vars to the superset configuration. - -### Accessing the postgres database - -Sometimes it's useful to access the database in the docker container directly. -You can enter a `psql` shell (the official Postgres client) by running the following command: - -```bash -docker compose exec db psql -U superset -``` - -Also note that the database is exposed on port 5432, so you can connect to it using your favorite -Postgres client or even SQL Lab itselft directly in Superset by creating a new database connection -to `localhost:5432`. - -### Nuking the postgres database - -At times, it's possible to end up with your development database in a bad state, it's -common while switching branches that contain migrations for instance, where the database -version stamp that `alembic` manages is no longer available after switching branch. - -In that case, the easy solution is to nuke the postgres db and start fresh. Note that the full -state of the database will be gone after doing this, so be cautious. - -```bash -# first stop docker-compose if it's running -docker-compose down -# delete the volume containing the database -docker volume rm superset_db_home -# restart docker-compose, which will init a fresh database and load examples -docker-compose up -``` - -## GitHub Codespaces (Cloud Development) - -GitHub Codespaces provides a complete, pre-configured development environment in the cloud. This is ideal for: -- Quick contributions without local setup -- Consistent development environments across team members -- Working from devices that can't run Docker locally -- Safe experimentation in isolated environments - -:::info -We're grateful to GitHub for providing this excellent cloud development service that makes -contributing to Apache Superset more accessible to developers worldwide. -::: - -### Getting Started with Codespaces - -1. **Create a Codespace**: Use this pre-configured link that sets up everything you need: - - [**Launch Superset Codespace →**](https://github.com/codespaces/new?skip_quickstart=true&machine=standardLinux32gb&repo=39464018&ref=master&devcontainer_path=.devcontainer%2Fdevcontainer.json&geo=UsWest) - - :::caution - **Important**: You must select at least the **4 CPU / 16GB RAM** machine type (pre-selected in the link above). - Smaller instances will not have sufficient resources to run Superset effectively. - ::: - -2. **Wait for Setup**: The initial setup takes several minutes. The Codespace will: - - Build the development container - - Install all dependencies - - Start all required services (PostgreSQL, Redis, etc.) - - Initialize the database with example data - -3. **Access Superset**: Once ready, check the **PORTS** tab in VS Code for port `9001`. - Click the globe icon to open Superset in your browser. - - Default credentials: `admin` / `admin` - -### Key Features - -- **Auto-reload**: Both Python and TypeScript files auto-refresh on save -- **Pre-installed Extensions**: VS Code extensions for Python, TypeScript, and database tools -- **Multiple Instances**: Run multiple Codespaces for different branches/features -- **SSH Access**: Connect via terminal using `gh cs ssh` or through the GitHub web UI -- **VS Code Integration**: Works seamlessly with VS Code desktop app - -### Managing Codespaces - -- **List active Codespaces**: `gh cs list` -- **SSH into a Codespace**: `gh cs ssh` -- **Stop a Codespace**: Via GitHub UI or `gh cs stop` -- **Delete a Codespace**: Via GitHub UI or `gh cs delete` - -### Debugging and Logs - -Since Codespaces uses `docker-compose-light.yml`, you can monitor all services: - -```bash -# Stream logs from all services -docker compose -f docker-compose-light.yml logs -f - -# Stream logs from a specific service -docker compose -f docker-compose-light.yml logs -f superset - -# View last 100 lines and follow -docker compose -f docker-compose-light.yml logs --tail=100 -f - -# List all running services -docker compose -f docker-compose-light.yml ps -``` - -:::tip -Codespaces automatically stop after 30 minutes of inactivity to save resources. -Your work is preserved and you can restart anytime. -::: - -## Installing Development Tools - -:::note -While `docker compose` simplifies a lot of the setup, there are still -many things you'll want to set up locally to power your IDE, and things like -**commit hooks**, **linters**, and **test-runners**. Note that you can do these -things inside docker images with commands like `docker compose exec superset_app bash` for -instance, but many people like to run that tooling from their host. -::: - -### Python environment - -Assuming you already have a way to setup your python environments -like `pyenv`, `virtualenv` or something else, all you should have to -do is to install our dev, pinned python requirements bundle, after installing -the prerequisites mentioned in [OS Dependencies](https://superset.apache.org/docs/installation/pypi/#os-dependencies) - -```bash -pip install -r requirements/development.txt -``` - -### Git Hooks - -Superset uses Git pre-commit hooks courtesy of [pre-commit](https://pre-commit.com/). -To install run the following: - -```bash -pre-commit install -``` - -This will install the hooks in your local repository. From now on, a series of checks will -automatically run whenever you make a Git commit. - -#### Running Pre-commit Manually - -You can also run the pre-commit checks manually in various ways: - -- **Run pre-commit on all files (same as CI):** - - To run the pre-commit checks across all files in your repository, use the following command: - - ```bash - pre-commit run --all-files - ``` - - This is the same set of checks that will run during CI, ensuring your changes meet the project's standards. - -- **Run pre-commit on a specific file:** - - If you want to check or fix a specific file, you can do so by specifying the file path: - - ```bash - pre-commit run --files path/to/your/file.py - ``` - - This will only run the checks on the file(s) you specify. - -- **Run a specific pre-commit check:** - - To run a specific check (hook) across all files or a particular file, use the following command: - - ```bash - pre-commit run --all-files - ``` - - Or for a specific file: - - ```bash - pre-commit run --files path/to/your/file.py - ``` - - Replace `` with the ID of the specific hook you want to run. You can find the list - of available hooks in the `.pre-commit-config.yaml` file. - -## Working with LLMs - -### Environment Setup -Ensure Docker Compose is running before starting LLM sessions: -```bash -docker compose up -``` - -Validate your environment: -```bash -curl -f http://localhost:8088/health && echo "✅ Superset ready" -``` - -### LLM Session Best Practices -- Always validate environment setup first using the health checks above -- Use focused validation commands: `pre-commit run` (not `--all-files`) -- **Read [AGENTS.md](https://github.com/apache/superset/blob/master/AGENTS.md) first** - Contains comprehensive development guidelines, coding standards, and critical refactor information -- **Check platform-specific files** when available: - - `CLAUDE.md` - For Claude/Anthropic tools - - `CURSOR.md` - For Cursor editor - - `GEMINI.md` - For Google Gemini tools - - `GPT.md` - For OpenAI/ChatGPT tools -- Follow the TypeScript migration guidelines and avoid deprecated patterns listed in AGENTS.md - -### Key Development Commands -```bash -# Frontend development -cd superset-frontend -npm run dev # Development server on http://localhost:9000 -npm run test # Run all tests -npm run test -- filename.test.tsx # Run single test file -npm run lint # Linting and type checking - -# Backend validation -pre-commit run mypy # Type checking -pytest # Run all tests -pytest tests/unit_tests/specific_test.py # Run single test file -pytest tests/unit_tests/ # Run all tests in directory -``` - -For detailed development context, environment setup, and coding guidelines, see [AGENTS.md](https://github.com/apache/superset/blob/master/AGENTS.md). - -## Alternatives to `docker compose` - -:::caution -This part of the documentation is a patchwork of information related to setting up -development environments without `docker compose` and is documented/supported to varying -degrees. It's been difficult to maintain this wide array of methods and insure they're -functioning across environments. -::: - -### Flask server - -#### OS Dependencies - -Make sure your machine meets the [OS dependencies](https://superset.apache.org/docs/installation/pypi#os-dependencies) before following these steps. -You also need to install MySQL. - -Ensure that you are using Python version 3.9, 3.10 or 3.11, then proceed with: - -```bash -# Create a virtual environment and activate it (recommended) -python3 -m venv venv # setup a python3 virtualenv -source venv/bin/activate - -# Install external dependencies -pip install -r requirements/development.txt - -# Install Superset in editable (development) mode -pip install -e . - -# Initialize the database -superset db upgrade - -# Create an admin user in your metadata database (use `admin` as username to be able to load the examples) -superset fab create-admin - -# Create default roles and permissions -superset init - -# Load some data to play with. -# Note: you MUST have previously created an admin user with the username `admin` for this command to work. -superset load-examples - -# The load-examples command supports various options: -# --force / -f Force reload data even if tables exist -# --only-metadata / -m Only create table metadata without loading data (fast setup) -# --load-test-data / -t Load additional test dashboards and datasets -# --load-big-data / -b Generate synthetic data for stress testing (wide tables, many tables) - -# Start the Flask dev web server from inside your virtualenv. -# Note that your page may not have CSS at this point. -# See instructions below on how to build the front-end assets. -superset run -p 8088 --with-threads --reload --debugger --debug -``` - -Or you can install it via our Makefile - -```bash -# Create a virtual environment and activate it (recommended) -$ python3 -m venv venv # setup a python3 virtualenv -$ source venv/bin/activate - -# install pip packages + pre-commit -$ make install - -# Install superset pip packages and setup env only -$ make superset - -# Setup pre-commit only -$ make pre-commit -``` - -**Note: the FLASK_APP env var should not need to be set, as it's currently controlled -via `.flaskenv`, however, if needed, it should be set to `superset.app:create_app()`** - -If you have made changes to the FAB-managed templates, which are not built the same way as the newer, React-powered front-end assets, you need to start the app without the `--with-threads` argument like so: -`superset run -p 8088 --reload --debugger --debug` - -#### Dependencies - -If you add a new requirement or update an existing requirement (per the `install_requires` section in `setup.py`) you must recompile (freeze) the Python dependencies to ensure that for CI, testing, etc. the build is deterministic. This can be achieved via, - -```bash -python3 -m venv venv -source venv/bin/activate -python3 -m pip install -r requirements/development.txt -./scripts/uv-pip-compile.sh -``` - -When upgrading the version number of a single package, you should run `./scripts/uv-pip-compile.sh` with the `-P` flag: - -```bash -./scripts/uv-pip-compile.sh -P some-package-to-upgrade -``` - -To bring all dependencies up to date as per the restrictions defined in `setup.py` and `requirements/*.in`, run `./scripts/uv-pip-compile.sh --upgrade` - -```bash -./scripts/uv-pip-compile.sh --upgrade -``` - -This should be done periodically, but it is recommended to do thorough manual testing of the application to ensure no breaking changes have been introduced that aren't caught by the unit and integration tests. - -#### Logging to the browser console - -This feature is only available on Python 3. When debugging your application, you can have the server logs sent directly to the browser console using the [ConsoleLog](https://github.com/betodealmeida/consolelog) package. You need to mutate the app, by adding the following to your `config.py` or `superset_config.py`: - -```python -from console_log import ConsoleLog - -def FLASK_APP_MUTATOR(app): - app.wsgi_app = ConsoleLog(app.wsgi_app, app.logger) -``` - -Then make sure you run your WSGI server using the right worker type: - -```bash -gunicorn "superset.app:create_app()" -k "geventwebsocket.gunicorn.workers.GeventWebSocketWorker" -b 127.0.0.1:8088 --reload -``` - -### Frontend - -Frontend assets (TypeScript, JavaScript, CSS, and images) must be compiled in order to properly display the web UI. The `superset-frontend` directory contains all NPM-managed frontend assets. Note that for some legacy pages there are additional frontend assets bundled with Flask-Appbuilder (e.g. jQuery and bootstrap). These are not managed by NPM and may be phased out in the future. - -#### Prerequisite - -##### nvm and node - -First, be sure you are using the following versions of Node.js and npm: - -- `Node.js`: Version 20 -- `npm`: Version 10 - -We recommend using [nvm](https://github.com/nvm-sh/nvm) to manage your node environment: - -```bash -curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.37.0/install.sh | bash - -in case it shows '-bash: nvm: command not found' -export NVM_DIR="$HOME/.nvm" -[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm -[ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion" # This loads nvm bash_completion - -cd superset-frontend -nvm install --lts -nvm use --lts -``` - -Or if you use the default macOS starting with Catalina shell `zsh`, try: - -```zsh -sh -c "$(curl -fsSL https://raw.githubusercontent.com/nvm-sh/nvm/v0.37.0/install.sh)" -``` - -For those interested, you may also try out [avn](https://github.com/nvm-sh/nvm#deeper-shell-integration) to automatically switch to the node version that is required to run Superset frontend. - -#### Install dependencies - -Install third-party dependencies listed in `package.json` via: - -```bash -# From the root of the repository -cd superset-frontend - -# Install dependencies from `package-lock.json` -npm ci -``` - -Note that Superset uses [Scarf](https://docs.scarf.sh) to capture telemetry/analytics about versions being installed, including the `scarf-js` npm package and an analytics pixel. As noted elsewhere in this documentation, Scarf gathers aggregated stats for the sake of security/release strategy and does not capture/retain PII. [You can read here](https://docs.scarf.sh/package-analytics/) about the `scarf-js` package, and various means to opt out of it, but you can opt out of the npm package _and_ the pixel by setting the `SCARF_ANALYTICS` environment variable to `false` or opt out of the pixel by adding this setting in `superset-frontent/package.json`: - -```json -// your-package/package.json -{ - // ... - "scarfSettings": { - "enabled": false - } - // ... -} -``` - -#### Build assets - -There are three types of assets you can build: - -1. `npm run build`: the production assets, CSS/JSS minified and optimized -2. `npm run dev-server`: local development assets, with sourcemaps and hot refresh support -3. `npm run build-instrumented`: instrumented application code for collecting code coverage from Cypress tests - -If while using the above commands you encounter an error related to the limit of file watchers: - -```bash -Error: ENOSPC: System limit for number of file watchers reached -``` - -The error is thrown because the number of files monitored by the system has reached the limit. -You can address this error by increasing the number of inotify watchers. - -The current value of max watches can be checked with: - -```bash -cat /proc/sys/fs/inotify/max_user_watches -``` - -Edit the file `/etc/sysctl.conf` to increase this value. -The value needs to be decided based on the system memory [(see this StackOverflow answer for more context)](https://stackoverflow.com/questions/535768/what-is-a-reasonable-amount-of-inotify-watches-with-linux). - -Open the file in an editor and add a line at the bottom specifying the max watches values. - -```bash -fs.inotify.max_user_watches=524288 -``` - -Save the file and exit the editor. -To confirm that the change succeeded, run the following command to load the updated value of max_user_watches from `sysctl.conf`: - -```bash -sudo sysctl -p -``` - -#### Webpack dev server - -The dev server by default starts at `http://localhost:9000` and proxies the backend requests to `http://localhost:8088`. - -So a typical development workflow is the following: - -1. [run Superset locally](#flask-server) using Flask, on port `8088` — but don't access it directly,
- - ```bash - # Install Superset and dependencies, plus load your virtual environment first, as detailed above. - superset run -p 8088 --with-threads --reload --debugger --debug - ``` - -2. in parallel, run the Webpack dev server locally on port `9000`,
- - ```bash - npm run dev-server - ``` - -3. access `http://localhost:9000` (the Webpack server, _not_ Flask) in your web browser. This will use the hot-reloading front-end assets from the Webpack development server while redirecting back-end queries to Flask/Superset: your changes on Superset codebase — either front or back-end — will then be reflected live in the browser. - -It's possible to change the Webpack server settings: - -```bash -# Start the dev server at http://localhost:9000 -npm run dev-server - -# Run the dev server on a non-default port -npm run dev-server -- --port=9001 - -# Proxy backend requests to a Flask server running on a non-default port -npm run dev-server -- --env=--supersetPort=8081 - -# Proxy to a remote backend but serve local assets -npm run dev-server -- --env=--superset=https://superset-dev.example.com -``` - -The `--superset=` option is useful in case you want to debug a production issue or have to setup Superset behind a firewall. It allows you to run Flask server in another environment while keep assets building locally for the best developer experience. - -#### Other npm commands - -Alternatively, there are other NPM commands you may find useful: - -1. `npm run build-dev`: build assets in development mode. -2. `npm run dev`: built dev assets in watch mode, will automatically rebuild when a file changes - -#### Docker (docker compose) - -See docs [here](https://superset.apache.org/docs/installation/docker-compose) - -#### Updating NPM packages - -Use npm in the prescribed way, making sure that -`superset-frontend/package-lock.json` is updated according to `npm`-prescribed -best practices. - -#### Feature flags - -Superset supports a server-wide feature flag system, which eases the incremental development of features. To add a new feature flag, simply modify `superset_config.py` with something like the following: - -```python -FEATURE_FLAGS = { - 'SCOPED_FILTER': True, -} -``` - -If you want to use the same flag in the client code, also add it to the FeatureFlag TypeScript enum in [@superset-ui/core](https://github.com/apache/superset/blob/master/superset-frontend/packages/superset-ui-core/src/utils/featureFlags.ts). For example, - -```typescript -export enum FeatureFlag { - SCOPED_FILTER = "SCOPED_FILTER", -} -``` - -`superset/config.py` contains `DEFAULT_FEATURE_FLAGS` which will be overwritten by -those specified under FEATURE_FLAGS in `superset_config.py`. For example, `DEFAULT_FEATURE_FLAGS = { 'FOO': True, 'BAR': False }` in `superset/config.py` and `FEATURE_FLAGS = { 'BAR': True, 'BAZ': True }` in `superset_config.py` will result -in combined feature flags of `{ 'FOO': True, 'BAR': True, 'BAZ': True }`. - -The current status of the usability of each flag (stable vs testing, etc) can be found in the [Feature Flags](/docs/configuration/feature-flags) documentation. - -## Git Hooks - -Superset uses Git pre-commit hooks courtesy of [pre-commit](https://pre-commit.com/). To install run the following: - -```bash -pip3 install -r requirements/development.txt -pre-commit install -``` - -A series of checks will now run when you make a git commit. - -## Linting - -See [how tos](/docs/contributing/howtos#linting) - -## GitHub Actions and `act` - -:::tip -`act` compatibility of Superset's GHAs is not fully tested. Running `act` locally may or may not -work for different actions, and may require fine tuning and local secret-handling. -For those more intricate GHAs that are tricky to run locally, we recommend iterating -directly on GHA's infrastructure, by pushing directly on a branch and monitoring GHA logs. -For more targeted iteration, see the `gh workflow run --ref {BRANCH}` subcommand of the GitHub CLI. -::: - -For automation and CI/CD, Superset makes extensive use of GitHub Actions (GHA). You -can find all of the workflows and other assets under the `.github/` folder. This includes: - -- running the backend unit test suites (`tests/`) -- running the frontend test suites (`superset-frontend/src/**.*.test.*`) -- running our Playwright end-to-end tests (`superset-frontend/playwright/`) and legacy Cypress tests (`superset-frontend/cypress-base/`) -- linting the codebase, including all Python, Typescript and Javascript, yaml and beyond -- checking for all sorts of other rules conventions - -When you open a pull request (PR), the appropriate GitHub Actions (GHA) workflows will -automatically run depending on the changes in your branch. It's perfectly reasonable -(and required!) to rely on this automation. However, the downside is that it's mostly an -all-or-nothing approach and doesn't provide much control to target specific tests or -iterate quickly. - -At times, it may be more convenient to run GHA workflows locally. For that purpose -we use [act](https://github.com/nektos/act), a tool that allows you to run GitHub Actions (GHA) -workflows locally. It simulates the GitHub Actions environment, enabling developers to -test and debug workflows on their local machines before pushing changes to the repository. More -on how to use it in the next section. - -:::note -In both GHA and `act`, we can run a more complex matrix for our tests, executing against different -database engines (PostgreSQL, MySQL, SQLite) and different versions of Python. -This enables us to ensure compatibility and stability across various environments. -::: - -### Using `act` - -First, install `act` -> https://nektosact.com/ - -To list the workflows, simply: - -```bash -act --list -``` - -To run a specific workflow: - -```bash -act pull_request --job {workflow_name} --secret GITHUB_TOKEN=$GITHUB_TOKEN --container-architecture linux/amd64 -``` - -In the example above, notice that: - -- we target a specific workflow, using `--job` -- we pass a secret using `--secret`, as many jobs require read access (public) to the repo -- we simulate a `pull_request` event by specifying it as the first arg, - similarly, we could simulate a `push` event or something else -- we specify `--container-architecture`, which tends to emulate GHA more reliably - -:::note -`act` is a rich tool that offers all sorts of features, allowing you to simulate different -events (pull_request, push, ...), semantics around passing secrets where required and much -more. For more information, refer to [act's documentation](https://nektosact.com/) -::: - -:::note -Some jobs require secrets to interact with external systems and accounts that you may -not have in your possession. In those cases you may have to rely on remote CI or parameterize the -job further to target a different environment/sandbox or your own alongside the related -secrets. -::: - ---- - -## Example Data and Test Loaders - -### Example Datasets - -Superset includes example datasets stored as Parquet files, organized by example name in the `superset/examples/` directory. Each example is self-contained: - -``` -superset/examples/ -├── _shared/ # Shared configuration -│ ├── database.yaml # Database connection config -│ └── metadata.yaml # Import metadata -├── birth_names/ # Example: US Birth Names -│ ├── data.parquet # Dataset (compressed columnar) -│ ├── dataset.yaml # Dataset metadata -│ ├── dashboard.yaml # Dashboard configuration (optional) -│ └── charts/ # Chart configurations (optional) -│ ├── Boys.yaml -│ ├── Girls.yaml -│ └── ... -├── energy_usage/ # Example: Energy Sankey -│ ├── data.parquet -│ ├── dataset.yaml -│ └── charts/ -└── ... (27 example directories) -``` - -#### Adding a New Example Dataset - -**Simple dataset (data only):** - -1. Create a directory: `superset/examples/my_dataset/` -2. Add your data as `data.parquet`: - ```python - import pandas as pd - df = pd.read_csv("your_data.csv") - df.to_parquet("superset/examples/my_dataset/data.parquet", compression="snappy") - ``` -3. The dataset will be auto-discovered when running `superset load-examples` - -**Complete example with dashboard:** - -1. Create your dataset directory with `data.parquet` -2. Add `dataset.yaml` with metadata (columns, metrics, etc.) -3. Add `dashboard.yaml` with dashboard layout -4. Add chart configs in `charts/` directory -5. See existing examples like `birth_names/` for reference - -#### Exporting an Existing Dashboard - -To export a dashboard and its charts as YAML configs: - -1. In Superset, go to the dashboard you want to export -2. Click the "..." menu → "Export" -3. Unzip the exported file -4. Copy the YAML files to your example directory -5. Add the `data.parquet` file - -#### Why Parquet? - -- **Apache-friendly**: Parquet is an Apache project, ideal for ASF codebases -- **Compressed**: Built-in Snappy compression (~27% smaller than CSV) -- **Self-describing**: Schema is embedded in the file -- **Widely supported**: Works with pandas, pyarrow, DuckDB, Spark, etc. - -### Test Data Generation - -For stress testing and development, Superset includes special test data generators that create synthetic data: - -#### Big Data Loader (`--load-big-data`) - -Located in `superset/cli/test_loaders.py`, this generates: - -- **Wide Table** (`wide_table`): 100 columns of mixed types, 1000 rows -- **Many Small Tables** (`small_table_0` through `small_table_999`): 1000 tables for testing catalog performance -- **Long Name Table**: Table with 60-character random name for testing UI edge cases - -This is primarily used for: -- Performance testing with extreme data shapes -- UI edge case validation -- Database catalog stress testing -- CI/CD pipeline validation - -#### Test Dashboards (`--load-test-data`) - -Loads additional test-specific content: -- Tabbed dashboard example -- Supported charts dashboard -- Test configuration files (*.test.yaml) - ---- - -## Testing - -### Python Testing - -#### Unit Tests - -For unit tests located in `tests/unit_tests/`, it's usually easy to simply run the script locally using: - -```bash -pytest tests/unit_tests/* -``` - -#### Integration Tests - -For more complex pytest-defined integration tests (not to be confused with our end-to-end Cypress tests), many tests will require having a working test environment. Some tests require a database, Celery, and potentially other services or libraries installed. - -### Running Tests with `act` - -To run integration tests locally using `act`, ensure you have followed the setup instructions from the [GitHub Actions and `act`](#github-actions-and-act) section. You can run specific workflows or jobs that include integration tests. For example: - -```bash -act --job test-python-38 --secret GITHUB_TOKEN=$GITHUB_TOKEN --event pull_request --container-architecture linux/amd64 -``` - -#### Running locally using a test script - -There is also a utility script included in the Superset codebase to run Python integration tests. The [readme can be found here](https://github.com/apache/superset/tree/master/scripts/tests). - -To run all integration tests, for example, run this script from the root directory: - -```bash -scripts/tests/run.sh -``` - -You can run unit tests found in `./tests/unit_tests` with pytest. It is a simple way to run an isolated test that doesn't need any database setup: - -```bash -pytest ./link_to_test.py -``` - -### Frontend Testing - -We use [Jest](https://jestjs.io/) and [Enzyme](https://airbnb.io/enzyme/) to test TypeScript/JavaScript. Tests can be run with: - -```bash -cd superset-frontend -npm run test -``` - -To run a single test file: - -```bash -npm run test -- path/to/file.js -``` - -#### Known Issues and Workarounds - -**Jest Test Hanging (MessageChannel Issue)** - -If Jest tests hang with "Jest did not exit one second after the test run has completed", this is likely due to the MessageChannel issue from rc-overflow (Ant Design v5 components). - -**Root Cause**: `rc-overflow@1.4.1` creates MessageChannel handles for responsive overflow detection that remain open after test completion. - -**Current Workaround**: MessageChannel is mocked as undefined in `spec/helpers/jsDomWithFetchAPI.ts`, forcing rc-overflow to use requestAnimationFrame fallback. - -**To verify if still needed**: Remove the MessageChannel mocking lines and run `npm test -- --shard=4/8`. If tests hang, the workaround is still required. - -**Future removal conditions**: This workaround can be removed when: -- rc-overflow updates to properly clean up MessagePorts in test environments -- Jest updates to handle MessageChannel/MessagePort cleanup better -- Ant Design switches away from rc-overflow -- We switch away from Ant Design v5 - -**See**: [PR #34871](https://github.com/apache/superset/pull/34871) for full technical details. - -### Debugging Server App - -#### Local - -For debugging locally using VSCode, you can configure a launch configuration file .vscode/launch.json such as - -```json -{ - "version": "0.2.0", - "configurations": [ - { - "name": "Python: Flask", - "type": "python", - "request": "launch", - "module": "flask", - "env": { - "FLASK_APP": "superset", - "SUPERSET_ENV": "development" - }, - "args": ["run", "-p 8088", "--with-threads", "--reload", "--debugger"], - "jinja": true, - "justMyCode": true - } - ] -} -``` - -#### Raw Docker (without `docker compose`) - -Follow these instructions to debug the Flask app running inside a docker container. Note that -this will run a barebones Superset web server, - -First, add the following to the ./docker-compose.yaml file - -```diff -superset: - env_file: docker/.env - image: *superset-image - container_name: superset_app - command: ["/app/docker/docker-bootstrap.sh", "app"] - restart: unless-stopped -+ cap_add: -+ - SYS_PTRACE - ports: - - 8088:8088 -+ - 5678:5678 - user: "root" - depends_on: *superset-depends-on - volumes: *superset-volumes - environment: - CYPRESS_CONFIG: "${CYPRESS_CONFIG}" -``` - -Start Superset as usual - -```bash -docker compose up --build -``` - -Install the required libraries and packages to the docker container - -Enter the superset_app container - -```bash -docker exec -it superset_app /bin/bash -root@39ce8cf9d6ab:/app# -``` - -Run the following commands inside the container - -```bash -apt update -apt install -y gdb -apt install -y net-tools -pip install debugpy -``` - -Find the PID for the Flask process. Make sure to use the first PID. The Flask app will re-spawn a sub-process every time you change any of the python code. So it's important to use the first PID. - -```bash -ps -ef - -UID PID PPID C STIME TTY TIME CMD -root 1 0 0 14:09 ? 00:00:00 bash /app/docker/docker-bootstrap.sh app -root 6 1 4 14:09 ? 00:00:04 /usr/local/bin/python /usr/bin/flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0 -root 10 6 7 14:09 ? 00:00:07 /usr/local/bin/python /usr/bin/flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0 -``` - -Inject debugpy into the running Flask process. In this case PID 6. - -```bash -python3 -m debugpy --listen 0.0.0.0:5678 --pid 6 -``` - -Verify that debugpy is listening on port 5678 - -```bash -netstat -tunap - -Active Internet connections (servers and established) -Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name -tcp 0 0 0.0.0.0:5678 0.0.0.0:* LISTEN 462/python -tcp 0 0 0.0.0.0:8088 0.0.0.0:* LISTEN 6/python -``` - -You are now ready to attach a debugger to the process. Using VSCode you can configure a launch configuration file .vscode/launch.json like so. - -```json -{ - "version": "0.2.0", - "configurations": [ - { - "name": "Attach to Superset App in Docker Container", - "type": "python", - "request": "attach", - "connect": { - "host": "127.0.0.1", - "port": 5678 - }, - "pathMappings": [ - { - "localRoot": "${workspaceFolder}", - "remoteRoot": "/app" - } - ] - } - ] -} -``` - -VSCode will not stop on breakpoints right away. We've attached to PID 6 however it does not yet know of any sub-processes. In order to "wake up" the debugger you need to modify a python file. This will trigger Flask to reload the code and create a new sub-process. This new sub-process will be detected by VSCode and breakpoints will be activated. - -### Debugging Server App in Kubernetes Environment - -To debug Flask running in POD inside a kubernetes cluster, you'll need to make sure the pod runs as root and is granted the SYS_TRACE capability.These settings should not be used in production environments. - -```yaml - securityContext: - capabilities: - add: ["SYS_PTRACE"] -``` - -See [set capabilities for a container](https://kubernetes.io/docs/tasks/configure-pod-container/security-context/#set-capabilities-for-a-container) for more details. - -Once the pod is running as root and has the SYS_PTRACE capability it will be able to debug the Flask app. - -You can follow the same instructions as in `docker compose`. Enter the pod and install the required library and packages; gdb, netstat and debugpy. - -Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine. - -```bash -kubectl port-forward pod/superset- 5678:5678 -``` - -You can now launch your VSCode debugger with the same config as above. VSCode will connect to 127.0.0.1:5678 which is forwarded by kubectl to your remote kubernetes POD. - -### Storybook - -Superset includes a [Storybook](https://storybook.js.org/) to preview the layout/styling of various Superset components and variations thereof. To open and view the Storybook: - -```bash -cd superset-frontend -npm run storybook -``` - -When contributing new React components to Superset, please try to add a Story alongside the component's `jsx/tsx` file. - -#### Testing Stories - -Superset uses [@storybook/test-runner](https://storybook.js.org/docs/writing-tests/test-runner) to validate that all stories compile and render without errors. This helps catch broken stories before they're merged. - -```bash -# Run against a running Storybook server (start with `npm run storybook` first) -npm run test-storybook - -# Build static Storybook and test (CI-friendly, no server needed) -npm run test-storybook:ci -``` - -The `test-storybook` job runs automatically in CI on every pull request, ensuring stories remain functional. - -## Tips - -### Adding a new datasource - -1. Create Models and Views for the datasource, add them under superset folder, like a new my_models.py - with models for cluster, datasources, columns and metrics and my_views.py with clustermodelview - and datasourcemodelview. - -1. Create DB migration files for the new models - -1. Specify this variable to add the datasource model and from which module it is from in config.py: - - For example: - - ```python - ADDITIONAL_MODULE_DS_MAP = {'superset.my_models': ['MyDatasource', 'MyOtherDatasource']} - ``` - - This means it'll register MyDatasource and MyOtherDatasource in superset.my_models module in the source registry. - -### Visualization Plugins - -The topic of authoring new plugins, whether you'd like to contribute -it back or not has been well documented in the -[documentation](https://superset.apache.org/docs/contributing/creating-viz-plugins). - -:::resources -- [Blog: Building Custom Viz Plugins in Superset v2](https://preset.io/blog/building-custom-viz-plugins-in-superset-v2) -- [Blog: Enhancing Superset Visualization Plugins](https://preset.io/blog/enhancing-superset-visualization-plugins-part-1/) -::: - -To contribute a plugin to Superset, your plugin must meet the following criteria: - -- The plugin should be applicable to the community at large, not a particularly specialized use case -- The plugin should be written with TypeScript -- The plugin should contain sufficient unit/e2e tests -- The plugin should use appropriate namespacing, e.g. a folder name of `plugin-chart-whatever` and a package name of `@superset-ui/plugin-chart-whatever` -- The plugin should use theme variables via Emotion, as passed in by the ThemeProvider -- The plugin should provide adequate error handling (no data returned, malformed data, invalid controls, etc.) -- The plugin should contain documentation in the form of a populated `README.md` file -- The plugin should have a meaningful and unique icon -- Above all else, the plugin should come with a _commitment to maintenance_ from the original author(s) - -Submissions will be considered for submission (or removal) on a case-by-case basis. - -### Adding a DB migration - -1. Alter the model you want to change. This example will add a `Column` Annotations model. - - [Example commit](https://github.com/apache/superset/commit/6c25f549384d7c2fc288451222e50493a7b14104) - -1. Generate the migration file - - ```bash - superset db migrate -m 'add_metadata_column_to_annotation_model' - ``` - - This will generate a file in `migrations/version/{SHA}_this_will_be_in_the_migration_filename.py`. - - [Example commit](https://github.com/apache/superset/commit/d3e83b0fd572c9d6c1297543d415a332858e262) - -1. Upgrade the DB - - ```bash - superset db upgrade - ``` - - The output should look like this: - - ```log - INFO [alembic.runtime.migration] Context impl SQLiteImpl. - INFO [alembic.runtime.migration] Will assume transactional DDL. - INFO [alembic.runtime.migration] Running upgrade 1a1d627ebd8e -> 40a0a483dd12, add_metadata_column_to_annotation_model.py - ``` - -1. Add column to view - - Since there is a new column, we need to add it to the AppBuilder Model view. - - [Example commit](https://github.com/apache/superset/pull/5745/commits/6220966e2a0a0cf3e6d87925491f8920fe8a3458) - -1. Test the migration's `down` method - - ```bash - superset db downgrade - ``` - - The output should look like this: - - ```log - INFO [alembic.runtime.migration] Context impl SQLiteImpl. - INFO [alembic.runtime.migration] Will assume transactional DDL. - INFO [alembic.runtime.migration] Running downgrade 40a0a483dd12 -> 1a1d627ebd8e, add_metadata_column_to_annotation_model.py - ``` - -### Merging DB migrations - -When two DB migrations collide, you'll get an error message like this one: - -```text -alembic.util.exc.CommandError: Multiple head revisions are present for -given argument 'head'; please specify a specific target -revision, '@head' to narrow to a specific head, -or 'heads' for all heads` -``` - -To fix it: - -1. Get the migration heads - - ```bash - superset db heads - ``` - - This should list two or more migration hashes. E.g. - - ```bash - 1412ec1e5a7b (head) - 67da9ef1ef9c (head) - ``` - -2. Pick one of them as the parent revision, open the script for the other revision - and update `Revises` and `down_revision` to the new parent revision. E.g.: - - ```diff - --- a/67da9ef1ef9c_add_hide_left_bar_to_tabstate.py - +++ b/67da9ef1ef9c_add_hide_left_bar_to_tabstate.py - @@ -17,14 +17,14 @@ - """add hide_left_bar to tabstate - - Revision ID: 67da9ef1ef9c - -Revises: c501b7c653a3 - +Revises: 1412ec1e5a7b - Create Date: 2021-02-22 11:22:10.156942 - - """ - - # revision identifiers, used by Alembic. - revision = "67da9ef1ef9c" - -down_revision = "c501b7c653a3" - +down_revision = "1412ec1e5a7b" - - import sqlalchemy as sa - from alembic import op - ``` - - Alternatively, you may also run `superset db merge` to create a migration script - just for merging the heads. - - ```bash - superset db merge {HASH1} {HASH2} - ``` - -3. Upgrade the DB to the new checkpoint - - ```bash - superset db upgrade - ``` diff --git a/docs/docs/contributing/guidelines.mdx b/docs/docs/contributing/guidelines.mdx deleted file mode 100644 index de734eebd75..00000000000 --- a/docs/docs/contributing/guidelines.mdx +++ /dev/null @@ -1,254 +0,0 @@ ---- -title: Guidelines -sidebar_position: 2 -version: 1 ---- - -## Pull Request Guidelines - -A philosophy we would like to strongly encourage is - -> Before creating a PR, create an issue. - -The purpose is to separate problem from possible solutions. - -**Bug fixes:** If you’re only fixing a small bug, it’s fine to submit a pull request right away but we highly recommend filing an issue detailing what you’re fixing. This is helpful in case we don’t accept that specific fix but want to keep track of the issue. Please keep in mind that the project maintainers reserve the rights to accept or reject incoming PRs, so it is better to separate the issue and the code to fix it from each other. In some cases, project maintainers may request you to create a separate issue from PR before proceeding. - -**Refactor:** For small refactors, it can be a standalone PR itself detailing what you are refactoring and why. If there are concerns, project maintainers may request you to create a `#SIP` for the PR before proceeding. - -**Feature/Large changes:** If you intend to change the public API, or make any non-trivial changes to the implementation, we require you to file a new issue as `#SIP` (Superset Improvement Proposal). This lets us reach an agreement on your proposal before you put significant effort into it. You are welcome to submit a PR along with the SIP (sometimes necessary for demonstration), but we will not review/merge the code until the SIP is approved. - -In general, small PRs are always easier to review than large PRs. The best practice is to break your work into smaller independent PRs and refer to the same issue. This will greatly reduce turnaround time. - -If you wish to share your work which is not ready to merge yet, create a [Draft PR](https://github.blog/2019-02-14-introducing-draft-pull-requests/). This will enable maintainers and the CI runner to prioritize mature PR's. - -Finally, never submit a PR that will put master branch in broken state. If the PR is part of multiple PRs to complete a large feature and cannot work on its own, you can create a feature branch and merge all related PRs into the feature branch before creating a PR from feature branch to master. - -### Protocol - -#### Authoring - -- Fill in all sections of the PR template. -- Title the PR with one of the following semantic prefixes (inspired by [Karma](http://karma-runner.github.io/0.10/dev/git-commit-msg.html])): - - - `feat` (new feature) - - `fix` (bug fix) - - `docs` (changes to the documentation) - - `style` (formatting, missing semi colons, etc; no application logic change) - - `refactor` (refactoring code) - - `test` (adding missing tests, refactoring tests; no application logic change) - - `chore` (updating tasks etc; no application logic change) - - `perf` (performance-related change) - - `build` (build tooling, Docker configuration change) - - `ci` (test runner, GitHub Actions workflow changes) - - `other` (changes that don't correspond to the above -- should be rare!) - - Examples: - - `feat: export charts as ZIP files` - - `perf(api): improve API info performance` - - `fix(chart-api): cached-indicator always shows value is cached` - -- Add prefix `[WIP]` to title if not ready for review (WIP = work-in-progress). We recommend creating a PR with `[WIP]` first and remove it once you have passed CI test and read through your code changes at least once. -- If you believe your PR contributes a potentially breaking change, put a `!` after the semantic prefix but before the colon in the PR title, like so: `feat!: Added foo functionality to bar` -- **Screenshots/GIFs:** Changes to user interface require before/after screenshots, or GIF for interactions - - Recommended capture tools ([Kap](https://getkap.co/), [LICEcap](https://www.cockos.com/licecap/), [Skitch](https://download.cnet.com/Skitch/3000-13455_4-189876.html)) - - If no screenshot is provided, the committers will mark the PR with `need:screenshot` label and will not review until screenshot is provided. -- **Dependencies:** Be careful about adding new dependency and avoid unnecessary dependencies. - - For Python, include it in `pyproject.toml` denoting any specific restrictions and - in `requirements.txt` pinned to a specific version which ensures that the application - build is deterministic. - - For TypeScript/JavaScript, include new libraries in `package.json` -- **Tests:** The pull request should include tests, either as doctests, unit tests, or both. Make sure to resolve all errors and test failures. See [Testing](/docs/contributing/howtos#testing) for how to run tests. -- **Documentation:** If the pull request adds functionality, the docs should be updated as part of the same PR. -- **CI:** Reviewers will not review the code until all CI tests are passed. Sometimes there can be flaky tests. You can close and open PR to re-run CI test. Please report if the issue persists. After the CI fix has been deployed to `master`, please rebase your PR. -- **Code coverage:** Please ensure that code coverage does not decrease. -- Remove `[WIP]` when ready for review. Please note that it may be merged soon after approved so please make sure the PR is ready to merge and do not expect more time for post-approval edits. -- If the PR was not ready for review and inactive for > 30 days, we will close it due to inactivity. The author is welcome to re-open and update. - -#### Reviewing - -- Use constructive tone when writing reviews. -- If there are changes required, state clearly what needs to be done before the PR can be approved. -- If you are asked to update your pull request with some changes there's no need to create a new one. Push your changes to the same branch. -- The committers reserve the right to reject any PR and in some cases may request the author to file an issue. - -#### Test Environments - -- Members of the Apache GitHub org can launch an ephemeral test environment directly on a pull request by creating a comment containing (only) the command `/testenv up`. - - Note that org membership must be public in order for this validation to function properly. -- Feature flags may be set for a test environment by specifying the flag name (prefixed with `FEATURE_`) and value after the command. - - Format: `/testenv up FEATURE_=true|false` - - Example: `/testenv up FEATURE_DASHBOARD_NATIVE_FILTERS=true` - - Multiple feature flags may be set in single command, separated by whitespace -- A comment will be created by the workflow script with the address and login information for the ephemeral environment. -- Test environments may be created once the Docker build CI workflow for the PR has completed successfully. -- Test environments do not currently update automatically when new commits are added to a pull request. -- Test environments do not currently support async workers, though this is planned. -- Running test environments will be shutdown upon closing the pull request. - -#### Merging - -- At least one approval is required for merging a PR. -- PR is usually left open for at least 24 hours before merging. -- After the PR is merged, [close the corresponding issue(s)](https://help.github.com/articles/closing-issues-using-keywords/). - -#### Post-merge Responsibility - -- Project maintainers may contact the PR author if new issues are introduced by the PR. -- Project maintainers may revert your changes if a critical issue is found, such as breaking master branch CI. - -## Managing Issues and PRs - -To handle issues and PRs that are coming in, committers read issues/PRs and flag them with labels to categorize and help contributors spot where to take actions, as contributors usually have different expertises. - -Triaging goals - -- **For issues:** Categorize, screen issues, flag required actions from authors. -- **For PRs:** Categorize, flag required actions from authors. If PR is ready for review, flag required actions from reviewers. - -First, add **Category labels (a.k.a. hash labels)**. Every issue/PR must have one hash label (except spam entry). Labels that begin with `#` defines issue/PR type: - -| Label | for Issue | for PR | -| --------------- | ----------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------- | -| `#bug` | Bug report | Bug fix | -| `#code-quality` | Describe problem with code, architecture or productivity | Refactor, tests, tooling | -| `#feature` | New feature request | New feature implementation | -| `#refine` | Propose improvement such as adjusting padding or refining UI style, excluding new features, bug fixes, and refactoring. | Implementation of improvement such as adjusting padding or refining UI style, excluding new features, bug fixes, and refactoring. | -| `#doc` | Documentation | Documentation | -| `#question` | Troubleshooting: Installation, Running locally, Ask how to do something. Can be changed to `#bug` later. | N/A | -| `#SIP` | Superset Improvement Proposal | N/A | -| `#ASF` | Tasks related to Apache Software Foundation policy | Tasks related to Apache Software Foundation policy | - -Then add other types of labels as appropriate. - -- **Descriptive labels (a.k.a. dot labels):** These labels that begin with `.` describe the details of the issue/PR, such as `.ui`, `.js`, `.install`, `.backend`, etc. Each issue/PR can have zero or more dot labels. -- **Need labels:** These labels have pattern `need:xxx`, which describe the work required to progress, such as `need:rebase`, `need:update`, `need:screenshot`. -- **Risk labels:** These labels have pattern `risk:xxx`, which describe the potential risk on adopting the work, such as `risk:db-migration`. The intention was to better understand the impact and create awareness for PRs that need more rigorous testing. -- **Status labels:** These labels describe the status (`abandoned`, `wontfix`, `cant-reproduce`, etc.) Issue/PRs that are rejected or closed without completion should have one or more status labels. -- **Version labels:** These have the pattern `vx.x` such as `v0.28`. Version labels on issues describe the version the bug was reported on. Version labels on PR describe the first release that will include the PR. - -Committers may also update title to reflect the issue/PR content if the author-provided title is not descriptive enough. - -If the PR passes CI tests and does not have any `need:` labels, it is ready for review, add label `review` and/or `design-review`. - -If an issue/PR has been inactive for at least 30 days, it will be closed. If it does not have any status label, add `inactive`. - -When creating a PR, if you're aiming to have it included in a specific release, please tag it with the version label. For example, to have a PR considered for inclusion in Superset 1.1 use the label `v1.1`. - -## Revert Guidelines - -Reverting changes that are causing issues in the master branch is a normal and expected part of the development process. In an open source community, the ramifications of a change cannot always be fully understood. With that in mind, here are some considerations to keep in mind when considering a revert: - -- **Availability of the PR author:** If the original PR author or the engineer who merged the code is highly available and can provide a fix in a reasonable time frame, this would counter-indicate reverting. -- **Severity of the issue:** How severe is the problem on master? Is it keeping the project from moving forward? Is there user impact? What percentage of users will experience a problem? -- **Size of the change being reverted:** Reverting a single small PR is a much lower-risk proposition than reverting a massive, multi-PR change. -- **Age of the change being reverted:** Reverting a recently-merged PR will be more acceptable than reverting an older PR. A bug discovered in an older PR is unlikely to be causing widespread serious issues. -- **Risk inherent in reverting:** Will the reversion break critical functionality? Is the medicine more dangerous than the disease? -- **Difficulty of crafting a fix:** In the case of issues with a clear solution, it may be preferable to implement and merge a fix rather than a revert. - -Should you decide that reverting is desirable, it is the responsibility of the Contributor performing the revert to: - -- **Contact the interested parties:** The PR's author and the engineer who merged the work should both be contacted and informed of the revert. -- **Provide concise reproduction steps:** Ensure that the issue can be clearly understood and duplicated by the original author of the PR. -- **Put the revert through code review:** The revert must be approved by another committer. - -## Design Guidelines - -### Capitalization guidelines - -#### Sentence case - -Use sentence-case capitalization for everything in the UI (except these \*\*). - -Sentence case is predominantly lowercase. Capitalize only the initial character of the first word, and other words that require capitalization, like: - -- **Proper nouns.** Objects in the product _are not_ considered proper nouns e.g. dashboards, charts, saved queries etc. Proprietary feature names eg. SQL Lab, Preset Manager _are_ considered proper nouns -- **Acronyms** (e.g. CSS, HTML) -- When referring to **UI labels that are themselves capitalized** from sentence case (e.g. page titles - Dashboards page, Charts page, Saved queries page, etc.) -- User input that is reflected in the UI. E.g. a user-named a dashboard tab - -**Sentence case vs. Title case:** -Title case: "A Dog Takes a Walk in Paris" -Sentence case: "A dog takes a walk in Paris" - -**Why sentence case?** - -- It’s generally accepted as the quickest to read -- It’s the easiest form to distinguish between common and proper nouns - -#### How to refer to UI elements - -When writing about a UI element, use the same capitalization as used in the UI. - -For example, if an input field is labeled “Name” then you refer to this as the “Name input field”. Similarly, if a button has the label “Save” in it, then it is correct to refer to the “Save button”. - -Where a product page is titled “Settings”, you refer to this in writing as follows: -“Edit your personal information on the Settings page”. - -Often a product page will have the same title as the objects it contains. In this case, refer to the page as it appears in the UI, and the objects as common nouns: - -- Upload a dashboard on the Dashboards page -- Go to Dashboards -- View dashboard -- View all dashboards -- Upload CSS templates on the CSS templates page -- Queries that you save will appear on the Saved queries page -- Create custom queries in SQL Lab then create dashboards - -#### \*\*Exceptions to sentence case - -- Input labels, buttons and UI tabs are all caps -- User input values (e.g. column names, SQL Lab tab names) should be in their original case - -## Programming Language Conventions - -### Python - -Parameters in the `config.py` (which are accessible via the Flask app.config dictionary) are -assumed to always be defined and thus should be accessed directly via, - -```python -blueprints = app.config["BLUEPRINTS"] -``` - -rather than, - -```python -blueprints = app.config.get("BLUEPRINTS") -``` - -or similar as the later will cause typing issues. The former is of type `List[Callable]` -whereas the later is of type `Optional[List[Callable]]`. - -#### Typing / Types Hints - -To ensure clarity, consistency, all readability, _all_ new functions should use -[type hints](https://docs.python.org/3/library/typing.html) and include a -docstring. - -Note per [PEP-484](https://www.python.org/dev/peps/pep-0484/#exceptions) no -syntax for listing explicitly raised exceptions is proposed and thus the -recommendation is to put this information in a docstring, i.e., - -```python -import math -from typing import Union - - -def sqrt(x: Union[float, int]) -> Union[float, int]: - """ - Return the square root of x. - - :param x: A number - :returns: The square root of the given number - :raises ValueError: If the number is negative - """ - - return math.sqrt(x) -``` - -### TypeScript - -TypeScript is fully supported and is the recommended language for writing all new frontend -components. When modifying existing functions/components, migrating to TypeScript is -appreciated, but not required. Examples of migrating functions/components to TypeScript can be -found in [#9162](https://github.com/apache/superset/pull/9162) and [#9180](https://github.com/apache/superset/pull/9180). diff --git a/docs/docs/contributing/howtos.mdx b/docs/docs/contributing/howtos.mdx deleted file mode 100644 index 37eb61ad817..00000000000 --- a/docs/docs/contributing/howtos.mdx +++ /dev/null @@ -1,720 +0,0 @@ ---- -title: Development How-tos -hide_title: true -sidebar_position: 4 -version: 1 ---- -# Development How-tos - -## Contributing to Documentation - -The latest documentation and tutorial are available at https://superset.apache.org/. - -The documentation site is built using [Docusaurus 3](https://docusaurus.io/), a modern -static website generator, the source for which resides in `./docs`. - -### Local Development - -To set up a local development environment with hot reloading for the documentation site: - -```shell -cd docs -yarn install # Installs NPM dependencies -yarn start # Starts development server at http://localhost:3000 -``` - -### Build - -To create and serve a production build of the documentation site: - -```shell -yarn build -yarn serve -``` - -### Deployment - -Commits to `master` trigger a rebuild and redeploy of the documentation site. Submit pull requests that modify the documentation with the `docs:` prefix. - -## Creating Visualization Plugins - -Visualizations in Superset are implemented in JavaScript or TypeScript. Superset -comes preinstalled with several visualizations types (hereafter "viz plugins") that -can be found under the `superset-frontend/plugins` directory. Viz plugins are added to -the application in the `superset-frontend/src/visualizations/presets/MainPreset.js`. -The Superset project is always happy to review proposals for new high quality viz -plugins. However, for highly custom viz types it is recommended to maintain a fork -of Superset, and add the custom built viz plugins by hand. - -**Note:** Additional community-generated resources about creating and deploying custom visualization plugins can be found on the [Superset Wiki](https://github.com/apache/superset/wiki/Community-Resource-Library#creating-custom-data-visualizations) - -### Prerequisites - -In order to create a new viz plugin, you need the following: - -- Run MacOS or Linux (Windows is not officially supported, but may work) -- Node.js 16 -- npm 7 or 8 - -A general familiarity with [React](https://reactjs.org/) and the npm/Node system is -also recommended. - -### Creating a simple Hello World viz plugin - -To get started, you need the Superset Yeoman Generator. It is recommended to use the -version of the template that ships with the version of Superset you are using. This -can be installed by doing the following: - -```bash -npm i -g yo -cd superset-frontend/packages/generator-superset -npm i -npm link -``` - -After this you can proceed to create your viz plugin. Create a new directory for your -viz plugin with the prefix `superset-plugin-chart` and run the Yeoman generator: - -```bash -mkdir /tmp/superset-plugin-chart-hello-world -cd /tmp/superset-plugin-chart-hello-world -``` - -Initialize the viz plugin: - -```bash -yo @superset-ui/superset -``` - -After that the generator will ask a few questions (the defaults should be fine): - -```bash -$ yo @superset-ui/superset - _-----_ ╭──────────────────────────╮ - | | │ Welcome to the │ - |--(o)--| │ generator-superset │ - `---------´ │ generator! │ - ( _´U`_ ) ╰──────────────────────────╯ - /___A___\ / - | ~ | - __'.___.'__ - ´ ` |° ´ Y ` -? Package name: superset-plugin-chart-hello-world -? Description: Hello World -? What type of chart would you like? Time-series chart - create package.json - create .gitignore - create babel.config.js - create jest.config.js - create README.md - create tsconfig.json - create src/index.ts - create src/plugin/buildQuery.ts - create src/plugin/controlPanel.ts - create src/plugin/index.ts - create src/plugin/transformProps.ts - create src/types.ts - create src/SupersetPluginChartHelloWorld.tsx - create test/index.test.ts - create test/__mocks__/mockExportString.js - create test/plugin/buildQuery.test.ts - create test/plugin/transformProps.test.ts - create types/external.d.ts - create src/images/thumbnail.png -``` - -To build the viz plugin, run the following commands: - -```bash -npm i --force -npm run build -``` - -Alternatively, to run the viz plugin in development mode (=rebuilding whenever changes -are made), start the dev server with the following command: - -```bash -npm run dev -``` - -To add the package to Superset, go to the `superset-frontend` subdirectory in your -Superset source folder run - -```bash -npm i -S /tmp/superset-plugin-chart-hello-world -``` - -If you publish your package to npm, you can naturally install directly from there, too. -After this edit the `superset-frontend/src/visualizations/presets/MainPreset.js` -and make the following changes: - -```js -import { SupersetPluginChartHelloWorld } from 'superset-plugin-chart-hello-world'; -``` - -to import the viz plugin and later add the following to the array that's passed to the -`plugins` property: - -```js -new SupersetPluginChartHelloWorld().configure({ key: 'ext-hello-world' }), -``` - -After that the viz plugin should show up when you run Superset, e.g. the development -server: - -```bash -npm run dev-server -``` - -#### Deploying your visualization plugin - -Once your plugin is complete, you will need to deploy it to your superset instance. - -This step assumes you are running your own Docker image as described [here](https://superset.apache.org/docs/installation/docker-builds/#building-your-own-production-docker-image). -Instructions may vary for other kinds of deployments. - -If you have your own Superset Docker image, the first line is most likely: -`FROM apache/superset:latest` or something similar. You will need to compile -your own `"lean"` image and replace this FROM line with your own image. - -1. Publish your chart plugin to npm: it makes the build process simpler. - -Note: if your chart is not published to npm, then in the docker build below, you will need -to edit the default Dockerfile to copy your plugin source code to the appropriate -location in the container build environment. - -2. Install your chart in the frontend with `npm i `. -3. Start with a base superset release. - -```bash -git checkout tags/X.0.0 -``` - -4. Install your chart with the instructions you followed during development. -5. Navigate to the root of your superset directory. -6. Run `docker build -t apache/superset:mychart --target lean .` -7. Rebuild your production container using `FROM apache/superset:mychart`. - -This will create a new productized superset container with your new chart compiled in. -Then you can recreate your custom production container based on a superset built with your chart. - -##### Troubleshooting - - -- If you get the following NPM error: - -``` -npm error `npm ci` can only install packages when your package.json and package-lock.json -``` - -It's because your local nodejs/npm version is different than the one being used inside docker. - -You can resolve this by running npm install with the same version used by the container build process - -Replace XYZ in the following command with the node tag used in the Dockerfile (search for "node:" in the Dockerfile to find the tag). -```bash -docker run --rm -v $PWD/superset-frontend:/app node:XYZ /bin/bash -c "cd /app && npm i" -``` - -## Testing - -### Python Testing - -`pytest`, backend by docker-compose is how we recommend running tests locally. - -For a more complex test matrix (against different database backends, python versions, ...) you -can rely on our GitHub Actions by simply opening a draft pull request. - -Note that the test environment uses a temporary directory for defining the -SQLite databases which will be cleared each time before the group of test -commands are invoked. - -There is also a utility script included in the Superset codebase to run python integration tests. The [readme can be -found here](https://github.com/apache/superset/tree/master/scripts/tests) - -To run all integration tests for example, run this script from the root directory: - -```bash -scripts/tests/run.sh -``` - -You can run unit tests found in './tests/unit_tests' for example with pytest. It is a simple way to run an isolated test that doesn't need any database setup - -```bash -pytest ./link_to_test.py -``` - -#### Testing with local Presto connections - -If you happen to change db engine spec for Presto/Trino, you can run a local Presto cluster with Docker: - -```bash -docker run -p 15433:15433 starburstdata/presto:350-e.6 -``` - -Then update `SUPERSET__SQLALCHEMY_EXAMPLES_URI` to point to local Presto cluster: - -```bash -export SUPERSET__SQLALCHEMY_EXAMPLES_URI=presto://localhost:15433/memory/default -``` - -### Frontend Testing - -We use [Jest](https://jestjs.io/) and [Enzyme](https://airbnb.io/enzyme/) to test TypeScript/JavaScript. Tests can be run with: - -```bash -cd superset-frontend -npm run test -``` - -To run a single test file: - -```bash -npm run test -- path/to/file.js -``` - -### E2E Integration Testing - -**Note: We are migrating from Cypress to Playwright. Use Playwright for new tests.** - -#### Playwright (Recommended - NEW) - -For E2E testing with Playwright, use the same `docker compose` backend: - -```bash -CYPRESS_CONFIG=true docker compose up --build -``` - -The backend setup is identical - this exposes a test-ready Superset app on port 8081 with isolated database schema (`superset_cypress`), test data, and configurations. - -Now in another terminal, run Playwright tests: - -```bash -# Navigate to frontend directory (Playwright config is here) -cd superset-frontend - -# Run all Playwright tests -npm run playwright:test -# or: npx playwright test - -# Run with interactive UI for debugging -npm run playwright:ui -# or: npx playwright test --ui - -# Run in headed mode (see browser) -npm run playwright:headed -# or: npx playwright test --headed - -# Run specific test file -npx playwright test tests/auth/login.spec.ts - -# Run with debug mode (step through tests) -npm run playwright:debug tests/auth/login.spec.ts -# or: npx playwright test --debug tests/auth/login.spec.ts - -# Generate test report -npx playwright show-report -``` - -Configuration is in `superset-frontend/playwright.config.ts`. Base URL is automatically set to `http://localhost:8088` but will use `PLAYWRIGHT_BASE_URL` if provided. - -#### Cypress (DEPRECATED - will be removed in Phase 5) - -:::warning -Cypress is being phased out in favor of Playwright. Use Playwright for all new tests. -::: - -```bash -# Set base URL for Cypress -CYPRESS_BASE_URL=http://localhost:8081 -``` - -```bash -# superset-frontend/cypress-base is the base folder for everything Cypress-related -# It's essentially its own npm app, with its own dependencies, configurations and utilities -cd superset-frontend/cypress-base -npm install - -# use interactive mode to run tests, while keeping memory usage contained -# this will fire up an interactive Cypress UI -# as you alter the code, the tests will re-run automatically, and you can visualize each -# and every step for debugging purposes -npx cypress open --config numTestsKeptInMemory=5 - -# to run the test suite on the command line using chrome (same as CI) -npm run cypress-run-chrome - -# run tests from a specific file -npm run cypress-run-chrome -- --spec cypress/e2e/explore/link.test.ts - -# run specific file with video capture -npm run cypress-run-chrome -- --spec cypress/e2e/dashboard/index.test.js --config video=true - -# to open the cypress ui -npm run cypress-debug - -``` - -See [`superset-frontend/cypress_build.sh`](https://github.com/apache/superset/blob/master/superset-frontend/cypress_build.sh). - -As an alternative you can use docker compose environment for testing: - -Make sure you have added below line to your /etc/hosts file: -`127.0.0.1 db` - -If you already have launched Docker environment please use the following command to assure a fresh database instance: -`docker compose down -v` - -Launch environment: - -`CYPRESS_CONFIG=true docker compose up` - -It will serve backend and frontend on port 8088. - -Run Cypress tests: - -```bash -cd cypress-base -npm install -npm run cypress open -``` - -### Debugging Server App - -Follow these instructions to debug the Flask app running inside a docker container. - -First add the following to the ./docker-compose.yaml file - -```diff -superset: - env_file: docker/.env - image: *superset-image - container_name: superset_app - command: ["/app/docker/docker-bootstrap.sh", "app"] - restart: unless-stopped -+ cap_add: -+ - SYS_PTRACE - ports: - - 8088:8088 -+ - 5678:5678 - user: "root" - depends_on: *superset-depends-on - volumes: *superset-volumes - environment: - CYPRESS_CONFIG: "${CYPRESS_CONFIG}" -``` - -Start Superset as usual - -```bash -docker compose up -``` - -Install the required libraries and packages to the docker container - -Enter the superset_app container - -```bash -docker exec -it superset_app /bin/bash -root@39ce8cf9d6ab:/app# -``` - -Run the following commands inside the container - -```bash -apt update -apt install -y gdb -apt install -y net-tools -pip install debugpy -``` - -Find the PID for the Flask process. Make sure to use the first PID. The Flask app will re-spawn a sub-process every time you change any of the python code. So it's important to use the first PID. - -```bash -ps -ef - -UID PID PPID C STIME TTY TIME CMD -root 1 0 0 14:09 ? 00:00:00 bash /app/docker/docker-bootstrap.sh app -root 6 1 4 14:09 ? 00:00:04 /usr/local/bin/python /usr/bin/flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0 -root 10 6 7 14:09 ? 00:00:07 /usr/local/bin/python /usr/bin/flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0 -``` - -Inject debugpy into the running Flask process. In this case PID 6. - -```bash -python3 -m debugpy --listen 0.0.0.0:5678 --pid 6 -``` - -Verify that debugpy is listening on port 5678 - -```bash -netstat -tunap - -Active Internet connections (servers and established) -Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name -tcp 0 0 0.0.0.0:5678 0.0.0.0:* LISTEN 462/python -tcp 0 0 0.0.0.0:8088 0.0.0.0:* LISTEN 6/python -``` - -You are now ready to attach a debugger to the process. Using VSCode you can configure a launch configuration file .vscode/launch.json like so. - -```json -{ - "version": "0.2.0", - "configurations": [ - { - "name": "Attach to Superset App in Docker Container", - "type": "python", - "request": "attach", - "connect": { - "host": "127.0.0.1", - "port": 5678 - }, - "pathMappings": [ - { - "localRoot": "${workspaceFolder}", - "remoteRoot": "/app" - } - ] - }, - ] -} -``` - -VSCode will not stop on breakpoints right away. We've attached to PID 6 however it does not yet know of any sub-processes. In order to "wakeup" the debugger you need to modify a python file. This will trigger Flask to reload the code and create a new sub-process. This new sub-process will be detected by VSCode and breakpoints will be activated. - -### Debugging Server App in Kubernetes Environment - -To debug Flask running in POD inside a kubernetes cluster, you'll need to make sure the pod runs as root and is granted the `SYS_TRACE` capability. These settings should not be used in production environments. - -```yaml - securityContext: - capabilities: - add: ["SYS_PTRACE"] -``` - -See [set capabilities for a container](https://kubernetes.io/docs/tasks/configure-pod-container/security-context/#set-capabilities-for-a-container) for more details. - -Once the pod is running as root and has the `SYS_PTRACE` capability it will be able to debug the Flask app. - -You can follow the same instructions as in `docker compose`. Enter the pod and install the required library and packages: gdb, netstat and debugpy. - -Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine. - -```bash -kubectl port-forward pod/superset- 5678:5678 -``` - -You can now launch your VSCode debugger with the same config as above. VSCode will connect to to 127.0.0.1:5678 which is forwarded by kubectl to your remote kubernetes POD. - -### Storybook - -Superset includes a [Storybook](https://storybook.js.org/) to preview the layout/styling of various Superset components, and variations thereof. To open and view the Storybook: - -```bash -cd superset-frontend -npm run storybook -``` - -When contributing new React components to Superset, please try to add a Story alongside the component's `jsx/tsx` file. - -## Contributing Translations - -We use [Flask-Babel](https://python-babel.github.io/flask-babel/) to translate Superset. -In Python files, we use the following -[translation functions](https://python-babel.github.io/flask-babel/#using-translations) -from `Flask-Babel`: - -- `gettext` and `lazy_gettext` (usually aliased to `_`): for translating singular - strings. -- `ngettext`: for translating strings that might become plural. - -```python -from flask_babel import lazy_gettext as _ -``` - -then wrap the translatable strings with it, e.g. `_('Translate me')`. -During extraction, string literals passed to `_` will be added to the -generated `.po` file for each language for later translation. - -At runtime, the `_` function will return the translation of the given -string for the current language, or the given string itself -if no translation is available. - -In TypeScript/JavaScript, the technique is similar: -we import `t` (simple translation), `tn` (translation containing a number). - -```javascript -import { t, tn } from "@superset-ui/translation"; -``` - -### Enabling language selection - -Add the `LANGUAGES` variable to your `superset_config.py`. Having more than one -option inside will add a language selection dropdown to the UI on the right side -of the navigation bar. - -```python -LANGUAGES = { - 'en': {'flag': 'us', 'name': 'English'}, - 'fr': {'flag': 'fr', 'name': 'French'}, - 'zh': {'flag': 'cn', 'name': 'Chinese'}, -} -``` - -### Creating a new language dictionary - -First check if the language code for your target language already exists. Check if the -[two letter ISO 639-1 code](https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) -for your target language already exists in the `superset/translations` directory: - -```bash -ls superset/translations | grep -E "^[a-z]{2}\/" -``` - -If your language already has a preexisting translation, skip to the next section - -The following languages are already supported by Flask AppBuilder, and will make it -easier to translate the application to your target language: -[Flask AppBuilder i18n documentation](https://flask-appbuilder.readthedocs.io/en/latest/i18n.html) - -To create a dictionary for a new language, first make sure the necessary dependencies are installed: - -```bash -pip install -r superset/translations/requirements.txt -``` - -Then run the following, where `LANGUAGE_CODE` is replaced with the language code for your target -language: - -```bash -pybabel init -i superset/translations/messages.pot -d superset/translations -l LANGUAGE_CODE -``` - -For instance, to add a translation for Finnish (language code `fi`), run the following: - -```bash -pybabel init -i superset/translations/messages.pot -d superset/translations -l fi -``` - -### Extracting new strings for translation - -Periodically, when working on translations, we need to extract the strings from both the -backend and the frontend to compile a list of all strings to be translated. It doesn't -happen automatically and is a required step to gather the strings and get them into the -`.po` files where they can be translated, so that they can then be compiled. - -This script does just that: - -```bash -./scripts/translations/babel_update.sh -``` - -### Updating language files - -Run the following command to update the language files with the new extracted strings. - -```bash - pybabel update -i superset/translations/messages.pot -d superset/translations --ignore-obsolete -``` - -You can then translate the strings gathered in files located under -`superset/translation`, where there's one folder per language. You can use [Poedit](https://poedit.net/features) -to translate the `po` file more conveniently. -Here is [a tutorial](https://web.archive.org/web/20220517065036/https://wiki.lxde.org/en/Translate_*.po_files_with_Poedit). - -To perform the translation on MacOS, you can install `poedit` via Homebrew: - -```bash -brew install poedit -``` - -After this, just start the `poedit` application and open the `messages.po` file. In the -case of the Finnish translation, this would be `superset/translations/fi/LC_MESSAGES/messages.po`. - -### Applying translations - -To make the translations available on the frontend, we need to convert the PO file into -a collection of JSON files. To convert all PO files to formatted JSON files you can use -the `build-translation` script - -```bash -# Install dependencies if you haven't already -cd superset-frontend/ && npm ci -# Compile translations for the frontend -npm run build-translation -``` - -Finally, for the translations to take effect we need to compile translation catalogs into -binary MO files for the backend using `pybabel`. - -```bash -# inside the project root -pybabel compile -d superset/translations -``` - -## Linting - -### Python - -We use [ruff](https://github.com/astral-sh/ruff) for linting which can be invoked via: - -``` -# auto-reformat using ruff -ruff format - -# lint check with ruff -ruff check - -# lint fix with ruff -ruff check --fix -``` - -Ruff configuration is located in our -(pyproject.toml)[https://github.com/apache/superset/blob/master/pyproject.toml] file - -All this is configured to run in pre-commit hooks, which we encourage you to setup -with `pre-commit install` - -### TypeScript - -```bash -cd superset-frontend -npm ci -# run linting checks -npm run lint -# run tsc (typescript) checks -npm run type -``` - -If using the eslint extension with vscode, put the following in your workspace `settings.json` file: - -```json -"eslint.workingDirectories": [ - "superset-frontend" -] -``` - -## GitHub Ephemeral Environments - -On any given pull request on GitHub, it's possible to create a temporary environment/deployment -by simply adding the label `testenv-up` to the PR. Once you add the `testenv-up` label, a -GitHub Action will be triggered that will: - -- build a docker image -- deploy it in EC2 (sponsored by the folks at [Preset](https://preset.io)) -- write a comment on the PR with a link to the ephemeral environment - -For more advanced use cases, it's possible to set a feature flag on the PR body, which will -take effect on the ephemeral environment. For example, if you want to set the `TAGGING_SYSTEM` -feature flag to `true`, you can add the following line to the PR body/description: - -``` -FEATURE_TAGGING_SYSTEM=true -``` - -Similarly, it's possible to disable feature flags with: - -``` -FEATURE_TAGGING_SYSTEM=false -``` diff --git a/docs/docs/contributing/misc.mdx b/docs/docs/contributing/misc.mdx deleted file mode 100644 index 6ec8a63c505..00000000000 --- a/docs/docs/contributing/misc.mdx +++ /dev/null @@ -1,55 +0,0 @@ ---- -sidebar_position: 6 -version: 1 ---- - -# Miscellaneous - -## Reporting a Security Vulnerability - -Please report security vulnerabilities to private@superset.apache.org. - -In the event a community member discovers a security flaw in Superset, it is important to follow the [Apache Security Guidelines](https://www.apache.org/security/committers.html) and release a fix as quickly as possible before public disclosure. Reporting security vulnerabilities through the usual GitHub Issues channel is not ideal as it will publicize the flaw before a fix can be applied. - -## SQL Lab Async - -It's possible to configure a local database to operate in `async` mode, -to work on `async` related features. - -To do this, you'll need to: - -- Add an additional database entry. We recommend you copy the connection - string from the database labeled `main`, and then enable `SQL Lab` and the - features you want to use. Don't forget to check the `Async` box -- Configure a results backend, here's a local `FileSystemCache` example, - not recommended for production, - but perfect for testing (stores cache in `/tmp`) - - ```python - from flask_caching.backends.filesystemcache import FileSystemCache - RESULTS_BACKEND = FileSystemCache('/tmp/sqllab') - ``` - -- Start up a celery worker - - ```shell script - celery --app=superset.tasks.celery_app:app worker -O fair - ``` - -Note that: - -- for changes that affect the worker logic, you'll have to - restart the `celery worker` process for the changes to be reflected. -- The message queue used is a `sqlite` database using the `SQLAlchemy` - experimental broker. Ok for testing, but not recommended in production -- In some cases, you may want to create a context that is more aligned - to your production environment, and use the similar broker as well as - results backend configuration - -## Async Chart Queries - -It's possible to configure database queries for charts to operate in `async` mode. This is especially useful for dashboards with many charts that may otherwise be affected by browser connection limits. To enable async queries for dashboards and Explore, the following dependencies are required: - -- Redis 5.0+ (the feature utilizes [Redis Streams](https://redis.io/topics/streams-intro)) -- Cache backends enabled via the `CACHE_CONFIG` and `DATA_CACHE_CONFIG` config settings -- Celery workers configured and running to process async tasks diff --git a/docs/docs/contributing/resources.mdx b/docs/docs/contributing/resources.mdx deleted file mode 100644 index 8de3bb9c8c4..00000000000 --- a/docs/docs/contributing/resources.mdx +++ /dev/null @@ -1,104 +0,0 @@ ---- -sidebar_position: 5 -version: 1 ---- - -import InteractiveSVG from '../../src/components/InteractiveERDSVG'; -import Mermaid from '@theme/Mermaid'; - -# Resources - -## High Level Architecture -
-```mermaid -flowchart TD - - %% Top Level - LB["Load Balancer(s)
(optional)"] - LB -.-> WebServers - - %% Web Servers - subgraph WebServers ["Web Server(s)"] - WS1["Frontend
(React, AntD, ECharts, AGGrid)"] - WS2["Backend
(Python, Flask, SQLAlchemy, Pandas, ...)"] - end - - %% Infra - subgraph InfraServices ["Infra"] - DB[("Metadata Database
(Postgres / MySQL)")] - - subgraph Caching ["Caching Subservices
(Redis, memcache, S3, ...)"] - direction LR - DummySpace[" "]:::invisible - QueryCache["Query Results Cache
(Accelerated Dashboards)"] - CsvCache["CSV Exports Cache"] - ThumbnailCache["Thumbnails Cache"] - AlertImageCache["Alert/Report Images Cache"] - QueryCache -- " " --> CsvCache - linkStyle 1 stroke:transparent; - ThumbnailCache -- " " --> AlertImageCache - linkStyle 2 stroke:transparent; - end - - Broker(("Message Queue
(Redis / RabbitMQ / SQS)")) - end - - AsyncBackend["Async Workers (Celery)
required for Alerts & Reports, thumbnails, CSV exports, long-running workloads, ..."] - - %% External DBs - subgraph ExternalDatabases ["Analytics Databases"] - direction LR - BigQuery[(BigQuery)] - Snowflake[(Snowflake)] - Redshift[(Redshift)] - Postgres[(Postgres)] - Postgres[(... any ...)] - end - - %% Connections - LB -.-> WebServers - WebServers --> DB - WebServers -.-> Caching - WebServers -.-> Broker - WebServers -.-> ExternalDatabases - - Broker -.-> AsyncBackend - - AsyncBackend -.-> ExternalDatabases - AsyncBackend -.-> Caching - - - - %% Legend styling - classDef requiredNode stroke-width:2px,stroke:black; - class Required requiredNode; - class Optional optionalNode; - - %% Hide real arrow - linkStyle 0 stroke:transparent; - - %% Styling - classDef optionalNode stroke-dasharray: 5 5, opacity:0.9; - class LB optionalNode; - class Caching optionalNode; - class AsyncBackend optionalNode; - class Broker optionalNode; - class QueryCache optionalNode; - class CsvCache optionalNode; - class ThumbnailCache optionalNode; - class AlertImageCache optionalNode; - class Celery optionalNode; - - classDef invisible fill:transparent,stroke:transparent; -``` -
- -## Entity-Relationship Diagram - -Here is our interactive ERD: - - - -
- -[Download the .svg](https://github.com/apache/superset/tree/master/docs/static/img/erd.svg) diff --git a/docs/docs/faq.mdx b/docs/docs/faq.mdx index d38c39eadad..e98cd70edc1 100644 --- a/docs/docs/faq.mdx +++ b/docs/docs/faq.mdx @@ -107,11 +107,11 @@ multiple tables as long as your database account has access to the tables. ## How do I create my own visualization? We recommend reading the instructions in -[Creating Visualization Plugins](/docs/contributing/howtos#creating-visualization-plugins). +[Creating Visualization Plugins](/developer-docs/contributing/howtos#creating-visualization-plugins). ## Can I upload and visualize CSV data? -Absolutely! Read the instructions [here](/docs/using-superset/exploring-data) to learn +Absolutely! Read the instructions [here](/user-docs/using-superset/exploring-data) to learn how to enable and use CSV upload. ## Why are my queries timing out? @@ -198,7 +198,7 @@ SQLALCHEMY_DATABASE_URI = 'sqlite:////new/location/superset.db?check_same_thread ``` You can read more about customizing Superset using the configuration file -[here](/docs/configuration/configuring-superset). +[here](/admin-docs/configuration/configuring-superset). ## What if the table schema changed? @@ -213,7 +213,7 @@ table afterwards to configure the Columns tab, check the appropriate boxes and s To clarify, the database backend is an OLTP database used by Superset to store its internal information like your list of users and dashboard definitions. While Superset supports a -[variety of databases as data _sources_](/docs/databases#installing-database-drivers), +[variety of databases as data _sources_](/user-docs/databases/#installing-database-drivers), only a few database engines are supported for use as the OLTP backend / metadata store. Superset is tested using MySQL, PostgreSQL, and SQLite backends. It’s recommended you install @@ -246,7 +246,7 @@ second etc). Example: ## Does Superset work with [insert database engine here]? -The [Connecting to Databases section](/docs/databases) provides the best +The [Connecting to Databases section](/user-docs/databases/) provides the best overview for supported databases. Database engines not listed on that page may work too. We rely on the community to contribute to this knowledge base. @@ -282,7 +282,7 @@ are typical in basic SQL: ## Does Superset offer a public API? Yes, a public REST API, and the surface of that API formal is expanding steadily. You can read more about this API and -interact with it using Swagger [here](/docs/api). +interact with it using Swagger [here](/developer-docs/api). Some of the original vision for the collection of endpoints under **/api/v1** was originally specified in @@ -322,7 +322,7 @@ Superset uses [Scarf](https://about.scarf.sh/) by default to collect basic telem We use the [Scarf Gateway](https://docs.scarf.sh/gateway/) to sit in front of container registries, the [scarf-js](https://about.scarf.sh/package-sdks) package to track `npm` installations, and a Scarf pixel to gather anonymous analytics on Superset page views. Scarf purges PII and provides aggregated statistics. Superset users can easily opt out of analytics in various ways documented [here](https://docs.scarf.sh/gateway/#do-not-track) and [here](https://docs.scarf.sh/package-analytics/#as-a-user-of-a-package-using-scarf-js-how-can-i-opt-out-of-analytics). Superset maintainers can also opt out of telemetry data collection by setting the `SCARF_ANALYTICS` environment variable to `false` in the Superset container (or anywhere Superset/webpack are run). -Additional opt-out instructions for Docker users are available on the [Docker Installation](/docs/installation/docker-compose) page. +Additional opt-out instructions for Docker users are available on the [Docker Installation](/admin-docs/installation/docker-compose) page. ## Does Superset have an archive panel or trash bin from which a user can recover deleted assets? diff --git a/docs/docs/index.mdx b/docs/docs/index.mdx new file mode 100644 index 00000000000..681acd6d2d5 --- /dev/null +++ b/docs/docs/index.mdx @@ -0,0 +1,269 @@ +--- +hide_title: true +sidebar_position: 1 +--- + +import DatabaseLogoWall from '@site/src/components/databases/DatabaseLogoWall'; + + + +# Superset + +[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/license/apache-2-0) +[![Latest Release on Github](https://img.shields.io/github/v/release/apache/superset?sort=semver)](https://github.com/apache/superset/releases/latest) +[![Build Status](https://github.com/apache/superset/actions/workflows/superset-python-unittest.yml/badge.svg)](https://github.com/apache/superset/actions) +[![PyPI version](https://badge.fury.io/py/apache_superset.svg)](https://badge.fury.io/py/apache_superset) +[![PyPI](https://img.shields.io/pypi/pyversions/apache_superset.svg?maxAge=2592000)](https://pypi.python.org/pypi/apache_superset) +[![GitHub Stars](https://img.shields.io/github/stars/apache/superset?style=social)](https://github.com/apache/superset/stargazers) +[![Contributors](https://img.shields.io/github/contributors/apache/superset)](https://github.com/apache/superset/graphs/contributors) +[![Last Commit](https://img.shields.io/github/last-commit/apache/superset)](https://github.com/apache/superset/commits/master) +[![Open Issues](https://img.shields.io/github/issues/apache/superset)](https://github.com/apache/superset/issues) +[![Open PRs](https://img.shields.io/github/issues-pr/apache/superset)](https://github.com/apache/superset/pulls) +[![Get on Slack](https://img.shields.io/badge/slack-join-orange.svg)](http://bit.ly/join-superset-slack) +[![Documentation](https://img.shields.io/badge/docs-apache.org-blue.svg)](https://superset.apache.org) + + + + Superset logo (light) + + +A modern, enterprise-ready business intelligence web application. + +[**Why Superset?**](#why-superset) | +[**Supported Databases**](#supported-databases) | +[**Installation and Configuration**](#installation-and-configuration) | +[**Release Notes**](https://github.com/apache/superset/blob/master/RELEASING/README.md#release-notes-for-recent-releases) | +[**Get Involved**](#get-involved) | +[**Contributor Guide**](#contributor-guide) | +[**Resources**](#resources) | +[**Organizations Using Superset**](https://superset.apache.org/inTheWild) + +## Why Superset? + +Superset is a modern data exploration and data visualization platform. Superset can replace or augment proprietary business intelligence tools for many teams. Superset integrates well with a variety of data sources. + +Superset provides: + +- A **no-code interface** for building charts quickly +- A powerful, web-based **SQL Editor** for advanced querying +- A **lightweight semantic layer** for quickly defining custom dimensions and metrics +- Out of the box support for **nearly any SQL** database or data engine +- A wide array of **beautiful visualizations** to showcase your data, ranging from simple bar charts to geospatial visualizations +- Lightweight, configurable **caching layer** to help ease database load +- Highly extensible **security roles and authentication** options +- An **API** for programmatic customization +- A **cloud-native architecture** designed from the ground up for scale + +## Screenshots & Gifs + +**Video Overview** + + + +[superset-video-1080p.webm](https://github.com/user-attachments/assets/b37388f7-a971-409c-96a7-90c4e31322e6) + +
+ +**Large Gallery of Visualizations** + +
+ +**Craft Beautiful, Dynamic Dashboards** + +
+ +**No-Code Chart Builder** + +
+ +**Powerful SQL Editor** + +
+ +## Supported Databases + +Superset can query data from any SQL-speaking datastore or data engine (Presto, Trino, Athena, [and more](/user-docs/databases/)) that has a Python DB-API driver and a SQLAlchemy dialect. + +Here are some of the major database solutions that are supported: + + + + + + + +**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](/user-docs/databases/). + +Want to add support for your datastore or data engine? Read more [here](/user-docs/faq#does-superset-work-with-insert-database-engine-here) about the technical requirements. + +## Installation and Configuration + +Try out Superset's [quickstart](/user-docs/quickstart) guide or learn about [the options for production deployments](/admin-docs/installation/installation-methods). + +## Get Involved + +- Ask and answer questions on [StackOverflow](https://stackoverflow.com/questions/tagged/apache-superset) using the **apache-superset** tag +- [Join our community's Slack](http://bit.ly/join-superset-slack) + and please read our [Slack Community Guidelines](https://github.com/apache/superset/blob/master/CODE_OF_CONDUCT.md#slack-community-guidelines) +- [Join our dev@superset.apache.org Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org). To join, simply send an email to [dev-subscribe@superset.apache.org](mailto:dev-subscribe@superset.apache.org) +- If you want to help troubleshoot GitHub Issues involving the numerous database drivers that Superset supports, please consider adding your name and the databases you have access to on the [Superset Database Familiarity Rolodex](https://docs.google.com/spreadsheets/d/1U1qxiLvOX0kBTUGME1AHHi6Ywel6ECF8xk_Qy-V9R8c/edit#gid=0) +- Join Superset's Town Hall and [Operational Model](https://preset.io/blog/the-superset-operational-model-wants-you/) recurring meetings. Meeting info is available on the [Superset Community Calendar](https://superset.apache.org/community) + +## Contributor Guide + +Interested in contributing? Check out our +[Developer Docs](https://superset.apache.org/developer-docs/) +to find resources around contributing along with a detailed guide on +how to set up a development environment. + +## Resources + +- [Superset "In the Wild"](https://superset.apache.org/inTheWild) - see who's using Superset, and [add your organization](https://github.com/apache/superset/edit/master/RESOURCES/INTHEWILD.yaml) to the list! +- [Feature Flags](/admin-docs/configuration/feature-flags) - the status of Superset's Feature Flags. +- [Standard Roles](https://github.com/apache/superset/blob/master/RESOURCES/STANDARD_ROLES.md) - How RBAC permissions map to roles. +- [Superset Wiki](https://github.com/apache/superset/wiki) - Tons of additional community resources: best practices, community content and other information. +- [Superset SIPs](https://github.com/orgs/apache/projects/170) - The status of Superset's SIPs (Superset Improvement Proposals) for both consensus and implementation status. + +Understanding the Superset Points of View + +- [The Case for Dataset-Centric Visualization](https://preset.io/blog/dataset-centric-visualization/) +- [Understanding the Superset Semantic Layer](https://preset.io/blog/understanding-superset-semantic-layer/) + +- Getting Started with Superset + - [Superset in 2 Minutes using Docker Compose](/admin-docs/installation/docker-compose#installing-superset-locally-using-docker-compose) + - [Installing Database Drivers](/admin-docs/configuration/configuring-superset#installing-database-drivers) + - [Building New Database Connectors](https://preset.io/blog/building-database-connector/) + - [Create Your First Dashboard](/user-docs/using-superset/creating-your-first-dashboard) + - [Comprehensive Tutorial for Contributing Code to Apache Superset + ](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/) +- [Resources to master Superset by Preset](https://preset.io/resources/) + +- Deploying Superset + + - [Official Docker image](https://hub.docker.com/r/apache/superset) + - [Helm Chart](https://github.com/apache/superset/tree/master/helm/superset) + +- Recordings of Past [Superset Community Events](https://preset.io/events) + + - [Mixed Time Series Charts](https://preset.io/events/mixed-time-series-visualization-in-superset-workshop/) + - [How the Bing Team Customized Superset for the Internal Self-Serve Data & Analytics Platform](https://preset.io/events/how-the-bing-team-heavily-customized-superset-for-their-internal-data/) + - [Live Demo: Visualizing MongoDB and Pinot Data using Trino](https://preset.io/events/2021-04-13-visualizing-mongodb-and-pinot-data-using-trino/) + - [Introduction to the Superset API](https://preset.io/events/introduction-to-the-superset-api/) + - [Building a Database Connector for Superset](https://preset.io/events/2021-02-16-building-a-database-connector-for-superset/) + +- Visualizations + + - [Creating Viz Plugins](https://superset.apache.org/developer-docs/contributing/howtos#creating-visualization-plugins) + - [Managing and Deploying Custom Viz Plugins](https://medium.com/nmc-techblog/apache-superset-manage-custom-viz-plugins-in-production-9fde1a708e55) + - [Why Apache Superset is Betting on Apache ECharts](https://preset.io/blog/2021-4-1-why-echarts/) + +- [Superset API](/developer-docs/api) + +## Repo Activity + + + + + Performance Stats of apache/superset - Last 28 days + + + + + + + diff --git a/docs/docs/quickstart.mdx b/docs/docs/quickstart.mdx index eb234ca795e..2eee4d09eca 100644 --- a/docs/docs/quickstart.mdx +++ b/docs/docs/quickstart.mdx @@ -15,7 +15,7 @@ Although we recommend using `Docker Compose` for a quick start in a sandbox-type environment and for other development-type use cases, **we do not recommend this setup for production**. For this purpose please refer to our -[Installing on Kubernetes](/docs/installation/kubernetes/) +[Installing on Kubernetes](/admin-docs/installation/kubernetes) page. ::: @@ -73,16 +73,16 @@ processes by running Docker Compose `stop` command. By doing so, you can avoid d From this point on, you can head on to: -- [Create your first Dashboard](/docs/using-superset/creating-your-first-dashboard) -- [Connect to a Database](/docs/databases) -- [Using Docker Compose](/docs/installation/docker-compose) -- [Configure Superset](/docs/configuration/configuring-superset/) -- [Installing on Kubernetes](/docs/installation/kubernetes/) +- [Create your first Dashboard](/user-docs/using-superset/creating-your-first-dashboard) +- [Connect to a Database](/user-docs/databases/) +- [Using Docker Compose](/admin-docs/installation/docker-compose) +- [Configure Superset](/admin-docs/configuration/configuring-superset) +- [Installing on Kubernetes](/admin-docs/installation/kubernetes) Or just explore our [Documentation](https://superset.apache.org/docs/intro)! :::resources - [Video: Superset in 2 Minutes](https://www.youtube.com/watch?v=AqousXQ7YHw) - [Video: Superset 101](https://www.youtube.com/watch?v=mAIH3hUoxEE) -- [Tutorial: Creating Your First Dashboard](/docs/using-superset/creating-your-first-dashboard) +- [Tutorial: Creating Your First Dashboard](/user-docs/using-superset/creating-your-first-dashboard) ::: diff --git a/docs/docs/using-superset/creating-your-first-dashboard.mdx b/docs/docs/using-superset/creating-your-first-dashboard.mdx index 7a214037a71..febf3395a6f 100644 --- a/docs/docs/using-superset/creating-your-first-dashboard.mdx +++ b/docs/docs/using-superset/creating-your-first-dashboard.mdx @@ -31,7 +31,7 @@ your existing SQL-speaking database or data store. First things first, we need to add the connection credentials to your database to be able to query and visualize data from it. If you're using Superset locally via -[Docker compose](/docs/installation/docker-compose), you can +[Docker compose](/admin-docs/installation/docker-compose), you can skip this step because a Postgres database, named **examples**, is included and pre-configured in Superset for you. @@ -193,7 +193,7 @@ Access to dashboards is managed via owners and permissions. Non-owner access can through dataset permissions or dashboard-level roles (using the `DASHBOARD_RBAC` feature flag). For detailed information on configuring dashboard access, see the -[Dashboard Access Control](/docs/security/security#dashboard-access-control) section in the +[Dashboard Access Control](/admin-docs/security/security#dashboard-access-control) section in the Security documentation. diff --git a/docs/docs/using-superset/sql-templating.mdx b/docs/docs/using-superset/sql-templating.mdx new file mode 100644 index 00000000000..4791c8357e0 --- /dev/null +++ b/docs/docs/using-superset/sql-templating.mdx @@ -0,0 +1,250 @@ +--- +title: SQL Templating +sidebar_position: 4 +description: Use Jinja templates in SQL Lab and virtual datasets to create dynamic queries +keywords: [sql templating, jinja, sql lab, virtual datasets, dynamic queries] +--- + +{/* +Licensed to the Apache Software Foundation (ASF) under one +or more contributor license agreements. See the NOTICE file +distributed with this work for additional information +regarding copyright ownership. The ASF licenses this file +to you under the Apache License, Version 2.0 (the +"License"); you may not use this file except in compliance +with the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, +software distributed under the License is distributed on an +"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +KIND, either express or implied. See the License for the +specific language governing permissions and limitations +under the License. +*/} + +# SQL Templating + +Superset supports [Jinja templating](https://jinja.palletsprojects.com/) in SQL Lab queries and virtual datasets. This allows you to write dynamic SQL that responds to filters, user context, and URL parameters. + +:::note +SQL templating must be enabled by your administrator via the `ENABLE_TEMPLATE_PROCESSING` feature flag. +For advanced configuration options, see the [SQL Templating Configuration Guide](/admin-docs/configuration/sql-templating). +::: + +## Basic Usage + +Jinja templates use double curly braces `{{ }}` for expressions and `{% %}` for logic blocks. + +### Using Parameters + +You can define parameters in SQL Lab via the **Parameters** menu as JSON: + +```json +{ + "my_table": "sales", + "start_date": "2024-01-01" +} +``` + +Then reference them in your query: + +```sql +SELECT * +FROM {{ my_table }} +WHERE order_date >= '{{ start_date }}' +``` + +### Conditional Logic + +Use Jinja's logic blocks for conditional SQL: + +```sql +SELECT * +FROM orders +WHERE 1 = 1 +{% if start_date %} + AND order_date >= '{{ start_date }}' +{% endif %} +{% if end_date %} + AND order_date < '{{ end_date }}' +{% endif %} +``` + +## Available Macros + +Superset provides built-in macros for common use cases. + +### User Context + +| Macro | Description | +|-------|-------------| +| `{{ current_username() }}` | Returns the logged-in user's username | +| `{{ current_user_id() }}` | Returns the logged-in user's account ID | +| `{{ current_user_email() }}` | Returns the logged-in user's email | +| `{{ current_user_roles() }}` | Returns an array of the user's roles | + +**Example: Row-level filtering by user** + +```sql +SELECT * +FROM sales_data +WHERE sales_rep = '{{ current_username() }}' +``` + +**Example: Role-based access** + +```sql +SELECT * +FROM users +WHERE role IN {{ current_user_roles()|where_in }} +``` + +### Filter Values + +Access dashboard and chart filter values in your queries: + +| Macro | Description | +|-------|-------------| +| `{{ filter_values('column') }}` | Returns filter values as a list | +| `{{ get_filters('column') }}` | Returns filters with operators | + +**Example: Using filter values** + +```sql +SELECT product, SUM(revenue) as total +FROM sales +WHERE region IN {{ filter_values('region')|where_in }} +GROUP BY product +``` + +The `where_in` filter converts the list to SQL format: `('value1', 'value2', 'value3')` + +### Time Filters + +For charts with time range filters: + +| Macro | Description | +|-------|-------------| +| `{{ get_time_filter('column') }}` | Returns time filter with `from_expr` and `to_expr` | + +**Example: Time-filtered virtual dataset** + +```sql +{% set time_filter = get_time_filter("order_date", default="Last 7 days") %} +SELECT * +FROM orders +WHERE order_date >= {{ time_filter.from_expr }} + AND order_date < {{ time_filter.to_expr }} +``` + +### URL Parameters + +Pass custom values via URL query strings: + +```sql +SELECT * +FROM orders +WHERE country = '{{ url_param('country') }}' +``` + +Access via: `superset.example.com/sqllab?country=US` + +### Reusing Dataset Definitions + +Query existing datasets by ID: + +```sql +-- Query a dataset (ID 42) as a table +SELECT * FROM {{ dataset(42) }} LIMIT 100 + +-- Include computed metrics +SELECT * FROM {{ dataset(42, include_metrics=True) }} +``` + +Reuse metric definitions across queries: + +```sql +SELECT + category, + {{ metric('total_revenue') }} as revenue +FROM sales +GROUP BY category +``` + +## Testing Templates in SQL Lab + +Some variables like `from_dttm` and `filter_values()` only work when filters are applied from dashboards or charts. To test in SQL Lab: + +**Option 1: Use defaults** + +```sql +SELECT * +FROM orders +WHERE date >= '{{ from_dttm | default("2024-01-01", true) }}' +``` + +**Option 2: Set test parameters** + +Add to the Parameters menu: + +```json +{ + "_filters": [ + {"col": "region", "op": "IN", "val": ["US", "EU"]} + ] +} +``` + +**Option 3: Use `{% set %}`** + +```sql +{% set start_date = "2024-01-01" %} +SELECT * FROM orders WHERE date >= '{{ start_date }}' +``` + +## Common Patterns + +### Dynamic Table Selection + +```sql +{% set table_name = url_param('table') or 'default_table' %} +SELECT * FROM {{ table_name }} +``` + +### User-Specific Data Access + +```sql +SELECT * +FROM sensitive_data +WHERE department IN ( + SELECT department + FROM user_permissions + WHERE username = '{{ current_username() }}' +) +``` + +### Time-Based Partitioning + +```sql +{% set time_filter = get_time_filter("event_date", remove_filter=True) %} +SELECT * +FROM events +WHERE event_date >= {{ time_filter.from_expr }} + AND event_date < {{ time_filter.to_expr }} +``` + +Using `remove_filter=True` applies the filter in the inner query for better performance. + +## Tips + +- Use `|where_in` filter to convert lists to SQL `IN` clauses +- Use `|tojson` to serialize arrays as JSON strings +- Test queries with explicit parameter values before relying on filter context +- For complex templating needs, ask your administrator about custom Jinja macros + +:::resources +- [Admin Guide: SQL Templating Configuration](/admin-docs/configuration/sql-templating) +- [Blog: Intro to Jinja Templating in Apache Superset](https://preset.io/blog/intro-jinja-templating-apache-superset/) +::: diff --git a/docs/docusaurus.config.ts b/docs/docusaurus.config.ts index 336b44d0edd..289ae9ed0e5 100644 --- a/docs/docusaurus.config.ts +++ b/docs/docusaurus.config.ts @@ -64,27 +64,55 @@ if (!versionsConfig.components.disabled) { ]); } -// Add developer_portal plugin if not disabled -if (!versionsConfig.developer_portal.disabled) { +// Add admin_docs plugin if not disabled +if (!versionsConfig.admin_docs.disabled) { dynamicPlugins.push([ '@docusaurus/plugin-content-docs', { - id: 'developer_portal', - path: 'developer_portal', - routeBasePath: 'developer_portal', - sidebarPath: require.resolve('./sidebarTutorials.js'), + id: 'admin_docs', + path: 'admin_docs', + routeBasePath: 'admin-docs', + sidebarPath: require.resolve('./sidebarAdminDocs.js'), editUrl: - 'https://github.com/apache/superset/edit/master/docs/developer_portal', + 'https://github.com/apache/superset/edit/master/docs/admin_docs', remarkPlugins: [remarkImportPartial, remarkLocalizeBadges, remarkTechArticleSchema], admonitions: { keywords: ['note', 'tip', 'info', 'warning', 'danger', 'resources'], extendDefaults: true, }, docItemComponent: '@theme/DocItem', - includeCurrentVersion: versionsConfig.developer_portal.includeCurrentVersion, - lastVersion: versionsConfig.developer_portal.lastVersion, - onlyIncludeVersions: versionsConfig.developer_portal.onlyIncludeVersions, - versions: versionsConfig.developer_portal.versions, + includeCurrentVersion: versionsConfig.admin_docs.includeCurrentVersion, + lastVersion: versionsConfig.admin_docs.lastVersion, + onlyIncludeVersions: versionsConfig.admin_docs.onlyIncludeVersions, + versions: versionsConfig.admin_docs.versions, + disableVersioning: false, + showLastUpdateAuthor: true, + showLastUpdateTime: true, + }, + ]); +} + +// Add developer_docs plugin if not disabled +if (!versionsConfig.developer_docs.disabled) { + dynamicPlugins.push([ + '@docusaurus/plugin-content-docs', + { + id: 'developer_docs', + path: 'developer_docs', + routeBasePath: 'developer-docs', + sidebarPath: require.resolve('./sidebarTutorials.js'), + editUrl: + 'https://github.com/apache/superset/edit/master/docs/developer_docs', + remarkPlugins: [remarkImportPartial, remarkLocalizeBadges, remarkTechArticleSchema], + admonitions: { + keywords: ['note', 'tip', 'info', 'warning', 'danger', 'resources'], + extendDefaults: true, + }, + docItemComponent: '@theme/ApiItem', // Required for OpenAPI docs + includeCurrentVersion: versionsConfig.developer_docs.includeCurrentVersion, + lastVersion: versionsConfig.developer_docs.lastVersion, + onlyIncludeVersions: versionsConfig.developer_docs.onlyIncludeVersions, + versions: versionsConfig.developer_docs.versions, disableVersioning: false, showLastUpdateAuthor: true, showLastUpdateTime: true, @@ -121,45 +149,79 @@ if (!versionsConfig.components.disabled) { }); } -// Add Developer Portal navbar item if not hidden from nav -if (!versionsConfig.developer_portal.disabled && !versionsConfig.developer_portal.hideFromNav) { +// Add Admin Docs navbar item if not disabled +if (!versionsConfig.admin_docs.disabled) { dynamicNavbarItems.push({ - label: 'Developer Portal', + label: 'Admins', + to: '/admin-docs/', position: 'left', + activeBaseRegex: '^/admin-docs/', items: [ { - type: 'doc', - docsPluginId: 'developer_portal', - docId: 'index', label: 'Overview', + to: '/admin-docs/', + activeBaseRegex: '^/admin-docs/$', + }, + { + label: 'Installation', + to: '/admin-docs/installation/installation-methods', + activeBaseRegex: '^/admin-docs/installation/', + }, + { + label: 'Configuration', + to: '/admin-docs/configuration/configuring-superset', + activeBaseRegex: '^/admin-docs/configuration/', + }, + { + label: 'Database Drivers', + href: '/user-docs/databases/', + }, + { + label: 'Security', + to: '/admin-docs/security/security', + activeBaseRegex: '^/admin-docs/security/', + }, + ], + }); +} + +// Add Developer Docs navbar item if not hidden from nav +if (!versionsConfig.developer_docs.disabled && !versionsConfig.developer_docs.hideFromNav) { + dynamicNavbarItems.push({ + label: 'Developers', + to: '/developer-docs/', + position: 'left', + activeBaseRegex: '^/developer-docs/', + items: [ + { + label: 'Overview', + to: '/developer-docs/', + activeBaseRegex: '^/developer-docs/$', }, { - type: 'doc', - docsPluginId: 'developer_portal', - docId: 'contributing/overview', label: 'Contributing', + to: '/developer-docs/contributing/overview', + activeBaseRegex: '^/developer-docs/contributing/', }, { - type: 'doc', - docsPluginId: 'developer_portal', - docId: 'extensions/overview', label: 'Extensions', + to: '/developer-docs/extensions/overview', + activeBaseRegex: '^/developer-docs/extensions/', }, { - type: 'doc', - docsPluginId: 'developer_portal', - docId: 'testing/overview', label: 'Testing', + to: '/developer-docs/testing/overview', + activeBaseRegex: '^/developer-docs/testing/', }, { - type: 'doc', - docsPluginId: 'developer_portal', - docId: 'components/index', label: 'UI Components', + to: '/developer-docs/components/', + activeBaseRegex: '^/developer-docs/components/', }, { label: 'API Reference', - href: '/docs/api', + to: '/developer-docs/api', + activeBaseRegex: '^/developer-docs/api', }, ], }); @@ -271,11 +333,11 @@ const config: Config = { 'docusaurus-plugin-openapi-docs', { id: 'api', - docsPluginId: 'classic', + docsPluginId: 'developer_docs', config: { superset: { specPath: 'static/resources/openapi.json', - outputDir: 'docs/api', + outputDir: 'developer_docs/api', sidebarOptions: { groupPathsBy: 'tag', categoryLinkSource: 'tag', @@ -307,147 +369,311 @@ const config: Config = { '@docusaurus/plugin-client-redirects', { redirects: [ + // Legacy HTML page redirects { - to: '/docs/installation/docker-compose', + to: '/admin-docs/installation/docker-compose', from: '/installation.html', }, { - to: '/docs/intro', + to: '/user-docs/', from: '/tutorials.html', }, { - to: '/docs/using-superset/creating-your-first-dashboard', + to: '/user-docs/using-superset/creating-your-first-dashboard', from: '/admintutorial.html', }, { - to: '/docs/using-superset/creating-your-first-dashboard', + to: '/user-docs/using-superset/creating-your-first-dashboard', from: '/usertutorial.html', }, { - to: '/docs/security/', + to: '/admin-docs/security/', from: '/security.html', }, { - to: '/docs/configuration/sql-templating', + to: '/admin-docs/configuration/sql-templating', from: '/sqllab.html', }, { - to: '/docs/intro', + to: '/user-docs/', from: '/gallery.html', }, { - to: '/docs/databases', + to: '/user-docs/databases/', from: '/druid.html', }, { - to: '/docs/configuration/country-map-tools', + to: '/admin-docs/configuration/country-map-tools', from: '/misc.html', }, { - to: '/docs/configuration/country-map-tools', + to: '/admin-docs/configuration/country-map-tools', from: '/visualization.html', }, { - to: '/docs/faq', + to: '/user-docs/faq', from: '/videos.html', }, { - to: '/docs/faq', + to: '/user-docs/faq', from: '/faq.html', }, { - to: '/docs/using-superset/creating-your-first-dashboard', + to: '/user-docs/using-superset/creating-your-first-dashboard', from: '/tutorial.html', }, { - to: '/docs/using-superset/creating-your-first-dashboard', + to: '/user-docs/using-superset/creating-your-first-dashboard', from: '/docs/creating-charts-dashboards/first-dashboard', }, { - to: '/docs/api', + to: '/developer-docs/api', from: '/docs/rest-api', }, { - to: '/docs/configuration/alerts-reports', + to: '/admin-docs/configuration/alerts-reports', from: '/docs/installation/alerts-reports', }, { - to: '/docs/contributing/development', + to: '/developer-docs/contributing/development-setup', from: '/docs/contributing/hooks-and-linting', }, { - to: '/docs/intro', + to: '/user-docs/', from: '/docs/roadmap', }, { - to: '/docs/contributing/', + to: '/user-docs/', + from: '/user-docs/intro', + }, + { + to: '/developer-docs/contributing/overview', from: '/docs/contributing/contribution-guidelines', }, { - to: '/docs/contributing/', + to: '/developer-docs/contributing/overview', from: '/docs/contributing/contribution-page', }, { - to: '/docs/databases', + to: '/user-docs/databases/', from: '/docs/databases/yugabyte/', }, { - to: '/docs/faq', + to: '/user-docs/faq', from: '/docs/frequently-asked-questions', }, + // Redirect old user-docs/api to developer-docs/api { - to: '/docs/installation/kubernetes', + to: '/developer-docs/api', + from: '/user-docs/api', + }, + // Redirects from old /docs/ paths to new /admin-docs/ paths + { + to: '/admin-docs/installation/installation-methods', + from: '/docs/installation/installation-methods', + }, + { + to: '/admin-docs/installation/docker-compose', + from: '/docs/installation/docker-compose', + }, + { + to: '/admin-docs/installation/docker-builds', + from: '/docs/installation/docker-builds', + }, + { + to: '/admin-docs/installation/kubernetes', + from: '/docs/installation/kubernetes', + }, + { + to: '/admin-docs/installation/pypi', + from: '/docs/installation/pypi', + }, + { + to: '/admin-docs/installation/architecture', + from: '/docs/installation/architecture', + }, + { + to: '/admin-docs/installation/upgrading-superset', + from: '/docs/installation/upgrading-superset', + }, + { + to: '/admin-docs/configuration/configuring-superset', + from: '/docs/configuration/configuring-superset', + }, + { + to: '/admin-docs/configuration/alerts-reports', + from: '/docs/configuration/alerts-reports', + }, + { + to: '/admin-docs/configuration/async-queries-celery', + from: '/docs/configuration/async-queries-celery', + }, + { + to: '/admin-docs/configuration/cache', + from: '/docs/configuration/cache', + }, + { + to: '/admin-docs/configuration/event-logging', + from: '/docs/configuration/event-logging', + }, + { + to: '/admin-docs/configuration/feature-flags', + from: '/docs/configuration/feature-flags', + }, + { + to: '/admin-docs/configuration/sql-templating', + from: '/docs/configuration/sql-templating', + }, + { + to: '/admin-docs/configuration/theming', + from: '/docs/configuration/theming', + }, + { + to: '/admin-docs/security/', + from: '/docs/security', + }, + { + to: '/admin-docs/security/', + from: '/docs/security/security', + }, + // Redirects from old /docs/contributing/ to Developer Portal + { + to: '/developer-docs/contributing/overview', + from: '/docs/contributing', + }, + { + to: '/developer-docs/contributing/overview', + from: '/docs/contributing/contributing', + }, + { + to: '/developer-docs/contributing/development-setup', + from: '/docs/contributing/development', + }, + { + to: '/developer-docs/contributing/guidelines', + from: '/docs/contributing/guidelines', + }, + { + to: '/developer-docs/contributing/howtos', + from: '/docs/contributing/howtos', + }, + { + to: '/admin-docs/installation/kubernetes', from: '/docs/installation/running-on-kubernetes/', }, { - to: '/docs/contributing/howtos', + to: '/developer-docs/contributing/howtos', from: '/docs/contributing/testing-locally/', }, { - to: '/docs/using-superset/creating-your-first-dashboard', + to: '/user-docs/using-superset/creating-your-first-dashboard', from: '/docs/creating-charts-dashboards/creating-your-first-dashboard/', }, { - to: '/docs/using-superset/creating-your-first-dashboard', + to: '/user-docs/using-superset/creating-your-first-dashboard', from: '/docs/creating-charts-dashboards/exploring-data/', }, { - to: '/docs/installation/docker-compose', + to: '/admin-docs/installation/docker-compose', from: '/docs/installation/installing-superset-using-docker-compose/', }, { - to: '/docs/contributing/howtos', + to: '/developer-docs/contributing/howtos', from: '/docs/contributing/creating-viz-plugins/', }, { - to: '/docs/installation/kubernetes', + to: '/admin-docs/installation/kubernetes', from: '/docs/installation/', }, { - to: '/docs/installation/pypi', + to: '/admin-docs/installation/pypi', from: '/docs/installation/installing-superset-from-pypi/', }, { - to: '/docs/configuration/configuring-superset', + to: '/admin-docs/configuration/configuring-superset', from: '/docs/installation/configuring-superset/', }, { - to: '/docs/configuration/cache', + to: '/admin-docs/configuration/cache', from: '/docs/installation/cache/', }, { - to: '/docs/configuration/async-queries-celery', + to: '/admin-docs/configuration/async-queries-celery', from: '/docs/installation/async-queries-celery/', }, { - to: '/docs/configuration/event-logging', + to: '/admin-docs/configuration/event-logging', from: '/docs/installation/event-logging/', }, { - to: '/docs/contributing/howtos', + to: '/developer-docs/contributing/howtos', from: '/docs/contributing/translations/', }, + // Additional configuration redirects + { + to: '/admin-docs/configuration/country-map-tools', + from: '/docs/configuration/country-map-tools', + }, + { + to: '/admin-docs/configuration/importing-exporting-datasources', + from: '/docs/configuration/importing-exporting-datasources', + }, + { + to: '/admin-docs/configuration/map-tiles', + from: '/docs/configuration/map-tiles', + }, + { + to: '/admin-docs/configuration/networking-settings', + from: '/docs/configuration/networking-settings', + }, + { + to: '/admin-docs/configuration/timezones', + from: '/docs/configuration/timezones', + }, + // Additional security redirects + { + to: '/admin-docs/security/cves', + from: '/docs/security/cves', + }, + { + to: '/admin-docs/security/securing_superset', + from: '/docs/security/securing_superset', + }, + // Additional contributing redirects + { + to: '/developer-docs/contributing/resources', + from: '/docs/contributing/resources', + }, + { + to: '/developer-docs/contributing/howtos', + from: '/docs/contributing/misc', + }, + { + to: '/developer-docs/contributing/overview', + from: '/docs/contributing/pkg-resources-migration', + }, ], + // Use createRedirects for pattern-based redirects + createRedirects(existingPath) { + const redirects = []; + + // Redirect all /developer_portal/* paths to /developer-docs/* + if (existingPath.startsWith('/developer-docs/')) { + redirects.push(existingPath.replace('/developer-docs/', '/developer_portal/')); + } + + // Redirect all /docs/* paths to /user-docs/* for user documentation + if (existingPath.startsWith('/user-docs/')) { + redirects.push(existingPath.replace('/user-docs/', '/docs/')); + } + + // Redirect /docs/api/* to /developer-docs/api/* (API moved to developer docs) + if (existingPath.startsWith('/developer-docs/api')) { + redirects.push(existingPath.replace('/developer-docs/', '/docs/')); + } + + return redirects.length > 0 ? redirects : undefined; + }, }, ], ], @@ -457,6 +683,7 @@ const config: Config = { '@docusaurus/preset-classic', { docs: { + routeBasePath: 'user-docs', sidebarPath: require.resolve('./sidebars.js'), editUrl: ({ versionDocsDirPath, docPath }) => { if (docPath === 'intro.md') { @@ -476,7 +703,7 @@ const config: Config = { disableVersioning: false, showLastUpdateAuthor: true, showLastUpdateTime: true, - docItemComponent: '@theme/ApiItem', // Required for OpenAPI docs + docItemComponent: '@theme/DocItem', }, blog: { showReadingTime: true, @@ -499,19 +726,22 @@ const config: Config = { const items = await defaultCreateSitemapItems(rest); return items.map((item) => { // Boost priority for key pages - if (item.url.includes('/docs/intro')) { + if (item.url.endsWith('/user-docs/')) { return { ...item, priority: 1.0, changefreq: 'daily' }; } - if (item.url.includes('/docs/quickstart')) { + if (item.url.includes('/user-docs/quickstart')) { return { ...item, priority: 0.9, changefreq: 'weekly' }; } - if (item.url.includes('/docs/installation/')) { + if (item.url.includes('/admin-docs/installation/')) { return { ...item, priority: 0.8, changefreq: 'weekly' }; } - if (item.url.includes('/docs/databases')) { + if (item.url.includes('/user-docs/databases')) { return { ...item, priority: 0.8, changefreq: 'weekly' }; } - if (item.url.includes('/docs/faq')) { + if (item.url.includes('/admin-docs/')) { + return { ...item, priority: 0.8, changefreq: 'weekly' }; + } + if (item.url.includes('/user-docs/faq')) { return { ...item, priority: 0.7, changefreq: 'monthly' }; } if (item.url === 'https://superset.apache.org/') { @@ -562,34 +792,50 @@ const config: Config = { srcDark: '/img/superset-logo-horiz-dark.svg', }, items: [ + // Users docs - mirrors sidebar structure { - label: 'Documentation', + label: 'Users', + to: '/user-docs/', position: 'left', + activeBaseRegex: '^/user-docs/', items: [ { - type: 'doc', - docId: 'intro', - label: 'Getting Started', + label: 'Overview', + to: '/user-docs/', + activeBaseRegex: '^/user-docs/$', }, { - type: 'doc', - docId: 'databases/index', - label: 'Databases', + label: 'Quickstart', + to: '/user-docs/quickstart', + }, + { + label: 'Using Superset', + to: '/user-docs/using-superset/creating-your-first-dashboard', + activeBaseRegex: '^/user-docs/using-superset/', + }, + { + label: 'Connecting to Databases', + to: '/user-docs/databases/', + activeBaseRegex: '^/user-docs/databases/', }, { - type: 'doc', - docId: 'faq', label: 'FAQ', + to: '/user-docs/faq', }, ], }, + ...dynamicNavbarItems, + // Community section { - label: 'Community Resources', + label: 'Community', to: '/community', + position: 'left', + activeBaseRegex: '^/community', items: [ { label: 'Resources', - href: '/community', + to: '/community', + activeBaseRegex: '^/community$', }, { label: 'GitHub', @@ -617,9 +863,8 @@ const config: Config = { }, ], }, - ...dynamicNavbarItems, { - href: '/docs/intro', + href: '/user-docs/', position: 'right', className: 'default-button-theme get-started-button', label: 'Get Started', @@ -646,7 +891,7 @@ const config: Config = { Divider

- Security |  + Security |  Donate |  Thanks |  Events |  diff --git a/docs/package.json b/docs/package.json index 10ac638f3bf..d42942d715b 100644 --- a/docs/package.json +++ b/docs/package.json @@ -8,7 +8,7 @@ "_init": "cat src/intro_header.txt ../README.md > docs/intro.md", "start": "yarn run _init && yarn run generate:all && NODE_OPTIONS='--max-old-space-size=8192' NODE_ENV=development docusaurus start", "start:quick": "yarn run _init && NODE_OPTIONS='--max-old-space-size=8192' NODE_ENV=development docusaurus start", - "stop": "pkill -f 'docusaurus start' || pkill -f 'docusaurus serve' || echo 'No docusaurus server running'", + "stop": "pkill -9 -f 'docusaurus start' || pkill -9 -f 'docusaurus serve' || echo 'No docusaurus server running'", "build": "yarn run _init && yarn run generate:all && NODE_OPTIONS='--max-old-space-size=8192' DEBUG=docusaurus:* docusaurus build", "generate:api-docs": "python3 scripts/fix-openapi-spec.py && docusaurus gen-api-docs superset && node scripts/convert-api-sidebar.mjs && node scripts/generate-api-index.mjs && node scripts/generate-api-tag-pages.mjs", "clean:api-docs": "docusaurus clean-api-docs superset", diff --git a/docs/scripts/convert-api-sidebar.mjs b/docs/scripts/convert-api-sidebar.mjs index 3a7854664a6..f3db2a125fa 100644 --- a/docs/scripts/convert-api-sidebar.mjs +++ b/docs/scripts/convert-api-sidebar.mjs @@ -28,8 +28,8 @@ import path from 'path'; import { fileURLToPath } from 'url'; const __dirname = path.dirname(fileURLToPath(import.meta.url)); -const sidebarTsPath = path.join(__dirname, '..', 'docs', 'api', 'sidebar.ts'); -const sidebarJsPath = path.join(__dirname, '..', 'docs', 'api', 'sidebar.js'); +const sidebarTsPath = path.join(__dirname, '..', 'developer_docs', 'api', 'sidebar.ts'); +const sidebarJsPath = path.join(__dirname, '..', 'developer_docs', 'api', 'sidebar.js'); if (!fs.existsSync(sidebarTsPath)) { console.log('No sidebar.ts found, skipping conversion'); diff --git a/docs/scripts/generate-api-index.mjs b/docs/scripts/generate-api-index.mjs index 0a9b095fed0..e5f6a7e6934 100644 --- a/docs/scripts/generate-api-index.mjs +++ b/docs/scripts/generate-api-index.mjs @@ -35,8 +35,8 @@ const require = createRequire(import.meta.url); const __dirname = path.dirname(fileURLToPath(import.meta.url)); const SPEC_PATH = path.join(__dirname, '..', 'static', 'resources', 'openapi.json'); -const SIDEBAR_PATH = path.join(__dirname, '..', 'docs', 'api', 'sidebar.js'); -const OUTPUT_PATH = path.join(__dirname, '..', 'docs', 'api.mdx'); +const SIDEBAR_PATH = path.join(__dirname, '..', 'developer_docs', 'api', 'sidebar.js'); +const OUTPUT_PATH = path.join(__dirname, '..', 'developer_docs', 'api.mdx'); // Category groupings for better organization const CATEGORY_GROUPS = { diff --git a/docs/scripts/generate-api-tag-pages.mjs b/docs/scripts/generate-api-tag-pages.mjs index 2057436b4f1..23ecda248b8 100644 --- a/docs/scripts/generate-api-tag-pages.mjs +++ b/docs/scripts/generate-api-tag-pages.mjs @@ -34,7 +34,7 @@ const require = createRequire(import.meta.url); const __dirname = path.dirname(fileURLToPath(import.meta.url)); const SPEC_PATH = path.join(__dirname, '..', 'static', 'resources', 'openapi.json'); -const API_DOCS_DIR = path.join(__dirname, '..', 'docs', 'api'); +const API_DOCS_DIR = path.join(__dirname, '..', 'developer_docs', 'api'); const SIDEBAR_PATH = path.join(API_DOCS_DIR, 'sidebar.js'); function slugify(text) { diff --git a/docs/scripts/generate-database-docs.mjs b/docs/scripts/generate-database-docs.mjs index 912569294ee..85e5b18ff0c 100644 --- a/docs/scripts/generate-database-docs.mjs +++ b/docs/scripts/generate-database-docs.mjs @@ -548,7 +548,7 @@ Superset to a database is to **install the proper database driver(s)** in your e You'll need to install the required packages for the database you want to use as your metadata database as well as the packages needed to connect to the databases you want to access through Superset. For information about setting up Superset's metadata database, please refer to -installation documentations ([Docker Compose](/docs/installation/docker-compose), [Kubernetes](/docs/installation/kubernetes)) +installation documentations ([Docker Compose](/admin-docs/installation/docker-compose), [Kubernetes](/admin-docs/installation/kubernetes)) ::: ## Supported Databases diff --git a/docs/sidebarAdminDocs.js b/docs/sidebarAdminDocs.js new file mode 100644 index 00000000000..b9d941a7318 --- /dev/null +++ b/docs/sidebarAdminDocs.js @@ -0,0 +1,73 @@ +/* eslint-env node */ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +// @ts-check + +/** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */ +const sidebars = { + AdminDocsSidebar: [ + { + type: 'doc', + label: 'Overview', + id: 'index', + }, + { + type: 'category', + label: 'Installation', + collapsed: false, + items: [ + { + type: 'autogenerated', + dirName: 'installation', + }, + ], + }, + { + type: 'category', + label: 'Configuration', + collapsed: true, + items: [ + { + type: 'autogenerated', + dirName: 'configuration', + }, + ], + }, + { + type: 'link', + label: 'Database Drivers', + href: '/user-docs/databases/', + description: 'See User Docs for database connection guides', + }, + { + type: 'category', + label: 'Security', + collapsed: true, + items: [ + { + type: 'autogenerated', + dirName: 'security', + }, + ], + }, + ], +}; + +module.exports = sidebars; diff --git a/docs/sidebarTutorials.js b/docs/sidebarTutorials.js index b786478c0b1..77839807ade 100644 --- a/docs/sidebarTutorials.js +++ b/docs/sidebarTutorials.js @@ -42,6 +42,7 @@ const sidebars = { 'contributing/howtos', 'contributing/release-process', 'contributing/resources', + 'contributing/pkg-resources-migration', 'guidelines/design-guidelines', { type: 'category', @@ -91,6 +92,7 @@ const sidebars = { collapsed: true, items: [ 'extensions/extension-points/sqllab', + 'extensions/extension-points/editors', ], }, 'extensions/development', @@ -127,9 +129,21 @@ const sidebars = { ], }, { - type: 'link', + type: 'category', label: 'API Reference', - href: '/docs/api', + link: { + type: 'doc', + id: 'api', + }, + items: (() => { + try { + // eslint-disable-next-line @typescript-eslint/no-require-imports + return require('./developer_docs/api/sidebar.js'); + } catch { + // Generated by `yarn generate:api-docs`; empty until then + return []; + } + })(), }, ], }; diff --git a/docs/sidebars.js b/docs/sidebars.js index ade80e9e652..fa7c7c5ef53 100644 --- a/docs/sidebars.js +++ b/docs/sidebars.js @@ -22,15 +22,12 @@ /** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */ const sidebars = { - // By default, Docusaurus generates a sidebar from the docs folder structure - //tutorialSidebar: [{type: 'autogenerated', dirName: '.'}], - - // But we're not doing that. + // User Docs sidebar - for analysts and business users CustomSidebar: [ { type: 'doc', - label: 'Introduction', - id: 'intro', + label: 'Overview', + id: 'index', }, { type: 'doc', @@ -39,27 +36,19 @@ const sidebars = { }, { type: 'category', - label: 'Installation', + label: 'Using Superset', + collapsed: false, items: [ { type: 'autogenerated', - dirName: 'installation', + dirName: 'using-superset', }, ], }, { type: 'category', - label: 'Configuration', - items: [ - { - type: 'autogenerated', - dirName: 'configuration', - }, - ], - }, - { - type: 'category', - label: 'Databases', + label: 'Connecting to Databases', + collapsed: true, link: { type: 'doc', id: 'databases/index', @@ -71,58 +60,11 @@ const sidebars = { }, ], }, - { - type: 'category', - label: 'Using Superset', - items: [ - { - type: 'autogenerated', - dirName: 'using-superset', - }, - ], - }, - { - type: 'category', - label: 'Contributing', - items: [ - { - type: 'autogenerated', - dirName: 'contributing', - }, - ], - }, - { - type: 'category', - label: 'Security', - items: [ - { - type: 'autogenerated', - dirName: 'security', - }, - ], - }, { type: 'doc', label: 'FAQ', id: 'faq', }, - { - type: 'category', - label: 'API Reference', - link: { - type: 'doc', - id: 'api', - }, - items: (() => { - try { - // eslint-disable-next-line @typescript-eslint/no-require-imports - return require('./docs/api/sidebar.js'); - } catch { - // Generated by `yarn generate:api-docs`; empty until then - return []; - } - })(), - }, ], }; diff --git a/docs/src/components/databases/DatabaseLogoWall.tsx b/docs/src/components/databases/DatabaseLogoWall.tsx new file mode 100644 index 00000000000..a9b5b7be98b --- /dev/null +++ b/docs/src/components/databases/DatabaseLogoWall.tsx @@ -0,0 +1,71 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import React from 'react'; +import databaseData from '../../data/databases.json'; +import type { DatabaseData } from './types'; + +const typedData = databaseData as DatabaseData; + +const seenLogos = new Set(); +const databases = Object.entries(typedData.databases) + .filter(([, db]) => db.documentation?.logo && db.documentation?.homepage_url) + .sort(([a], [b]) => a.localeCompare(b)) + .filter(([, db]) => { + const logo = db.documentation.logo!; + if (seenLogos.has(logo)) return false; + seenLogos.add(logo); + return true; + }) + .map(([name, db]) => ({ + name, + logo: db.documentation.logo!, + docPath: `/user-docs/databases/supported/${name.toLowerCase().replace(/[^a-z0-9]+/g, '-').replace(/^-|-$/g, '')}`, + })); + +export default function DatabaseLogoWall(): React.JSX.Element { + return ( +

+ {databases.map(({ name, logo, docPath }) => ( + + {name} + + ))} +
+ ); +} diff --git a/docs/src/data/databases.json b/docs/src/data/databases.json index 67cfdc3aebd..18eac0ddf34 100644 --- a/docs/src/data/databases.json +++ b/docs/src/data/databases.json @@ -1,5 +1,5 @@ { - "generated": "2026-02-16T04:47:37.257Z", + "generated": "2026-02-26T01:18:11.347Z", "statistics": { "totalDatabases": 72, "withDocumentation": 72, diff --git a/docs/src/pages/index.tsx b/docs/src/pages/index.tsx index 489019613aa..dba54bc6065 100644 --- a/docs/src/pages/index.tsx +++ b/docs/src/pages/index.tsx @@ -103,6 +103,41 @@ const features = [ }, ]; +const docSections = [ + { + title: 'User Guide', + description: + 'For analysts and business users. Learn to explore data, build charts, create dashboards, and connect to databases.', + cta: 'Browse User Docs', + href: '/user-docs/', + accent: '#20a7c9', + }, + { + title: 'Administrator Guide', + description: + 'For teams installing and operating Superset. Covers installation, configuration, security, and database drivers.', + cta: 'Browse Admin Docs', + href: '/admin-docs/', + accent: '#457f8d', + }, + { + title: 'Developer Guide', + description: + 'For contributors and engineers building on Superset. Covers the REST API, extensions, and contributing workflows.', + cta: 'Browse Developer Docs', + href: '/developer-docs/', + accent: '#2d6a4f', + }, + { + title: 'Community', + description: + 'Join the Superset community. Find resources on Slack, GitHub, the mailing list, and upcoming meetups.', + cta: 'Join the Community', + href: '/community', + accent: '#6d4c7e', + }, +]; + const StyledMain = styled('main')` text-align: center; `; @@ -289,6 +324,81 @@ const StyledFeaturesList = styled('ul')` } `; +interface StyledDocSectionCardProps { + accent: string; +} + +const StyledDocSectionsHeader = styled('div')` + & > div { + max-width: 960px; + } +`; + +const StyledDocSectionsGrid = styled('div')` + display: grid; + grid-template-columns: repeat(4, minmax(0, 1fr)); + gap: 20px; + max-width: 1170px; + width: 100%; + margin: 30px auto 0; + padding: 0 20px 10px; + ${mq[2]} { + grid-template-columns: repeat(2, minmax(0, 1fr)); + } + ${mq[0]} { + grid-template-columns: repeat(1, minmax(0, 1fr)); + } +`; + +const StyledDocSectionCard = styled(Link)` + display: flex; + flex-direction: column; + align-items: flex-start; + text-align: left; + border: 1px solid var(--ifm-border-color); + border-top: 4px solid ${({ accent }) => accent}; + border-radius: 10px; + padding: 24px; + text-decoration: none; + color: var(--ifm-font-base-color); + background: transparent; + transition: transform 0.2s ease, box-shadow 0.2s ease; + &:hover { + transform: translateY(-4px); + box-shadow: 0 8px 24px rgba(0, 0, 0, 0.1); + text-decoration: none; + color: var(--ifm-font-base-color); + } + .card-title { + font-size: 20px; + font-weight: 700; + margin: 0 0 8px; + color: var(--ifm-font-base-color); + } + .card-description { + font-size: 15px; + line-height: 22px; + margin: 0 0 16px; + color: var(--ifm-font-base-color); + flex: 1; + } + .card-cta { + font-size: 14px; + font-weight: 700; + color: ${({ accent }) => accent}; + margin: 0; + } + ${mq[1]} { + padding: 20px; + .card-title { + font-size: 18px; + } + .card-description { + font-size: 14px; + } + } +`; + const StyledSliderSection = styled('div')` position: relative; padding: 60px 20px; @@ -622,6 +732,24 @@ export default function Home(): JSX.Element {
+ + + + + + {docSections.map(({ title, description, cta, href, accent }) => ( + +

{title}

+

{description}

+ {cta} → +
+ ))} +
+
.navbar__link) { + display: flex; + height: 100%; + align-items: center; + padding-bottom: 0 !important; + } +} + +.navbar__item.dropdown:has(> .navbar__link.active) { + box-shadow: inset 0 -2px 0 var(--ifm-color-primary); +} + /* Dark mode support */ [data-theme='dark'] .navbar__item.dropdown .dropdown__menu { background-color: #242526; diff --git a/docs/versioned_docs/version-6.0.0/configuration/alerts-reports.mdx b/docs/versioned_docs/version-6.0.0/configuration/alerts-reports.mdx index b6ad789ca22..a989bc63b8d 100644 --- a/docs/versioned_docs/version-6.0.0/configuration/alerts-reports.mdx +++ b/docs/versioned_docs/version-6.0.0/configuration/alerts-reports.mdx @@ -20,12 +20,12 @@ Alerts and reports are disabled by default. To turn them on, you need to do some #### In your `superset_config.py` or `superset_config_docker.py` -- `"ALERT_REPORTS"` [feature flag](/docs/configuration/configuring-superset#feature-flags) must be turned to True. +- `"ALERT_REPORTS"` [feature flag](/docs/6.0.0/configuration/configuring-superset#feature-flags) must be turned to True. - `beat_schedule` in CeleryConfig must contain schedule for `reports.scheduler`. - At least one of those must be configured, depending on what you want to use: - emails: `SMTP_*` settings - Slack messages: `SLACK_API_TOKEN` -- Users can customize the email subject by including date code placeholders, which will automatically be replaced with the corresponding UTC date when the email is sent. To enable this functionality, activate the `"DATE_FORMAT_IN_EMAIL_SUBJECT"` [feature flag](/docs/configuration/configuring-superset#feature-flags). This enables date formatting in email subjects, preventing all reporting emails from being grouped into the same thread (optional for the reporting feature). +- Users can customize the email subject by including date code placeholders, which will automatically be replaced with the corresponding UTC date when the email is sent. To enable this functionality, activate the `"DATE_FORMAT_IN_EMAIL_SUBJECT"` [feature flag](/docs/6.0.0/configuration/configuring-superset#feature-flags). This enables date formatting in email subjects, preventing all reporting emails from being grouped into the same thread (optional for the reporting feature). - Use date codes from [strftime.org](https://strftime.org/) to create the email subject. - If no date code is provided, the original string will be used as the email subject. @@ -38,7 +38,7 @@ Screenshots will be taken but no messages actually sent as long as `ALERT_REPORT - You must install a headless browser, for taking screenshots of the charts and dashboards. Only Firefox and Chrome are currently supported. > If you choose Chrome, you must also change the value of `WEBDRIVER_TYPE` to `"chrome"` in your `superset_config.py`. -Note: All the components required (Firefox headless browser, Redis, Postgres db, celery worker and celery beat) are present in the *dev* docker image if you are following [Installing Superset Locally](/docs/installation/docker-compose/). +Note: All the components required (Firefox headless browser, Redis, Postgres db, celery worker and celery beat) are present in the *dev* docker image if you are following [Installing Superset Locally](/docs/6.0.0/installation/docker-compose/). All you need to do is add the required config variables described in this guide (See `Detailed Config`). If you are running a non-dev docker image, e.g., a stable release like `apache/superset:3.1.0`, that image does not include a headless browser. Only the `superset_worker` container needs this headless browser to browse to the target chart or dashboard. @@ -70,7 +70,7 @@ Note: when you configure an alert or a report, the Slack channel list takes chan ### Kubernetes-specific - You must have a `celery beat` pod running. If you're using the chart included in the GitHub repository under [helm/superset](https://github.com/apache/superset/tree/master/helm/superset), you need to put `supersetCeleryBeat.enabled = true` in your values override. -- You can see the dedicated docs about [Kubernetes installation](/docs/installation/kubernetes) for more details. +- You can see the dedicated docs about [Kubernetes installation](/docs/6.0.0/installation/kubernetes) for more details. ### Docker Compose specific diff --git a/docs/versioned_docs/version-6.0.0/configuration/cache.mdx b/docs/versioned_docs/version-6.0.0/configuration/cache.mdx index ccd4daba6d1..2f60785c5ed 100644 --- a/docs/versioned_docs/version-6.0.0/configuration/cache.mdx +++ b/docs/versioned_docs/version-6.0.0/configuration/cache.mdx @@ -78,11 +78,11 @@ Caching for SQL Lab query results is used when async queries are enabled and is Note that this configuration does not use a flask-caching dictionary for its configuration, but instead requires a cachelib object. -See [Async Queries via Celery](/docs/configuration/async-queries-celery) for details. +See [Async Queries via Celery](/docs/6.0.0/configuration/async-queries-celery) for details. ## Caching Thumbnails -This is an optional feature that can be turned on by activating its [feature flag](/docs/configuration/configuring-superset#feature-flags) on config: +This is an optional feature that can be turned on by activating its [feature flag](/docs/6.0.0/configuration/configuring-superset#feature-flags) on config: ``` FEATURE_FLAGS = { diff --git a/docs/versioned_docs/version-6.0.0/configuration/configuring-superset.mdx b/docs/versioned_docs/version-6.0.0/configuration/configuring-superset.mdx index 845fd91e3cf..d9fb2ca41a0 100644 --- a/docs/versioned_docs/version-6.0.0/configuration/configuring-superset.mdx +++ b/docs/versioned_docs/version-6.0.0/configuration/configuring-superset.mdx @@ -37,7 +37,7 @@ ENV SUPERSET_CONFIG_PATH /app/superset_config.py ``` Docker compose deployments handle application configuration differently using specific conventions. -Refer to the [docker compose tips & configuration](/docs/installation/docker-compose#docker-compose-tips--configuration) +Refer to the [docker compose tips & configuration](/docs/6.0.0/installation/docker-compose#docker-compose-tips--configuration) for details. The following is an example of just a few of the parameters you can set in your `superset_config.py` file: @@ -246,7 +246,7 @@ flask --app "superset.app:create_app(superset_app_root='/analytics')" ### Docker builds -The [docker compose](/docs/installation/docker-compose#configuring-further) developer +The [docker compose](/docs/6.0.0/installation/docker-compose#configuring-further) developer configuration includes an additional environmental variable, [`SUPERSET_APP_ROOT`](https://github.com/apache/superset/blob/master/docker/.env), to simplify the process of setting up a non-default root path across the services. @@ -441,4 +441,4 @@ FEATURE_FLAGS = { } ``` -A current list of feature flags can be found in the [Feature Flags](/docs/configuration/feature-flags) documentation. +A current list of feature flags can be found in the [Feature Flags](/docs/6.0.0/configuration/feature-flags) documentation. diff --git a/docs/versioned_docs/version-6.0.0/configuration/databases.mdx b/docs/versioned_docs/version-6.0.0/configuration/databases.mdx index 2b293e79c86..5c344c70223 100644 --- a/docs/versioned_docs/version-6.0.0/configuration/databases.mdx +++ b/docs/versioned_docs/version-6.0.0/configuration/databases.mdx @@ -14,7 +14,7 @@ in your environment. You’ll need to install the required packages for the database you want to use as your metadata database as well as the packages needed to connect to the databases you want to access through Superset. For information about setting up Superset's metadata database, please refer to -installation documentations ([Docker Compose](/docs/installation/docker-compose), [Kubernetes](/docs/installation/kubernetes)) +installation documentations ([Docker Compose](/docs/6.0.0/installation/docker-compose), [Kubernetes](/docs/6.0.0/installation/kubernetes)) ::: This documentation tries to keep pointer to the different drivers for commonly used database @@ -26,7 +26,7 @@ Superset requires a Python [DB-API database driver](https://peps.python.org/pep- and a [SQLAlchemy dialect](https://docs.sqlalchemy.org/en/20/dialects/) to be installed for each database engine you want to connect to. -You can read more [here](/docs/configuration/databases#installing-drivers-in-docker-images) about how to +You can read more [here](/docs/6.0.0/configuration/databases#installing-drivers-in-docker-images) about how to install new database drivers into your Superset configuration. ### Supported Databases and Dependencies @@ -37,53 +37,53 @@ are compatible with Superset. |
Database
| PyPI package | Connection String | | --------------------------------------------------------- | ---------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | -| [AWS Athena](/docs/configuration/databases#aws-athena) | `pip install pyathena[pandas]` , `pip install PyAthenaJDBC` | `awsathena+rest://{access_key_id}:{access_key}@athena.{region}.amazonaws.com/{schema}?s3_staging_dir={s3_staging_dir}&...` | -| [AWS DynamoDB](/docs/configuration/databases#aws-dynamodb) | `pip install pydynamodb` | `dynamodb://{access_key_id}:{secret_access_key}@dynamodb.{region_name}.amazonaws.com?connector=superset` | -| [AWS Redshift](/docs/configuration/databases#aws-redshift) | `pip install sqlalchemy-redshift` | `redshift+psycopg2://:@:5439/` | -| [Apache Doris](/docs/configuration/databases#apache-doris) | `pip install pydoris` | `doris://:@:/.` | -| [Apache Drill](/docs/configuration/databases#apache-drill) | `pip install sqlalchemy-drill` | `drill+sadrill://:@:/`, often useful: `?use_ssl=True/False` | -| [Apache Druid](/docs/configuration/databases#apache-druid) | `pip install pydruid` | `druid://:@:/druid/v2/sql` | -| [Apache Hive](/docs/configuration/databases#hive) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` | -| [Apache Impala](/docs/configuration/databases#apache-impala) | `pip install impyla` | `impala://{hostname}:{port}/{database}` | -| [Apache Kylin](/docs/configuration/databases#apache-kylin) | `pip install kylinpy` | `kylin://:@:/?=&=` | -| [Apache Pinot](/docs/configuration/databases#apache-pinot) | `pip install pinotdb` | `pinot://BROKER:5436/query?server=http://CONTROLLER:5983/` | -| [Apache Solr](/docs/configuration/databases#apache-solr) | `pip install sqlalchemy-solr` | `solr://{username}:{password}@{hostname}:{port}/{server_path}/{collection}` | -| [Apache Spark SQL](/docs/configuration/databases#apache-spark-sql) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` | -| [Ascend.io](/docs/configuration/databases#ascendio) | `pip install impyla` | `ascend://{username}:{password}@{hostname}:{port}/{database}?auth_mechanism=PLAIN;use_ssl=true` | -| [Azure MS SQL](/docs/configuration/databases#sql-server) | `pip install pymssql` | `mssql+pymssql://UserName@presetSQL:TestPassword@presetSQL.database.windows.net:1433/TestSchema` | -| [ClickHouse](/docs/configuration/databases#clickhouse) | `pip install clickhouse-connect` | `clickhousedb://{username}:{password}@{hostname}:{port}/{database}` | -| [CockroachDB](/docs/configuration/databases#cockroachdb) | `pip install cockroachdb` | `cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable` | -| [Couchbase](/docs/configuration/databases#couchbase) | `pip install couchbase-sqlalchemy` | `couchbase://{username}:{password}@{hostname}:{port}?truststorepath={ssl certificate path}` | -| [CrateDB](/docs/configuration/databases#cratedb) | `pip install sqlalchemy-cratedb` | `crate://{username}:{password}@{hostname}:{port}`, often useful: `?ssl=true/false` or `?schema=testdrive`. | -| [Denodo](/docs/configuration/databases#denodo) | `pip install denodo-sqlalchemy` | `denodo://{username}:{password}@{hostname}:{port}/{database}` | -| [Dremio](/docs/configuration/databases#dremio) | `pip install sqlalchemy_dremio` |`dremio+flight://{username}:{password}@{host}:32010`, often useful: `?UseEncryption=true/false`. For Legacy ODBC: `dremio+pyodbc://{username}:{password}@{host}:31010` | -| [Elasticsearch](/docs/configuration/databases#elasticsearch) | `pip install elasticsearch-dbapi` | `elasticsearch+http://{user}:{password}@{host}:9200/` | -| [Exasol](/docs/configuration/databases#exasol) | `pip install sqlalchemy-exasol` | `exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC` | -| [Google BigQuery](/docs/configuration/databases#google-bigquery) | `pip install sqlalchemy-bigquery` | `bigquery://{project_id}` | -| [Google Sheets](/docs/configuration/databases#google-sheets) | `pip install shillelagh[gsheetsapi]` | `gsheets://` | -| [Firebolt](/docs/configuration/databases#firebolt) | `pip install firebolt-sqlalchemy` | `firebolt://{client_id}:{client_secret}@{database}/{engine_name}?account_name={name}` | -| [Hologres](/docs/configuration/databases#hologres) | `pip install psycopg2` | `postgresql+psycopg2://:@/` | -| [IBM Db2](/docs/configuration/databases#ibm-db2) | `pip install ibm_db_sa` | `db2+ibm_db://` | -| [IBM Netezza Performance Server](/docs/configuration/databases#ibm-netezza-performance-server) | `pip install nzalchemy` | `netezza+nzpy://:@/` | -| [MySQL](/docs/configuration/databases#mysql) | `pip install mysqlclient` | `mysql://:@/` | -| [OceanBase](/docs/configuration/databases#oceanbase) | `pip install oceanbase_py` | `oceanbase://:@/` | -| [Oracle](/docs/configuration/databases#oracle) | `pip install cx_Oracle` | `oracle://:@:` | -| [Parseable](/docs/configuration/databases#parseable) | `pip install sqlalchemy-parseable` | `parseable://:@/` | -| [PostgreSQL](/docs/configuration/databases#postgres) | `pip install psycopg2` | `postgresql://:@/` | -| [Presto](/docs/configuration/databases#presto) | `pip install pyhive` | `presto://{username}:{password}@{hostname}:{port}/{database}` | -| [SAP Hana](/docs/configuration/databases#hana) | `pip install hdbcli sqlalchemy-hana` or `pip install apache_superset[hana]` | `hana://{username}:{password}@{host}:{port}` | -| [SingleStore](/docs/configuration/databases#singlestore) | `pip install sqlalchemy-singlestoredb` | `singlestoredb://{username}:{password}@{host}:{port}/{database}` | -| [StarRocks](/docs/configuration/databases#starrocks) | `pip install starrocks` | `starrocks://:@:/.` | -| [Snowflake](/docs/configuration/databases#snowflake) | `pip install snowflake-sqlalchemy` | `snowflake://{user}:{password}@{account}.{region}/{database}?role={role}&warehouse={warehouse}` | +| [AWS Athena](/docs/6.0.0/configuration/databases#aws-athena) | `pip install pyathena[pandas]` , `pip install PyAthenaJDBC` | `awsathena+rest://{access_key_id}:{access_key}@athena.{region}.amazonaws.com/{schema}?s3_staging_dir={s3_staging_dir}&...` | +| [AWS DynamoDB](/docs/6.0.0/configuration/databases#aws-dynamodb) | `pip install pydynamodb` | `dynamodb://{access_key_id}:{secret_access_key}@dynamodb.{region_name}.amazonaws.com?connector=superset` | +| [AWS Redshift](/docs/6.0.0/configuration/databases#aws-redshift) | `pip install sqlalchemy-redshift` | `redshift+psycopg2://:@:5439/` | +| [Apache Doris](/docs/6.0.0/configuration/databases#apache-doris) | `pip install pydoris` | `doris://:@:/.` | +| [Apache Drill](/docs/6.0.0/configuration/databases#apache-drill) | `pip install sqlalchemy-drill` | `drill+sadrill://:@:/`, often useful: `?use_ssl=True/False` | +| [Apache Druid](/docs/6.0.0/configuration/databases#apache-druid) | `pip install pydruid` | `druid://:@:/druid/v2/sql` | +| [Apache Hive](/docs/6.0.0/configuration/databases#hive) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` | +| [Apache Impala](/docs/6.0.0/configuration/databases#apache-impala) | `pip install impyla` | `impala://{hostname}:{port}/{database}` | +| [Apache Kylin](/docs/6.0.0/configuration/databases#apache-kylin) | `pip install kylinpy` | `kylin://:@:/?=&=` | +| [Apache Pinot](/docs/6.0.0/configuration/databases#apache-pinot) | `pip install pinotdb` | `pinot://BROKER:5436/query?server=http://CONTROLLER:5983/` | +| [Apache Solr](/docs/6.0.0/configuration/databases#apache-solr) | `pip install sqlalchemy-solr` | `solr://{username}:{password}@{hostname}:{port}/{server_path}/{collection}` | +| [Apache Spark SQL](/docs/6.0.0/configuration/databases#apache-spark-sql) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` | +| [Ascend.io](/docs/6.0.0/configuration/databases#ascendio) | `pip install impyla` | `ascend://{username}:{password}@{hostname}:{port}/{database}?auth_mechanism=PLAIN;use_ssl=true` | +| [Azure MS SQL](/docs/6.0.0/configuration/databases#sql-server) | `pip install pymssql` | `mssql+pymssql://UserName@presetSQL:TestPassword@presetSQL.database.windows.net:1433/TestSchema` | +| [ClickHouse](/docs/6.0.0/configuration/databases#clickhouse) | `pip install clickhouse-connect` | `clickhousedb://{username}:{password}@{hostname}:{port}/{database}` | +| [CockroachDB](/docs/6.0.0/configuration/databases#cockroachdb) | `pip install cockroachdb` | `cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable` | +| [Couchbase](/docs/6.0.0/configuration/databases#couchbase) | `pip install couchbase-sqlalchemy` | `couchbase://{username}:{password}@{hostname}:{port}?truststorepath={ssl certificate path}` | +| [CrateDB](/docs/6.0.0/configuration/databases#cratedb) | `pip install sqlalchemy-cratedb` | `crate://{username}:{password}@{hostname}:{port}`, often useful: `?ssl=true/false` or `?schema=testdrive`. | +| [Denodo](/docs/6.0.0/configuration/databases#denodo) | `pip install denodo-sqlalchemy` | `denodo://{username}:{password}@{hostname}:{port}/{database}` | +| [Dremio](/docs/6.0.0/configuration/databases#dremio) | `pip install sqlalchemy_dremio` |`dremio+flight://{username}:{password}@{host}:32010`, often useful: `?UseEncryption=true/false`. For Legacy ODBC: `dremio+pyodbc://{username}:{password}@{host}:31010` | +| [Elasticsearch](/docs/6.0.0/configuration/databases#elasticsearch) | `pip install elasticsearch-dbapi` | `elasticsearch+http://{user}:{password}@{host}:9200/` | +| [Exasol](/docs/6.0.0/configuration/databases#exasol) | `pip install sqlalchemy-exasol` | `exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC` | +| [Google BigQuery](/docs/6.0.0/configuration/databases#google-bigquery) | `pip install sqlalchemy-bigquery` | `bigquery://{project_id}` | +| [Google Sheets](/docs/6.0.0/configuration/databases#google-sheets) | `pip install shillelagh[gsheetsapi]` | `gsheets://` | +| [Firebolt](/docs/6.0.0/configuration/databases#firebolt) | `pip install firebolt-sqlalchemy` | `firebolt://{client_id}:{client_secret}@{database}/{engine_name}?account_name={name}` | +| [Hologres](/docs/6.0.0/configuration/databases#hologres) | `pip install psycopg2` | `postgresql+psycopg2://:@/` | +| [IBM Db2](/docs/6.0.0/configuration/databases#ibm-db2) | `pip install ibm_db_sa` | `db2+ibm_db://` | +| [IBM Netezza Performance Server](/docs/6.0.0/configuration/databases#ibm-netezza-performance-server) | `pip install nzalchemy` | `netezza+nzpy://:@/` | +| [MySQL](/docs/6.0.0/configuration/databases#mysql) | `pip install mysqlclient` | `mysql://:@/` | +| [OceanBase](/docs/6.0.0/configuration/databases#oceanbase) | `pip install oceanbase_py` | `oceanbase://:@/` | +| [Oracle](/docs/6.0.0/configuration/databases#oracle) | `pip install cx_Oracle` | `oracle://:@:` | +| [Parseable](/docs/6.0.0/configuration/databases#parseable) | `pip install sqlalchemy-parseable` | `parseable://:@/` | +| [PostgreSQL](/docs/6.0.0/configuration/databases#postgres) | `pip install psycopg2` | `postgresql://:@/` | +| [Presto](/docs/6.0.0/configuration/databases#presto) | `pip install pyhive` | `presto://{username}:{password}@{hostname}:{port}/{database}` | +| [SAP Hana](/docs/6.0.0/configuration/databases#hana) | `pip install hdbcli sqlalchemy-hana` or `pip install apache_superset[hana]` | `hana://{username}:{password}@{host}:{port}` | +| [SingleStore](/docs/6.0.0/configuration/databases#singlestore) | `pip install sqlalchemy-singlestoredb` | `singlestoredb://{username}:{password}@{host}:{port}/{database}` | +| [StarRocks](/docs/6.0.0/configuration/databases#starrocks) | `pip install starrocks` | `starrocks://:@:/.` | +| [Snowflake](/docs/6.0.0/configuration/databases#snowflake) | `pip install snowflake-sqlalchemy` | `snowflake://{user}:{password}@{account}.{region}/{database}?role={role}&warehouse={warehouse}` | | SQLite | No additional library needed | `sqlite://path/to/file.db?check_same_thread=false` | -| [SQL Server](/docs/configuration/databases#sql-server) | `pip install pymssql` | `mssql+pymssql://:@:/` | -| [TDengine](/docs/configuration/databases#tdengine) | `pip install taospy` `pip install taos-ws-py` | `taosws://:@:` | -| [Teradata](/docs/configuration/databases#teradata) | `pip install teradatasqlalchemy` | `teradatasql://{user}:{password}@{host}` | -| [TimescaleDB](/docs/configuration/databases#timescaledb) | `pip install psycopg2` | `postgresql://:@:/` | -| [Trino](/docs/configuration/databases#trino) | `pip install trino` | `trino://{username}:{password}@{hostname}:{port}/{catalog}` | -| [Vertica](/docs/configuration/databases#vertica) | `pip install sqlalchemy-vertica-python` | `vertica+vertica_python://:@/` | -| [YDB](/docs/configuration/databases#ydb) | `pip install ydb-sqlalchemy` | `ydb://{host}:{port}/{database_name}` | -| [YugabyteDB](/docs/configuration/databases#yugabytedb) | `pip install psycopg2` | `postgresql://:@/` | +| [SQL Server](/docs/6.0.0/configuration/databases#sql-server) | `pip install pymssql` | `mssql+pymssql://:@:/` | +| [TDengine](/docs/6.0.0/configuration/databases#tdengine) | `pip install taospy` `pip install taos-ws-py` | `taosws://:@:` | +| [Teradata](/docs/6.0.0/configuration/databases#teradata) | `pip install teradatasqlalchemy` | `teradatasql://{user}:{password}@{host}` | +| [TimescaleDB](/docs/6.0.0/configuration/databases#timescaledb) | `pip install psycopg2` | `postgresql://:@:/` | +| [Trino](/docs/6.0.0/configuration/databases#trino) | `pip install trino` | `trino://{username}:{password}@{hostname}:{port}/{catalog}` | +| [Vertica](/docs/6.0.0/configuration/databases#vertica) | `pip install sqlalchemy-vertica-python` | `vertica+vertica_python://:@/` | +| [YDB](/docs/6.0.0/configuration/databases#ydb) | `pip install ydb-sqlalchemy` | `ydb://{host}:{port}/{database_name}` | +| [YugabyteDB](/docs/6.0.0/configuration/databases#yugabytedb) | `pip install psycopg2` | `postgresql://:@/` | --- @@ -109,7 +109,7 @@ The connector library installation process is the same for all additional librar #### 1. Determine the driver you need -Consult the [list of database drivers](/docs/configuration/databases) +Consult the [list of database drivers](/docs/6.0.0/configuration/databases) and find the PyPI package needed to connect to your database. In this example, we're connecting to a MySQL database, so we'll need the `mysqlclient` connector library. @@ -165,11 +165,11 @@ to your database via the Superset web UI. As an admin user, go to Settings -> Data: Database Connections and click the +DATABASE button. From there, follow the steps on the -[Using Database Connection UI page](/docs/configuration/databases#connecting-through-the-ui). +[Using Database Connection UI page](/docs/6.0.0/configuration/databases#connecting-through-the-ui). Consult the page for your specific database type in the Superset documentation to determine the connection string and any other parameters you need to input. For instance, -on the [MySQL page](/docs/configuration/databases#mysql), we see that the connection string +on the [MySQL page](/docs/6.0.0/configuration/databases#mysql), we see that the connection string to a local MySQL database differs depending on whether the setup is running on Linux or Mac. Click the “Test Connection” button, which should result in a popup message saying, @@ -407,7 +407,7 @@ this: crate://:@.cratedb.net:4200/?ssl=true ``` -Follow the steps [here](/docs/configuration/databases#installing-database-drivers) +Follow the steps [here](/docs/6.0.0/configuration/databases#installing-database-drivers) to install the CrateDB connector package when setting up Superset locally using Docker Compose. @@ -782,7 +782,7 @@ The recommended connector library for BigQuery is ##### Install BigQuery Driver -Follow the steps [here](/docs/configuration/databases#installing-drivers-in-docker-images) about how to +Follow the steps [here](/docs/6.0.0/configuration/databases#installing-drivers-in-docker-images) about how to install new database drivers when setting up Superset locally via docker compose. ```bash @@ -1177,7 +1177,7 @@ risingwave://root@{hostname}:{port}/{database}?sslmode=disable ##### Install Snowflake Driver -Follow the steps [here](/docs/configuration/databases#installing-database-drivers) about how to +Follow the steps [here](/docs/6.0.0/configuration/databases#installing-database-drivers) about how to install new database drivers when setting up Superset locally via docker compose. ```bash diff --git a/docs/versioned_docs/version-6.0.0/configuration/networking-settings.mdx b/docs/versioned_docs/version-6.0.0/configuration/networking-settings.mdx index 020071ed91e..d26d87382a3 100644 --- a/docs/versioned_docs/version-6.0.0/configuration/networking-settings.mdx +++ b/docs/versioned_docs/version-6.0.0/configuration/networking-settings.mdx @@ -51,7 +51,7 @@ Restart Superset for this configuration change to take effect. #### Making a Dashboard Public -1. Add the `'DASHBOARD_RBAC': True` [Feature Flag](/docs/configuration/feature-flags) to `superset_config.py` +1. Add the `'DASHBOARD_RBAC': True` [Feature Flag](/docs/6.0.0/configuration/feature-flags) to `superset_config.py` 2. Add the `Public` role to your dashboard as described [here](https://superset.apache.org/docs/using-superset/creating-your-first-dashboard/#manage-access-to-dashboards) #### Embedding a Public Dashboard diff --git a/docs/versioned_docs/version-6.0.0/configuration/sql-templating.mdx b/docs/versioned_docs/version-6.0.0/configuration/sql-templating.mdx index 09004ba6fb5..9af4f61dbbe 100644 --- a/docs/versioned_docs/version-6.0.0/configuration/sql-templating.mdx +++ b/docs/versioned_docs/version-6.0.0/configuration/sql-templating.mdx @@ -10,7 +10,7 @@ version: 1 ## Jinja Templates SQL Lab and Explore supports [Jinja templating](https://jinja.palletsprojects.com/en/2.11.x/) in queries. -To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/docs/configuration/configuring-superset#feature-flags) needs to be enabled in +To enable templating, the `ENABLE_TEMPLATE_PROCESSING` [feature flag](/docs/6.0.0/configuration/configuring-superset#feature-flags) needs to be enabled in `superset_config.py`. When templating is enabled, python code can be embedded in virtual datasets and in Custom SQL in the filter and metric controls in Explore. By default, the following variables are made available in the Jinja context: diff --git a/docs/versioned_docs/version-6.0.0/configuration/timezones.mdx b/docs/versioned_docs/version-6.0.0/configuration/timezones.mdx index 233e4786fcd..d27901970dd 100644 --- a/docs/versioned_docs/version-6.0.0/configuration/timezones.mdx +++ b/docs/versioned_docs/version-6.0.0/configuration/timezones.mdx @@ -20,7 +20,7 @@ To help make the problem somewhat tractable—given that Apache Superset has no To strive for data consistency (regardless of the timezone of the client) the Apache Superset backend tries to ensure that any timestamp sent to the client has an explicit (or semi-explicit as in the case with [Epoch time](https://en.wikipedia.org/wiki/Unix_time) which is always in reference to UTC) timezone encoded within. -The challenge however lies with the slew of [database engines](/docs/configuration/databases#installing-drivers-in-docker-images) which Apache Superset supports and various inconsistencies between their [Python Database API (DB-API)](https://www.python.org/dev/peps/pep-0249/) implementations combined with the fact that we use [Pandas](https://pandas.pydata.org/) to read SQL into a DataFrame prior to serializing to JSON. Regrettably Pandas ignores the DB-API [type_code](https://www.python.org/dev/peps/pep-0249/#type-objects) relying by default on the underlying Python type returned by the DB-API. Currently only a subset of the supported database engines work correctly with Pandas, i.e., ensuring timestamps without an explicit timestamp are serializd to JSON with the server timezone, thus guaranteeing the client will display timestamps in a consistent manner irrespective of the client's timezone. +The challenge however lies with the slew of [database engines](/docs/6.0.0/configuration/databases#installing-drivers-in-docker-images) which Apache Superset supports and various inconsistencies between their [Python Database API (DB-API)](https://www.python.org/dev/peps/pep-0249/) implementations combined with the fact that we use [Pandas](https://pandas.pydata.org/) to read SQL into a DataFrame prior to serializing to JSON. Regrettably Pandas ignores the DB-API [type_code](https://www.python.org/dev/peps/pep-0249/#type-objects) relying by default on the underlying Python type returned by the DB-API. Currently only a subset of the supported database engines work correctly with Pandas, i.e., ensuring timestamps without an explicit timestamp are serializd to JSON with the server timezone, thus guaranteeing the client will display timestamps in a consistent manner irrespective of the client's timezone. For example the following is a comparison of MySQL and Presto, diff --git a/docs/versioned_docs/version-6.0.0/contributing/contributing.mdx b/docs/versioned_docs/version-6.0.0/contributing/contributing.mdx index 109a3692df9..c151eb28b7c 100644 --- a/docs/versioned_docs/version-6.0.0/contributing/contributing.mdx +++ b/docs/versioned_docs/version-6.0.0/contributing/contributing.mdx @@ -77,7 +77,7 @@ Look through the GitHub issues. Issues tagged with Superset could always use better documentation, whether as part of the official Superset docs, in docstrings, `docs/*.rst` or even on the web as blog posts or -articles. See [Documentation](/docs/contributing/howtos#contributing-to-documentation) for more details. +articles. See [Documentation](/docs/6.0.0/contributing/howtos#contributing-to-documentation) for more details. ### Add Translations diff --git a/docs/versioned_docs/version-6.0.0/contributing/development.mdx b/docs/versioned_docs/version-6.0.0/contributing/development.mdx index 8e6822adc5b..afe11139957 100644 --- a/docs/versioned_docs/version-6.0.0/contributing/development.mdx +++ b/docs/versioned_docs/version-6.0.0/contributing/development.mdx @@ -599,7 +599,7 @@ export enum FeatureFlag { those specified under FEATURE_FLAGS in `superset_config.py`. For example, `DEFAULT_FEATURE_FLAGS = { 'FOO': True, 'BAR': False }` in `superset/config.py` and `FEATURE_FLAGS = { 'BAR': True, 'BAZ': True }` in `superset_config.py` will result in combined feature flags of `{ 'FOO': True, 'BAR': True, 'BAZ': True }`. -The current status of the usability of each flag (stable vs testing, etc) can be found in the [Feature Flags](/docs/configuration/feature-flags) documentation. +The current status of the usability of each flag (stable vs testing, etc) can be found in the [Feature Flags](/docs/6.0.0/configuration/feature-flags) documentation. ## Git Hooks @@ -614,7 +614,7 @@ A series of checks will now run when you make a git commit. ## Linting -See [how tos](/docs/contributing/howtos#linting) +See [how tos](/docs/6.0.0/contributing/howtos#linting) ## GitHub Actions and `act` diff --git a/docs/versioned_docs/version-6.0.0/contributing/guidelines.mdx b/docs/versioned_docs/version-6.0.0/contributing/guidelines.mdx index 1ba1e6af93c..26861553469 100644 --- a/docs/versioned_docs/version-6.0.0/contributing/guidelines.mdx +++ b/docs/versioned_docs/version-6.0.0/contributing/guidelines.mdx @@ -57,7 +57,7 @@ Finally, never submit a PR that will put master branch in broken state. If the P in `requirements.txt` pinned to a specific version which ensures that the application build is deterministic. - For TypeScript/JavaScript, include new libraries in `package.json` -- **Tests:** The pull request should include tests, either as doctests, unit tests, or both. Make sure to resolve all errors and test failures. See [Testing](/docs/contributing/howtos#testing) for how to run tests. +- **Tests:** The pull request should include tests, either as doctests, unit tests, or both. Make sure to resolve all errors and test failures. See [Testing](/docs/6.0.0/contributing/howtos#testing) for how to run tests. - **Documentation:** If the pull request adds functionality, the docs should be updated as part of the same PR. - **CI:** Reviewers will not review the code until all CI tests are passed. Sometimes there can be flaky tests. You can close and open PR to re-run CI test. Please report if the issue persists. After the CI fix has been deployed to `master`, please rebase your PR. - **Code coverage:** Please ensure that code coverage does not decrease. diff --git a/docs/versioned_docs/version-6.0.0/faq.mdx b/docs/versioned_docs/version-6.0.0/faq.mdx index d168eacde24..10c6b267d1e 100644 --- a/docs/versioned_docs/version-6.0.0/faq.mdx +++ b/docs/versioned_docs/version-6.0.0/faq.mdx @@ -51,7 +51,7 @@ multiple tables as long as your database account has access to the tables. ## How do I create my own visualization? We recommend reading the instructions in -[Creating Visualization Plugins](/docs/contributing/howtos#creating-visualization-plugins). +[Creating Visualization Plugins](/docs/6.0.0/contributing/howtos#creating-visualization-plugins). ## Can I upload and visualize CSV data? @@ -142,7 +142,7 @@ SQLALCHEMY_DATABASE_URI = 'sqlite:////new/location/superset.db?check_same_thread ``` You can read more about customizing Superset using the configuration file -[here](/docs/configuration/configuring-superset). +[here](/docs/6.0.0/configuration/configuring-superset). ## What if the table schema changed? @@ -157,7 +157,7 @@ table afterwards to configure the Columns tab, check the appropriate boxes and s To clarify, the database backend is an OLTP database used by Superset to store its internal information like your list of users and dashboard definitions. While Superset supports a -[variety of databases as data _sources_](/docs/configuration/databases#installing-database-drivers), +[variety of databases as data _sources_](/docs/6.0.0/configuration/databases#installing-database-drivers), only a few database engines are supported for use as the OLTP backend / metadata store. Superset is tested using MySQL, PostgreSQL, and SQLite backends. It’s recommended you install @@ -190,7 +190,7 @@ second etc). Example: ## Does Superset work with [insert database engine here]? -The [Connecting to Databases section](/docs/configuration/databases) provides the best +The [Connecting to Databases section](/docs/6.0.0/configuration/databases) provides the best overview for supported databases. Database engines not listed on that page may work too. We rely on the community to contribute to this knowledge base. @@ -266,7 +266,7 @@ Superset uses [Scarf](https://about.scarf.sh/) by default to collect basic telem We use the [Scarf Gateway](https://docs.scarf.sh/gateway/) to sit in front of container registries, the [scarf-js](https://about.scarf.sh/package-sdks) package to track `npm` installations, and a Scarf pixel to gather anonymous analytics on Superset page views. Scarf purges PII and provides aggregated statistics. Superset users can easily opt out of analytics in various ways documented [here](https://docs.scarf.sh/gateway/#do-not-track) and [here](https://docs.scarf.sh/package-analytics/#as-a-user-of-a-package-using-scarf-js-how-can-i-opt-out-of-analytics). Superset maintainers can also opt out of telemetry data collection by setting the `SCARF_ANALYTICS` environment variable to `false` in the Superset container (or anywhere Superset/webpack are run). -Additional opt-out instructions for Docker users are available on the [Docker Installation](/docs/installation/docker-compose) page. +Additional opt-out instructions for Docker users are available on the [Docker Installation](/docs/6.0.0/installation/docker-compose) page. ## Does Superset have an archive panel or trash bin from which a user can recover deleted assets? diff --git a/docs/versioned_docs/version-6.0.0/installation/architecture.mdx b/docs/versioned_docs/version-6.0.0/installation/architecture.mdx index 7830d5e078f..427fc0a4e5b 100644 --- a/docs/versioned_docs/version-6.0.0/installation/architecture.mdx +++ b/docs/versioned_docs/version-6.0.0/installation/architecture.mdx @@ -24,10 +24,10 @@ A Superset installation is made up of these components: The optional components above are necessary to enable these features: -- [Alerts and Reports](/docs/configuration/alerts-reports) -- [Caching](/docs/configuration/cache) -- [Async Queries](/docs/configuration/async-queries-celery/) -- [Dashboard Thumbnails](/docs/configuration/cache/#caching-thumbnails) +- [Alerts and Reports](/docs/6.0.0/configuration/alerts-reports) +- [Caching](/docs/6.0.0/configuration/cache) +- [Async Queries](/docs/6.0.0/configuration/async-queries-celery/) +- [Dashboard Thumbnails](/docs/6.0.0/configuration/cache/#caching-thumbnails) If you install with Kubernetes or Docker Compose, all of these components will be created. @@ -59,7 +59,7 @@ The caching layer serves two main functions: - Store the results of queries to your data warehouse so that when a chart is loaded twice, it pulls from the cache the second time, speeding up the application and reducing load on your data warehouse. - Act as a message broker for the worker, enabling the Alerts & Reports, async queries, and thumbnail caching features. -Most people use Redis for their cache, but Superset supports other options too. See the [cache docs](/docs/configuration/cache/) for more. +Most people use Redis for their cache, but Superset supports other options too. See the [cache docs](/docs/6.0.0/configuration/cache/) for more. ### Worker and Beat @@ -67,6 +67,6 @@ This is one or more workers who execute tasks like run async queries or take sna ## Other components -Other components can be incorporated into Superset. The best place to learn about additional configurations is the [Configuration page](/docs/configuration/configuring-superset). For instance, you could set up a load balancer or reverse proxy to implement HTTPS in front of your Superset application, or specify a Mapbox URL to enable geospatial charts, etc. +Other components can be incorporated into Superset. The best place to learn about additional configurations is the [Configuration page](/docs/6.0.0/configuration/configuring-superset). For instance, you could set up a load balancer or reverse proxy to implement HTTPS in front of your Superset application, or specify a Mapbox URL to enable geospatial charts, etc. Superset won't even start without certain configuration settings established, so it's essential to review that page. diff --git a/docs/versioned_docs/version-6.0.0/installation/installation-methods.mdx b/docs/versioned_docs/version-6.0.0/installation/installation-methods.mdx index 51f3708b3ef..17f7c4f35f9 100644 --- a/docs/versioned_docs/version-6.0.0/installation/installation-methods.mdx +++ b/docs/versioned_docs/version-6.0.0/installation/installation-methods.mdx @@ -9,11 +9,11 @@ import useBaseUrl from "@docusaurus/useBaseUrl"; # Installation Methods -How should you install Superset? Here's a comparison of the different options. It will help if you've first read the [Architecture](/docs/installation/architecture.mdx) page to understand Superset's different components. +How should you install Superset? Here's a comparison of the different options. It will help if you've first read the [Architecture](/docs/6.0.0/installation/architecture page to understand Superset's different components. The fundamental trade-off is between you needing to do more of the detail work yourself vs. using a more complex deployment route that handles those details. -## [Docker Compose](/docs/installation/docker-compose.mdx) +## [Docker Compose](/docs/6.0.0/installation/docker-compose **Summary:** This takes advantage of containerization while remaining simpler than Kubernetes. This is the best way to try out Superset; it's also useful for developing & contributing back to Superset. @@ -27,9 +27,9 @@ You will need to back up your metadata DB. That could mean backing up the servic You will also need to extend the Superset docker image. The default `lean` images do not contain drivers needed to access your metadata database (Postgres or MySQL), nor to access your data warehouse, nor the headless browser needed for Alerts & Reports. You could run a `-dev` image while demoing Superset, which has some of this, but you'll still need to install the driver for your data warehouse. The `-dev` images run as root, which is not recommended for production. -Ideally you will build your own image of Superset that extends `lean`, adding what your deployment needs. See [Building your own production Docker image](/docs/installation/docker-builds/#building-your-own-production-docker-image). +Ideally you will build your own image of Superset that extends `lean`, adding what your deployment needs. See [Building your own production Docker image](/docs/6.0.0/installation/docker-builds/#building-your-own-production-docker-image). -## [Kubernetes (K8s)](/docs/installation/kubernetes.mdx) +## [Kubernetes (K8s)](/docs/6.0.0/installation/kubernetes **Summary:** This is the best-practice way to deploy a production instance of Superset, but has the steepest skill requirement - someone who knows Kubernetes. @@ -41,7 +41,7 @@ A K8s deployment can scale up and down based on usage and deploy rolling updates You will need to build your own Docker image, and back up your metadata DB, both as described in Docker Compose above. You'll also need to customize your Helm chart values and deploy and maintain your Kubernetes cluster. -## [PyPI (Python)](/docs/installation/pypi.mdx) +## [PyPI (Python)](/docs/6.0.0/installation/pypi **Summary:** This is the only method that requires no knowledge of containers. It requires the most hands-on work to deploy, connect, and maintain each component. diff --git a/docs/versioned_docs/version-6.0.0/installation/kubernetes.mdx b/docs/versioned_docs/version-6.0.0/installation/kubernetes.mdx index 9f515a63010..cdb0cccddd7 100644 --- a/docs/versioned_docs/version-6.0.0/installation/kubernetes.mdx +++ b/docs/versioned_docs/version-6.0.0/installation/kubernetes.mdx @@ -149,7 +149,7 @@ For production clusters it's recommended to build own image with this step done Superset requires a Python DB-API database driver and a SQLAlchemy dialect to be installed for each datastore you want to connect to. -See [Install Database Drivers](/docs/configuration/databases) for more information. +See [Install Database Drivers](/docs/6.0.0/configuration/databases) for more information. It is recommended that you refer to versions listed in [pyproject.toml](https://github.com/apache/superset/blob/master/pyproject.toml) instead of hard-coding them in your bootstrap script, as seen below. @@ -310,7 +310,7 @@ configOverrides: ### Enable Alerts and Reports -For this, as per the [Alerts and Reports doc](/docs/configuration/alerts-reports), you will need to: +For this, as per the [Alerts and Reports doc](/docs/6.0.0/configuration/alerts-reports), you will need to: #### Install a supported webdriver in the Celery worker diff --git a/docs/versioned_docs/version-6.0.0/intro.md b/docs/versioned_docs/version-6.0.0/intro.md index 9ef45fc9947..841ee0f7e55 100644 --- a/docs/versioned_docs/version-6.0.0/intro.md +++ b/docs/versioned_docs/version-6.0.0/intro.md @@ -165,14 +165,14 @@ Try out Superset's [quickstart](https://superset.apache.org/docs/quickstart/) gu ## Contributor Guide Interested in contributing? Check out our -[Developer Portal](https://superset.apache.org/developer_portal/) +[Developer Docs](https://superset.apache.org/developer-docs/) to find resources around contributing along with a detailed guide on how to set up a development environment. ## Resources - [Superset "In the Wild"](https://github.com/apache/superset/blob/master/RESOURCES/INTHEWILD.md) - open a PR to add your org to the list! -- [Feature Flags](/docs/configuration/feature-flags) - the status of Superset's Feature Flags. +- [Feature Flags](/docs/6.0.0/configuration/feature-flags) - the status of Superset's Feature Flags. - [Standard Roles](https://github.com/apache/superset/blob/master/RESOURCES/STANDARD_ROLES.md) - How RBAC permissions map to roles. - [Superset Wiki](https://github.com/apache/superset/wiki) - Tons of additional community resources: best practices, community content and other information. - [Superset SIPs](https://github.com/orgs/apache/projects/170) - The status of Superset's SIPs (Superset Improvement Proposals) for both consensus and implementation status. diff --git a/docs/versioned_docs/version-6.0.0/quickstart.mdx b/docs/versioned_docs/version-6.0.0/quickstart.mdx index bbd4e5d0d44..640ebe5794b 100644 --- a/docs/versioned_docs/version-6.0.0/quickstart.mdx +++ b/docs/versioned_docs/version-6.0.0/quickstart.mdx @@ -15,7 +15,7 @@ Although we recommend using `Docker Compose` for a quick start in a sandbox-type environment and for other development-type use cases, **we do not recommend this setup for production**. For this purpose please refer to our -[Installing on Kubernetes](/docs/installation/kubernetes/) +[Installing on Kubernetes](/docs/6.0.0/installation/kubernetes/) page. ::: @@ -73,10 +73,10 @@ processes by running Docker Compose `stop` command. By doing so, you can avoid d From this point on, you can head on to: -- [Create your first Dashboard](/docs/using-superset/creating-your-first-dashboard) -- [Connect to a Database](/docs/configuration/databases) -- [Using Docker Compose](/docs/installation/docker-compose) -- [Configure Superset](/docs/configuration/configuring-superset/) -- [Installing on Kubernetes](/docs/installation/kubernetes/) +- [Create your first Dashboard](/docs/6.0.0/using-superset/creating-your-first-dashboard) +- [Connect to a Database](/docs/6.0.0/configuration/databases) +- [Using Docker Compose](/docs/6.0.0/installation/docker-compose) +- [Configure Superset](/docs/6.0.0/configuration/configuring-superset/) +- [Installing on Kubernetes](/docs/6.0.0/installation/kubernetes/) Or just explore our [Documentation](https://superset.apache.org/docs/intro)! diff --git a/docs/versioned_docs/version-6.0.0/using-superset/creating-your-first-dashboard.mdx b/docs/versioned_docs/version-6.0.0/using-superset/creating-your-first-dashboard.mdx index a976e4b60dd..0ad727b9181 100644 --- a/docs/versioned_docs/version-6.0.0/using-superset/creating-your-first-dashboard.mdx +++ b/docs/versioned_docs/version-6.0.0/using-superset/creating-your-first-dashboard.mdx @@ -31,7 +31,7 @@ your existing SQL-speaking database or data store. First things first, we need to add the connection credentials to your database to be able to query and visualize data from it. If you're using Superset locally via -[Docker compose](/docs/installation/docker-compose), you can +[Docker compose](/docs/6.0.0/installation/docker-compose), you can skip this step because a Postgres database, named **examples**, is included and pre-configured in Superset for you. @@ -188,7 +188,7 @@ Access to dashboards is managed via owners (users that have edit permissions to Non-owner users access can be managed in two different ways. The dashboard needs to be published to be visible to other users. 1. Dataset permissions - if you add to the relevant role permissions to datasets it automatically grants implicit access to all dashboards that uses those permitted datasets. -2. Dashboard roles - if you enable [**DASHBOARD_RBAC** feature flag](/docs/configuration/configuring-superset#feature-flags) then you will be able to manage which roles can access the dashboard +2. Dashboard roles - if you enable [**DASHBOARD_RBAC** feature flag](/docs/6.0.0/configuration/configuring-superset#feature-flags) then you will be able to manage which roles can access the dashboard - Granting a role access to a dashboard will bypass dataset level checks. Having dashboard access implicitly grants read access to all the featured charts in the dashboard, and thereby also all the associated datasets. - If no roles are specified for a dashboard, regular **Dataset permissions** will apply. diff --git a/docs/versions-config.json b/docs/versions-config.json index d96bfc498d0..985b075e669 100644 --- a/docs/versions-config.json +++ b/docs/versions-config.json @@ -20,7 +20,22 @@ } } }, - "developer_portal": { + "admin_docs": { + "disabled": false, + "lastVersion": "current", + "includeCurrentVersion": true, + "onlyIncludeVersions": [ + "current" + ], + "versions": { + "current": { + "label": "Next", + "path": "", + "banner": "none" + } + } + }, + "developer_docs": { "disabled": false, "hideFromNav": false, "lastVersion": "current",