mirror of
https://github.com/apache/superset.git
synced 2026-04-19 08:04:53 +00:00
docs: cleanup markdown warnings (#29511)
This commit is contained in:
@@ -18,10 +18,10 @@ which can be joined by anyone):
|
||||
- [Superset Community Calendar](https://superset.apache.org/community)
|
||||
|
||||
More references:
|
||||
|
||||
- [Comprehensive Tutorial for Contributing Code to Apache Superset](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
|
||||
- [Superset Wiki (code guidelines and additional resources)](https://github.com/apache/superset/wiki)
|
||||
|
||||
|
||||
## Orientation
|
||||
|
||||
Here's a list of repositories that contain Superset-related packages:
|
||||
@@ -37,7 +37,6 @@ Here's a list of repositories that contain Superset-related packages:
|
||||
GitHub organization under which we manage Superset-related
|
||||
small tools, forks and Superset-related experimental ideas.
|
||||
|
||||
|
||||
## Types of Contributions
|
||||
|
||||
### Report Bug
|
||||
|
||||
@@ -34,7 +34,9 @@ Setting things up to squeeze an "hello world" into any part of Superset should b
|
||||
```bash
|
||||
docker-compose up
|
||||
```
|
||||
|
||||
Note that:
|
||||
|
||||
- this will pull/build docker images and run a cluster of services, including:
|
||||
- A Superset **Flask web server**, mounting the local python repo/code
|
||||
- A Superset **Celery worker**, also mounting the local python repo/code
|
||||
@@ -287,22 +289,28 @@ If while using the above commands you encounter an error related to the limit of
|
||||
```bash
|
||||
Error: ENOSPC: System limit for number of file watchers reached
|
||||
```
|
||||
|
||||
The error is thrown because the number of files monitored by the system has reached the limit.
|
||||
You can address this this error by increasing the number of inotify watchers.
|
||||
|
||||
The current value of max watches can be checked with:
|
||||
|
||||
```bash
|
||||
cat /proc/sys/fs/inotify/max_user_watches
|
||||
```
|
||||
Edit the file /etc/sysctl.conf to increase this value.
|
||||
|
||||
Edit the file `/etc/sysctl.conf` to increase this value.
|
||||
The value needs to be decided based on the system memory [(see this StackOverflow answer for more context)](https://stackoverflow.com/questions/535768/what-is-a-reasonable-amount-of-inotify-watches-with-linux).
|
||||
|
||||
Open the file in editor and add a line at the bottom specifying the max watches values.
|
||||
|
||||
```bash
|
||||
fs.inotify.max_user_watches=524288
|
||||
```
|
||||
|
||||
Save the file and exit editor.
|
||||
To confirm that the change succeeded, run the following command to load the updated value of max_user_watches from sysctl.conf:
|
||||
To confirm that the change succeeded, run the following command to load the updated value of max_user_watches from `sysctl.conf`:
|
||||
|
||||
```bash
|
||||
sudo sysctl -p
|
||||
```
|
||||
@@ -314,14 +322,18 @@ The dev server by default starts at `http://localhost:9000` and proxies the back
|
||||
So a typical development workflow is the following:
|
||||
|
||||
1. [run Superset locally](#flask-server) using Flask, on port `8088` — but don't access it directly,<br/>
|
||||
|
||||
```bash
|
||||
# Install Superset and dependencies, plus load your virtual environment first, as detailed above.
|
||||
superset run -p 8088 --with-threads --reload --debugger --debug
|
||||
```
|
||||
|
||||
2. in parallel, run the Webpack dev server locally on port `9000`,<br/>
|
||||
|
||||
```bash
|
||||
npm run dev-server
|
||||
```
|
||||
|
||||
3. access `http://localhost:9000` (the Webpack server, _not_ Flask) in your web browser. This will use the hot-reloading front-end assets from the Webpack development server while redirecting back-end queries to Flask/Superset: your changes on Superset codebase — either front or back-end — will then be reflected live in the browser.
|
||||
|
||||
It's possible to change the Webpack server settings:
|
||||
@@ -704,9 +716,9 @@ VSCode will not stop on breakpoints right away. We've attached to PID 6 however
|
||||
|
||||
### Debugging Server App in Kubernetes Environment
|
||||
|
||||
To debug Flask running in POD inside kubernetes cluster. You'll need to make sure the pod runs as root and is granted the SYS_TRACE capability.These settings should not be used in production environments.
|
||||
To debug Flask running in POD inside a kubernetes cluster, you'll need to make sure the pod runs as root and is granted the SYS_TRACE capability.These settings should not be used in production environments.
|
||||
|
||||
```
|
||||
```yaml
|
||||
securityContext:
|
||||
capabilities:
|
||||
add: ["SYS_PTRACE"]
|
||||
@@ -720,7 +732,7 @@ You can follow the same instructions as in the docker-compose. Enter the pod and
|
||||
|
||||
Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine.
|
||||
|
||||
```
|
||||
```bash
|
||||
kubectl port-forward pod/superset-<some random id> 5678:5678
|
||||
```
|
||||
|
||||
@@ -801,7 +813,7 @@ Submissions will be considered for submission (or removal) on a case-by-case bas
|
||||
|
||||
The output should look like this:
|
||||
|
||||
```
|
||||
```log
|
||||
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
|
||||
INFO [alembic.runtime.migration] Will assume transactional DDL.
|
||||
INFO [alembic.runtime.migration] Running upgrade 1a1d627ebd8e -> 40a0a483dd12, add_metadata_column_to_annotation_model.py
|
||||
@@ -821,7 +833,7 @@ Submissions will be considered for submission (or removal) on a case-by-case bas
|
||||
|
||||
The output should look like this:
|
||||
|
||||
```
|
||||
```log
|
||||
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
|
||||
INFO [alembic.runtime.migration] Will assume transactional DDL.
|
||||
INFO [alembic.runtime.migration] Running downgrade 40a0a483dd12 -> 1a1d627ebd8e, add_metadata_column_to_annotation_model.py
|
||||
|
||||
@@ -96,7 +96,6 @@ Finally, never submit a PR that will put master branch in broken state. If the P
|
||||
- Project maintainers may contact the PR author if new issues are introduced by the PR.
|
||||
- Project maintainers may revert your changes if a critical issue is found, such as breaking master branch CI.
|
||||
|
||||
|
||||
## Managing Issues and PRs
|
||||
|
||||
To handle issues and PRs that are coming in, committers read issues/PRs and flag them with labels to categorize and help contributors spot where to take actions, as contributors usually have different expertises.
|
||||
@@ -152,10 +151,8 @@ Should you decide that reverting is desirable, it is the responsibility of the C
|
||||
- **Provide concise reproduction steps:** Ensure that the issue can be clearly understood and duplicated by the original author of the PR.
|
||||
- **Put the revert through code review:** The revert must be approved by another committer.
|
||||
|
||||
|
||||
|
||||
|
||||
## Design Guidelines
|
||||
|
||||
### Capitalization guidelines
|
||||
|
||||
#### Sentence case
|
||||
@@ -197,12 +194,11 @@ Often a product page will have the same title as the objects it contains. In thi
|
||||
- Queries that you save will appear on the Saved queries page
|
||||
- Create custom queries in SQL Lab then create dashboards
|
||||
|
||||
#### \*\*Exceptions to sentence case:
|
||||
#### \*\*Exceptions to sentence case
|
||||
|
||||
- Input labels, buttons and UI tabs are all caps
|
||||
- User input values (e.g. column names, SQL Lab tab names) should be in their original case
|
||||
|
||||
|
||||
## Programming Language Conventions
|
||||
|
||||
### Python
|
||||
|
||||
@@ -88,7 +88,7 @@ yo @superset-ui/superset
|
||||
|
||||
After that the generator will ask a few questions (the defaults should be fine):
|
||||
|
||||
```
|
||||
```bash
|
||||
$ yo @superset-ui/superset
|
||||
_-----_ ╭──────────────────────────╮
|
||||
| | │ Welcome to the │
|
||||
@@ -125,7 +125,7 @@ $ yo @superset-ui/superset
|
||||
|
||||
To build the viz plugin, run the following commands:
|
||||
|
||||
```
|
||||
```bash
|
||||
npm i --force
|
||||
npm run build
|
||||
```
|
||||
@@ -133,7 +133,7 @@ npm run build
|
||||
Alternatively, to run the viz plugin in development mode (=rebuilding whenever changes
|
||||
are made), start the dev server with the following command:
|
||||
|
||||
```
|
||||
```bash
|
||||
npm run dev
|
||||
```
|
||||
|
||||
@@ -399,7 +399,7 @@ tcp 0 0 0.0.0.0:8088 0.0.0.0:* LISTEN
|
||||
|
||||
You are now ready to attach a debugger to the process. Using VSCode you can configure a launch configuration file .vscode/launch.json like so.
|
||||
|
||||
```
|
||||
```json
|
||||
{
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
@@ -426,9 +426,9 @@ VSCode will not stop on breakpoints right away. We've attached to PID 6 however
|
||||
|
||||
### Debugging Server App in Kubernetes Environment
|
||||
|
||||
To debug Flask running in POD inside kubernetes cluster. You'll need to make sure the pod runs as root and is granted the SYS_TRACE capability.These settings should not be used in production environments.
|
||||
To debug Flask running in POD inside a kubernetes cluster, you'll need to make sure the pod runs as root and is granted the `SYS_TRACE` capability. These settings should not be used in production environments.
|
||||
|
||||
```
|
||||
```yaml
|
||||
securityContext:
|
||||
capabilities:
|
||||
add: ["SYS_PTRACE"]
|
||||
@@ -436,13 +436,13 @@ To debug Flask running in POD inside kubernetes cluster. You'll need to make sur
|
||||
|
||||
See (set capabilities for a container)[https://kubernetes.io/docs/tasks/configure-pod-container/security-context/#set-capabilities-for-a-container] for more details.
|
||||
|
||||
Once the pod is running as root and has the SYS_PTRACE capability it will be able to debug the Flask app.
|
||||
Once the pod is running as root and has the `SYS_PTRACE` capability it will be able to debug the Flask app.
|
||||
|
||||
You can follow the same instructions as in the docker-compose. Enter the pod and install the required library and packages; gdb, netstat and debugpy.
|
||||
|
||||
Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine.
|
||||
|
||||
```
|
||||
```bash
|
||||
kubectl port-forward pod/superset-<some random id> 5678:5678
|
||||
```
|
||||
|
||||
@@ -465,6 +465,7 @@ We use [Flask-Babel](https://python-babel.github.io/flask-babel/) to translate S
|
||||
In Python files, we use the following
|
||||
[translation functions](https://python-babel.github.io/flask-babel/#using-translations)
|
||||
from `Flask-Babel`:
|
||||
|
||||
- `gettext` and `lazy_gettext` (usually aliased to `_`): for translating singular
|
||||
strings.
|
||||
- `ngettext`: for translating strings that might become plural.
|
||||
@@ -502,7 +503,6 @@ LANGUAGES = {
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
### Creating a new language dictionary
|
||||
|
||||
First check if the language code for your target language already exists. Check if the
|
||||
@@ -520,6 +520,7 @@ easier to translate the application to your target language:
|
||||
[Flask AppBuilder i18n documentation](https://flask-appbuilder.readthedocs.io/en/latest/i18n.html)
|
||||
|
||||
To create a dictionary for a new language, first make sure the necessary dependencies are installed:
|
||||
|
||||
```bash
|
||||
pip install -r superset/translations/requirements.txt
|
||||
```
|
||||
@@ -576,14 +577,14 @@ case of the Finnish translation, this would be `superset/translations/fi/LC_MESS
|
||||
|
||||
To make the translations available on the frontend, we need to convert the PO file into
|
||||
a collection of JSON files. To convert all PO files to formatted JSON files you can use
|
||||
the build-translation script
|
||||
the `build-translation` script
|
||||
|
||||
```bash
|
||||
npm run build-translation
|
||||
```
|
||||
|
||||
Finally, for the translations to take effect we need to compile translation catalogs into
|
||||
binary MO files for the backend using pybabel.
|
||||
binary MO files for the backend using `pybabel`.
|
||||
|
||||
```bash
|
||||
pybabel compile -d superset/translations
|
||||
|
||||
@@ -4,7 +4,6 @@ sidebar_position: 9
|
||||
|
||||
# FAQ
|
||||
|
||||
|
||||
## How big of a dataset can Superset handle?
|
||||
|
||||
Superset can work with even gigantic databases! Superset acts as a thin layer above your underlying
|
||||
@@ -28,7 +27,6 @@ to occur in spikes, e.g., if everyone in a meeting loads the same dashboard at o
|
||||
Superset's application metadata does not require a very large database to store it, though
|
||||
the log file grows over time.
|
||||
|
||||
|
||||
## Can I join / query multiple tables at one time?
|
||||
|
||||
Not in the Explore or Visualization UI. A Superset SQLAlchemy datasource can only be a single table
|
||||
@@ -178,7 +176,7 @@ You can take a look at this Flask-AppBuilder
|
||||
It is possible on a per-dashboard basis by providing a mapping of labels to colors in the JSON
|
||||
Metadata attribute using the `label_colors` key.
|
||||
|
||||
```
|
||||
```json
|
||||
{
|
||||
"label_colors": {
|
||||
"Girls": "#FF69B4",
|
||||
|
||||
@@ -23,13 +23,14 @@ Different sets of images are built and/or published at different times:
|
||||
- **Merges to the main branch** (`push`): resulting in new SHAs, with tags
|
||||
prefixed with `master` for the latest `master` version.
|
||||
|
||||
# Build presets
|
||||
## Build presets
|
||||
|
||||
We have a set of build "presets" that each represent a combination of
|
||||
parameters for the build, mostly pointing to either different target layer
|
||||
for the build, and/or base image.
|
||||
|
||||
Here are the build presets that are exposed through the `build_docker.py` script:
|
||||
|
||||
- `lean`: The default Docker image, including both frontend and backend. Tags
|
||||
without a build_preset are lean builds, e.g., `latest`.
|
||||
- `dev`: For development, with a headless browser, dev-related utilities and root access.
|
||||
@@ -92,7 +93,7 @@ configured in that way). Setting the environment
|
||||
variable `DOCKER_DEFAULT_PLATFORM` to `linux/amd64` seems to function in
|
||||
term of leveraging, and building upon the Superset builds provided here.
|
||||
|
||||
```
|
||||
```bash
|
||||
export DOCKER_DEFAULT_PLATFORM=linux/amd64
|
||||
```
|
||||
|
||||
|
||||
@@ -9,7 +9,6 @@ import useBaseUrl from "@docusaurus/useBaseUrl";
|
||||
|
||||
# Using Docker Compose
|
||||
|
||||
|
||||
<img src={useBaseUrl("/img/docker-compose.webp" )} width="150" />
|
||||
<br /><br />
|
||||
|
||||
@@ -22,13 +21,13 @@ our [installing on k8s](https://superset.apache.org/docs/installation/running-on
|
||||
documentation.
|
||||
:::
|
||||
|
||||
|
||||
As mentioned in our [quickstart guide](/docs/quickstart), the fastest way to try
|
||||
Superset locally is using Docker Compose on a Linux or Mac OSX
|
||||
computer. Superset does not have official support for Windows. It's also the easiest
|
||||
way to launch a fully functioning **development environment** quickly.
|
||||
|
||||
Note that there are 3 major ways we support to run docker-compose:
|
||||
|
||||
1. **docker-compose.yml:** for interactive development, where we mount your local folder with the
|
||||
frontend/backend files that you can edit and experience the changes you
|
||||
make in the app in real time
|
||||
@@ -49,7 +48,6 @@ Note that this documentation assumes that you have [Docker](https://www.docker.c
|
||||
[docker-compose](https://docs.docker.com/compose/), and
|
||||
[git](https://git-scm.com/) installed.
|
||||
|
||||
|
||||
## 1. Clone Superset's GitHub repository
|
||||
|
||||
[Clone Superset's repo](https://github.com/apache/superset) in your terminal with the
|
||||
@@ -151,7 +149,6 @@ located in your `PYTHONPATH`, note that it can be done by providing a
|
||||
The mechanics of this are in `docker/pythonpath_dev/superset_config.py` where you can see
|
||||
that the logic runs a `from superset_config_docker import *`
|
||||
|
||||
|
||||
:::note
|
||||
Users often want to connect to other databases from Superset. Currently, the easiest way to
|
||||
do this is to modify the `docker-compose-non-dev.yml` file and add your database as a service that
|
||||
|
||||
@@ -9,7 +9,6 @@ import useBaseUrl from "@docusaurus/useBaseUrl";
|
||||
|
||||
# Installing on Kubernetes
|
||||
|
||||
|
||||
<img src={useBaseUrl("/img/k8s.png" )} width="150" />
|
||||
<br /><br />
|
||||
|
||||
@@ -27,7 +26,6 @@ For simpler, single host environments, we recommend using
|
||||
and works fantastically well with the Helm chart referenced here.
|
||||
:::
|
||||
|
||||
|
||||
## Running
|
||||
|
||||
1. Add the Superset helm repository
|
||||
@@ -434,9 +432,12 @@ configOverrides:
|
||||
"--disable-extensions",
|
||||
]
|
||||
```
|
||||
|
||||
### Load the Examples data and dashboards
|
||||
|
||||
If you are trying Superset out and want some data and dashboards to explore, you can load some examples by creating a `my_values.yaml` and deploying it as described above in the **Configure your setting overrides** step of the **Running** section.
|
||||
To load the examples, add the following to the `my_values.yaml` file:
|
||||
|
||||
```yaml
|
||||
init:
|
||||
loadExamples: true
|
||||
|
||||
@@ -24,13 +24,13 @@ level dependencies.
|
||||
|
||||
The following command will ensure that the required dependencies are installed:
|
||||
|
||||
```
|
||||
```bash
|
||||
sudo apt-get install build-essential libssl-dev libffi-dev python-dev python-pip libsasl2-dev libldap2-dev default-libmysqlclient-dev
|
||||
```
|
||||
|
||||
In Ubuntu 20.04 the following command will ensure that the required dependencies are installed:
|
||||
|
||||
```
|
||||
```bash
|
||||
sudo apt-get install build-essential libssl-dev libffi-dev python3-dev python3-pip libsasl2-dev libldap2-dev default-libmysqlclient-dev
|
||||
```
|
||||
|
||||
@@ -38,19 +38,19 @@ sudo apt-get install build-essential libssl-dev libffi-dev python3-dev python3-p
|
||||
|
||||
Install the following packages using the `yum` package manager:
|
||||
|
||||
```
|
||||
```bash
|
||||
sudo yum install gcc gcc-c++ libffi-devel python-devel python-pip python-wheel openssl-devel cyrus-sasl-devel openldap-devel
|
||||
```
|
||||
|
||||
In more recent versions of CentOS and Fedora, you may need to install a slightly different set of packages using `dnf`:
|
||||
|
||||
```
|
||||
```bash
|
||||
sudo dnf install gcc gcc-c++ libffi-devel python3-devel python3-pip python3-wheel openssl-devel cyrus-sasl-devel openldap-devel
|
||||
```
|
||||
|
||||
Also, on CentOS, you may need to upgrade pip for the install to work:
|
||||
|
||||
```
|
||||
```bash
|
||||
pip3 install --upgrade pip
|
||||
```
|
||||
|
||||
@@ -60,14 +60,14 @@ If you're not on the latest version of OS X, we recommend upgrading because we'v
|
||||
issues people have run into are linked to older versions of Mac OS X. After updating, install the
|
||||
latest version of XCode command line tools:
|
||||
|
||||
```
|
||||
```bash
|
||||
xcode-select --install
|
||||
```
|
||||
|
||||
We don't recommend using the system installed Python. Instead, first install the
|
||||
[homebrew](https://brew.sh/) manager and then run the following commands:
|
||||
|
||||
```
|
||||
```bash
|
||||
brew install readline pkg-config libffi openssl mysql postgresql@14
|
||||
```
|
||||
|
||||
@@ -83,13 +83,13 @@ To identify the Python version used by the official docker image, see the [Docke
|
||||
|
||||
Let's also make sure we have the latest version of `pip` and `setuptools`:
|
||||
|
||||
```
|
||||
```bash
|
||||
pip install --upgrade setuptools pip
|
||||
```
|
||||
|
||||
Lastly, you may need to set LDFLAGS and CFLAGS for certain Python packages to properly build. You can export these variables with:
|
||||
|
||||
```
|
||||
```bash
|
||||
export LDFLAGS="-L$(brew --prefix openssl)/lib"
|
||||
export CFLAGS="-I$(brew --prefix openssl)/include"
|
||||
```
|
||||
@@ -101,13 +101,13 @@ These will now be available when pip installing requirements.
|
||||
We highly recommend installing Superset inside of a virtual environment. Python ships with
|
||||
`virtualenv` out of the box. If you're using [pyenv](https://github.com/pyenv/pyenv), you can install [pyenv-virtualenv](https://github.com/pyenv/pyenv-virtualenv). Or you can install it with `pip`:
|
||||
|
||||
```
|
||||
```bash
|
||||
pip install virtualenv
|
||||
```
|
||||
|
||||
You can create and activate a virtual environment using:
|
||||
|
||||
```
|
||||
```bash
|
||||
# virtualenv is shipped in Python 3.6+ as venv instead of pyvenv.
|
||||
# See https://docs.python.org/3.6/library/venv.html
|
||||
python3 -m venv venv
|
||||
@@ -116,7 +116,7 @@ python3 -m venv venv
|
||||
|
||||
Or with pyenv-virtualenv:
|
||||
|
||||
```
|
||||
```bash
|
||||
# Here we name the virtual env 'superset'
|
||||
pyenv virtualenv superset
|
||||
pyenv activate superset
|
||||
@@ -130,13 +130,13 @@ command line.
|
||||
|
||||
First, start by installing `apache-superset`:
|
||||
|
||||
```
|
||||
```bash
|
||||
pip install apache-superset
|
||||
```
|
||||
|
||||
Then, you need to initialize the database:
|
||||
|
||||
```
|
||||
```bash
|
||||
superset db upgrade
|
||||
```
|
||||
|
||||
@@ -146,7 +146,7 @@ Note that some configuration is mandatory for production instances of Superset.
|
||||
|
||||
Finish installing by running through the following commands:
|
||||
|
||||
```
|
||||
```bash
|
||||
# Create an admin user in your metadata database (use `admin` as username to be able to load the examples)
|
||||
export FLASK_APP=superset
|
||||
superset fab create-admin
|
||||
|
||||
@@ -34,6 +34,7 @@ $ cd superset
|
||||
# Fire up Superset using Docker Compose
|
||||
$ docker compose -f docker-compose-image-tag.yml up
|
||||
```
|
||||
|
||||
This may take a moment as Docker Compose will fetch the underlying
|
||||
container images and will load up some examples. Once all containers
|
||||
are downloaded and the output settles, you're ready to log in.
|
||||
@@ -41,7 +42,9 @@ are downloaded and the output settles, you're ready to log in.
|
||||
⚠️ If you get an error message like `validating superset\docker-compose-image-tag.yml: services.superset-worker-beat.env_file.0 must be a string`, you need to update your version of `docker-compose`.
|
||||
|
||||
### 3. Log into Superset
|
||||
|
||||
Now head over to [http://localhost:8088](http://localhost:8088) and log in with the default created account:
|
||||
|
||||
```bash
|
||||
username: admin
|
||||
password: admin
|
||||
@@ -50,10 +53,13 @@ password: admin
|
||||
#### 🎉 Congratulations! Superset is now up and running on your machine! 🎉
|
||||
|
||||
### Wrapping Up
|
||||
|
||||
Once you're done with Superset, you can stop and delete just like any other container environment:
|
||||
|
||||
```bash
|
||||
$ docker compose down
|
||||
```
|
||||
|
||||
:::tip
|
||||
You can use the same environment more than once, as Superset will persist data locally. However, make sure to properly stop all
|
||||
processes by running Docker Compose `stop` command. By doing so, you can avoid data corruption and/or loss of data.
|
||||
@@ -62,6 +68,7 @@ processes by running Docker Compose `stop` command. By doing so, you can avoid d
|
||||
## What's next?
|
||||
|
||||
From this point on, you can head on to:
|
||||
|
||||
- [Create your first Dashboard](/docs/using-superset/creating-your-first-dashboard)
|
||||
- [Connect to a Database](/docs/configuration/databases)
|
||||
- [Using Docker Compose](/docs/installation/docker-compose)
|
||||
|
||||
@@ -38,7 +38,6 @@ sidebar_position: 2
|
||||
| CVE-2023-49736 | SQL Injection on where_in JINJA macro | < 2.1.3, >= 3.0.0, < 3.0.2 |
|
||||
| CVE-2023-49734 | Privilege Escalation Vulnerability | < 2.1.3, >= 3.0.0, < 3.0.2 |
|
||||
|
||||
|
||||
#### Version 3.0.0
|
||||
|
||||
| CVE | Title | Affected |
|
||||
@@ -46,14 +45,12 @@ sidebar_position: 2
|
||||
| CVE-2023-42502 | Open Redirect Vulnerability | < 3.0.0 |
|
||||
| CVE-2023-42505 | Sensitive information disclosure on db connection details | < 3.0.0 |
|
||||
|
||||
|
||||
#### Version 2.1.3
|
||||
|
||||
| CVE | Title | Affected |
|
||||
|:---------------|:------------------------------------------------------------------------|---------:|
|
||||
| CVE-2023-42504 | Lack of rate limiting allows for possible denial of service | < 2.1.3 |
|
||||
|
||||
|
||||
#### Version 2.1.2
|
||||
|
||||
| CVE | Title | Affected |
|
||||
@@ -62,7 +59,6 @@ sidebar_position: 2
|
||||
| CVE-2023-42501 | Unnecessary read permissions within the Gamma role | < 2.1.2 |
|
||||
| CVE-2023-43701 | Stored XSS on API endpoint | < 2.1.2 |
|
||||
|
||||
|
||||
#### Version 2.1.1
|
||||
|
||||
| CVE | Title | Affected |
|
||||
@@ -76,7 +72,6 @@ sidebar_position: 2
|
||||
| CVE-2023-37941 | Metadata db write access can lead to remote code execution | < 2.1.1 |
|
||||
| CVE-2023-32672 | SQL parser edge case bypasses data access authorization | < 2.1.1 |
|
||||
|
||||
|
||||
#### Version 2.1.0
|
||||
|
||||
| CVE | Title | Affected |
|
||||
@@ -86,7 +81,6 @@ sidebar_position: 2
|
||||
| CVE-2023-27525 | Incorrect default permissions for Gamma role | < 2.1.0 |
|
||||
| CVE-2023-30776 | Database connection password leak | < 2.1.0 |
|
||||
|
||||
|
||||
#### Version 2.0.1
|
||||
|
||||
| CVE | Title | Affected |
|
||||
|
||||
Reference in New Issue
Block a user