Compare commits
18 Commits
2.0
...
v2021.19.2
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4fa307566d | ||
|
|
097d1fb85c | ||
|
|
5646bcb9c8 | ||
|
|
1393c5660c | ||
|
|
3cea7b4199 | ||
|
|
15126ef17d | ||
|
|
d99d173817 | ||
|
|
5f2bb51393 | ||
|
|
2da0c347db | ||
|
|
c5637dba34 | ||
|
|
574d3a1a49 | ||
|
|
b6ef99a51c | ||
|
|
8843b895ce | ||
|
|
b39dae95c5 | ||
|
|
e5c6472d4f | ||
|
|
e48d3f5315 | ||
|
|
925a05e67e | ||
|
|
4daa87fcd0 |
10
.asf.yaml
@@ -66,12 +66,12 @@ github:
|
||||
- cypress-matrix (3, chrome)
|
||||
- docker-build
|
||||
- frontend-build
|
||||
- pre-commit (3.8)
|
||||
- python-lint (3.8)
|
||||
- test-mysql (3.8)
|
||||
- pre-commit (3.7)
|
||||
- python-lint (3.7)
|
||||
- test-mysql (3.7)
|
||||
- test-postgres (3.7)
|
||||
- test-postgres (3.8)
|
||||
- test-postgres (3.9)
|
||||
- test-sqlite (3.8)
|
||||
- test-sqlite (3.7)
|
||||
|
||||
required_pull_request_reviews:
|
||||
dismiss_stale_reviews: false
|
||||
|
||||
12
.codecov.yml
@@ -3,7 +3,6 @@ codecov:
|
||||
after_n_builds: 4
|
||||
ignore:
|
||||
- "superset/migrations/versions/*.py"
|
||||
- "superset-frontend/packages/superset-ui-demo/**/*"
|
||||
- "**/*.stories.tsx"
|
||||
- "**/*.stories.jsx"
|
||||
coverage:
|
||||
@@ -15,17 +14,6 @@ coverage:
|
||||
# project coverage decrease:
|
||||
target: auto
|
||||
threshold: 0%
|
||||
core-packages-ts:
|
||||
target: 100%
|
||||
paths:
|
||||
- 'superset-frontend/packages'
|
||||
- '!superset-frontend/packages/**/*.jsx'
|
||||
- '!superset-frontend/packages/**/*.tsx'
|
||||
core-packages-tsx:
|
||||
target: 50%
|
||||
paths:
|
||||
- 'superset-frontend/packages/**/*.jsx'
|
||||
- 'superset-frontend/packages/**/*.tsx'
|
||||
patch:
|
||||
default:
|
||||
informational: true
|
||||
|
||||
1
.gitattributes
vendored
@@ -1 +0,0 @@
|
||||
docker/**/*.sh text eol=lf
|
||||
17
.github/CODEOWNERS
vendored
@@ -1,19 +1,8 @@
|
||||
# Notify all committers of DB migration changes, per SIP-59
|
||||
|
||||
# https://github.com/apache/superset/issues/13351
|
||||
|
||||
/superset/migrations/ @apache/superset-committers
|
||||
|
||||
# Notify Preset team when ephemeral env settings are changed
|
||||
|
||||
.github/workflows/ecs-task-definition.json @robdiciuccio @craig-rueda @rusackas @eschutho @dpgaspar @nytai @mistercrunch
|
||||
.github/workflows/docker-ephemeral-env.yml @robdiciuccio @craig-rueda @rusackas @eschutho @dpgaspar @nytai @mistercrunch
|
||||
.github/workflows/ephemeral*.yml @robdiciuccio @craig-rueda @rusackas @eschutho @dpgaspar @nytai @mistercrunch
|
||||
|
||||
# Notify some committers of changes in the Select component
|
||||
|
||||
/superset-frontend/src/components/Select/ @michael-s-molina @geido @ktmud
|
||||
|
||||
# Notify Helm Chart maintainers about changes in it
|
||||
|
||||
/helm/superset/ @craig-rueda @dpgaspar @villebro
|
||||
.github/workflows/ecs-task-definition.json @robdiciuccio @craig-rueda @benjreinhart
|
||||
.github/workflows/docker-ephemeral-env.yml @robdiciuccio @craig-rueda @benjreinhart
|
||||
.github/workflows/ephemeral*.yml @robdiciuccio @craig-rueda @benjreinhart
|
||||
|
||||
15
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -7,13 +7,6 @@ labels: "#bug"
|
||||
|
||||
A clear and concise description of what the bug is.
|
||||
|
||||
#### How to reproduce the bug
|
||||
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
|
||||
### Expected results
|
||||
|
||||
what you expected to happen.
|
||||
@@ -26,16 +19,20 @@ what actually happens.
|
||||
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
#### How to reproduce the bug
|
||||
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
|
||||
### Environment
|
||||
|
||||
(please complete the following information):
|
||||
|
||||
- browser type and version:
|
||||
- superset version: `superset version`
|
||||
- python version: `python --version`
|
||||
- node.js version: `node -v`
|
||||
- any feature flags active:
|
||||
|
||||
### Checklist
|
||||
|
||||
|
||||
12
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -5,10 +5,14 @@ labels: "#enhancement"
|
||||
|
||||
---
|
||||
|
||||
Github Discussions is our new home for discussing features and improvements!
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
https://github.com/apache/superset/discussions/categories/ideas
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
We'd like to keep Github Issues focuses on bugs and SIP's (Superset Improvement Proposals)!
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
Please note that feature requests opened as Github Issues will be moved to Discussions.
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
|
||||
12
.github/ISSUE_TEMPLATE/security_vulnerability.md
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
---
|
||||
name: Security vulnerability
|
||||
about: Report a security vulnerability or issue
|
||||
labels: "#security"
|
||||
|
||||
---
|
||||
|
||||
## DO NOT REPORT SECURITY VULNERABILITIES HERE
|
||||
|
||||
Please report security vulnerabilities to private@superset.apache.org.
|
||||
|
||||
In the event a community member discovers a security flaw in Superset, it is important to follow the [Apache Security Guidelines](https://www.apache.org/security/committers.html) and release a fix as quickly as possible before public disclosure. Reporting security vulnerabilities through the usual GitHub Issues channel is not ideal as it will publicize the flaw before a fix can be applied.
|
||||
4
.github/ISSUE_TEMPLATE/sip.md
vendored
@@ -6,9 +6,9 @@ labels: "#SIP"
|
||||
---
|
||||
|
||||
*Please make sure you are familiar with the SIP process documented*
|
||||
(here)[https://github.com/apache/superset/issues/5602]. The SIP number should be the next number after the latest SIP listed [here](https://github.com/apache/superset/issues?q=is%3Aissue+label%3Asip).
|
||||
(here)[https://github.com/apache/superset/issues/5602]
|
||||
|
||||
## [SIP-\<number>] Proposal for <title>
|
||||
## [SIP] Proposal for XXX
|
||||
|
||||
### Motivation
|
||||
|
||||
|
||||
11
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -1,23 +1,16 @@
|
||||
<!---
|
||||
Please write the PR title following the conventions at https://www.conventionalcommits.org/en/v1.0.0/
|
||||
Example:
|
||||
fix(dashboard): load charts correctly
|
||||
-->
|
||||
|
||||
### SUMMARY
|
||||
<!--- Describe the change below, including rationale and design decisions -->
|
||||
|
||||
### BEFORE/AFTER SCREENSHOTS OR ANIMATED GIF
|
||||
<!--- Skip this if not applicable -->
|
||||
|
||||
### TESTING INSTRUCTIONS
|
||||
<!--- Required! What steps can be taken to manually verify the changes? -->
|
||||
### TEST PLAN
|
||||
<!--- What steps should be taken to verify the changes -->
|
||||
|
||||
### ADDITIONAL INFORMATION
|
||||
<!--- Check any relevant boxes with "x" -->
|
||||
<!--- HINT: Include "Fixes #nnn" if you are fixing an existing issue -->
|
||||
- [ ] Has associated issue:
|
||||
- [ ] Required feature flags:
|
||||
- [ ] Changes UI
|
||||
- [ ] Includes DB Migration (follow approval process in [SIP-59](https://github.com/apache/superset/issues/13351))
|
||||
- [ ] Migration is atomic, supports rollback & is backwards-compatible
|
||||
|
||||
6
.github/dependabot.yml
vendored
@@ -21,9 +21,3 @@ updates:
|
||||
schedule:
|
||||
interval: "daily"
|
||||
open-pull-requests-limit: 0
|
||||
|
||||
- package-ecosystem: "npm"
|
||||
directory: "/docs/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
open-pull-requests-limit: 0
|
||||
|
||||
49
.github/workflows/bashlib.sh
vendored
@@ -38,10 +38,10 @@ default-setup-command() {
|
||||
}
|
||||
|
||||
apt-get-install() {
|
||||
say "::group::apt-get install dependencies"
|
||||
sudo apt-get update && sudo apt-get install --yes \
|
||||
libsasl2-dev
|
||||
say "::endgroup::"
|
||||
say "::group::apt-get install dependencies"
|
||||
sudo apt-get update && sudo apt-get install --yes \
|
||||
libsasl2-dev
|
||||
say "::endgroup::"
|
||||
}
|
||||
|
||||
pip-upgrade() {
|
||||
@@ -55,6 +55,7 @@ npm-install() {
|
||||
cd "$GITHUB_WORKSPACE/superset-frontend"
|
||||
|
||||
# cache-restore npm
|
||||
|
||||
say "::group::Install npm packages"
|
||||
echo "npm: $(npm --version)"
|
||||
echo "node: $(node --version)"
|
||||
@@ -68,7 +69,7 @@ build-assets() {
|
||||
cd "$GITHUB_WORKSPACE/superset-frontend"
|
||||
|
||||
say "::group::Build static assets"
|
||||
npm run build
|
||||
npm run build -- --no-progress
|
||||
say "::endgroup::"
|
||||
}
|
||||
|
||||
@@ -80,7 +81,7 @@ build-instrumented-assets() {
|
||||
if [[ -f "$ASSETS_MANIFEST" ]]; then
|
||||
echo 'Skip frontend build because instrumented static assets already exist.'
|
||||
else
|
||||
npm run build-instrumented
|
||||
npm run build-instrumented -- --no-progress
|
||||
cache-save instrumented-assets
|
||||
fi
|
||||
say "::endgroup::"
|
||||
@@ -161,7 +162,7 @@ cypress-run() {
|
||||
if [[ -z $CYPRESS_KEY ]]; then
|
||||
$cypress --spec "cypress/integration/$page" --browser "$browser"
|
||||
else
|
||||
export CYPRESS_RECORD_KEY=$(echo $CYPRESS_KEY | base64 --decode)
|
||||
export CYPRESS_RECORD_KEY=`echo $CYPRESS_KEY | base64 --decode`
|
||||
# additional flags for Cypress dashboard recording
|
||||
$cypress --spec "cypress/integration/$page" --browser "$browser" \
|
||||
--record --group "$group" --tag "${GITHUB_REPOSITORY},${GITHUB_EVENT_NAME}" \
|
||||
@@ -183,22 +184,22 @@ cypress-run-all() {
|
||||
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
|
||||
local flaskProcessId=$!
|
||||
|
||||
cypress-run "*/**/!(*.applitools.test.ts)"
|
||||
cypress-run "*/**/*"
|
||||
|
||||
# After job is done, print out Flask log for debugging
|
||||
say "::group::Flask log for default run"
|
||||
cat "$flasklog"
|
||||
say "::endgroup::"
|
||||
|
||||
# Rerun SQL Lab tests with backend persist disabled
|
||||
export SUPERSET_CONFIG=tests.integration_tests.superset_test_config_sqllab_backend_persist_off
|
||||
# Rerun SQL Lab tests with backend persist enabled
|
||||
export SUPERSET_CONFIG=tests.superset_test_config_sqllab_backend_persist
|
||||
|
||||
# Restart Flask with new configs
|
||||
kill $flaskProcessId
|
||||
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
|
||||
local flaskProcessId=$!
|
||||
|
||||
cypress-run "sqllab/!(*.applitools.test.ts)" "Backend persist"
|
||||
cypress-run "sqllab/*" "Backend persist"
|
||||
|
||||
# Upload code coverage separately so each page can have separate flags
|
||||
# -c will clean existing coverage reports, -F means add flags
|
||||
@@ -212,29 +213,3 @@ cypress-run-all() {
|
||||
# make sure the program exits
|
||||
kill $flaskProcessId
|
||||
}
|
||||
|
||||
eyes-storybook-dependencies() {
|
||||
say "::group::install eyes-storyook dependencies"
|
||||
sudo apt-get update -y && sudo apt-get -y install gconf-service ca-certificates libxshmfence-dev fonts-liberation libappindicator3-1 libasound2 libatk-bridge2.0-0 libatk1.0-0 libc6 libcairo2 libcups2 libdbus-1-3 libexpat1 libfontconfig1 libgbm1 libgcc1 libgconf-2-4 libglib2.0-0 libgdk-pixbuf2.0-0 libgtk-3-0 libnspr4 libnss3 libpangocairo-1.0-0 libstdc++6 libx11-6 libx11-xcb1 libxcb1 libxcomposite1 libxcursor1 libxdamage1 libxext6 libxfixes3 libxi6 libxrandr2 libxrender1 libxss1 libxtst6 lsb-release xdg-utils libappindicator1
|
||||
say "::endgroup::"
|
||||
}
|
||||
|
||||
cypress-run-applitools() {
|
||||
local flasklog="${HOME}/flask.log"
|
||||
local port=8081
|
||||
export CYPRESS_BASE_URL="http://localhost:${port}"
|
||||
|
||||
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
|
||||
local flaskProcessId=$!
|
||||
|
||||
cypress-run "*/**/*.applitools.test.ts"
|
||||
|
||||
codecov -c -F "cypress" || true
|
||||
|
||||
say "::group::Flask log for default run"
|
||||
cat "$flasklog"
|
||||
say "::endgroup::"
|
||||
|
||||
# make sure the program exits
|
||||
kill $flaskProcessId
|
||||
}
|
||||
|
||||
@@ -50,11 +50,10 @@ jobs:
|
||||
repo: context.repo.repo,
|
||||
issue_number: pull.number,
|
||||
body:
|
||||
`# 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️ 🙅♂️` +
|
||||
`❗ @${pull.user.login} Your base branch \`${currentBranch}\` has ` +
|
||||
`⚠️ @${pull.user.login} Your base branch \`${currentBranch}\` has just ` +
|
||||
'also updated `superset/migrations`.\n' +
|
||||
'\n' +
|
||||
'**Please consider rebasing your branch and [resolving potential db migration conflicts](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#merging-db-migrations).**',
|
||||
'❗ **Please consider rebasing your branch to avoid db migration conflicts.**',
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
23
.github/workflows/embedded-sdk-release.yml
vendored
@@ -1,23 +0,0 @@
|
||||
name: Embedded SDK Release
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- 'master'
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-20.04
|
||||
defaults:
|
||||
run:
|
||||
working-directory: superset-embedded-sdk
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: "16"
|
||||
registry-url: 'https://registry.npmjs.org'
|
||||
- run: npm ci
|
||||
- run: npm run ci:release
|
||||
env:
|
||||
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
24
.github/workflows/embedded-sdk-test.yml
vendored
@@ -1,24 +0,0 @@
|
||||
name: Embedded SDK PR Checks
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
paths:
|
||||
- "superset-embedded-sdk/**"
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
|
||||
jobs:
|
||||
embedded-sdk-test:
|
||||
if: github.event.pull_request.draft == false
|
||||
runs-on: ubuntu-20.04
|
||||
defaults:
|
||||
run:
|
||||
working-directory: superset-embedded-sdk
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: "16"
|
||||
registry-url: 'https://registry.npmjs.org'
|
||||
- run: npm ci
|
||||
- run: npm test
|
||||
- run: npm run build
|
||||
1
.github/workflows/ephemeral-env.yml
vendored
@@ -17,6 +17,7 @@ jobs:
|
||||
- name: Debug
|
||||
run: |
|
||||
echo "Comment on PR #${{ github.event.issue.number }} by ${{ github.event.issue.user.login }}, ${{ github.event.comment.author_association }}"
|
||||
echo "Comment body: ${{ github.event.comment.body }}"
|
||||
|
||||
- name: Eval comment body for /testenv slash command
|
||||
uses: actions/github-script@v3
|
||||
|
||||
86
.github/workflows/release.yml
vendored
@@ -1,86 +0,0 @@
|
||||
name: release-workflow
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- 'master'
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: Bump version and publish package(s)
|
||||
|
||||
runs-on: ubuntu-20.04
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
node-version: [16]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
with:
|
||||
# pulls all commits (needed for lerna / semantic release to correctly version)
|
||||
fetch-depth: 0
|
||||
- name: Get tags and filter trigger tags
|
||||
run: |
|
||||
git fetch --depth=1 origin "+refs/tags/*:refs/tags/*"
|
||||
git fetch --prune --unshallow
|
||||
git tag -d `git tag | grep -E '^trigger-'`
|
||||
|
||||
- name: Use Node.js ${{ matrix.node-version }}
|
||||
uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: ${{ matrix.node-version }}
|
||||
|
||||
- name: Cache npm
|
||||
uses: actions/cache@v1
|
||||
with:
|
||||
path: ~/.npm # npm cache files are stored in `~/.npm` on Linux/macOS
|
||||
key: ${{ runner.OS }}-node-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.OS }}-node-
|
||||
${{ runner.OS }}-
|
||||
|
||||
- name: Get npm cache directory path
|
||||
id: npm-cache-dir-path
|
||||
run: echo "::set-output name=dir::$(npm config get cache)"
|
||||
- name: Cache npm
|
||||
uses: actions/cache@v1
|
||||
id: npm-cache # use this to check for `cache-hit` (`steps.npm-cache.outputs.cache-hit != 'true'`)
|
||||
with:
|
||||
path: ${{ steps.npm-cache-dir-path.outputs.dir }}
|
||||
key: ${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-npm-
|
||||
|
||||
- name: Install dependencies
|
||||
working-directory: ./superset-frontend
|
||||
run: npm ci
|
||||
- name: Run unit tests
|
||||
working-directory: ./superset-frontend
|
||||
run: npm run test -- plugins packages
|
||||
- name: Build packages
|
||||
working-directory: ./superset-frontend
|
||||
run: npm run plugins:build
|
||||
|
||||
- name: Configure npm and git
|
||||
run: |
|
||||
echo "@superset-ui:registry=https://registry.npmjs.org/" > .npmrc
|
||||
echo "registry=https://registry.npmjs.org/" >> .npmrc
|
||||
echo "//registry.npmjs.org/:_authToken=\${NPM_TOKEN}" >> $HOME/.npmrc 2> /dev/null
|
||||
npm whoami
|
||||
git config --local user.email "action@github.com"
|
||||
git config --local user.name "GitHub Action"
|
||||
git remote set-url origin "https://${GITHUB_TOKEN}@github.com/apache-superset/superset-ui.git" > /dev/null 2>&1
|
||||
env:
|
||||
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Bump version and publish package(s)
|
||||
working-directory: ./superset-frontend
|
||||
run: |
|
||||
git tag -d `git tag | grep -E '^trigger-'`
|
||||
npm run plugins:release-from-tag
|
||||
env:
|
||||
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
GH_TOKEN: ${{ secrets.GH_PERSONAL_ACCESS_TOKEN }}
|
||||
88
.github/workflows/superset-applitool-cypress.yml
vendored
@@ -1,88 +0,0 @@
|
||||
name: Applitools Cypress
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: "0 1 * * *"
|
||||
|
||||
jobs:
|
||||
cypress-applitools:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
browser: ["chrome"]
|
||||
node: [16]
|
||||
env:
|
||||
FLASK_ENV: development
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
REDIS_PORT: 16379
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
APPLITOOLS_APP_NAME: Superset
|
||||
APPLITOOLS_API_KEY: ${{ secrets.APPLITOOLS_API_KEY }}
|
||||
APPLITOOLS_BATCH_ID: ${{ github.sha }}
|
||||
APPLITOOLS_BATCH_NAME: Superset Cypress
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:14-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
ports:
|
||||
- 15432:5432
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v3
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
ref: master
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: "3.8"
|
||||
- name: OS dependencies
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: apt-get-install
|
||||
- name: Install python dependencies
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
pip-upgrade
|
||||
pip install -r requirements/testing.txt
|
||||
- name: Setup postgres
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
- name: Import test data
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: testdata
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: ${{ matrix.node }}
|
||||
- name: Install npm dependencies
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: npm-install
|
||||
- name: Build javascript packages
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: build-instrumented-assets
|
||||
- name: Install cypress
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: cypress-install
|
||||
- name: Run Cypress
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
env:
|
||||
CYPRESS_BROWSER: ${{ matrix.browser }}
|
||||
with:
|
||||
run: cypress-run-applitools
|
||||
@@ -1,40 +0,0 @@
|
||||
name: Applitools Storybook
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: "0 0 * * *"
|
||||
|
||||
env:
|
||||
APPLITOOLS_APP_NAME: Superset
|
||||
APPLITOOLS_API_KEY: ${{ secrets.APPLITOOLS_API_KEY }}
|
||||
APPLITOOLS_BATCH_ID: ${{ github.sha }}
|
||||
APPLITOOLS_BATCH_NAME: Superset Storybook
|
||||
|
||||
jobs:
|
||||
cron:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
node: [16]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v3
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
ref: master
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v3.1.1
|
||||
with:
|
||||
node-version: ${{ matrix.node }}
|
||||
- name: Install eyes-storybook dependencies
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: eyes-storybook-dependencies
|
||||
- name: Install NPM dependencies
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: npm-install
|
||||
- name: Run Applitools Eyes-Storybook
|
||||
working-directory: ./superset-frontend
|
||||
run: npx eyes-storybook -u https://superset-storybook.netlify.app/
|
||||
76
.github/workflows/superset-cli.yml
vendored
@@ -1,76 +0,0 @@
|
||||
name: Superset CLI tests
|
||||
|
||||
on:
|
||||
push:
|
||||
branches-ignore:
|
||||
- "dependabot/npm_and_yarn/**"
|
||||
pull_request:
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
|
||||
jobs:
|
||||
test-load-examples:
|
||||
if: github.event.pull_request.draft == false
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.9]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:14-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
ports:
|
||||
# Use custom ports for services to avoid accidentally connecting to
|
||||
# GitHub action runner's default installations
|
||||
- 15432:5432
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
- name: Setup Python
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/testing.txt'
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/testing.txt
|
||||
setup-postgres
|
||||
- name: superset init
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
pip install -e .
|
||||
superset db upgrade
|
||||
superset load_test_users
|
||||
- name: superset load_examples
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
# load examples without test data
|
||||
superset load_examples --load-big-data
|
||||
19
.github/workflows/superset-docs.yml
vendored
@@ -12,28 +12,31 @@ jobs:
|
||||
build-deploy:
|
||||
name: Build & Deploy
|
||||
runs-on: ubuntu-20.04
|
||||
defaults:
|
||||
run:
|
||||
working-directory: docs
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: yarn install
|
||||
- name: npm install
|
||||
working-directory: ./docs
|
||||
run: |
|
||||
yarn install --check-cache
|
||||
- name: yarn build
|
||||
npm install
|
||||
- name: lint
|
||||
working-directory: ./docs
|
||||
run: |
|
||||
yarn build
|
||||
npm run lint
|
||||
- name: gatsby build
|
||||
working-directory: ./docs
|
||||
run: |
|
||||
npm run build
|
||||
- name: deploy docs
|
||||
if: github.ref == 'refs/heads/master'
|
||||
uses: ./.github/actions/github-action-push-to-another-repository
|
||||
env:
|
||||
API_TOKEN_GITHUB: ${{ secrets.SUPERSET_SITE_BUILD }}
|
||||
with:
|
||||
source-directory: './docs/build'
|
||||
source-directory: './docs/public'
|
||||
destination-github-username: 'apache'
|
||||
destination-repository-name: 'superset-site'
|
||||
target-branch: 'asf-site'
|
||||
|
||||
29
.github/workflows/superset-e2e.yml
vendored
@@ -24,14 +24,15 @@ jobs:
|
||||
browser: ["chrome"]
|
||||
env:
|
||||
FLASK_ENV: development
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
ENABLE_REACT_CRUD_VIEWS: true
|
||||
SUPERSET_CONFIG: tests.superset_test_config
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
REDIS_PORT: 16379
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:14-alpine
|
||||
image: postgres:10-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
@@ -66,12 +67,13 @@ jobs:
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: "3.8"
|
||||
python-version: "3.7"
|
||||
- name: OS dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: apt-get-install
|
||||
run: |
|
||||
apt-get-install
|
||||
- name: Install python dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
@@ -83,31 +85,32 @@ jobs:
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
run: |
|
||||
setup-postgres
|
||||
- name: Import test data
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: testdata
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: "16"
|
||||
run: |
|
||||
testdata
|
||||
- name: Install npm dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: npm-install
|
||||
run: |
|
||||
npm-install
|
||||
- name: Build javascript packages
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: build-instrumented-assets
|
||||
run: |
|
||||
build-instrumented-assets
|
||||
- name: Install cypress
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: cypress-install
|
||||
run: |
|
||||
cypress-install
|
||||
- name: Run Cypress
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
|
||||
20
.github/workflows/superset-frontend.yml
vendored
@@ -18,8 +18,6 @@ jobs:
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check npm lock file version
|
||||
run: ./scripts/ci_check_npm_lock_version.sh ./superset-frontend/package-lock.json
|
||||
- name: Check if frontend changes are present
|
||||
id: check
|
||||
env:
|
||||
@@ -27,11 +25,6 @@ jobs:
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh frontend
|
||||
- name: Setup Node.js
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: "16"
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
@@ -43,24 +36,11 @@ jobs:
|
||||
run: |
|
||||
npm run lint
|
||||
npm run prettier-check
|
||||
- name: Build plugins packages
|
||||
if: steps.check.outcome == 'failure'
|
||||
working-directory: ./superset-frontend
|
||||
run: npm run plugins:build
|
||||
- name: Build plugins Storybook
|
||||
if: steps.check.outcome == 'failure'
|
||||
working-directory: ./superset-frontend
|
||||
run: npm run plugins:build-storybook
|
||||
- name: unit tests
|
||||
if: steps.check.outcome == 'failure'
|
||||
working-directory: ./superset-frontend
|
||||
run: |
|
||||
npm run test -- --coverage
|
||||
# todo: remove this step when fix generator as a project in root jest.config.js
|
||||
- name: generator-superset unit tests
|
||||
if: steps.check.outcome == 'failure'
|
||||
working-directory: ./superset-frontend/packages/generator-superset
|
||||
run: npx jest
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
working-directory: ./superset-frontend
|
||||
|
||||
2
.github/workflows/superset-helm-lint.yml
vendored
@@ -22,7 +22,7 @@ jobs:
|
||||
|
||||
- uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: 3.8
|
||||
python-version: 3.7
|
||||
|
||||
- name: Set up chart-testing
|
||||
uses: ./.github/actions/chart-testing-action
|
||||
|
||||
@@ -1,202 +0,0 @@
|
||||
# Python integration tests
|
||||
name: Python-Integration
|
||||
|
||||
on:
|
||||
push:
|
||||
branches-ignore:
|
||||
- "dependabot/npm_and_yarn/**"
|
||||
pull_request:
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
|
||||
jobs:
|
||||
test-mysql:
|
||||
if: github.event.pull_request.draft == false
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.8]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: |
|
||||
mysql+mysqldb://superset:superset@127.0.0.1:13306/superset?charset=utf8mb4&binary_prefix=true
|
||||
services:
|
||||
mysql:
|
||||
image: mysql:5.7
|
||||
env:
|
||||
MYSQL_ROOT_PASSWORD: root
|
||||
ports:
|
||||
- 13306:3306
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
options: --entrypoint redis-server
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
- name: Setup Python
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/testing.txt'
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/testing.txt
|
||||
setup-mysql
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
- name: Python integration tests (MySQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F mysql
|
||||
|
||||
test-postgres:
|
||||
if: github.event.pull_request.draft == false
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.8, 3.9]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:14-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
ports:
|
||||
# Use custom ports for services to avoid accidentally connecting to
|
||||
# GitHub action runner's default installations
|
||||
- 15432:5432
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
- name: Setup Python
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/testing.txt'
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/testing.txt
|
||||
setup-postgres
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
- name: Python integration tests (PostgreSQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F postgres
|
||||
|
||||
test-sqlite:
|
||||
if: github.event.pull_request.draft == false
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.8]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: |
|
||||
sqlite:///${{ github.workspace }}/.temp/unittest.db
|
||||
services:
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
- name: Setup Python
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/testing.txt'
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/testing.txt
|
||||
mkdir ${{ github.workspace }}/.temp
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
- name: Python integration tests (SQLite)
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F sqlite
|
||||
19
.github/workflows/superset-python-misc.yml
vendored
@@ -14,7 +14,7 @@ jobs:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.8]
|
||||
python-version: [3.7]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
@@ -29,12 +29,9 @@ jobs:
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
- name: Setup Python
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/testing.txt'
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
@@ -42,7 +39,6 @@ jobs:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/testing.txt
|
||||
- name: pylint
|
||||
if: steps.check.outcome == 'failure'
|
||||
@@ -54,7 +50,7 @@ jobs:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.8]
|
||||
python-version: [3.7]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
@@ -65,18 +61,12 @@ jobs:
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: |
|
||||
requirements/base.txt
|
||||
requirements/integration.txt
|
||||
- name: Install dependencies
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/base.txt
|
||||
pip install -r requirements/integration.txt
|
||||
- name: pre-commit
|
||||
run: pre-commit run --all-files
|
||||
@@ -86,7 +76,7 @@ jobs:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.8]
|
||||
python-version: [3.7]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
@@ -97,15 +87,12 @@ jobs:
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/base.txt'
|
||||
- name: Install dependencies
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/base.txt
|
||||
- name: Test babel extraction
|
||||
run: flask fab babel-extract --target superset/translations --output superset/translations/messages.pot --config superset/translations/babel.cfg -k _,__,t,tn,tct
|
||||
|
||||
@@ -17,13 +17,13 @@ jobs:
|
||||
python-version: [3.8]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
SUPERSET_CONFIG: tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: presto://localhost:15433/memory/default
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:14-alpine
|
||||
image: postgres:10-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
@@ -62,8 +62,6 @@ jobs:
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/testing.txt'
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
@@ -71,16 +69,15 @@ jobs:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/testing.txt
|
||||
setup-postgres
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
run: celery worker --app=superset.tasks.celery_app:app -Ofair -c 2 &
|
||||
- name: Python unit tests (PostgreSQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
./scripts/python_tests.sh -m 'chart_data_flow or sql_json_flow'
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
@@ -94,14 +91,14 @@ jobs:
|
||||
python-version: [3.8]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
SUPERSET_CONFIG: tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: hive://localhost:10000/default
|
||||
UPLOAD_FOLDER: /tmp/.superset/uploads/
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:14-alpine
|
||||
image: postgres:10-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
@@ -140,8 +137,6 @@ jobs:
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/testing.txt'
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
@@ -149,16 +144,15 @@ jobs:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/testing.txt
|
||||
setup-postgres
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
|
||||
run: celery worker --app=superset.tasks.celery_app:app -Ofair -c 2 &
|
||||
- name: Python unit tests (PostgreSQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
./scripts/python_tests.sh -m 'chart_data_flow or sql_json_flow'
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
|
||||
157
.github/workflows/superset-python-unittest.yml
vendored
@@ -1,5 +1,5 @@
|
||||
# Python unit tests
|
||||
name: Python-Unit
|
||||
name: Python
|
||||
|
||||
on:
|
||||
push:
|
||||
@@ -9,14 +9,30 @@ on:
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
|
||||
jobs:
|
||||
unit-tests:
|
||||
test-mysql:
|
||||
if: github.event.pull_request.draft == false
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.8, 3.9]
|
||||
python-version: [3.7]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: |
|
||||
mysql+mysqldb://superset:superset@127.0.0.1:13306/superset?charset=utf8mb4&binary_prefix=true
|
||||
services:
|
||||
mysql:
|
||||
image: mysql:5.7
|
||||
env:
|
||||
MYSQL_ROOT_PASSWORD: root
|
||||
ports:
|
||||
- 13306:3306
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
options: --entrypoint redis-server
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
@@ -35,9 +51,6 @@ jobs:
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: 'requirements/testing.txt'
|
||||
# TODO: separated requirements.txt file just for unit tests
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
@@ -45,14 +58,136 @@ jobs:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install wheel
|
||||
pip install -r requirements/testing.txt
|
||||
mkdir ${{ github.workspace }}/.temp
|
||||
- name: Python unit tests
|
||||
setup-mysql
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: celery worker --app=superset.tasks.celery_app:app -Ofair -c 2 &
|
||||
- name: Python unit tests (MySQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
pytest --durations-min=0.5 --cov-report= --cov=superset ./tests/common ./tests/unit_tests --cache-clear
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F unit
|
||||
bash .github/workflows/codecov.sh -c -F python -F mysql
|
||||
|
||||
test-postgres:
|
||||
if: github.event.pull_request.draft == false
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.7, 3.8]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:10-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
ports:
|
||||
# Use custom ports for services to avoid accidentally connecting to
|
||||
# GitHub action runner's default installations
|
||||
- 15432:5432
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
- name: Setup Python
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install -r requirements/testing.txt
|
||||
setup-postgres
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: celery worker --app=superset.tasks.celery_app:app -Ofair -c 2 &
|
||||
- name: Python unit tests (PostgreSQL)
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F postgres
|
||||
|
||||
test-sqlite:
|
||||
if: github.event.pull_request.draft == false
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.7]
|
||||
env:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
SUPERSET_CONFIG: tests.superset_test_config
|
||||
REDIS_PORT: 16379
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: |
|
||||
sqlite:///${{ github.workspace }}/.temp/unittest.db
|
||||
services:
|
||||
redis:
|
||||
image: redis:5-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Check if python changes are present
|
||||
id: check
|
||||
env:
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
continue-on-error: true
|
||||
run: ./scripts/ci_check_no_file_changes.sh python
|
||||
- name: Setup Python
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Install dependencies
|
||||
if: steps.check.outcome == 'failure'
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: |
|
||||
apt-get-install
|
||||
pip-upgrade
|
||||
pip install -r requirements/testing.txt
|
||||
mkdir ${{ github.workspace }}/.temp
|
||||
- name: Run celery
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: celery worker --app=superset.tasks.celery_app:app -Ofair -c 2 &
|
||||
- name: Python unit tests (SQLite)
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
./scripts/python_tests.sh
|
||||
- name: Upload code coverage
|
||||
if: steps.check.outcome == 'failure'
|
||||
run: |
|
||||
bash .github/workflows/codecov.sh -c -F python -F sqlite
|
||||
|
||||
11
.github/workflows/superset-translations.yml
vendored
@@ -17,10 +17,6 @@ jobs:
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: '16'
|
||||
- name: Install dependencies
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
@@ -35,7 +31,7 @@ jobs:
|
||||
runs-on: ubuntu-20.04
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.8]
|
||||
python-version: [3.7]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v2
|
||||
@@ -54,4 +50,7 @@ jobs:
|
||||
pip-upgrade
|
||||
pip install -r requirements/base.txt
|
||||
- name: Test babel extraction
|
||||
run: ./scripts/babel_update.sh
|
||||
run: |
|
||||
flask fab babel-extract --target superset/translations \
|
||||
--output superset/translations/messages.pot \
|
||||
--config superset/translations/babel.cfg -k _,__,t,tn,tct
|
||||
|
||||
25
.github/workflows/welcome-new-users.yml
vendored
@@ -1,25 +0,0 @@
|
||||
name: Welcome New Contributor
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [opened]
|
||||
|
||||
jobs:
|
||||
welcome:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
issues: write
|
||||
|
||||
steps:
|
||||
- name: Welcome Message
|
||||
uses: actions/first-interaction@v1.0.0
|
||||
with:
|
||||
repo-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
pr-message: |-
|
||||
Congrats on making your first PR and thank you for contributing to Superset! :tada: :heart:
|
||||
We hope to see you in our [Slack](https://apache-superset.slack.com/) community too!
|
||||
- name: First Time Label
|
||||
uses: andymckay/labeler@master
|
||||
with:
|
||||
add-labels: "new:contributor"
|
||||
repo-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
23
.gitignore
vendored
@@ -14,6 +14,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
*.ipynb
|
||||
*.bak
|
||||
*.db
|
||||
*.pyc
|
||||
@@ -60,38 +61,20 @@ tmp
|
||||
rat-results.txt
|
||||
superset/app/
|
||||
|
||||
# Node.js, webpack artifacts, storybook
|
||||
# Node.js, webpack artifacts
|
||||
*.entry.js
|
||||
*.js.map
|
||||
node_modules
|
||||
npm-debug.log*
|
||||
superset/static/assets
|
||||
superset/static/version_info.json
|
||||
superset-frontend/**/esm/*
|
||||
superset-frontend/**/lib/*
|
||||
superset-frontend/**/storybook-static/*
|
||||
yarn-error.log
|
||||
*.map
|
||||
*.min.js
|
||||
test-changelog.md
|
||||
*.tsbuildinfo
|
||||
|
||||
# Ignore package-lock in packages
|
||||
plugins/*/package-lock.json
|
||||
packages/*/package-lock.json
|
||||
|
||||
# For country map geojson conversion script
|
||||
.ipynb_checkpoints/
|
||||
scripts/*.zip
|
||||
|
||||
# IntelliJ
|
||||
*.iml
|
||||
venv
|
||||
@eaDir/
|
||||
|
||||
# PyCharm
|
||||
.run
|
||||
|
||||
# Test data
|
||||
celery_results.sqlite
|
||||
celerybeat-schedule
|
||||
@@ -108,5 +91,3 @@ release.json
|
||||
messages.mo
|
||||
|
||||
docker/requirements-local.txt
|
||||
|
||||
cache/
|
||||
|
||||
@@ -15,17 +15,20 @@
|
||||
# limitations under the License.
|
||||
#
|
||||
repos:
|
||||
- repo: https://github.com/PyCQA/isort
|
||||
rev: 5.9.3
|
||||
- repo: https://github.com/asottile/seed-isort-config
|
||||
rev: v1.9.3
|
||||
hooks:
|
||||
- id: seed-isort-config
|
||||
- repo: https://github.com/pre-commit/mirrors-isort
|
||||
rev: v4.3.21
|
||||
hooks:
|
||||
- id: isort
|
||||
- repo: https://github.com/pre-commit/mirrors-mypy
|
||||
rev: v0.941
|
||||
rev: v0.790
|
||||
hooks:
|
||||
- id: mypy
|
||||
additional_dependencies: [types-all]
|
||||
- repo: https://github.com/peterdemin/pip-compile-multi
|
||||
rev: v2.4.1
|
||||
rev: v1.5.8
|
||||
hooks:
|
||||
- id: pip-compile-multi-verify
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
@@ -33,7 +36,6 @@ repos:
|
||||
hooks:
|
||||
- id: check-docstring-first
|
||||
- id: check-added-large-files
|
||||
exclude: \.(geojson)$
|
||||
- id: check-yaml
|
||||
exclude: ^helm/superset/templates/
|
||||
- id: debug-statements
|
||||
@@ -41,19 +43,7 @@ repos:
|
||||
- id: trailing-whitespace
|
||||
args: ["--markdown-linebreak-ext=md"]
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 22.3.0
|
||||
rev: 19.10b0
|
||||
hooks:
|
||||
- id: black
|
||||
language_version: python3
|
||||
- repo: https://github.com/pre-commit/mirrors-prettier
|
||||
rev: v2.4.1 # Use the sha or tag you want to point at
|
||||
hooks:
|
||||
- id: prettier
|
||||
args: ['--ignore-path=./superset-frontend/.prettierignore']
|
||||
files: 'superset-frontend'
|
||||
# blacklist unsafe functions like make_url (see #19526)
|
||||
- repo: https://github.com/skorokithakis/blacklist-pre-commit-hook
|
||||
rev: e2f070289d8eddcaec0b580d3bde29437e7c8221
|
||||
hooks:
|
||||
- id: blacklist
|
||||
args: ["--blacklisted-names=make_url", "--ignore=tests/"]
|
||||
|
||||
21
.pylintrc
@@ -70,8 +70,7 @@ confidence=
|
||||
# either give multiple identifier separated by comma (,) or put this option
|
||||
# multiple time (only on the command line, not in the configuration file where
|
||||
# it should appear only once). See also the "--disable" option for examples.
|
||||
enable=
|
||||
useless-suppression,
|
||||
#enable=
|
||||
|
||||
# Disable the message, report, category or checker with the given id(s). You
|
||||
# can either give multiple identifiers separated by comma (,) or put this
|
||||
@@ -82,12 +81,8 @@ enable=
|
||||
# --enable=similarities". If you want to run only the classes checker, but have
|
||||
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||
# --disable=W"
|
||||
disable=
|
||||
missing-docstring,
|
||||
duplicate-code,
|
||||
unspecified-encoding,
|
||||
# re-enable once this no longer raises false positives
|
||||
too-many-instance-attributes
|
||||
disable=long-builtin,dict-view-method,intern-builtin,suppressed-message,no-absolute-import,unpacking-in-except,apply-builtin,delslice-method,indexing-exception,old-raise-syntax,print-statement,cmp-builtin,reduce-builtin,useless-suppression,coerce-method,input-builtin,cmp-method,raw_input-builtin,nonzero-method,backtick,basestring-builtin,setslice-method,reload-builtin,oct-method,map-builtin-not-iterating,execfile-builtin,old-octal-literal,zip-builtin-not-iterating,buffer-builtin,getslice-method,metaclass-assignment,xrange-builtin,long-suffix,round-builtin,range-builtin-not-iterating,next-method-called,parameter-unpacking,unicode-builtin,unichr-builtin,import-star-module-level,raising-string,filter-builtin-not-iterating,using-cmp-argument,coerce-builtin,file-builtin,old-division,hex-method,missing-docstring,too-many-lines,ungrouped-imports,import-outside-toplevel,raise-missing-from,super-with-arguments,bad-option-value,too-few-public-methods
|
||||
|
||||
|
||||
[REPORTS]
|
||||
|
||||
@@ -120,7 +115,7 @@ evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / stateme
|
||||
[BASIC]
|
||||
|
||||
# Good variable names which should always be accepted, separated by a comma
|
||||
good-names=_,df,ex,f,i,id,j,k,l,o,pk,Run,ts,v,x,y
|
||||
good-names=_,df,ex,f,i,id,j,k,l,o,pk,Run,ts,v,x
|
||||
|
||||
# Bad variable names which should always be refused, separated by a comma
|
||||
bad-names=fd,foo,bar,baz,toto,tutu,tata
|
||||
@@ -214,7 +209,7 @@ max-nested-blocks=5
|
||||
[FORMAT]
|
||||
|
||||
# Maximum number of characters on a single line.
|
||||
max-line-length=90
|
||||
max-line-length=88
|
||||
|
||||
# Regexp for a line that is allowed to be longer than the limit.
|
||||
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
|
||||
@@ -313,7 +308,7 @@ generated-members=
|
||||
# List of decorators that produce context managers, such as
|
||||
# contextlib.contextmanager. Add to this list to register other decorators that
|
||||
# produce valid context managers.
|
||||
contextmanager-decorators=contextlib.contextmanager
|
||||
contextmanager-decorators=contextlib.contextmanager,contextlib2.contextmanager
|
||||
|
||||
|
||||
[VARIABLES]
|
||||
@@ -370,7 +365,7 @@ max-locals=15
|
||||
max-returns=10
|
||||
|
||||
# Maximum number of branch for function / method body
|
||||
max-branches=15
|
||||
max-branches=12
|
||||
|
||||
# Maximum number of statements in function / method body
|
||||
max-statements=50
|
||||
@@ -379,7 +374,7 @@ max-statements=50
|
||||
max-parents=7
|
||||
|
||||
# Maximum number of attributes for a class (see R0902).
|
||||
max-attributes=8
|
||||
max-attributes=7
|
||||
|
||||
# Minimum number of public methods for a class (see R0903).
|
||||
min-public-methods=2
|
||||
|
||||
@@ -35,9 +35,8 @@ apache_superset.egg-info
|
||||
.*csv
|
||||
# Generated doc files
|
||||
env/*
|
||||
docs/README.md
|
||||
docs/.htaccess*
|
||||
docs-v2/.htaccess*
|
||||
.nojekyll
|
||||
_build/*
|
||||
_static/*
|
||||
.buildinfo
|
||||
@@ -49,16 +48,3 @@ vendor/*
|
||||
# github configuration
|
||||
.github/*
|
||||
.*mdx
|
||||
|
||||
# skip license check in superset-ui
|
||||
tmp/*
|
||||
lib/*
|
||||
esm/*
|
||||
tsconfig.tsbuildinfo
|
||||
.*ipynb
|
||||
.*yml
|
||||
.*iml
|
||||
.esprintrc
|
||||
.prettierignore
|
||||
generator-superset/*
|
||||
temporary_superset_ui/*
|
||||
|
||||
8972
CHANGELOG.md
@@ -106,7 +106,7 @@ This statement thanks the following, on which it draws for content and inspirati
|
||||
|
||||
# Slack Community Guidelines
|
||||
|
||||
If you decide to join the [Community Slack](https://join.slack.com/t/apache-superset/shared_invite/zt-16jvzmoi8-sI7jKWp~xc2zYRe~NqiY9Q), please adhere to the following rules:
|
||||
If you decide to join the [Community Slack](https://join.slack.com/t/apache-superset/shared_invite/zt-l5f5e0av-fyYu8tlfdqbMdz_sPLwUqQ), please adhere to the following rules:
|
||||
|
||||
**1. Treat everyone in the community with respect.**
|
||||
|
||||
@@ -119,7 +119,7 @@ If you decide to join the [Community Slack](https://join.slack.com/t/apache-supe
|
||||
|
||||
**3. Ask thoughtful questions.**
|
||||
|
||||
- We’re all here to help each other out. The best way to get help is by investing effort into your questions. First check and see if your question is answered in [the Superset documentation](https://superset.apache.org/faq.html) or on [Stack Overflow](https://stackoverflow.com/search?q=apache+superset). You can also check [GitHub issues](https://github.com/apache/superset/issues) to see if your question or feature request has been submitted before. Then, use Slack search to see if your question has already been asked and answered in the past. If you still feel the need to ask a question, make sure you include:
|
||||
- We’re all here to help each other out. The best way to get help is by investing effort into your questions. First check and see if your question is answered in [the Superset documentation](https://superset.apache.org/faq.html) or on [Stack Overflow](https://stackoverflow.com/search?q=apache+superset). You can also check [Github issues](https://github.com/apache/superset/issues) to see if your question or feature request has been submitted before. Then, use Slack search to see if your question has already been asked and answered in the past. If you still feel the need to ask a question, make sure you include:
|
||||
|
||||
- The steps you’ve already taken
|
||||
- Relevant details presented cleanly (text stacktraces, formatted markdown, or screenshots. Please don’t paste large blocks of code unformatted or post photos of your screen from your phone)
|
||||
|
||||
277
CONTRIBUTING.md
@@ -55,7 +55,6 @@ little bit helps, and credit will always be given.
|
||||
- [Images](#images)
|
||||
- [Flask server](#flask-server)
|
||||
- [OS Dependencies](#os-dependencies)
|
||||
- [Dependencies](#dependencies)
|
||||
- [Logging to the browser console](#logging-to-the-browser-console)
|
||||
- [Frontend](#frontend)
|
||||
- [Prerequisite](#prerequisite)
|
||||
@@ -69,33 +68,31 @@ little bit helps, and credit will always be given.
|
||||
- [Feature flags](#feature-flags)
|
||||
- [Git Hooks](#git-hooks)
|
||||
- [Linting](#linting)
|
||||
- [Python](#python)
|
||||
- [TypeScript](#typescript)
|
||||
- [Conventions](#conventions)
|
||||
- [Python Conventions](#python-conventions)
|
||||
- [Python](#python)
|
||||
- [Typing](#typing)
|
||||
- [Python Typing](#python-typing)
|
||||
- [TypeScript Typing](#typescript-typing)
|
||||
- [Python](#python-1)
|
||||
- [TypeScript](#typescript)
|
||||
- [Testing](#testing)
|
||||
- [Python Testing](#python-testing)
|
||||
- [Frontend Testing](#frontend-testing)
|
||||
- [Integration Testing](#integration-testing)
|
||||
- [Debugging Server App](#debugging-server-app)
|
||||
- [Debugging Server App in Kubernetes Environment](#debugging-server-app-in-kubernetes-environment)
|
||||
- [Storybook](#storybook)
|
||||
- [Translating](#translating)
|
||||
- [Enabling language selection](#enabling-language-selection)
|
||||
- [Extracting new strings for translation](#extracting-new-strings-for-translation)
|
||||
- [Updating language files](#updating-language-files)
|
||||
- [Creating a new language dictionary](#creating-a-new-language-dictionary)
|
||||
- [Tips](#tips)
|
||||
- [Adding a new datasource](#adding-a-new-datasource)
|
||||
- [Improving visualizations](#improving-visualizations)
|
||||
- [Visualization Plugins](#visualization-plugins)
|
||||
- [Adding a DB migration](#adding-a-db-migration)
|
||||
- [Merging DB migrations](#merging-db-migrations)
|
||||
- [SQL Lab Async](#sql-lab-async)
|
||||
- [Async Chart Queries](#async-chart-queries)
|
||||
- [Chart Parameters](#chart-parameters)
|
||||
- [Datasource \& Chart Type](#datasource--chart-type)
|
||||
- [Datasource & Chart Type](#datasource--chart-type)
|
||||
- [Time](#time)
|
||||
- [GROUP BY](#group-by)
|
||||
- [NOT GROUPED BY](#not-grouped-by)
|
||||
@@ -118,8 +115,19 @@ Here's a list of repositories that contain Superset-related packages:
|
||||
also includes Superset's main TypeScript/JavaScript bundles and react apps under
|
||||
the [superset-frontend](https://github.com/apache/superset/tree/master/superset-frontend)
|
||||
folder.
|
||||
- [apache-superset/superset-ui](https://github.com/apache-superset/superset-ui)
|
||||
contains core Superset's
|
||||
[npm packages](https://github.com/apache-superset/superset-ui/tree/master/packages).
|
||||
These packages are shared across the React apps in the main repository,
|
||||
and in visualization plugins.
|
||||
- [apache-superset/superset-ui-plugins](https://github.com/apache-superset/superset-ui-plugins)
|
||||
contains the code for the default visualizations that ship with Superset
|
||||
and are maintained by the core community.
|
||||
- [apache-superset/superset-ui-plugins-deckgl](https://github.com/apache-superset/superset-ui-plugins-deckgl)
|
||||
contains the code for the geospatial visualizations that ship with Superset
|
||||
and are maintained by the core community.
|
||||
- [github.com/apache-superset](https://github.com/apache-superset) is the
|
||||
GitHub organization under which we manage Superset-related
|
||||
Github organization under which we manage Superset-related
|
||||
small tools, forks and Superset-related experimental ideas.
|
||||
|
||||
## Types of Contributions
|
||||
@@ -188,7 +196,7 @@ The purpose is to separate problem from possible solutions.
|
||||
|
||||
**Refactor:** For small refactors, it can be a standalone PR itself detailing what you are refactoring and why. If there are concerns, project maintainers may request you to create a `#SIP` for the PR before proceeding.
|
||||
|
||||
**Feature/Large changes:** If you intend to change the public API, or make any non-trivial changes to the implementation, we require you to file a new issue as `#SIP` (Superset Improvement Proposal). This lets us reach an agreement on your proposal before you put significant effort into it. You are welcome to submit a PR along with the SIP (sometimes necessary for demonstration), but we will not review/merge the code until the SIP is approved.
|
||||
**Feature/Large changes:** If you intend to change the public API, or make any non-trivial changes to the implementation, we requires you to file a new issue as `#SIP` (Superset Improvement Proposal). This lets us reach an agreement on your proposal before you put significant effort into it. You are welcome to submit a PR along with the SIP (sometimes necessary for demonstration), but we will not review/merge the code until the SIP is approved.
|
||||
|
||||
In general, small PRs are always easier to review than large PRs. The best practice is to break your work into smaller independent PRs and refer to the same issue. This will greatly reduce turnaround time.
|
||||
|
||||
@@ -201,7 +209,7 @@ Finally, never submit a PR that will put master branch in broken state. If the P
|
||||
#### Authoring
|
||||
|
||||
- Fill in all sections of the PR template.
|
||||
- Title the PR with one of the following semantic prefixes (inspired by [Karma](http://karma-runner.github.io/0.10/dev/git-commit-msg.html)):
|
||||
- Title the PR with one of the following semantic prefixes (inspired by [Karma](http://karma-runner.github.io/0.10/dev/git-commit-msg.html])):
|
||||
|
||||
- `feat` (new feature)
|
||||
- `fix` (bug fix)
|
||||
@@ -212,7 +220,7 @@ Finally, never submit a PR that will put master branch in broken state. If the P
|
||||
- `chore` (updating tasks etc; no application logic change)
|
||||
- `perf` (performance-related change)
|
||||
- `build` (build tooling, Docker configuration change)
|
||||
- `ci` (test runner, GitHub Actions workflow changes)
|
||||
- `ci` (test runner, Github Actions workflow changes)
|
||||
- `other` (changes that don't correspond to the above -- should be rare!)
|
||||
- Examples:
|
||||
- `feat: export charts as ZIP files`
|
||||
@@ -362,7 +370,7 @@ In the event a community member discovers a security flaw in Superset, it is imp
|
||||
|
||||
Reverting changes that are causing issues in the master branch is a normal and expected part of the development process. In an open source community, the ramifications of a change cannot always be fully understood. With that in mind, here are some considerations to keep in mind when considering a revert:
|
||||
|
||||
- **Availability of the PR author:** If the original PR author or the engineer who merged the code is highly available and can provide a fix in a reasonable time frame, this would counter-indicate reverting.
|
||||
- **Availability of the PR author:** If the original PR author or the engineer who merged the code is highly available and can provide a fix in a reasonable timeframe, this would counter-indicate reverting.
|
||||
- **Severity of the issue:** How severe is the problem on master? Is it keeping the project from moving forward? Is there user impact? What percentage of users will experience a problem?
|
||||
- **Size of the change being reverted:** Reverting a single small PR is a much lower-risk proposition than reverting a massive, multi-PR change.
|
||||
- **Age of the change being reverted:** Reverting a recently-merged PR will be more acceptable than reverting an older PR. A bug discovered in an older PR is unlikely to be causing widespread serious issues.
|
||||
@@ -399,21 +407,20 @@ referenced in the rst, e.g.
|
||||
.. image:: _static/images/tutorial/tutorial_01_sources_database.png
|
||||
|
||||
aren't actually stored in that directory. Instead, you should add and commit
|
||||
images (and any other static assets) to the `superset-frontend/src/assets/images` directory.
|
||||
images (and any other static assets) to the `superset-frontend/images` directory.
|
||||
When the docs are deployed to https://superset.apache.org/, images
|
||||
are copied from there to the `_static/images` directory, just like they're referenced
|
||||
in the docs.
|
||||
|
||||
For example, the image referenced above actually lives in `superset-frontend/src/assets/images/tutorial`. Since the image is moved during the documentation build process, the docs reference the image in `_static/images/tutorial` instead.
|
||||
For example, the image referenced above actually lives in `superset-frontend/images/tutorial`. Since the image is moved during the documentation build process, the docs reference the image in `_static/images/tutorial` instead.
|
||||
|
||||
### Flask server
|
||||
|
||||
#### OS Dependencies
|
||||
|
||||
Make sure your machine meets the [OS dependencies](https://superset.apache.org/docs/installation/installing-superset-from-scratch#os-dependencies) before following these steps.
|
||||
You also need to install MySQL or [MariaDB](https://mariadb.com/downloads).
|
||||
|
||||
Ensure that you are using Python version 3.7 or 3.8, then proceed with:
|
||||
Ensure Python versions >3.7, Then proceed with:
|
||||
|
||||
```bash
|
||||
# Create a virtual environment and activate it (recommended)
|
||||
@@ -421,29 +428,27 @@ python3 -m venv venv # setup a python3 virtualenv
|
||||
source venv/bin/activate
|
||||
|
||||
# Install external dependencies
|
||||
pip install -r requirements/testing.txt
|
||||
pip install -r requirements/local.txt
|
||||
|
||||
# Install Superset in editable (development) mode
|
||||
pip install -e .
|
||||
|
||||
# Create an admin user in your metadata database
|
||||
superset fab create-admin
|
||||
|
||||
# Initialize the database
|
||||
superset db upgrade
|
||||
|
||||
# Create an admin user in your metadata database (use `admin` as username to be able to load the examples)
|
||||
superset fab create-admin
|
||||
|
||||
# Create default roles and permissions
|
||||
superset init
|
||||
|
||||
# Load some data to play with.
|
||||
# Note: you MUST have previously created an admin user with the username `admin` for this command to work.
|
||||
# Load some data to play with (you must create an Admin user with the username `admin` for this command to work)
|
||||
superset load-examples
|
||||
|
||||
# Start the Flask dev web server from inside your virtualenv.
|
||||
# Note that your page may not have CSS at this point.
|
||||
# Note that your page may not have css at this point.
|
||||
# See instructions below how to build the front-end assets.
|
||||
FLASK_ENV=development superset run -p 8088 --with-threads --reload --debugger
|
||||
```
|
||||
|
||||
Or you can install via our Makefile
|
||||
|
||||
@@ -468,31 +473,6 @@ via `.flaskenv`, however if needed, it should be set to `superset.app:create_app
|
||||
If you have made changes to the FAB-managed templates, which are not built the same way as the newer, React-powered front-end assets, you need to start the app without the `--with-threads` argument like so:
|
||||
`FLASK_ENV=development superset run -p 8088 --reload --debugger`
|
||||
|
||||
#### Dependencies
|
||||
|
||||
If you add a new requirement or update an existing requirement (per the `install_requires` section in `setup.py`) you must recompile (freeze) the Python dependencies to ensure that for CI, testing, etc. the build is deterministic. This can be achieved via,
|
||||
|
||||
```bash
|
||||
$ python3 -m venv venv
|
||||
$ source venv/bin/activate
|
||||
$ python3 -m pip install -r requirements/integration.txt
|
||||
$ pip-compile-multi --no-upgrade
|
||||
```
|
||||
|
||||
When upgrading the version number of a single package, you should run `pip-compile-multi` with the `-P` flag:
|
||||
|
||||
```bash
|
||||
$ pip-compile-multi -P my-package
|
||||
```
|
||||
|
||||
To bring all dependencies up to date as per the restrictions defined in `setup.py` and `requirements/*.in`, run pip-compile-multi` without any flags:
|
||||
|
||||
```bash
|
||||
$ pip-compile-multi
|
||||
```
|
||||
|
||||
This should be done periodically, but it is recommended to do thorough manual testing of the application to ensure no breaking changes have been introduced that aren't caught by the unit and integration tests.
|
||||
|
||||
#### Logging to the browser console
|
||||
|
||||
This feature is only available on Python 3. When debugging your application, you can have the server logs sent directly to the browser console using the [ConsoleLog](https://github.com/betodealmeida/consolelog) package. You need to mutate the app, by adding the following to your `config.py` or `superset_config.py`:
|
||||
@@ -526,12 +506,7 @@ Frontend assets (TypeScript, JavaScript, CSS, and images) must be compiled in or
|
||||
|
||||
##### nvm and node
|
||||
|
||||
First, be sure you are using the following versions of Node.js and npm:
|
||||
|
||||
- `Node.js`: Version 16
|
||||
- `npm`: Version 7
|
||||
|
||||
We recommend using [nvm](https://github.com/nvm-sh/nvm) to manage your node environment:
|
||||
First, be sure you are using recent versions of Node.js and npm. We recommend using [nvm](https://github.com/nvm-sh/nvm) to manage your node environment:
|
||||
|
||||
```bash
|
||||
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.37.0/install.sh | bash
|
||||
@@ -542,13 +517,18 @@ nvm use --lts
|
||||
```
|
||||
|
||||
Or if you use the default macOS starting with Catalina shell `zsh`, try:
|
||||
|
||||
```zsh
|
||||
sh -c "$(curl -fsSL https://raw.githubusercontent.com/nvm-sh/nvm/v0.37.0/install.sh)"
|
||||
```
|
||||
|
||||
For those interested, you may also try out [avn](https://github.com/nvm-sh/nvm#deeper-shell-integration) to automatically switch to the node version that is required to run Superset frontend.
|
||||
|
||||
We have upgraded our `package-lock.json` to use `lockfileversion: 2` from npm 7, so please make sure you have installed npm 7, too:
|
||||
|
||||
```bash
|
||||
npm install -g npm@7
|
||||
```
|
||||
|
||||
#### Install dependencies
|
||||
|
||||
Install third-party dependencies listed in `package.json` via:
|
||||
@@ -571,29 +551,14 @@ There are three types of assets you can build:
|
||||
|
||||
#### Webpack dev server
|
||||
|
||||
The dev server by default starts at `http://localhost:9000` and proxies the backend requests to `http://localhost:8088`.
|
||||
|
||||
So a typical development workflow is the following:
|
||||
|
||||
1. [run Superset locally](#flask-server) using Flask, on port `8088` — but don't access it directly,<br/>
|
||||
```bash
|
||||
# Install Superset and dependencies, plus load your virtual environment first, as detailed above.
|
||||
FLASK_ENV=development superset run -p 8088 --with-threads --reload --debugger
|
||||
```
|
||||
2. in parallel, run the Webpack dev server locally on port `9000`,<br/>
|
||||
```bash
|
||||
npm run dev-server
|
||||
```
|
||||
3. access `http://localhost:9000` (the Webpack server, _not_ Flask) in your web browser. This will use the hot-reloading front-end assets from the Webpack development server while redirecting back-end queries to Flask/Superset: your changes on Superset codebase — either front or back-end — will then be reflected live in the browser.
|
||||
|
||||
It's possible to change the Webpack server settings:
|
||||
The dev server by default starts at `http://localhost:9000` and proxies the backend requests to `http://localhost:8088`. It's possible to change these settings:
|
||||
|
||||
```bash
|
||||
# Start the dev server at http://localhost:9000
|
||||
npm run dev-server
|
||||
|
||||
# Run the dev server on a non-default port
|
||||
npm run dev-server -- --port=9001
|
||||
npm run dev-server -- --devserverPort=9001
|
||||
|
||||
# Proxy backend requests to a Flask server running on a non-default port
|
||||
npm run dev-server -- --supersetPort=8081
|
||||
@@ -631,7 +596,7 @@ FEATURE_FLAGS = {
|
||||
}
|
||||
```
|
||||
|
||||
If you want to use the same flag in the client code, also add it to the FeatureFlag TypeScript enum in [@superset-ui/core](https://github.com/apache-superset/superset-ui/blob/master/packages/superset-ui-core/src/utils/featureFlags.ts). For example,
|
||||
If you want to use the same flag in the client code, also add it to the FeatureFlag TypeScript enum in `superset-frontend/src/featureFlags.ts`. For example,
|
||||
|
||||
```typescript
|
||||
export enum FeatureFlag {
|
||||
@@ -663,48 +628,34 @@ tox -e pre-commit
|
||||
```
|
||||
|
||||
Or by running pre-commit manually:
|
||||
|
||||
```bash
|
||||
pre-commit run --all-files
|
||||
```
|
||||
|
||||
## Linting
|
||||
|
||||
### Python
|
||||
|
||||
We use [Pylint](https://pylint.org/) for linting which can be invoked via:
|
||||
Lint the project with:
|
||||
|
||||
```bash
|
||||
# for python
|
||||
tox -e pylint
|
||||
```
|
||||
|
||||
In terms of best practices please avoid blanket disablement of Pylint messages globally (via `.pylintrc`) or top-level within the file header, albeit there being a few exceptions. Disablement should occur inline as it prevents masking issues and provides context as to why said message is disabled.
|
||||
Alternatively, you can use pre-commit (mentioned above) for python linting
|
||||
|
||||
Additionally, the Python code is auto-formatted using [Black](https://github.com/python/black) which
|
||||
is configured as a pre-commit hook. There are also numerous [editor integrations](https://black.readthedocs.io/en/stable/integrations/editors.html)
|
||||
The Python code is auto-formatted using [Black](https://github.com/python/black) which
|
||||
is configured as a pre-commit hook. There are also numerous [editor integrations](https://black.readthedocs.io/en/stable/editor_integration.html)
|
||||
|
||||
### TypeScript
|
||||
|
||||
```bash
|
||||
# for frontend
|
||||
cd superset-frontend
|
||||
npm ci
|
||||
npm run lint
|
||||
```
|
||||
|
||||
If using the eslint extension with vscode, put the following in your workspace `settings.json` file:
|
||||
|
||||
```json
|
||||
"eslint.workingDirectories": [
|
||||
"superset-frontend"
|
||||
]
|
||||
```
|
||||
|
||||
## Conventions
|
||||
|
||||
### Python Conventions
|
||||
### Python
|
||||
|
||||
Parameters in the `config.py` (which are accessible via the Flask app.config dictionary) are assumed to always be defined and thus should be accessed directly via,
|
||||
Parameters in the `config.py` (which are accessible via the Flask app.config dictionary) are assummed to always be defined and thus should be accessed directly via,
|
||||
|
||||
```python
|
||||
blueprints = app.config["BLUEPRINTS"]
|
||||
@@ -720,7 +671,7 @@ or similar as the later will cause typing issues. The former is of type `List[Ca
|
||||
|
||||
## Typing
|
||||
|
||||
### Python Typing
|
||||
### Python
|
||||
|
||||
To ensure clarity, consistency, all readability, _all_ new functions should use
|
||||
[type hints](https://docs.python.org/3/library/typing.html) and include a
|
||||
@@ -747,7 +698,7 @@ def sqrt(x: Union[float, int]) -> Union[float, int]:
|
||||
return math.sqrt(x)
|
||||
```
|
||||
|
||||
### TypeScript Typing
|
||||
### TypeScript
|
||||
|
||||
TypeScript is fully supported and is the recommended language for writing all new frontend components. When modifying existing functions/components, migrating to TypeScript is appreciated, but not required. Examples of migrating functions/components to TypeScript can be found in [#9162](https://github.com/apache/superset/pull/9162) and [#9180](https://github.com/apache/superset/pull/9180).
|
||||
|
||||
@@ -785,21 +736,14 @@ Note that the test environment uses a temporary directory for defining the
|
||||
SQLite databases which will be cleared each time before the group of test
|
||||
commands are invoked.
|
||||
|
||||
There is also a utility script included in the Superset codebase to run python integration tests. The [readme can be
|
||||
There is also a utility script included in the Superset codebase to run python tests. The [readme can be
|
||||
found here](https://github.com/apache/superset/tree/master/scripts/tests)
|
||||
|
||||
To run all integration tests for example, run this script from the root directory:
|
||||
|
||||
To run all tests for example, run this script from the root directory:
|
||||
```bash
|
||||
scripts/tests/run.sh
|
||||
```
|
||||
|
||||
You can run unit tests found in './tests/unit_tests' for example with pytest. It is a simple way to run an isolated test that doesn't need any database setup
|
||||
|
||||
```bash
|
||||
pytest ./link_to_test.py
|
||||
```
|
||||
|
||||
### Frontend Testing
|
||||
|
||||
We use [Jest](https://jestjs.io/) and [Enzyme](https://airbnb.io/enzyme/) to test TypeScript/JavaScript. Tests can be run with:
|
||||
@@ -820,8 +764,9 @@ npm run test -- path/to/file.js
|
||||
We use [Cypress](https://www.cypress.io/) for integration tests. Tests can be run by `tox -e cypress`. To open Cypress and explore tests first setup and run test server:
|
||||
|
||||
```bash
|
||||
export SUPERSET_CONFIG=tests.integration_tests.superset_test_config
|
||||
export SUPERSET_CONFIG=tests.superset_test_config
|
||||
export SUPERSET_TESTENV=true
|
||||
export ENABLE_REACT_CRUD_VIEWS=true
|
||||
export CYPRESS_BASE_URL="http://localhost:8081"
|
||||
superset db upgrade
|
||||
superset load_test_users
|
||||
@@ -843,7 +788,7 @@ npm install
|
||||
npm run cypress-run-chrome
|
||||
|
||||
# run tests from a specific file
|
||||
npm run cypress-run-chrome -- --spec cypress/integration/explore/link.test.ts
|
||||
npm run cypress-run-chrome -- --spec cypress/integration/explore/link.test.js
|
||||
|
||||
# run specific file with video capture
|
||||
npm run cypress-run-chrome -- --spec cypress/integration/dashboard/index.test.js --config video=true
|
||||
@@ -906,7 +851,6 @@ superset:
|
||||
```
|
||||
|
||||
Start Superset as usual
|
||||
|
||||
```bash
|
||||
docker-compose up
|
||||
```
|
||||
@@ -914,14 +858,12 @@ docker-compose up
|
||||
Install the required libraries and packages to the docker container
|
||||
|
||||
Enter the superset_app container
|
||||
|
||||
```bash
|
||||
docker exec -it superset_app /bin/bash
|
||||
root@39ce8cf9d6ab:/app#
|
||||
```
|
||||
|
||||
Run the following commands inside the container
|
||||
|
||||
```bash
|
||||
apt update
|
||||
apt install -y gdb
|
||||
@@ -929,7 +871,7 @@ apt install -y net-tools
|
||||
pip install debugpy
|
||||
```
|
||||
|
||||
Find the PID for the Flask process. Make sure to use the first PID. The Flask app will re-spawn a sub-process every time you change any of the python code. So it's important to use the first PID.
|
||||
Find the PID for the Flask process. Make sure to use the first PID. The Flask app will re-spawn a sub-process everytime you change any of the python code. So it's important to use the first PID.
|
||||
|
||||
```bash
|
||||
ps -ef
|
||||
@@ -941,13 +883,11 @@ root 10 6 7 14:09 ? 00:00:07 /usr/local/bin/python /usr/bin/f
|
||||
```
|
||||
|
||||
Inject debugpy into the running Flask process. In this case PID 6.
|
||||
|
||||
```bash
|
||||
python3 -m debugpy --listen 0.0.0.0:5678 --pid 6
|
||||
```
|
||||
|
||||
Verify that debugpy is listening on port 5678
|
||||
|
||||
```bash
|
||||
netstat -tunap
|
||||
|
||||
@@ -958,7 +898,6 @@ tcp 0 0 0.0.0.0:8088 0.0.0.0:* LISTEN
|
||||
```
|
||||
|
||||
You are now ready to attach a debugger to the process. Using VSCode you can configure a launch configuration file .vscode/launch.json like so.
|
||||
|
||||
```
|
||||
{
|
||||
"version": "0.2.0",
|
||||
@@ -982,7 +921,8 @@ You are now ready to attach a debugger to the process. Using VSCode you can conf
|
||||
}
|
||||
```
|
||||
|
||||
VSCode will not stop on breakpoints right away. We've attached to PID 6 however it does not yet know of any sub-processes. In order to "wake up" the debugger you need to modify a python file. This will trigger Flask to reload the code and create a new sub-process. This new sub-process will be detected by VSCode and breakpoints will be activated.
|
||||
VSCode will not stop on breakpoints right away. We've attached to PID 6 however it does not yet know of any sub-processes. In order to "wakeup" the debugger you need to modify a python file. This will trigger Flask to reload the code and create a new sub-process. This new sub-process will be detected by VSCode and breakpoints will be activated.
|
||||
|
||||
|
||||
### Debugging Server App in Kubernetes Environment
|
||||
|
||||
@@ -1000,7 +940,7 @@ Once the pod is running as root and has the SYS_PTRACE capability it will be abl
|
||||
|
||||
You can follow the same instructions as in the docker-compose. Enter the pod and install the required library and packages; gdb, netstat and debugpy.
|
||||
|
||||
Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine.
|
||||
Often in a kuernetes environment nodes are not addressable from ouside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine.
|
||||
|
||||
```
|
||||
kubectl port-forward pod/superset-<some random id> 5678:5678
|
||||
@@ -1008,6 +948,7 @@ kubectl port-forward pod/superset-<some random id> 5678:5678
|
||||
|
||||
You can now launch your VSCode debugger with the same config as above. VSCode will connect to to 127.0.0.1:5678 which is forwarded by kubectl to your remote kubernetes POD.
|
||||
|
||||
|
||||
### Storybook
|
||||
|
||||
Superset includes a [Storybook](https://storybook.js.org/) to preview the layout/styling of various Superset components, and variations thereof. To open and view the Storybook:
|
||||
@@ -1057,15 +998,22 @@ LANGUAGES = {
|
||||
}
|
||||
```
|
||||
|
||||
### Extracting new strings for translation
|
||||
|
||||
```bash
|
||||
pybabel extract -F superset/translations/babel.cfg -o superset/translations/messages.pot -k _ -k __ -k t -k tn -k tct .
|
||||
```
|
||||
|
||||
This will update the template file `superset/translations/messages.pot` with current application strings. Do not forget to update
|
||||
this file with the appropriate license information.
|
||||
|
||||
### Updating language files
|
||||
|
||||
```bash
|
||||
./scripts/babel_update.sh
|
||||
pybabel update -i superset/translations/messages.pot -d superset/translations --ignore-obsolete
|
||||
```
|
||||
|
||||
This script will
|
||||
1. update the template file `superset/translations/messages.pot` with current application strings.
|
||||
2. update language files with the new extracted strings.
|
||||
This will update language files with the new extracted strings.
|
||||
|
||||
You can then translate the strings gathered in files located under
|
||||
`superset/translation`, where there's one per language. You can use [Poedit](https://poedit.net/features)
|
||||
@@ -1107,7 +1055,7 @@ pip install -r superset/translations/requirements.txt
|
||||
pybabel init -i superset/translations/messages.pot -d superset/translations -l LANGUAGE_CODE
|
||||
```
|
||||
|
||||
Then, [Updating language files](#updating-language-files).
|
||||
Then, [extract strings for the new language](#extracting-new-strings-for-translation).
|
||||
|
||||
## Tips
|
||||
|
||||
@@ -1129,13 +1077,44 @@ Then, [Updating language files](#updating-language-files).
|
||||
|
||||
This means it'll register MyDatasource and MyOtherDatasource in superset.my_models module in the source registry.
|
||||
|
||||
### Improving visualizations
|
||||
|
||||
To edit the frontend code for visualizations, you will have to check out a copy of [apache-superset/superset-ui](https://github.com/apache-superset/superset-ui):
|
||||
|
||||
```bash
|
||||
git clone https://github.com/apache-superset/superset-ui.git
|
||||
cd superset-ui
|
||||
yarn
|
||||
yarn build
|
||||
```
|
||||
|
||||
Then use `npm link` to create symlinks of the plugins/superset-ui packages you want to edit in `superset-frontend/node_modules`:
|
||||
|
||||
```bash
|
||||
# Since npm 7, you have to install plugin dependencies separately, too
|
||||
cd ../../superset-ui/plugins/[PLUGIN NAME] && npm install --legacy-peer-deps
|
||||
|
||||
cd superset/superset-frontend
|
||||
npm link ../../superset-ui/plugins/[PLUGIN NAME]
|
||||
|
||||
# Or to link all core superset-ui and plugin packages:
|
||||
# npm link ../../superset-ui/{packages,plugins}/*
|
||||
|
||||
# Start developing
|
||||
npm run dev-server
|
||||
```
|
||||
|
||||
When `superset-ui` packages are linked with `npm link`, the dev server will automatically load a package's source code from its `/src` directory, instead of the built modules in `lib/` or `esm/`.
|
||||
|
||||
Note that every time you do `npm install`, you will lose the symlink(s) and may have to run `npm link` again.
|
||||
|
||||
### Visualization Plugins
|
||||
|
||||
The topic of authoring new plugins, whether you'd like to contribute
|
||||
it back or not has been well documented in the
|
||||
[the documentation](https://superset.apache.org/docs/contributing/creating-viz-plugins), and in [this blog post](https://preset.io/blog/building-custom-viz-plugins-in-superset-v2).
|
||||
[So, You Want to Build a Superset Viz Plugin...](https://preset.io/blog/2020-07-02-hello-world/) blog post
|
||||
|
||||
To contribute a plugin to Superset, your plugin must meet the following criteria:
|
||||
To contribute a plugin to Superset-UI, your plugin must meet the following criteria:
|
||||
|
||||
- The plugin should be applicable to the community at large, not a particularly specialized use case
|
||||
- The plugin should be written with TypeScript
|
||||
@@ -1285,7 +1264,7 @@ To do this, you'll need to:
|
||||
- Start up a celery worker
|
||||
|
||||
```shell script
|
||||
celery --app=superset.tasks.celery_app:app worker -Ofair
|
||||
celery worker --app=superset.tasks.celery_app:app -Ofair
|
||||
```
|
||||
|
||||
Note that:
|
||||
@@ -1317,7 +1296,7 @@ The following configuration settings are available for async queries (see config
|
||||
- `GLOBAL_ASYNC_QUERIES_JWT_COOKIE_SECURE` - JWT cookie secure option
|
||||
- `GLOBAL_ASYNC_QUERIES_JWT_COOKIE_DOMAIN` - JWT cookie domain option ([see docs for set_cookie](https://tedboy.github.io/flask/interface_api.response_object.html#flask.Response.set_cookie))
|
||||
- `GLOBAL_ASYNC_QUERIES_JWT_SECRET` - JWT's use a secret key to sign and validate the contents. This value should be at least 32 bytes and have sufficient randomness for proper security
|
||||
- `GLOBAL_ASYNC_QUERIES_TRANSPORT` - available options: "polling" (HTTP, default), "ws" (WebSocket, requires running superset-websocket server)
|
||||
- `GLOBAL_ASYNC_QUERIES_TRANSPORT` - currently the only available option is (HTTP) `polling`, but support for a WebSocket will be added in future versions
|
||||
- `GLOBAL_ASYNC_QUERIES_POLLING_DELAY` - the time (in ms) between polling requests
|
||||
|
||||
More information on the async query feature can be found in [SIP-39](https://github.com/apache/superset/issues/9190).
|
||||
@@ -1326,20 +1305,20 @@ More information on the async query feature can be found in [SIP-39](https://git
|
||||
|
||||
Chart parameters are stored as a JSON encoded string the `slices.params` column and are often referenced throughout the code as form-data. Currently the form-data is neither versioned nor typed as thus is somewhat free-formed. Note in the future there may be merit in using something like [JSON Schema](https://json-schema.org/) to both annotate and validate the JSON object in addition to using a Mypy `TypedDict` (introduced in Python 3.8) for typing the form-data in the backend. This section serves as a potential primer for that work.
|
||||
|
||||
The following tables provide a non-exhaustive list of the various fields which can be present in the JSON object grouped by the Explorer pane sections. These values were obtained by extracting the distinct fields from a legacy deployment consisting of tens of thousands of charts and thus some fields may be missing whilst others may be deprecated.
|
||||
The following tables provide a non-exhausive list of the various fields which can be present in the JSON object grouped by the Explorer pane sections. These values were obtained by extracting the distinct fields from a legacy deployment consisting of tens of thousands of charts and thus some fields may be missing whilst others may be deprecated.
|
||||
|
||||
Note not all fields are correctly categorized. The fields vary based on visualization type and may appear in different sections depending on the type. Verified deprecated columns may indicate a missing migration and/or prior migrations which were unsuccessful and thus future work may be required to clean up the form-data.
|
||||
Note not all fields are correctly catagorized. The fields vary based on visualization type and may apprear in different sections depending on the type. Verified deprecated columns may indicate a missing migration and/or prior migrations which were unsucessful and thus future work may be required to clean up the form-data.
|
||||
|
||||
### Datasource & Chart Type
|
||||
|
||||
| Field | Type | Notes |
|
||||
| ----------------- | -------- | ------------------------------------ |
|
||||
| `database_name` | _string_ | _Deprecated?_ |
|
||||
| `datasource` | _string_ | `<datasource_id>__<datasource_type>` |
|
||||
| `datasource_id` | _string_ | _Deprecated?_ See `datasource` |
|
||||
| `datasource_name` | _string_ | _Deprecated?_ |
|
||||
| `datasource_type` | _string_ | _Deprecated?_ See `datasource` |
|
||||
| `viz_type` | _string_ | The **Visualization Type** widget |
|
||||
| Field | Type | Notes |
|
||||
| ----------------- | -------- | ----------------------------------- |
|
||||
| `database_name` | _string_ | _Deprecated?_ |
|
||||
| `datasource` | _string_ | `<datasouce_id>__<datasource_type>` |
|
||||
| `datasource_id` | _string_ | _Deprecated?_ See `datasource` |
|
||||
| `datasource_name` | _string_ | _Deprecated?_ |
|
||||
| `datasource_type` | _string_ | _Deprecated?_ See `datasource` |
|
||||
| `viz_type` | _string_ | The **Visualization Type** widget |
|
||||
|
||||
### Time
|
||||
|
||||
@@ -1382,17 +1361,16 @@ Note not all fields are correctly categorized. The fields vary based on visualiz
|
||||
|
||||
### Query
|
||||
|
||||
| Field | Type | Notes |
|
||||
| ------------------------------------------------------------------------------------------------------ | ------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `adhoc_filters` | _array(object)_ | The **Filters** widget |
|
||||
| `extra_filters` | _array(object)_ | Another pathway to the **Filters** widget.<br/>It is generally used to pass dashboard filter parameters to a chart.<br/>It can be used for appending additional filters to a chart that has been saved with its own filters on an ad-hoc basis if the chart is being used as a standalone widget.<br/><br/>For implementation examples see : [utils test.py](https://github.com/apache/superset/blob/66a4c94a1ed542e69fe6399bab4c01d4540486cf/tests/utils_tests.py#L181)<br/>For insight into how superset processes the contents of this parameter see: [exploreUtils/index.js](https://github.com/apache/superset/blob/93c7f5bb446ec6895d7702835f3157426955d5a9/superset-frontend/src/explore/exploreUtils/index.js#L159) |
|
||||
| `columns` | _array(string)_ | The **Breakdowns** widget |
|
||||
| `groupby` | _array(string)_ | The **Group by** or **Series** widget |
|
||||
| `limit` | _number_ | The **Series Limit** widget |
|
||||
| `metric`<br>`metric_2`<br>`metrics`<br>`percent_metrics`<br>`secondary_metric`<br>`size`<br>`x`<br>`y` | _string_,_object_,_array(string)_,_array(object)_ | The metric(s) depending on the visualization type |
|
||||
| `order_asc` | _boolean_ | The **Sort Descending** widget |
|
||||
| `row_limit` | _number_ | The **Row limit** widget |
|
||||
| `timeseries_limit_metric` | _object_ | The **Sort By** widget |
|
||||
| Field | Type | Notes |
|
||||
| ------------------------------------------------------------------------------------------------------ | ------------------------------------------------- | ------------------------------------------------- |
|
||||
| `adhoc_filters` | _array(object)_ | The **Filters** widget |
|
||||
| `columns` | _array(string)_ | The **Breakdowns** widget |
|
||||
| `groupby` | _array(string)_ | The **Group by** or **Series** widget |
|
||||
| `limit` | _number_ | The **Series Limit** widget |
|
||||
| `metric`<br>`metric_2`<br>`metrics`<br>`percent_mertics`<br>`secondary_metric`<br>`size`<br>`x`<br>`y` | _string_,_object_,_array(string)_,_array(object)_ | The metric(s) depending on the visualization type |
|
||||
| `order_asc` | _boolean_ | The **Sort Descending** widget |
|
||||
| `row_limit` | _number_ | The **Row limit** widget |
|
||||
| `timeseries_limit_metric` | _object_ | The **Sort By** widget |
|
||||
|
||||
The `metric` (or equivalent) and `timeseries_limit_metric` fields are all composed of either metric names or the JSON representation of the `AdhocMetric` TypeScript type. The `adhoc_filters` is composed of the JSON represent of the `AdhocFilter` TypeScript type (which can comprise of columns or metrics depending on whether it is a WHERE or HAVING clause). The `all_columns`, `all_columns_x`, `columns`, `groupby`, and `order_by_cols` fields all represent column names.
|
||||
|
||||
@@ -1432,6 +1410,7 @@ Note the `y_axis_format` is defined under various section for some charts.
|
||||
| `default_filters` | _N/A_ | |
|
||||
| `entity` | _N/A_ | |
|
||||
| `expanded_slices` | _N/A_ | |
|
||||
| `extra_filters` | _N/A_ | |
|
||||
| `filter_immune_slice_fields` | _N/A_ | |
|
||||
| `filter_immune_slices` | _N/A_ | |
|
||||
| `flt_col_0` | _N/A_ | |
|
||||
|
||||
28
Dockerfile
@@ -18,7 +18,7 @@
|
||||
######################################################################
|
||||
# PY stage that simply does a pip install on our requirements
|
||||
######################################################################
|
||||
ARG PY_VER=3.8.12
|
||||
ARG PY_VER=3.7.9
|
||||
FROM python:${PY_VER} AS superset-py
|
||||
|
||||
RUN mkdir /app \
|
||||
@@ -45,7 +45,7 @@ RUN cd /app \
|
||||
######################################################################
|
||||
# Node stage to deal with static asset construction
|
||||
######################################################################
|
||||
FROM node:16 AS superset-node
|
||||
FROM node:14 AS superset-node
|
||||
|
||||
ARG NPM_VER=7
|
||||
RUN npm install -g npm@${NPM_VER}
|
||||
@@ -57,12 +57,14 @@ ENV BUILD_CMD=${NPM_BUILD_CMD}
|
||||
RUN mkdir -p /app/superset-frontend
|
||||
RUN mkdir -p /app/superset/assets
|
||||
COPY ./docker/frontend-mem-nag.sh /
|
||||
COPY ./superset-frontend /app/superset-frontend
|
||||
COPY ./superset-frontend/package* /app/superset-frontend/
|
||||
RUN /frontend-mem-nag.sh \
|
||||
&& cd /app/superset-frontend \
|
||||
&& npm ci
|
||||
|
||||
# This seems to be the most expensive step
|
||||
# Next, copy in the rest and let webpack do its thing
|
||||
COPY ./superset-frontend /app/superset-frontend
|
||||
# This is BY FAR the most expensive step (thanks Terser!)
|
||||
RUN cd /app/superset-frontend \
|
||||
&& npm run ${BUILD_CMD} \
|
||||
&& rm -rf node_modules
|
||||
@@ -71,7 +73,7 @@ RUN cd /app/superset-frontend \
|
||||
######################################################################
|
||||
# Final lean image...
|
||||
######################################################################
|
||||
ARG PY_VER=3.8.12
|
||||
ARG PY_VER=3.7.9
|
||||
FROM python:${PY_VER} AS lean
|
||||
|
||||
ENV LANG=C.UTF-8 \
|
||||
@@ -82,18 +84,17 @@ ENV LANG=C.UTF-8 \
|
||||
SUPERSET_HOME="/app/superset_home" \
|
||||
SUPERSET_PORT=8088
|
||||
|
||||
RUN mkdir -p ${PYTHONPATH} \
|
||||
&& useradd --user-group -d ${SUPERSET_HOME} -m --no-log-init --shell /bin/bash superset \
|
||||
RUN useradd --user-group --no-create-home --no-log-init --shell /bin/bash superset \
|
||||
&& mkdir -p ${SUPERSET_HOME} ${PYTHONPATH} \
|
||||
&& apt-get update -y \
|
||||
&& apt-get install -y --no-install-recommends \
|
||||
build-essential \
|
||||
default-libmysqlclient-dev \
|
||||
libsasl2-modules-gssapi-mit \
|
||||
libpq-dev \
|
||||
libecpg-dev \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
COPY --from=superset-py /usr/local/lib/python3.8/site-packages/ /usr/local/lib/python3.8/site-packages/
|
||||
COPY --from=superset-py /usr/local/lib/python3.7/site-packages/ /usr/local/lib/python3.7/site-packages/
|
||||
# Copying site-packages doesn't move the CLIs, so let's copy them one by one
|
||||
COPY --from=superset-py /usr/local/bin/gunicorn /usr/local/bin/celery /usr/local/bin/flask /usr/bin/
|
||||
COPY --from=superset-node /app/superset/static/assets /app/superset/static/assets
|
||||
@@ -104,12 +105,9 @@ COPY superset /app/superset
|
||||
COPY setup.py MANIFEST.in README.md /app/
|
||||
RUN cd /app \
|
||||
&& chown -R superset:superset * \
|
||||
&& pip install -e . \
|
||||
&& flask fab babel-compile --target superset/translations
|
||||
&& pip install -e .
|
||||
|
||||
COPY ./docker/run-server.sh /usr/bin/
|
||||
|
||||
RUN chmod a+x /usr/bin/run-server.sh
|
||||
COPY ./docker/docker-entrypoint.sh /usr/bin/
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
@@ -119,7 +117,7 @@ HEALTHCHECK CMD curl -f "http://localhost:$SUPERSET_PORT/health"
|
||||
|
||||
EXPOSE ${SUPERSET_PORT}
|
||||
|
||||
CMD /usr/bin/run-server.sh
|
||||
ENTRYPOINT ["/usr/bin/docker-entrypoint.sh"]
|
||||
|
||||
######################################################################
|
||||
# Dev image...
|
||||
|
||||
@@ -18,6 +18,6 @@ under the License.
|
||||
-->
|
||||
# INSTALL / BUILD instructions for Apache Superset
|
||||
|
||||
At this time, the docker file at RELEASING/Dockerfile.from_local_tarball
|
||||
At this time, the docker file at RELEASING/Dockerfile.from_tarball
|
||||
constitutes the recipe on how to get to a working release from a source
|
||||
release tarball.
|
||||
|
||||
64
Makefile
@@ -15,9 +15,6 @@
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
# Python version installed; we need 3.8-3.9
|
||||
PYTHON=`command -v python3.9 || command -v python3.8`
|
||||
|
||||
.PHONY: install superset venv pre-commit
|
||||
|
||||
install: superset pre-commit
|
||||
@@ -30,12 +27,7 @@ superset:
|
||||
pip install -e .
|
||||
|
||||
# Create an admin user in your metadata database
|
||||
superset fab create-admin \
|
||||
--username admin \
|
||||
--firstname "Admin I."\
|
||||
--lastname Strator \
|
||||
--email admin@superset.io \
|
||||
--password general
|
||||
superset fab create-admin
|
||||
|
||||
# Initialize the database
|
||||
superset db upgrade
|
||||
@@ -46,36 +38,10 @@ superset:
|
||||
# Load some data to play with
|
||||
superset load-examples
|
||||
|
||||
# Install node packages
|
||||
cd superset-frontend; npm install
|
||||
|
||||
update: update-py update-js
|
||||
|
||||
update-py:
|
||||
# Install external dependencies
|
||||
pip install -r requirements/local.txt
|
||||
|
||||
# Install Superset in editable (development) mode
|
||||
pip install -e .
|
||||
|
||||
# Initialize the database
|
||||
superset db upgrade
|
||||
|
||||
# Create default roles and permissions
|
||||
superset init
|
||||
|
||||
update-js:
|
||||
# Install js packages
|
||||
cd superset-frontend; npm ci
|
||||
|
||||
venv:
|
||||
# Create a virtual environment and activate it (recommended)
|
||||
if ! [ -x "${PYTHON}" ]; then echo "You need Python 3.8 or 3.9 installed"; exit 1; fi
|
||||
test -d venv || ${PYTHON} -m venv venv # setup a python3 virtualenv
|
||||
. venv/bin/activate
|
||||
|
||||
activate:
|
||||
. venv/bin/activate
|
||||
python3 -m venv venv # setup a python3 virtualenv
|
||||
source venv/bin/activate
|
||||
|
||||
pre-commit:
|
||||
# setup pre commit dependencies
|
||||
@@ -84,28 +50,8 @@ pre-commit:
|
||||
|
||||
format: py-format js-format
|
||||
|
||||
py-format: pre-commit
|
||||
pre-commit run black --all-files
|
||||
|
||||
py-lint: pre-commit
|
||||
pylint -j 0 superset
|
||||
py-format:
|
||||
python -m black superset
|
||||
|
||||
js-format:
|
||||
cd superset-frontend; npm run prettier
|
||||
|
||||
flask-app:
|
||||
flask run -p 8088 --with-threads --reload --debugger
|
||||
|
||||
node-app:
|
||||
cd superset-frontend; npm run dev-server
|
||||
|
||||
build-cypress:
|
||||
cd superset-frontend; npm run build-instrumented
|
||||
cd superset-frontend/cypress-base; npm install
|
||||
|
||||
open-cypress:
|
||||
if ! [ $(port) ]; then cd superset-frontend/cypress-base; CYPRESS_BASE_URL=http://localhost:9000 npm run cypress open; fi
|
||||
cd superset-frontend/cypress-base; CYPRESS_BASE_URL=http://localhost:$(port) npm run cypress open
|
||||
|
||||
admin-user:
|
||||
superset fab create-admin
|
||||
|
||||
151
README.md
@@ -25,8 +25,9 @@ under the License.
|
||||
[](https://badge.fury.io/py/apache-superset)
|
||||
[](https://codecov.io/github/apache/superset)
|
||||
[](https://pypi.python.org/pypi/apache-superset)
|
||||
[](https://join.slack.com/t/apache-superset/shared_invite/zt-16jvzmoi8-sI7jKWp~xc2zYRe~NqiY9Q)
|
||||
[](https://join.slack.com/t/apache-superset/shared_invite/zt-l5f5e0av-fyYu8tlfdqbMdz_sPLwUqQ)
|
||||
[](https://superset.apache.org)
|
||||
[](https://david-dm.org/apache/superset?path=superset-frontend)
|
||||
|
||||
<img
|
||||
src="https://github.com/apache/superset/raw/master/superset-frontend/src/assets/branding/superset-logo-horiz-apache.png"
|
||||
@@ -45,76 +46,81 @@ A modern, enterprise-ready business intelligence web application.
|
||||
[**Resources**](#resources) |
|
||||
[**Organizations Using Superset**](RESOURCES/INTHEWILD.md)
|
||||
|
||||
## Why Superset?
|
||||
## Screenshots & Gifs
|
||||
|
||||
Superset is a modern data exploration and data visualization platform. Superset can replace or augment proprietary business intelligence tools for many teams. Superset integrates well with a variety of data sources.
|
||||
**Gallery**
|
||||
|
||||
<kbd><a href="https://superset.apache.org/gallery"><img title="Gallery" src="superset-frontend/images/screenshots/gallery.jpg"/></a></kbd><br/>
|
||||
|
||||
**View Dashboards**
|
||||
|
||||
<kbd><img title="View Dashboards" src="superset-frontend/images/screenshots/slack_dash.jpg"/></kbd><br/>
|
||||
|
||||
**Slice & dice your data**
|
||||
|
||||
<kbd><img title="Slice & dice your data" src="superset-frontend/images/screenshots/explore.jpg"/></kbd><br/>
|
||||
|
||||
**Query and visualize your data with SQL Lab**
|
||||
|
||||
<kbd><img title="SQL Lab" src="superset-frontend/images/screenshots/sql_lab.jpg"/></kbd><br/>
|
||||
|
||||
**Visualize geospatial data with deck.gl**
|
||||
|
||||
<kbd><img title="Geospatial" src="superset-frontend/images/screenshots/geospatial_dash.jpg"/></kbd><br/>
|
||||
|
||||
**Choose from a wide array of visualizations**
|
||||
|
||||
<kbd><img title="Visualizations" src="superset-frontend/images/screenshots/explore_visualizations.jpg"/></kbd><br/>
|
||||
|
||||
## Why Superset?
|
||||
|
||||
Superset provides:
|
||||
|
||||
- A **no-code interface** for building charts quickly
|
||||
- A powerful, web-based **SQL Editor** for advanced querying
|
||||
- A **lightweight semantic layer** for quickly defining custom dimensions and metrics
|
||||
- Out of the box support for **nearly any SQL** database or data engine
|
||||
- A wide array of **beautiful visualizations** to showcase your data, ranging from simple bar charts to geospatial visualizations
|
||||
- Lightweight, configurable **caching layer** to help ease database load
|
||||
- Highly extensible **security roles and authentication** options
|
||||
- An **API** for programmatic customization
|
||||
- A **cloud-native architecture** designed from the ground up for scale
|
||||
|
||||
## Screenshots & Gifs
|
||||
|
||||
**Large Gallery of Visualizations**
|
||||
|
||||
<kbd><img title="Gallery" src="superset-frontend/src/assets/images/screenshots/gallery.jpg"/></kbd><br/>
|
||||
|
||||
**Craft Beautiful, Dynamic Dashboards**
|
||||
|
||||
<kbd><img title="View Dashboards" src="superset-frontend/src/assets/images/screenshots/slack_dash.jpg"/></kbd><br/>
|
||||
|
||||
**No-Code Chart Builder**
|
||||
|
||||
<kbd><img title="Slice & dice your data" src="superset-frontend/src/assets/images/screenshots/explore.jpg"/></kbd><br/>
|
||||
|
||||
**Powerful SQL Editor**
|
||||
|
||||
<kbd><img title="SQL Lab" src="superset-frontend/src/assets/images/screenshots/sql_lab.jpg"/></kbd><br/>
|
||||
- An intuitive interface for visualizing datasets and
|
||||
crafting interactive dashboards
|
||||
- A wide array of beautiful visualizations to showcase your data
|
||||
- Code-free visualization builder to extract and present datasets
|
||||
- A world-class SQL IDE for preparing data for visualization, including a rich metadata browser
|
||||
- A lightweight semantic layer which empowers data analysts to quickly define custom dimensions and metrics
|
||||
- Out-of-the-box support for most SQL-speaking databases
|
||||
- Seamless, in-memory asynchronous caching and queries
|
||||
- An extensible security model that allows configuration of very intricate rules
|
||||
on who can access which product features and datasets.
|
||||
- Integration with major
|
||||
authentication backends (database, OpenID, LDAP, OAuth, REMOTE_USER, etc)
|
||||
- The ability to add custom visualization plugins
|
||||
- An API for programmatic customization
|
||||
- A cloud-native architecture designed from the ground up for scale
|
||||
|
||||
## Supported Databases
|
||||
|
||||
Superset can query data from any SQL-speaking datastore or data engine (Presto, Trino, Athena, [and more](https://superset.apache.org/docs/databases/installing-database-drivers/)) that has a Python DB-API driver and a SQLAlchemy dialect.
|
||||
Superset can query data from any SQL-speaking datastore or data engine (e.g. Presto or Athena) that has a Python DB-API driver and a SQLAlchemy dialect.
|
||||
|
||||
Here are some of the major database solutions that are supported:
|
||||
|
||||
<p align="center">
|
||||
<img src="superset-frontend/src/assets/images/redshift.png" alt="redshift" border="0" width="106" height="41"/>
|
||||
<img src="superset-frontend/src/assets/images/google-biquery.png" alt="google-biquery" border="0" width="114" height="43"/>
|
||||
<img src="superset-frontend/src/assets/images/snowflake.png" alt="snowflake" border="0" width="152" height="46"/>
|
||||
<img src="superset-frontend/src/assets/images/trino.png" alt="trino" border="0" width="46" height="46"/>
|
||||
<img src="superset-frontend/src/assets/images/presto.png" alt="presto" border="0" width="152" height="46"/>
|
||||
<img src="superset-frontend/src/assets/images/druid.png" alt="druid" border="0" width="135" height="37" />
|
||||
<img src="superset-frontend/src/assets/images/firebolt.png" alt="firebolt" border="0" width="133" height="21.5" />
|
||||
<img src="superset-frontend/src/assets/images/timescale.png" alt="timescale" border="0" width="102" height="26.8" />
|
||||
<img src="superset-frontend/src/assets/images/rockset.png" alt="rockset" border="0" width="125" height="51" />
|
||||
<img src="superset-frontend/src/assets/images/postgresql.png" alt="postgresql" border="0" width="132" height="81" />
|
||||
<img src="superset-frontend/src/assets/images/mysql.png" alt="mysql" border="0" width="119" height="62" />
|
||||
<img src="superset-frontend/src/assets/images/mssql-server.png" alt="mssql-server" border="0" width="93" height="74" />
|
||||
<img src="superset-frontend/src/assets/images/db2.png" alt="db2" border="0" width="62" height="62" />
|
||||
<img src="superset-frontend/src/assets/images/sqlite.png" alt="sqlite" border="0" width="102" height="45" />
|
||||
<img src="superset-frontend/src/assets/images/sybase.png" alt="sybase" border="0" width="128" height="47" />
|
||||
<img src="superset-frontend/src/assets/images/mariadb.png" alt="mariadb" border="0" width="83" height="63" />
|
||||
<img src="superset-frontend/src/assets/images/vertica.png" alt="vertica" border="0" width="128" height="40" />
|
||||
<img src="superset-frontend/src/assets/images/oracle.png" alt="oracle" border="0" width="121" height="66" />
|
||||
<img src="superset-frontend/src/assets/images/firebird.png" alt="firebird" border="0" width="86" height="56" />
|
||||
<img src="superset-frontend/src/assets/images/greenplum.png" alt="greenplum" border="0" width="140" height="45" />
|
||||
<img src="superset-frontend/src/assets/images/clickhouse.png" alt="clickhouse" border="0" width="133" height="34" />
|
||||
<img src="superset-frontend/src/assets/images/exasol.png" alt="exasol" border="0" width="106" height="59" />
|
||||
<img src="superset-frontend/src/assets/images/monet-db.png" alt="monet-db" border="0" width="106" height="46" />
|
||||
<img src="superset-frontend/src/assets/images/apache-kylin.png" alt="apache-kylin" border="0" width="56" height="64"/>
|
||||
<img src="superset-frontend/src/assets/images/hologres.png" alt="hologres" border="0" width="71" height="64"/>
|
||||
<img src="superset-frontend/src/assets/images/netezza.png" alt="netezza" border="0" width="64" height="64"/>
|
||||
<img src="superset-frontend/src/assets/images/pinot.png" alt="pinot" border="0" width="165" height="64"/>
|
||||
<img src="superset-frontend/src/assets/images/teradata.png" alt="teradata" border="0" width="165" height="64"/>
|
||||
<img src="superset-frontend/src/assets/images/yugabyte.png" alt="yugabyte" border="0" width="180" height="31"/>
|
||||
<img src="superset-frontend/images/redshift.png" alt="redshift" border="0" width="106" height="41"/>
|
||||
<img src="superset-frontend/images/google-biquery.png" alt="google-biquery" border="0" width="114" height="43"/>
|
||||
<img src="superset-frontend/images/snowflake.png" alt="snowflake" border="0" width="152" height="46"/>
|
||||
<img src="superset-frontend/images/trino.png" alt="trino" border="0" width="46" height="46"/>
|
||||
<img src="superset-frontend/images/presto.png" alt="presto" border="0" width="152" height="46"/>
|
||||
<img src="superset-frontend/images/druid.png" alt="druid" border="0" width="135" height="37" />
|
||||
<img src="superset-frontend/images/postgresql.png" alt="postgresql" border="0" width="132" height="81" />
|
||||
<img src="superset-frontend/images/mysql.png" alt="mysql" border="0" width="119" height="62" />
|
||||
<img src="superset-frontend/images/mssql-server.png" alt="mssql-server" border="0" width="93" height="74" />
|
||||
<img src="superset-frontend/images/db2.png" alt="db2" border="0" width="62" height="62" />
|
||||
<img src="superset-frontend/images/sqlite.png" alt="sqlite" border="0" width="102" height="45" />
|
||||
<img src="superset-frontend/images/sybase.png" alt="sybase" border="0" width="128" height="47" />
|
||||
<img src="superset-frontend/images/mariadb.png" alt="mariadb" border="0" width="83" height="63" />
|
||||
<img src="superset-frontend/images/vertica.png" alt="vertica" border="0" width="128" height="40" />
|
||||
<img src="superset-frontend/images/oracle.png" alt="oracle" border="0" width="121" height="66" />
|
||||
<img src="superset-frontend/images/firebird.png" alt="firebird" border="0" width="86" height="56" />
|
||||
<img src="superset-frontend/images/greenplum.png" alt="greenplum" border="0" width="140" height="45" />
|
||||
<img src="superset-frontend/images/clickhouse.png" alt="clickhouse" border="0" width="133" height="34" />
|
||||
<img src="superset-frontend/images/exasol.png" alt="exasol" border="0" width="106" height="59" />
|
||||
<img src="superset-frontend/images/monet-db.png" alt="monet-db" border="0" width="106" height="46" />
|
||||
<img src="superset-frontend/images/apache-kylin.png" alt="apache-kylin" border="0" width="56" height="64"/>
|
||||
<img src="superset-frontend/images/hologres.png" alt="hologres" border="0" width="71" height="64"/>
|
||||
</p>
|
||||
|
||||
**A more comprehensive list of supported databases** along with the configuration instructions can be found
|
||||
@@ -129,7 +135,7 @@ Want to add support for your datastore or data engine? Read more [here](https://
|
||||
## Get Involved
|
||||
|
||||
- Ask and answer questions on [StackOverflow](https://stackoverflow.com/questions/tagged/apache-superset) using the **apache-superset** tag
|
||||
- [Join our community's Slack](https://join.slack.com/t/apache-superset/shared_invite/zt-16jvzmoi8-sI7jKWp~xc2zYRe~NqiY9Q)
|
||||
- [Join our community's Slack](https://join.slack.com/t/apache-superset/shared_invite/zt-l5f5e0av-fyYu8tlfdqbMdz_sPLwUqQ)
|
||||
and please read our [Slack Community Guidelines](https://github.com/apache/superset/blob/master/CODE_OF_CONDUCT.md#slack-community-guidelines)
|
||||
- [Join our dev@superset.apache.org Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org)
|
||||
|
||||
@@ -142,27 +148,28 @@ how to set up a development environment.
|
||||
|
||||
## Resources
|
||||
|
||||
- Superset 1.0
|
||||
- [Superset 1.0 Milestone](https://superset.apache.org/docs/version-one)
|
||||
- [Superset 1.0 Release Notes](https://github.com/apache/superset/tree/master/RELEASING/release-notes-1-0)
|
||||
- Getting Started with Superset
|
||||
- [Superset in 2 Minutes using Docker Compose](https://superset.apache.org/docs/installation/installing-superset-using-docker-compose#installing-superset-locally-using-docker-compose)
|
||||
- [Installing Database Drivers](https://superset.apache.org/docs/databases/docker-add-drivers/)
|
||||
- [Installing Database Drivers](https://superset.apache.org/docs/databases/dockeradddrivers)
|
||||
- [Building New Database Connectors](https://preset.io/blog/building-database-connector/)
|
||||
- [Create Your First Dashboard](https://superset.apache.org/docs/creating-charts-dashboards/first-dashboard)
|
||||
- [Comprehensive Tutorial for Contributing Code to Apache Superset
|
||||
](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
|
||||
- [Documentation for Superset End-Users (by Preset)](https://docs.preset.io/docs/terminology)
|
||||
- Deploying Superset
|
||||
- [Official Docker image](https://hub.docker.com/r/apache/superset)
|
||||
- [Helm Chart](https://github.com/apache/superset/tree/master/helm/superset)
|
||||
- Recordings of Past [Superset Community Events](https://preset.io/events)
|
||||
- [Live Demo: Interactive Time-series Analysis with Druid and Superset](https://preset.io/events/2021-03-02-interactive-time-series-analysis-with-druid-and-superset/)
|
||||
- [Live Demo: Visualizing MongoDB and Pinot Data using Trino](https://preset.io/events/2021-04-13-visualizing-mongodb-and-pinot-data-using-trino/)
|
||||
- [Superset Contributor Bootcamp](https://preset.io/events/superset-contributor-bootcamp-dec-21/)
|
||||
- [Introduction to the Superset API](https://preset.io/events/introduction-to-the-superset-api/)
|
||||
- [Apache Superset 1.3 Meetup](https://preset.io/events/apache-superset-1-3/)
|
||||
- [Building a Database Connector for Superset](https://preset.io/events/2021-02-16-building-a-database-connector-for-superset/)
|
||||
- [Recordings of Past Community Events](https://www.youtube.com/channel/UCMuwrvBsg_jjI2gLcm04R0g)
|
||||
- [Meetup: Superset 1.0](https://www.youtube.com/watch?v=gEZkFF2kokk)
|
||||
- [Live Demo: Interactive Time-series Analysis with Druid and Superset](https://www.youtube.com/watch?v=4eh7OTfMln8)
|
||||
- [Live Demo: Visualizing MongoDB and Pinot Data using Trino](https://www.youtube.com/watch?v=Dw_al_26F6o)
|
||||
- Upcoming Superset Events
|
||||
- [Superset + Star Wars: May the 4th Be With You](https://preset.io/events/2021-05-04-superset-star-wars-may-the-4th-be-with-you)
|
||||
- [Meetup - Developing and Deploying Custom Visualization Plugins in Superset](https://www.meetup.com/Global-Apache-Superset-Community-Meetup/events/277835486/)
|
||||
- [Visualize Your Data Lake Using Athena and Superset](https://preset.io/events/2021-05-18-visualize-your-data-lake-using-athena-and-superset)
|
||||
- Visualizations
|
||||
- [Building Custom Viz Plugins](https://superset.apache.org/docs/installation/building-custom-viz-plugins)
|
||||
- [Managing and Deploying Custom Viz Plugins](https://medium.com/nmc-techblog/apache-superset-manage-custom-viz-plugins-in-production-9fde1a708e55)
|
||||
- [Why Apache Superset is Betting on Apache ECharts](https://preset.io/blog/2021-4-1-why-echarts/)
|
||||
|
||||
- [Superset API](https://superset.apache.org/docs/rest-api)
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
FROM python:3.8-buster
|
||||
FROM python:3.7-buster
|
||||
|
||||
RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset
|
||||
|
||||
@@ -34,7 +34,7 @@ RUN apt-get install -y build-essential libssl-dev \
|
||||
|
||||
# Install nodejs for custom build
|
||||
# https://nodejs.org/en/download/package-manager/
|
||||
RUN curl -sL https://deb.nodesource.com/setup_16.x | bash - \
|
||||
RUN curl -sL https://deb.nodesource.com/setup_12.x | bash - \
|
||||
&& apt-get install -y nodejs
|
||||
|
||||
RUN mkdir -p /home/superset
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
FROM python:3.8-buster
|
||||
FROM python:3.7-buster
|
||||
|
||||
RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset
|
||||
|
||||
@@ -34,7 +34,7 @@ RUN apt-get install -y build-essential libssl-dev \
|
||||
|
||||
# Install nodejs for custom build
|
||||
# https://nodejs.org/en/download/package-manager/
|
||||
RUN curl -sL https://deb.nodesource.com/setup_16.x | bash - \
|
||||
RUN curl -sL https://deb.nodesource.com/setup_12.x | bash - \
|
||||
&& apt-get install -y nodejs
|
||||
|
||||
RUN mkdir -p /home/superset
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
FROM python:3.8-buster
|
||||
FROM python:3.7-buster
|
||||
ARG VERSION
|
||||
|
||||
RUN git clone --depth 1 --branch ${VERSION} https://github.com/apache/superset.git /superset
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
FROM python:3.8-buster
|
||||
FROM python:3.7-buster
|
||||
|
||||
RUN apt-get update -y
|
||||
RUN apt-get install -y jq
|
||||
|
||||
@@ -29,13 +29,7 @@ on the Superset Slack. People crafting releases and those interested in
|
||||
partaking in the process should join the channel.
|
||||
|
||||
## Release notes for recent releases
|
||||
|
||||
- [1.5](release-notes-1-5/README.md)
|
||||
- [1.4](release-notes-1-4/README.md)
|
||||
- [1.3](release-notes-1-3/README.md)
|
||||
- [1.2](release-notes-1-2/README.md)
|
||||
- [1.1](release-notes-1-1/README.md)
|
||||
- [1.0](release-notes-1-0/README.md)
|
||||
- [1.0.0](release-notes-1-0/README.md)
|
||||
- [0.38](release-notes-0-38/README.md)
|
||||
|
||||
## Release setup (First Time Only)
|
||||
@@ -62,35 +56,6 @@ need to be done at every release.
|
||||
|
||||
# Commit the changes
|
||||
svn commit -m "Add PGP keys of new Superset committer"
|
||||
|
||||
# push the changes
|
||||
svn update
|
||||
```
|
||||
|
||||
To minimize the risk of mixing up your local development environment, it's recommended to work on the
|
||||
release in a different directory than where the devenv is located. In this example, we'll clone
|
||||
the repo directly from the main `apache/superset` repo to a new directory `superset-release`:
|
||||
|
||||
```bash
|
||||
cd <MY PROJECTS PATH>
|
||||
git clone git@github.com:apache/superset.git superset-release
|
||||
cd superset-release
|
||||
```
|
||||
|
||||
We recommend setting up a virtual environment to isolate the python dependencies from your main
|
||||
setup:
|
||||
|
||||
```bash
|
||||
virtualenv venv
|
||||
source venv/bin/activate
|
||||
```
|
||||
|
||||
In addition, we recommend using the [`cherrytree`](https://pypi.org/project/cherrytree/) tool for
|
||||
automating cherry picking, as it will help speed up the release process. To install `cherrytree`
|
||||
and other dependencies that are required for the release process, run the following commands:
|
||||
|
||||
```bash
|
||||
pip install -r RELEASING/requirements.txt
|
||||
```
|
||||
|
||||
## Setting up the release environment (do every time)
|
||||
@@ -104,41 +69,35 @@ the wrong files/using wrong names. There's a script to help you set correctly al
|
||||
necessary environment variables. Change your current directory to `superset/RELEASING`
|
||||
and execute the `set_release_env.sh` script with the relevant parameters:
|
||||
|
||||
Usage (MacOS/ZSH):
|
||||
|
||||
```bash
|
||||
cd RELEASING
|
||||
source set_release_env.sh <SUPERSET_RC_VERSION> <PGP_KEY_FULLNAME>
|
||||
```
|
||||
|
||||
Usage (BASH):
|
||||
|
||||
```bash
|
||||
. set_release_env.sh <SUPERSET_RC_VERSION> <PGP_KEY_FULLNAME>
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
Usage (ZSH):
|
||||
```bash
|
||||
source set_release_env.sh 1.5.1rc1 myid@apache.org
|
||||
source set_release_env.sh <SUPERSET_RC_VERSION> <PGP_KEY_FULLNAME>
|
||||
```
|
||||
|
||||
The script will output the exported variables. Here's example for 1.5.1rc1:
|
||||
Example:
|
||||
```bash
|
||||
source set_release_env.sh 0.38.0rc1 myid@apache.org
|
||||
```
|
||||
|
||||
The script will output the exported variables. Here's example for 0.38.0rc1:
|
||||
|
||||
```
|
||||
-------------------------------
|
||||
Set Release env variables
|
||||
SUPERSET_VERSION=1.5.1
|
||||
SUPERSET_VERSION=0.38.0
|
||||
SUPERSET_RC=1
|
||||
SUPERSET_GITHUB_BRANCH=1.5
|
||||
SUPERSET_PGP_FULLNAME=villebro@apache.org
|
||||
SUPERSET_VERSION_RC=1.5.1rc1
|
||||
SUPERSET_RELEASE=apache-superset-1.5.1
|
||||
SUPERSET_RELEASE_RC=apache-superset-1.5.1rc1
|
||||
SUPERSET_RELEASE_TARBALL=apache-superset-1.5.1-source.tar.gz
|
||||
SUPERSET_RELEASE_RC_TARBALL=apache-superset-1.5.1rc1-source.tar.gz
|
||||
SUPERSET_TMP_ASF_SITE_PATH=/tmp/incubator-superset-site-1.5.1
|
||||
-------------------------------
|
||||
SUPERSET_GITHUB_BRANCH=0.38
|
||||
SUPERSET_PGP_FULLNAME=myid@apache.org
|
||||
SUPERSET_VERSION_RC=0.38.0rc1
|
||||
SUPERSET_RELEASE=apache-superset-0.38.0
|
||||
SUPERSET_RELEASE_RC=apache-superset-0.38.0rc1
|
||||
SUPERSET_RELEASE_TARBALL=apache-superset-0.38.0-source.tar.gz
|
||||
SUPERSET_RELEASE_RC_TARBALL=apache-superset-0.38.0rc1-source.tar.gz
|
||||
SUPERSET_TMP_ASF_SITE_PATH=/tmp/superset-site-0.38.0
|
||||
```
|
||||
|
||||
## Crafting a source release
|
||||
@@ -148,133 +107,41 @@ a branch named with the release MAJOR.MINOR version (on this example 0.37).
|
||||
This new branch will hold all PATCH and release candidates
|
||||
that belong to the MAJOR.MINOR version.
|
||||
|
||||
### Creating an initial minor release (e.g. 1.5.0)
|
||||
|
||||
The MAJOR.MINOR branch is normally a "cut" from a specific point in time from the master branch.
|
||||
When creating the initial minor release (e.g. 1.5.0), create a new branch:
|
||||
Then (if needed) apply all cherries that will make the PATCH.
|
||||
|
||||
```bash
|
||||
git checkout master
|
||||
git pull
|
||||
git checkout -b ${SUPERSET_GITHUB_BRANCH}
|
||||
git push origin $SUPERSET_GITHUB_BRANCH
|
||||
git checkout -b $SUPERSET_GITHUB_BRANCH
|
||||
git push upstream $SUPERSET_GITHUB_BRANCH
|
||||
```
|
||||
|
||||
Note that this initializes a new "release cut", and is NOT needed when creating a patch release
|
||||
(e.g. 1.5.1).
|
||||
|
||||
### Creating a patch release (e.g. 1.5.1)
|
||||
|
||||
When getting ready to bake a patch release, simply checkout the relevant branch:
|
||||
|
||||
```bash
|
||||
git checkout master
|
||||
git pull
|
||||
git checkout ${SUPERSET_GITHUB_BRANCH}
|
||||
```
|
||||
|
||||
### Cherry picking
|
||||
|
||||
It is customary to label PRs that have been introduced after the cut with the label
|
||||
`v<MAJOR>.<MINOR>`. For example, for any PRs that should be included in the 1.5 branch, the
|
||||
label `v1.5` should be added.
|
||||
|
||||
To see how well the labelled PRs would apply to the current branch, run the following command:
|
||||
|
||||
```bash
|
||||
cherrytree bake -r apache/superset -m master -l v${SUPERSET_GITHUB_BRANCH} ${SUPERSET_GITHUB_BRANCH}
|
||||
```
|
||||
|
||||
This requires the presence of an environment variable `GITHUB_TOKEN`. Alternatively,
|
||||
you can pass the token directly via the `--access-token` parameter (`-at` for short).
|
||||
|
||||
#### Happy path: no conflicts
|
||||
|
||||
This will show how many cherries will apply cleanly. If there are no conflicts, you can simply apply all cherries
|
||||
by adding the `--no-dry-run` flag (`-nd` for short):
|
||||
|
||||
```bash
|
||||
cherrytree bake -r apache/superset -m master -l v${SUPERSET_GITHUB_BRANCH} -nd ${SUPERSET_GITHUB_BRANCH}
|
||||
```
|
||||
|
||||
#### Resolving conflicts
|
||||
|
||||
If there are conflicts, you can issue the following command to apply all cherries up until the conflict automatically, and then
|
||||
break by adding the `-error-mode break` flag (`-e break` for short):
|
||||
|
||||
```bash
|
||||
cherrytree bake -r apache/superset -m master -l v${SUPERSET_GITHUB_BRANCH} -nd -e break ${SUPERSET_GITHUB_BRANCH}
|
||||
```
|
||||
|
||||
After applying the cleanly merged cherries, `cherrytree` will specify the SHA of the conflicted cherry. To resolve the conflict,
|
||||
simply issue the following command:
|
||||
|
||||
```bash
|
||||
git cherry-pick <SHA>
|
||||
```
|
||||
|
||||
Then fix all conflicts, followed by
|
||||
|
||||
```bash
|
||||
git add -u # add all changes
|
||||
git cherry-pick --continue
|
||||
```
|
||||
|
||||
After this, rerun all the above steps until all cherries have been picked, finally pushing all new commits to the release branch
|
||||
on the main repo:
|
||||
|
||||
```bash
|
||||
git push
|
||||
```
|
||||
|
||||
### Updating changelog
|
||||
|
||||
Next, update the `CHANGELOG.md` with all the changes that are included in the release.
|
||||
Make sure the branch has been pushed to `origin` to ensure the changelog generator
|
||||
Make sure the branch has been pushed to `upstream` to ensure the changelog generator
|
||||
can pick up changes since the previous release.
|
||||
Similar to `cherrytree`, the change log script requires a github token, either as an env var
|
||||
(`GITHUB_TOKEN`) or as the parameter `--access_token`.
|
||||
|
||||
#### Initial release (e.g. 1.5.0)
|
||||
|
||||
When generating the changelog for an initial minor relese, you should compare with
|
||||
the previous release (in the example, the previous release branch is `1.4`, so remember to
|
||||
update it accordingly):
|
||||
Change log script requires a github token and will try to use your env var GITHUB_TOKEN.
|
||||
you can also pass the token using the parameter `--access_token`.
|
||||
|
||||
Example:
|
||||
```bash
|
||||
python changelog.py --previous_version 1.4 --current_version ${SUPERSET_GITHUB_BRANCH} changelog
|
||||
python changelog.py --previous_version 0.37 --current_version 0.38 changelog
|
||||
```
|
||||
|
||||
You can get a list of pull requests with labels started with blocking, risk, hold, revert and security by using the parameter `--risk`.
|
||||
Example:
|
||||
|
||||
```bash
|
||||
python changelog.py --previous_version 0.37 --current_version 0.38 changelog --access_token {GITHUB_TOKEN} --risk
|
||||
```
|
||||
|
||||
The script will checkout both branches, compare all the PRs, and output the lines that are needed to be added to the
|
||||
`CHANGELOG.md` file in the root of the repo. Remember to also make sure to update the branch id (with the above command
|
||||
`1.5` needs to be changed to `1.5.0`)
|
||||
The script will checkout both branches and compare all the PR's, copy the output and paste it on the `CHANGELOG.md`
|
||||
|
||||
Then, in `UPDATING.md`, a file that contains a list of notifications around
|
||||
deprecations and upgrading-related topics,
|
||||
make sure to move the content now under the `Next Version` section under a new
|
||||
section for the new release.
|
||||
|
||||
#### Patch release (e.g. 1.5.1)
|
||||
Finally bump the version number on `superset-frontend/package.json` (replace with whichever version is being released excluding the RC version):
|
||||
|
||||
To compare the forthcoming patch release with the latest release from the same branch, set
|
||||
`--previous_version` as the tag of the previous release (in this example `1.5.0`; remember to update accordingly)
|
||||
|
||||
```bash
|
||||
python changelog.py --previous_version 1.5.0 --current_version ${SUPERSET_GITHUB_BRANCH} changelog
|
||||
```
|
||||
|
||||
### Set version number
|
||||
|
||||
Finally, bump the version number on `superset-frontend/package.json` (replace with whichever version is being released excluding the RC version):
|
||||
|
||||
```
|
||||
```json
|
||||
"version": "0.38.0"
|
||||
```
|
||||
|
||||
@@ -286,13 +153,9 @@ git add ...
|
||||
git commit ...
|
||||
# push new tag
|
||||
git tag ${SUPERSET_VERSION_RC}
|
||||
git push origin ${SUPERSET_VERSION_RC}
|
||||
git push upstream ${SUPERSET_VERSION_RC}
|
||||
```
|
||||
|
||||
### Create a release on Github
|
||||
|
||||
After submitting the tag, follow the steps [here](https://docs.github.com/en/repositories/releasing-projects-on-github/managing-releases-in-a-repository) to create the release. Use the vote email text as the content for the release description. Make sure to check the "This is a pre-release" checkbox for release canditates. You can check previous releases if you need an example.
|
||||
|
||||
## Preparing the release candidate
|
||||
|
||||
The first step of preparing an Apache Release is packaging a release candidate
|
||||
@@ -308,15 +171,14 @@ the tag and create a signed source tarball from it:
|
||||
|
||||
Note that `make_tarball.sh`:
|
||||
|
||||
- By default, the script assumes you have already executed an SVN checkout to `$HOME/svn/superset_dev`.
|
||||
This can be overridden by setting `SUPERSET_SVN_DEV_PATH` environment var to a different svn dev directory
|
||||
- By default assumes you have already executed an SVN checkout to `$HOME/svn/superset_dev`.
|
||||
This can be overriden by setting `SUPERSET_SVN_DEV_PATH` environment var to a different svn dev directory
|
||||
- Will refuse to craft a new release candidate if a release already exists on your local svn dev directory
|
||||
- Will check `package.json` version number and fails if it's not correctly set
|
||||
|
||||
### Build and test the created source tarball
|
||||
|
||||
To build and run the **local copy** of the recently created tarball:
|
||||
|
||||
```bash
|
||||
# Build and run a release candidate tarball
|
||||
./test_run_tarball.sh local
|
||||
@@ -332,13 +194,11 @@ Now let's ship this RC into svn's dev folder
|
||||
cd ~/svn/superset_dev/
|
||||
svn add ${SUPERSET_VERSION_RC}
|
||||
svn commit -m "Release ${SUPERSET_VERSION_RC}"
|
||||
svn update
|
||||
```
|
||||
|
||||
### Build and test from SVN source tarball
|
||||
|
||||
To build and run the recently created tarball **from SVN**:
|
||||
|
||||
```bash
|
||||
# Build and run a release candidate tarball
|
||||
./test_run_tarball.sh
|
||||
@@ -347,7 +207,6 @@ To build and run the recently created tarball **from SVN**:
|
||||
```
|
||||
|
||||
### Voting
|
||||
|
||||
Now you're ready to start the [VOTE] thread. Here's an example of a
|
||||
previous release vote thread:
|
||||
https://lists.apache.org/thread.html/e60f080ebdda26896214f7d3d5be1ccadfab95d48fbe813252762879@<dev.superset.apache.org>
|
||||
@@ -356,10 +215,17 @@ To easily send a voting request to Superset community, still on the `superset/RE
|
||||
|
||||
```bash
|
||||
# Note: use Superset's virtualenv
|
||||
(venv)$ python generate_email.py vote_pmc
|
||||
(venv)$ python send_email.py vote_pmc
|
||||
```
|
||||
|
||||
The script will generate the email text that should be sent to dev@superset.apache.org using an email client. The release version and release candidate number are fetched from the previously set environment variables.
|
||||
The script will interactively ask for extra information so it can authenticate on the Apache Email Relay.
|
||||
The release version and release candidate number are fetched from the previously set environment variables.
|
||||
|
||||
```
|
||||
Sender email (ex: user@apache.org): your_apache_email@apache.org
|
||||
Apache username: your_apache_user
|
||||
Apache password: your_apache_password
|
||||
```
|
||||
|
||||
Once 3+ binding votes (by PMC members) have been cast and at
|
||||
least 72 hours have past, you can post a [RESULT] thread:
|
||||
@@ -369,20 +235,23 @@ To easily send the result email, still on the `superset/RELEASING` directory:
|
||||
|
||||
```bash
|
||||
# Note: use Superset's virtualenv
|
||||
python generate_email.py result_pmc
|
||||
python send_email.py result_pmc
|
||||
```
|
||||
|
||||
The script will interactively ask for extra information needed to fill out the email template. Based on the
|
||||
voting description, it will generate a passing, non passing or non conclusive email.
|
||||
Here's an example:
|
||||
here's an example:
|
||||
|
||||
```
|
||||
Sender email (ex: user@apache.org): your_apache_email@apache.org
|
||||
Apache username: your_apache_user
|
||||
Apache password: your_apache_password
|
||||
A List of people with +1 binding vote (ex: Max,Grace,Krist): Daniel,Alan,Max,Grace
|
||||
A List of people with +1 non binding vote (ex: Ville): Ville
|
||||
A List of people with -1 vote (ex: John):
|
||||
```
|
||||
|
||||
The script will generate the email text that should be sent to dev@superset.apache.org using an email client. The release version and release candidate number are fetched from the previously set environment variables.
|
||||
Following the result thread, yet another [VOTE] thread should be
|
||||
|
||||
### Validating a release
|
||||
|
||||
@@ -391,7 +260,6 @@ https://www.apache.org/info/verification.html
|
||||
## Publishing a successful release
|
||||
|
||||
Upon a successful vote, you'll have to copy the folder into the non-"dev/" folder.
|
||||
|
||||
```bash
|
||||
cp -r ~/svn/superset_dev/${SUPERSET_VERSION_RC}/ ~/svn/superset/${SUPERSET_VERSION}/
|
||||
cd ~/svn/superset/
|
||||
@@ -399,11 +267,9 @@ cd ~/svn/superset/
|
||||
for f in ${SUPERSET_VERSION}/*; do mv "$f" "${f/${SUPERSET_VERSION_RC}/${SUPERSET_VERSION}}"; done
|
||||
svn add ${SUPERSET_VERSION}
|
||||
svn commit -m "Release ${SUPERSET_VERSION}"
|
||||
svn update
|
||||
```
|
||||
|
||||
Then tag the final release:
|
||||
|
||||
```bash
|
||||
# Go to the root directory of the repo, e.g. `~/src/superset`
|
||||
cd ~/src/superset/
|
||||
@@ -411,8 +277,6 @@ cd ~/src/superset/
|
||||
git branch
|
||||
# Create the release tag
|
||||
git tag -f ${SUPERSET_VERSION}
|
||||
# push the tag to the remote
|
||||
git push origin ${SUPERSET_VERSION}
|
||||
```
|
||||
|
||||
### Update CHANGELOG and UPDATING on superset
|
||||
@@ -422,61 +286,25 @@ with the changes on `CHANGELOG.md` and `UPDATING.md`.
|
||||
|
||||
### Publishing a Convenience Release to PyPI
|
||||
|
||||
Extract the release to the `/tmp` folder to build the PiPY release. Files in the `/tmp` folder will be automatically deleted by the OS.
|
||||
|
||||
```bash
|
||||
mkdir -p /tmp/superset && cd /tmp/superset
|
||||
tar xfvz ~/svn/superset/${SUPERSET_VERSION}/${SUPERSET_RELEASE_TARBALL}
|
||||
```
|
||||
|
||||
Create a virtual environment and install the dependencies
|
||||
|
||||
```bash
|
||||
cd ${SUPERSET_RELEASE_RC}
|
||||
python3 -m venv venv
|
||||
source venv/bin/activate
|
||||
pip install -r requirements/base.txt
|
||||
pip install twine
|
||||
```
|
||||
|
||||
Create the distribution
|
||||
|
||||
```bash
|
||||
cd superset-frontend/
|
||||
npm ci && npm run build
|
||||
cd ../
|
||||
flask fab babel-compile --target superset/translations
|
||||
python setup.py sdist
|
||||
```
|
||||
|
||||
Publish to PyPI
|
||||
|
||||
You may need to ask a fellow committer to grant
|
||||
Using the final release tarball, unpack it and run `./pypi_push.sh`.
|
||||
This script will build the Javascript bundle and echo the twine command
|
||||
allowing you to publish to PyPI. You may need to ask a fellow committer to grant
|
||||
you access to it if you don't have access already. Make sure to create
|
||||
an account first if you don't have one, and reference your username
|
||||
while requesting access to push packages.
|
||||
|
||||
```bash
|
||||
twine upload dist/apache-superset-${SUPERSET_VERSION}.tar.gz
|
||||
|
||||
# Set your username to token
|
||||
# Set your password to the token value, including the pypi- prefix
|
||||
```
|
||||
|
||||
### Announcing
|
||||
|
||||
Once it's all done, an [ANNOUNCE] thread announcing the release to the dev@ mailing list is the final step.
|
||||
|
||||
```bash
|
||||
# Note use Superset's virtualenv
|
||||
python generate_email.py announce
|
||||
python send_email.py announce
|
||||
```
|
||||
|
||||
The script will generate the email text that should be sent to dev@superset.apache.org using an email client. The release version is fetched from the previously set environment variables.
|
||||
### Github Release
|
||||
|
||||
### GitHub Release
|
||||
|
||||
Finally, so the GitHub UI reflects the latest release, you should create a release from the
|
||||
Finally, so the Github UI reflects the latest release, you should create a release from the
|
||||
tag corresponding with the new version. Go to https://github.com/apache/superset/tags,
|
||||
click the 3-dot icon and select `Create Release`, paste the content of the ANNOUNCE thread in the
|
||||
release notes, and publish the new release.
|
||||
|
||||
@@ -13,6 +13,9 @@
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
# pylint: disable=no-value-for-parameter
|
||||
|
||||
import csv as lib_csv
|
||||
import os
|
||||
import re
|
||||
@@ -21,12 +24,11 @@ from dataclasses import dataclass
|
||||
from typing import Any, Dict, Iterator, List, Optional, Union
|
||||
|
||||
import click
|
||||
from click.core import Context
|
||||
|
||||
try:
|
||||
from github import BadCredentialsException, Github, PullRequest, Repository
|
||||
except ModuleNotFoundError:
|
||||
print("PyGitHub is a required package for this script")
|
||||
print("PyGithub is a required package for this script")
|
||||
exit(1)
|
||||
|
||||
SUPERSET_REPO = "apache/superset"
|
||||
@@ -48,7 +50,7 @@ class GitLog:
|
||||
author_email: str = ""
|
||||
|
||||
def __eq__(self, other: object) -> bool:
|
||||
"""A log entry is considered equal if it has the same PR number"""
|
||||
""" A log entry is considered equal if it has the same PR number """
|
||||
if isinstance(other, self.__class__):
|
||||
return other.pr_number == self.pr_number
|
||||
return False
|
||||
@@ -164,20 +166,11 @@ class GitChangeLog:
|
||||
return False
|
||||
|
||||
def _get_changelog_version_head(self) -> str:
|
||||
if not len(self._logs):
|
||||
print(
|
||||
f"No changes found between revisions. "
|
||||
f"Make sure your branch is up to date."
|
||||
)
|
||||
sys.exit(1)
|
||||
return f"### {self._version} ({self._logs[0].time})"
|
||||
|
||||
def _parse_change_log(
|
||||
self,
|
||||
changelog: Dict[str, str],
|
||||
pr_info: Dict[str, str],
|
||||
github_login: str,
|
||||
) -> None:
|
||||
self, changelog: Dict[str, str], pr_info: Dict[str, str], github_login: str,
|
||||
):
|
||||
formatted_pr = (
|
||||
f"- [#{pr_info.get('id')}]"
|
||||
f"(https://github.com/{SUPERSET_REPO}/pull/{pr_info.get('id')}) "
|
||||
@@ -331,8 +324,8 @@ def print_title(message: str) -> None:
|
||||
@click.pass_context
|
||||
@click.option("--previous_version", help="The previous release version", required=True)
|
||||
@click.option("--current_version", help="The current release version", required=True)
|
||||
def cli(ctx: Context, previous_version: str, current_version: str) -> None:
|
||||
"""Welcome to change log generator"""
|
||||
def cli(ctx, previous_version: str, current_version: str) -> None:
|
||||
""" Welcome to change log generator """
|
||||
previous_logs = GitLogs(previous_version)
|
||||
current_logs = GitLogs(current_version)
|
||||
previous_logs.fetch()
|
||||
@@ -344,7 +337,7 @@ def cli(ctx: Context, previous_version: str, current_version: str) -> None:
|
||||
@cli.command("compare")
|
||||
@click.pass_obj
|
||||
def compare(base_parameters: BaseParameters) -> None:
|
||||
"""Compares both versions (by PR)"""
|
||||
""" Compares both versions (by PR) """
|
||||
previous_logs = base_parameters.previous_logs
|
||||
current_logs = base_parameters.current_logs
|
||||
print_title(
|
||||
@@ -364,8 +357,7 @@ def compare(base_parameters: BaseParameters) -> None:
|
||||
|
||||
@cli.command("changelog")
|
||||
@click.option(
|
||||
"--csv",
|
||||
help="The csv filename to export the changelog to",
|
||||
"--csv", help="The csv filename to export the changelog to",
|
||||
)
|
||||
@click.option(
|
||||
"--access_token",
|
||||
@@ -377,7 +369,7 @@ def compare(base_parameters: BaseParameters) -> None:
|
||||
def change_log(
|
||||
base_parameters: BaseParameters, csv: str, access_token: str, risk: bool
|
||||
) -> None:
|
||||
"""Outputs a changelog (by PR)"""
|
||||
""" Outputs a changelog (by PR) """
|
||||
previous_logs = base_parameters.previous_logs
|
||||
current_logs = base_parameters.current_logs
|
||||
previous_diff_logs = previous_logs.diff(current_logs)
|
||||
|
||||
@@ -17,7 +17,7 @@
|
||||
under the License.
|
||||
-#}
|
||||
To: {{ receiver_email }}
|
||||
|
||||
From: {{ sender_email }}
|
||||
Subject: [ANNOUNCE] Apache {{ project_name }} version {{ version }} Released
|
||||
|
||||
Hello Community,
|
||||
@@ -31,12 +31,12 @@ The official source release:
|
||||
|
||||
https://www.apache.org/dist/{{ project_module }}/{{ version }}
|
||||
|
||||
The PyPI package:
|
||||
The Pypi package:
|
||||
|
||||
https://pypi.org/project/apache-superset/
|
||||
|
||||
If you have any usage questions or have problems when upgrading or
|
||||
find any issues with enhancements included in this release, please
|
||||
If you have any usage questions, or have problems when upgrading or
|
||||
find any problems about enhancements included in this release, please
|
||||
don't hesitate to let us know by sending feedback to this mailing
|
||||
list.
|
||||
|
||||
|
||||
@@ -17,18 +17,18 @@
|
||||
under the License.
|
||||
-#}
|
||||
To: {{ receiver_email }}
|
||||
|
||||
From: {{ sender_email }}
|
||||
Subject: [RESULT] [VOTE] Release Apache {{ project_name }} {{ version }} based on Superset {{ version_rc }}
|
||||
|
||||
Thanks to everyone that participated. The vote to release
|
||||
Apache {{ project_name }} version {{ version }} based on {{ version_rc }} is now closed.
|
||||
|
||||
{% if vote_negatives|length > 0 -%}
|
||||
The vote did NOT PASS with {{vote_bindings|length}} binding +1, {{ vote_nonbindings|length}} non-binding +1, and {{vote_negatives|length}} -1 votes:
|
||||
The vote did NOT PASS with {{vote_bindings|length}} binding +1, {{ vote_nonbindings|length}} non binding +1 and {{vote_negatives|length}} -1 votes:
|
||||
{% elif vote_bindings|length > 2 -%}
|
||||
The vote PASSED with {{vote_bindings|length}} binding +1, {{ vote_nonbindings|length}} non-binding +1, and {{vote_negatives|length}} -1 votes:
|
||||
The vote PASSED with {{vote_bindings|length}} binding +1, {{ vote_nonbindings|length}} non binding +1 and {{vote_negatives|length}} -1 votes:
|
||||
{% else -%}
|
||||
The vote is non conclusive with {{vote_bindings|length}} binding +1, {{ vote_nonbindings|length}} non-binding -1, and {{vote_negatives|length}} -1 votes:
|
||||
The vote is non conclusive with {{vote_bindings|length}} binding +1, {{ vote_nonbindings|length}} non binding -1 and {{vote_negatives|length}} -1 votes:
|
||||
{%- endif %}
|
||||
|
||||
{% if vote_bindings|length > 0 -%}
|
||||
@@ -39,7 +39,7 @@ Binding votes:
|
||||
{%- endif %}
|
||||
|
||||
{% if vote_nonbindings|length > 0 -%}
|
||||
Non-binding votes:
|
||||
Non binding votes:
|
||||
{% for voter in vote_nonbindings -%}
|
||||
- {{ voter }}
|
||||
{% endfor -%}
|
||||
|
||||
@@ -17,7 +17,7 @@
|
||||
under the License.
|
||||
-#}
|
||||
To: {{ receiver_email }}
|
||||
|
||||
From: {{ sender_email }}
|
||||
Subject: [VOTE] Release Apache {{ project_name }} {{ version }} based on Superset {{ version_rc }}
|
||||
|
||||
Hello {{ project_name }} Community,
|
||||
@@ -36,7 +36,8 @@ https://github.com/apache/{{ project_module }}/blob/{{ version_rc }}/CHANGELOG.m
|
||||
The Updating instructions for the release:
|
||||
https://github.com/apache/{{ project_module }}/blob/{{ version_rc }}/UPDATING.md
|
||||
|
||||
Public keys are available at:
|
||||
public keys are available at:
|
||||
|
||||
https://www.apache.org/dist/{{ project_module }}/KEYS
|
||||
|
||||
The vote will be open for at least 72 hours or until the necessary number
|
||||
|
||||
@@ -1,154 +0,0 @@
|
||||
#!/usr/bin/python3
|
||||
#
|
||||
# Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
# contributor license agreements. See the NOTICE file distributed with
|
||||
# this work for additional information regarding copyright ownership.
|
||||
# The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
# (the "License"); you may not use this file except in compliance with
|
||||
# the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
from typing import Any, Dict, List
|
||||
|
||||
from click.core import Context
|
||||
|
||||
try:
|
||||
import jinja2
|
||||
except ModuleNotFoundError:
|
||||
exit("Jinja2 is a required dependency for this script")
|
||||
try:
|
||||
import click
|
||||
except ModuleNotFoundError:
|
||||
exit("Click is a required dependency for this script")
|
||||
|
||||
RECEIVER_EMAIL = "dev@superset.apache.org"
|
||||
PROJECT_NAME = "Superset"
|
||||
PROJECT_MODULE = "superset"
|
||||
PROJECT_DESCRIPTION = "Apache Superset is a modern, enterprise-ready business intelligence web application"
|
||||
|
||||
|
||||
def string_comma_to_list(message: str) -> List[str]:
|
||||
if not message:
|
||||
return []
|
||||
return [element.strip() for element in message.split(",")]
|
||||
|
||||
|
||||
def render_template(template_file: str, **kwargs: Any) -> str:
|
||||
"""
|
||||
Simple render template based on named parameters
|
||||
|
||||
:param template_file: The template file location
|
||||
:kwargs: Named parameters to use when rendering the template
|
||||
:return: Rendered template
|
||||
"""
|
||||
template = jinja2.Template(open(template_file).read())
|
||||
return template.render(kwargs)
|
||||
|
||||
|
||||
class BaseParameters(object):
|
||||
def __init__(
|
||||
self,
|
||||
version: str,
|
||||
version_rc: str,
|
||||
) -> None:
|
||||
self.version = version
|
||||
self.version_rc = version_rc
|
||||
self.template_arguments: Dict[str, Any] = {}
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"Apache Credentials: {self.version}/{self.version_rc}"
|
||||
|
||||
|
||||
@click.group()
|
||||
@click.pass_context
|
||||
@click.option("--version", envvar="SUPERSET_VERSION")
|
||||
@click.option("--version_rc", envvar="SUPERSET_VERSION_RC")
|
||||
def cli(
|
||||
ctx: Context,
|
||||
version: str,
|
||||
version_rc: str,
|
||||
) -> None:
|
||||
"""Welcome to releasing send email CLI interface!"""
|
||||
base_parameters = BaseParameters(version, version_rc)
|
||||
base_parameters.template_arguments["receiver_email"] = RECEIVER_EMAIL
|
||||
base_parameters.template_arguments["project_name"] = PROJECT_NAME
|
||||
base_parameters.template_arguments["project_module"] = PROJECT_MODULE
|
||||
base_parameters.template_arguments["project_description"] = PROJECT_DESCRIPTION
|
||||
base_parameters.template_arguments["version"] = base_parameters.version
|
||||
base_parameters.template_arguments["version_rc"] = base_parameters.version_rc
|
||||
ctx.obj = base_parameters
|
||||
|
||||
|
||||
@cli.command("vote_pmc")
|
||||
@click.pass_obj
|
||||
def vote_pmc(base_parameters: BaseParameters) -> None:
|
||||
template_file = "email_templates/vote_pmc.j2"
|
||||
message = render_template(template_file, **base_parameters.template_arguments)
|
||||
print(message)
|
||||
|
||||
|
||||
@cli.command("result_pmc")
|
||||
@click.option(
|
||||
"--vote_bindings",
|
||||
default="",
|
||||
type=str,
|
||||
prompt="A List of people with +1 binding vote (ex: Max,Grace,Krist)",
|
||||
)
|
||||
@click.option(
|
||||
"--vote_nonbindings",
|
||||
default="",
|
||||
type=str,
|
||||
prompt="A List of people with +1 non binding vote (ex: Ville)",
|
||||
)
|
||||
@click.option(
|
||||
"--vote_negatives",
|
||||
default="",
|
||||
type=str,
|
||||
prompt="A List of people with -1 vote (ex: John)",
|
||||
)
|
||||
@click.option(
|
||||
"--vote_thread",
|
||||
default="",
|
||||
type=str,
|
||||
prompt="Permalink to the vote thread "
|
||||
"(see https://lists.apache.org/list.html?dev@superset.apache.org)",
|
||||
)
|
||||
@click.pass_obj
|
||||
def result_pmc(
|
||||
base_parameters: BaseParameters,
|
||||
vote_bindings: str,
|
||||
vote_nonbindings: str,
|
||||
vote_negatives: str,
|
||||
vote_thread: str,
|
||||
) -> None:
|
||||
template_file = "email_templates/result_pmc.j2"
|
||||
base_parameters.template_arguments["vote_bindings"] = string_comma_to_list(
|
||||
vote_bindings
|
||||
)
|
||||
base_parameters.template_arguments["vote_nonbindings"] = string_comma_to_list(
|
||||
vote_nonbindings
|
||||
)
|
||||
base_parameters.template_arguments["vote_negatives"] = string_comma_to_list(
|
||||
vote_negatives
|
||||
)
|
||||
base_parameters.template_arguments["vote_thread"] = vote_thread
|
||||
message = render_template(template_file, **base_parameters.template_arguments)
|
||||
print(message)
|
||||
|
||||
|
||||
@cli.command("announce")
|
||||
@click.pass_obj
|
||||
def announce(base_parameters: BaseParameters) -> None:
|
||||
template_file = "email_templates/announce.j2"
|
||||
message = render_template(template_file, **base_parameters.template_arguments)
|
||||
print(message)
|
||||
|
||||
|
||||
cli()
|
||||
@@ -167,7 +167,7 @@ Other features
|
||||
|
||||
Alerts (send notification when a condition is met) ([Roadmap](https://github.com/apache-superset/superset-roadmap/issues/54))
|
||||
- feat: add test email functionality to SQL-based email alerts (#[10476](https://github.com/apache/superset/pull/10476))
|
||||
- feat: refactored SQL-based alerting framework (#[10605](https://github.com/apache/superset/pull/10605))
|
||||
- feat: refractored SQL-based alerting framework (#[10605](https://github.com/apache/superset/pull/10605))
|
||||
|
||||
|
||||
[SIP-34] Proposal to establish a new design direction, system, and process for Superset ([SIP](https://github.com/apache/superset/issues/8976))
|
||||
|
||||
@@ -96,6 +96,7 @@ Some of the new features in this release are disabled by default. Each has a fea
|
||||
| Dashboard Native Filters | `DASHBOARD_NATIVE_FILTERS: True` | |
|
||||
| Alerts & Reporting | `ALERT_REPORTS: True` | [Celery workers configured & celery beat process](https://superset.apache.org/docs/installation/async-queries-celery) |
|
||||
| Homescreen Thumbnails | `THUMBNAILS: TRUE, THUMBNAIL_CACHE_CONFIG: CacheConfig = { "CACHE_TYPE": "null", "CACHE_NO_NULL_WARNING": True}`| selenium, pillow 7, celery |
|
||||
| Row Level Security | `ROW_LEVEL_SECURITY` | | [Extra Documentation](https://superset.apache.org/docs/security#row-level-security)
|
||||
| Dynamic Viz Plugin Import | `DYNAMIC_PLUGINS: True` | |
|
||||
|
||||
# Stability and Bugfixes
|
||||
|
||||
@@ -1,122 +0,0 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Release Notes for Superset 1.2
|
||||
|
||||
Superset 1.2 continues the Apache ECharts migration by introducing several chart types. It also brings with it tons of user experience improvements, API improvements, bug fixes, and continued development of experimental features included in previous releases. Keep reading for more details on these categories:
|
||||
|
||||
- [**User Experience**](#user-experience)
|
||||
- [**Dashboard Level Security**](#dashboard-level-security)
|
||||
- [**Database Connectivity**](#database-connectivity)
|
||||
- [**Developer Experience**](#developer-experience)
|
||||
- [**PR Highlights**](#pr-highlights)
|
||||
- [**Breaking Changes and Full Changelog**](#breaking-changes-and-full-changelog)
|
||||
|
||||
# User Experience
|
||||
|
||||
The migration to Apache ECharts continues by introducing several new high-quality visualizations in this release.
|
||||
|
||||
The mixed time-series multi chart allows different kinds of time-series visualization to be overlayed.
|
||||
|
||||

|
||||
|
||||
The radar chart provides a good way to compare two or more groups over various features of interest.
|
||||
|
||||

|
||||
|
||||
By popular demand, we have introduced a new and improved version of the pivot table visualization as well.
|
||||
|
||||

|
||||
|
||||
Several UI tweaks in Explore and SQL Lab made it into this release as well, including new buttons and menu options to make common workflows easier, as well as more communicative error messages, particularly in the database connection menus.
|
||||
|
||||
The dashboard native filter feature, [while still behind a feature flag in this release](https://github.com/apache/superset/blob/master/RELEASING/release-notes-1-0/README.md#feature-flags), has received plenty of new functionality and is closer than ever to being ready for prime-time. This feature provides a way to apply and manipulate filters over many charts at the dashboard level. 1.2 adds more controls, more options for aggregations, and better support for temporal filters, among other things.
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
Last but not least, the alerts and reports feature and its dependencies have been added to the [docker-compose](https://superset.apache.org/docs/installation/installing-superset-using-docker-compose) setup, making it easier to use outside of well-supported enterprise deployments.
|
||||
|
||||
|
||||
# Dashboard Level Security
|
||||
|
||||
Superset has so far relied on a role-based access system implemented at the dataset level. While this provides granular security options that satisfy many use cases, some organizations need more options. [SIP-51](https://github.com/apache/superset/issues/10408) lays out a vision for dashboard-level role-based access control as a fully backwards compatible extension to Superset's security options.
|
||||
|
||||
The 1.1 release saw steps taken in the direction of this vision, and 1.2 builds on that with new permissions for sharing charts and dashboards that can be assigned to roles. **Note that this functionality is still experimental and hidden behind a feature flag as of 1.2.**
|
||||
|
||||

|
||||
|
||||
# Database Connectivity
|
||||
The 1.2 release adds support for [Crate DB](https://github.com/apache/superset/pull/13152) and the [Databricks DB](https://github.com/apache/superset/pull/13682) engine spec.
|
||||
|
||||
|
||||
# Developer Experience
|
||||
Expanding the API has been an ongoing effort, and 1.2 introduces several new API routes to allow developers to get available databases, get a given dashboard's charts, and import saved queries, among other things.
|
||||
|
||||
# PR Highlights
|
||||
|
||||
**New Charts and User Experience**
|
||||
|
||||
- [14197](https://github.com/apache/superset/pull/14197) feat(viz): add mixed and radar chart (#14197) (@Ville Brofeldt)
|
||||
- [14187](https://github.com/apache/superset/pull/14187) Enable the new pivot table (#14187) (@Kamil Gabryjelski)
|
||||
- [13210](https://github.com/apache/superset/pull/13210) feat(explore): ColumnSelectControl with drag-and-drop (#13210) (@Yongjie Zhao)
|
||||
- [13598](https://github.com/apache/superset/pull/13598) feat(explore): Drag and drop UX improvements (#13598) (@Kamil Gabryjelski)
|
||||
- [13294](https://github.com/apache/superset/pull/13294) feat(explore): Postgres datatype conversion (#13294) (@Nikola Gigić)
|
||||
- [13758](https://github.com/apache/superset/pull/13758) feat(explore): adhoc column formatting for Table chart (#13758) (@Jesse Yang)
|
||||
|
||||
**Progress On Dashboard Native Filters**
|
||||
|
||||
- [13726](https://github.com/apache/superset/pull/13726) feat(native-filters): Add default first value to select filter (#13726) (@simcha90)
|
||||
- [14461](https://github.com/apache/superset/pull/14461) feat(native-filters): Auto apply changes in FiltersConfigModal (#14461) (@simcha90)
|
||||
- [13507](https://github.com/apache/superset/pull/13507) feat(native-filters): Filter set tabs (#13507) (@simcha90)
|
||||
- [14313](https://github.com/apache/superset/pull/14313) feat(native-filters): Implement adhoc filters and time picker in Range and Select native filters (#14313) (@Kamil Gabryjelski)
|
||||
- [14261](https://github.com/apache/superset/pull/14261) feat(native-filters): Show/Hide filter bar by metadata ff (#14261) (@simcha90)
|
||||
- [13506](https://github.com/apache/superset/pull/13506) feat(native-filters): Update filter bar buttons (#13506) (@simcha90)
|
||||
- [14374](https://github.com/apache/superset/pull/14374) feat(native-filters): Use datasets in dashboard as default options for native filters (#14374) (@Kamil Gabryjelski)
|
||||
- [14314](https://github.com/apache/superset/pull/14314) feat(native-filters): add option to create value in select filter (#14314) (@Ville Brofeldt)
|
||||
- [14346](https://github.com/apache/superset/pull/14346) feat(native-filters): add optional sort metric to select filter (#14346) (@Ville Brofeldt)
|
||||
- [14375](https://github.com/apache/superset/pull/14375) feat(native-filters): add refresh button to default value picker (#14375) (@Ville Brofeldt)
|
||||
- [13569](https://github.com/apache/superset/pull/13569) feat(native-filters): add sort option to select filter (#13569) (@Ville Brofeldt)
|
||||
- [13622](https://github.com/apache/superset/pull/13622) feat(native-filters): add temporal support to select filter (#13622) (@Ville Brofeldt)
|
||||
- [13484](https://github.com/apache/superset/pull/13484) feat(native-filters): add timegrain and column filter (#13484) (@Ville Brofeldt)
|
||||
- [14312](https://github.com/apache/superset/pull/14312) feat(native-filters): add tooltip to control values (#14312) (@Ville Brofeldt)
|
||||
- [14217](https://github.com/apache/superset/pull/14217) feat(native-filters): select group by support (#14217) (@Amit Miran)
|
||||
|
||||
**Progress On Dashboard Level Access**
|
||||
|
||||
- [13145](https://github.com/apache/superset/pull/13145) feat(dashboard_rbac): manage roles for dashboard (#13145) (@simcha90)
|
||||
- [13992](https://github.com/apache/superset/pull/13992) feat(dashboard_rbac): provide data access based on dashboard access (#13992) (@Amit Miran)
|
||||
- [#12865](https://github.com/apache/superset/pull/12865) feat(dashboard_rbac): dashboards API support for roles create/update + roles validation (@amitmiran137)
|
||||
|
||||
|
||||
**Improvements to Developer Experience**
|
||||
|
||||
- [14208](https://github.com/apache/superset/pull/14208) feat: add endpoint to fetch available DBs (#14208) (@Beto Dealmeida)
|
||||
- [13331](https://github.com/apache/superset/pull/13331) fix(query-object): extra time-range-endpoints (#13331) (@John Bodley)
|
||||
- [13893](https://github.com/apache/superset/pull/13893) feat: create backend routes and API for importing saved queries (#13893) (@AAfghahi)
|
||||
- [13960](https://github.com/apache/superset/pull/13960) feat: initial work to make v1 API compatible with SIP-40 and SIP-41 (#13960) (@Beto Dealmeida)
|
||||
- [13444](https://github.com/apache/superset/pull/13444) fix: API to allow importing old exports (JSON/YAML) (#13444) (@Beto Dealmeida)
|
||||
- [13893](https://github.com/apache/superset/pull/13893) feat: create backend routes and API for importing saved queries (#13893) (@AAfghahi)
|
||||
|
||||
|
||||
## Breaking Changes and Full Changelog
|
||||
|
||||
- To see the complete changelog in this release, head to [CHANGELOG.MD](https://github.com/apache/superset/blob/master/CHANGELOG.md).
|
||||
- In line with the semantic versioning scheme adopted by the community, 1.2.0 does not contain any backwards incompatible changes.
|
||||
|
Before Width: | Height: | Size: 542 KiB |
|
Before Width: | Height: | Size: 383 KiB |
|
Before Width: | Height: | Size: 118 KiB |
|
Before Width: | Height: | Size: 576 KiB |
|
Before Width: | Height: | Size: 217 KiB |
|
Before Width: | Height: | Size: 37 KiB |
@@ -1,73 +0,0 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Release Notes for Superset 1.3
|
||||
|
||||
Superset 1.3 focuses on hardening and polishing the superset user experience, with tons of UX improvements and bug fixes focused on charts, dashboards, and the new dashboard-native filters.
|
||||
|
||||
- [**User Experience**](#user-experience)
|
||||
- [**PR Highlights**](#pr-highlights)
|
||||
- [**Breaking Changes and Full Changelog**](#breaking-changes-and-full-changelog)
|
||||
|
||||
# User Experience
|
||||
One major goal of this release is to improve and harden dashboard-native filters. These filters live at the dashboard level instead of within a chart and affect all charts under their scope within a dashboard. Improvements in this release include clearer visual indicators of what charts are within the scope of a selected filter.
|
||||
|
||||

|
||||
|
||||
Native-filters can also be set to load collapsed, which also improves connected thumbnail and alerts/reports functionality.
|
||||
|
||||

|
||||
|
||||
For charts, we've added a new funnel chart.
|
||||
|
||||

|
||||
|
||||
Users can also now use Jinja templating in calculated columns and SQL metrics.
|
||||
|
||||

|
||||
|
||||
At the dashboard level, work has been focused on improving available information and UX ergonomics. Users can now download a full .csv of the full dataset behind a table chart from the dashboard.
|
||||
|
||||

|
||||
|
||||
Continuing on the theme of making more things accessible directly from the dashboard, users can now view the SQL Query behind any chart directly from the dashboard as well.
|
||||
|
||||

|
||||
|
||||
# Developer Experience
|
||||
The API has received a new endpoint to allow the developer to pass DB-specific parameters instead of the full SQLAlchemy URI.
|
||||
|
||||
# Database Connectivity
|
||||
We have improved support for Ascend.io's engine spec and fixed a long list of bugs.
|
||||
|
||||
Also in the works is a new database connection UI, which should make connecting to a database easier without having to put together a SQLAlchemy URI. It's behind a feature flag for now, but it can be turned on in config.py with `FORCE_DATABASE_CONNECTIONS_SSL = True`.
|
||||
|
||||
# PR Highlights
|
||||
|
||||
- [14682](https://github.com/apache/superset/pull/14682) add ascend engine spec (#14682) (@Daniel Wood)
|
||||
- [14420](https://github.com/apache/superset/pull/14420) feat: API endpoint to validate databases using separate parameters (#14420) (@Beto Dealmeida)
|
||||
- [14934](https://github.com/apache/superset/pull/14934) feat: Adding FORCE_SSL as feature flag in config.py (#14934) (@AAfghahi)
|
||||
- [14480](https://github.com/apache/superset/pull/14480) feat(viz): add funnel chart (#14480) (@Ville Brofeldt)
|
||||
|
||||
|
||||
|
||||
## Breaking Changes and Full Changelog
|
||||
|
||||
- To see the complete changelog in this release, head to [CHANGELOG.MD](../../CHANGELOG.md).
|
||||
- 1.3.0 does not contain any backwards incompatible changes.
|
||||
|
Before Width: | Height: | Size: 362 KiB |
|
Before Width: | Height: | Size: 116 KiB |
|
Before Width: | Height: | Size: 227 KiB |
|
Before Width: | Height: | Size: 33 KiB |
|
Before Width: | Height: | Size: 326 KiB |
|
Before Width: | Height: | Size: 283 KiB |
@@ -1,78 +0,0 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Release Notes for Superset 1.4
|
||||
|
||||
Superset 1.4 focuses heavily on continuing to polish the core Superset experience. This release has a very long list of fixes from across the community.
|
||||
|
||||
- [**User Experience**](#user-facing-features)
|
||||
- [**Database Experience**](#database-experience)
|
||||
- [**Breaking Changes and Full Changelog**](#breaking-changes-and-full-changelog)
|
||||
|
||||
## User Facing Features
|
||||
|
||||
- Charts and dashboards in Superset can now be certified! In addition, the Edit Dataset modal more accurately reflects state of Certification (especially for Calculated Columns). ([#17335](https://github.com/apache/superset/pull/17335), [#16454](https://github.com/apache/superset/pull/16454))
|
||||
|
||||

|
||||
|
||||
- Parquet files can now be uploaded into an existing connected database that has Data Upload enabled. Eventually, the contributor hopes that this foundation can be used to accommodate `feather` and `orc` files. ([#14449](https://github.com/apache/superset/pull/14449))
|
||||
|
||||
- Tabs can now be added to Column elements in dashboards. ([#16593](https://github.com/apache/superset/pull/16593))
|
||||
|
||||

|
||||
|
||||
- The experience of using alerts and reports have improved in a few minor ways. ([#16335](https://github.com/apache/superset/pull/16335),[#16281](https://github.com/apache/superset/pull/16281))
|
||||
|
||||
- Drag and drop now has a clickable ghost button for an improved user experience. ([#16119](https://github.com/apache/superset/pull/16119))
|
||||
|
||||
## Database Experience
|
||||
|
||||
- Apache Drill: Superset can now connect to Apache Drill (thru ODBC / JDBC) and impersonate the currently logged in user. ([#17353](https://github.com/apache/superset/pull/17353/files)).
|
||||
|
||||
- Firebolt: Superset now supports the cloud data warehouse Firebolt! ([#16903](https://github.com/apache/superset/pull/16903)).
|
||||
|
||||
- Databricks: Superset now supports the new [SQL Endpoints in Databricks](https://docs.databricks.com/sql/admin/sql-endpoints.html). ([#16862](https://github.com/apache/superset/pull/16862))
|
||||
|
||||
- Apache Druid: Superset Explore now can take advantage of support for JOIN's in Druid (note: the `DRUID_JOINS` feature flag needs to be enabled). ([#16770](https://github.com/apache/superset/pull/16770))
|
||||
|
||||
- AWS Aurora: Superset now has a separate db_engine_spec for Amazon Aurora. ([#16535](https://github.com/apache/superset/pull/16535))
|
||||
|
||||
- Clickhouse: Superset now includes function names in the auto-complete for SQL Lab. ([#16234](https://github.com/apache/superset/pull/16234))
|
||||
|
||||
- Google Sheets: Better support for private Google Sheets was added. ([#16228](https://github.com/apache/superset/pull/16628))
|
||||
|
||||
|
||||
## Developer Experience
|
||||
|
||||
- The `Makefile` for Superset has gone through a number of improvements. ([#16327](https://github.com/apache/superset/pull/16327), [#16533](https://github.com/apache/superset/pull/16533))
|
||||
|
||||
- Add Python instrumentation to pages, showing method calls used to build the page & how long each one took. This requires a configuration flag (see PR for more info). ([#16136](https://github.com/apache/superset/pull/16136))
|
||||
|
||||

|
||||
|
||||
## Breaking Changes and Full Changelog
|
||||
|
||||
**Breaking Changes**
|
||||
|
||||
- [16660](https://github.com/apache/superset/pull/16660): The `columns` Jinja parameter has been renamed `table_columns` to make the columns query object parameter available in the Jinja context.
|
||||
- [16711](https://github.com/apache/superset/pull/16711): The url_param Jinja function will now by default escape the result. For instance, the value `O'Brien` will now be changed to `O''Brien`. To disable this behavior, call `url_param` with `escape_result` set to `False: url_param("my_key", "my default", escape_result=False)`.
|
||||
|
||||
**Changelog**
|
||||
|
||||
To see the complete changelog in this release, head to [CHANGELOG.MD](https://github.com/apache/superset/blob/master/CHANGELOG.md). As mentioned earlier, this release has a MASSIVE amount of bug fixes. The full changelog lists all of them!
|
||||
|
Before Width: | Height: | Size: 266 KiB |
|
Before Width: | Height: | Size: 159 KiB |
|
Before Width: | Height: | Size: 269 KiB |
@@ -1,142 +0,0 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Release Notes for Superset 1.5
|
||||
|
||||
Superset 1.5 focuses on polishing the dashboard native filters experience, while
|
||||
improving performance and stability. Superset 1.5 is likely the last minor release of
|
||||
version 1 of Superset, and will be succeeded by Superset 2.0. The 1.5 branch
|
||||
introduces the notion of a Long Term Support (LTS) version of Superset, and will
|
||||
receive security and other critical fixes even after Superset 2.x is released.
|
||||
Therefore, users will have the choice of staying on the 1.5 branch or upgrading to 2.x
|
||||
when available.
|
||||
|
||||
- [**User Experience**](#user-facing-features)
|
||||
- [**Feature flags**](#feature-flags)
|
||||
- [**Database Experience**](#database-experience)
|
||||
- [**Breaking Changes and Full Changelog**](#breaking-changes-and-full-changelog)
|
||||
|
||||
## User Facing Features
|
||||
|
||||
- Complex dashboards with lots of native filters and charts will render considerably
|
||||
faster. See the videos that shows the rendering time of a complex dashboard go from
|
||||
11 to 3 seconds: [#19064](https://github.com/apache/superset/pull/19064). In
|
||||
addition, applying filters and switching tabs is also much smoother.
|
||||
- The Native Filter Bar has been redesigned, along with moving the "Apply" and
|
||||
"Clear all" buttons to the bottom:
|
||||
|
||||

|
||||
|
||||
- Native filters can now be made dependent on multiple filters. This makes it possible
|
||||
to restrict the available values in a filter based on the selection of other filters.
|
||||
|
||||

|
||||
|
||||
- In addition to being able to write Custom SQL for adhoc metrics and filters, the
|
||||
column control now also features a Custom SQL tab. This makes it possible to write
|
||||
custom expressions directly in charts without adding them to the dataset as saved
|
||||
expressions.
|
||||
|
||||

|
||||
|
||||
- A new `SupersetMetastoreCache` has been added which makes it possible to cache data
|
||||
in the Superset Metastore without the need for running a dedicated cache like Redis
|
||||
or Memcached. The new cache will be used by default for required caches, but can also
|
||||
be used for caching chart or other data. See the
|
||||
[documentation](https://superset.apache.org/docs/installation/cache#caching) for
|
||||
details on using the new cache.
|
||||
- Previously it was possible for Dashboards with lots of filters to cause an error.
|
||||
A similar issue existed on Explore. Now Superset stores Dashboard and Explore state
|
||||
in the cache (as opposed to the URL), eliminating the infamous
|
||||
[Long URL Problem](https://github.com/apache/superset/issues/17086).
|
||||
- Previously permanent links to Dashboard and Explore pages were in fact shortened URLS
|
||||
that relied on state being stored in the URL (see Long URL Problem above). In
|
||||
addition, the links used numerical ids and didn't check user permissions making it
|
||||
easy to iterate through links that were stored in the metastore. Now permanent links
|
||||
state is stored as JSON objects in the metastore, making it possible to store
|
||||
arbitrarily large Dashboard and Explore state in permalinks. In addition, the ids
|
||||
are encoded using [`hashids`](https://hashids.org/) and check permissions, making
|
||||
permalink state more secure.
|
||||
|
||||

|
||||
|
||||
## Feature flags
|
||||
|
||||
- A new feature flag `GENERIC_CHART_AXES` has been added that makes it possible to
|
||||
use a non-temporal x-axis on the ECharts Timeseries chart
|
||||
([#17917](https://github.com/apache/superset/pull/17917)). When enabled, a new
|
||||
control "X Axis" is added to the control panel of ECharts line, area, bar, step and
|
||||
scatter charts, which makes it possible to use categorical or numerical x-axes on
|
||||
those charts.
|
||||
|
||||

|
||||
|
||||
## Database Experience
|
||||
|
||||
- DuckDB: Add support for database:
|
||||
[#19317](https://github.com/apache/superset/pull/19317)
|
||||
|
||||
- Kusto: Add support for Azure Data Explorer (Kusto):
|
||||
[#17898](https://github.com/apache/superset/pull/17898)
|
||||
|
||||
- Trino: Add server cert support and new auth methods:
|
||||
[#17593](https://github.com/apache/superset/pull/17593) and
|
||||
[#16346](https://github.com/apache/superset/pull/16346)
|
||||
|
||||
- Microsoft SQL Server (MSSQL): support using CTEs in virtual tables:
|
||||
[#18567](https://github.com/apache/superset/pull/18567)
|
||||
|
||||
- Teradata and MSSQL: add support for TOP limit syntax:
|
||||
[#18746](https://github.com/apache/superset/pull/18746) and
|
||||
[#18240](https://github.com/apache/superset/pull/18240)
|
||||
|
||||
- Apache Drill: User impersonation using `drill+sadrill`:
|
||||
[#19252](https://github.com/apache/superset/pull/19252)
|
||||
|
||||
## Developer Experience
|
||||
|
||||
- `superset-ui` has now been integrated into the Superset codebase as per
|
||||
[SIP-58](https://github.com/apache/superset/issues/13013) dubbed "Monorepo". This
|
||||
makes development of plugins that ship with Superset considerably simpler. In
|
||||
addition, it makes it possible to align `superset-ui` releases with official Superset
|
||||
releases.
|
||||
|
||||
## Breaking Changes and Full Changelog
|
||||
|
||||
**Breaking Changes**
|
||||
|
||||
- Bump `mysqlclient` from v1 to v2:
|
||||
[#17556](https://github.com/apache/superset/pull/17556)
|
||||
- Single and double quotes will no longer be removed from filter values:
|
||||
[#17881](https://github.com/apache/superset/pull/17881)
|
||||
- Previously `QUERY_COST_FORMATTERS_BY_ENGINE`, `SQL_VALIDATORS_BY_ENGINE` and
|
||||
`SCHEDULED_QUERIES` were expected to be defined in the feature flag dictionary in
|
||||
the `config.py` file. These should now be defined as a top-level config, with the
|
||||
feature flag dictionary being reserved for boolean only values:
|
||||
[#15254](https://github.com/apache/superset/pull/15254)
|
||||
- All Superset CLI commands (init, load_examples and etc) require setting the
|
||||
`FLASK_APP` environment variable (which is set by default when `.flaskenv` is loaded):
|
||||
[#17539](https://github.com/apache/superset/pull/17539)
|
||||
|
||||
**Changelog**
|
||||
|
||||
To see the complete changelog in this release, head to
|
||||
[CHANGELOG.MD](https://github.com/apache/superset/blob/1.5/CHANGELOG.md).
|
||||
As mentioned earlier, this release has a MASSIVE amount of bug fixes. The full
|
||||
changelog lists all of them!
|
||||
|
Before Width: | Height: | Size: 202 KiB |
|
Before Width: | Height: | Size: 301 KiB |
|
Before Width: | Height: | Size: 428 KiB |
|
Before Width: | Height: | Size: 359 KiB |
|
Before Width: | Height: | Size: 184 KiB |
@@ -1,19 +0,0 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
cherrytree
|
||||
jinja2
|
||||
255
RELEASING/send_email.py
Executable file
@@ -0,0 +1,255 @@
|
||||
#!/usr/bin/python3
|
||||
#
|
||||
# Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
# contributor license agreements. See the NOTICE file distributed with
|
||||
# this work for additional information regarding copyright ownership.
|
||||
# The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
# (the "License"); you may not use this file except in compliance with
|
||||
# the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
import smtplib
|
||||
import ssl
|
||||
from typing import List
|
||||
|
||||
try:
|
||||
import jinja2
|
||||
except ModuleNotFoundError:
|
||||
exit("Jinja2 is a required dependency for this script")
|
||||
try:
|
||||
import click
|
||||
except ModuleNotFoundError:
|
||||
exit("Click is a required dependency for this script")
|
||||
|
||||
|
||||
SMTP_PORT = 587
|
||||
SMTP_SERVER = "mail-relay.apache.org"
|
||||
PROJECT_NAME = "Superset"
|
||||
PROJECT_MODULE = "superset"
|
||||
PROJECT_DESCRIPTION = "Apache Superset is a modern, enterprise-ready business intelligence web application"
|
||||
|
||||
|
||||
def string_comma_to_list(message: str) -> List[str]:
|
||||
if not message:
|
||||
return []
|
||||
return [element.strip() for element in message.split(",")]
|
||||
|
||||
|
||||
def send_email(
|
||||
smtp_server: str,
|
||||
smpt_port: int,
|
||||
username: str,
|
||||
password: str,
|
||||
sender_email: str,
|
||||
receiver_email: str,
|
||||
message: str,
|
||||
):
|
||||
"""
|
||||
Send a simple text email (SMTP)
|
||||
"""
|
||||
context = ssl.create_default_context()
|
||||
with smtplib.SMTP(smtp_server, smpt_port) as server:
|
||||
server.starttls(context=context)
|
||||
server.login(username, password)
|
||||
server.sendmail(sender_email, receiver_email, message)
|
||||
|
||||
|
||||
def render_template(template_file: str, **kwargs) -> str:
|
||||
"""
|
||||
Simple render template based on named parameters
|
||||
|
||||
:param template_file: The template file location
|
||||
:kwargs: Named parameters to use when rendering the template
|
||||
:return: Rendered template
|
||||
"""
|
||||
template = jinja2.Template(open(template_file).read())
|
||||
return template.render(kwargs)
|
||||
|
||||
|
||||
def inter_send_email(username, password, sender_email, receiver_email, message):
|
||||
print("--------------------------")
|
||||
print("SMTP Message")
|
||||
print("--------------------------")
|
||||
print(message)
|
||||
print("--------------------------")
|
||||
confirm = input("Is the Email message ok? (yes/no): ")
|
||||
if confirm not in ("Yes", "yes", "y"):
|
||||
exit("Exit by user request")
|
||||
|
||||
try:
|
||||
send_email(
|
||||
SMTP_SERVER,
|
||||
SMTP_PORT,
|
||||
username,
|
||||
password,
|
||||
sender_email,
|
||||
receiver_email,
|
||||
message,
|
||||
)
|
||||
print("Email sent successfully")
|
||||
except smtplib.SMTPAuthenticationError:
|
||||
exit("SMTP User authentication error, Email not sent!")
|
||||
except Exception as e:
|
||||
exit(f"SMTP exception {e}")
|
||||
|
||||
|
||||
class BaseParameters(object):
|
||||
def __init__(
|
||||
self, email=None, username=None, password=None, version=None, version_rc=None
|
||||
):
|
||||
self.email = email
|
||||
self.username = username
|
||||
self.password = password
|
||||
self.version = version
|
||||
self.version_rc = version_rc
|
||||
self.template_arguments = dict()
|
||||
|
||||
def __repr__(self):
|
||||
return f"Apache Credentials: {self.email}/{self.username}/{self.version}/{self.version_rc}"
|
||||
|
||||
|
||||
@click.group()
|
||||
@click.pass_context
|
||||
@click.option(
|
||||
"--apache_email",
|
||||
prompt="Apache Email",
|
||||
help="Your Apache email this will be used for SMTP From",
|
||||
)
|
||||
@click.option(
|
||||
"--apache_username", prompt="Apache username", help="Your LDAP Apache username"
|
||||
)
|
||||
@click.option(
|
||||
"--apache_password",
|
||||
prompt="Apache password",
|
||||
hide_input=True,
|
||||
help="Your LDAP Apache password",
|
||||
)
|
||||
@click.option("--version", envvar="SUPERSET_VERSION")
|
||||
@click.option("--version_rc", envvar="SUPERSET_VERSION_RC")
|
||||
def cli(ctx, apache_email, apache_username, apache_password, version, version_rc):
|
||||
""" Welcome to releasing send email CLI interface! """
|
||||
base_parameters = BaseParameters(
|
||||
apache_email, apache_username, apache_password, version, version_rc
|
||||
)
|
||||
base_parameters.template_arguments["project_name"] = PROJECT_NAME
|
||||
base_parameters.template_arguments["project_module"] = PROJECT_MODULE
|
||||
base_parameters.template_arguments["project_description"] = PROJECT_DESCRIPTION
|
||||
base_parameters.template_arguments["version"] = base_parameters.version
|
||||
base_parameters.template_arguments["version_rc"] = base_parameters.version_rc
|
||||
base_parameters.template_arguments["sender_email"] = base_parameters.email
|
||||
ctx.obj = base_parameters
|
||||
|
||||
|
||||
@cli.command("vote_pmc")
|
||||
@click.option(
|
||||
"--receiver_email",
|
||||
default="dev@superset.apache.org",
|
||||
type=str,
|
||||
prompt="The receiver email (To:)",
|
||||
)
|
||||
@click.pass_obj
|
||||
def vote_pmc(base_parameters, receiver_email):
|
||||
template_file = "email_templates/vote_pmc.j2"
|
||||
base_parameters.template_arguments["receiver_email"] = receiver_email
|
||||
message = render_template(template_file, **base_parameters.template_arguments)
|
||||
inter_send_email(
|
||||
base_parameters.username,
|
||||
base_parameters.password,
|
||||
base_parameters.template_arguments["sender_email"],
|
||||
base_parameters.template_arguments["receiver_email"],
|
||||
message,
|
||||
)
|
||||
|
||||
|
||||
@cli.command("result_pmc")
|
||||
@click.option(
|
||||
"--receiver_email",
|
||||
default="dev@superset.apache.org",
|
||||
type=str,
|
||||
prompt="The receiver email (To:)",
|
||||
)
|
||||
@click.option(
|
||||
"--vote_bindings",
|
||||
default="",
|
||||
type=str,
|
||||
prompt="A List of people with +1 binding vote (ex: Max,Grace,Krist)",
|
||||
)
|
||||
@click.option(
|
||||
"--vote_nonbindings",
|
||||
default="",
|
||||
type=str,
|
||||
prompt="A List of people with +1 non binding vote (ex: Ville)",
|
||||
)
|
||||
@click.option(
|
||||
"--vote_negatives",
|
||||
default="",
|
||||
type=str,
|
||||
prompt="A List of people with -1 vote (ex: John)",
|
||||
)
|
||||
@click.option(
|
||||
"--vote_thread",
|
||||
default="",
|
||||
type=str,
|
||||
prompt="Permalink to the vote thread "
|
||||
"(see https://lists.apache.org/list.html?dev@superset.apache.org)",
|
||||
)
|
||||
@click.pass_obj
|
||||
def result_pmc(
|
||||
base_parameters,
|
||||
receiver_email,
|
||||
vote_bindings,
|
||||
vote_nonbindings,
|
||||
vote_negatives,
|
||||
vote_thread,
|
||||
):
|
||||
template_file = "email_templates/result_pmc.j2"
|
||||
base_parameters.template_arguments["receiver_email"] = receiver_email
|
||||
base_parameters.template_arguments["vote_bindings"] = string_comma_to_list(
|
||||
vote_bindings
|
||||
)
|
||||
base_parameters.template_arguments["vote_nonbindings"] = string_comma_to_list(
|
||||
vote_nonbindings
|
||||
)
|
||||
base_parameters.template_arguments["vote_negatives"] = string_comma_to_list(
|
||||
vote_negatives
|
||||
)
|
||||
base_parameters.template_arguments["vote_thread"] = vote_thread
|
||||
message = render_template(template_file, **base_parameters.template_arguments)
|
||||
inter_send_email(
|
||||
base_parameters.username,
|
||||
base_parameters.password,
|
||||
base_parameters.template_arguments["sender_email"],
|
||||
base_parameters.template_arguments["receiver_email"],
|
||||
message,
|
||||
)
|
||||
|
||||
|
||||
@cli.command("announce")
|
||||
@click.option(
|
||||
"--receiver_email",
|
||||
default="dev@superset.apache.org",
|
||||
type=str,
|
||||
prompt="The receiver email (To:)",
|
||||
)
|
||||
@click.pass_obj
|
||||
def announce(base_parameters, receiver_email):
|
||||
template_file = "email_templates/announce.j2"
|
||||
base_parameters.template_arguments["receiver_email"] = receiver_email
|
||||
message = render_template(template_file, **base_parameters.template_arguments)
|
||||
inter_send_email(
|
||||
base_parameters.username,
|
||||
base_parameters.password,
|
||||
base_parameters.template_arguments["sender_email"],
|
||||
base_parameters.template_arguments["receiver_email"],
|
||||
message,
|
||||
)
|
||||
|
||||
|
||||
cli()
|
||||
@@ -25,6 +25,7 @@ These features are considered **unfinished** and should only be used on developm
|
||||
- CLIENT_CACHE
|
||||
- DASHBOARD_CACHE
|
||||
- DASHBOARD_NATIVE_FILTERS_SET
|
||||
- DASHBOARD_RBAC
|
||||
- DISABLE_DATASET_SOURCE_EDIT
|
||||
- ENABLE_EXPLORE_JSON_CSRF_PROTECTION
|
||||
- KV_STORE
|
||||
@@ -32,7 +33,6 @@ These features are considered **unfinished** and should only be used on developm
|
||||
- REMOVE_SLICE_LEVEL_LABEL_COLORS
|
||||
- SHARE_QUERIES_VIA_KV_STORE
|
||||
- TAGGING_SYSTEM
|
||||
- ENABLE_TEMPLATE_REMOVE_FILTERS
|
||||
|
||||
## In Testing
|
||||
These features are **finished** but currently being tested. They are usable, but may still contain some bugs.
|
||||
@@ -41,17 +41,17 @@ These features are **finished** but currently being tested. They are usable, but
|
||||
- DYNAMIC_PLUGINS: [(docs)](https://superset.apache.org/docs/installation/running-on-kubernetes)
|
||||
- DASHBOARD_NATIVE_FILTERS
|
||||
- GLOBAL_ASYNC_QUERIES [(docs)](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#async-chart-queries)
|
||||
- OMNIBAR
|
||||
- VERSIONED_EXPORT
|
||||
- ENABLE_JAVASCRIPT_CONTROLS
|
||||
|
||||
## Stable
|
||||
These features flags are **safe for production** and have been tested.
|
||||
|
||||
- DASHBOARD_CROSS_FILTERS
|
||||
- DASHBOARD_RBAC [(docs)](https://superset.apache.org/docs/creating-charts-dashboards/first-dashboard#manage-access-to-dashboards)
|
||||
- ESCAPE_MARKDOWN_HTML
|
||||
- ENABLE_TEMPLATE_PROCESSING
|
||||
- LISTVIEWS_DEFAULT_CARD_VIEW
|
||||
- ROW_LEVEL_SECURITY
|
||||
- SCHEDULED_QUERIES [(docs)](https://superset.apache.org/docs/installation/alerts-reports)
|
||||
- SQL_VALIDATORS_BY_ENGINE [(docs)](https://superset.apache.org/docs/installation/sql-templating)
|
||||
- SQLLAB_BACKEND_PERSISTENCE
|
||||
@@ -62,3 +62,4 @@ These features flags currently default to True and **will be removed in a future
|
||||
|
||||
- ALLOW_DASHBOARD_DOMAIN_SHARDING
|
||||
- DISPLAY_MARKDOWN_HTML
|
||||
- ENABLE_REACT_CRUD_VIEWS
|
||||
|
||||
@@ -32,7 +32,6 @@ Join our growing community!
|
||||
- [Hostnfly](https://www.hostnfly.com/) [@alexisrosuel]
|
||||
- [Lime](https://www.limebike.com/) [@cxmcc]
|
||||
- [Lyft](https://www.lyft.com/)
|
||||
- [Ontruck](https://www.ontruck.com/)
|
||||
|
||||
### Financial Services
|
||||
- [Aktia Bank plc](https://www.aktia.com) [@villebro]
|
||||
@@ -61,7 +60,6 @@ Join our growing community!
|
||||
- [Tails.com](https://tails.com) [@alanmcruickshank]
|
||||
- [THE ICONIC](http://theiconic.com.au/) [@ksaagariconic]
|
||||
- [Utair](https://www.utair.ru) [@utair-digital]
|
||||
- [VkusVill](https://www.vkusvill.ru) [@ETselikov]
|
||||
- [Zalando](https://www.zalando.com) [@dmigo]
|
||||
- [Zalora](https://www.zalora.com) [@ksaagariconic]
|
||||
|
||||
@@ -80,11 +78,9 @@ Join our growing community!
|
||||
- [FBK - ICT center](http://ict.fbk.eu)
|
||||
- [GfK Data Lab](http://datalab.gfk.com) [@mherr]
|
||||
- [GrowthSimple](https://growthsimple.ai/)
|
||||
- [Hydrolix](https://www.hydrolix.io/)
|
||||
- [Intercom](https://www.intercom.com/) [@kate-gallo]
|
||||
- [jampp](https://jampp.com/)
|
||||
- [Konfío](http://konfio.mx) [@uis-rodriguez]
|
||||
- [mishmash io](https://mishmash.io/)[@mishmash-io]
|
||||
- [Myra Labs](http://www.myralabs.com/) [@viksit]
|
||||
- [Nielsen](http://www.nielsen.com/) [@amitNielsen]
|
||||
- [Ona](https://ona.io) [@pld]
|
||||
@@ -98,57 +94,48 @@ Join our growing community!
|
||||
- [Showmax](https://tech.showmax.com) [@bobek]
|
||||
- [source{d}](https://www.sourced.tech) [@marnovo]
|
||||
- [Steamroot](https://streamroot.io/)
|
||||
- [TechAudit](https://www.techaudit.info) [@ETselikov]
|
||||
- [Tenable](https://www.tenable.com) [@dflionis]
|
||||
- [timbr.ai](https://timbr.ai/) [@semantiDan]
|
||||
- [Tobii](http://www.tobii.com/) [@dwa]
|
||||
- [Tooploox](https://www.tooploox.com/) [@jakubczaplicki]
|
||||
- [Unvired](https://unvired.com)[@srinisubramanian]
|
||||
- [Whale](http://whale.im)
|
||||
- [Windsor.ai](https://www.windsor.ai/) [@octaviancorlade]
|
||||
- [Zeta](https://www.zeta.tech/) [@shaikidris]
|
||||
|
||||
|
||||
### Entertainment
|
||||
- [6play](https://www.6play.fr) [@CoryChaplin]
|
||||
- [bilibili](https://www.bilibili.com) [@Moinheart]
|
||||
- [Douban](https://www.douban.com/) [@luchuan]
|
||||
- [Kuaishou](https://www.kuaishou.com/) [@zhaoyu89730105]
|
||||
- [Netflix](https://www.netflix.com/)
|
||||
- [TME QQMUSIC/WESING](https://www.tencentmusic.com/)[@shenyuanli,@marklaw]
|
||||
- [TME QQMUSIC/WESING](https://www.tencentmusic.com/)
|
||||
- [Xite](https://xite.com/) [@shashankkoppar]
|
||||
- [Zaihang](http://www.zaih.com/)
|
||||
|
||||
### Education
|
||||
- [Brilliant.org](https://brilliant.org/)
|
||||
- [Sunbird](https://www.sunbird.org/) [@eksteporg]
|
||||
- [The GRAPH Network](https://thegraphnetwork.org/)[@fccoelho]
|
||||
- [Udemy](https://www.udemy.com/) [@sungjuly]
|
||||
- [VIPKID](https://www.vipkid.com.cn/) [@illpanda]
|
||||
|
||||
### Energy
|
||||
- [Airboxlab](https://foobot.io) [@antoine-galataud]
|
||||
- [DouroECI](https://www.douroeci.com/) [@nunohelibeires]
|
||||
- [DouroECI](http://douroeci.com/en/) [@nunohelibeires]
|
||||
- [Safaricom](https://www.safaricom.co.ke/) [@mmutiso]
|
||||
- [Scoot](https://scoot.co/) [@haaspt]
|
||||
|
||||
### Healthcare
|
||||
- [Amino](https://amino.com) [@shkr]
|
||||
- [Beans](https://www.beans.fi) [@kakoni]
|
||||
- [Care](https://www.getcare.io/)[@alandao2021]
|
||||
- [Living Goods](https://www.livinggoods.org) [@chelule]
|
||||
- [Maieutical Labs](https://maieuticallabs.it) [@xrmx]
|
||||
- [QPID Health](http://www.qpidhealth.com/)
|
||||
- [TrustMedis](https://trustmedis.com) [@famasya]
|
||||
- [WeSure](https://www.wesure.cn/)
|
||||
|
||||
### HR / Staffing
|
||||
- [Symmetrics](https://www.symmetrics.fyi)
|
||||
|
||||
### Others
|
||||
- [Dropbox](https://www.dropbox.com/) [@bkyryliuk]
|
||||
- [Grassroot](https://www.grassrootinstitute.org/)
|
||||
- [komoot](https://www.komoot.com/) [@christophlingg]
|
||||
- [Let's Roam](https://www.letsroam.com/)
|
||||
- [Twitter](https://twitter.com/)
|
||||
- [VLMedia](https://www.vlmedia.com.tr/) [@ibotheperfect]
|
||||
- [Yahoo!](https://yahoo.com/)
|
||||
- [Grassroot](https://www.grassrootinstitute.org/)
|
||||
- [komoot](https://www.komoot.com/) [@christophlingg]
|
||||
- [Let's Roam](https://www.letsroam.com/)
|
||||
- [Twitter](https://twitter.com/)
|
||||
- [VLMedia](https://www.vlmedia.com.tr/) [@ibotheperfect]
|
||||
- [Yahoo!](https://yahoo.com/)
|
||||
|
||||
200
UPDATING.md
@@ -22,127 +22,27 @@ under the License.
|
||||
This file documents any backwards-incompatible changes in Superset and
|
||||
assists people when migrating to a new version.
|
||||
|
||||
## 2.0.1
|
||||
|
||||
- [21895](https://github.com/apache/superset/pull/21895): Markdown components had their security increased by adhering to the same sanitization process enforced by Github. This means that some HTML elements found in markdowns are not allowed anymore due to the security risks they impose. If you're deploying Superset in a trusted environment and wish to use some of the blocked elements, then you can use the HTML_SANITIZATION_SCHEMA_EXTENSIONS configuration to extend the default sanitization schema. There's also the option to disable HTML sanitization using the HTML_SANITIZATION configuration but we do not recommend this approach because of the security risks. Given the provided configurations, we don't view the improved sanitization as a breaking change but as a security patch.
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
### Potential Downtime
|
||||
|
||||
### Other
|
||||
|
||||
## 2.0.0
|
||||
|
||||
- [19046](https://github.com/apache/superset/pull/19046): Enables the drag and drop interface in Explore control panel by default. Flips `ENABLE_EXPLORE_DRAG_AND_DROP` and `ENABLE_DND_WITH_CLICK_UX` feature flags to `True`.
|
||||
- [18936](https://github.com/apache/superset/pull/18936): Removes legacy SIP-15 interim logic/flags—specifically the `SIP_15_ENABLED`, `SIP_15_GRACE_PERIOD_END`, `SIP_15_DEFAULT_TIME_RANGE_ENDPOINTS`, and `SIP_15_TOAST_MESSAGE` flags. Time range endpoints are no longer configurable and strictly adhere to the `[start, end)` paradigm, i.e., inclusive of the start and exclusive of the end. Additionally this change removes the now obsolete `time_range_endpoints` from the form-data and resulting in the cache being busted.
|
||||
- [19570](https://github.com/apache/superset/pull/19570): makes [sqloxide](https://pypi.org/project/sqloxide/) optional so the SIP-68 migration can be run on aarch64. If the migration is taking too long installing sqloxide manually should improve the performance.
|
||||
- [20170](https://github.com/apache/superset/pull/20170): Introduced a new endpoint for getting datasets samples.
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- [19981](https://github.com/apache/superset/pull/19981): Per [SIP-81](https://github.com/apache/superset/issues/19953) the /explore/form_data api now requires a `datasource_type` in addition to a `datasource_id` for POST and PUT requests
|
||||
- [19770](https://github.com/apache/superset/pull/19770): Per [SIP-11](https://github.com/apache/superset/issues/6032) and [SIP-68](https://github.com/apache/superset/issues/14909), the native NoSQL Druid connector is deprecated and has been removed. Druid is still supported through SQLAlchemy via pydruid. The config keys `DRUID_IS_ACTIVE` and `DRUID_METADATA_LINKS_ENABLED` have also been removed.
|
||||
- [19274](https://github.com/apache/superset/pull/19274): The `PUBLIC_ROLE_LIKE_GAMMA` config key has been removed, set `PUBLIC_ROLE_LIKE = "Gamma"` to have the same functionality.
|
||||
- [19273](https://github.com/apache/superset/pull/19273): The `SUPERSET_CELERY_WORKERS` and `SUPERSET_WORKERS` config keys has been removed. Configure Celery directly using `CELERY_CONFIG` on Superset.
|
||||
- [19231](https://github.com/apache/superset/pull/19231): The `ENABLE_REACT_CRUD_VIEWS` feature flag has been removed (premantly enabled). Any deployments which had set this flag to false will need to verify that the React views support their use case.
|
||||
- [19230](https://github.com/apache/superset/pull/19230): The `ROW_LEVEL_SECURITY` feature flag has been removed (permantly enabled). Any deployments which had set this flag to false will need to verify that the presence of the Row Level Security feature does not interfere with their use case.
|
||||
- [19168](https://github.com/apache/superset/pull/19168): Celery upgrade to 5.X resulted in breaking changes to its command line invocation. Please follow [these](https://docs.celeryq.dev/en/stable/whatsnew-5.2.html#step-1-adjust-your-command-line-invocation) instructions for adjustments. Also consider migrating you Celery config per [here](https://docs.celeryq.dev/en/stable/userguide/configuration.html#conf-old-settings-map).
|
||||
- [19142](https://github.com/apache/superset/pull/19142): The `VERSIONED_EXPORT` config key is now `True` by default.
|
||||
- [19113](https://github.com/apache/superset/pull/19113): The `ENABLE_JAVASCRIPT_CONTROLS` config key has moved from an app config to a feature flag. Any deployments who overrode this setting will now need to override the feature flag from here onward.
|
||||
- [19107](https://github.com/apache/superset/pull/19107): The `SQLLAB_BACKEND_PERSISTENCE` feature flag is now `True` by default, which enables persisting SQL Lab tabs in the backend instead of the browser's `localStorage`.
|
||||
- [19083](https://github.com/apache/superset/pull/19083): Updates the mutator function in the config file to take a SQL argument and a list of kwargs. Any `SQL_QUERY_MUTATOR` config function overrides will need to be updated to match the new set of params. It is advised regardless of the dictionary args that you list in your function arguments, to keep `**kwargs` as the last argument to allow for any new kwargs to be passed in.
|
||||
- [19049](https://github.com/apache/superset/pull/19049): The `APP_ICON_WIDTH` config key has been removed. Superset should now be able to handle different logo sizes without having to explicitly set an `APP_ICON_WIDTH`. This might affect the size of existing custom logos as the UI will now resize them according to the specified space of maximum 148px and not according to the value of `APP_ICON_WIDTH`.
|
||||
- [19017](https://github.com/apache/superset/pull/19017): Removes Python 3.7 support.
|
||||
- [18970](https://github.com/apache/superset/pull/18970): The `DISABLE_LEGACY_DATASOURCE_EDITOR` feature flag is now `True` by default which disables the legacy datasource editor from being shown in the client.
|
||||
|
||||
## 1.5.0
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- [18976](https://github.com/apache/superset/pull/18976): When running the app in debug mode, the app will default to use `SimpleCache` for `FILTER_STATE_CACHE_CONFIG` and `EXPLORE_FORM_DATA_CACHE_CONFIG`. When running in non-debug mode, a cache backend will need to be defined, otherwise the application will fail to start. For installations using Redis or other caching backends, it is recommended to use the same backend for both cache configs.
|
||||
- [17881](https://github.com/apache/superset/pull/17881): Previously simple adhoc filter values on string columns were stripped of enclosing single and double quotes. To fully support literal quotes in filters, both single and double quotes will no longer be removed from filter values.
|
||||
- [17556](https://github.com/apache/superset/pull/17556): Bumps `mysqlclient` from v1 to v2.
|
||||
- [17539](https://github.com/apache/superset/pull/17539): All Superset CLI commands, e.g. `init`, `load_examples`, etc. require setting the `FLASK_APP` environment variable (which is set by default when `.flaskenv` is loaded).
|
||||
- [15254](https://github.com/apache/superset/pull/15254): The `QUERY_COST_FORMATTERS_BY_ENGINE`, `SQL_VALIDATORS_BY_ENGINE` and `SCHEDULED_QUERIES` feature flags are now defined as config keys given that feature flags are reserved for boolean only values.
|
||||
|
||||
### Potential Downtime
|
||||
|
||||
- [16756](https://github.com/apache/incubator-superset/pull/16756): a change which renames the `dbs.allow_csv_upload` column to `dbs.allow_file_upload` via a (potentially locking) DDL operation.
|
||||
- [17539](https://github.com/apache/superset/pull/17539): all Superset CLI commands
|
||||
(init, load_examples and etc) require setting the FLASK_APP environment variable
|
||||
(which is set by default when .flaskenv is loaded)
|
||||
- [17360](https://github.com/apache/superset/pull/17360): changes the column type from `VARCHAR(32)` to `TEXT` in table `table_columns`, potentially requiring a table lock on MySQL dbs or taking some time to complete on large deployments.
|
||||
- [17543](https://github.com/apache/superset/pull/17543): introduces new models from SIP-68. The database migration migrates the old models (`SqlaTable`, `TableColumn`, `SqlMetric`) to the new models (`Column`, `Table`, `Dataset`), and the PR introduces logic to keep the old models in sync with the new ones until they are fully removed. The migration might take considerable time depending on the number of datasets.
|
||||
|
||||
### Deprecations
|
||||
|
||||
- [18960](https://github.com/apache/superset/pull/18960): Persisting URL params in chart metadata is no longer supported. To set a default value for URL params in Jinja code, use the optional second argument: `url_param("my-param", "my-default-value")`.
|
||||
|
||||
### Other
|
||||
|
||||
- [17589](https://github.com/apache/superset/pull/17589): It is now possible to limit access to users' recent activity data by setting the `ENABLE_BROAD_ACTIVITY_ACCESS` config flag to false, or customizing the `raise_for_user_activity_access` method in the security manager.
|
||||
- [17536](https://github.com/apache/superset/pull/17536): introduced a key-value endpoint to store dashboard filter state. This endpoint is backed by Flask-Caching and the default configuration assumes that the values will be stored in the file system. If you are already using another cache backend like Redis or Memcached, you'll probably want to change this setting in `superset_config.py`. The key is `FILTER_STATE_CACHE_CONFIG` and the available settings can be found in Flask-Caching [docs](https://flask-caching.readthedocs.io/en/latest/).
|
||||
- [17882](https://github.com/apache/superset/pull/17882): introduced a key-value endpoint to store Explore form data. This endpoint is backed by Flask-Caching and the default configuration assumes that the values will be stored in the file system. If you are already using another cache backend like Redis or Memcached, you'll probably want to change this setting in `superset_config.py`. The key is `EXPLORE_FORM_DATA_CACHE_CONFIG` and the available settings can be found in Flask-Caching [docs](https://flask-caching.readthedocs.io/en/latest/).
|
||||
|
||||
## 1.4.1
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- [17984](https://github.com/apache/superset/pull/17984): Default Flask SECRET_KEY has changed for security reasons. You should always override with your own secret. Set `PREVIOUS_SECRET_KEY` (ex: PREVIOUS_SECRET_KEY = "\2\1thisismyscretkey\1\2\\e\\y\\y\\h") with your previous key and use `superset re-encrypt-secrets` to rotate you current secrets
|
||||
|
||||
### Potential Downtime
|
||||
|
||||
### Deprecations
|
||||
|
||||
### Other
|
||||
|
||||
## 1.4.0
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- [16660](https://github.com/apache/superset/pull/16660): The `columns` Jinja parameter has been renamed `table_columns` to make the `columns` query object parameter available in the Jinja context.
|
||||
- [16711](https://github.com/apache/superset/pull/16711): The `url_param` Jinja function will now by default escape the result. For instance, the value `O'Brien` will now be changed to `O''Brien`. To disable this behavior, call `url_param` with `escape_result` set to `False`: `url_param("my_key", "my default", escape_result=False)`.
|
||||
|
||||
### Potential Downtime
|
||||
|
||||
### Deprecations
|
||||
|
||||
### Other
|
||||
|
||||
- [16809](https://github.com/apache/superset/pull/16809): When building the superset frontend assets manually, you should now use Node 16 (previously Node 14 was required/recommended). Node 14 will most likely still work for at least some time, but is no longer actively tested for on CI.
|
||||
|
||||
## 1.3.0
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- [15909](https://github.com/apache/superset/pull/15909): a change which
|
||||
drops a uniqueness criterion (which may or may not have existed) to the tables table. This constraint was obsolete as it is handled by the ORM due to differences in how MySQL, PostgreSQL, etc. handle uniqueness for NULL values.
|
||||
|
||||
### Potential Downtime
|
||||
|
||||
- [14234](https://github.com/apache/superset/pull/14234): Adds the `limiting_factor` column to the `query` table. Give the migration includes a DDL operation on a heavily trafficked table, potential service downtime may be required.
|
||||
|
||||
-[16454](https://github.com/apache/superset/pull/16454): Adds the `extra` column to the `table_columns` table. Users using MySQL will either need to schedule downtime or use the percona toolkit (or similar) to perform the migration.
|
||||
|
||||
## 1.2.0
|
||||
|
||||
### Deprecations
|
||||
|
||||
- [13440](https://github.com/apache/superset/pull/13440): Dashboard/Charts reports and old Alerts is deprecated. The following config keys are deprecated:
|
||||
- ENABLE_ALERTS
|
||||
- SCHEDULED_EMAIL_DEBUG_MODE
|
||||
- EMAIL_REPORTS_CRON_RESOLUTION
|
||||
- EMAIL_ASYNC_TIME_LIMIT_SEC
|
||||
- EMAIL_REPORT_BCC_ADDRESS
|
||||
- EMAIL_REPORTS_USER
|
||||
|
||||
### Other
|
||||
|
||||
## Next
|
||||
- [13772](https://github.com/apache/superset/pull/13772): Row level security (RLS) is now enabled by default. To activate the feature, please run `superset init` to expose the RLS menus to Admin users.
|
||||
|
||||
- [13980](https://github.com/apache/superset/pull/13980): Data health checks no longer use the metadata database as an interim cache. Though non-breaking, deployments which implement complex logic should likely memoize the callback function. Refer to documentation in the confg.py file for more detail.
|
||||
|
||||
- [14255](https://github.com/apache/superset/pull/14255): The default `CSV_TO_HIVE_UPLOAD_DIRECTORY_FUNC` callable logic has been updated to leverage the specified database and schema to ensure the upload S3 key prefix is unique. Previously tables generated via upload from CSV with the same name but differ schema and/or cluster would use the same S3 key prefix. Note this change does not impact previously imported tables.
|
||||
|
||||
### Breaking Changes
|
||||
### Potential Downtime
|
||||
- [14234](https://github.com/apache/superset/pull/14234): Adds the `limiting_factor` column to the `query` table. Give the migration includes a DDL operation on a heavily trafficed table, potential service downtime may be required.
|
||||
|
||||
### Deprecations
|
||||
- [13440](https://github.com/apache/superset/pull/13440): Dashboard/Charts reports and old Alerts is deprecated. The following config keys are deprecated:
|
||||
- ENABLE_ALERTS
|
||||
- SCHEDULED_EMAIL_DEBUG_MODE
|
||||
- EMAIL_REPORTS_CRON_RESOLUTION
|
||||
- EMAIL_ASYNC_TIME_LIMIT_SEC
|
||||
- EMAIL_REPORT_BCC_ADDRESS
|
||||
- EMAIL_REPORTS_USER
|
||||
### Other
|
||||
|
||||
## 1.1.0
|
||||
|
||||
### Breaking Changes
|
||||
@@ -170,21 +70,19 @@ assists people when migrating to a new version.
|
||||
## 1.0.0
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- [11509](https://github.com/apache/superset/pull/12491): Dataset metadata updates check user ownership, only owners or an Admin are allowed.
|
||||
- Security simplification (SIP-19), the following permission domains were simplified:
|
||||
|
||||
- [12072](https://github.com/apache/superset/pull/12072): `Query` with `can_read`, `can_write`
|
||||
- [12036](https://github.com/apache/superset/pull/12036): `Database` with `can_read`, `can_write`.
|
||||
- [12012](https://github.com/apache/superset/pull/12036): `Dashboard` with `can_read`, `can_write`.
|
||||
- [12061](https://github.com/apache/superset/pull/12061): `Log` with `can_read`, `can_write`.
|
||||
- [12000](https://github.com/apache/superset/pull/12000): `Dataset` with `can_read`, `can_write`.
|
||||
- [12014](https://github.com/apache/superset/pull/12014): `Annotation` with `can_read`, `can_write`.
|
||||
- [11981](https://github.com/apache/superset/pull/11981): `Chart` with `can_read`, `can_write`.
|
||||
- [11853](https://github.com/apache/superset/pull/11853): `ReportSchedule` with `can_read`, `can_write`.
|
||||
- [11856](https://github.com/apache/superset/pull/11856): `CssTemplate` with `can_read`, `can_write`.
|
||||
- [11764](https://github.com/apache/superset/pull/11764): `SavedQuery` with `can_read`, `can_write`.
|
||||
Old permissions will be automatically migrated to these new permissions and applied to all existing security Roles.
|
||||
- [12072](https://github.com/apache/superset/pull/12072): `Query` with `can_read`, `can_write`
|
||||
- [12036](https://github.com/apache/superset/pull/12036): `Database` with `can_read`, `can_write`.
|
||||
- [12012](https://github.com/apache/superset/pull/12036): `Dashboard` with `can_read`, `can_write`.
|
||||
- [12061](https://github.com/apache/superset/pull/12061): `Log` with `can_read`, `can_write`.
|
||||
- [12000](https://github.com/apache/superset/pull/12000): `Dataset` with `can_read`, `can_write`.
|
||||
- [12014](https://github.com/apache/superset/pull/12014): `Annotation` with `can_read`, `can_write`.
|
||||
- [11981](https://github.com/apache/superset/pull/11981): `Chart` with `can_read`, `can_write`.
|
||||
- [11853](https://github.com/apache/superset/pull/11853): `ReportSchedule` with `can_read`, `can_write`.
|
||||
- [11856](https://github.com/apache/superset/pull/11856): `CssTemplate` with `can_read`, `can_write`.
|
||||
- [11764](https://github.com/apache/superset/pull/11764): `SavedQuery` with `can_read`, `can_write`.
|
||||
Old permissions will be automatically migrated to these new permissions and applied to all existing security Roles.
|
||||
|
||||
- [11499](https://github.com/apache/superset/pull/11499): Breaking change: `STORE_CACHE_KEYS_IN_METADATA_DB` config flag added (default=`False`) to write `CacheKey` records to the metadata DB. `CacheKey` recording was enabled by default previously.
|
||||
|
||||
@@ -194,49 +92,47 @@ assists people when migrating to a new version.
|
||||
|
||||
- [11575](https://github.com/apache/superset/pull/11575) The Row Level Security (RLS) config flag has been moved to a feature flag. To migrate, add `ROW_LEVEL_SECURITY: True` to the `FEATURE_FLAGS` dict in `superset_config.py`.
|
||||
|
||||
- [11259](https://github.com/apache/superset/pull/11259): config flag ENABLE_REACT_CRUD_VIEWS has been set to `True` by default, set to `False` if you prefer to the vintage look and feel. However, we may discontinue support on the vintage list view in the future.
|
||||
- [11259](https://github.com/apache/superset/pull/11259): config flag ENABLE_REACT_CRUD_VIEWS has been set to `True` by default, set to `False` if you prefer to the vintage look and feel. However, we may discontine support on the vintage list view in the future.
|
||||
|
||||
- [11244](https://github.com/apache/superset/pull/11244): The `REDUCE_DASHBOARD_BOOTSTRAP_PAYLOAD` feature flag has been removed after being set to True for multiple months.
|
||||
|
||||
- [11172](https://github.com/apache/superset/pull/11172): Turning
|
||||
off language selectors by default as i18n is incomplete in most languages
|
||||
and requires more work. You can easily turn on the languages you want
|
||||
to expose in your environment in superset_config.py
|
||||
off language selectors by default as i18n is incomplete in most languages
|
||||
and requires more work. You can easily turn on the languages you want
|
||||
to expose in your environment in superset_config.py
|
||||
|
||||
- [11172](https://github.com/apache/superset/pull/11172): Breaking change: SQL templating is turned off by default. To turn it on set `ENABLE_TEMPLATE_PROCESSING` to True on `FEATURE_FLAGS`
|
||||
|
||||
### Potential Downtime
|
||||
|
||||
- [11920](https://github.com/apache/superset/pull/11920): Undoes the DB migration from [11714](https://github.com/apache/superset/pull/11714) to prevent adding new columns to the logs table. Deploying a sha between these two PRs may result in locking your DB.
|
||||
- [11920](https://github.com/apache/superset/pull/11920): Undos the DB migration from [11714](https://github.com/apache/superset/pull/11714) to prevent adding new columns to the logs table. Deploying a sha between these two PRs may result in locking your DB.
|
||||
|
||||
- [11714](https://github.com/apache/superset/pull/11714): Logs
|
||||
significantly more analytics events (roughly double?), and when
|
||||
using DBEventLogger (default) could result in stressing the metadata
|
||||
database more.
|
||||
significantly more analytics events (roughly double?), and when
|
||||
using DBEventLogger (default) could result in stressing the metadata
|
||||
database more.
|
||||
|
||||
- [11098](https://github.com/apache/superset/pull/11098): includes a database migration that adds a `uuid` column to most models, and updates `Dashboard.position_json` to include chart UUIDs. Depending on number of objects, the migration may take up to 5 minutes, requiring planning for downtime.
|
||||
|
||||
### Deprecations
|
||||
|
||||
- [11155](https://github.com/apache/superset/pull/11155): The `FAB_UPDATE_PERMS` config parameter is no longer required as the Superset application correctly informs FAB under which context permissions should be updated.
|
||||
|
||||
## 0.38.0
|
||||
|
||||
- [10887](https://github.com/apache/superset/pull/10887): Breaking change: The custom cache backend changed in order to support the Flask-Caching factory method approach and thus must be registered as a custom type. See [here](https://flask-caching.readthedocs.io/en/latest/#custom-cache-backends) for specifics.
|
||||
* [10887](https://github.com/apache/superset/pull/10887): Breaking change: The custom cache backend changed in order to support the Flask-Caching factory method approach and thus must be registered as a custom type. See [here](https://flask-caching.readthedocs.io/en/latest/#custom-cache-backends) for specifics.
|
||||
|
||||
- [10674](https://github.com/apache/superset/pull/10674): Breaking change: PUBLIC_ROLE_LIKE_GAMMA was removed is favour of the new PUBLIC_ROLE_LIKE so it can be set to whatever role you want.
|
||||
* [10674](https://github.com/apache/superset/pull/10674): Breaking change: PUBLIC_ROLE_LIKE_GAMMA was removed is favour of the new PUBLIC_ROLE_LIKE so it can be set to whatever role you want.
|
||||
|
||||
- [10590](https://github.com/apache/superset/pull/10590): Breaking change: this PR will convert iframe chart into dashboard markdown component, and remove all `iframe`, `separator`, and `markup` slices (and support) from Superset. If you have important data in those slices, please backup manually.
|
||||
* [10590](https://github.com/apache/superset/pull/10590): Breaking change: this PR will convert iframe chart into dashboard markdown component, and remove all `iframe`, `separator`, and `markup` slices (and support) from Superset. If you have important data in those slices, please backup manually.
|
||||
|
||||
- [10562](https://github.com/apache/superset/pull/10562): EMAIL_REPORTS_WEBDRIVER is deprecated use WEBDRIVER_TYPE instead.
|
||||
* [10562](https://github.com/apache/superset/pull/10562): EMAIL_REPORTS_WEBDRIVER is deprecated use WEBDRIVER_TYPE instead.
|
||||
|
||||
- [10567](https://github.com/apache/superset/pull/10567): Default WEBDRIVER_OPTION_ARGS are Chrome-specific. If you're using FF, should be `--headless` only
|
||||
* [10567](https://github.com/apache/superset/pull/10567): Default WEBDRIVER_OPTION_ARGS are Chrome-specific. If you're using FF, should be `--headless` only
|
||||
|
||||
- [10241](https://github.com/apache/superset/pull/10241): change on Alpha role, users started to have access to "Annotation Layers", "Css Templates" and "Import Dashboards".
|
||||
* [10241](https://github.com/apache/superset/pull/10241): change on Alpha role, users started to have access to "Annotation Layers", "Css Templates" and "Import Dashboards".
|
||||
|
||||
- [10324](https://github.com/apache/superset/pull/10324): Facebook Prophet has been introduced as an optional dependency to add support for timeseries forecasting in the chart data API. To enable this feature, install Superset with the optional dependency `prophet` or directly `pip install fbprophet`.
|
||||
* [10324](https://github.com/apache/superset/pull/10324): Facebook Prophet has been introduced as an optional dependency to add support for timeseries forecasting in the chart data API. To enable this feature, install Superset with the optional dependency `prophet` or directly `pip install fbprophet`.
|
||||
|
||||
- [10320](https://github.com/apache/superset/pull/10320): References to blacklist/whitelist language have been replaced with more appropriate alternatives. All configs referencing containing `WHITE`/`BLACK` have been replaced with `ALLOW`/`DENY`. Affected config variables that need to be updated: `TIME_GRAIN_BLACKLIST`, `VIZ_TYPE_BLACKLIST`, `DRUID_DATA_SOURCE_BLACKLIST`.
|
||||
* [10320](https://github.com/apache/superset/pull/10320): References to blacklst/whitelist language have been replaced with more appropriate alternatives. All configs refencing containing `WHITE`/`BLACK` have been replaced with `ALLOW`/`DENY`. Affected config variables that need to be updated: `TIME_GRAIN_BLACKLIST`, `VIZ_TYPE_BLACKLIST`, `DRUID_DATA_SOURCE_BLACKLIST`.
|
||||
|
||||
## 0.37.1
|
||||
|
||||
@@ -250,7 +146,7 @@ assists people when migrating to a new version.
|
||||
|
||||
- [10222](https://github.com/apache/superset/pull/10222): a change which changes how payloads are cached. Previous cached objects cannot be decoded and thus will be reloaded from source.
|
||||
|
||||
- [10130](https://github.com/apache/superset/pull/10130): a change which deprecates the `dbs.perm` column in favor of SQLAlchemy [hybrid attributes](https://docs.sqlalchemy.org/en/13/orm/extensions/hybrid.html).
|
||||
- [10130](https://github.com/apache/superset/pull/10130): a change which deprecates the `dbs.perm` column in favor of SQLAlchemy [hybird attributes](https://docs.sqlalchemy.org/en/13/orm/extensions/hybrid.html).
|
||||
|
||||
- [10034](https://github.com/apache/superset/pull/10034): a change which deprecates the public security manager `assert_datasource_permission`, `assert_query_context_permission`, `assert_viz_permission`, and `rejected_tables` methods with the `raise_for_access` method which also handles assertion logic for SQL tables.
|
||||
|
||||
@@ -304,7 +200,7 @@ assists people when migrating to a new version.
|
||||
- [8699](https://github.com/apache/superset/pull/8699): A `row_level_security_filters`
|
||||
table has been added, which is many-to-many with `tables` and `ab_roles`. The applicable filters
|
||||
are added to the sqla query, and the RLS ids are added to the query cache keys. If RLS is enabled in config.py (`ENABLE_ROW_LEVEL_SECURITY = True`; by default, it is disabled), they can be
|
||||
accessed through the `Security` menu, or when editing a table.
|
||||
accessed through the `Security` menu, or when editting a table.
|
||||
|
||||
- [8732](https://github.com/apache/superset/pull/8732): Swagger user interface is now enabled by default.
|
||||
A new permission `show on SwaggerView` is created by `superset init` and given to the `Admin` Role. To disable the UI,
|
||||
@@ -343,7 +239,7 @@ assists people when migrating to a new version.
|
||||
- We're deprecating the concept of "restricted metric", this feature
|
||||
was not fully working anyhow.
|
||||
- [8117](https://github.com/apache/superset/pull/8117): If you are
|
||||
using `ENABLE_PROXY_FIX = True`, review the newly-introduced variable,
|
||||
using `ENABLE_PROXY_FIX = True`, review the newly-introducted variable,
|
||||
`PROXY_FIX_CONFIG`, which changes the proxy behavior in accordance with
|
||||
[Werkzeug](https://werkzeug.palletsprojects.com/en/0.15.x/middleware/proxy_fix/)
|
||||
|
||||
@@ -483,7 +379,7 @@ Superset 0.25.0 contains a backwards incompatible changes.
|
||||
If you run a production system you should schedule downtime for this
|
||||
upgrade.
|
||||
|
||||
The PRs below have more information around the breaking changes:
|
||||
The PRs bellow have more information around the breaking changes:
|
||||
|
||||
- [9825](https://github.com/apache/superset/pull/9825): Support for Excel sheet upload added. To enable support, install Superset with the optional dependency `excel`
|
||||
|
||||
|
||||
@@ -14,7 +14,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
x-superset-image: &superset-image apache/superset:${TAG:-latest-dev}
|
||||
x-superset-image: &superset-image apache/superset:latest-dev
|
||||
x-superset-depends-on: &superset-depends-on
|
||||
- db
|
||||
- redis
|
||||
@@ -33,7 +33,7 @@ services:
|
||||
- redis:/data
|
||||
|
||||
db:
|
||||
env_file: docker/.env-non-dev
|
||||
env_file: docker/.env
|
||||
image: postgres:10
|
||||
container_name: superset_db
|
||||
restart: unless-stopped
|
||||
|
||||
@@ -14,8 +14,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
x-superset-image: &superset-image apache/superset:2.0.0
|
||||
x-superset-user: &superset-user root
|
||||
x-superset-image: &superset-image apache/superset:latest-dev
|
||||
x-superset-depends-on: &superset-depends-on
|
||||
- db
|
||||
- redis
|
||||
@@ -40,7 +39,7 @@ services:
|
||||
|
||||
db:
|
||||
env_file: docker/.env
|
||||
image: postgres:14
|
||||
image: postgres:10
|
||||
container_name: superset_db
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
@@ -56,7 +55,7 @@ services:
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- 8088:8088
|
||||
user: *superset-user
|
||||
user: "root"
|
||||
depends_on: *superset-depends-on
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
@@ -96,13 +95,13 @@ services:
|
||||
command: ["/app/docker/docker-init.sh"]
|
||||
env_file: docker/.env
|
||||
depends_on: *superset-depends-on
|
||||
user: *superset-user
|
||||
user: "root"
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
CYPRESS_CONFIG: "${CYPRESS_CONFIG}"
|
||||
|
||||
superset-node:
|
||||
image: node:16
|
||||
image: node:14
|
||||
container_name: superset_node
|
||||
command: ["/app/docker/docker-frontend.sh"]
|
||||
env_file: docker/.env
|
||||
@@ -116,11 +115,8 @@ services:
|
||||
env_file: docker/.env
|
||||
restart: unless-stopped
|
||||
depends_on: *superset-depends-on
|
||||
user: *superset-user
|
||||
user: "root"
|
||||
volumes: *superset-volumes
|
||||
# Bump memory limit if processing selenium / thumbnails on superset-worker
|
||||
# mem_limit: 2038m
|
||||
# mem_reservation: 128M
|
||||
|
||||
superset-worker-beat:
|
||||
image: *superset-image
|
||||
@@ -129,7 +125,7 @@ services:
|
||||
env_file: docker/.env
|
||||
restart: unless-stopped
|
||||
depends_on: *superset-depends-on
|
||||
user: *superset-user
|
||||
user: "root"
|
||||
volumes: *superset-volumes
|
||||
|
||||
superset-tests-worker:
|
||||
@@ -145,7 +141,7 @@ services:
|
||||
REDIS_HOST: localhost
|
||||
network_mode: host
|
||||
depends_on: *superset-depends-on
|
||||
user: *superset-user
|
||||
user: "root"
|
||||
volumes: *superset-volumes
|
||||
|
||||
volumes:
|
||||
|
||||
@@ -24,52 +24,58 @@ Docker is an easy way to get started with Superset.
|
||||
## Prerequisites
|
||||
|
||||
1. Docker! [link](https://www.docker.com/get-started)
|
||||
2. Docker-compose [link](https://docs.docker.com/compose/install/)
|
||||
1. Docker-compose [link](https://docs.docker.com/compose/install/)
|
||||
|
||||
## Configuration
|
||||
|
||||
The `/app/pythonpath` folder is mounted from [`./docker/pythonpath_dev`](./pythonpath_dev)
|
||||
which contains a base configuration [`./docker/pythonpath_dev/superset_config.py`](./pythonpath_dev/superset_config.py)
|
||||
The `/app/pythonpath` folder is mounted from [./docker/pythonpath_dev](./docker/pythonpath_dev)
|
||||
which contains a base configuration [./docker/pythonpath_dev/superset_config.py](./docker/pythonpath_dev/superset_config.py)
|
||||
intended for use with local development.
|
||||
|
||||
### Local overrides
|
||||
|
||||
In order to override configuration settings locally, simply make a copy of [`./docker/pythonpath_dev/superset_config_local.example`](./pythonpath_dev/superset_config_local.example)
|
||||
into `./docker/pythonpath_dev/superset_config_docker.py` (git ignored) and fill in your overrides.
|
||||
In order to override configuration settings locally, simply make a copy of [./docker/pythonpath_dev/superset_config_local.example](./docker/pythonpath_dev/superset_config_local.example)
|
||||
into [./docker/pythonpath_dev/superset_config_docker.py](./docker/pythonpath_dev/superset_config_docker.py) (git ignored) and fill in your overrides.
|
||||
|
||||
### Local packages
|
||||
|
||||
If you want to add Python packages in order to test things like databases locally, you can simply add a local requirements.txt (`./docker/requirements-local.txt`)
|
||||
and rebuild your Docker stack.
|
||||
If you want to add python packages in order to test things like DBs locally, you can simply add a local requirements.txt (./docker/requirements-local.txt)
|
||||
and rebuild your docker stack.
|
||||
|
||||
Steps:
|
||||
|
||||
1. Create `./docker/requirements-local.txt`
|
||||
2. Add your new packages
|
||||
3. Rebuild docker-compose
|
||||
1. `docker-compose down -v`
|
||||
2. `docker-compose up`
|
||||
1. Create ./docker/requirements-local.txt
|
||||
2. Add your new packages
|
||||
3. Rebuild docker-compose
|
||||
a. `docker-compose down -v`
|
||||
b. `docker-compose up`
|
||||
|
||||
## Initializing Database
|
||||
|
||||
The database will initialize itself upon startup via the init container ([`superset-init`](./docker-init.sh)). This may take a minute.
|
||||
The DB will initialize itself upon startup via the init container (superset-init)
|
||||
(This may take a minute.)
|
||||
|
||||
## Normal Operation
|
||||
|
||||
To run the container, simply run: `docker-compose up`
|
||||
To run the container, simply run:
|
||||
|
||||
After waiting several minutes for Superset initialization to finish, you can open a browser and view [`http://localhost:8088`](http://localhost:8088)
|
||||
```bash
|
||||
docker-compose up
|
||||
```
|
||||
|
||||
After several minutes for superset initialization to finish, you can open a browser and view [`http://localhost:8088`](http://localhost:8088)
|
||||
to start your journey.
|
||||
|
||||
## Developing
|
||||
|
||||
While running, the container server will reload on modification of the Superset Python and JavaScript source code.
|
||||
While running, the container server will reload on modification of the superset python and javascript source code.
|
||||
Don't forget to reload the page to take the new frontend into account though.
|
||||
|
||||
## Production
|
||||
|
||||
It is possible to run Superset in non-development mode by using [`docker-compose-non-dev.yml`](../docker-compose-non-dev.yml). This file excludes the volumes needed for development and uses [`./docker/.env-non-dev`](./.env-non-dev) which sets the variable `SUPERSET_ENV` to `production`.
|
||||
It is also possible to run Superset in non-development mode: in the `docker-compose.yml` file remove
|
||||
the volumes needed for development and change the variable `SUPERSET_ENV` to `production`.
|
||||
|
||||
## Resource Constraints
|
||||
|
||||
If you are attempting to build on macOS and it exits with 137 you need to increase your Docker resources. See instructions [here](https://docs.docker.com/docker-for-mac/#advanced) (search for memory)
|
||||
If you are attempting to build on a Mac and it exits with 137 you need to increase your docker resources.
|
||||
OSX instructions: https://docs.docker.com/docker-for-mac/#advanced (Search for memory)
|
||||
|
||||
@@ -21,8 +21,9 @@ set -eo pipefail
|
||||
REQUIREMENTS_LOCAL="/app/docker/requirements-local.txt"
|
||||
# If Cypress run – overwrite the password for admin and export env variables
|
||||
if [ "$CYPRESS_CONFIG" == "true" ]; then
|
||||
export SUPERSET_CONFIG=tests.integration_tests.superset_test_config
|
||||
export SUPERSET_CONFIG=tests.superset_test_config
|
||||
export SUPERSET_TESTENV=true
|
||||
export ENABLE_REACT_CRUD_VIEWS=true
|
||||
export SUPERSET__SQLALCHEMY_DATABASE_URI=postgresql+psycopg2://superset:superset@db:5432/superset
|
||||
fi
|
||||
#
|
||||
@@ -37,14 +38,14 @@ fi
|
||||
|
||||
if [[ "${1}" == "worker" ]]; then
|
||||
echo "Starting Celery worker..."
|
||||
celery --app=superset.tasks.celery_app:app worker -Ofair -l INFO
|
||||
celery worker --app=superset.tasks.celery_app:app -Ofair -l INFO
|
||||
elif [[ "${1}" == "beat" ]]; then
|
||||
echo "Starting Celery beat..."
|
||||
celery --app=superset.tasks.celery_app:app beat --pidfile /tmp/celerybeat.pid -l INFO -s "${SUPERSET_HOME}"/celerybeat-schedule
|
||||
celery beat --app=superset.tasks.celery_app:app --pidfile /tmp/celerybeat.pid -l INFO
|
||||
elif [[ "${1}" == "app" ]]; then
|
||||
echo "Starting web app..."
|
||||
flask run -p 8088 --with-threads --reload --debugger --host=0.0.0.0
|
||||
elif [[ "${1}" == "app-gunicorn" ]]; then
|
||||
echo "Starting web app..."
|
||||
/usr/bin/run-server.sh
|
||||
/app/docker/docker-entrypoint.sh
|
||||
fi
|
||||
|
||||
@@ -20,7 +20,16 @@
|
||||
# TODO: copy config overrides from ENV vars
|
||||
|
||||
# TODO: run celery in detached state
|
||||
export SERVER_THREADS_AMOUNT=8
|
||||
# start up the web server
|
||||
|
||||
/usr/bin/run-server.sh
|
||||
# start up the web server
|
||||
gunicorn \
|
||||
--bind "0.0.0.0:${SUPERSET_PORT}" \
|
||||
--access-logfile '-' \
|
||||
--error-logfile '-' \
|
||||
--workers 1 \
|
||||
--worker-class gthread \
|
||||
--threads 8 \
|
||||
--timeout 60 \
|
||||
--limit-request-line 0 \
|
||||
--limit-request-field_size 0 \
|
||||
"${FLASK_APP}"
|
||||
|
||||
34
docker/docker-entrypoint.sh
Executable file
@@ -0,0 +1,34 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
# contributor license agreements. See the NOTICE file distributed with
|
||||
# this work for additional information regarding copyright ownership.
|
||||
# The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
# (the "License"); you may not use this file except in compliance with
|
||||
# the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
set -eo pipefail
|
||||
|
||||
if [ "${#}" -ne 0 ]; then
|
||||
exec "${@}"
|
||||
else
|
||||
gunicorn \
|
||||
--bind "0.0.0.0:${SUPERSET_PORT}" \
|
||||
--access-logfile '-' \
|
||||
--error-logfile '-' \
|
||||
--workers 1 \
|
||||
--worker-class gthread \
|
||||
--threads 20 \
|
||||
--timeout 60 \
|
||||
--limit-request-line 0 \
|
||||
--limit-request-field_size 0 \
|
||||
"${FLASK_APP}"
|
||||
fi
|
||||
@@ -18,7 +18,6 @@
|
||||
set -e
|
||||
|
||||
cd /app/superset-frontend
|
||||
npm install -g npm@7
|
||||
npm install -f --no-optional --global webpack webpack-cli
|
||||
npm install -f --no-optional
|
||||
|
||||
|
||||
@@ -41,8 +41,9 @@ ADMIN_PASSWORD="admin"
|
||||
# If Cypress run – overwrite the password for admin and export env variables
|
||||
if [ "$CYPRESS_CONFIG" == "true" ]; then
|
||||
ADMIN_PASSWORD="general"
|
||||
export SUPERSET_CONFIG=tests.integration_tests.superset_test_config
|
||||
export SUPERSET_CONFIG=tests.superset_test_config
|
||||
export SUPERSET_TESTENV=true
|
||||
export ENABLE_REACT_CRUD_VIEWS=true
|
||||
export SUPERSET__SQLALCHEMY_DATABASE_URI=postgresql+psycopg2://superset:superset@db:5432/superset
|
||||
fi
|
||||
# Initialize the database
|
||||
|
||||
@@ -23,7 +23,6 @@
|
||||
import logging
|
||||
import os
|
||||
from datetime import timedelta
|
||||
from typing import Optional
|
||||
|
||||
from cachelib.file import FileSystemCache
|
||||
from celery.schedules import crontab
|
||||
@@ -31,7 +30,7 @@ from celery.schedules import crontab
|
||||
logger = logging.getLogger()
|
||||
|
||||
|
||||
def get_env_variable(var_name: str, default: Optional[str] = None) -> str:
|
||||
def get_env_variable(var_name, default=None):
|
||||
"""Get the environment variable or raise exception."""
|
||||
try:
|
||||
return os.environ[var_name]
|
||||
@@ -64,25 +63,15 @@ SQLALCHEMY_DATABASE_URI = "%s://%s:%s@%s:%s/%s" % (
|
||||
|
||||
REDIS_HOST = get_env_variable("REDIS_HOST")
|
||||
REDIS_PORT = get_env_variable("REDIS_PORT")
|
||||
REDIS_CELERY_DB = get_env_variable("REDIS_CELERY_DB", "0")
|
||||
REDIS_RESULTS_DB = get_env_variable("REDIS_RESULTS_DB", "1")
|
||||
REDIS_CELERY_DB = get_env_variable("REDIS_CELERY_DB", 0)
|
||||
REDIS_RESULTS_DB = get_env_variable("REDIS_RESULTS_DB", 1)
|
||||
|
||||
RESULTS_BACKEND = FileSystemCache("/app/superset_home/sqllab")
|
||||
|
||||
CACHE_CONFIG = {
|
||||
"CACHE_TYPE": "redis",
|
||||
"CACHE_DEFAULT_TIMEOUT": 300,
|
||||
"CACHE_KEY_PREFIX": "superset_",
|
||||
"CACHE_REDIS_HOST": REDIS_HOST,
|
||||
"CACHE_REDIS_PORT": REDIS_PORT,
|
||||
"CACHE_REDIS_DB": REDIS_RESULTS_DB,
|
||||
}
|
||||
DATA_CACHE_CONFIG = CACHE_CONFIG
|
||||
|
||||
|
||||
class CeleryConfig(object):
|
||||
BROKER_URL = f"redis://{REDIS_HOST}:{REDIS_PORT}/{REDIS_CELERY_DB}"
|
||||
CELERY_IMPORTS = ("superset.sql_lab",)
|
||||
CELERY_IMPORTS = ("superset.sql_lab", "superset.tasks")
|
||||
CELERY_RESULT_BACKEND = f"redis://{REDIS_HOST}:{REDIS_PORT}/{REDIS_RESULTS_DB}"
|
||||
CELERYD_LOG_LEVEL = "DEBUG"
|
||||
CELERYD_PREFETCH_MULTIPLIER = 1
|
||||
|
||||
@@ -1,35 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
#
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
HYPHEN_SYMBOL='-'
|
||||
|
||||
gunicorn \
|
||||
--bind "${SUPERSET_BIND_ADDRESS:-0.0.0.0}:${SUPERSET_PORT:-8088}" \
|
||||
--access-logfile "${ACCESS_LOG_FILE:-$HYPHEN_SYMBOL}" \
|
||||
--error-logfile "${ERROR_LOG_FILE:-$HYPHEN_SYMBOL}" \
|
||||
--workers ${SERVER_WORKER_AMOUNT:-1} \
|
||||
--worker-class ${SERVER_WORKER_CLASS:-gthread} \
|
||||
--threads ${SERVER_THREADS_AMOUNT:-20} \
|
||||
--timeout ${GUNICORN_TIMEOUT:-60} \
|
||||
--keep-alive ${GUNICORN_KEEPALIVE:-2} \
|
||||
--max-requests ${WORKER_MAX_REQUESTS:-0} \
|
||||
--max-requests-jitter ${WORKER_MAX_REQUESTS_JITTER:-0} \
|
||||
--limit-request-line ${SERVER_LIMIT_REQUEST_LINE:-0} \
|
||||
--limit-request-field_size ${SERVER_LIMIT_REQUEST_FIELD_SIZE:-0} \
|
||||
"${FLASK_APP}"
|
||||
48
docs/.eslintrc.js
Normal file
@@ -0,0 +1,48 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
module.exports = {
|
||||
parser: 'babel-eslint',
|
||||
rules: {
|
||||
strict: 0,
|
||||
'react/jsx-filename-extension': [
|
||||
2,
|
||||
{ extensions: ['.js', '.jsx', '.ts', '.tsx'] },
|
||||
],
|
||||
'import/extensions': 'off',
|
||||
'react/jsx-props-no-spreading': 'off',
|
||||
'jsx-a11y/click-events-have-key-events': 'off',
|
||||
'jsx-a11y/iframe-has-title': 'off',
|
||||
'jsx-a11y/interactive-supports-focus': 'off',
|
||||
'react-hooks/rules-of-hooks': 'off',
|
||||
'jsx-a11y/no-noninteractive-element-interactions': 'off',
|
||||
},
|
||||
extends: ['airbnb', 'airbnb/hooks'],
|
||||
env: {
|
||||
browser: true,
|
||||
node: true,
|
||||
jasmine: true,
|
||||
},
|
||||
settings: {
|
||||
'import/resolver': {
|
||||
node: {
|
||||
extensions: ['.js', '.jsx', '.ts', '.tsx'],
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
23
docs/.gitignore
vendored
@@ -1,20 +1,3 @@
|
||||
# Dependencies
|
||||
/node_modules
|
||||
|
||||
# Production
|
||||
/build
|
||||
|
||||
# Generated files
|
||||
.docusaurus
|
||||
.cache-loader
|
||||
|
||||
# Misc
|
||||
.DS_Store
|
||||
.env.local
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
.docz
|
||||
.cache
|
||||
public
|
||||
|
||||
1
docs/.nvmrc
Normal file
@@ -0,0 +1 @@
|
||||
v12
|
||||
@@ -17,36 +17,53 @@ specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Website
|
||||
Here's the source to the documentation hosted at
|
||||
<a href="https://superset.apache.org">superset.apache.org</a>
|
||||
|
||||
This website is built using [Docusaurus 2](https://docusaurus.io/), a modern static website generator.
|
||||
The site runs on the Gatsby framework and uses docz for it's
|
||||
`Documentation` subsection.
|
||||
|
||||
### Installation
|
||||
## Getting Started
|
||||
|
||||
```
|
||||
$ yarn install
|
||||
```bash
|
||||
cd docs/
|
||||
npm install
|
||||
npm run start
|
||||
# navigate to localhost:8000`
|
||||
```
|
||||
|
||||
### Local Development
|
||||
## To Publish
|
||||
|
||||
Github Actions CI automatically publishes the site after changes are
|
||||
merged to master.
|
||||
|
||||
To manually publish, the static site that Gatsby generates needs to be pushed
|
||||
to the `asf-site` branch on the
|
||||
[apache/superset-site](https://github.com/apache/superset-site/)
|
||||
repository. No need to PR here, simply `git push`.
|
||||
|
||||
```bash
|
||||
# Get in the docs/ folder in the main repo
|
||||
cd ~/repos/superset/docs
|
||||
# have Gatsby build the static website, this puts in under `docs/public`
|
||||
npm run build
|
||||
|
||||
# go to the docs repo
|
||||
cd ~/repos/superset-site
|
||||
# checkout the proper branch
|
||||
git checkout asf-site
|
||||
|
||||
# BE CAREFUL WITH THIS COMMAND
|
||||
# wipe the content of the repo
|
||||
rm -rf *
|
||||
|
||||
# copy the static site here
|
||||
cp -r ~/repos/superset/docs/public/ ./
|
||||
|
||||
# git push
|
||||
git add .
|
||||
git commit -m "relevant commit msg"
|
||||
git push origin asf-site
|
||||
|
||||
# SUCCESS - it should take minutes to take effect on superset.apache.org
|
||||
```
|
||||
$ yarn start
|
||||
```
|
||||
|
||||
This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.
|
||||
|
||||
### Build
|
||||
|
||||
```
|
||||
$ yarn build
|
||||
```
|
||||
|
||||
This command generates static content into the `build` directory and can be served using any static contents hosting service.
|
||||
|
||||
### Deployment
|
||||
|
||||
```
|
||||
$ GIT_USER=<Your GitHub username> USE_SSH=true yarn deploy
|
||||
```
|
||||
|
||||
If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the `gh-pages` branch.
|
||||
|
||||
@@ -1,22 +0,0 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
module.exports = {
|
||||
presets: [require.resolve('@docusaurus/core/lib/babel/preset')],
|
||||
};
|
||||
@@ -1,37 +0,0 @@
|
||||
---
|
||||
title: API
|
||||
hide_title: true
|
||||
sidebar_position: 9
|
||||
---
|
||||
|
||||
import { Buffer } from 'buffer';
|
||||
global.Buffer = Buffer;
|
||||
import SwaggerUI from 'swagger-ui-react';
|
||||
import openapi from '/resources/openapi.json';
|
||||
import 'swagger-ui-react/swagger-ui.css';
|
||||
import { Alert } from 'antd';
|
||||
|
||||
## API
|
||||
|
||||
Superset's public **REST API** follows the
|
||||
[OpenAPI specification](https://swagger.io/specification/), and is
|
||||
documented here. The docs bellow are generated using
|
||||
[Swagger React UI](https://www.npmjs.com/package/swagger-ui-react).
|
||||
|
||||
<Alert
|
||||
type="info"
|
||||
message={
|
||||
<div>
|
||||
<strong>NOTE! </strong>
|
||||
You can find an interactive version of this documentation on your local Superset
|
||||
instance at <strong>/swagger/v1</strong> (unless disabled)
|
||||
</div>
|
||||
}
|
||||
/>
|
||||
|
||||
<br />
|
||||
<br />
|
||||
<hr />
|
||||
<div className="swagger-container">
|
||||
<SwaggerUI spec={openapi} />
|
||||
</div>
|
||||
@@ -1,4 +0,0 @@
|
||||
{
|
||||
"label": "Contributing",
|
||||
"position": 6
|
||||
}
|
||||
@@ -1,20 +0,0 @@
|
||||
---
|
||||
title: Contributing to Superset
|
||||
hide_title: true
|
||||
sidebar_position: 1
|
||||
version: 1
|
||||
---
|
||||
|
||||
## Contributing to Superset
|
||||
|
||||
Superset is an [Apache Software foundation](https://www.apache.org/theapacheway/index.html) project.
|
||||
The core contributors (or committers) to Superset communicate primarily in the following channels (
|
||||
which can be joined by anyone):
|
||||
|
||||
- [Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org)
|
||||
- [Apache Superset Slack community](https://join.slack.com/t/apache-superset/shared_invite/zt-16jvzmoi8-sI7jKWp~xc2zYRe~NqiY9Q)
|
||||
- [GitHub issues and PR's](https://github.com/apache/superset/issues)
|
||||
|
||||
More references:
|
||||
- [Comprehensive Tutorial for Contributing Code to Apache Superset](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
|
||||
- [CONTRIBUTING Guide on GitHub](https://github.com/apache/superset/blob/master/CONTRIBUTING.md)
|
||||