Compare commits

..

233 Commits

Author SHA1 Message Date
Joe Li
da48584012 chore: updating 4.1.3rc2 change log 2025-06-27 11:25:27 -07:00
gpchandran
da39ad800b chore: update Dockerfile - Upgrade to 3.11.12 (#33612)
(cherry picked from commit f0b6e87091)
2025-06-26 17:45:42 -07:00
Rafael Benitez
7e0564489b fix(Sqllab): Autocomplete got stuck in UI when open it too fast (#33522)
(cherry picked from commit b4e2406385)
2025-06-26 17:45:42 -07:00
JUST.in DO IT
a271c39b62 fix(table-chart): time shift is not working (#33425)
(cherry picked from commit dc4474889d)
2025-06-26 17:45:42 -07:00
Paul Rhodes
8fd27b362a feat(api): Added uuid to list api calls (#32414)
(cherry picked from commit 8decc9e45f)
2025-06-26 17:45:42 -07:00
sha174n
50cdf86b7c docs: CVEs fixed on 4.1.2 (#33435)
(cherry picked from commit 8a8fb49617)
2025-06-02 16:40:40 -07:00
github-actions[bot]
59836cc326 chore(🦾): bump python h11 0.14.0 -> 0.16.0 (#33339)
Co-authored-by: GitHub Action <action@github.com>
(cherry picked from commit 82526865d2)
2025-06-02 16:40:40 -07:00
github-actions[bot]
c1552fa236 chore(🦾): bump python sqlglot 26.1.3 -> 26.11.1 (#32745)
Co-authored-by: GitHub Action <action@github.com>
(cherry picked from commit 66c1a6a875)
2025-06-02 16:40:37 -07:00
Joe Li
d16bea8949 chore: creating 4.1.3rc1 change log and updating frontend json
(cherry picked from commit 72cf9b6d770ae93ec2cf28bce528a5cb65fafd6d)
2025-05-06 13:15:16 +08:00
Maxime Beauchemin
3464323ac9 fix: loading examples from raw.githubusercontent.com fails with 429 errors (#33354)
(cherry picked from commit f045a73e2d)
2025-05-06 10:14:05 +08:00
Yuri
c74a6c8f15 fix(pinot): revert join and subquery flags (#32382)
(cherry picked from commit 822d72c57d)
2025-05-06 10:14:04 +08:00
Ville Brofeldt
cc1764e98e fix(plugin-chart-echarts): remove erroneous upper bound value (#32473)
(cherry picked from commit 5766c36372)
2025-05-06 10:14:04 +08:00
JUST.in DO IT
df8109f966 fix: improve error type on parse error (#33048)
(cherry picked from commit ed0cd5e7b0)
2025-05-06 10:14:04 +08:00
JUST.in DO IT
83e3aa23ff fix(pivot-table): Revert "fix(Pivot Table): Fix column width to respect currency config (#31414)" (#32968)
(cherry picked from commit a36e636a58)
2025-05-06 10:14:04 +08:00
JUST.in DO IT
31e59b1635 fix(log): store navigation path to get correct logging path (#32795)
(cherry picked from commit 4a70065e5f)
2025-05-06 10:13:58 +08:00
Andreas Motl
0f82cbaffd fix: Downgrade to marshmallow<4 (#33216) 2025-05-06 09:13:50 +08:00
Joe Li
1bdc33fe89 fix: make packages PEP 625 compliant (#32866)
Co-authored-by: Michael S. Molina <michael.s.molina@gmail.com>
(cherry picked from commit 6e02d19b0d)
2025-04-01 10:43:58 -07:00
Đỗ Trọng Hải
ccfc07de83 fix(fe/dashboard-list): display modifier info for Last modified data (#32035)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit 88cf2d5c39)
2025-03-27 08:49:50 -07:00
JUST.in DO IT
e4d1735657 fix(logging): missing path in event data (#32708)
(cherry picked from commit cd5a94305c)
2025-03-27 08:44:44 -07:00
Michael S. Molina
ade8abb87f fix: Signature of Celery pruner jobs (#32699)
(cherry picked from commit df06bdf33b)
2025-03-27 08:28:32 -07:00
JUST.in DO IT
d0376596a1 fix(log): Update recent_activity by event name (#32681)
(cherry picked from commit 449f51aed5)
2025-03-26 17:26:28 -07:00
JUST.in DO IT
5167443ce4 fix(welcome): perf on distinct recent activities (#32608)
(cherry picked from commit 832e028b39)
2025-03-26 16:29:49 -07:00
Michael S. Molina
3f04866c83 fix: Log table retention policy (#32572)
(cherry picked from commit 89b6d7fb68)
2025-03-26 16:02:22 -07:00
Đỗ Trọng Hải
81e3d69e4b fix(model/helper): represent RLS filter clause in proper textual SQL string (#32406)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit ff0529c932)
2025-03-26 10:59:50 -07:00
gpchandran
f1e0c9c8ed fix: upgrade to 3.11.11-slim-bookworm to address critical vulnerabilities (#32240)
(cherry picked from commit ad057324b7)
2025-03-24 15:25:04 -07:00
Jack
804b653eca fix(chart data): removing query from /chart/data payload when accessing as guest user (#30858)
(cherry picked from commit dd39138e6e)
2025-03-24 15:23:25 -07:00
Joe Li
0c9577c605 chore: Revert "chore: bump base image in Dockerfile with ARG PY_VER=3.11.11-slim-bookworm" (#32782) 2025-03-20 18:40:10 -07:00
gpchandran
7c8dc95170 chore: bump base image in Dockerfile with ARG PY_VER=3.11.11-slim-bookworm (#32780) 2025-03-20 18:31:07 -07:00
Joe Li
72538bda06 chore: creating 4.1.2rc1 change log and updating frontend json 2025-03-12 11:33:31 -07:00
JUST.in DO IT
735552e795 fix(sqllab): Allow clear on schema and catalog (#32515)
(cherry picked from commit 4c3aae7583)
2025-03-12 11:13:12 -07:00
Daniel Vaz Gaspar
fdbc40d7ff fix: dashboard, chart and dataset import validation (#32500)
(cherry picked from commit fc844d3dfd)
2025-03-12 10:54:59 -07:00
Usiel Riedl
69dd88720e feat(sqllab): Adds refresh button to table metadata in SQL Lab (#29974)
(cherry picked from commit 9d5268ab6d)
2025-03-12 10:54:59 -07:00
Antonio Rivero
a43f6bde06 fix(migrations): Handle comparator None in old time comparison migration (#32538)
(cherry picked from commit 20e5df501e)
2025-03-07 14:10:11 -08:00
Joe Li
e2fcf0bb20 Revert "fix(sqllab): duplicate error message (#31353)"
This reverts commit adb9be9532.
2025-03-04 10:39:03 -08:00
Fardin Mustaque
4e76eb7372 fix: Big Number side cut fixed (#31407)
Co-authored-by: Geido <60598000+geido@users.noreply.github.com>
(cherry picked from commit 640dac1eff)
2025-03-04 10:20:36 -08:00
Michael Gerber
e16d574f28 fix(sunburst): Use metric label from verbose map (#31480)
(cherry picked from commit a1adb7f31c)
2025-03-04 10:20:03 -08:00
Ville Brofeldt
7a2c06e89c fix(tags): clean up bulk create api and schema (#31427)
(cherry picked from commit bf56a327f4)
2025-03-04 10:15:42 -08:00
Daniel Grossberg
1fa0450977 fix(docs): add custom editUrl path for intro page (#31334)
Co-authored-by: Evan Rusackas <evan@preset.io>
(cherry picked from commit 878c7f0267)
2025-03-04 10:08:45 -08:00
Beto Dealmeida
adb9be9532 fix(sqllab): duplicate error message (#31353)
(cherry picked from commit fc45647440)
2025-03-04 10:07:49 -08:00
Oleg Ovcharuk
de67435816 fix: Use clickhouse sqlglot dialect for YDB (#31323)
(cherry picked from commit 48c5ee4f8b)
2025-03-04 10:03:19 -08:00
Daniel Vaz Gaspar
dcaeb7615b fix: add more clickhouse disallowed functions on config (#31198)
(cherry picked from commit 25f4226dbb)
2025-03-04 10:00:10 -08:00
Vitor Avila
ed0981aa02 fix(embedded): Hide anchor links in embedded mode (#31194)
(cherry picked from commit 14682b9054)
2025-03-04 09:56:32 -08:00
Antonio Rivero
6966f48515 fix(migrations): Handle no params in time comparison migration (#32155)
(cherry picked from commit 6ed9dae2f7)
2025-02-05 16:15:38 -08:00
Vitor Avila
0110154520 chore: Skip the creation of secondary perms during catalog migrations (#32043)
(cherry picked from commit 3f46bcf142)
2025-01-30 14:05:01 -08:00
JUST.in DO IT
c262cc6c24 fix(sqllab): Missing allowHTML props in ResultTableExtension (#31960)
(cherry picked from commit 1d6423e71f)
2025-01-22 14:12:33 -08:00
Elizabeth Thompson
955a1a42aa fix: prevent multiple pvm errors on migration (#31332)
(cherry picked from commit cd200f07a5)
2025-01-21 17:10:23 -08:00
Vitor Avila
fed0d3707d fix(database import): Gracefully handle error to get catalog schemas (#31437)
(cherry picked from commit 21e794a66f)
2025-01-21 16:34:12 -08:00
nsivarajan
31d966ba64 fix: cache-warmup fails (#31173)
Co-authored-by: Sivarajan Narayanan <narayanan_sivarajan@apple.com>
Co-authored-by: Pat Heard <patrick.heard@cds-snc.ca>
(cherry picked from commit 592564b623)
2025-01-21 16:30:57 -08:00
Đỗ Trọng Hải
9946596586 fix(fe/src/dashboard): optional chaining for possibly nullable parent attribute in LayoutItem type (#30442)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit 2a458a4802)
2025-01-21 16:15:41 -08:00
Damian Pendrak
41246c7c1f fix(sqllab): unable to update saved queries (#31639)
(cherry picked from commit 3a6fdf8bdf)
2025-01-21 16:04:46 -08:00
Elizabeth Thompson
4f436d4512 fix: parse pandas pivot null values (#29898)
(cherry picked from commit 0e8fa54f81)
2024-12-12 15:37:20 -08:00
Vitor Avila
da4d911891 fix(Pivot Table): Fix column width to respect currency config (#31414)
(cherry picked from commit 43314dc8db)
2024-12-12 15:08:28 -08:00
Tatiana Cherne
b40a819377 fix(histogram): axis margin padding consistent with other graphs (#31335)
Co-authored-by: Evan Rusackas <evan@preset.io>
(cherry picked from commit 73d21a87ae)
2024-12-12 15:07:47 -08:00
alexandrusoare
bfa868ac08 fix(AllEntitiesTable): show Tags (#31301)
(cherry picked from commit 0133bab038)
2024-12-12 15:03:11 -08:00
Beto Dealmeida
2774adfe29 fix: pass string to process_template (#31329)
(cherry picked from commit 9315a8838c)
2024-12-12 15:02:00 -08:00
Yuri
91fe814981 fix(pinot): remove query aliases from SELECT and ORDER BY clauses in Pinot (#31341)
(cherry picked from commit 931f69d6c7)
2024-12-10 10:04:40 -08:00
Damian Pendrak
04b1ccd647 fix: annotations on horizontal bar chart (#31308)
(cherry picked from commit 2816a70af3)
2024-12-10 09:59:51 -08:00
JUST.in DO IT
db779d1665 fix(sqllab): Remove update_saved_query_exec_info to reduce lag (#31294)
(cherry picked from commit 48864ce8c7)
2024-12-10 09:59:10 -08:00
Joe Li
25eb7142b6 fix: linting to pass pre-commit 2024-12-05 09:14:52 -08:00
Michael S. Molina
1c08d5ef79 fix: Exception handling for SQL Lab views (#30897)
(cherry picked from commit c2885a166e)
2024-12-05 09:05:54 -08:00
Vitor Avila
2f134ff919 fix(Databricks): Escape catalog and schema names in pre-queries (#31199)
(cherry picked from commit d66ac9f3f4)
2024-12-04 14:07:23 -08:00
Joe Li
e7e9cfe530 fix: check for column before adding in migrations (#31185)
(cherry picked from commit 8020729ced)
2024-12-04 14:04:38 -08:00
JUST.in DO IT
e77e158ff6 fix(trino): db session error in handle cursor (#31265)
(cherry picked from commit 1e0c04fc15)
2024-12-04 14:04:17 -08:00
Beto Dealmeida
5e7299431d fix(dataset): use sqlglot for DML check (#31024)
(cherry picked from commit 832fed1db5)
2024-12-04 14:03:57 -08:00
Elizabeth Thompson
6e092dd5a6 fix: add mutator to get_columns_description (#29885)
(cherry picked from commit 38d64e8dd2)
2024-12-04 14:03:26 -08:00
Damian Pendrak
fd7f1d3052 fix: x axis title disappears when editing bar chart (#30821)
(cherry picked from commit 97dde8c485)
2024-12-04 11:54:49 -08:00
Michael S. Molina
43fe5cae7b fix: Time-series Line Chart Display unnecessary total (#31181)
(cherry picked from commit dbcb473040)
2024-12-04 11:53:19 -08:00
Geido
15740f8e81 fix(Dashboard): Backward compatible shared_label_colors field (#31163)
(cherry picked from commit f077323e6f)
2024-12-04 11:52:36 -08:00
Beto Dealmeida
e151dda5c4 fix: check orderby (#31156)
(cherry picked from commit 7f2e752796)
2024-12-04 11:51:38 -08:00
Michael S. Molina
caaf5bff54 fix: Remove unwanted commit on Trino's handle_cursor (#31154)
(cherry picked from commit 547a4adef5)
2024-12-04 11:51:02 -08:00
Michael S. Molina
b31595a533 fix: Revert "feat(trino): Add functionality to upload data (#29164)" (#31151)
(cherry picked from commit ff282492a1)
2024-12-04 11:50:09 -08:00
Geido
71d4bf7212 fix(Dashboard): Ensure shared label colors are updated (#31031)
(cherry picked from commit 91301bcd5b)
2024-12-04 11:46:52 -08:00
Evan Rusackas
f20a8de769 fix(release validation): scripts now support RSA and EDDSA keys. (#30967)
(cherry picked from commit 4f899dd164)
2024-12-04 11:41:18 -08:00
Geido
cebd45778f fix(Dashboard): Native & Cross-Filters Scoping Performance (#30881)
(cherry picked from commit f4c36a6d05)
2024-11-22 11:23:37 -08:00
Linden
1b2ecc6955 fix(imports): import query_context for imports with charts (#30887)
(cherry picked from commit 8905508d8f)
2024-11-22 11:19:02 -08:00
JUST.in DO IT
1f5e567645 fix(explore): verified props is not updated (#31008)
(cherry picked from commit 9e5b568cc9)
2024-11-22 11:18:33 -08:00
Geido
c467fb566d fix(Dashboard): Retain colors when color scheme not set (#30646)
(cherry picked from commit 90572be95a)
2024-11-22 11:17:55 -08:00
Geido
cc0ed0fef4 fix(Dashboard): Exclude edit param in async screenshot (#30962)
(cherry picked from commit 1b63b8f3c7)
2024-11-22 11:17:17 -08:00
yousoph
4f463399a7 docs: Updating 4.1 Release Notes (#30865)
(cherry picked from commit 88eb95c39a)
2024-11-22 11:16:43 -08:00
Joe Li
6264ff5165 chore: Adds 4.1.1 RC1 data to CHANGELOG.md and update frontend package 2024-11-15 14:35:11 -08:00
Sukuna
1410e528a4 fix: blocks UI elements on right side (#30886)
Co-authored-by: Evan Rusackas <evan@preset.io>
(cherry picked from commit df479940a6)
2024-11-15 09:27:39 -08:00
Geido
f704b0f556 fix(package.json): Pin luxon version to unblock master (#30859)
(cherry picked from commit de8282cea0)
2024-11-13 15:50:29 -08:00
Maxime Beauchemin
bdfd5cd4ec fix(explore): column data type tooltip format (#30588)
(cherry picked from commit 73768f6313)
2024-11-13 15:48:42 -08:00
Elizabeth Thompson
af44b14fbe chore: add link to Superset when report error (#30576)
(cherry picked from commit 4d5f70c694)
2024-11-13 12:23:27 -08:00
Ayush Tripathi
29c76ef1d5 fix: Rename database from 'couchbasedb' to 'couchbase' in documentation and db_engine_specs (#29911)
(cherry picked from commit f5d614d80d)
2024-11-13 12:21:04 -08:00
Geido
7953c89d51 fix(TimezoneSelector): Failing unit tests due to timezone change (#30828)
(cherry picked from commit 5820d31b5c)
2024-11-13 12:17:14 -08:00
Joe Li
fd4c3dce44 fix: don't show metadata for embedded dashboards (#30875)
(cherry picked from commit ac3a10d8f1)
2024-11-13 12:10:59 -08:00
Michael S. Molina
234f8c94d1 fix: Graph chart colors (#30851)
(cherry picked from commit 0e165c1a21)
2024-11-13 12:10:17 -08:00
Evan Rusackas
bc2e51d8d0 fix(capitalization): Capitalizing a button. (#29867)
(cherry picked from commit 052b38bdf3)
2024-11-13 12:08:26 -08:00
Geido
2787167abe refactor(Slider): Upgrade Slider to Antd 5 (#29786)
(cherry picked from commit d877d46557)
2024-11-13 12:07:26 -08:00
Ross Mabbett
9e84e13888 refactor(ChartCreation): Migrate tests to RTL (#29674)
(cherry picked from commit 6bc8567802)
2024-11-13 12:06:44 -08:00
Ross Mabbett
cfd24e3ccd refactor(controls): Migrate AdhocMetricOption.test to RTL (#29843)
(cherry picked from commit 819597faf6)
2024-11-13 12:06:19 -08:00
Ross Mabbett
aaecec2e03 refactor(controls): Migrate MetricDefinitionValue.test to RTL (#29845)
(cherry picked from commit 27c08d0e0e)
2024-11-13 12:05:47 -08:00
Evan Rusackas
66fe0b0594 fix(translations): Translate embedded errors (#29782)
(cherry picked from commit 0d62bb2261)
2024-11-13 12:01:13 -08:00
Evan Rusackas
0d0b43062e fix: Fixing incomplete string escaping. (#29772)
(cherry picked from commit 2bce20f790)
2024-11-13 12:00:03 -08:00
Đỗ Trọng Hải
72df46a729 fix(frontend/docker, ci): fix borked Docker build due to Lerna v8 uplift (#29725)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit 8891f04f11)
2024-11-13 11:58:44 -08:00
Evan Rusackas
f7cfd9182a docs: Check markdown files for bad links using linkinator (#28424)
(cherry picked from commit c3702be9d4)
2024-11-13 11:57:22 -08:00
Sam Firke
5faaaf978b docs(contributing): fix broken link to translations sub-section (#29768)
(cherry picked from commit c22dfa1abb)
2024-11-13 11:56:47 -08:00
Joe Li
855f4c4897 chore: Adds 4.1.0 RC4 data to CHANGELOG.md 2024-11-01 16:49:06 -07:00
Ville Brofeldt
008ab202f3 fix(plugin-chart-echarts): sort tooltip correctly (#30819)
(cherry picked from commit b02d18a39e)
2024-11-01 16:11:26 -07:00
Daniel Vaz Gaspar
db311eb376 chore: bump werkzeug to address vulnerability (#30729)
(cherry picked from commit f19c4280c0)
2024-11-01 10:25:05 -07:00
Geido
d5f33c4c02 fix(Dashboard): Sync/Async Dashboard Screenshot Generation and Default Cache (#30755)
Co-authored-by: Michael S. Molina <michael.s.molina@gmail.com>
Co-authored-by: Michael S. Molina <70410625+michael-s-molina@users.noreply.github.com>
(cherry picked from commit 3e29777526)
2024-11-01 10:08:29 -07:00
Beto Dealmeida
ad82a8c14e fix: catalog migration w/o connection (#30773)
(cherry picked from commit 402c29c2bc)
2024-11-01 09:57:26 -07:00
Phil Dumbreck
45c18368f6 ci: Add Python 3.11 images to Docker Hub (#30733)
(cherry picked from commit eecb537808)
2024-10-31 09:48:44 -07:00
Maxime Beauchemin
6f656914fe fix: CI remove cypress command --headed (#30429)
(cherry picked from commit 63e17ca546)
2024-10-30 15:55:58 -07:00
Maxime Beauchemin
6706d1308f chore: alter scripts/cypress_run to run one file per command + retry (#30397)
Co-authored-by: Joe Li <joe@preset.io>
(cherry picked from commit a3bfbd0186)
2024-10-30 15:55:02 -07:00
Elizabeth Thompson
cbf1aeec7d chore: split cypress files for less memory (#30354)
(cherry picked from commit 43721f1206)
2024-10-30 15:54:28 -07:00
Geido
3f7907b266 chore(Dashboard): Simplify scoping logic for cross/native filters (#30719)
(cherry picked from commit d5a98e0189)
2024-10-30 10:10:06 -07:00
Vitor Avila
ba0d118fdd fix(Jinja): Extra cache keys for calculated columns and metrics using Jinja (#30735)
(cherry picked from commit 09d3f60d85)
2024-10-29 09:47:12 -07:00
Michael S. Molina
49aa74cec8 fix: Nested transaction is inactive when embedding dashboard (#30699)
(cherry picked from commit c9ff09a418)
2024-10-28 10:40:06 -07:00
Vitor Avila
7c569abaf6 fix(dashboard): Include urlParams in the screenshot generation (#30675)
(cherry picked from commit 16981d6316)
2024-10-28 10:36:59 -07:00
Geido
a70f2cee72 fix(Jinja): Extra cache keys for Jinja columns (#30715)
(cherry picked from commit a12ccf2c1d)
2024-10-28 10:36:35 -07:00
JUST.in DO IT
7b343f7fac fix(chart): Table and page entries misaligned (#30680)
(cherry picked from commit 87deb19bcb)
2024-10-23 09:53:56 -07:00
Kamil Gabryjelski
742ad92189 fix(explore): Missing markarea component broke annotations in echarts (#30348)
(cherry picked from commit 038ef32454)
2024-10-23 09:51:34 -07:00
Joe Li
03b72628fa chore: update change log for 4.1.0rc3 and linting 2024-10-16 10:01:42 -07:00
Michael S. Molina
27ca7ba7d7 fix: First item hovered on stacked bar (#30628)
(cherry picked from commit c8edd1fb25)
2024-10-16 09:11:28 -07:00
Joe Li
1074d1e618 chore: Update to Dockerfile to get creating releases to work (#29937)
(cherry picked from commit 955db48c59)
2024-10-16 09:06:20 -07:00
Sam Firke
b6edf148e2 fix(docs): address two linkinator failures (#30617)
(cherry picked from commit 53a121d9e1)
2024-10-15 16:37:46 -07:00
Beto Dealmeida
046770cf76 feat: use dialect when tokenizing (#30614)
(cherry picked from commit 4cac7feb67)
2024-10-15 16:26:56 -07:00
Geido
0f1064eab8 fix(Filters): Apply native & cross filters on common columns (#30438)
(cherry picked from commit 362948324c)
2024-10-15 16:25:10 -07:00
ObservabilityTeam
82fc8879b0 fix(filters): Adds a fix for saving time range adhoc_filters (#30581)
Co-authored-by: Muhammad Musfir <muhammad.musfir@de-cix.net>
(cherry picked from commit 2c3ba95768)
2024-10-15 16:23:57 -07:00
David Markey
159958e577 feat(embedded): add hook to allow superset admins to validate guest token parameters (#30132)
Co-authored-by: David Markey <markey@rapidraitngs.com>
(cherry picked from commit a31a4eebdd)
2024-10-15 16:19:51 -07:00
Kamil Gabryjelski
2a98780d2c perf: Implement Echarts treeshaking (#29874)
(cherry picked from commit c220245414)
2024-10-15 16:18:25 -07:00
Beto Dealmeida
a84da1c5cc fix: sqlparse fallback for formatting queries (#30578)
(cherry picked from commit 47c1e09c75)
2024-10-11 13:02:52 -07:00
Joe Li
8c329c445f fix: update html rendering to true from false (#30565) 2024-10-10 16:32:26 -07:00
Beto Dealmeida
05cccf6404 fix: adhoc metrics (#30202)
(cherry picked from commit 0db59b45b8)
2024-10-10 16:29:59 -07:00
Geido
92808ffe38 fix(Jinja): Extra cache keys to consider vars with set (#30549)
(cherry picked from commit 318eff7327)
2024-10-09 14:14:57 -07:00
Felix
0a7635fc05 fix(dashboard-export): Fixes datasetId is not replaced with datasetUuid in Dashboard export in 4.1.x (#30425)
(cherry picked from commit 211564a6da)
2024-10-09 14:12:59 -07:00
Michael S. Molina
4fe51c6db9 fix: Horizon Chart are not working any more (#30563)
(cherry picked from commit 7b47e43fd0)
2024-10-09 11:12:12 -07:00
Michael S. Molina
0eaa8c5894 fix: Incorrect type in config.py (#30564)
(cherry picked from commit 7a8e8f890f)
2024-10-09 11:11:42 -07:00
Michael S. Molina
f56dfb35b2 fix: Unable to parse escaped tables (#30560)
(cherry picked from commit fc857d987b)
2024-10-09 11:10:48 -07:00
JUST.in DO IT
597e207eff fix(explore): don't discard controls on deprecated (#30447)
(cherry picked from commit b627011463)
2024-10-07 11:17:58 -07:00
JUST.in DO IT
95ae663e88 chore(chart-controls): migrate enzyme to RTL (#26257)
(cherry picked from commit c049771a7f)
2024-10-07 11:17:30 -07:00
Ville Brofeldt
d0def80d3b fix(migration): replace unquote with double percentages (#30532)
(cherry picked from commit 163b71e019)
2024-10-07 10:41:31 -07:00
Geido
9f5f0895f6 fix(Explore): Apply RLS at column values (#30490)
Co-authored-by: Beto Dealmeida <roberto@dealmeida.net>
(cherry picked from commit f314685a8e)
2024-10-07 09:56:02 -07:00
Jack
dce7e47399 fix(imports): Error when importing charts / dashboards with missing DB credentials (#30503)
(cherry picked from commit 95325c4673)
2024-10-07 09:53:14 -07:00
Beto Dealmeida
4b9ae07fe5 fix: don't reformat generated queries (#30350)
(cherry picked from commit 0b34197815)
2024-10-04 09:04:22 -07:00
Michael S. Molina
7519cab379 fix: Open control with Simple tab selected when there is no column selected (#30502)
(cherry picked from commit 03146b21be)
2024-10-03 16:56:36 -07:00
Beto Dealmeida
84c1ad97dc fix(embedded): sankey charts (#30491)
(cherry picked from commit e0172a24b8)
2024-10-02 14:50:44 -07:00
Michael S. Molina
f743ae36dc fix: Histogram chart not able to use decimal datatype column (#30416)
(cherry picked from commit 4834390e6a)
2024-09-30 10:37:56 -07:00
Beto Dealmeida
ca5ed8b7b0 chore: improve DML check (#30417)
(cherry picked from commit cc9fd88c0d)
2024-09-30 10:36:45 -07:00
Beto Dealmeida
f1a6aaad63 chore: organize SQL parsing files (#30258)
(cherry picked from commit bdf29cb7c2)
2024-09-30 10:36:40 -07:00
Michael S. Molina
995182270c fix: Incorrect hovered items in tooltips (#30405)
(cherry picked from commit 36f7a3f524)
2024-09-27 18:30:54 -07:00
Michael S. Molina
c864e6cd2b fix: Allows X-Axis Sort By for custom SQL (#30393)
(cherry picked from commit abf2943e4d)
2024-09-25 17:46:46 -07:00
Michael S. Molina
a3d6ef07c1 fix: Pre-query normalization with custom SQL (#30389)
(cherry picked from commit ad2998598f)
2024-09-25 17:46:16 -07:00
Michael S. Molina
072540f321 fix: KeyError 'sql' when opening a Trino virtual dataset (#30339)
(cherry picked from commit ef9e5e523d)
2024-09-19 14:03:38 -07:00
Antonio Rivero
2561b267ab fix(table): Use extras in queries (#30335)
(cherry picked from commit 6c2bd2a968)
2024-09-19 09:42:54 -07:00
Ville Brofeldt
8fc4c50050 fix(migration): 87d38ad83218 failing on upgrade (#30275)
(cherry picked from commit 78099b0d1f)
2024-09-18 18:18:45 -07:00
JUST.in DO IT
359d7baaf5 fix(dashboard): Invalid owner's name displayed after updates (#30272)
(cherry picked from commit 2f0c9947ce)
2024-09-18 10:33:10 -07:00
JUST.in DO IT
437151a95f fix: unable to disallow csv upload on header menu (#30271)
(cherry picked from commit cd8b56706b)
2024-09-18 10:32:54 -07:00
Maxime Beauchemin
2157fe3f28 chore: move SLACK_ENABLE_AVATARS from config to feature flag (#30274)
(cherry picked from commit f315a4f02c)
2024-09-16 16:13:34 -07:00
JUST.in DO IT
1f6ef6a870 chore(sqllab): Add shortcuts for switching tabs (#30173)
(cherry picked from commit f553344aa1)
2024-09-16 09:28:05 -07:00
Geido
35de980081 fix(Screenshot): Dashboard screenshot cache key to include state (#30265)
(cherry picked from commit 0679454b48)
2024-09-16 09:27:05 -07:00
Geido
90ce1b5012 fix(CrossFilters): Do not reload unrelated filters in global scope (#30252)
Co-authored-by: JUST.in DO IT <justin.park@airbnb.com>
(cherry picked from commit dbab2fb955)
2024-09-13 10:59:01 -07:00
Beto Dealmeida
4a6dd94a6c chore: remove duplicate _process_sql_expression (#30213)
(cherry picked from commit cddf1530da)
2024-09-12 11:05:05 -07:00
Geido
860c9c08a1 fix(Fave): Charts and Dashboards fave/unfave do not commit transactions (#30215)
(cherry picked from commit 23467bd7e4)
2024-09-12 10:57:15 -07:00
JUST.in DO IT
f0c42b0a01 feat(sqllab): Add timeout on fetching query results (#29959)
(cherry picked from commit ff3b86b5ff)
2024-09-12 10:55:39 -07:00
Sam Firke
889ab36dff fix(uploads): respect db engine spec's supports_multivalues_insert value for file uploads & enable multi-insert for MSSQL (#30222)
Co-authored-by: Ville Brofeldt <33317356+villebro@users.noreply.github.com>
(cherry picked from commit f8a77537a7)
2024-09-12 09:16:14 -07:00
JUST.in DO IT
d85fdf4bf9 fix: filters panel broken due to tabs scroll (#30180)
(cherry picked from commit be0a0ced25)
2024-09-12 09:14:58 -07:00
Sam Firke
afd5379bb0 chore(docs): note that release-tagged docker images no longer ship with metadata db drivers as of 4.1.0 (#30243)
(cherry picked from commit 4385b44e86)
2024-09-11 11:15:51 -07:00
Geido
789ca738dc fix(Celery): Pass guest_token as user context is not available in Celery (#30224)
(cherry picked from commit 1b34ad65fa)
2024-09-11 09:27:07 -07:00
Vitor Avila
40568fd1ff fix(Dashboard download): Download dashboard screenshot/PDF using SupersetClient (#30212)
(cherry picked from commit d191e67e51)
2024-09-11 09:26:50 -07:00
Geido
6205fb4e48 fix(Embedded): Dashboard screenshot should use GuestUser (#30200)
(cherry picked from commit 52a03f18a1)
2024-09-11 09:26:34 -07:00
Beto Dealmeida
c3bc7de75f feat: is_mutating method (#30177)
(cherry picked from commit 1f890718a2)
2024-09-11 09:25:53 -07:00
Ross Masters
d33f1534e2 fix: Chart cache-warmup task fails on Superset 4.0 (#28706)
(cherry picked from commit 0744abe87b)
2024-09-11 09:24:12 -07:00
Maxime Beauchemin
1ccc147670 fix: set default mysql isolation level to 'READ COMMITTED' (#30174)
Co-authored-by: Ville Brofeldt <33317356+villebro@users.noreply.github.com>
(cherry picked from commit 6baeb659a7)
2024-09-09 17:14:22 -07:00
Kamil Gabryjelski
d8b9f38609 fix: Disable cross filtering on charts with no dimensions (#30176)
(cherry picked from commit 3aafd29768)
2024-09-09 09:55:21 -07:00
Michael S. Molina
e8d5ff1264 fix: Delete modal button with lowercase text (#30060)
(cherry picked from commit cd6b8b2f6d)
2024-09-06 11:10:08 -07:00
JUST.in DO IT
3becd6b72e chore(shared components): Migrate enzyme to RTL (#26258)
(cherry picked from commit 1a1548da3b)
2024-09-06 11:09:26 -07:00
JUST.in DO IT
cea8ede3f0 fix(sqllab): Skip AceEditor in inactive tabs (#30171)
(cherry picked from commit 4d1db9e32c)
2024-09-06 10:44:32 -07:00
JUST.in DO IT
e94667820f fix(native filter): undefined layout type on filterInScope (#30164)
(cherry picked from commit e02b18c63c)
2024-09-06 10:44:03 -07:00
Jonathan Schneider
41e611b413 fix(plugins): display correct tooltip (fixes #3342) (#30023)
(cherry picked from commit c428108713)
2024-09-06 10:43:28 -07:00
Michael S. Molina
d47430ac21 fix: FacePile is requesting avatars when SLACK_ENABLE_AVATARS is false (#30156)
(cherry picked from commit de3de541e7)
2024-09-05 10:35:41 -07:00
JUST.in DO IT
f2c0d3aa48 fix(sqllab): race condition when updating cursor position (#30154)
(cherry picked from commit 2097b716f4)
2024-09-04 17:31:59 -07:00
Maxime Beauchemin
f49a426ada docs: document how docker-compose-image-tag requires -dev suffixed images (#30144)
Co-authored-by: Sam Firke <sfirke@users.noreply.github.com>
(cherry picked from commit 34e240ef0e)
2024-09-04 17:30:50 -07:00
Antonio Rivero
acf3e12230 fix(catalog): Table Schema View with no catalog (#30139)
(cherry picked from commit 6009023fad)
2024-09-04 10:21:51 -07:00
Michael S. Molina
1d90ee3517 fix: New tooltip inappropriately combines series on mixed chart (#30137)
(cherry picked from commit 9cb9e5beee)
2024-09-04 10:20:34 -07:00
Michael S. Molina
0f32116734 fix: JSON loading logs (#30138)
(cherry picked from commit 5c5b4d0f5f)
2024-09-03 16:23:39 -07:00
Michael S. Molina
8d7ceebbc3 fix: DeckGL legend layout (#30140)
(cherry picked from commit af066a4630)
2024-09-03 16:22:59 -07:00
Evan Rusackas
45da3f4519 fix(accessibility): logo outline on tab navigation, but not on click (#30077)
(cherry picked from commit 9c3eb8f51f)
2024-09-03 16:21:43 -07:00
Joe Li
122057bac5 fix: pass if table is already removed on upgrade (#30017)
(cherry picked from commit c929f5ed7a)
2024-09-03 16:20:58 -07:00
hao-zhuventures
997cd60d43 fix: use StrEnum type for GuestTokenResourceType to fix token parsing (#30042)
(cherry picked from commit e2c4435cab)
2024-09-03 10:42:07 -07:00
Michael S. Molina
c57f47ddce fix: When hovering Drill By the dashboard is scrolled to the top (#30073)
(cherry picked from commit 548d543efe)
2024-09-03 10:41:18 -07:00
Michael S. Molina
b4068f1fca fix: Retrieving Slack channels when Slack is disabled (#30074)
(cherry picked from commit 72a520fba4)
2024-09-03 10:40:39 -07:00
Antonio Rivero
e7b136b822 fix(migrations): Fix the time comparison migration (#30029)
(cherry picked from commit d80f23ed94)
2024-08-27 14:36:26 -07:00
Michael S. Molina
86ca2b3d08 fix: Partition calls from Jinja context (#30019)
(cherry picked from commit 07a90ad4fe)
2024-08-27 10:44:33 -07:00
Michael S. Molina
36b229cd18 fix: Dashboard list row height does not match other lists (#30025)
(cherry picked from commit 2afb66d68d)
2024-08-27 10:44:12 -07:00
Ville Brofeldt
fff9f874b1 fix(user-dao): return user model instances (#30020)
(cherry picked from commit fcf0450294)
2024-08-27 10:43:41 -07:00
Jack
7dc65072c0 fix(screenshots): dashboard screenshots do not capture filter state (#29989)
(cherry picked from commit 7db34b994e)
2024-08-27 10:42:09 -07:00
squalou
5411d40a7a fix: set columns numeric datatypes when exporting to excel (#27229)
Co-authored-by: Elizabeth Thompson <eschutho@gmail.com>
(cherry picked from commit ce72a0ac27)
2024-08-27 10:40:55 -07:00
Ville Brofeldt
a7eb28ddd4 fix(trino): handle missing db in migration (#29997)
(cherry picked from commit 17eecb1981)
2024-08-27 10:39:29 -07:00
Beto Dealmeida
d488c78472 chore: improve mask/unmask encrypted_extra (#29943)
(cherry picked from commit 4b59e42d3f)
2024-08-27 10:38:37 -07:00
Hugh A. Miles II
fe33689917 fix: Gamma users shouldn't be able to create roles (#29687)
(cherry picked from commit 7650c47e72)
2024-08-27 10:37:07 -07:00
Michael S. Molina
b0a2aea760 fix: Security manager incorrect calls (#29884)
(cherry picked from commit d497dcad41)
2024-08-23 10:03:16 -07:00
Joe Li
8f93ad7068 chore: Adds 4.1.0 RC2 data to CHANGELOG.md 2024-08-22 11:49:10 -07:00
Michael S. Molina
cced1c5a4e fix: Duplicated example dataset (#29993)
(cherry picked from commit eb2d69a5e6)
2024-08-22 11:03:00 -07:00
Daniel Vaz Gaspar
c332eebc37 fix: trino thread app missing full context (#29981)
(cherry picked from commit 4d821f44ae)
2024-08-22 11:01:32 -07:00
JUST.in DO IT
106d755931 fix(sqllab): flaky json explore modal due to shallow equality checks for extra data (#29978)
(cherry picked from commit 1ca5947a7d)
2024-08-22 11:01:09 -07:00
Đỗ Trọng Hải
ef31710c2b fix(ci): remove unused "type: ignore" comment to unblock precommit check in CI (#29830)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit 71786dba64)
2024-08-21 10:19:17 -07:00
JUST.in DO IT
6a5c293a04 fix(sqllab): Add abort call on query refresh timeout (#29956)
(cherry picked from commit 6e1ef193dd)
2024-08-19 11:02:18 -07:00
Joe Li
86bfb2ade6 fix: try to prevent deadlocks when running upgrade (#29625) 2024-08-19 11:00:43 -07:00
Michael S. Molina
f8ed0cec74 chore: Allow auto pruning of the query table (#29936) 2024-08-19 09:59:09 -07:00
Michael S. Molina
b70c5e1d9d fix: upgrade_catalog_perms and downgrade_catalog_perms implementation (#29860)
(cherry picked from commit e8f5d7680f)
2024-08-16 09:03:25 -07:00
Vitor Avila
f4b201857e fix(embedded): Remove CSRF requirement for dashboard download API (#29953)
(cherry picked from commit 47715c39d0)
2024-08-16 09:02:59 -07:00
JUST.in DO IT
16385322db fix(explore): missing column autocomplete in custom SQL (#29672)
(cherry picked from commit 3c971455e7)
2024-08-14 18:05:26 -07:00
Beto Dealmeida
9677fa97ff fix: handle empty catalog when DB supports them (#29840)
(cherry picked from commit 39209c2b40)
2024-08-13 13:23:17 -07:00
Markus Eriksson
16295b086a fix: Add user filtering to changed_by. Fixes #27986 (#29287)
Co-authored-by: Markus Eriksson <markus.eriksson@sinch.com>
(cherry picked from commit 922128f6e0)
2024-08-12 17:04:53 -07:00
Joe Li
afe580bb8a fix: add imports back to celery file (#29921)
(cherry picked from commit 9f5eb899e8)
2024-08-12 17:04:05 -07:00
Michael S. Molina
d102b45692 fix: Error when downgrading add_catalog_perm_to_tables migration (#29906)
(cherry picked from commit fb7f50868d)
2024-08-12 17:03:49 -07:00
Geido
c0c6486e70 fix(Embedded): Deleting Embedded Dashboards does not commit the transaction (#29894)
(cherry picked from commit b323bf0fb6)
2024-08-12 17:03:32 -07:00
Michael S. Molina
a2d8590f0a chore: Logs the duration of migrations execution (#29893)
(cherry picked from commit 57a4199f52)
2024-08-12 17:03:18 -07:00
Maxime Beauchemin
bfb6ff3394 fix: update celery config imports (#29862)
(cherry picked from commit 9fed576cb4)
2024-08-12 17:02:59 -07:00
Elizabeth Thompson
8ea94916d9 fix: load slack channels earlier (#29846)
(cherry picked from commit 0c3aa7d8fe)
2024-08-12 17:02:38 -07:00
Elizabeth Thompson
642de0ad63 fix: bump packages to unblock ci (#29805)
(cherry picked from commit 2cbd945692)
2024-08-12 17:01:37 -07:00
Beto Dealmeida
6954db023c fix: create permissions on DB import (#29802)
(cherry picked from commit 61c0970968)
2024-08-12 16:55:26 -07:00
Michael S. Molina
eca7c57083 fix: Downgrade of revision 678eefb4ab44 throws error (#29799)
(cherry picked from commit 249f5ec31a)
2024-08-12 16:55:08 -07:00
Beto Dealmeida
4dca9bceed fix: catalog upgrade/downgrade (#29780)
(cherry picked from commit 525e837c5b)
2024-08-12 16:54:51 -07:00
Geido
7219310267 fix(Dashboard): Copying a Dashboard does not commit the transaction (#29776)
(cherry picked from commit 4c52ecc4d8)
2024-08-12 16:54:23 -07:00
Elizabeth Thompson
77ade18107 fix: pass slack recipients correctly (#29721)
(cherry picked from commit 57e8cd2ba2)
2024-08-12 16:53:57 -07:00
Geido
bca2366d5a fix(Database): Refresh catalogs on db update returns database error (#29681)
(cherry picked from commit 134ca38b8d)
2024-08-12 16:52:59 -07:00
Joe Li
de2eedd16f chore: Add the 4.1 release notes (#29262)
(cherry picked from commit 422aa6b657)
2024-08-12 16:50:24 -07:00
Geido
0f1663b2ec refactor(ProgressBar): Upgrade ProgressBar to Antd 5 (#29666)
(cherry picked from commit 3de2b7c989)
2024-08-12 16:47:58 -07:00
Kamil Gabryjelski
604fe27ed1 fix: Use default custom time range time without timezone (#29669)
(cherry picked from commit cd713a239e)
2024-08-12 16:47:33 -07:00
Kamil Gabryjelski
3d7f6dae90 fix: Dashboard editable title weird behavior when adding spaces (#29667)
(cherry picked from commit 453e6deb97)
2024-08-12 16:47:02 -07:00
nsivarajan
a8c6bb5b52 feat(alert/report): Added optional CC and BCC fields for email notifi… (#29088)
Co-authored-by: Sivarajan Narayanan <sivarajannarayanan@Sivarajans-MacBook-Pro.local>
Co-authored-by: Sivarajan Narayanan <narayanan_sivarajan@apple.com>
(cherry picked from commit 27dde2a811)
2024-08-12 16:46:46 -07:00
Jaswanth-Sriram-Veturi
30fbfa1b14 docs: update creating-your-first-dashboard.mdx (#29631)
(cherry picked from commit 2a9a1d3194)
2024-08-12 16:46:14 -07:00
Michael S. Molina
3e297d130e fix: Layout of native filters modal with lengthy columns (#29648)
(cherry picked from commit be833dce4f)
2024-08-12 16:45:54 -07:00
Michael S. Molina
dc754e2d26 fix: Loading of native filter column (#29647)
(cherry picked from commit 92537f1fd5)
2024-08-12 16:45:35 -07:00
Beto Dealmeida
f59fb6f780 chore: add catalog_access to OBJECT_SPEC_PERMISSIONS (#29650)
(cherry picked from commit ae0edbfdce)
2024-08-12 16:45:18 -07:00
Michael S. Molina
fea187a36a fix: Required native filter message wrongfully appearing (#29643)
(cherry picked from commit 9487d6c9d6)
2024-08-12 16:45:04 -07:00
JUST.in DO IT
a9ba3b325f fix(sqllab): prev shema/table options remained on fail (#29638)
(cherry picked from commit 5539f87912)
2024-08-12 16:44:24 -07:00
Michael S. Molina
c8008e6225 refactor: Remove dead code from the Word Cloud plugin (#29594)
(cherry picked from commit 85b66946ed)
2024-08-12 16:43:46 -07:00
Joe Li
4369967732 chore: Adds 4.1.0 RC1 daa to CHANGELOG.md and UPDATING.md (#29637) 2024-07-22 18:07:56 -07:00
1014 changed files with 65373 additions and 55843 deletions

View File

@@ -70,9 +70,8 @@ github:
- cypress-matrix (4, chrome)
- cypress-matrix (5, chrome)
- frontend-build
- pre-commit (current)
- pre-commit (next)
- pre-commit (previous)
- pre-commit
- python-lint
- test-mysql
- test-postgres (current)
- test-postgres (next)

6
.github/CODEOWNERS vendored
View File

@@ -2,7 +2,7 @@
# https://github.com/apache/superset/issues/13351
/superset/migrations/ @mistercrunch @michael-s-molina @betodealmeida @eschutho
/superset/migrations/ @apache/superset-committers
# Notify some committers of changes in the components
@@ -12,7 +12,7 @@
# Notify Helm Chart maintainers about changes in it
/helm/superset/ @craig-rueda @dpgaspar @villebro @nytai @michael-s-molina
/helm/superset/ @craig-rueda @dpgaspar @villebro
# Notify E2E test maintainers of changes
@@ -22,7 +22,7 @@
/.github/ @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @john-bodley @kgabryje @dpgaspar
# Notify PMC members of changes to required GitHub Actions
# Notify PMC members of changes to required Github Actions
/.asf.yaml @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @john-bodley @kgabryje @dpgaspar

View File

@@ -15,9 +15,14 @@ body:
id: bug-description
attributes:
label: Bug description
description: A clear description of what the bug is, including reproduction steps and expected behavior.
description: A clear and concise description of what the bug is.
validations:
required: true
- type: textarea
id: repro-steps
attributes:
label: How to reproduce the bug
placeholder: |
The bug is that...
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
@@ -41,7 +46,7 @@ body:
label: Superset version
options:
- master / latest-dev
- "4.1.0"
- "4.0.2"
- "3.1.3"
validations:
required: true

View File

@@ -8,9 +8,8 @@ updates:
- package-ecosystem: "npm"
ignore:
# not until React >= 18.0.0
- dependency-name: "storybook"
- dependency-name: "@storybook*"
# not until node >= 18.12.0
- dependency-name: "css-minimizer-webpack-plugin"
directory: "/superset-frontend/"
schedule:
interval: "monthly"

View File

@@ -165,7 +165,7 @@ cypress-run-all() {
# UNCOMMENT the next few commands to monitor memory usage
# monitor_memory & # Start memory monitoring in the background
# memoryMonitorPid=$!
python ../../scripts/cypress_run.py --parallelism $PARALLELISM --parallelism-id $PARALLEL_ID --group $PARALLEL_ID --retries 5 $USE_DASHBOARD_FLAG
python ../../scripts/cypress_run.py --parallelism $PARALLELISM --parallelism-id $PARALLEL_ID --retries 5 $USE_DASHBOARD_FLAG
# kill $memoryMonitorPid
# After job is done, print out Flask log for debugging

View File

@@ -31,7 +31,7 @@ jobs:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: "20"
node-version: "18"
registry-url: 'https://registry.npmjs.org'
- run: npm ci
- run: npm run ci:release

View File

@@ -21,7 +21,7 @@ jobs:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: "20"
node-version: "18"
registry-url: 'https://registry.npmjs.org'
- run: npm ci
- run: npm test

View File

@@ -233,7 +233,7 @@ jobs:
- name: Deploy Amazon ECS task definition
id: deploy-task
uses: aws-actions/amazon-ecs-deploy-task-definition@v2
uses: aws-actions/amazon-ecs-deploy-task-definition@v1
with:
task-definition: ${{ steps.task-def.outputs.task-definition }}
service: pr-${{ github.event.issue.number }}-service

View File

@@ -19,7 +19,7 @@ jobs:
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
node-version: '18'
- name: Install Dependencies
run: npm install -g @action-validator/core @action-validator/cli --save-dev

View File

@@ -16,9 +16,6 @@ concurrency:
jobs:
pre-commit:
runs-on: ubuntu-22.04
strategy:
matrix:
python-version: ["current", "next", "previous"]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4
@@ -27,8 +24,6 @@ jobs:
submodules: recursive
- name: Setup Python
uses: ./.github/actions/setup-backend/
with:
python-version: ${{ matrix.python-version }}
- name: Enable brew and helm-docs
# Add brew to the path - see https://github.com/actions/runner-images/issues/6283
run: |
@@ -40,11 +35,8 @@ jobs:
brew install norwoodj/tap/helm-docs
- name: pre-commit
run: |
set +e # Don't exit immediately on failure
pre-commit run --all-files
if [ $? -ne 0 ] || ! git diff --quiet --exit-code; then
echo "❌ Pre-commit check failed."
echo "🚒 To prevent/address this CI issue, please install/use pre-commit locally."
echo "📖 More details here: https://superset.apache.org/docs/contributing/development#git-hooks"
if ! pre-commit run --all-files; then
git status
git diff
exit 1
fi

View File

@@ -29,7 +29,7 @@ jobs:
strategy:
matrix:
node-version: [20]
node-version: [18]
steps:
- uses: actions/checkout@v4

View File

@@ -26,7 +26,7 @@ jobs:
fail-fast: false
matrix:
browser: ["chrome"]
node: [20]
node: [18]
env:
SUPERSET_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config

View File

@@ -30,7 +30,7 @@ jobs:
runs-on: ubuntu-22.04
strategy:
matrix:
node: [20]
node: [18]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4

View File

@@ -4,7 +4,6 @@ on:
pull_request:
paths:
- "docs/**"
- ".github/workflows/superset-docs-verify.yml"
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
@@ -14,41 +13,15 @@ concurrency:
jobs:
linkinator:
# See docs here: https://github.com/marketplace/actions/linkinator
name: Link Checking
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
# Do not bump this linkinator-action version without opening
# an ASF Infra ticket to allow the new verison first!
- uses: JustinBeckwith/linkinator-action@v1.11.0
continue-on-error: true # This will make the job advisory (non-blocking, no red X)
- uses: JustinBeckwith/linkinator-action@v1.10.4
with:
paths: "**/*.md, **/*.mdx"
linksToSkip: >-
^https://github.com/apache/(superset|incubator-superset)/(pull|issue)/\d+,
http://localhost:8088/,
docker/.env-non-dev,
http://127.0.0.1:3000/,
http://localhost:9001/,
https://charts.bitnami.com/bitnami,
https://www.li.me/,
https://www.fanatics.com/,
https://tails.com/gb/,
https://www.techaudit.info/,
https://avetilearning.com/,
https://www.udemy.com/,
https://trustmedis.com/,
http://theiconic.com.au/,
https://dev.mysql.com/doc/refman/5.7/en/innodb-limits.html,
^https://img\.shields\.io/.*,
https://vkusvill.ru/
https://www.linkedin.com/in/mark-thomas-b16751158/
https://theiconic.com.au/
https://wattbewerb.de/
https://timbr.ai/
https://opensource.org/license/apache-2-0
https://www.plaidcloud.com/
linksToSkip: '^https://github.com/apache/(superset|incubator-superset)/(pull|issue)/\d+, http://localhost:8088/, docker/.env-non-dev, http://127.0.0.1:3000/, http://localhost:9001/, https://charts.bitnami.com/bitnami, https://www.li.me/, https://www.fanatics.com/, https://tails.com/gb/, https://www.techaudit.info/, https://avetilearning.com/, https://www.udemy.com/, https://trustmedis.com/, http://theiconic.com.au/, https://dev.mysql.com/doc/refman/5.7/en/innodb-limits.html, https://img.shields.io/librariesio/release/npm/%40superset-ui%2Fembedded-sdk?style=flat, https://img.shields.io/librariesio/release/npm/%40superset-ui%2Fplugin-chart-pivot-table?style=flat, https://vkusvill.ru/'
# verbosity: 'ERROR'
build-deploy:
name: Build & Deploy
runs-on: ubuntu-22.04

View File

@@ -48,8 +48,7 @@ jobs:
PYTHONPATH: ${{ github.workspace }}
REDIS_PORT: 16379
GITHUB_TOKEN: ${{ github.token }}
# use the dashboard feature when running manually OR merging to master
USE_DASHBOARD: ${{ github.event.inputs.use_dashboard == 'true'|| (github.ref == 'refs/heads/master' && 'true') || 'false' }}
USE_DASHBOARD: ${{ github.event.inputs.use_dashboard || (github.ref == 'refs/heads/master' && 'true') || 'false' }}
services:
postgres:
image: postgres:15-alpine
@@ -108,7 +107,7 @@ jobs:
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@v4
with:
node-version: "20"
node-version: "18"
- name: Install npm dependencies
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
@@ -132,7 +131,6 @@ jobs:
PARALLEL_ID: ${{ matrix.parallel_id }}
PARALLELISM: 6
CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
NODE_OPTIONS: "--max-old-space-size=4096"
with:
run: cypress-run-all ${{ env.USE_DASHBOARD }}
- name: Upload Artifacts

View File

@@ -33,7 +33,7 @@ jobs:
if: steps.check.outputs.frontend
uses: actions/setup-node@v4
with:
node-version: "20"
node-version: "18"
- name: Install dependencies
if: steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
@@ -49,6 +49,11 @@ jobs:
working-directory: ./superset-frontend
run: |
npm run type
- name: prettier
if: steps.check.outputs.frontend
working-directory: ./superset-frontend
run: |
npm run prettier-check
- name: Build plugins packages
if: steps.check.outputs.frontend
working-directory: ./superset-frontend

View File

@@ -36,7 +36,7 @@ jobs:
run: helm repo add bitnami https://charts.bitnami.com/bitnami
- name: Run chart-releaser
uses: ./.github/actions/chart-releaser-action
uses: helm/chart-releaser-action@v1.6.0
with:
charts_dir: helm
mark_as_latest: false

View File

@@ -0,0 +1,53 @@
# Python Misc unit tests
name: Python Misc
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
python-lint:
runs-on: ubuntu-22.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python
babel-extract:
runs-on: ubuntu-22.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
if: steps.check.outputs.python
uses: ./.github/actions/setup-backend/
- name: Test babel extraction
if: steps.check.outputs.python
run: scripts/translations/babel_update.sh

View File

@@ -54,13 +54,10 @@ jobs:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4
with:
persist-credentials: false
tags: true
fetch-depth: 0
- name: Use Node.js 20
uses: actions/setup-node@v4
with:
node-version: 20
- name: Setup supersetbot
uses: ./.github/actions/setup-supersetbot/
@@ -97,38 +94,16 @@ jobs:
--platform "linux/arm64" \
--platform "linux/amd64"
# Returning to master to support closing setup-supersetbot
git checkout master
update-prs-with-release-info:
needs: config
if: needs.config.outputs.has-secrets
runs-on: ubuntu-22.04
permissions:
contents: read
pull-requests: write
steps:
# Going back on original branch to allow "post" GHA operations
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Use Node.js 20
uses: actions/setup-node@v4
with:
node-version: 20
- name: Setup supersetbot
uses: ./.github/actions/setup-supersetbot/
persist-credentials: false
- name: Label the PRs with the right release-related labels
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
export GITHUB_ACTOR=""
git fetch --all --tags
git checkout master
RELEASE="${{ github.event.release.tag_name }}"
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
# in the case of a manually-triggered run, read release from input

View File

@@ -32,7 +32,7 @@ jobs:
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
node-version: '18'
- name: Install Dependencies
run: npm install

2
.gitignore vendored
View File

@@ -104,6 +104,7 @@ ghostdriver.log
testCSV.csv
.terser-plugin-cache/
apache-superset-*.tar.gz*
apache_superset-*.tar.gz*
release.json
# Translation-related files
@@ -121,4 +122,3 @@ docker/*local*
# Jest test report
test-report.html
superset/static/stats/statistics.html

View File

@@ -53,14 +53,11 @@ repos:
- id: debug-statements
- id: end-of-file-fixer
- id: trailing-whitespace
exclude: ^.*\.(snap)
args: ["--markdown-linebreak-ext=md"]
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v3.1.0 # Use the sha or tag you want to point at
hooks:
- id: prettier
additional_dependencies:
- prettier@3.3.3
args: ["--ignore-path=./superset-frontend/.prettierignore"]
files: "superset-frontend"
# blacklist unsafe functions like make_url (see #19526)

View File

@@ -61,9 +61,6 @@ tsconfig.tsbuildinfo
generator-superset/*
temporary_superset_ui/*
# skip license checks for auto-generated test snapshots
.*snap
# docs overrides for third party logos we don't have the rights to
google-big-query.svg
google-sheets.svg

50
CHANGELOG/4.1.1.md Normal file
View File

@@ -0,0 +1,50 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Change Log
### 4.1.1 (Fri Nov 15 22:13:57 2024 +0530)
**Database Migrations**
**Features**
**Fixes**
- [#30886](https://github.com/apache/superset/pull/30886) fix: blocks UI elements on right side (@samarsrivastav)
- [#30859](https://github.com/apache/superset/pull/30859) fix(package.json): Pin luxon version to unblock master (@geido)
- [#30588](https://github.com/apache/superset/pull/30588) fix(explore): column data type tooltip format (@mistercrunch)
- [#29911](https://github.com/apache/superset/pull/29911) fix: Rename database from 'couchbasedb' to 'couchbase' in documentation and db_engine_specs (@ayush-couchbase)
- [#30828](https://github.com/apache/superset/pull/30828) fix(TimezoneSelector): Failing unit tests due to timezone change (@geido)
- [#30875](https://github.com/apache/superset/pull/30875) fix: don't show metadata for embedded dashboards (@sadpandajoe)
- [#30851](https://github.com/apache/superset/pull/30851) fix: Graph chart colors (@michael-s-molina)
- [#29867](https://github.com/apache/superset/pull/29867) fix(capitalization): Capitalizing a button. (@rusackas)
- [#29782](https://github.com/apache/superset/pull/29782) fix(translations): Translate embedded errors (@rusackas)
- [#29772](https://github.com/apache/superset/pull/29772) fix: Fixing incomplete string escaping. (@rusackas)
- [#29725](https://github.com/apache/superset/pull/29725) fix(frontend/docker, ci): fix borked Docker build due to Lerna v8 uplift (@hainenber)
**Others**
- [#30576](https://github.com/apache/superset/pull/30576) chore: add link to Superset when report error (@eschutho)
- [#29786](https://github.com/apache/superset/pull/29786) refactor(Slider): Upgrade Slider to Antd 5 (@geido)
- [#29674](https://github.com/apache/superset/pull/29674) refactor(ChartCreation): Migrate tests to RTL (@rtexelm)
- [#29843](https://github.com/apache/superset/pull/29843) refactor(controls): Migrate AdhocMetricOption.test to RTL (@rtexelm)
- [#29845](https://github.com/apache/superset/pull/29845) refactor(controls): Migrate MetricDefinitionValue.test to RTL (@rtexelm)
- [#28424](https://github.com/apache/superset/pull/28424) docs: Check markdown files for bad links using linkinator (@rusackas)
- [#29768](https://github.com/apache/superset/pull/29768) docs(contributing): fix broken link to translations sub-section (@sfirke)

83
CHANGELOG/4.1.2.md Normal file
View File

@@ -0,0 +1,83 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Change Log
### 4.1.2 (Fri Mar 7 13:28:05 2025 -0800)
**Database Migrations**
- [#32538](https://github.com/apache/superset/pull/32538) fix(migrations): Handle comparator None in old time comparison migration (@Antonio-RiveroMartnez)
- [#32155](https://github.com/apache/superset/pull/32155) fix(migrations): Handle no params in time comparison migration (@Antonio-RiveroMartnez)
- [#31185](https://github.com/apache/superset/pull/31185) fix: check for column before adding in migrations (@betodealmeida)
**Features**
- [#29974](https://github.com/apache/superset/pull/29974) feat(sqllab): Adds refresh button to table metadata in SQL Lab (@Usiel)
**Fixes**
- [#32515](https://github.com/apache/superset/pull/32515) fix(sqllab): Allow clear on schema and catalog (@justinpark)
- [#32500](https://github.com/apache/superset/pull/32500) fix: dashboard, chart and dataset import validation (@dpgaspar)
- [#31353](https://github.com/apache/superset/pull/31353) fix(sqllab): duplicate error message (@betodealmeida)
- [#31407](https://github.com/apache/superset/pull/31407) fix: Big Number side cut fixed (@fardin-developer)
- [#31480](https://github.com/apache/superset/pull/31480) fix(sunburst): Use metric label from verbose map (@gerbermichi)
- [#31427](https://github.com/apache/superset/pull/31427) fix(tags): clean up bulk create api and schema (@villebro)
- [#31334](https://github.com/apache/superset/pull/31334) fix(docs): add custom editUrl path for intro page (@dwgrossberg)
- [#31353](https://github.com/apache/superset/pull/31353) fix(sqllab): duplicate error message (@betodealmeida)
- [#31323](https://github.com/apache/superset/pull/31323) fix: Use clickhouse sqlglot dialect for YDB (@vgvoleg)
- [#31198](https://github.com/apache/superset/pull/31198) fix: add more clickhouse disallowed functions on config (@dpgaspar)
- [#31194](https://github.com/apache/superset/pull/31194) fix(embedded): Hide anchor links in embedded mode (@Vitor-Avila)
- [#31960](https://github.com/apache/superset/pull/31960) fix(sqllab): Missing allowHTML props in ResultTableExtension (@justinpark)
- [#31332](https://github.com/apache/superset/pull/31332) fix: prevent multiple pvm errors on migration (@eschutho)
- [#31437](https://github.com/apache/superset/pull/31437) fix(database import): Gracefully handle error to get catalog schemas (@Vitor-Avila)
- [#31173](https://github.com/apache/superset/pull/31173) fix: cache-warmup fails (@nsivarajan)
- [#30442](https://github.com/apache/superset/pull/30442) fix(fe/src/dashboard): optional chaining for possibly nullable parent attribute in LayoutItem type (@hainenber)
- [#31639](https://github.com/apache/superset/pull/31639) fix(sqllab): unable to update saved queries (@DamianPendrak)
- [#29898](https://github.com/apache/superset/pull/29898) fix: parse pandas pivot null values (@eschutho)
- [#31414](https://github.com/apache/superset/pull/31414) fix(Pivot Table): Fix column width to respect currency config (@Vitor-Avila)
- [#31335](https://github.com/apache/superset/pull/31335) fix(histogram): axis margin padding consistent with other graphs (@tatiana-cherne)
- [#31301](https://github.com/apache/superset/pull/31301) fix(AllEntitiesTable): show Tags (@alexandrusoare)
- [#31329](https://github.com/apache/superset/pull/31329) fix: pass string to `process_template` (@betodealmeida)
- [#31341](https://github.com/apache/superset/pull/31341) fix(pinot): remove query aliases from SELECT and ORDER BY clauses in Pinot (@yuribogomolov)
- [#31308](https://github.com/apache/superset/pull/31308) fix: annotations on horizontal bar chart (@DamianPendrak)
- [#31294](https://github.com/apache/superset/pull/31294) fix(sqllab): Remove update_saved_query_exec_info to reduce lag (@justinpark)
- [#30897](https://github.com/apache/superset/pull/30897) fix: Exception handling for SQL Lab views (@michael-s-molina)
- [#31199](https://github.com/apache/superset/pull/31199) fix(Databricks): Escape catalog and schema names in pre-queries (@Vitor-Avila)
- [#31265](https://github.com/apache/superset/pull/31265) fix(trino): db session error in handle cursor (@justinpark)
- [#31024](https://github.com/apache/superset/pull/31024) fix(dataset): use sqlglot for DML check (@betodealmeida)
- [#29885](https://github.com/apache/superset/pull/29885) fix: add mutator to get_columns_description (@eschutho)
- [#30821](https://github.com/apache/superset/pull/30821) fix: x axis title disappears when editing bar chart (@DamianPendrak)
- [#31181](https://github.com/apache/superset/pull/31181) fix: Time-series Line Chart Display unnecessary total (@michael-s-molina)
- [#31163](https://github.com/apache/superset/pull/31163) fix(Dashboard): Backward compatible shared_label_colors field (@geido)
- [#31156](https://github.com/apache/superset/pull/31156) fix: check orderby (@betodealmeida)
- [#31154](https://github.com/apache/superset/pull/31154) fix: Remove unwanted commit on Trino's handle_cursor (@michael-s-molina)
- [#31151](https://github.com/apache/superset/pull/31151) fix: Revert "feat(trino): Add functionality to upload data (#29164)" (@michael-s-molina)
- [#31031](https://github.com/apache/superset/pull/31031) fix(Dashboard): Ensure shared label colors are updated (@geido)
- [#30967](https://github.com/apache/superset/pull/30967) fix(release validation): scripts now support RSA and EDDSA keys. (@rusackas)
- [#30881](https://github.com/apache/superset/pull/30881) fix(Dashboard): Native & Cross-Filters Scoping Performance (@geido)
- [#30887](https://github.com/apache/superset/pull/30887) fix(imports): import query_context for imports with charts (@lindenh)
- [#31008](https://github.com/apache/superset/pull/31008) fix(explore): verified props is not updated (@justinpark)
- [#30646](https://github.com/apache/superset/pull/30646) fix(Dashboard): Retain colors when color scheme not set (@geido)
- [#30962](https://github.com/apache/superset/pull/30962) fix(Dashboard): Exclude edit param in async screenshot (@geido)
**Others**
- [#32043](https://github.com/apache/superset/pull/32043) chore: Skip the creation of secondary perms during catalog migrations (@Vitor-Avila)
- [#30865](https://github.com/apache/superset/pull/30865) docs: Updating 4.1 Release Notes (@yousoph)

58
CHANGELOG/4.1.3.md Normal file
View File

@@ -0,0 +1,58 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Change Log
### 4.1.3 (Thu May 29 02:31:07 2025 -0500)
**Database Migrations**
**Features**
**Fixes**
- [#33522](https://github.com/apache/superset/pull/33522) fix(Sqllab): Autocomplete got stuck in UI when open it too fast (@rebenitez1802)
- [#33425](https://github.com/apache/superset/pull/33425) fix(table-chart): time shift is not working (@justinpark)
- [#32414](https://github.com/apache/superset/pull/32414) fix(api): Added uuid to list api calls (@withnale)
- [#33354](https://github.com/apache/superset/pull/33354) fix: loading examples from raw.githubusercontent.com fails with 429 errors (@mistercrunch)
- [#32382](https://github.com/apache/superset/pull/32382) fix(pinot): revert join and subquery flags (@yuribogomolov)
- [#32473](https://github.com/apache/superset/pull/32473) fix(plugin-chart-echarts): remove erroneous upper bound value (@villebro)
- [#33048](https://github.com/apache/superset/pull/33048) fix: improve error type on parse error (@justinpark)
- [#32968](https://github.com/apache/superset/pull/32968) fix(pivot-table): Revert "fix(Pivot Table): Fix column width to respect currency config (#31414)" (@justinpark)
- [#32795](https://github.com/apache/superset/pull/32795) fix(log): store navigation path to get correct logging path (@justinpark)
- [#33216](https://github.com/apache/superset/pull/33216) fix: Downgrade to marshmallow<4 (@amotl)
- [#32866](https://github.com/apache/superset/pull/32866) fix: make packages PEP 625 compliant (@sadpandajoe)
- [#32035](https://github.com/apache/superset/pull/32035) fix(fe/dashboard-list): display modifier info for `Last modified` data (@hainenber)
- [#32708](https://github.com/apache/superset/pull/32708) fix(logging): missing path in event data (@justinpark)
- [#32699](https://github.com/apache/superset/pull/32699) fix: Signature of Celery pruner jobs (@michael-s-molina)
- [#32681](https://github.com/apache/superset/pull/32681) fix(log): Update recent_activity by event name (@justinpark)
- [#32608](https://github.com/apache/superset/pull/32608) fix(welcome): perf on distinct recent activities (@justinpark)
- [#32572](https://github.com/apache/superset/pull/32572) fix: Log table retention policy (@michael-s-molina)
- [#32406](https://github.com/apache/superset/pull/32406) fix(model/helper): represent RLS filter clause in proper textual SQL string (@hainenber)
- [#32240](https://github.com/apache/superset/pull/32240) fix: upgrade to 3.11.11-slim-bookworm to address critical vulnerabilities (@gpchandran)
- [#30858](https://github.com/apache/superset/pull/30858) fix(chart data): removing query from /chart/data payload when accessing as guest user (@fisjac)
**Others**
- [#33612](https://github.com/apache/superset/pull/33612) chore: update Dockerfile - Upgrade to 3.11.12 (@gpchandran)
- [#33435](https://github.com/apache/superset/pull/33435) docs: CVEs fixed on 4.1.2 (@sha174n)
- [#33339](https://github.com/apache/superset/pull/33339) chore(🦾): bump python h11 0.14.0 -> 0.16.0 (@github-actions[bot])
- [#32745](https://github.com/apache/superset/pull/32745) chore(🦾): bump python sqlglot 26.1.3 -> 26.11.1 (@github-actions[bot])
- [#32782](https://github.com/apache/superset/pull/32782) chore: Revert "chore: bump base image in Dockerfile with `ARG PY_VER=3.11.11-slim-bookworm`" (@sadpandajoe)
- [#32780](https://github.com/apache/superset/pull/32780) chore: bump base image in Dockerfile with `ARG PY_VER=3.11.11-slim-bookworm` (@gpchandran)

View File

@@ -80,9 +80,9 @@ If you believe someone is violating this code of conduct, you may reply to them
Or one of our volunteers:
* [Mark Thomas](https://www.linkedin.com/in/mark-thomas-b16751158/)
* [Joan Touzet](https://www.apache.org/foundation/conduct-team/wohali.html)
* [Sharan Foga](https://www.linkedin.com/in/sfoga/)
* [Mark Thomas](http://home.apache.org/~markt/coc.html)
* [Joan Touzet](http://home.apache.org/~wohali/)
* [Sharan Foga](http://home.apache.org/~sharan/coc.html)
If the violation is in documentation or code, for example inappropriate pronoun usage or word choice within official documentation, we ask that people report these privately to the project in question at <private@project.apache.org>, and, if they have sufficient ability within the project, to resolve or remove the concerning material, being mindful of the perspective of the person originally reporting the issue.

View File

@@ -18,117 +18,84 @@
######################################################################
# Node stage to deal with static asset construction
######################################################################
ARG PY_VER=3.10-slim-bookworm
ARG PY_VER=3.11.12-slim-bookworm
# if BUILDPLATFORM is null, set it to 'amd64' (or leave as is otherwise).
ARG BUILDPLATFORM=${BUILDPLATFORM:-amd64}
FROM --platform=${BUILDPLATFORM} node:20-bullseye-slim AS superset-node
FROM --platform=${BUILDPLATFORM} node:18-bullseye-slim AS superset-node
ARG NPM_BUILD_CMD="build"
# Include translations in the final build. The default supports en only to
# reduce complexity and weight for those only using en
ARG BUILD_TRANSLATIONS="false"
# Used by docker-compose to skip the frontend build,
# in dev we mount the repo and build the frontend inside docker
ARG DEV_MODE="false"
# Include headless browsers? Allows for alerts, reports & thumbnails, but bloats the images
ARG INCLUDE_CHROMIUM="true"
ARG INCLUDE_FIREFOX="false"
# Somehow we need python3 + build-essential on this side of the house to install node-gyp
RUN apt-get update -qq \
&& apt-get install \
-yqq --no-install-recommends \
build-essential \
python3 \
zstd
&& apt-get install \
-yqq --no-install-recommends \
build-essential \
python3
ENV BUILD_CMD=${NPM_BUILD_CMD} \
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
# NPM ci first, as to NOT invalidate previous steps except for when package.json changes
RUN --mount=type=bind,target=/frontend-mem-nag.sh,src=./docker/frontend-mem-nag.sh \
/frontend-mem-nag.sh
/frontend-mem-nag.sh
WORKDIR /app/superset-frontend
# Creating empty folders to avoid errors when running COPY later on
RUN mkdir -p /app/superset/static/assets
RUN --mount=type=bind,target=./package.json,src=./superset-frontend/package.json \
--mount=type=bind,target=./package-lock.json,src=./superset-frontend/package-lock.json \
if [ "$DEV_MODE" = "false" ]; then \
npm ci; \
else \
echo "Skipping 'npm ci' in dev mode"; \
fi
--mount=type=bind,target=./package-lock.json,src=./superset-frontend/package-lock.json \
npm ci
# Runs the webpack build process
COPY superset-frontend /app/superset-frontend
RUN npm run ${BUILD_CMD}
# This copies the .po files needed for translation
RUN mkdir -p /app/superset/translations
COPY superset/translations /app/superset/translations
RUN if [ "$DEV_MODE" = "false" ]; then \
BUILD_TRANSLATIONS=$BUILD_TRANSLATIONS npm run ${BUILD_CMD}; \
else \
echo "Skipping 'npm run ${BUILD_CMD}' in dev mode"; \
fi
# Compiles .json files from the .po files, then deletes the .po files
RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \
npm run build-translation; \
else \
echo "Skipping translations as requested by build flag"; \
fi
RUN npm run build-translation
RUN rm /app/superset/translations/*/LC_MESSAGES/*.po
RUN rm /app/superset/translations/messages.pot
FROM python:${PY_VER} AS python-base
######################################################################
# Final lean image...
######################################################################
FROM python-base AS lean
# Include translations in the final build. The default supports en only to
# reduce complexity and weight for those only using en
ARG BUILD_TRANSLATIONS="false"
FROM python:${PY_VER} AS lean
WORKDIR /app
ENV LANG=C.UTF-8 \
LC_ALL=C.UTF-8 \
SUPERSET_ENV=production \
FLASK_APP="superset.app:create_app()" \
PYTHONPATH="/app/pythonpath" \
SUPERSET_HOME="/app/superset_home" \
SUPERSET_PORT=8088
LC_ALL=C.UTF-8 \
SUPERSET_ENV=production \
FLASK_APP="superset.app:create_app()" \
PYTHONPATH="/app/pythonpath" \
SUPERSET_HOME="/app/superset_home" \
SUPERSET_PORT=8088
RUN mkdir -p ${PYTHONPATH} superset/static requirements superset-frontend apache_superset.egg-info requirements \
&& useradd --user-group -d ${SUPERSET_HOME} -m --no-log-init --shell /bin/bash superset \
&& apt-get update -qq && apt-get install -yqq --no-install-recommends \
curl \
libsasl2-dev \
libsasl2-modules-gssapi-mit \
libpq-dev \
libecpg-dev \
libldap2-dev \
&& touch superset/static/version_info.json \
&& chown -R superset:superset ./* \
&& rm -rf /var/lib/apt/lists/*
&& useradd --user-group -d ${SUPERSET_HOME} -m --no-log-init --shell /bin/bash superset \
&& apt-get update -qq && apt-get install -yqq --no-install-recommends \
curl \
default-libmysqlclient-dev \
libsasl2-dev \
libsasl2-modules-gssapi-mit \
libpq-dev \
libecpg-dev \
libldap2-dev \
&& touch superset/static/version_info.json \
&& chown -R superset:superset ./* \
&& rm -rf /var/lib/apt/lists/*
COPY --chown=superset:superset pyproject.toml setup.py MANIFEST.in README.md ./
# setup.py uses the version information in package.json
COPY --chown=superset:superset superset-frontend/package.json superset-frontend/
COPY --chown=superset:superset requirements/base.txt requirements/
COPY --chown=superset:superset scripts/check-env.py scripts/
RUN --mount=type=cache,target=/root/.cache/pip \
apt-get update -qq && apt-get install -yqq --no-install-recommends \
build-essential \
&& pip install --no-cache-dir --upgrade setuptools pip \
&& pip install --no-cache-dir -r requirements/base.txt \
&& apt-get autoremove -yqq --purge build-essential \
&& rm -rf /var/lib/apt/lists/*
apt-get update -qq && apt-get install -yqq --no-install-recommends \
build-essential \
&& pip install --upgrade setuptools pip \
&& pip install -r requirements/base.txt \
&& apt-get autoremove -yqq --purge build-essential \
&& rm -rf /var/lib/apt/lists/*
# Copy the compiled frontend assets
COPY --chown=superset:superset --from=superset-node /app/superset/static/assets superset/static/assets
@@ -136,21 +103,17 @@ COPY --chown=superset:superset --from=superset-node /app/superset/static/assets
## Lastly, let's install superset itself
COPY --chown=superset:superset superset superset
RUN --mount=type=cache,target=/root/.cache/pip \
pip install --no-cache-dir -e .
pip install -e .
# Copy the .json translations from the frontend layer
COPY --chown=superset:superset --from=superset-node /app/superset/translations superset/translations
# Compile translations for the backend - this generates .mo files, then deletes the .po files
COPY ./scripts/translations/generate_mo_files.sh ./scripts/translations/
RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \
./scripts/translations/generate_mo_files.sh \
&& chown -R superset:superset superset/translations \
&& rm superset/translations/messages.pot \
&& rm superset/translations/*/LC_MESSAGES/*.po; \
else \
echo "Skipping translations as requested by build flag"; \
fi
RUN ./scripts/translations/generate_mo_files.sh \
&& chown -R superset:superset superset/translations \
&& rm superset/translations/messages.pot \
&& rm superset/translations/*/LC_MESSAGES/*.po
COPY --chmod=755 ./docker/run-server.sh /usr/bin/
USER superset
@@ -168,52 +131,42 @@ FROM lean AS dev
USER root
RUN apt-get update -qq \
&& apt-get install -yqq --no-install-recommends \
libnss3 \
libdbus-glib-1-2 \
libgtk-3-0 \
libx11-xcb1 \
libasound2 \
libxtst6 \
git \
pkg-config \
&& rm -rf /var/lib/apt/lists/*
&& apt-get install -yqq --no-install-recommends \
libnss3 \
libdbus-glib-1-2 \
libgtk-3-0 \
libx11-xcb1 \
libasound2 \
libxtst6 \
git \
pkg-config \
&& rm -rf /var/lib/apt/lists/*
RUN --mount=type=cache,target=/root/.cache/pip \
pip install --no-cache-dir playwright
pip install playwright
RUN playwright install-deps
RUN if [ "$INCLUDE_CHROMIUM" = "true" ]; then \
playwright install chromium; \
else \
echo "Skipping translations in dev mode"; \
fi
RUN playwright install chromium
# Install GeckoDriver WebDriver
ARG GECKODRIVER_VERSION=v0.34.0 \
FIREFOX_VERSION=125.0.3
FIREFOX_VERSION=125.0.3
RUN if [ "$INCLUDE_FIREFOX" = "true" ]; then \
apt-get update -qq \
&& apt-get install -yqq --no-install-recommends wget bzip2 \
&& wget -q https://github.com/mozilla/geckodriver/releases/download/${GECKODRIVER_VERSION}/geckodriver-${GECKODRIVER_VERSION}-linux64.tar.gz -O - | tar xfz - -C /usr/local/bin \
&& wget -q https://download-installer.cdn.mozilla.net/pub/firefox/releases/${FIREFOX_VERSION}/linux-x86_64/en-US/firefox-${FIREFOX_VERSION}.tar.bz2 -O - | tar xfj - -C /opt \
&& ln -s /opt/firefox/firefox /usr/local/bin/firefox \
&& apt-get autoremove -yqq --purge wget bzip2 && rm -rf /var/[log,tmp]/* /tmp/* /var/lib/apt/lists/*; \
fi
# Installing mysql client os-level dependencies in dev image only because GPL
RUN apt-get install -yqq --no-install-recommends \
default-libmysqlclient-dev \
&& rm -rf /var/lib/apt/lists/*
RUN apt-get update -qq \
&& apt-get install -yqq --no-install-recommends wget bzip2 \
&& wget -q https://github.com/mozilla/geckodriver/releases/download/${GECKODRIVER_VERSION}/geckodriver-${GECKODRIVER_VERSION}-linux64.tar.gz -O - | tar xfz - -C /usr/local/bin \
# Install Firefox
&& wget -q https://download-installer.cdn.mozilla.net/pub/firefox/releases/${FIREFOX_VERSION}/linux-x86_64/en-US/firefox-${FIREFOX_VERSION}.tar.bz2 -O - | tar xfj - -C /opt \
&& ln -s /opt/firefox/firefox /usr/local/bin/firefox \
&& apt-get autoremove -yqq --purge wget bzip2 && rm -rf /var/[log,tmp]/* /tmp/* /var/lib/apt/lists/*
# Cache everything for dev purposes...
COPY --chown=superset:superset requirements/development.txt requirements/
RUN --mount=type=cache,target=/root/.cache/pip \
apt-get update -qq && apt-get install -yqq --no-install-recommends \
build-essential \
&& pip install --no-cache-dir -r requirements/development.txt \
&& apt-get autoremove -yqq --purge build-essential \
&& rm -rf /var/lib/apt/lists/*
apt-get update -qq && apt-get install -yqq --no-install-recommends \
build-essential \
&& pip install -r requirements/development.txt \
&& apt-get autoremove -yqq --purge build-essential \
&& rm -rf /var/lib/apt/lists/*
USER superset
######################################################################

View File

@@ -19,12 +19,12 @@ under the License.
# Superset
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/license/apache-2-0)
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
[![GitHub release (latest SemVer)](https://img.shields.io/github/v/release/apache/superset?sort=semver)](https://github.com/apache/superset/tree/latest)
[![Build Status](https://github.com/apache/superset/workflows/Python/badge.svg)](https://github.com/apache/superset/actions)
[![PyPI version](https://badge.fury.io/py/apache-superset.svg)](https://badge.fury.io/py/apache-superset)
[![PyPI version](https://badge.fury.io/py/apache_superset.svg)](https://badge.fury.io/py/apache_superset)
[![Coverage Status](https://codecov.io/github/apache/superset/coverage.svg?branch=master)](https://codecov.io/github/apache/superset)
[![PyPI](https://img.shields.io/pypi/pyversions/apache-superset.svg?maxAge=2592000)](https://pypi.python.org/pypi/apache-superset)
[![PyPI](https://img.shields.io/pypi/pyversions/apache_superset.svg?maxAge=2592000)](https://pypi.python.org/pypi/apache_superset)
[![Get on Slack](https://img.shields.io/badge/slack-join-orange.svg)](http://bit.ly/join-superset-slack)
[![Documentation](https://img.shields.io/badge/docs-apache.org-blue.svg)](https://superset.apache.org)
@@ -72,8 +72,10 @@ Superset provides:
## Screenshots & Gifs
**Video Overview**
<!-- File hosted here https://github.com/apache/superset-site/raw/lfs/superset-video-4k.mp4 -->
[superset-video-4k.webm](https://github.com/apache/superset/assets/812905/da036bc2-150c-4ee7-80f9-75e63210ff76)
[superset-video-1080p.webm](https://github.com/user-attachments/assets/b37388f7-a971-409c-96a7-90c4e31322e6)
<br/>
@@ -134,8 +136,6 @@ Here are some of the major database solutions that are supported:
<img src="https://superset.apache.org/img/databases/starrocks.png" alt="starrocks" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/doris.png" alt="doris" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/oceanbase.svg" alt="oceanbase" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/sap-hana.png" alt="oceanbase" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/denodo.png" alt="denodo" border="0" width="200" />
</p>
**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/configuration/databases).
@@ -153,7 +153,7 @@ Want to add support for your datastore or data engine? Read more [here](https://
and please read our [Slack Community Guidelines](https://github.com/apache/superset/blob/master/CODE_OF_CONDUCT.md#slack-community-guidelines)
- [Join our dev@superset.apache.org Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org). To join, simply send an email to [dev-subscribe@superset.apache.org](mailto:dev-subscribe@superset.apache.org)
- If you want to help troubleshoot GitHub Issues involving the numerous database drivers that Superset supports, please consider adding your name and the databases you have access to on the [Superset Database Familiarity Rolodex](https://docs.google.com/spreadsheets/d/1U1qxiLvOX0kBTUGME1AHHi6Ywel6ECF8xk_Qy-V9R8c/edit#gid=0)
- Join Superset's Town Hall and [Operational Model](https://preset.io/blog/the-superset-operational-model-wants-you/) recurring meetings. Meeting info is available on the [Superset Community Calendar](https://superset.apache.org/community)
- Join Superset's Town Hall and [Operational Model](https://preset.io/blog/the-superset-operational-model-wants-you/) recurring meetings. Meeting info is available on the [Superset Community Calendar](https://superset.apache.org/community)
## Contributor Guide
@@ -181,14 +181,16 @@ Understanding the Superset Points of View
- [Building New Database Connectors](https://preset.io/blog/building-database-connector/)
- [Create Your First Dashboard](https://superset.apache.org/docs/using-superset/creating-your-first-dashboard/)
- [Comprehensive Tutorial for Contributing Code to Apache Superset
](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
- [Resources to master Superset by Preset](https://preset.io/resources/)
- Deploying Superset
- [Official Docker image](https://hub.docker.com/r/apache/superset)
- [Helm Chart](https://github.com/apache/superset/tree/master/helm/superset)
- Recordings of Past [Superset Community Events](https://preset.io/events)
- [Mixed Time Series Charts](https://preset.io/events/mixed-time-series-visualization-in-superset-workshop/)
- [How the Bing Team Customized Superset for the Internal Self-Serve Data & Analytics Platform](https://preset.io/events/how-the-bing-team-heavily-customized-superset-for-their-internal-data/)
- [Live Demo: Visualizing MongoDB and Pinot Data using Trino](https://preset.io/events/2021-04-13-visualizing-mongodb-and-pinot-data-using-trino/)
@@ -196,6 +198,7 @@ Understanding the Superset Points of View
- [Building a Database Connector for Superset](https://preset.io/events/2021-02-16-building-a-database-connector-for-superset/)
- Visualizations
- [Creating Viz Plugins](https://superset.apache.org/docs/contributing/creating-viz-plugins/)
- [Managing and Deploying Custom Viz Plugins](https://medium.com/nmc-techblog/apache-superset-manage-custom-viz-plugins-in-production-9fde1a708e55)
- [Why Apache Superset is Betting on Apache ECharts](https://preset.io/blog/2021-4-1-why-echarts/)

View File

@@ -20,7 +20,7 @@ RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset
# Configure environment
ENV LANG=C.UTF-8 \
LC_ALL=C.UTF-8
LC_ALL=C.UTF-8
RUN apt-get update -y
@@ -30,14 +30,14 @@ RUN apt-get install -y apt-transport-https apt-utils
# Install superset dependencies
# https://superset.apache.org/docs/installation/installing-superset-from-scratch
RUN apt-get install -y build-essential libssl-dev \
libffi-dev python3-dev libsasl2-dev libldap2-dev libxi-dev chromium
libffi-dev python3-dev libsasl2-dev libldap2-dev libxi-dev chromium zstd
# Install nodejs for custom build
# https://nodejs.org/en/download/package-manager/
RUN set -eux; \
curl -sL https://deb.nodesource.com/setup_18.x | bash -; \
apt-get install -y nodejs; \
node --version;
curl -sL https://deb.nodesource.com/setup_20.x | bash -; \
apt-get install -y nodejs; \
node --version;
RUN if ! which npm; then apt-get install -y npm; fi
RUN mkdir -p /home/superset
@@ -50,21 +50,21 @@ ARG SUPERSET_RELEASE_RC_TARBALL
# Can fetch source from svn or copy tarball from local mounted directory
COPY $SUPERSET_RELEASE_RC_TARBALL ./
RUN tar -xvf *.tar.gz
WORKDIR /home/superset/apache-superset-$VERSION/superset-frontend
WORKDIR /home/superset/apache_superset-$VERSION/superset-frontend
RUN npm ci \
&& npm run build \
&& rm -rf node_modules
&& npm run build \
&& rm -rf node_modules
WORKDIR /home/superset/apache-superset-$VERSION
WORKDIR /home/superset/apache_superset-$VERSION
RUN pip install --upgrade setuptools pip \
&& pip install -r requirements/base.txt \
&& pip install --no-cache-dir .
&& pip install -r requirements/base.txt \
&& pip install --no-cache-dir .
RUN flask fab babel-compile --target superset/translations
ENV PATH=/home/superset/superset/bin:$PATH \
PYTHONPATH=/home/superset/superset/:$PYTHONPATH \
SUPERSET_TESTENV=true
PYTHONPATH=/home/superset/superset/ \
SUPERSET_TESTENV=true
COPY from_tarball_entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]

View File

@@ -20,7 +20,7 @@ RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset
# Configure environment
ENV LANG=C.UTF-8 \
LC_ALL=C.UTF-8
LC_ALL=C.UTF-8
RUN apt-get update -y
@@ -29,13 +29,16 @@ RUN apt-get install -y apt-transport-https apt-utils
# Install superset dependencies
# https://superset.apache.org/docs/installation/installing-superset-from-scratch
RUN apt-get install -y build-essential libssl-dev \
libffi-dev python3-dev libsasl2-dev libldap2-dev libxi-dev chromium
RUN apt-get install -y subversion build-essential libssl-dev \
libffi-dev python3-dev libsasl2-dev libldap2-dev libxi-dev chromium zstd
# Install nodejs for custom build
# https://nodejs.org/en/download/package-manager/
RUN curl -sL https://deb.nodesource.com/setup_16.x | bash - \
&& apt-get install -y nodejs
RUN set -eux; \
curl -sL https://deb.nodesource.com/setup_20.x | bash -; \
apt-get install -y nodejs; \
node --version;
RUN if ! which npm; then apt-get install -y npm; fi
RUN mkdir -p /home/superset
RUN chown superset /home/superset
@@ -46,22 +49,20 @@ ARG VERSION
# Can fetch source from svn or copy tarball from local mounted directory
RUN svn co https://dist.apache.org/repos/dist/dev/superset/$VERSION ./
RUN tar -xvf *.tar.gz
WORKDIR apache-superset-$VERSION
WORKDIR /home/superset/apache_superset-$VERSION/superset-frontend
RUN cd superset-frontend \
&& npm ci \
&& npm run build \
&& rm -rf node_modules
RUN npm ci \
&& npm run build \
&& rm -rf node_modules
WORKDIR /home/superset/apache-superset-$VERSION
WORKDIR /home/superset/apache_superset-$VERSION
RUN pip install --upgrade setuptools pip \
&& pip install -r requirements/base.txt \
&& pip install --no-cache-dir .
&& pip install -r requirements/base.txt \
&& pip install --no-cache-dir .
RUN flask fab babel-compile --target superset/translations
ENV PATH=/home/superset/superset/bin:$PATH \
PYTHONPATH=/home/superset/superset/:$PYTHONPATH
PYTHONPATH=/home/superset/superset/
COPY from_tarball_entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]

View File

@@ -123,10 +123,10 @@ SUPERSET_RC=1
SUPERSET_GITHUB_BRANCH=1.5
SUPERSET_PGP_FULLNAME=villebro@apache.org
SUPERSET_VERSION_RC=1.5.1rc1
SUPERSET_RELEASE=apache-superset-1.5.1
SUPERSET_RELEASE_RC=apache-superset-1.5.1rc1
SUPERSET_RELEASE_TARBALL=apache-superset-1.5.1-source.tar.gz
SUPERSET_RELEASE_RC_TARBALL=apache-superset-1.5.1rc1-source.tar.gz
SUPERSET_RELEASE=apache_superset-1.5.1
SUPERSET_RELEASE_RC=apache_superset-1.5.1rc1
SUPERSET_RELEASE_TARBALL=apache_superset-1.5.1-source.tar.gz
SUPERSET_RELEASE_RC_TARBALL=apache_superset-1.5.1rc1-source.tar.gz
SUPERSET_TMP_ASF_SITE_PATH=/tmp/incubator-superset-site-1.5.1
-------------------------------
```
@@ -380,7 +380,7 @@ Official instructions:
https://www.apache.org/info/verification.html
We now have a handy script for anyone validating a release to use. The core of it is in this very folder, `verify_release.py`. Just make sure you have all three release files in the same directory (`{some version}.tar.gz`, `{some version}.tar.gz.asc` and `{some version}tar.gz.sha512`). Then you can pass this script the path to the `.gz` file like so:
`python verify_release.py ~/path/tp/apache-superset-{version/candidate}-source.tar.gz`
`python verify_release.py ~/path/tp/apache_superset-{version/candidate}-source.tar.gz`
If all goes well, you will see this result in your terminal:
@@ -437,7 +437,7 @@ cd ${SUPERSET_RELEASE_RC}
python3 -m venv venv
source venv/bin/activate
pip install -r requirements/base.txt
pip install build twine
pip install twine
```
Create the distribution
@@ -455,7 +455,7 @@ cd ../
./scripts/translations/generate_po_files.sh
# build the python distribution
python -m build
python setup.py sdist
```
Publish to PyPI
@@ -467,7 +467,7 @@ while requesting access to push packages.
```bash
twine upload dist/apache_superset-${SUPERSET_VERSION}-py3-none-any.whl
twine upload dist/apache-superset-${SUPERSET_VERSION}.tar.gz
twine upload dist/apache_superset-${SUPERSET_VERSION}.tar.gz
```
Set your username to `__token__`

View File

@@ -31,7 +31,7 @@ The official source release:
https://downloads.apache.org/{{ project_module }}/{{ version }}
The PyPI package:
https://pypi.org/project/apache-superset/{{ version }}
https://pypi.org/project/apache_superset/{{ version }}
The CHANGELOG for the release:
https://github.com/apache/{{ project_module }}/blob/{{ version }}/CHANGELOG/{{ version }}.md

View File

@@ -32,7 +32,7 @@ else
SUPERSET_VERSION="${1}"
SUPERSET_RC="${2}"
SUPERSET_PGP_FULLNAME="${3}"
SUPERSET_RELEASE_RC_TARBALL="apache-superset-${SUPERSET_VERSION_RC}-source.tar.gz"
SUPERSET_RELEASE_RC_TARBALL="apache_superset-${SUPERSET_VERSION_RC}-source.tar.gz"
fi
SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}"

View File

@@ -22,7 +22,7 @@ if [ -z "${SUPERSET_VERSION_RC}" ] || [ -z "${SUPERSET_SVN_DEV_PATH}" ] || [ -z
exit 1
fi
SUPERSET_RELEASE_RC=apache-superset-"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC=apache_superset-"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC_TARBALL="${SUPERSET_RELEASE_RC}"-source.tar.gz
SUPERSET_RELEASE_RC_BASE_PATH="${SUPERSET_SVN_DEV_PATH}"/"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC_TARBALL_PATH="${SUPERSET_RELEASE_RC_BASE_PATH}"/"${SUPERSET_RELEASE_RC_TARBALL}"

View File

@@ -137,4 +137,4 @@ There is now a [metadata bar](https://github.com/apache/superset/pull/27857) add
## Change to Docker image builds
Starting in 4.1.0, the release's docker image does not ship with drivers needed to operate Superset. Users may need to install a driver for their metadata database (MySQL or Postgres) as well as the driver for their data warehouse. This is a result of changes to the `lean` docker image that official releases come from; see [Docker Build Presets](/docs/docs/installation/docker-builds.mdx#build-presets) for more details.
Starting in 4.1.0, the release's docker image does not ship with drivers needed to operate Superset. Users may need to install a driver for their metadata database (MySQL or Postgres) as well as the driver for their data warehouse. This is a result of changes to the `lean` docker image that official releases come from; see [Docker Build Presets](/docs/installation/docker-builds#build-presets) for more details.

View File

@@ -50,8 +50,8 @@ else
export SUPERSET_GITHUB_BRANCH="${VERSION_MAJOR}.${VERSION_MINOR}"
export SUPERSET_PGP_FULLNAME="${2}"
export SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${VERSION_RC}"
export SUPERSET_RELEASE=apache-superset-"${SUPERSET_VERSION}"
export SUPERSET_RELEASE_RC=apache-superset-"${SUPERSET_VERSION_RC}"
export SUPERSET_RELEASE=apache_superset-"${SUPERSET_VERSION}"
export SUPERSET_RELEASE_RC=apache_superset-"${SUPERSET_VERSION_RC}"
export SUPERSET_RELEASE_TARBALL="${SUPERSET_RELEASE}"-source.tar.gz
export SUPERSET_RELEASE_RC_TARBALL="${SUPERSET_RELEASE_RC}"-source.tar.gz
export SUPERSET_TMP_ASF_SITE_PATH="/tmp/incubator-superset-site-${SUPERSET_VERSION}"

View File

@@ -27,7 +27,7 @@ if [ -z "${SUPERSET_SVN_DEV_PATH}" ]; then
fi
if [[ -n ${1} ]] && [[ ${1} == "local" ]]; then
SUPERSET_RELEASE_RC=apache-superset-"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC=apache_superset-"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC_TARBALL="${SUPERSET_RELEASE_RC}"-source.tar.gz
SUPERSET_TARBALL_PATH="${SUPERSET_SVN_DEV_PATH}"/${SUPERSET_VERSION_RC}/${SUPERSET_RELEASE_RC_TARBALL}
SUPERSET_TMP_TARBALL_FILENAME=_tmp_"${SUPERSET_VERSION_RC}".tar.gz

View File

@@ -38,7 +38,7 @@ get_pip_command() {
PYTHON=$(get_python_command)
PIP=$(get_pip_command)
# Get the release directory's path. If you unzip an Apache release and just run the npm script to validate the release, this will be a file name like `apache-superset-x.x.xrcx-source.tar.gz`
# Get the release directory's path. If you unzip an Apache release and just run the npm script to validate the release, this will be a file name like `apache_superset-x.x.xrcx-source.tar.gz`
RELEASE_ZIP_PATH="../../$(basename "$(dirname "$(pwd)")")-source.tar.gz"
# Install dependencies from requirements.txt if the file exists

View File

@@ -25,54 +25,56 @@ all you have to do is file a simple PR [like this one](https://github.com/apache
the categorization is inaccurate, please file a PR with your correction as well.
Join our growing community!
### Sharing Economy
- [Airbnb](https://github.com/airbnb)
- [Faasos](https://faasos.com/) [@shashanksingh]
- [Free2Move](https://www.free2move.com/) [@PaoloTerzi]
- [Faasos](http://faasos.com/) [@shashanksingh]
- [Hostnfly](https://www.hostnfly.com/) [@alexisrosuel]
- [Lime](https://www.li.me/) [@cxmcc]
- [Lyft](https://www.lyft.com/)
- [Ontruck](https://www.ontruck.com/)
### Financial Services
- [Aktia Bank plc](https://www.aktia.com)
- [Aktia Bank plc](https://www.aktia.com) [@villebro]
- [American Express](https://www.americanexpress.com) [@TheLastSultan]
- [bumper](https://www.bumper.co/) [@vasu-ram, @JamiePercival]
- [Cape Crypto](https://capecrypto.com)
- [Capital Service S.A.](https://capitalservice.pl) [@pkonarzewski]
- [Clark.de](https://clark.de/)
- [Capital Service S.A.](http://capitalservice.pl) [@pkonarzewski]
- [Clark.de](http://clark.de/)
- [KarrotPay](https://www.daangnpay.com/)
- [Taveo](https://www.taveo.com) [@codek]
- [Unit](https://www.unit.co/about-us) [@amitmiran137]
- [Wise](https://wise.com) [@koszti]
- [Xendit](https://xendit.co/) [@LieAlbertTriAdrian]
- [Xendit](http://xendit.co/) [@LieAlbertTriAdrian]
- [bumper](https://www.bumper.co/) [@vasu-ram, @JamiePercival]
### Gaming
- [Popoko VM Games Studio](https://popoko.live)
### E-Commerce
- [AiHello](https://www.aihello.com) [@ganeshkrishnan1]
- [Bazaar Technologies](https://www.bazaartech.com) [@umair-abro]
- [Dragonpass](https://www.dragonpass.com.cn/) [@zhxjdwh]
- [Dropit Shopping](https://www.dropit.shop/) [@dropit-dev]
- [Fanatics](https://www.fanatics.com/) [@coderfender]
- [Fordeal](https://www.fordeal.com) [@Renkai]
- [Fordeal](http://www.fordeal.com) [@Renkai]
- [GFG - Global Fashion Group](https://global-fashion-group.com) [@ksaagariconic]
- [HuiShouBao](https://www.huishoubao.com/) [@Yukinoshita-Yukino]
- [HuiShouBao](http://www.huishoubao.com/) [@Yukinoshita-Yukino]
- [Now](https://www.now.vn/) [@davidkohcw]
- [Qunar](https://www.qunar.com/) [@flametest]
- [Rakuten Viki](https://www.viki.com)
- [Shopee](https://shopee.sg) [@xiaohanyu]
- [Shopkick](https://www.shopkick.com) [@LAlbertalli]
- [Tails.com](https://tails.com/gb/) [@alanmcruickshank]
- [THE ICONIC](https://theiconic.com.au/) [@ksaagariconic]
- [THE ICONIC](http://theiconic.com.au/) [@ksaagariconic]
- [Utair](https://www.utair.ru) [@utair-digital]
- [VkusVill](https://vkusvill.ru/) [@ETselikov]
- [Zalando](https://www.zalando.com) [@dmigo]
- [Zalora](https://www.zalora.com) [@ksaagariconic]
### Enterprise Technology
- [A3Data](https://a3data.com.br) [@neylsoncrepalde]
- [Analytics Aura](https://analyticsaura.com/) [@Analytics-Aura]
- [Apollo GraphQL](https://www.apollographql.com/) [@evans]
@@ -81,31 +83,29 @@ Join our growing community!
- [Caizin](https://caizin.com/) [@tejaskatariya]
- [Careem](https://www.careem.com/) [@SamraHanifCareem]
- [Cloudsmith](https://cloudsmith.io) [@alancarson]
- [CnOvit](https://www.cnovit.com/) [@xieshaohu]
- [Cyberhaven](https://www.cyberhaven.com/) [@toliver-ch]
- [Deepomatic](https://deepomatic.com/) [@Zanoellia]
- [Dial Once](https://www.dial-once.com/)
- [Dremio](https://dremio.com) [@narendrans]
- [EFinance](https://www.efinance.com.eg) [@habeeb556]
- [Elestio](https://elest.io/) [@kaiwalyakoparkar]
- [ELMO Cloud HR & Payroll](https://elmosoftware.com.au/)
- [Endress+Hauser](https://www.endress.com/) [@rumbin]
- [FBK - ICT center](https://ict.fbk.eu)
- [Endress+Hauser](http://www.endress.com/) [@rumbin]
- [FBK - ICT center](http://ict.fbk.eu)
- [Gavagai](https://gavagai.io) [@gavagai-corp]
- [GfK Data Lab](https://www.gfk.com/home) [@mherr]
- [Hydrolix](https://www.hydrolix.io/)
- [Intercom](https://www.intercom.com/) [@kate-gallo]
- [jampp](https://jampp.com/)
- [Konfío](https://konfio.mx) [@uis-rodriguez]
- [Konfío](http://konfio.mx) [@uis-rodriguez]
- [Mainstrat](https://mainstrat.com/)
- [mishmash io](https://mishmash.io/) [@mishmash-io]
- [Myra Labs](https://www.myralabs.com/) [@viksit]
- [Nielsen](https://www.nielsen.com/) [@amitNielsen]
- [mishmash io](https://mishmash.io/)[@mishmash-io]
- [Myra Labs](http://www.myralabs.com/) [@viksit]
- [Nielsen](http://www.nielsen.com/) [@amitNielsen]
- [Ona](https://ona.io) [@pld]
- [Orange](https://www.orange.com) [@icsu]
- [Oslandia](https://oslandia.com)
- [Peak AI](https://www.peak.ai/) [@azhar22k]
- [PeopleDoc](https://www.people-doc.com) [@rodo]
- [PlaidCloud](https://www.plaidcloud.com)
- [Preset, Inc.](https://preset.io)
- [PubNub](https://pubnub.com) [@jzucker2]
- [ReadyTech](https://www.readytech.io)
@@ -114,16 +114,17 @@ Join our growing community!
- [Showmax](https://showmax.com) [@bobek]
- [TechAudit](https://www.techaudit.info) [@ETselikov]
- [Tenable](https://www.tenable.com) [@dflionis]
- [Tentacle](https://www.linkedin.com/company/tentacle-cmi/) [@jdclarke5]
- [Tentacle](https://tentaclecmi.com) [@jdclarke5]
- [timbr.ai](https://timbr.ai/) [@semantiDan]
- [Tobii](https://www.tobii.com/) [@dwa]
- [Tobii](http://www.tobii.com/) [@dwa]
- [Tooploox](https://www.tooploox.com/) [@jakubczaplicki]
- [Unvired](https://unvired.com) [@srinisubramanian]
- [Whale](https://whale.im)
- [Unvired](https://unvired.com)[@srinisubramanian]
- [Whale](http://whale.im)
- [Windsor.ai](https://www.windsor.ai/) [@octaviancorlade]
- [Zeta](https://www.zeta.tech/) [@shaikidris]
### Media & Entertainment
- [6play](https://www.6play.fr) [@CoryChaplin]
- [bilibili](https://www.bilibili.com) [@Moinheart]
- [BurdaForward](https://www.burda-forward.de/en/)
@@ -131,21 +132,23 @@ Join our growing community!
- [Kuaishou](https://www.kuaishou.com/) [@zhaoyu89730105]
- [Netflix](https://www.netflix.com/)
- [Prensa Iberica](https://www.prensaiberica.es/) [@zamar-roura]
- [TME QQMUSIC/WESING](https://www.tencentmusic.com/) [@shenyuanli,@marklaw]
- [TME QQMUSIC/WESING](https://www.tencentmusic.com/)[@shenyuanli,@marklaw]
- [Xite](https://xite.com/) [@shashankkoppar]
- [Zaihang](https://www.zaih.com/)
- [Zaihang](http://www.zaih.com/)
### Education
- [Aveti Learning](https://avetilearning.com/) [@TheShubhendra]
- [Brilliant.org](https://brilliant.org/)
- [Platzi.com](https://platzi.com/)
- [Sunbird](https://www.sunbird.org/) [@eksteporg]
- [The GRAPH Network](https://thegraphnetwork.org/) [@fccoelho]
- [The GRAPH Network](https://thegraphnetwork.org/)[@fccoelho]
- [Udemy](https://www.udemy.com/) [@sungjuly]
- [VIPKID](https://www.vipkid.com.cn/) [@illpanda]
- [WikiMedia Foundation](https://wikimediafoundation.org) [@vg]
### Energy
- [Airboxlab](https://foobot.io) [@antoine-galataud]
- [DouroECI](https://www.douroeci.com/) [@nunohelibeires]
- [Safaricom](https://www.safaricom.co.ke/) [@mmutiso]
@@ -153,32 +156,35 @@ Join our growing community!
- [Wattbewerb](https://wattbewerb.de/) [@wattbewerb]
### Healthcare
- [Amino](https://amino.com) [@shkr]
- [Bluesquare](https://www.bluesquarehub.com/) [@madewulf]
- [Care](https://www.getcare.io/) [@alandao2021]
- [Care](https://www.getcare.io/)[@alandao2021]
- [Living Goods](https://www.livinggoods.org) [@chelule]
- [Maieutical Labs](https://maieuticallabs.it) [@xrmx]
- [Medic](https://medic.org) [@1yuv]
- [QPID Health](http://www.qpidhealth.com/)
- [REDCap Cloud](https://www.redcapcloud.com/)
- [TrustMedis](https://trustmedis.com/) [@famasya]
- [WeSure](https://www.wesure.cn/)
- [2070Health](https://2070health.com/)
### HR / Staffing
- [Swile](https://www.swile.co/) [@PaoloTerzi]
- [Symmetrics](https://www.symmetrics.fyi)
- [bluquist](https://bluquist.com/)
### Government / Non-Profit
### Government
- [City of Ann Arbor, MI](https://www.a2gov.org/) [@sfirke]
- [RIS3 Strategy of CZ, MIT CR](https://www.ris3.cz/) [@RIS3CZ]
- [NRLM - Sarathi, India](https://pib.gov.in/PressReleasePage.aspx?PRID=1999586)
### Travel
- [Agoda](https://www.agoda.com/) [@lostseaway, @maiake, @obombayo]
- [Skyscanner](https://www.skyscanner.net/) [@cleslie, @stanhoucke]
### Others
- [10Web](https://10web.io/)
- [AI inside](https://inside.ai/en/)
- [Automattic](https://automattic.com/) [@Khrol, @Usiel]
@@ -189,6 +195,6 @@ Join our growing community!
- [komoot](https://www.komoot.com/) [@christophlingg]
- [Let's Roam](https://www.letsroam.com/)
- [Onebeat](https://1beat.com/) [@GuyAttia]
- [X](https://x.com/)
- [Twitter](https://twitter.com/)
- [VLMedia](https://www.vlmedia.com.tr/) [@ibotheperfect]
- [Yahoo!](https://yahoo.com/)

View File

@@ -22,11 +22,10 @@ under the License.
This file documents any backwards-incompatible changes in Superset and
assists people when migrating to a new version.
## Next
## 4.1.2
- [29798](https://github.com/apache/superset/pull/29798) Since 3.1.0, the intial schedule for an alert or report was mistakenly offset by the specified timezone's relation to UTC. The initial schedule should now begin at the correct time.
- [30021](https://github.com/apache/superset/pull/30021) The `dev` layer in our Dockerfile no long includes firefox binaries, only Chromium to reduce bloat/docker-build-time.
- [30099](https://github.com/apache/superset/pull/30099) Translations are no longer included in the default docker image builds. If your environment requires translations, you'll want to set the docker build arg `BUILD_TRANSACTION=true`.
- [31198](https://github.com/apache/superset/pull/31198) Disallows by default the use of the following ClickHouse functions: "version", "currentDatabase", "hostName".
- [31173](https://github.com/apache/superset/pull/31173) Modified `fetch_csrf_token` to align with HTTP standards, particularly regarding how cookies are handled. If you encounter any issues related to CSRF functionality, please report them as a new issue and reference this PR for context.
### Potential Downtime
@@ -40,9 +39,9 @@ assists people when migrating to a new version.
`requirements/` folder. If you use these files for your builds you may want to double
check that your builds are not affected. `base.txt` should be the same as before, though
`development.txt` becomes a bigger set, incorporating the now defunct local,testing,integration, and docker
- [27434](https://github.com/apache/superset/pull/27434/files): DO NOT USE our docker compose.\*
- [27434](https://github.com/apache/superset/pull/27434/files): DO NOT USE our docker-compose.\*
files for production use cases! While we never really supported
or should have tried to support docker compose for production use cases, we now actively
or should have tried to support docker-compose for production use cases, we now actively
have taken a stance against supporting it. See the PR for details.
- [24112](https://github.com/apache/superset/pull/24112): Python 3.10 is now the recommended python version to use, 3.9 still
supported but getting deprecated in the nearish future. CI/CD runs on py310 so you probably want to align. If you
@@ -66,7 +65,7 @@ assists people when migrating to a new version.
backend, as well as the .json files used by the frontend. If you were doing anything before
as part of your bundling to expose translation packages, it's probably not needed anymore.
- [29264](https://github.com/apache/superset/pull/29264) Slack has updated its file upload api, and we are now supporting this new api in Superset, although the Slack api is not backward compatible. The original Slack integration is deprecated and we will require a new Slack scope `channels:read` to be added to Slack workspaces in order to use this new api. In an upcoming release, we will make this new Slack scope mandatory and remove the old Slack functionality.
- [30274](https://github.com/apache/superset/pull/30274) Moved SLACK_ENABLE_AVATAR from config.py to the feature flag framework, please adapt your configs.
- [30274](https://github.com/apache/superset/pull/30274) Moved SLACK_ENABLE_AVATAR from config.py to the feature flag framework, please adapt your configs
### Potential Downtime
@@ -124,7 +123,7 @@ assists people when migrating to a new version.
- [24911](https://github.com/apache/superset/pull/24911): Changes the column type from `TEXT` to `MediumText` in table `logs`, potentially requiring a table lock on MySQL dbs or taking some time to complete on large deployments.
- [24939](https://github.com/apache/superset/pull/24939): Augments the foreign key constraints for the `embedded_dashboards` table to include an explicit CASCADE ON DELETE to ensure the relevant records are deleted when a dashboard is deleted. Scheduled downtime may be advised.
- [24938](https://github.com/apache/superset/pull/24938): Augments the foreign key constraints for the `dashboard_slices` table to include an explicit CASCADE ON DELETE to ensure the relevant records are deleted when a dashboard or slice is deleted. Scheduled downtime may be advised.
- [24628](https://github.com/apache/superset/pull/24628): Augments the foreign key constraints for the `dashboard_owner`, `report_schedule_owner`, and `slice_owner` tables to include an explicit CASCADE ON DELETE to ensure the relevant ownership records are deleted when a dataset is deleted. Scheduled downtime may be advised.
- [24628]https://github.com/apache/superset/pull/24628): Augments the foreign key constraints for the `dashboard_owner`, `report_schedule_owner`, and `slice_owner` tables to include an explicit CASCADE ON DELETE to ensure the relevant ownership records are deleted when a dataset is deleted. Scheduled downtime may be advised.
- [24488](https://github.com/apache/superset/pull/24488): Augments the foreign key constraints for the `sql_metrics`, `sqlatable_user`, and `table_columns` tables which reference the `tables` table to include an explicit CASCADE ON DELETE to ensure the relevant records are deleted when a dataset is deleted. Scheduled downtime may be advised.
- [24232](https://github.com/apache/superset/pull/24232): Enables ENABLE_TEMPLATE_REMOVE_FILTERS, DRILL_TO_DETAIL, DASHBOARD_CROSS_FILTERS by default, marks VERSIONED_EXPORT and ENABLE_TEMPLATE_REMOVE_FILTERS as deprecated.
- [23652](https://github.com/apache/superset/pull/23652): Enables GENERIC_CHART_AXES feature flag by default.
@@ -140,7 +139,7 @@ assists people when migrating to a new version.
### Breaking Changes
- [24686](https://github.com/apache/superset/pull/24686): All dataset's custom explore_url are handled as relative URLs on the frontend, behaviour controlled by PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET.
- [24686]https://github.com/apache/superset/pull/24686): All dataset's custom explore_url are handled as relative URLs on the frontend, behaviour controlled by PREVENT_UNSAFE_DEFAULT_URLS_ON_DATASET.
- [24262](https://github.com/apache/superset/pull/24262): Enabled `TALISMAN_ENABLED` flag by default and provided stricter default Content Security Policy
- [24415](https://github.com/apache/superset/pull/24415): Removed the obsolete Druid NoSQL REGEX operator.
- [24423](https://github.com/apache/superset/pull/24423): Removed deprecated APIs `/superset/slice_json/...`, `/superset/annotation_json/...`
@@ -325,7 +324,8 @@ assists people when migrating to a new version.
### Potential Downtime
- [14234](https://github.com/apache/superset/pull/14234): Adds the `limiting_factor` column to the `query` table. Give the migration includes a DDL operation on a heavily trafficked table, potential service downtime may be required.
- [16454](https://github.com/apache/superset/pull/16454): Adds the `extra` column to the `table_columns` table. Users using MySQL will either need to schedule downtime or use the percona toolkit (or similar) to perform the migration.
-[16454](https://github.com/apache/superset/pull/16454): Adds the `extra` column to the `table_columns` table. Users using MySQL will either need to schedule downtime or use the percona toolkit (or similar) to perform the migration.
## 1.2.0

View File

@@ -16,7 +16,7 @@
#
# -----------------------------------------------------------------------
# We don't support docker compose for production environments.
# We don't support docker-compose for production environments.
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.

View File

@@ -16,7 +16,7 @@
#
# -----------------------------------------------------------------------
# We don't support docker compose for production environments.
# We don't support docker-compose for production environments.
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.

View File

@@ -16,7 +16,7 @@
#
# -----------------------------------------------------------------------
# We don't support docker compose for production environments.
# We don't support docker-compose for production environments.
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.
@@ -25,7 +25,6 @@ x-superset-user: &superset-user root
x-superset-depends-on: &superset-depends-on
- db
- redis
- superset-checks
x-superset-volumes: &superset-volumes
# /app/pythonpath_docker will be appended to the PYTHONPATH in the final container
- ./docker:/app/docker
@@ -39,8 +38,6 @@ x-common-build: &common-build
target: dev
cache_from:
- apache/superset-cache:3.10-slim-bookworm
args:
DEV_MODE: "true"
services:
nginx:
@@ -123,7 +120,7 @@ services:
- /home/superset-websocket/dist
# Mounting a config file that contains a dummy secret required to boot up.
# do not use this docker compose in production
# do not use this docker-compose in production
- ./docker/superset-websocket/config.json:/home/superset-websocket/config.json
environment:
- PORT=8080
@@ -131,23 +128,6 @@ services:
- REDIS_PORT=6379
- REDIS_SSL=false
superset-checks:
build:
context: .
target: python-base
cache_from:
- apache/superset-cache:3.10-slim-bookworm
container_name: superset_checks
command: ["/app/scripts/check-env.py"]
env_file:
- path: docker/.env # default
required: true
- path: docker/.env-local # optional override
required: false
user: *superset-user
healthcheck:
disable: true
superset-init:
build:
<<: *common-build
@@ -167,18 +147,10 @@ services:
disable: true
superset-node:
build:
context: .
target: superset-node
args:
# This prevents building the frontend bundle since we'll mount local folder
# and build it on startup while firing docker-frontend.sh in dev mode, where
# it'll mount and watch local files and rebuild as you update them
DEV_MODE: "true"
image: node:18
environment:
# set this to false if you have perf issues running the npm i; npm run dev in-docker
# if you do so, you have to run this manually on the host, which should perform better!
BUILD_SUPERSET_FRONTEND_IN_DOCKER: true
SCARF_ANALYTICS: "${SCARF_ANALYTICS:-}"
container_name: superset_node
command: ["/app/docker/docker-frontend.sh"]

View File

@@ -24,16 +24,12 @@ if [ "$PUPPETEER_SKIP_CHROMIUM_DOWNLOAD" = "false" ]; then
fi
if [ "$BUILD_SUPERSET_FRONTEND_IN_DOCKER" = "true" ]; then
echo "Building Superset frontend in dev mode inside docker container"
cd /app/superset-frontend
echo "Running `npm install`"
npm install
npm install -f --no-optional --global webpack webpack-cli
npm install -f
echo "Running frontend"
npm run dev
else
echo "Skipping frontend build steps - YOU NEED TO RUN IT MANUALLY ON THE HOST!"
echo "https://superset.apache.org/docs/contributing/development/#webpack-dev-server"
echo "Skipping frontend build steps - YOU RUN IT MANUALLY ON THE HOST!"
fi

View File

@@ -22,11 +22,7 @@ set -e
#
/app/docker/docker-bootstrap.sh
if [ "$SUPERSET_LOAD_EXAMPLES" = "yes" ]; then
STEP_CNT=4
else
STEP_CNT=3
fi
STEP_CNT=4
echo_step() {
cat <<EOF

View File

@@ -26,7 +26,6 @@ gunicorn \
--workers ${SERVER_WORKER_AMOUNT:-1} \
--worker-class ${SERVER_WORKER_CLASS:-gthread} \
--threads ${SERVER_THREADS_AMOUNT:-20} \
--log-level "${GUNICORN_LOGLEVEL:info}" \
--timeout ${GUNICORN_TIMEOUT:-60} \
--keep-alive ${GUNICORN_KEEPALIVE:-2} \
--max-requests ${WORKER_MAX_REQUESTS:-0} \

View File

@@ -1 +1 @@
v20.16.0
v20.12.2

View File

@@ -16,7 +16,6 @@ KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
This is the public documentation site for Superset, built using
[Docusaurus 2](https://docusaurus.io/). See
[CONTRIBUTING.md](../CONTRIBUTING.md#documentation) for documentation on

View File

@@ -77,7 +77,6 @@
"Guyana",
"Haiti",
"Honduras",
"Hungary",
"Iceland",
"India",
"Indonesia",

View File

@@ -251,18 +251,15 @@ FROM apache/superset:3.1.0
USER root
RUN apt-get update && \
apt-get install -y wget zip libaio1
RUN export CHROMEDRIVER_VERSION=$(curl --silent https://googlechromelabs.github.io/chrome-for-testing/LATEST_RELEASE_116) && \
wget -O google-chrome-stable_current_amd64.deb -q http://dl.google.com/linux/chrome/deb/pool/main/g/google-chrome-stable/google-chrome-stable_${CHROMEDRIVER_VERSION}-1_amd64.deb && \
wget -q https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb && \
apt-get install -y --no-install-recommends ./google-chrome-stable_current_amd64.deb && \
rm -f google-chrome-stable_current_amd64.deb
RUN export CHROMEDRIVER_VERSION=$(curl --silent https://googlechromelabs.github.io/chrome-for-testing/LATEST_RELEASE_116) && \
wget -q https://storage.googleapis.com/chrome-for-testing-public/${CHROMEDRIVER_VERSION}/linux64/chromedriver-linux64.zip && \
unzip -j chromedriver-linux64.zip -d /usr/bin && \
RUN export CHROMEDRIVER_VERSION=$(curl --silent https://chromedriver.storage.googleapis.com/LATEST_RELEASE_102) && \
wget -q https://chromedriver.storage.googleapis.com/${CHROMEDRIVER_VERSION}/chromedriver_linux64.zip && \
unzip chromedriver_linux64.zip -d /usr/bin && \
chmod 755 /usr/bin/chromedriver && \
rm -f chromedriver-linux64.zip
rm -f chromedriver_linux64.zip
RUN pip install --no-cache gevent psycopg2 redis

View File

@@ -13,8 +13,8 @@ SimpleCache (in-memory), or the local filesystem.
[Custom cache backends](https://flask-caching.readthedocs.io/en/latest/#custom-cache-backends)
are also supported.
Caching can be configured by providing dictionaries in
`superset_config.py` that comply with [the Flask-Caching config specifications](https://flask-caching.readthedocs.io/en/latest/#configuring-flask-caching).
Caching can be configured by providing a dictionaries in
`superset_config.py` that comply with[the Flask-Caching config specifications](https://flask-caching.readthedocs.io/en/latest/#configuring-flask-caching).
The following cache configurations can be customized in this way:
- Dashboard filter state (required): `FILTER_STATE_CACHE_CONFIG`.
@@ -22,7 +22,7 @@ The following cache configurations can be customized in this way:
- Metadata cache (optional): `CACHE_CONFIG`
- Charting data queried from datasets (optional): `DATA_CACHE_CONFIG`
For example, to configure the filter state cache using Redis:
For example, to configure the filter state cache using redis:
```python
FILTER_STATE_CACHE_CONFIG = {

View File

@@ -37,7 +37,7 @@ ENV SUPERSET_CONFIG_PATH /app/superset_config.py
```
Docker compose deployments handle application configuration differently using specific conventions.
Refer to the [docker compose tips & configuration](/docs/installation/docker-compose#docker-compose-tips--configuration)
Refer to the [docker-compose tips & configuration](/docs/installation/docker-compose#docker-compose-tips--configuration)
for details.
The following is an example of just a few of the parameters you can set in your `superset_config.py` file:
@@ -314,9 +314,9 @@ CUSTOM_SECURITY_MANAGER = CustomSsoSecurityManager
]
```
### Keycloak-Specific Configuration using Flask-OIDC
If you are using Keycloak as OpenID Connect 1.0 Provider, the above configuration based on [`Authlib`](https://authlib.org/) might not work. In this case using [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is a viable option.
If you are using Keycloak as OpenID Connect 1.0 Provider, the above configuration based on [`Authlib`](https://authlib.org/) might not work. In this case using [`Flask-OIDC`](https://https://pypi.org/project/flask-oidc/) is a viable option.
Make sure the pip package [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is installed on the webserver. This was succesfully tested using version 2.2.0. This package requires [`Flask-OpenID`](https://pypi.org/project/Flask-OpenID/) as a dependency.
Make sure the pip package [`Flask-OIDC`](https://https://pypi.org/project/flask-oidc/) is installed on the webserver. This was succesfully tested using version 2.2.0. This package requires [`Flask-OpenID`](https://pypi.org/project/Flask-OpenID/) as a dependency.
The following code defines a new security manager. Add it to a new file named `keycloak_security_manager.py`, placed in the same directory as your `superset_config.py` file.
```python

View File

@@ -55,9 +55,7 @@ are compatible with Superset.
| [ClickHouse](/docs/configuration/databases#clickhouse) | `pip install clickhouse-connect` | `clickhousedb://{username}:{password}@{hostname}:{port}/{database}` |
| [CockroachDB](/docs/configuration/databases#cockroachdb) | `pip install cockroachdb` | `cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable` |
| [Couchbase](/docs/configuration/databases#couchbase) | `pip install couchbase-sqlalchemy` | `couchbase://{username}:{password}@{hostname}:{port}?truststorepath={ssl certificate path}` |
| [CrateDB](/docs/configuration/databases#cratedb) | `pip install sqlalchemy-cratedb` | `crate://{username}:{password}@{hostname}:{port}`, often useful: `?ssl=true/false` or `?schema=testdrive`. |
| [Denodo](/docs/configuration/databases#denodo) | `pip install denodo-sqlalchemy` | `denodo://{username}:{password}@{hostname}:{port}/{database}` |
| [Dremio](/docs/configuration/databases#dremio) | `pip install sqlalchemy_dremio` |`dremio+flight://{username}:{password}@{host}:32010`, often useful: `?UseEncryption=true/false`. For Legacy ODBC: `dremio+pyodbc://{username}:{password}@{host}:31010` |
| [Dremio](/docs/configuration/databases#dremio) | `pip install sqlalchemy_dremio` | `dremio://user:pwd@host:31010/` |
| [Elasticsearch](/docs/configuration/databases#elasticsearch) | `pip install elasticsearch-dbapi` | `elasticsearch+http://{user}:{password}@{host}:9200/` |
| [Exasol](/docs/configuration/databases#exasol) | `pip install sqlalchemy-exasol` | `exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC` |
| [Google BigQuery](/docs/configuration/databases#google-bigquery) | `pip install sqlalchemy-bigquery` | `bigquery://{project_id}` |
@@ -72,7 +70,7 @@ are compatible with Superset.
| [PostgreSQL](/docs/configuration/databases#postgres) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
| [Presto](/docs/configuration/databases#presto) | `pip install pyhive` | `presto://` |
| [Rockset](/docs/configuration/databases#rockset) | `pip install rockset-sqlalchemy` | `rockset://<api_key>:@<api_server>` |
| [SAP Hana](/docs/configuration/databases#hana) | `pip install hdbcli sqlalchemy-hana` or `pip install apache-superset[hana]` | `hana://{username}:{password}@{host}:{port}` |
| [SAP Hana](/docs/configuration/databases#hana) | `pip install hdbcli sqlalchemy-hana` or `pip install apache_superset[hana]` | `hana://{username}:{password}@{host}:{port}` |
| [StarRocks](/docs/configuration/databases#starrocks) | `pip install starrocks` | `starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>` |
| [Snowflake](/docs/configuration/databases#snowflake) | `pip install snowflake-sqlalchemy` | `snowflake://{user}:{password}@{account}.{region}/{database}?role={role}&warehouse={warehouse}` |
| SQLite | No additional library needed | `sqlite://path/to/file.db?check_same_thread=false` |
@@ -395,33 +393,21 @@ couchbase://{username}:{password}@{hostname}:{port}?truststorepath={certificate
#### CrateDB
The connector library for CrateDB is [sqlalchemy-cratedb].
We recommend to add the following item to your `requirements.txt` file:
The recommended connector library for CrateDB is
[crate](https://pypi.org/project/crate/).
You need to install the extras as well for this library.
We recommend adding something like the following
text to your requirements file:
```
sqlalchemy-cratedb>=0.40.1,<1
crate[sqlalchemy]==0.26.0
```
An SQLAlchemy connection string for [CrateDB Self-Managed] on localhost,
for evaluation purposes, looks like this:
The expected connection string is formatted as follows:
```
crate://crate@127.0.0.1:4200
```
An SQLAlchemy connection string for connecting to [CrateDB Cloud] looks like
this:
```
crate://<username>:<password>@<clustername>.cratedb.net:4200/?ssl=true
```
Follow the steps [here](/docs/configuration/databases#installing-database-drivers)
to install the CrateDB connector package when setting up Superset locally using
Docker Compose.
```
echo "sqlalchemy-cratedb" >> ./docker/requirements-local.txt
```
[CrateDB Cloud]: https://cratedb.com/product/cloud
[CrateDB Self-Managed]: https://cratedb.com/product/self-managed
[sqlalchemy-cratedb]: https://pypi.org/project/sqlalchemy-cratedb/
#### Databend
@@ -526,16 +512,6 @@ For a connection to a SQL endpoint you need to use the HTTP path from the endpoi
```
#### Denodo
The recommended connector library for Denodo is
[denodo-sqlalchemy](https://pypi.org/project/denodo-sqlalchemy/).
The expected connection string is formatted as follows (default port is 9996):
```
denodo://{username}:{password}@{hostname}:{port}/{database}
```
#### Dremio
@@ -546,7 +522,7 @@ The recommended connector library for Dremio is
The expected connection string for ODBC (Default port is 31010) is formatted as follows:
```
dremio+pyodbc://{username}:{password}@{host}:{port}/{database_name}/dremio?SSL=1
dremio://{username}:{password}@{host}:{port}/{database_name}/dremio?SSL=1
```
The expected connection string for Arrow Flight (Dremio 4.9.1+. Default port is 32010) is formatted as follows:
@@ -1331,10 +1307,6 @@ Here's what the connection string looks like:
starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>
```
:::note
StarRocks maintains their Superset docuementation [here](https://docs.starrocks.io/docs/integrations/BI_integrations/Superset/).
:::
#### Teradata
The recommended connector library is

View File

@@ -11,7 +11,7 @@ version: 1
To configure CORS, or cross-origin resource sharing, the following dependency must be installed:
```python
pip install apache-superset[cors]
pip install apache_superset[cors]
```
The following keys in `superset_config.py` can be specified to configure CORS:
@@ -24,65 +24,9 @@ The following keys in `superset_config.py` can be specified to configure CORS:
## HTTP headers
Note that Superset bundles [flask-talisman](https://pypi.org/project/talisman/)
Self-described as a small Flask extension that handles setting HTTP headers that can help
Self-descried as a small Flask extension that handles setting HTTP headers that can help
protect against a few common web application security issues.
## HTML Embedding of Dashboards and Charts
There are two ways to embed a dashboard: Using the [SDK](https://www.npmjs.com/package/@superset-ui/embedded-sdk) or embedding a direct link. Note that in the latter case everybody who knows the link is able to access the dashboard.
### Embedding a Public Direct Link to a Dashboard
This works by first changing the content security policy (CSP) of [flask-talisman](https://github.com/GoogleCloudPlatform/flask-talisman) to allow for certain domains to display Superset content. Then a dashboard can be made publicly accessible, i.e. **bypassing authentication**. Once made public, the dashboard's URL can be added to an iframe in another website's HTML code.
#### Changing flask-talisman CSP
Add to `superset_config.py` the entire `TALISMAN_CONFIG` section from `config.py` and include a `frame-ancestors` section:
```python
TALISMAN_ENABLED = True
TALISMAN_CONFIG = {
"content_security_policy": {
...
"frame-ancestors": ["*.my-domain.com", "*.another-domain.com"],
...
```
Restart Superset for this configuration change to take effect.
#### Making a Dashboard Public
1. Add the `'DASHBOARD_RBAC': True` [Feature Flag](https://github.com/apache/superset/blob/master/RESOURCES/FEATURE_FLAGS.md) to `superset_config.py`
2. Add the `Public` role to your dashboard as described [here](https://superset.apache.org/docs/using-superset/creating-your-first-dashboard/#manage-access-to-dashboards)
#### Embedding a Public Dashboard
Now anybody can directly access the dashboard's URL. You can embed it in an iframe like so:
```html
<iframe
width="600"
height="400"
seamless
frameBorder="0"
scrolling="no"
src="https://superset.my-domain.com/superset/dashboard/10/?standalone=1&height=400"
>
</iframe>
```
#### Embedding a Chart
A chart's embed code can be generated by going to a chart's edit view and then clicking at the top right on `...` > `Share` > `Embed code`
### Enabling Embedding via the SDK
Clicking on `...` next to `EDIT DASHBOARD` on the top right of the dashboard's overview page should yield a drop-down menu including the entry "Embed dashboard".
To enable this entry, add the following line to the `.env` file:
```text
SUPERSET_FEATURE_EMBEDDED_SUPERSET=true
```
## CSRF settings
Similarly, [flask-wtf](https://flask-wtf.readthedocs.io/en/0.15.x/config/) is used manage

View File

@@ -17,8 +17,8 @@ made available in the Jinja context:
- `columns`: columns which to group by in the query
- `filter`: filters applied in the query
- `from_dttm`: start `datetime` value from the selected time range (`None` if undefined) (deprecated beginning in version 5.0, use `get_time_filter` instead)
- `to_dttm`: end `datetime` value from the selected time range (`None` if undefined). (deprecated beginning in version 5.0, use `get_time_filter` instead)
- `from_dttm`: start `datetime` value from the selected time range (`None` if undefined)
- `to_dttm`: end `datetime` value from the selected time range (`None` if undefined)
- `groupby`: columns which to group by in the query (deprecated)
- `metrics`: aggregate expressions in the query
- `row_limit`: row limit of the query
@@ -48,15 +48,12 @@ WHERE (
{% if to_dttm is not none %}
dttm_col < '{{ to_dttm }}' AND
{% endif %}
1 = 1
true
)
```
The `1 = 1` at the end ensures a value is present for the `WHERE` clause even when
the time filter is not set. For many database engines, this could be replaced with `true`.
Note that the Jinja parameters are called within _double_ brackets in the query and with
_single_ brackets in the logic blocks.
Note how the Jinja parameters are called within double brackets in the query, and without in the
logic blocks.
To add custom functionality to the Jinja context, you need to overload the default Jinja
context in your environment by defining the `JINJA_CONTEXT_ADDONS` in your superset configuration
@@ -97,7 +94,7 @@ There is a special ``_filters`` parameter which can be used to test filters used
```sql
SELECT action, count(*) as times
FROM logs
WHERE action in {{ filter_values('action_type')|where_in }}
WHERE action in {{ filter_values('action_type'))|where_in }}
GROUP BY action
```
@@ -349,78 +346,6 @@ Here's a concrete example:
order by lineage, level
```
**Time Filter**
The `{{ get_time_filter() }}` macro returns the time filter applied to a specific column. This is useful if you want
to handle time filters inside the virtual dataset, as by default the time filter is placed on the outer query. This can
considerably improve performance, as many databases and query engines are able to optimize the query better
if the temporal filter is placed on the inner query, as opposed to the outer query.
The macro takes the following parameters:
- `column`: Name of the temporal column. Leave undefined to reference the time range from a Dashboard Native Time Range
filter (when present).
- `default`: The default value to fall back to if the time filter is not present, or has the value `No filter`
- `target_type`: The target temporal type as recognized by the target database (e.g. `TIMESTAMP`, `DATE` or
`DATETIME`). If `column` is defined, the format will default to the type of the column. This is used to produce
the format of the `from_expr` and `to_expr` properties of the returned `TimeFilter` object.
- `strftime`: format using the `strftime` method of `datetime` for custom time formatting.
([see docs for valid format codes](https://docs.python.org/3/library/datetime.html#strftime-and-strptime-format-codes)).
When defined `target_type` will be ignored.
- `remove_filter`: When set to true, mark the filter as processed, removing it from the outer query. Useful when a
filter should only apply to the inner query.
The return type has the following properties:
- `from_expr`: the start of the time filter (if any)
- `to_expr`: the end of the time filter (if any)
- `time_range`: The applied time range
Here's a concrete example using the `logs` table from the Superset metastore:
```
{% set time_filter = get_time_filter("dttm", remove_filter=True) %}
{% set from_expr = time_filter.from_expr %}
{% set to_expr = time_filter.to_expr %}
{% set time_range = time_filter.time_range %}
SELECT
*,
'{{ time_range }}' as time_range
FROM logs
{% if from_expr or to_expr %}WHERE 1 = 1
{% if from_expr %}AND dttm >= {{ from_expr }}{% endif %}
{% if to_expr %}AND dttm < {{ to_expr }}{% endif %}
{% endif %}
```
Assuming we are creating a table chart with a simple `COUNT(*)` as the metric with a time filter `Last week` on the
`dttm` column, this would render the following query on Postgres (note the formatting of the temporal filters, and
the absence of time filters on the outer query):
```
SELECT COUNT(*) AS count
FROM
(SELECT *,
'Last week' AS time_range
FROM public.logs
WHERE 1 = 1
AND dttm >= TO_TIMESTAMP('2024-08-27 00:00:00.000000', 'YYYY-MM-DD HH24:MI:SS.US')
AND dttm < TO_TIMESTAMP('2024-09-03 00:00:00.000000', 'YYYY-MM-DD HH24:MI:SS.US')) AS virtual_table
ORDER BY count DESC
LIMIT 1000;
```
When using the `default` parameter, the templated query can be simplified, as the endpoints will always be defined
(to use a fixed time range, you can also use something like `default="2024-08-27 : 2024-09-03"`)
```
{% set time_filter = get_time_filter("dttm", default="Last week", remove_filter=True) %}
SELECT
*,
'{{ time_filter.time_range }}' as time_range
FROM logs
WHERE
dttm >= {{ time_filter.from_expr }}
AND dttm < {{ time_filter.to_expr }}
```
**Datasets**
It's possible to query physical and virtual datasets using the `dataset` macro. This is useful if you've defined computed columns and metrics on your datasets, and want to reuse the definition in adhoc SQL Lab queries.

View File

@@ -26,9 +26,9 @@ More references:
Here's a list of repositories that contain Superset-related packages:
- [apache/superset](https://github.com/apache/superset)
is the main repository containing the `apache-superset` Python package
is the main repository containing the `apache_superset` Python package
distributed on
[pypi](https://pypi.org/project/apache-superset/). This repository
[pypi](https://pypi.org/project/apache_superset/). This repository
also includes Superset's main TypeScript/JavaScript bundles and react apps under
the [superset-frontend](https://github.com/apache/superset/tree/master/superset-frontend)
folder.

View File

@@ -6,13 +6,13 @@ version: 1
# Setting up a Development Environment
The documentation in this section is a bit of a patchwork of knowledge representing the
multitude of ways that exist to run Superset (`docker compose`, just "docker", on "metal", using
multitude of ways that exist to run Superset (`docker-compose`, just "docker", on "metal", using
a Makefile).
:::note
Now we have evolved to recommend and support `docker compose` more actively as the main way
Now we have evolved to recommend and support `docker-compose` more actively as the main way
to run Superset for development and preserve your sanity. **Most people should stick to
the first few sections - ("Fork & Clone", "docker compose" and "Installing Dev Tools")**
the first few sections - ("Fork & Clone", "docker-compose" and "Installing Dev Tools")**
:::
## Fork and Clone
@@ -27,12 +27,12 @@ git clone git@github.com:your-username/superset.git
cd superset
```
## docker compose (recommended!)
## docker-compose (recommended!)
Setting things up to squeeze a "hello world" into any part of Superset should be as simple as
Setting things up to squeeze an "hello world" into any part of Superset should be as simple as
```bash
docker compose up
docker-compose up
```
Note that:
@@ -45,7 +45,7 @@ Note that:
- **Postgres** as the metadata database and to store example datasets, charts and dashboards which
should be populated upon startup
- **Redis** as the message queue for our async backend and caching backend
- It'll load up examples into the database upon the first startup
- It'll load up examples into the database upon first startup
- all other details and pointers available in
[docker-compose.yml](https://github.com/apache/superset/blob/master/docker-compose.yml)
- The local repository is mounted within the services, meaning updating
@@ -53,17 +53,10 @@ Note that:
- Superset is served at localhost:8088/
- You can login with admin/admin
:::note
Installing and building Node modules for Apache Superset inside `superset-node` can take a
significant amount of time. This is normal due to the size of the dependencies. Please be
patient while the process completes, as long wait times do not indicate an issue with your setup.
If delays seem excessive, check your internet connection or system resources.
:::
:::caution
Since `docker compose` is primarily designed to run a set of containers on **a single host**
Since `docker-compose` is primarily designed to run a set of containers on **a single host**
and can't credibly support **high availability** as a result, we do not support nor recommend
using our `docker compose` constructs to support production-type use-cases. For single host
using our `docker-compose` constructs to support production-type use-cases. For single host
environments, we recommend using [minikube](https://minikube.sigs.k8s.io/docs/start/) along
our [installing on k8s](https://superset.apache.org/docs/installation/running-on-kubernetes)
documentation.
@@ -73,10 +66,10 @@ configured to be secure.
## Installing Development Tools
:::note
While `docker compose` simplifies a lot of the setup, there are still
While docker-compose simplifies a lot of the setup, there are still
many things you'll want to set up locally to power your IDE, and things like
**commit hooks**, **linters**, and **test-runners**. Note that you can do these
things inside docker images with commands like `docker compose exec superset_app bash` for
things inside docker images with commands like `docker-compose exec superset_app bash` for
instance, but many people like to run that tooling from their host.
:::
@@ -99,55 +92,13 @@ To install run the following:
pre-commit install
```
This will install the hooks in your local repository. From now on, a series of checks will
automatically run whenever you make a Git commit.
A series of checks will now run when you make a git commit.
#### Running Pre-commit Manually
You can also run the pre-commit checks manually in various ways:
- **Run pre-commit on all files (same as CI):**
To run the pre-commit checks across all files in your repository, use the following command:
```bash
pre-commit run --all-files
```
This is the same set of checks that will run during CI, ensuring your changes meet the project's standards.
- **Run pre-commit on a specific file:**
If you want to check or fix a specific file, you can do so by specifying the file path:
```bash
pre-commit run --files path/to/your/file.py
```
This will only run the checks on the file(s) you specify.
- **Run a specific pre-commit check:**
To run a specific check (hook) across all files or a particular file, use the following command:
```bash
pre-commit run <hook_id> --all-files
```
Or for a specific file:
```bash
pre-commit run <hook_id> --files path/to/your/file.py
```
Replace `<hook_id>` with the ID of the specific hook you want to run. You can find the list
of available hooks in the `.pre-commit-config.yaml` file.
## Alternatives to `docker compose`
## Alternatives to docker-compose
:::caution
This part of the documentation is a patchwork of information related to setting up
development environments without `docker compose` and is documented/supported to varying
development environments without `docker-compose` and are documented/supported to varying
degrees. It's been difficult to maintain this wide array of methods and insure they're
functioning across environments.
:::
@@ -157,7 +108,7 @@ functioning across environments.
#### OS Dependencies
Make sure your machine meets the [OS dependencies](https://superset.apache.org/docs/installation/pypi#os-dependencies) before following these steps.
You also need to install MySQL.
You also need to install MySQL or [MariaDB](https://mariadb.com/downloads).
Ensure that you are using Python version 3.9, 3.10 or 3.11, then proceed with:
@@ -187,11 +138,11 @@ superset load-examples
# Start the Flask dev web server from inside your virtualenv.
# Note that your page may not have CSS at this point.
# See instructions below on how to build the front-end assets.
# See instructions below how to build the front-end assets.
superset run -p 8088 --with-threads --reload --debugger --debug
```
Or you can install it via our Makefile
Or you can install via our Makefile
```bash
# Create a virtual environment and activate it (recommended)
@@ -209,7 +160,7 @@ $ make pre-commit
```
**Note: the FLASK_APP env var should not need to be set, as it's currently controlled
via `.flaskenv`, however, if needed, it should be set to `superset.app:create_app()`**
via `.flaskenv`, however if needed, it should be set to `superset.app:create_app()`**
If you have made changes to the FAB-managed templates, which are not built the same way as the newer, React-powered front-end assets, you need to start the app without the `--with-threads` argument like so:
`superset run -p 8088 --reload --debugger --debug`
@@ -274,7 +225,7 @@ Frontend assets (TypeScript, JavaScript, CSS, and images) must be compiled in or
First, be sure you are using the following versions of Node.js and npm:
- `Node.js`: Version 20
- `Node.js`: Version 18
- `npm`: Version 10
We recommend using [nvm](https://github.com/nvm-sh/nvm) to manage your node environment:
@@ -312,7 +263,7 @@ cd superset-frontend
npm ci
```
Note that Superset uses [Scarf](https://docs.scarf.sh) to capture telemetry/analytics about versions being installed, including the `scarf-js` npm package and an analytics pixel. As noted elsewhere in this documentation, Scarf gathers aggregated stats for the sake of security/release strategy and does not capture/retain PII. [You can read here](https://docs.scarf.sh/package-analytics/) about the `scarf-js` package, and various means to opt out of it, but you can opt out of the npm package _and_ the pixel by setting the `SCARF_ANALYTICS` environment variable to `false` or opt out of the pixel by adding this setting in `superset-frontent/package.json`:
Note that Superset uses [Scarf](https://docs.scarf.sh) to capture telemetry/analytics about versions being installed, including the `scarf-js` npm package and an analytics pixel. As noted elsewhere in this documentation, Scarf gathers aggregated stats for the sake of security/release strategy, and does not capture/retain PII. [You can read here](https://docs.scarf.sh/package-analytics/) about the `scarf-js` package, and various means to opt out of it, but you can opt out of the npm package _and_ the pixel by setting the `SCARF_ANALYTICS` environment variable to `false` or opt out of the pixel by adding this setting in `superset-frontent/package.json`:
```json
// your-package/package.json
@@ -340,7 +291,7 @@ Error: ENOSPC: System limit for number of file watchers reached
```
The error is thrown because the number of files monitored by the system has reached the limit.
You can address this error by increasing the number of inotify watchers.
You can address this this error by increasing the number of inotify watchers.
The current value of max watches can be checked with:
@@ -351,13 +302,13 @@ cat /proc/sys/fs/inotify/max_user_watches
Edit the file `/etc/sysctl.conf` to increase this value.
The value needs to be decided based on the system memory [(see this StackOverflow answer for more context)](https://stackoverflow.com/questions/535768/what-is-a-reasonable-amount-of-inotify-watches-with-linux).
Open the file in an editor and add a line at the bottom specifying the max watches values.
Open the file in editor and add a line at the bottom specifying the max watches values.
```bash
fs.inotify.max_user_watches=524288
```
Save the file and exit the editor.
Save the file and exit editor.
To confirm that the change succeeded, run the following command to load the updated value of max_user_watches from `sysctl.conf`:
```bash
@@ -455,7 +406,7 @@ pre-commit install
A series of checks will now run when you make a git commit.
Alternatively, it is possible to run pre-commit via tox:
Alternatively it is possible to run pre-commit via tox:
```bash
tox -e pre-commit
@@ -539,7 +490,7 @@ commands are invoked.
There is also a utility script included in the Superset codebase to run python integration tests. The [readme can be
found here](https://github.com/apache/superset/tree/master/scripts/tests)
To run all integration tests, for example, run this script from the root directory:
To run all integration tests for example, run this script from the root directory:
```bash
scripts/tests/run.sh
@@ -614,14 +565,14 @@ As an alternative you can use docker compose environment for testing:
Make sure you have added below line to your /etc/hosts file:
`127.0.0.1 db`
If you already have launched Docker environment please use the following command to ensure a fresh database instance:
If you already have launched Docker environment please use the following command to assure a fresh database instance:
`docker compose down -v`
Launch environment:
`CYPRESS_CONFIG=true docker compose up`
It will serve the backend and frontend on port 8088.
It will serve backend and frontend on port 8088.
Run Cypress tests:
@@ -658,12 +609,12 @@ For debugging locally using VSCode, you can configure a launch configuration fil
}
```
#### Raw Docker (without `docker compose`)
#### Raw Docker (without docker-compose)
Follow these instructions to debug the Flask app running inside a docker container. Note that
this will run a barebones Superset web server,
First, add the following to the ./docker-compose.yaml file
First add the following to the ./docker-compose.yaml file
```diff
superset:
@@ -777,7 +728,7 @@ See (set capabilities for a container)[https://kubernetes.io/docs/tasks/configur
Once the pod is running as root and has the SYS_PTRACE capability it will be able to debug the Flask app.
You can follow the same instructions as in `docker compose`. Enter the pod and install the required library and packages; gdb, netstat and debugpy.
You can follow the same instructions as in the docker-compose. Enter the pod and install the required library and packages; gdb, netstat and debugpy.
Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine.
@@ -785,11 +736,11 @@ Often in a Kubernetes environment nodes are not addressable from outside the clu
kubectl port-forward pod/superset-<some random id> 5678:5678
```
You can now launch your VSCode debugger with the same config as above. VSCode will connect to 127.0.0.1:5678 which is forwarded by kubectl to your remote kubernetes POD.
You can now launch your VSCode debugger with the same config as above. VSCode will connect to to 127.0.0.1:5678 which is forwarded by kubectl to your remote kubernetes POD.
### Storybook
Superset includes a [Storybook](https://storybook.js.org/) to preview the layout/styling of various Superset components and variations thereof. To open and view the Storybook:
Superset includes a [Storybook](https://storybook.js.org/) to preview the layout/styling of various Superset components, and variations thereof. To open and view the Storybook:
```bash
cd superset-frontend
@@ -939,7 +890,7 @@ To fix it:
from alembic import op
```
Alternatively, you may also run `superset db merge` to create a migration script
Alternatively you may also run `superset db merge` to create a migration script
just for merging the heads.
```bash

View File

@@ -438,7 +438,7 @@ See [set capabilities for a container](https://kubernetes.io/docs/tasks/configur
Once the pod is running as root and has the `SYS_PTRACE` capability it will be able to debug the Flask app.
You can follow the same instructions as in `docker compose`. Enter the pod and install the required library and packages; gdb, netstat and debugpy.
You can follow the same instructions as in the docker-compose. Enter the pod and install the required library and packages; gdb, netstat and debugpy.
Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine.

View File

@@ -174,16 +174,13 @@ You can take a look at this Flask-AppBuilder
## Is there a way to force the dashboard to use specific colors?
It is possible on a per-dashboard basis by providing a mapping of labels to colors in the JSON
Metadata attribute using the `label_colors` key. You can use either the full hex color, a named color,
like `red`, `coral` or `lightblue`, or the index in the current color palette (0 for first color, 1 for
second etc). Example:
Metadata attribute using the `label_colors` key.
```json
{
"label_colors": {
"foo": "#FF69B4",
"bar": "lightblue",
"baz": 0
"Girls": "#FF69B4",
"Boys": "#ADD8E6"
}
}
```

View File

@@ -59,29 +59,11 @@ Here are the build presets that are exposed through the `build_docker.py` script
this specific SHA, which could be from a `master` merge, or release.
- `websocket-latest`: The WebSocket image for use in a Superset cluster.
For insights or modifications to the build matrix and tagging conventions,
check the [build_docker.py](https://github.com/apache/superset/blob/master/scripts/build_docker.py)
script and the [docker.yml](https://github.com/apache/superset/blob/master/.github/workflows/docker.yml)
GitHub action.
## Key ARGs in Dockerfile
- `BUILD_TRANSLATIONS`: whether to build the translations into the image. For the
frontend build this tells webpack to strip out all locales other than `en` from
the `moment-timezone` library. For the backendthis skips compiling the
`*.po` translation files
- `DEV_MODE`: whether to skip the frontend build, this is used by our `docker-compose` dev setup
where we mount the local volume and build using `webpack` in `--watch` mode, meaning as you
alter the code in the local file system, webpack, from within a docker image used for this
purpose, will constantly rebuild the frontend as you go. This ARG enables the initial
`docker-compose` build to take much less time and resources
- `INCLUDE_CHROMIUM`: whether to include chromium in the backend build so that it can be
used as a headless browser for workloads related to "Alerts & Reports" and thumbnail generation
- `INCLUDE_FIREFOX`: same as above, but for firefox
- `PY_VER`: specifying the base image for the python backend, we don't recommend altering
this setting if you're not working on forwards or backwards compatibility
## Caching
To accelerate builds, we follow Docker best practices and use `apache/superset-cache`.
@@ -101,7 +83,7 @@ add database support for the database you need.
Currently all automated builds are multi-platform, supporting both `linux/arm64`
and `linux/amd64`. This enables higher level constructs like `helm` and
`docker compose` to point to these images and effectively be multi-platform
docker-compose to point to these images and effectively be multi-platform
as well.
Pull requests and master builds

View File

@@ -13,9 +13,9 @@ import useBaseUrl from "@docusaurus/useBaseUrl";
<br /><br />
:::caution
Since `docker compose` is primarily designed to run a set of containers on **a single host**
Since `docker-compose` is primarily designed to run a set of containers on **a single host**
and can't support requirements for **high availability**, we do not support nor recommend
using our `docker compose` constructs to support production-type use-cases. For single host
using our `docker-compose` constructs to support production-type use-cases. For single host
environments, we recommend using [minikube](https://minikube.sigs.k8s.io/docs/start/) along
our [installing on k8s](https://superset.apache.org/docs/installation/running-on-kubernetes)
documentation.
@@ -26,7 +26,7 @@ Superset locally is using Docker Compose on a Linux or Mac OSX
computer. Superset does not have official support for Windows. It's also the easiest
way to launch a fully functioning **development environment** quickly.
Note that there are 3 major ways we support to run `docker compose`:
Note that there are 3 major ways we support to run docker-compose:
1. **docker-compose.yml:** for interactive development, where we mount your local folder with the
frontend/backend files that you can edit and experience the changes you
@@ -49,9 +49,9 @@ More on these two approaches after setting up the requirements for either.
## Requirements
Note that this documentation assumes that you have [Docker](https://www.docker.com) and
[git](https://git-scm.com/) installed. Note also that we used to use `docker-compose` but that
is on the path to deprecation so we now use `docker compose` instead.
Note that this documentation assumes that you have [Docker](https://www.docker.com),
[docker-compose](https://docs.docker.com/compose/), and
[git](https://git-scm.com/) installed.
## 1. Clone Superset's GitHub repository
@@ -67,7 +67,7 @@ current directory.
## 2. Launch Superset Through Docker Compose
First let's assume you're familiar with `docker compose` mechanics. Here we'll refer generally
First let's assume you're familiar with docker-compose mechanics. Here we'll refer generally
to `docker compose up` even though in some cases you may want to force a check for newer remote
images using `docker compose pull`, force a build with `docker compose build` or force a build
on latest base images using `docker compose build --pull`. In most cases though, the simple
@@ -112,7 +112,7 @@ Here various release tags, github SHA, and latest `master` can be referenced by
Refer to the docker-related documentation to learn more about existing tags you can point to
from Docker Hub.
## `docker compose` tips & configuration
## docker-compose tips & configuration
:::caution
All of the content belonging to a Superset instance - charts, dashboards, users, etc. - is stored in
@@ -137,7 +137,7 @@ You can install additional python packages and apply config overrides by followi
mentioned in [docker/README.md](https://github.com/apache/superset/tree/master/docker#configuration)
Note that `docker/.env` sets the default environment variables for all the docker images
used by `docker compose`, and that `docker/.env-local` can be used to override those defaults.
used by `docker-compose`, and that `docker/.env-local` can be used to override those defaults.
Also note that `docker/.env-local` is referenced in our `.gitignore`,
preventing developers from risking committing potentially sensitive configuration to the repository.

View File

@@ -153,7 +153,9 @@ See [Install Database Drivers](/docs/configuration/databases) for more informati
:::
The following example installs the drivers for BigQuery and Elasticsearch, allowing you to connect to these data sources within your Superset setup:
The following example installs the Big Query and Elasticsearch database drivers so that you can
connect to those datasources in your Superset installation:
```yaml
bootstrapScript: |
#!/bin/bash

View File

@@ -12,7 +12,7 @@ import useBaseUrl from "@docusaurus/useBaseUrl";
<img src={useBaseUrl("/img/pypi.png" )} width="150" />
<br /><br />
This page describes how to install Superset using the `apache-superset` package [published on PyPI](https://pypi.org/project/apache-superset/).
This page describes how to install Superset using the `apache_superset` package [published on PyPI](https://pypi.org/project/apache_superset/).
## OS Dependencies
@@ -22,18 +22,18 @@ level dependencies.
**Debian and Ubuntu**
In Ubuntu **20.04 and 22.04** the following command will ensure that the required dependencies are installed:
```bash
sudo apt-get install build-essential libssl-dev libffi-dev python3-dev python3-pip libsasl2-dev libldap2-dev default-libmysqlclient-dev
```
In Ubuntu **before 20.04** the following command will ensure that the required dependencies are installed:
The following command will ensure that the required dependencies are installed:
```bash
sudo apt-get install build-essential libssl-dev libffi-dev python-dev python-pip libsasl2-dev libldap2-dev default-libmysqlclient-dev
```
In Ubuntu 20.04 the following command will ensure that the required dependencies are installed:
```bash
sudo apt-get install build-essential libssl-dev libffi-dev python3-dev python3-pip libsasl2-dev libldap2-dev default-libmysqlclient-dev
```
**Fedora and RHEL-derivative Linux distributions**
Install the following packages using the `yum` package manager:
@@ -128,10 +128,10 @@ command line.
### Installing and Initializing Superset
First, start by installing `apache-superset`:
First, start by installing `apache_superset`:
```bash
pip install apache-superset
pip install apache_superset
```
Then, you need to initialize the database:

View File

@@ -32,7 +32,7 @@ docker compose up
To upgrade superset in a native installation, run the following commands:
```bash
pip install apache-superset --upgrade
pip install apache_superset --upgrade
```
## Upgrading the Metadata Database

View File

@@ -40,7 +40,6 @@ container images and will load up some examples. Once all containers
are downloaded and the output settles, you're ready to log in.
⚠️ If you get an error message like `validating superset\docker-compose-image-tag.yml: services.superset-worker-beat.env_file.0 must be a string`, you need to update your version of `docker-compose`.
Note that `docker-compose` is on the path to deprecation and you should now use `docker compose` instead.
### 3. Log into Superset

View File

@@ -2,6 +2,21 @@
title: CVEs fixed by release
sidebar_position: 2
---
#### Version 4.1.2
| CVE | Title | Affected |
|:---------------|:-----------------------------------------------------------------------------------|---------:|
| CVE-2025-27696 | Improper authorization leading to resource ownership takeover | < 4.1.2 |
#### Version 4.1.0
| CVE | Title | Affected |
|:---------------|:-----------------------------------------------------------------------------------|---------:|
| CVE-2024-53947 | Improper SQL authorisation, parse for specific postgres functions | < 4.1.0 |
| CVE-2024-53948 | Error verbosity exposes metadata in analytics databases | < 4.1.0 |
| CVE-2024-53949 | Lower privilege users are able to create Role when FAB_ADD_SECURITY_API is enabled | < 4.1.0 |
| CVE-2024-55633 | SQLLab Improper readonly query validation allows unauthorized write access | < 4.1.0 |
#### Version 4.0.2
| CVE | Title | Affected |

View File

@@ -27,34 +27,33 @@ following information about each flight is given:
You may need to enable the functionality to upload a CSV or Excel file to your database. The following section
explains how to enable this functionality for the examples database.
In the top menu, select **Settings ‣ Data ‣ Database Connections**. Find the **examples** database in the list and
In the top menu, select **Data ‣ Databases**. Find the **examples** database in the list and
select the **Edit** button.
<img src={useBaseUrl("/img/tutorial/edit-record.png" )} />
In the resulting modal window, switch to the **Advanced** tab and open **Security** section.
Then, tick the checkbox for **Allow file uploads to database**. End by clicking the **Finish** button.
In the resulting modal window, switch to the **Extra** tab and
tick the checkbox for **Allow Data Upload**. End by clicking the **Save** button.
<img src={useBaseUrl("/img/tutorial/allow-file-uploads.png" )} />
<img src={useBaseUrl("/img/tutorial/add-data-upload.png" )} />
### Loading CSV Data
Download the CSV dataset to your computer from
[GitHub](https://raw.githubusercontent.com/apache-superset/examples-data/master/tutorial_flights.csv).
In the top menu, select **Settings ‣ Data ‣ Database Connections**. Then, **Upload file to database ‣ Upload CSV**.
In the Superset menu, select **Data ‣ Upload a CSV**.
<img src={useBaseUrl("/img/tutorial/upload_a_csv.png" )} />
Then, select select the CSV file from your computer, select **Database** and **Schema**, and enter the **Table Name**
as _tutorial_flights_.
Then, enter the **Table Name** as _tutorial_flights_ and select the CSV file from your computer.
<img src={useBaseUrl("/img/tutorial/csv_to_database_configuration.png" )} />
Next enter the text _Travel Date_ into the **File settings ‣ Columns to be parsed as dates** field.
Next enter the text _Travel Date_ into the **Parse Dates** field.
<img src={useBaseUrl("/img/tutorial/parse_dates_column.png" )} />
Leaving all the other options in their default settings, select **Upload** at the bottom of the page.
Leaving all the other options in their default settings, select **Save** at the bottom of the page.
### Table Visualization

View File

@@ -203,13 +203,18 @@ const config = {
({
docs: {
sidebarPath: require.resolve('./sidebars.js'),
editUrl: 'https://github.com/apache/superset/edit/master/docs',
editUrl:
({versionDocsDirPath, docPath}) => {
if (docPath === 'intro.md') {
return 'https://github.com/apache/superset/edit/master/README.md'
}
return `https://github.com/apache/superset/edit/master/docs/${versionDocsDirPath}/${docPath}`
}
},
blog: {
showReadingTime: true,
// Please change this to your repo.
editUrl:
'https://github.com/facebook/docusaurus/edit/main/website/blog/',
editUrl: 'https://github.com/facebook/docusaurus/edit/main/website/blog/',
},
theme: {
customCss: require.resolve('./src/styles/custom.css'),

View File

@@ -17,40 +17,40 @@
"typecheck": "tsc"
},
"dependencies": {
"@algolia/client-search": "^5.12.0",
"@ant-design/icons": "^5.4.0",
"@docsearch/react": "^3.6.3",
"@docusaurus/core": "^3.5.2",
"@docusaurus/plugin-client-redirects": "^3.5.2",
"@docusaurus/preset-classic": "^3.5.2",
"@algolia/client-search": "^4.24.0",
"@ant-design/icons": "^5.3.7",
"@docsearch/react": "^3.6.0",
"@docusaurus/core": "^3.4.0",
"@docusaurus/plugin-client-redirects": "^3.4.0",
"@docusaurus/preset-classic": "^3.4.0",
"@emotion/core": "^10.1.1",
"@emotion/styled": "^10.0.27",
"@mdx-js/react": "^3.1.0",
"@saucelabs/theme-github-codeblock": "^0.3.0",
"@mdx-js/react": "^3.0.0",
"@saucelabs/theme-github-codeblock": "^0.2.3",
"@superset-ui/style": "^0.14.23",
"@svgr/webpack": "^8.1.0",
"antd": "^5.21.6",
"antd": "^4.19.3",
"buffer": "^6.0.3",
"clsx": "^2.1.1",
"docusaurus-plugin-less": "^2.0.2",
"file-loader": "^6.2.0",
"less": "^4.2.0",
"less-loader": "^11.0.0",
"prism-react-renderer": "^2.4.0",
"prism-react-renderer": "^2.3.1",
"react": "^18.3.1",
"react-dom": "^18.3.1",
"react-github-btn": "^1.4.0",
"react-svg-pan-zoom": "^3.13.1",
"react-svg-pan-zoom": "^3.12.1",
"stream": "^0.0.3",
"swagger-ui-react": "^5.17.14",
"url-loader": "^4.1.1"
},
"devDependencies": {
"@docusaurus/module-type-aliases": "^3.5.2",
"@docusaurus/tsconfig": "^3.5.2",
"@types/react": "^18.3.12",
"typescript": "^5.6.3",
"webpack": "^5.96.1"
"@docusaurus/module-type-aliases": "^3.4.0",
"@docusaurus/tsconfig": "^3.4.0",
"@types/react": "^18.3.3",
"typescript": "^5.5.2",
"webpack": "^5.92.1"
},
"browserslist": {
"production": [

View File

@@ -132,9 +132,4 @@ export const Databases = [
href: 'https://www.couchbase.com/',
imgName: 'couchbase.svg',
},
{
title: 'Denodo',
href: 'https://www.denodo.com/',
imgName: 'denodo.png',
},
];

View File

@@ -16,6 +16,8 @@
* specific language governing permissions and limitations
* under the License.
*/
@import '~antd/lib/style/themes/default.less';
@import '~antd/dist/antd.less'; // Import Ant Design styles by less entry
@import 'antd-theme.less';
body {

Binary file not shown.

Before

Width:  |  Height:  |  Size: 17 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 90 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 139 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 248 KiB

After

Width:  |  Height:  |  Size: 144 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 93 KiB

After

Width:  |  Height:  |  Size: 92 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 70 KiB

After

Width:  |  Height:  |  Size: 102 KiB

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -15,7 +15,7 @@
# limitations under the License.
#
apiVersion: v2
appVersion: "4.1.1"
appVersion: "4.0.1"
description: Apache Superset is a modern, enterprise-ready business intelligence web application
name: superset
icon: https://artifacthub.io/image/68c1d717-0e97-491f-b046-754e46f46922@2x
@@ -29,7 +29,7 @@ maintainers:
- name: craig-rueda
email: craig@craigrueda.com
url: https://github.com/craig-rueda
version: 0.13.3
version: 0.12.12
dependencies:
- name: postgresql
version: 12.1.6

View File

@@ -23,7 +23,7 @@ NOTE: This file is generated by helm-docs: https://github.com/norwoodj/helm-docs
# superset
![Version: 0.13.3](https://img.shields.io/badge/Version-0.13.3-informational?style=flat-square)
![Version: 0.12.12](https://img.shields.io/badge/Version-0.12.12-informational?style=flat-square)
Apache Superset is a modern, enterprise-ready business intelligence web application
@@ -69,7 +69,6 @@ On helm this can be set on `extraSecretEnv.SUPERSET_SECRET_KEY` or `configOverri
| extraConfigs | object | `{}` | Extra files to mount on `/app/pythonpath` |
| extraEnv | object | `{}` | Extra environment variables that will be passed into pods |
| extraEnvRaw | list | `[]` | Extra environment variables in RAW format that will be passed into pods |
| extraLabels | object | `{}` | Labels to be added to all resources |
| extraSecretEnv | object | `{}` | Extra environment variables to pass as secrets |
| extraSecrets | object | `{}` | Extra files to mount on `/app/pythonpath` as secrets |
| extraVolumeMounts | list | `[]` | |

View File

@@ -28,9 +28,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
data:
{{- range $path, $config := .Values.extraConfigs }}
{{ $path }}: |

View File

@@ -28,9 +28,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
{{- if .Values.supersetCeleryBeat.deploymentAnnotations }}
annotations: {{- toYaml .Values.supersetCeleryBeat.deploymentAnnotations | nindent 4 }}
{{- end }}
@@ -61,9 +58,6 @@ spec:
labels:
app: "{{ template "superset.name" . }}-celerybeat"
release: {{ .Release.Name }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 8 }}
{{- end }}
{{- if .Values.supersetCeleryBeat.podLabels }}
{{- toYaml .Values.supersetCeleryBeat.podLabels | nindent 8 }}
{{- end }}

View File

@@ -28,9 +28,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
{{- if .Values.supersetCeleryFlower.deploymentAnnotations }}
annotations: {{- toYaml .Values.supersetCeleryFlower.deploymentAnnotations | nindent 4 }}
{{- end }}
@@ -50,9 +47,6 @@ spec:
labels:
app: "{{ template "superset.name" . }}-flower"
release: {{ .Release.Name }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 8 }}
{{- end }}
{{- if .Values.supersetCeleryFlower.podLabels }}
{{- toYaml .Values.supersetCeleryFlower.podLabels | nindent 8 }}
{{- end }}

View File

@@ -27,9 +27,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
{{- if .Values.supersetWorker.deploymentLabels }}
{{- toYaml .Values.supersetWorker.deploymentLabels | nindent 4 }}
{{- end }}
@@ -67,9 +64,6 @@ spec:
labels:
app: {{ template "superset.name" . }}-worker
release: {{ .Release.Name }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 8 }}
{{- end }}
{{- if .Values.supersetWorker.podLabels }}
{{- toYaml .Values.supersetWorker.podLabels | nindent 8 }}
{{- end }}

View File

@@ -28,9 +28,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
{{- if .Values.supersetWebsockets.deploymentAnnotations }}
annotations: {{- toYaml .Values.supersetWebsockets.deploymentAnnotations | nindent 4 }}
{{- end }}
@@ -53,9 +50,6 @@ spec:
labels:
app: "{{ template "superset.name" . }}-ws"
release: {{ .Release.Name }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 8 }}
{{- end }}
{{- if .Values.supersetWebsockets.podLabels }}
{{- toYaml .Values.supersetWebsockets.podLabels | nindent 8 }}
{{- end }}

View File

@@ -27,9 +27,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
{{- if .Values.supersetNode.deploymentLabels }}
{{- toYaml .Values.supersetNode.deploymentLabels | nindent 4 }}
{{- end }}
@@ -69,9 +66,6 @@ spec:
labels:
app: {{ template "superset.name" . }}
release: {{ .Release.Name }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 8 }}
{{- end }}
{{- if .Values.supersetNode.podLabels }}
{{- toYaml .Values.supersetNode.podLabels | nindent 8 }}
{{- end }}

View File

@@ -27,9 +27,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
spec:
scaleTargetRef:
apiVersion: apps/v1

View File

@@ -27,9 +27,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
spec:
scaleTargetRef:
apiVersion: apps/v1

View File

@@ -29,9 +29,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: {{ .Release.Name }}
heritage: {{ .Release.Service }}
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
{{- with .Values.ingress.annotations }}
annotations: {{- toYaml . | nindent 4 }}
{{- end }}

View File

@@ -23,10 +23,6 @@ kind: Job
metadata:
name: {{ template "superset.fullname" . }}-init-db
namespace: {{ .Release.Namespace }}
{{- if .Values.extraLabels }}
labels:
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
{{- if .Values.init.jobAnnotations }}
annotations: {{- toYaml .Values.init.jobAnnotations | nindent 4 }}
{{- end }}

View File

@@ -31,9 +31,6 @@ metadata:
chart: {{ template "superset.chart" $ }}
release: {{ $.Release.Name }}
heritage: {{ $.Release.Service }}
{{- if $.Values.extraLabels }}
{{- toYaml $.Values.extraLabels | nindent 4 }}
{{- end }}
spec:
{{- if .minAvailable }}
minAvailable: {{ .minAvailable }}

View File

@@ -31,9 +31,6 @@ metadata:
chart: {{ template "superset.chart" $ }}
release: {{ $.Release.Name }}
heritage: {{ $.Release.Service }}
{{- if $.Values.extraLabels }}
{{- toYaml $.Values.extraLabels | nindent 4 }}
{{- end }}
spec:
{{- if .minAvailable }}
minAvailable: {{ .minAvailable }}

View File

@@ -31,9 +31,6 @@ metadata:
chart: {{ template "superset.chart" $ }}
release: {{ $.Release.Name }}
heritage: {{ $.Release.Service }}
{{- if $.Values.extraLabels }}
{{- toYaml $.Values.extraLabels | nindent 4 }}
{{- end }}
spec:
{{- if .minAvailable }}
minAvailable: {{ .minAvailable }}

View File

@@ -31,9 +31,6 @@ metadata:
chart: {{ template "superset.chart" $ }}
release: {{ $.Release.Name }}
heritage: {{ $.Release.Service }}
{{- if $.Values.extraLabels }}
{{- toYaml $.Values.extraLabels | nindent 4 }}
{{- end }}
spec:
{{- if .minAvailable }}
minAvailable: {{ .minAvailable }}

View File

@@ -31,9 +31,6 @@ metadata:
chart: {{ template "superset.chart" $ }}
release: {{ $.Release.Name }}
heritage: {{ $.Release.Service }}
{{- if $.Values.extraLabels }}
{{- toYaml $.Values.extraLabels | nindent 4 }}
{{- end }}
spec:
{{- if .minAvailable }}
minAvailable: {{ .minAvailable }}

View File

@@ -27,9 +27,6 @@ metadata:
chart: {{ template "superset.chart" . }}
release: "{{ .Release.Name }}"
heritage: "{{ .Release.Service }}"
{{- if .Values.extraLabels }}
{{- toYaml .Values.extraLabels | nindent 4 }}
{{- end }}
type: Opaque
stringData:
REDIS_HOST: {{ tpl .Values.supersetNode.connections.redis_host . | quote }}

Some files were not shown because too many files have changed in this diff Show More