Compare commits

..

159 Commits

Author SHA1 Message Date
Michael S. Molina
a5f0b24806 chore: Adds RC3 data to CHANGELOG.md 2025-05-28 16:47:24 -03:00
Michael S. Molina
f9ff3259b3 fix: Makes time compare migration more resilient (#33592)
(cherry picked from commit b7ba50033a)
2025-05-28 08:34:38 -03:00
Michael S. Molina
00cd069af2 fix: Missing processor context when rendering Jinja (#33596)
(cherry picked from commit ce9759785a)
2025-05-28 08:34:38 -03:00
Luiz Otavio
cfba5cdc57 fix: Adjust viz migrations to also migrate the queries object (#33285)
Co-authored-by: Michael S. Molina <michael.s.molina@gmail.com>
Co-authored-by: Michael S. Molina <70410625+michael-s-molina@users.noreply.github.com>
(cherry picked from commit 57183da315)
2025-05-26 15:08:17 -03:00
Richard Fogaca Nienkotter
685b259f6f fix(sankey): incorrect nodeValues (#33431)
Co-authored-by: richardfn <richard.fogaca@appsilon.com>
(cherry picked from commit 38868f9ff4)
2025-05-26 15:08:17 -03:00
Vitor Avila
8021a8ce42 fix(AllEntities): Display action buttons according to the user permissions (#33553)
(cherry picked from commit 546945e7a6)
2025-05-23 09:48:07 -03:00
Mike Klumpenaar
979e3a228f fix(user settings): Update forked cosmo theme to resolve down chevron in caret style (#30514) (#30577)
Co-authored-by: garriscp <garriscp@gmail.com>
(cherry picked from commit b7d3ff1e85)
2025-05-23 09:48:07 -03:00
amaannawab923
405fcfc5b1 fix(table): table sort by fix (#33540)
Co-authored-by: Amaan Nawab <nelsondrew07@gmail.com>
Co-authored-by: Geido <60598000+geido@users.noreply.github.com>
2025-05-23 09:47:47 -03:00
Rafael Benitez
a86a156e49 fix(Sqllab): Autocomplete got stuck in UI when open it too fast (#33522)
(cherry picked from commit b4e2406385)
2025-05-23 09:42:21 -03:00
Elizabeth Thompson
ec2b168d7a fix: allow metadata to parse json (#33444)
(cherry picked from commit b050897ebd)
2025-05-23 09:42:21 -03:00
JUST.in DO IT
c54778d7e7 fix(table-chart): time shift is not working (#33425)
(cherry picked from commit dc4474889d)
2025-05-23 09:42:21 -03:00
Syed Bariman Jan
6bdfb7ad6c fix(deckgl): fix deckgl multiple layers chart filter and viewport (#33364)
(cherry picked from commit 29ac507d56)
2025-05-23 09:42:21 -03:00
Mehmet Salih Yavuz
cd12f30db2 fix(Row): don't unload charts while embedded to reduce rerenders (#33422)
(cherry picked from commit 21ca26acd7)
2025-05-23 09:42:21 -03:00
Maxime Beauchemin
602208d68a fix: loading examples from raw.githubusercontent.com fails with 429 errors (#33354)
(cherry picked from commit f045a73e2d)
2025-05-12 13:46:39 -03:00
Đỗ Trọng Hải
8ffcd7e960 fix(be/utils): sync cache timeout for memoized function (#31917)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit 4ed05f4ff1)
2025-05-12 12:07:52 -03:00
Shao Yu-Lung (Allen)
782e11b59c fix(i18n): zh_TW pybabel compile error: placeholders are incompatible (#33345) 2025-05-12 12:07:37 -03:00
github-actions[bot]
aeb50c505f chore(🦾): bump python h11 0.14.0 -> 0.16.0 (#33339)
Co-authored-by: GitHub Action <action@github.com>
(cherry picked from commit 82526865d2)
2025-05-12 12:07:20 -03:00
Vitor Avila
2f3658c750 fix: Edge case with metric not getting quoted in sort by when normalize_columns is enabled (#33337)
(cherry picked from commit 9f0ae77341)
2025-05-12 12:07:20 -03:00
Michael S. Molina
7f57df5fa4 fix: Temporal filter conversion in viz migrations (#33224)
(cherry picked from commit 6db3a4d9d2)
2025-05-12 12:07:20 -03:00
Beto Dealmeida
ce6d0d5963 fix: improve function detection (#33306)
(cherry picked from commit 339ba96600)
2025-05-12 12:07:20 -03:00
JUST.in DO IT
8cc6f260c7 fix(echarts): rename time series shifted colnames (#33269)
(cherry picked from commit ef14b529b8)
2025-05-12 12:07:20 -03:00
Beto Dealmeida
ee8edcf4b4 fix: mask password on DB import (#33267) 2025-05-12 12:07:07 -03:00
Daniel Vaz Gaspar
f56757a0e9 fix: LocalProxy is not mapped warning (#33025)
(cherry picked from commit c029b532d4)
2025-05-12 12:06:50 -03:00
Evan Rusackas
850f35add1 fix(histogram): remove extra single quotes (#33248)
(cherry picked from commit d2360b533b)
2025-05-12 12:06:50 -03:00
Vitor Avila
f8b0632f99 fix(DB update): Gracefully handle querry error during DB update (#33250)
(cherry picked from commit de84a534ac)
2025-05-12 12:06:50 -03:00
Sam Firke
a6895e07b0 fix(heatmap): correctly render int and boolean falsy values on axes (#33238)
(cherry picked from commit ac636c73ae)
2025-05-12 12:06:50 -03:00
Vitor Avila
a3c08fd9bf fix(sqllab permalink): Commit SQL Lab permalinks (#33237)
(cherry picked from commit fbd8ae2888)
2025-05-12 12:06:50 -03:00
Vitor Avila
5b178dfbc7 fix(standalone): Ensure correct URL param value for standalone mode (#33234) 2025-05-12 12:06:33 -03:00
JUST.in DO IT
692d47c79d fix(antd): Invalid dashed border in tertiary button (#33291) 2025-04-30 08:17:57 -03:00
Vitor Avila
84b437d0a7 fix(export): Full CSV/Excel exports respecting SQL_MAX_ROW config (#33214)
(cherry picked from commit f7b7aace38)
2025-04-24 15:43:08 -03:00
JUST.in DO IT
f4ea15e477 fix(sqllab): Invalid SQL Error breaks SQL Lab (#33164) 2025-04-24 15:42:55 -03:00
Evan Rusackas
7c46374202 fix(deckgl): Update Arc to properly adjust line width (#33154)
(cherry picked from commit b589d44dfb)
2025-04-24 15:41:24 -03:00
Jacob Amrany
d9c1ee67f5 fix: os.makedirs race condition (#33161)
(cherry picked from commit 00f1fdb3c4)
2025-04-24 15:41:24 -03:00
JUST.in DO IT
35d7e15841 fix(echart): Thrown errors shown after resized (#33143)
(cherry picked from commit 172e5dd095)
2025-04-24 15:41:24 -03:00
JUST.in DO IT
454397fc17 fix(echart): Tooltip date format doesn't follow time grain (#33138)
(cherry picked from commit 7333ffd41e)
2025-04-17 09:46:45 -03:00
Jillian
aab564fa58 fix(lang): patch FAB's LocaleView to redirect to previous page (#31692) 2025-04-17 09:46:26 -03:00
JUST.in DO IT
5e475ecb7b fix(dashboard): invalid active tab state (#33106)
(cherry picked from commit 342e6f3ab0)
2025-04-17 09:43:54 -03:00
Michael S. Molina
0c80ab07f7 fix: Viz migration error handling (#33037)
(cherry picked from commit bc0ffe0d10)
2025-04-17 09:43:54 -03:00
Daniel Höxtermann
03255819a4 fix(playwright): allow screenshotting empty dashboards (#33107)
(cherry picked from commit 2233c02720)
2025-04-17 09:43:54 -03:00
Michael S. Molina
49541333d0 fix: Bump FAB to 4.5.5 2025-04-14 14:14:27 -03:00
Maxime Beauchemin
65c76024f6 fix: resolve recent merge collisio (#33110) 2025-04-14 14:08:42 -03:00
Michael S. Molina
d0aa6338b8 fix: Allows configuration of Selenium Webdriver binary (#33103)
(cherry picked from commit e1f5c49df7)
2025-04-14 14:08:42 -03:00
Daniel Höxtermann
bf2e014cde fix(thumbnails): ensure consistent cache_key (#33109)
(cherry picked from commit 347c174099)
2025-04-14 14:08:42 -03:00
Erkka Tahvanainen
65c9375624 fix(dashboard): Generate screenshot via celery (#32193)
Co-authored-by: Erkka Tahvanainen <erkka.tahvanainen@confidently.fi>
(cherry picked from commit 5656d69c04)
2025-04-14 14:08:42 -03:00
Hossein Khalilian
1a1986a191 fix(docker): fallback to pip if uv is not available (#33087)
(cherry picked from commit 164a07e2be)
2025-04-14 14:08:42 -03:00
Michael S. Molina
d0355d57bf fix: Adds missing __init__ file to commands/logs (#33059)
(cherry picked from commit c1159c53e3)
2025-04-14 14:08:41 -03:00
JUST.in DO IT
e0e2d329e0 fix: improve error type on parse error (#33048)
(cherry picked from commit ed0cd5e7b0)
2025-04-14 14:08:41 -03:00
EmmanuelCbd
f85f3f8b56 fix(export): charts csv export in dashboards (#31720)
(cherry picked from commit 6b7394e789)
2025-04-14 14:08:41 -03:00
JUST.in DO IT
ddecaa4c11 fix(log): Missing failed query log on async queries (#33024)
(cherry picked from commit 9b15e04bc4)
2025-04-14 14:08:41 -03:00
Levis Mbote
8df4b6d4e0 fix: fix bug where dashboard did not enter fullscreen mode. (#32839) 2025-04-14 14:08:41 -03:00
Hugo Lavernhe
052392e24d fix(dashboard): chart fullscreen issue when filter pane is collapsed (#28428)
(cherry picked from commit 629b137bb0)
2025-04-14 14:08:41 -03:00
Hex Café
1b60d36513 fix: show_filters URL parameter is not working (#29422)
Co-authored-by: Evan Rusackas <evan@preset.io>
Co-authored-by: Vitor Avila <vitor.avila@preset.io>
(cherry picked from commit bcb43327b1)
2025-04-14 14:08:41 -03:00
Michael S. Molina
4e8aa6de07 fix: Bar Chart (legacy) migration to keep labels layout (#32965)
(cherry picked from commit 24b1666273)
2025-04-14 14:08:41 -03:00
SBIN2010
e756094efc fix: fixed Add Metrics to Tree Chart (#29158) (#30679)
(cherry picked from commit f5d64176f6)
2025-04-14 14:08:41 -03:00
JUST.in DO IT
6a78260b66 fix(pivot-table): Revert "fix(Pivot Table): Fix column width to respect currency config (#31414)" (#32968)
(cherry picked from commit a36e636a58)
2025-04-14 14:08:41 -03:00
notHuman9504
5620acac61 fix: Clicking in the body of a Markdown component does not put it into edit mode (#32384)
(cherry picked from commit 26743dfcee)
2025-04-14 14:08:41 -03:00
github-actions[bot]
766c795106 chore(🦾): bump python sqlglot 26.1.3 -> 26.11.1 (#32745)
Co-authored-by: GitHub Action <action@github.com>
(cherry picked from commit 66c1a6a875)
2025-04-14 14:08:41 -03:00
Michael S. Molina
5bc3ca295c chore: Adds RC2 data to CHANGELOG.md 2025-04-14 14:08:41 -03:00
JUST.in DO IT
e4d34902d3 fix(sqllab): Invalid display of table column keys (#32763)
(cherry picked from commit 56bf17f879)
2025-04-14 14:08:41 -03:00
Vitor Avila
f27bf9ea64 fix(Jinja): Emit time grain to table charts even if they don't have a temporal column (#32871)
(cherry picked from commit ab22bb1878)
2025-04-14 14:08:41 -03:00
Đỗ Trọng Hải
56f6e1196c fix(backend/async_events): allow user to configure username for Redis authentication in GLOBAL_ASYNC_QUERIES_CACHE_BACKEND (#32372)
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: Ville Brofeldt <33317356+villebro@users.noreply.github.com>
(cherry picked from commit e0ed652ed8)
2025-04-14 14:08:41 -03:00
Luke Hart
2c6cdfe7ad fix: use role_model from security manager (#32873)
(cherry picked from commit 103fedaf92)
2025-04-14 14:08:41 -03:00
SBIN2010
2e0363ec36 fix(ColorPickerControl): change color picker control width (#32851)
(cherry picked from commit 37f626f5e2)
2025-04-14 14:08:41 -03:00
Vitor Avila
34b4edb372 fix(table-chart): Do not show comparison columns config if time_compare is set to [] (#32863)
(cherry picked from commit f0dc1e7527)
2025-04-14 14:08:41 -03:00
Christiaan Baartse
b5bb3c76a0 fix(translation): Dutch translations for Current datetime filter (#31869)
(cherry picked from commit 6c7f089ebb)
2025-04-14 14:08:41 -03:00
Beto Dealmeida
c63a33e76b fix: update dataset/query catalog on DB changes (#32829) 2025-04-14 14:08:41 -03:00
Vitor Avila
ea3a823fff fix(echarts): Sort series by name using naturalCompare (#32850)
(cherry picked from commit 5222f940cc)
2025-04-14 14:08:41 -03:00
JUST.in DO IT
e593bc0a2e fix(log): store navigation path to get correct logging path (#32795)
(cherry picked from commit 4a70065e5f)
2025-04-14 14:08:41 -03:00
Fardin Mustaque
7ed236170d fix: Time Comparison Feature Reverts Metric Labels to Metric Keys in Table Charts (#32665)
Co-authored-by: Fardin Mustaque <fardinmustaque@Fardins-Mac-mini.local>
(cherry picked from commit 7d77dc4fd2)
2025-04-14 14:08:41 -03:00
Chris
c070d9e8e1 fix: key error in frontend on disallowed GSheets (#32792)
(cherry picked from commit 6f69c84d10)
2025-04-14 14:08:41 -03:00
SBIN2010
2bdfe52075 fix: CSV/Excel upload form change column dates description (#32797) 2025-04-14 14:08:41 -03:00
Đỗ Trọng Hải
fd1e44b8f6 fix(sec): resolve CVE-2025-29907 and CVE-2025-25977 by pinning jspdf to v3 (#32802)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2025-04-14 14:08:41 -03:00
Đỗ Trọng Hải
e943604db8 fix(model/helper): represent RLS filter clause in proper textual SQL string (#32406)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit ff0529c932)
2025-04-14 14:08:41 -03:00
CharlesNkdl
59d03a3847 fix(excel export): big number truncation handling (#32739)
(cherry picked from commit c0f83a7467)
2025-04-14 14:08:41 -03:00
V9 Developer
231d9a321a fix(config): correct slack image url in talisman (#32778)
(cherry picked from commit 9bb3a5782d)
2025-04-14 14:08:40 -03:00
Ruslan
1c3c6dbe0f fix(css): typos in styles (#28350)
(cherry picked from commit 5ec710efc6)
2025-04-14 14:08:40 -03:00
Vladislav Korenkov
a530da2a36 fix(import): Missing catalog field in saved query schema (#32775)
Co-authored-by: Vladislav Koren'kov <korenkov.vv@dns-shop.ru>
(cherry picked from commit 5866f3ec83)
2025-04-14 14:08:40 -03:00
Antonio Rivero
a6576a1769 fix(sqllab): Pass query_id as kwarg so backoff can see it (#32774)
(cherry picked from commit 01801e3c36)
2025-04-14 14:08:40 -03:00
Vladislav Korenkov
13b97af7f6 fix(chart control): Change default of "Y Axis Title Margin" (#32720)
(cherry picked from commit d319543377)
2025-04-14 14:08:40 -03:00
Elizabeth Thompson
5d333ac6dc fix: do not add calculated columns when syncing (#32761)
(cherry picked from commit 89ce7ba0b0)
2025-03-20 09:44:56 -03:00
Giampaolo Capelli
4a7014b5aa fix: Changing language doesn't affect echarts charts (#31751)
Co-authored-by: Giampaolo Capelli <giampaolo.capelli@docaposte.fr>
(cherry picked from commit 78efb62781)
2025-03-20 09:44:56 -03:00
sowo
f6f1ffae2f fix(contextmenu): uncaught TypeError (#28203)
(cherry picked from commit 29b62f7c0a)
2025-03-20 09:44:56 -03:00
Daniel Höxtermann
6917362d78 fix: ensure datasource permission in explore (#32679)
(cherry picked from commit 9e3052968b)
2025-03-20 09:44:56 -03:00
Paul Rhodes
6119d797e4 fix(import): Ensure import exceptions are logged (#32410)
(cherry picked from commit bc3e19d0a2)
2025-03-20 09:44:56 -03:00
Beto Dealmeida
8ba265ca2b fix: coerce datetime conversion errors (#32683) 2025-03-20 09:44:40 -03:00
JUST.in DO IT
4a189945d8 fix(logging): missing path in event data (#32708)
(cherry picked from commit cd5a94305c)
2025-03-20 09:43:48 -03:00
Beto Dealmeida
9b8194fd8a fix: boolean filters in Explore (#32701)
(cherry picked from commit 41bf215367)
2025-03-20 09:43:48 -03:00
Sam Firke
30e3e2e437 fix(spreadsheet uploads): make file extension comparisons case-insensitive (#32696)
(cherry picked from commit 6a13ab8920)
2025-03-20 09:43:48 -03:00
Đỗ Trọng Hải
980d912cc6 fix(cosmetics): allow toast message to be toggled off when modal is opened (#32691)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit f1a222d356)
2025-03-20 09:43:48 -03:00
Michael S. Molina
d2ba0fc9ae fix: Signature of Celery pruner jobs (#32699)
(cherry picked from commit df06bdf33b)
2025-03-20 09:43:48 -03:00
JUST.in DO IT
a1d7f7adcd fix(log): Update recent_activity by event name (#32681)
(cherry picked from commit 449f51aed5)
2025-03-20 09:43:48 -03:00
Michael S. Molina
2ff9faba0e fix: Update RELEASING/README.md (#32678)
(cherry picked from commit b4dd64aa24)
2025-03-20 09:43:47 -03:00
Beto Dealmeida
3f2ddbac82 fix(gsheets): update params from encrypted extra (#32661) 2025-03-20 09:43:29 -03:00
Vitor Avila
24c120135c fix(import): Import a DB connection with expanded rows enabled (#32657)
(cherry picked from commit 0c6d868483)
2025-03-20 09:40:09 -03:00
Andrey Yakir
45cf969d4b fix(dashboard): Ensure dashboardId is included in form_data for embedded mode (#32646)
(cherry picked from commit 777760b096)
2025-03-20 09:40:09 -03:00
Dolph Mathews
43e68c7a0f fix: Upgrade node base image to Debian 12 bookworm (#32652)
(cherry picked from commit 2f6f5c6778)
2025-03-20 09:38:02 -03:00
JUST.in DO IT
1f34e3cf7c fix(welcome): perf on distinct recent activities (#32608)
(cherry picked from commit 832e028b39)
2025-03-17 12:01:23 -03:00
JUST.in DO IT
2f9edd3b0e fix(dashboard): Support bigint value in native filters (#32549)
(cherry picked from commit e7721a8c4d)
2025-03-17 12:01:23 -03:00
Vitor Avila
bdb9f48044 fix(Slack V2): Specify the filename for the Slack upload method (#32599)
(cherry picked from commit 8e021b0c82)
2025-03-17 12:01:23 -03:00
Michael S. Molina
a0321945cc fix: Log table retention policy (#32572)
(cherry picked from commit 89b6d7fb68)
2025-03-17 12:01:23 -03:00
Elizabeth Thompson
586d88c9ca fix: add DateOffset to json serializer (#32532) 2025-03-17 12:01:06 -03:00
JUST.in DO IT
91ba5b3c9e fix(sqllab): Allow clear on schema and catalog (#32515)
(cherry picked from commit 4c3aae7583)
2025-03-17 11:54:59 -03:00
Antonio Rivero
d493c9d3ca fix(migrations): Handle comparator None in old time comparison migration (#32538)
(cherry picked from commit 20e5df501e)
2025-03-17 11:54:59 -03:00
Elizabeth Thompson
d33b81a43c fix: keep calculated columns when datasource is updated (#32523)
(cherry picked from commit 99238dccbb)
2025-03-17 11:52:58 -03:00
Elizabeth Thompson
2df776e944 fix: Show response message as default error (#32507)
(cherry picked from commit c2de749d0e)
2025-03-17 11:52:58 -03:00
Vitor Avila
405bc269d4 fix(Slack): Fix Slack recipients migration to V2 (#32336)
(cherry picked from commit d2e0e2b79c)
2025-03-17 11:52:58 -03:00
Usiel Riedl
d0a5bd83c3 fix(beat): prune_query celery task args fix (#32511)
(cherry picked from commit e98194cdd3)
2025-03-17 11:52:58 -03:00
Kamil Gabryjelski
266fb7f2ad fix(explore): Glitch in a tooltip with metric's name (#32499)
(cherry picked from commit b3dfd4930a)
2025-03-05 14:17:58 -03:00
Daniel Vaz Gaspar
98c0eeccb2 fix: dashboard, chart and dataset import validation (#32500)
(cherry picked from commit fc844d3dfd)
2025-03-05 14:17:58 -03:00
Beto Dealmeida
5f1a5a1f98 fix: skip DB filter when doing OAuth2 (#32486)
(cherry picked from commit 813e79fa9f)
2025-03-05 14:17:58 -03:00
Evan Rusackas
6dd10ab1ca fix(tooltip): displaying <a> tags correctly (#32488)
(cherry picked from commit 6c3886aad0)
2025-03-05 14:17:58 -03:00
Ville Brofeldt
92549ef5c6 fix(plugin-chart-echarts): remove erroneous upper bound value (#32473)
(cherry picked from commit 5766c36372)
2025-03-05 14:17:58 -03:00
Đỗ Trọng Hải
f1291390a3 fix(com/grid-comp/markdown): pin remark-gfm to v3 to allow inline code block by backticks in Markdown (#32420)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2025-03-05 14:16:40 -03:00
Le Xich Long
8a09fd93e1 fix(clickhouse): get_parameters_from_uri failing when secure is true (#32423)
(cherry picked from commit 84b52b2323)
2025-03-05 14:12:31 -03:00
Damian Pendrak
b11fd2f22b fix(viz): update nesting logic to handle multiple dimensions in PartitionViz (#32290)
(cherry picked from commit 6317a91541)
2025-03-05 14:12:31 -03:00
Yuri
d611c25f2a fix(pinot): revert join and subquery flags (#32382)
(cherry picked from commit 822d72c57d)
2025-03-05 13:37:50 -03:00
Daniel Vaz Gaspar
ae3493aee9 fix: bump FAB to 4.5.4 (#32325)
(cherry picked from commit c02a0a00f4)
2025-03-05 13:37:50 -03:00
Beto Dealmeida
654062db27 fix: ensure metric_macro expands templates (#32344) 2025-03-05 10:01:38 -03:00
Dino
09a93d2803 fix: clickhouse-connect engine SSH parameter (#32348)
(cherry picked from commit 8dcae810d4)
2025-03-05 09:47:51 -03:00
Vedant Prajapati
98c82ad24e fix(docker): Configure nginx for consistent port mapping and hot reloading (#32362)
(cherry picked from commit 0f07d78e01)
2025-03-05 09:47:51 -03:00
Beto Dealmeida
3eda2223ca fix(firebolt): allow backslach escape for single quotes (#32350)
(cherry picked from commit 22fe985cfc)
2025-03-05 09:47:51 -03:00
Enzo Martellucci
7ef38fed24 fix(SSHTunnelForm): make the password tooltip visible (#32356)
(cherry picked from commit 4c4b5e8c64)
2025-03-05 09:47:51 -03:00
Levis Mbote
724f2dc5fe fix(roles): Add SqlLabPermalinkRestApi as default sqlab roles. (#32284)
(cherry picked from commit 2c37ddb2f6)
2025-03-05 09:47:51 -03:00
Đỗ Trọng Hải
7aa4cd4eef fix(fe/dashboard-list): display modifier info for Last modified data (#32035)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit 88cf2d5c39)
2025-03-05 09:47:51 -03:00
Elizabeth Thompson
b2bd39cc28 fix: revert "fix: remove sort values on stacked totals (#31333)" (#32337)
(cherry picked from commit 422a07b382)
2025-03-05 09:47:51 -03:00
Dmitry Kochnev
e8246ea786 fix: oauth2 trino (#31993)
(cherry picked from commit 7ce1a3445c)
2025-02-21 13:23:31 -03:00
Kamil Gabryjelski
2c03455f61 fix: Download as PDF fails due to cache error (#32332)
(cherry picked from commit 42a3c523ae)
2025-02-21 13:23:31 -03:00
Steven Liu
45045d3a1c fix: keep the tab order (#30888)
Co-authored-by: Steven Liu <steven.l@covergenius.com>
(cherry picked from commit d5a5bd46d2)
2025-02-21 13:23:31 -03:00
Đỗ Trọng Hải
146311101a fix(viz/table): selected column not shown in Conditional Formatting popover (#32272)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit dcc9628f31)
2025-02-21 13:23:31 -03:00
Michael S. Molina
c7cc436b0b fix: Decimal values for Histogram bins (#32253)
(cherry picked from commit ffe9244458)
2025-02-14 09:12:31 -03:00
Erkka Tahvanainen
ed217ce903 fix(Datasource): handle undefined datasource_type in fetchSyncedColumns (#32218)
Co-authored-by: Erkka Tahvanainen <erkka.tahvanainen@confidently.fi>
(cherry picked from commit 9da30956c0)
2025-02-14 09:12:31 -03:00
gpchandran
4a298e83d7 fix: upgrade to 3.11.11-slim-bookworm to address critical vulnerabilities (#32240)
(cherry picked from commit ad057324b7)
2025-02-14 09:12:31 -03:00
Elizabeth Thompson
39859e0d17 fix: remove sort values on stacked totals (#31333)
(cherry picked from commit 15fbb195e9)
2025-02-14 09:12:31 -03:00
Maxime Beauchemin
adce138484 docs: adding notes about using uv instead of raw pip (#32239) 2025-02-14 09:12:20 -03:00
Fardin Mustaque
e90b359d3e fix: Update 'Last modified' time when modifying RLS rules (#32227)
Co-authored-by: Fardin Mustaque <fardinmustaque@Fardins-Mac-mini.local>
(cherry picked from commit 52563d3eea)
2025-02-14 09:10:04 -03:00
Daniel Vaz Gaspar
6b95ecda98 chore(ci): fix ephemeral env null issue number (v2) (#32221)
(cherry picked from commit 31d6f5a639)
2025-02-12 16:23:54 -03:00
Daniel Vaz Gaspar
d0417c37b6 chore(ci): fix ephemeral env null issue number (#32220)
(cherry picked from commit 60424c4ccd)
2025-02-12 16:23:54 -03:00
Levis Mbote
c6e764ebbe fix(Scope): Correct issue where filters appear out of scope when sort is unchecked. (#32115)
(cherry picked from commit af3589fe91)
2025-02-12 11:40:29 -03:00
JUST.in DO IT
8b1de4c12c fix(sqllab): close the table tab (#32224)
(cherry picked from commit 937d40cdde)
2025-02-12 11:40:29 -03:00
Maxime Beauchemin
699d22c5fe fix: set Rich tooltip -> 'Show percentage' to false by default (#32212)
(cherry picked from commit d3b854a833)
2025-02-12 11:40:29 -03:00
Enzo Martellucci
8ba36abc87 fix(SaveDatasetModal): repairs field alignment in the SaveDatasetModal component (#32222)
Co-authored-by: Geido <60598000+geido@users.noreply.github.com>
(cherry picked from commit 650fa5ccfb)
2025-02-12 11:40:29 -03:00
Beto Dealmeida
000f2e3780 fix: hidrate datasetsStatus (#32211)
(cherry picked from commit eec54affc3)
2025-02-12 11:40:28 -03:00
Damian Pendrak
9ea2ad5105 fix: handlebars html and css templates reset on dataset update (#32195)
(cherry picked from commit 0f6bd5ea83)
2025-02-12 11:40:28 -03:00
Alex Duan
d83ae362cf fix: TDengine move tdengine.png to databases/ subfolder (#32176)
(cherry picked from commit 06f8f8e608)
2025-02-12 11:40:28 -03:00
Michael S. Molina
ed82e6ac75 fix: Adds an entry to UPDATING.md about DISABLE_LEGACY_DATASOURCE_EDITOR (#32185) 2025-02-12 11:40:16 -03:00
Levis Mbote
53d05a460d fix(sqllab): correct URL format for SQL Lab permalinks (#32154)
(cherry picked from commit f9f8c5d07a)
2025-02-12 11:38:52 -03:00
Jack
8ad24fabc9 fix(virtual dataset sync): Sync virtual dataset columns when changing the SQL query (#30903)
Co-authored-by: Kamil Gabryjelski <kamil.gabryjelski@gmail.com>
(cherry picked from commit f3e7c64de6)
2025-02-12 11:38:52 -03:00
EmmanuelCbd
cbebac9404 fix(docker): Docker python-translation-build (#32163)
(cherry picked from commit 5a8488af36)
2025-02-12 11:38:52 -03:00
Beto Dealmeida
50705e7253 fix: ScreenshotCachePayload serialization (#32156)
(cherry picked from commit e8990f4a36)
2025-02-12 11:38:52 -03:00
Antonio Rivero
e67089b5ce fix(migrations): Handle no params in time comparison migration (#32155)
(cherry picked from commit 6ed9dae2f7)
2025-02-12 11:38:52 -03:00
Đỗ Trọng Hải
923d076603 fix(releasing): fix borked SVN-based image building process (#32151)
Signed-off-by: hainenber <dotronghai96@gmail.com>
(cherry picked from commit ea5879bf2b)
2025-02-12 11:38:52 -03:00
Beto Dealmeida
92bc43e38b fix: move oauth2 capture to get_sqla_engine (#32137)
(cherry picked from commit c7c3b1b0e9)
2025-02-12 11:38:52 -03:00
Michael S. Molina
deb88e4174 fix: Local tarball Docker container is missing zstd dependency (#32135)
(cherry picked from commit c64018d421)
2025-02-12 11:38:52 -03:00
Elizabeth Thompson
faae9cc326 chore(timeseries charts): adjust legend width by padding (#32030)
(cherry picked from commit 8984f88a3e)
2025-02-12 11:38:51 -03:00
Michael S. Molina
890186a8e2 chore: Adds RC1 data to CHANGELOG.md and UPDATING.md 2025-02-04 14:00:05 -03:00
Michael S. Molina
b80aa864a1 fix: No virtual environment when running Docker translation compiler (#32133)
(cherry picked from commit 53d944d013)
2025-02-04 11:50:13 -03:00
Daniel Vaz Gaspar
b63256786d fix(ci): ephemeral env, handle different label, create comment (#32040)
(cherry picked from commit 0cd0fcdecb)
2025-02-04 11:50:13 -03:00
Mehmet Salih Yavuz
95694aa233 fix(datepicker): Full width datepicker on filter value select (#32064)
(cherry picked from commit cde2d49c95)
2025-02-04 11:50:13 -03:00
Michael S. Molina
15c60e8f2b fix: Histogram examples config (#32122) 2025-02-03 14:07:16 -03:00
2523 changed files with 83404 additions and 189733 deletions

View File

@@ -17,12 +17,6 @@
# https://cwiki.apache.org/confluence/display/INFRA/.asf.yaml+features+for+git+repositories
---
notifications:
commits: commits@superset.apache.org
issues: notifications@superset.apache.org
pullrequests: notifications@superset.apache.org
discussions: notifications@superset.apache.org
github:
del_branch_on_merge: true
description: "Apache Superset is a Data Visualization and Data Exploration Platform"
@@ -54,8 +48,6 @@ github:
projects: true
# Enable wiki for documentation
wiki: true
# Enable discussions
discussions: true
enabled_merge_buttons:
squash: true

View File

@@ -1,36 +0,0 @@
# .coveragerc to control coverage.py
[run]
branch = True
source = superset
# omit = bad_file.py
[paths]
source =
superset/
*/site-packages/
[report]
# Regexes for lines to exclude from consideration
exclude_lines =
# Have to re-enable the standard pragma
pragma: no cover
# Don't complain about missing debug-only code:
def __repr__
if self\.debug
# Don't complain if tests don't hit defensive assertion code:
raise AssertionError
raise NotImplementedError
# Don't complain if non-runnable code isn't run:
if 0:
if __name__ == .__main__.:
# Ignore importlib backport
from importlib
if TYPE_CHECKING:
#fail_under = 100
show_missing = True

View File

@@ -1,125 +0,0 @@
---
description: Apache Superset development standards and guidelines for Cursor IDE
globs: ["**/*.py", "**/*.ts", "**/*.tsx", "**/*.js", "**/*.jsx", "**/*.sql", "**/*.md"]
alwaysApply: true
---
# Apache Superset Development Standards for Cursor IDE
Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend.
## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do)
**These migrations are actively happening - avoid deprecated patterns:**
### Frontend Modernization
- **NO `any` types** - Use proper TypeScript types
- **NO JavaScript files** - Convert to TypeScript (.ts/.tsx)
- **NO Enzyme** - Use React Testing Library/Jest (Enzyme fully removed)
- **Use @superset-ui/core** - Don't import Ant Design directly
### Testing Strategy Migration
- **Prefer unit tests** over integration tests
- **Prefer integration tests** over Cypress end-to-end tests
- **Cypress is last resort** - Actively moving away from Cypress
- **Use Jest + React Testing Library** for component testing
### Backend Type Safety
- **Add type hints** - All new Python code needs proper typing
- **MyPy compliance** - Run `pre-commit run mypy` to validate
- **SQLAlchemy typing** - Use proper model annotations
## Code Standards
### TypeScript Frontend
- **NO `any` types** - Use proper TypeScript
- **Functional components** with hooks
- **@superset-ui/core** for UI components (not direct antd)
- **Jest** for testing (NO Enzyme)
- **Redux** for global state, hooks for local
### Python Backend
- **Type hints required** for all new code
- **MyPy compliant** - run `pre-commit run mypy`
- **SQLAlchemy models** with proper typing
- **pytest** for testing
### Apache License Headers
- **New files require ASF license headers** - When creating new code files, include the standard Apache Software Foundation license header
- **LLM instruction files are excluded** - Files like LLMS.md, CLAUDE.md, etc. are in `.rat-excludes` to avoid header token overhead
## Key Directory Structure
```
superset/
├── superset/ # Python backend (Flask, SQLAlchemy)
│ ├── views/api/ # REST API endpoints
│ ├── models/ # Database models
│ └── connectors/ # Database connections
├── superset-frontend/src/ # React TypeScript frontend
│ ├── components/ # Reusable components
│ ├── explore/ # Chart builder
│ ├── dashboard/ # Dashboard interface
│ └── SqlLab/ # SQL editor
├── superset-frontend/packages/
│ └── superset-ui-core/ # UI component library (USE THIS)
├── tests/ # Python/integration tests
├── docs/ # Documentation (UPDATE FOR CHANGES)
└── UPDATING.md # Breaking changes log
```
## Architecture Patterns
### Dataset-Centric Approach
Charts built from enriched datasets containing:
- Dimension columns with labels/descriptions
- Predefined metrics as SQL expressions
- Self-service analytics within defined contexts
### Security & Features
- **RBAC**: Role-based access via Flask-AppBuilder
- **Feature flags**: Control feature rollouts
- **Row-level security**: SQL-based data access control
## Test Utilities
### Python Test Helpers
- **`SupersetTestCase`** - Base class in `tests/integration_tests/base_tests.py`
- **`@with_config`** - Config mocking decorator
- **`@with_feature_flags`** - Feature flag testing
- **`login_as()`, `login_as_admin()`** - Authentication helpers
- **`create_dashboard()`, `create_slice()`** - Data setup utilities
### TypeScript Test Helpers
- **`superset-frontend/spec/helpers/testing-library.tsx`** - Custom render() with providers
- **`createWrapper()`** - Redux/Router/Theme wrapper
- **`selectOption()`** - Select component helper
- **React Testing Library** - NO Enzyme (removed)
## Pre-commit Validation
**Use pre-commit hooks for quality validation:**
```bash
# Install hooks
pre-commit install
# Quick validation (faster than --all-files)
pre-commit run # Staged files only
pre-commit run mypy # Python type checking
pre-commit run prettier # Code formatting
pre-commit run eslint # Frontend linting
```
## Development Guidelines
- **Documentation**: Update docs/ for any user-facing changes
- **Breaking Changes**: Add to UPDATING.md
- **Docstrings**: Required for new functions/classes
- **Follow existing patterns**: Mimic code style, use existing libraries and utilities
- **Type Safety**: This codebase is actively modernizing toward full TypeScript and type safety
- **Always run `pre-commit run`** to validate changes before committing
---
**Note**: This codebase is actively modernizing toward full TypeScript and type safety. Always run `pre-commit run` to validate changes. Follow the ongoing refactors section to avoid deprecated patterns.

1
.gitattributes vendored
View File

@@ -1,4 +1,3 @@
docker/**/*.sh text eol=lf
*.svg binary
*.ipynb binary
*.geojson binary

2
.github/CODEOWNERS vendored
View File

@@ -16,7 +16,7 @@
# Notify E2E test maintainers of changes
/superset-frontend/cypress-base/ @sadpandajoe @geido @eschutho @rusackas @betodealmeida @mistercrunch
/superset-frontend/cypress-base/ @sadpandajoe @geido @eschutho @rusackas @betodealmeida
# Notify PMC members of changes to GitHub Actions

View File

@@ -41,8 +41,8 @@ body:
label: Superset version
options:
- master / latest-dev
- "5.0.0"
- "4.1.3"
- "4.1.1"
- "4.0.2"
validations:
required: true
- type: dropdown

View File

@@ -1 +0,0 @@
../LLMS.md

View File

@@ -1,5 +1,4 @@
version: 2
enable-beta-ecosystems: true
updates:
- package-ecosystem: "github-actions"
@@ -12,10 +11,6 @@ updates:
# not until React >= 18.0.0
- dependency-name: "storybook"
- dependency-name: "@storybook*"
# JSDOM v30 doesn't play well with Jest v30
# Source: https://jestjs.io/blog#known-issues
# GH thread: https://github.com/jsdom/jsdom/issues/3492
- dependency-name: "jest-environment-jsdom"
directory: "/superset-frontend/"
schedule:
interval: "monthly"
@@ -26,16 +21,9 @@ updates:
versioning-strategy: increase
# NOTE: `uv` support is in beta, more details here:
# https://github.com/dependabot/dependabot-core/pull/10040#issuecomment-2696978430
- package-ecosystem: "uv"
directory: "requirements/"
open-pull-requests-limit: 10
schedule:
interval: "weekly"
labels:
- uv
- dependabot
# - package-ecosystem: "pip"
# NOTE: as dependabot isn't compatible with our usage of `uv pip compile` we're using
# `supersetbot` instead
- package-ecosystem: "npm"
directory: ".github/actions"

5
.github/labeler.yml vendored
View File

@@ -127,11 +127,6 @@
- any-glob-to-any-file:
- 'superset/translations/es/**'
"i18n:persian":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/fa/**'
############################################
# Sub-projects and monorepo packages
############################################

View File

@@ -145,7 +145,6 @@ cypress-install() {
cypress-run-all() {
local USE_DASHBOARD=$1
local APP_ROOT=$2
cd "$GITHUB_WORKSPACE/superset-frontend/cypress-base"
# Start Flask and run it in background
@@ -153,12 +152,7 @@ cypress-run-all() {
# so errors can print to stderr.
local flasklog="${HOME}/flask.log"
local port=8081
CYPRESS_BASE_URL="http://localhost:${port}"
if [ -n "$APP_ROOT" ]; then
export SUPERSET_APP_ROOT=$APP_ROOT
CYPRESS_BASE_URL=${CYPRESS_BASE_URL}${APP_ROOT}
fi
export CYPRESS_BASE_URL
export CYPRESS_BASE_URL="http://localhost:${port}"
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
local flaskProcessId=$!

View File

@@ -17,18 +17,13 @@ jobs:
check-python-deps:
runs-on: ubuntu-22.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4
with:
persist-credentials: false
submodules: recursive
fetch-depth: 1
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
depth: 1
- name: Setup Python
if: steps.check.outputs.python
@@ -39,20 +34,10 @@ jobs:
run: ./scripts/uv-pip-compile.sh
- name: Check for uncommitted changes
if: steps.check.outputs.python
run: |
echo "Full diff (for logging/debugging):"
git diff
echo "Filtered diff (excluding comments and whitespace):"
filtered_diff=$(git diff -U0 | grep '^[-+]' | grep -vE '^[-+]{3}' | grep -vE '^[-+][[:space:]]*#' | grep -vE '^[-+][[:space:]]*$' || true)
echo "$filtered_diff"
if [[ -n "$filtered_diff" ]]; then
echo
if [[ -n "$(git diff)" ]]; then
echo "ERROR: The pinned dependencies are not up-to-date."
echo "Please run './scripts/uv-pip-compile.sh' and commit the changes."
echo "More info: https://github.com/apache/superset/tree/master/requirements"
exit 1
else
echo "Pinned dependencies are up-to-date."

View File

@@ -1,82 +0,0 @@
name: Claude PR Assistant
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
jobs:
check-permissions:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude'))
runs-on: ubuntu-latest
outputs:
allowed: ${{ steps.check.outputs.allowed }}
steps:
- name: Check if user is allowed
id: check
run: |
# List of allowed users
ALLOWED_USERS="mistercrunch,rusackas"
# Get the commenter's username
COMMENTER="${{ github.event.comment.user.login }}"
echo "Checking permissions for user: $COMMENTER"
# Check if user is in allowed list
if [[ ",$ALLOWED_USERS," == *",$COMMENTER,"* ]]; then
echo "allowed=true" >> $GITHUB_OUTPUT
echo "✅ User $COMMENTER is allowed to use Claude"
else
echo "allowed=false" >> $GITHUB_OUTPUT
echo "❌ User $COMMENTER is not allowed to use Claude"
fi
deny-access:
needs: check-permissions
if: needs.check-permissions.outputs.allowed == 'false'
runs-on: ubuntu-latest
permissions:
issues: write
pull-requests: write
steps:
- name: Comment access denied
uses: actions/github-script@v7
with:
script: |
const message = `👋 Hi @${{ github.event.comment.user.login || github.event.review.user.login || github.event.issue.user.login }}!
Thanks for trying to use Claude Code, but currently only certain team members have access to this feature.
If you believe you should have access, please contact a project maintainer.`;
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body: message
});
claude-code-action:
needs: check-permissions
if: needs.check-permissions.outputs.allowed == 'true'
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
issues: write
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Run Claude PR Action
uses: anthropics/claude-code-action@beta
with:
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
timeout_minutes: "60"

View File

@@ -48,8 +48,6 @@ jobs:
allow-dependencies-licenses: pkg:npm/store2@2.14.2, pkg:npm/applitools/core, pkg:npm/applitools/core-base, pkg:npm/applitools/css-tree, pkg:npm/applitools/ec-client, pkg:npm/applitools/eg-socks5-proxy-server, pkg:npm/applitools/eyes, pkg:npm/applitools/eyes-cypress, pkg:npm/applitools/nml-client, pkg:npm/applitools/tunnel-client, pkg:npm/applitools/utils, pkg:npm/node-forge@1.3.1, pkg:npm/rgbcolor, pkg:npm/jszip@3.10.1
python-dependency-liccheck:
# NOTE: Configuration for liccheck lives in our pyproject.yml.
# You cannot use a liccheck.ini file in this workflow.
runs-on: ubuntu-22.04
steps:
- name: "Checkout Repository"

View File

@@ -111,9 +111,6 @@ jobs:
docker compose up superset-init --exit-code-from superset-init
docker-compose-image-tag:
# Run this job only on pushes to master (not for PRs)
# goal is to check that building the latest image works, not required for all PR pushes
if: github.event_name == 'push' && github.ref == 'refs/heads/master'
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"

View File

@@ -31,7 +31,7 @@ jobs:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: './superset-embedded-sdk/.nvmrc'
node-version: "20"
registry-url: 'https://registry.npmjs.org'
- run: npm ci
- run: npm run ci:release

View File

@@ -21,7 +21,7 @@ jobs:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: './superset-embedded-sdk/.nvmrc'
node-version: "20"
registry-url: 'https://registry.npmjs.org'
- run: npm ci
- run: npm test

View File

@@ -126,11 +126,7 @@ jobs:
throw new Error("Issue number is not available.");
}
const body = `@${user} Processing your ephemeral environment request [here](${workflowUrl}).` +
` Action: **${action}**.` +
` More information on [how to use or configure ephemeral environments]` +
`(https://superset.apache.org/docs/contributing/howtos/#github-ephemeral-environments)`;
const body = `@${user} Processing your ephemeral environment request [here](${workflowUrl}). Action: **${action}**.`;
await github.rest.issues.createComment({
owner: context.repo.owner,
@@ -300,7 +296,7 @@ jobs:
- name: Get network interface
id: get-eni
run: |
echo "eni=$(aws ecs describe-tasks --cluster superset-ci --tasks ${{ steps.list-tasks.outputs.task }} | jq '.tasks[0].attachments[0].details | map(select(.name=="networkInterfaceId"))[0].value')" >> $GITHUB_OUTPUT
echo "eni=$(aws ecs describe-tasks --cluster superset-ci --tasks ${{ steps.list-tasks.outputs.task }} | jq '.tasks[0].attachments[0].details | map(select(.name==\"networkInterfaceId\"))[0].value')" >> $GITHUB_OUTPUT
- name: Get public IP
id: get-ip
run: |

View File

@@ -18,7 +18,7 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
python-version: ["current", "previous", "next"]
python-version: ["current", "previous"]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4
@@ -38,47 +38,14 @@ jobs:
echo "HOMEBREW_CELLAR=$HOMEBREW_CELLAR" >>"${GITHUB_ENV}"
echo "HOMEBREW_REPOSITORY=$HOMEBREW_REPOSITORY" >>"${GITHUB_ENV}"
brew install norwoodj/tap/helm-docs
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install Frontend Dependencies
run: |
cd superset-frontend
npm ci
- name: Install Docs Dependencies
run: |
cd docs
yarn install --immutable
- name: Cache pre-commit environments
uses: actions/cache@v4
with:
path: ~/.cache/pre-commit
key: pre-commit-v2-${{ runner.os }}-py${{ matrix.python-version }}-${{ hashFiles('.pre-commit-config.yaml') }}
restore-keys: |
pre-commit-v2-${{ runner.os }}-py${{ matrix.python-version }}-
- name: pre-commit
run: |
set +e # Don't exit immediately on failure
export SKIP=eslint-frontend,type-checking-frontend
# Skip eslint as it requires `npm ci` and is executed in another job
export SKIP=eslint
pre-commit run --all-files
PRE_COMMIT_EXIT_CODE=$?
git diff --quiet --exit-code
GIT_DIFF_EXIT_CODE=$?
if [ "${PRE_COMMIT_EXIT_CODE}" -ne 0 ] || [ "${GIT_DIFF_EXIT_CODE}" -ne 0 ]; then
if [ "${PRE_COMMIT_EXIT_CODE}" -ne 0 ]; then
echo "❌ Pre-commit check failed (exit code: ${EXIT_CODE})."
else
echo "❌ Git working directory is dirty."
echo "📌 This likely means that pre-commit made changes that were not committed."
echo "🔍 Modified files:"
git diff --name-only
fi
if [ $? -ne 0 ] || ! git diff --quiet --exit-code; then
echo "❌ Pre-commit check failed."
echo "🚒 To prevent/address this CI issue, please install/use pre-commit locally."
echo "📖 More details here: https://superset.apache.org/docs/contributing/development#git-hooks"
exit 1

View File

@@ -24,7 +24,13 @@ jobs:
needs: config
if: needs.config.outputs.has-secrets
name: Bump version and publish package(s)
runs-on: ubuntu-24.04
strategy:
matrix:
node-version: [20]
steps:
- uses: actions/checkout@v4
with:
@@ -40,11 +46,11 @@ jobs:
git fetch --prune --unshallow
git tag -d `git tag | grep -E '^trigger-'`
- name: Install Node.js
- name: Use Node.js ${{ matrix.node-version }}
if: env.HAS_TAGS
uses: actions/setup-node@v4
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: ${{ matrix.node-version }}
- name: Cache npm
if: env.HAS_TAGS

View File

@@ -26,6 +26,7 @@ jobs:
fail-fast: false
matrix:
browser: ["chrome"]
node: [20]
env:
SUPERSET_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
@@ -39,7 +40,7 @@ jobs:
APPLITOOLS_BATCH_NAME: Superset Cypress
services:
postgres:
image: postgres:16-alpine
image: postgres:15-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
@@ -65,7 +66,7 @@ jobs:
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: ${{ matrix.node }}
- name: Install npm dependencies
uses: ./.github/actions/cached-dependencies
with:

View File

@@ -28,6 +28,9 @@ jobs:
needs: config
if: needs.config.outputs.has-secrets
runs-on: ubuntu-24.04
strategy:
matrix:
node: [20]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v4
@@ -38,7 +41,7 @@ jobs:
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: ${{ matrix.node }}
- name: Install eyes-storybook dependencies
uses: ./.github/actions/cached-dependencies
with:

View File

@@ -23,7 +23,7 @@ jobs:
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
services:
postgres:
image: postgres:16-alpine
image: postgres:15-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset

View File

@@ -35,10 +35,10 @@ jobs:
with:
persist-credentials: false
submodules: recursive
- name: Set up Node.js
- name: Set up Node.js 20
uses: actions/setup-node@v4
with:
node-version-file: './docs/.nvmrc'
node-version: '20'
- name: Setup Python
uses: ./.github/actions/setup-backend/
- uses: actions/setup-java@v4

View File

@@ -20,7 +20,7 @@ jobs:
steps:
- uses: actions/checkout@v4
# Do not bump this linkinator-action version without opening
# an ASF Infra ticket to allow the new version first!
# an ASF Infra ticket to allow the new verison first!
- uses: JustinBeckwith/linkinator-action@v1.11.0
continue-on-error: true # This will make the job advisory (non-blocking, no red X)
with:
@@ -60,10 +60,10 @@ jobs:
with:
persist-credentials: false
submodules: recursive
- name: Set up Node.js
- name: Set up Node.js 20
uses: actions/setup-node@v4
with:
node-version-file: './docs/.nvmrc'
node-version: '20'
- name: yarn install
run: |
yarn install --check-cache

View File

@@ -42,7 +42,6 @@ jobs:
matrix:
parallel_id: [0, 1, 2, 3, 4, 5]
browser: ["chrome"]
app_root: ["", "/app/prefix"]
env:
SUPERSET_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
@@ -50,11 +49,11 @@ jobs:
PYTHONPATH: ${{ github.workspace }}
REDIS_PORT: 16379
GITHUB_TOKEN: ${{ github.token }}
# Only use dashboard when explicitly requested via workflow_dispatch
USE_DASHBOARD: ${{ github.event.inputs.use_dashboard == 'true' || 'false' }}
# use the dashboard feature when running manually OR merging to master
USE_DASHBOARD: ${{ github.event.inputs.use_dashboard == 'true'|| (github.ref == 'refs/heads/master' && 'true') || 'false' }}
services:
postgres:
image: postgres:16-alpine
image: postgres:15-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
@@ -73,7 +72,6 @@ jobs:
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@v4
@@ -111,7 +109,7 @@ jobs:
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@v4
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: "20"
- name: Install npm dependencies
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
@@ -137,17 +135,10 @@ jobs:
CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
NODE_OPTIONS: "--max-old-space-size=4096"
with:
run: cypress-run-all ${{ env.USE_DASHBOARD }} ${{ matrix.app_root }}
- name: Set safe app root
if: failure()
id: set-safe-app-root
run: |
APP_ROOT="${{ matrix.app_root }}"
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
run: cypress-run-all ${{ env.USE_DASHBOARD }}
- name: Upload Artifacts
uses: actions/upload-artifact@v4
if: failure()
with:
path: ${{ github.workspace }}/superset-frontend/cypress-base/cypress/screenshots
name: cypress-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}-${{ matrix.parallel_id }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}
name: cypress-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}-${{ matrix.parallel_id }}

View File

@@ -26,8 +26,6 @@ jobs:
uses: actions/checkout@v4
with:
persist-credentials: false
fetch-depth: 0
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Check for File Changes
id: check
@@ -41,10 +39,6 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
echo "git rev-parse --short HEAD"
git rev-parse --short HEAD
echo "git show -s --format=raw HEAD"
git show -s --format=raw HEAD
docker buildx build \
-t $TAG \
--cache-from=type=registry,ref=apache/superset-cache:3.10-slim-bookworm \
@@ -121,6 +115,24 @@ jobs:
files: merged-output/coverage-summary.json
slug: apache/superset
core-cover:
needs: frontend-build
if: needs.frontend-build.outputs.should-run == 'true'
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@v4
with:
name: docker-image
- name: Load Docker Image
run: docker load < docker-image.tar.gz
- name: superset-ui/core coverage
run: |
docker run --rm $TAG bash -c \
"npm run core:cover"
lint-frontend:
needs: frontend-build
if: needs.frontend-build.outputs.should-run == 'true'
@@ -132,8 +144,7 @@ jobs:
name: docker-image
- name: Load Docker Image
run: |
docker load < docker-image.tar.gz
run: docker load < docker-image.tar.gz
- name: eslint
run: |

View File

@@ -77,7 +77,7 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
python-version: ["current", "previous", "next"]
python-version: ["current", "previous"]
env:
PYTHONPATH: ${{ github.workspace }}
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
@@ -85,7 +85,7 @@ jobs:
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
services:
postgres:
image: postgres:16-alpine
image: postgres:15-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset

View File

@@ -25,7 +25,7 @@ jobs:
SUPERSET__SQLALCHEMY_EXAMPLES_URI: presto://localhost:15433/memory/default
services:
postgres:
image: postgres:16-alpine
image: postgres:15-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
@@ -94,7 +94,7 @@ jobs:
UPLOAD_FOLDER: /tmp/.superset/uploads/
services:
postgres:
image: postgres:16-alpine
image: postgres:15-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
python-version: ["previous", "current", "next"]
python-version: ["previous", "current"]
env:
PYTHONPATH: ${{ github.workspace }}
steps:
@@ -44,14 +44,7 @@ jobs:
SUPERSET_TESTENV: true
SUPERSET_SECRET_KEY: not-a-secret
run: |
pytest --durations-min=0.5 --cov-report= --cov=superset ./tests/common ./tests/unit_tests --cache-clear --maxfail=50
- name: Python 100% coverage unit tests
if: steps.check.outputs.python
env:
SUPERSET_TESTENV: true
SUPERSET_SECRET_KEY: not-a-secret
run: |
pytest --durations-min=0.5 --cov-report= --cov=superset/sql/ ./tests/unit_tests/sql/ --cache-clear --cov-fail-under=100
pytest --durations-min=0.5 --cov-report= --cov=superset ./tests/common ./tests/unit_tests --cache-clear
- name: Upload code coverage
uses: codecov/codecov-action@v5
with:

View File

@@ -33,7 +33,7 @@ jobs:
if: steps.check.outputs.frontend
uses: actions/setup-node@v4
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: '18'
- name: Install dependencies
if: steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies

View File

@@ -32,10 +32,10 @@ jobs:
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: '20'
- name: Install Dependencies
run: npm ci
run: npm install
working-directory: ./superset-frontend
- name: Run Script

6
.gitignore vendored
View File

@@ -21,7 +21,6 @@
*.swp
__pycache__
.aider*
.local
.cache
.bento*
@@ -92,7 +91,6 @@ scripts/*.zip
# IntelliJ
*.iml
venv
.venv
@eaDir/
# PyCharm
@@ -108,7 +106,6 @@ ghostdriver.log
testCSV.csv
.terser-plugin-cache/
apache-superset-*.tar.gz*
apache_superset-*.tar.gz*
release.json
# Translation-related files
@@ -127,7 +124,4 @@ docker/*local*
# Jest test report
test-report.html
superset/static/stats/statistics.html
# LLM-related
CLAUDE.local.md
.aider*

View File

@@ -20,7 +20,7 @@ repos:
hooks:
- id: auto-walrus
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.15.0
rev: v1.13.0
hooks:
- id: mypy
args: [--check-untyped-defs]
@@ -52,27 +52,22 @@ repos:
- id: trailing-whitespace
exclude: ^.*\.(snap)
args: ["--markdown-linebreak-ext=md"]
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v4.0.0-alpha.8 # Use the sha or tag you want to point at
hooks:
- id: prettier
additional_dependencies:
- prettier@3.3.3
args: ["--ignore-path=./superset-frontend/.prettierignore"]
files: "superset-frontend"
- repo: local
hooks:
- id: eslint-frontend
name: eslint (frontend)
entry: ./scripts/eslint.sh
language: system
pass_filenames: true
files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$
- id: eslint-docs
name: eslint (docs)
entry: bash -c 'cd docs && FILES=$(echo "$@" | sed "s|docs/||g") && yarn eslint --fix --ext .js,.jsx,.ts,.tsx --quiet $FILES'
language: system
pass_filenames: true
files: ^docs/.*\.(js|jsx|ts|tsx)$
- id: type-checking-frontend
name: Type-Checking (Frontend)
entry: ./scripts/check-type.js package=superset-frontend excludeDeclarationDir=cypress-base
language: system
files: ^superset-frontend\/.*\.(js|jsx|ts|tsx)$
exclude: ^superset-frontend/cypress-base\/
require_serial: true
- id: eslint
name: eslint
entry: bash -c 'cd superset-frontend && npm run eslint -- $(echo "$@" | sed "s|superset-frontend/||g")'
language: system
pass_filenames: true
files: \.(js|jsx|ts|tsx)$
# blacklist unsafe functions like make_url (see #19526)
- repo: https://github.com/skorokithakis/blacklist-pre-commit-hook
rev: e2f070289d8eddcaec0b580d3bde29437e7c8221
@@ -84,30 +79,9 @@ repos:
hooks:
- id: helm-docs
files: helm
verbose: false
args: ["--log-level", "error"]
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.9.7
rev: v0.8.0
hooks:
- id: ruff
args: [--fix]
args: [ --fix ]
- id: ruff-format
- repo: local
hooks:
- id: pylint
name: pylint with custom Superset plugins
entry: bash
language: system
types: [python]
exclude: ^(tests/|superset/migrations/|scripts/|RELEASING/|docker/)
args:
- -c
- |
TARGET_BRANCH=${GITHUB_BASE_REF:-master}
git fetch origin "$TARGET_BRANCH"
files=$(git diff --name-only --diff-filter=ACM origin/"$TARGET_BRANCH"..HEAD | grep '^superset/.*\.py$' || true)
if [ -n "$files" ]; then
pylint --rcfile=.pylintrc --load-plugins=superset.extensions.pylint $files
else
echo "No Python files to lint."
fi

355
.pylintrc
View File

@@ -1,355 +0,0 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
[MASTER]
# Specify a configuration file.
#rcfile=
# Python code to execute, usually for sys.path manipulation such as
# pygtk.require().
#init-hook=
# Add files or directories to the blacklist. They should be base names, not
# paths.
ignore=CVS,migrations
# Add files or directories matching the regex patterns to the blacklist. The
# regex matches against base names, not paths.
ignore-patterns=
# Pickle collected data for later comparisons.
persistent=yes
# List of plugins (as comma separated values of python modules names) to load,
# usually to register additional checkers.
load-plugins=superset.extensions.pylint
# Use multiple processes to speed up Pylint.
jobs=2
# Allow loading of arbitrary C extensions. Extensions are imported into the
# active Python interpreter and may run arbitrary code.
unsafe-load-any-extension=no
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code
extension-pkg-whitelist=pyarrow
[MESSAGES CONTROL]
disable=all
enable=disallowed-json-import,disallowed-sql-import,consider-using-transaction
[REPORTS]
# Set the output format. Available formats are text, parseable, colorized, msvs
# (visual studio) and html. You can also give a reporter class, eg
# mypackage.mymodule.MyReporterClass.
output-format=text
# Tells whether to display a full report or only the messages
reports=yes
# Python expression which should return a note less than 10 (10 is the highest
# note). You have access to the variables errors warning, statement which
# respectively contain the number of errors / warnings messages and the total
# number of statements analyzed. This is used by the global evaluation report
# (RP0004).
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
# Template used to display messages. This is a python new-style format string
# used to format the message information. See doc for all details
#msg-template=
[BASIC]
# Good variable names which should always be accepted, separated by a comma
good-names=_,df,ex,f,i,id,j,k,l,o,pk,Run,ts,v,x,y
# Bad variable names which should always be refused, separated by a comma
bad-names=bar,baz,db,fd,foo,sesh,session,tata,toto,tutu
# Colon-delimited sets of names that determine each other's naming style when
# the name regexes allow several styles.
name-group=
# Include a hint for the correct naming format with invalid-name
include-naming-hint=no
# List of decorators that produce properties, such as abc.abstractproperty. Add
# to this list to register other decorators that produce valid properties.
property-classes=
abc.abstractproperty,
sqlalchemy.ext.hybrid.hybrid_property
# Regular expression matching correct argument names
argument-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct method names
method-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct variable names
variable-rgx=[a-z_][a-z0-9_]{1,30}$
# Regular expression matching correct inline iteration names
inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
# Regular expression matching correct constant names
const-rgx=(([A-Za-z_][A-Za-z0-9_]*)|(__.*__))$
# Regular expression matching correct class names
class-rgx=[A-Z_][a-zA-Z0-9]+$
# Regular expression matching correct class attribute names
class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
# Regular expression matching correct module names
module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
# Regular expression matching correct attribute names
attr-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct function names
function-rgx=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match function or class names that do
# not require a docstring.
no-docstring-rgx=^_
# Minimum line length for functions/classes that require docstrings, shorter
# ones are exempt.
docstring-min-length=10
[ELIF]
# Maximum number of nested blocks for function / method body
max-nested-blocks=5
[FORMAT]
# Maximum number of characters on a single line.
max-line-length=100
# Regexp for a line that is allowed to be longer than the limit.
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
# Allow the body of an if to be on the same line as the test if there is no
# else.
single-line-if-stmt=no
# Maximum number of lines in a module
max-module-lines=1000
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
# tab).
indent-string=' '
# Number of spaces of indent required inside a hanging or continued line.
indent-after-paren=4
# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
expected-line-ending-format=
[LOGGING]
# Logging modules to check that the string format arguments are in logging
# function parameter format
logging-modules=logging
[MISCELLANEOUS]
# List of note tags to take in consideration, separated by a comma.
notes=FIXME,XXX
[SIMILARITIES]
# Minimum lines number of a similarity.
min-similarity-lines=5
# Ignore comments when computing similarities.
ignore-comments=yes
# Ignore docstrings when computing similarities.
ignore-docstrings=yes
# Ignore imports when computing similarities.
ignore-imports=no
[SPELLING]
# Spelling dictionary name. Available dictionaries: none. To make it working
# install python-enchant package.
spelling-dict=
# List of comma separated words that should not be checked.
spelling-ignore-words=
# A path to a file that contains private dictionary; one word per line.
spelling-private-dict-file=
# Tells whether to store unknown words to indicated private dictionary in
# --spelling-private-dict-file option instead of raising a message.
spelling-store-unknown-words=no
[TYPECHECK]
# Tells whether missing members accessed in mixin class should be ignored. A
# mixin class is detected if its name ends with "mixin" (case insensitive).
ignore-mixin-members=yes
# List of module names for which member attributes should not be checked
# (useful for modules/projects where namespaces are manipulated during runtime
# and thus existing member attributes cannot be deduced by static analysis. It
# supports qualified module names, as well as Unix pattern matching.
ignored-modules=numpy,pandas,alembic.op,sqlalchemy,alembic.context,flask_appbuilder.security.sqla.PermissionView.role,flask_appbuilder.Model.metadata,flask_appbuilder.Base.metadata
# List of class names for which member attributes should not be checked (useful
# for classes with dynamically set attributes). This supports the use of
# qualified names.
ignored-classes=contextlib.closing,optparse.Values,thread._local,_thread._local
# List of members which are set dynamically and missed by pylint inference
# system, and so shouldn't trigger E1101 when accessed. Python regular
# expressions are accepted.
generated-members=
# List of decorators that produce context managers, such as
# contextlib.contextmanager. Add to this list to register other decorators that
# produce valid context managers.
contextmanager-decorators=contextlib.contextmanager
[VARIABLES]
# Tells whether we should check for unused import in __init__ files.
init-import=no
# A regular expression matching the name of dummy variables (i.e. expectedly
# not used).
dummy-variables-rgx=(_+[a-zA-Z0-9]*?$)|dummy
# List of additional names supposed to be defined in builtins. Remember that
# you should avoid to define new builtins when possible.
additional-builtins=
# List of strings which can identify a callback function by name. A callback
# name must start or end with one of those strings.
callbacks=cb_,_cb
# List of qualified module names which can have objects that can redefine
# builtins.
redefining-builtins-modules=six.moves,future.builtins
[CLASSES]
# List of method names used to declare (i.e. assign) instance attributes.
defining-attr-methods=__init__,__new__,setUp
# List of valid names for the first argument in a class method.
valid-classmethod-first-arg=cls
# List of valid names for the first argument in a metaclass class method.
valid-metaclass-classmethod-first-arg=mcs
# List of member names, which should be excluded from the protected access
# warning.
exclude-protected=_asdict,_fields,_replace,_source,_make
[DESIGN]
# Maximum number of arguments for function / method
max-args=5
# Argument names that match this expression will be ignored. Default to name
# with leading underscore
ignored-argument-names=_.*
# Maximum number of locals for function / method body
max-locals=15
# Maximum number of return / yield for function / method body
max-returns=10
# Maximum number of branch for function / method body
max-branches=15
# Maximum number of statements in function / method body
max-statements=50
# Maximum number of parents for a class (see R0901).
max-parents=7
# Maximum number of attributes for a class (see R0902).
max-attributes=8
# Minimum number of public methods for a class (see R0903).
min-public-methods=2
# Maximum number of public methods for a class (see R0904).
max-public-methods=20
# Maximum number of boolean expressions in a if statement
max-bool-expr=5
[IMPORTS]
# Deprecated modules which should not be used, separated by a comma
deprecated-modules=optparse
# Create a graph of every (i.e. internal and external) dependencies in the
# given file (report RP0402 must not be disabled)
import-graph=
# Create a graph of external dependencies in the given file (report RP0402 must
# not be disabled)
ext-import-graph=
# Create a graph of internal dependencies in the given file (report RP0402 must
# not be disabled)
int-import-graph=
# Force import order to recognize a module as part of the standard
# compatibility libraries.
known-standard-library=
# Force import order to recognize a module as part of a third party library.
known-third-party=enchant
# Analyse import fallback blocks. This can be used to support both Python 2 and
# 3 compatible code, which means that the block might have code that exists
# only in one or another interpreter, leading to false positives when analysed.
analyse-fallback-blocks=no
[EXCEPTIONS]
# Exceptions that will emit a warning when being caught. Defaults to
# "Exception"
overgeneral-exceptions=builtins.Exception

View File

@@ -76,11 +76,3 @@ ydb.svg
erd.puml
erd.svg
intro_header.txt
# for LLMs
llm-context.md
LLMS.md
CLAUDE.md
CURSOR.md
GEMINI.md
GPT.md

View File

@@ -1,50 +0,0 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Change Log
### 4.1 (Fri Nov 15 22:13:57 2024 +0530)
**Database Migrations**
**Features**
**Fixes**
- [#30886](https://github.com/apache/superset/pull/30886) fix: blocks UI elements on right side (@samarsrivastav)
- [#30859](https://github.com/apache/superset/pull/30859) fix(package.json): Pin luxon version to unblock master (@geido)
- [#30588](https://github.com/apache/superset/pull/30588) fix(explore): column data type tooltip format (@mistercrunch)
- [#29911](https://github.com/apache/superset/pull/29911) fix: Rename database from 'couchbasedb' to 'couchbase' in documentation and db_engine_specs (@ayush-couchbase)
- [#30828](https://github.com/apache/superset/pull/30828) fix(TimezoneSelector): Failing unit tests due to timezone change (@geido)
- [#30875](https://github.com/apache/superset/pull/30875) fix: don't show metadata for embedded dashboards (@sadpandajoe)
- [#30851](https://github.com/apache/superset/pull/30851) fix: Graph chart colors (@michael-s-molina)
- [#29867](https://github.com/apache/superset/pull/29867) fix(capitalization): Capitalizing a button. (@rusackas)
- [#29782](https://github.com/apache/superset/pull/29782) fix(translations): Translate embedded errors (@rusackas)
- [#29772](https://github.com/apache/superset/pull/29772) fix: Fixing incomplete string escaping. (@rusackas)
- [#29725](https://github.com/apache/superset/pull/29725) fix(frontend/docker, ci): fix borked Docker build due to Lerna v8 uplift (@hainenber)
**Others**
- [#30576](https://github.com/apache/superset/pull/30576) chore: add link to Superset when report error (@eschutho)
- [#29786](https://github.com/apache/superset/pull/29786) refactor(Slider): Upgrade Slider to Antd 5 (@geido)
- [#29674](https://github.com/apache/superset/pull/29674) refactor(ChartCreation): Migrate tests to RTL (@rtexelm)
- [#29843](https://github.com/apache/superset/pull/29843) refactor(controls): Migrate AdhocMetricOption.test to RTL (@rtexelm)
- [#29845](https://github.com/apache/superset/pull/29845) refactor(controls): Migrate MetricDefinitionValue.test to RTL (@rtexelm)
- [#28424](https://github.com/apache/superset/pull/28424) docs: Check markdown files for bad links using linkinator (@rusackas)
- [#29768](https://github.com/apache/superset/pull/29768) docs(contributing): fix broken link to translations sub-section (@sfirke)

View File

@@ -1,83 +0,0 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Change Log
### 4.1.2 (Fri Mar 7 13:28:05 2025 -0800)
**Database Migrations**
- [#32538](https://github.com/apache/superset/pull/32538) fix(migrations): Handle comparator None in old time comparison migration (@Antonio-RiveroMartnez)
- [#32155](https://github.com/apache/superset/pull/32155) fix(migrations): Handle no params in time comparison migration (@Antonio-RiveroMartnez)
- [#31185](https://github.com/apache/superset/pull/31185) fix: check for column before adding in migrations (@betodealmeida)
**Features**
- [#29974](https://github.com/apache/superset/pull/29974) feat(sqllab): Adds refresh button to table metadata in SQL Lab (@Usiel)
**Fixes**
- [#32515](https://github.com/apache/superset/pull/32515) fix(sqllab): Allow clear on schema and catalog (@justinpark)
- [#32500](https://github.com/apache/superset/pull/32500) fix: dashboard, chart and dataset import validation (@dpgaspar)
- [#31353](https://github.com/apache/superset/pull/31353) fix(sqllab): duplicate error message (@betodealmeida)
- [#31407](https://github.com/apache/superset/pull/31407) fix: Big Number side cut fixed (@fardin-developer)
- [#31480](https://github.com/apache/superset/pull/31480) fix(sunburst): Use metric label from verbose map (@gerbermichi)
- [#31427](https://github.com/apache/superset/pull/31427) fix(tags): clean up bulk create api and schema (@villebro)
- [#31334](https://github.com/apache/superset/pull/31334) fix(docs): add custom editUrl path for intro page (@dwgrossberg)
- [#31353](https://github.com/apache/superset/pull/31353) fix(sqllab): duplicate error message (@betodealmeida)
- [#31323](https://github.com/apache/superset/pull/31323) fix: Use clickhouse sqlglot dialect for YDB (@vgvoleg)
- [#31198](https://github.com/apache/superset/pull/31198) fix: add more clickhouse disallowed functions on config (@dpgaspar)
- [#31194](https://github.com/apache/superset/pull/31194) fix(embedded): Hide anchor links in embedded mode (@Vitor-Avila)
- [#31960](https://github.com/apache/superset/pull/31960) fix(sqllab): Missing allowHTML props in ResultTableExtension (@justinpark)
- [#31332](https://github.com/apache/superset/pull/31332) fix: prevent multiple pvm errors on migration (@eschutho)
- [#31437](https://github.com/apache/superset/pull/31437) fix(database import): Gracefully handle error to get catalog schemas (@Vitor-Avila)
- [#31173](https://github.com/apache/superset/pull/31173) fix: cache-warmup fails (@nsivarajan)
- [#30442](https://github.com/apache/superset/pull/30442) fix(fe/src/dashboard): optional chaining for possibly nullable parent attribute in LayoutItem type (@hainenber)
- [#31639](https://github.com/apache/superset/pull/31639) fix(sqllab): unable to update saved queries (@DamianPendrak)
- [#29898](https://github.com/apache/superset/pull/29898) fix: parse pandas pivot null values (@eschutho)
- [#31414](https://github.com/apache/superset/pull/31414) fix(Pivot Table): Fix column width to respect currency config (@Vitor-Avila)
- [#31335](https://github.com/apache/superset/pull/31335) fix(histogram): axis margin padding consistent with other graphs (@tatiana-cherne)
- [#31301](https://github.com/apache/superset/pull/31301) fix(AllEntitiesTable): show Tags (@alexandrusoare)
- [#31329](https://github.com/apache/superset/pull/31329) fix: pass string to `process_template` (@betodealmeida)
- [#31341](https://github.com/apache/superset/pull/31341) fix(pinot): remove query aliases from SELECT and ORDER BY clauses in Pinot (@yuribogomolov)
- [#31308](https://github.com/apache/superset/pull/31308) fix: annotations on horizontal bar chart (@DamianPendrak)
- [#31294](https://github.com/apache/superset/pull/31294) fix(sqllab): Remove update_saved_query_exec_info to reduce lag (@justinpark)
- [#30897](https://github.com/apache/superset/pull/30897) fix: Exception handling for SQL Lab views (@michael-s-molina)
- [#31199](https://github.com/apache/superset/pull/31199) fix(Databricks): Escape catalog and schema names in pre-queries (@Vitor-Avila)
- [#31265](https://github.com/apache/superset/pull/31265) fix(trino): db session error in handle cursor (@justinpark)
- [#31024](https://github.com/apache/superset/pull/31024) fix(dataset): use sqlglot for DML check (@betodealmeida)
- [#29885](https://github.com/apache/superset/pull/29885) fix: add mutator to get_columns_description (@eschutho)
- [#30821](https://github.com/apache/superset/pull/30821) fix: x axis title disappears when editing bar chart (@DamianPendrak)
- [#31181](https://github.com/apache/superset/pull/31181) fix: Time-series Line Chart Display unnecessary total (@michael-s-molina)
- [#31163](https://github.com/apache/superset/pull/31163) fix(Dashboard): Backward compatible shared_label_colors field (@geido)
- [#31156](https://github.com/apache/superset/pull/31156) fix: check orderby (@betodealmeida)
- [#31154](https://github.com/apache/superset/pull/31154) fix: Remove unwanted commit on Trino's handle_cursor (@michael-s-molina)
- [#31151](https://github.com/apache/superset/pull/31151) fix: Revert "feat(trino): Add functionality to upload data (#29164)" (@michael-s-molina)
- [#31031](https://github.com/apache/superset/pull/31031) fix(Dashboard): Ensure shared label colors are updated (@geido)
- [#30967](https://github.com/apache/superset/pull/30967) fix(release validation): scripts now support RSA and EDDSA keys. (@rusackas)
- [#30881](https://github.com/apache/superset/pull/30881) fix(Dashboard): Native & Cross-Filters Scoping Performance (@geido)
- [#30887](https://github.com/apache/superset/pull/30887) fix(imports): import query_context for imports with charts (@lindenh)
- [#31008](https://github.com/apache/superset/pull/31008) fix(explore): verified props is not updated (@justinpark)
- [#30646](https://github.com/apache/superset/pull/30646) fix(Dashboard): Retain colors when color scheme not set (@geido)
- [#30962](https://github.com/apache/superset/pull/30962) fix(Dashboard): Exclude edit param in async screenshot (@geido)
**Others**
- [#32043](https://github.com/apache/superset/pull/32043) chore: Skip the creation of secondary perms during catalog migrations (@Vitor-Avila)
- [#30865](https://github.com/apache/superset/pull/30865) docs: Updating 4.1 Release Notes (@yousoph)

View File

@@ -1,58 +0,0 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
## Change Log
### 4.1.3 (Thu May 29 02:31:07 2025 -0500)
**Database Migrations**
**Features**
**Fixes**
- [#33522](https://github.com/apache/superset/pull/33522) fix(Sqllab): Autocomplete got stuck in UI when open it too fast (@rebenitez1802)
- [#33425](https://github.com/apache/superset/pull/33425) fix(table-chart): time shift is not working (@justinpark)
- [#32414](https://github.com/apache/superset/pull/32414) fix(api): Added uuid to list api calls (@withnale)
- [#33354](https://github.com/apache/superset/pull/33354) fix: loading examples from raw.githubusercontent.com fails with 429 errors (@mistercrunch)
- [#32382](https://github.com/apache/superset/pull/32382) fix(pinot): revert join and subquery flags (@yuribogomolov)
- [#32473](https://github.com/apache/superset/pull/32473) fix(plugin-chart-echarts): remove erroneous upper bound value (@villebro)
- [#33048](https://github.com/apache/superset/pull/33048) fix: improve error type on parse error (@justinpark)
- [#32968](https://github.com/apache/superset/pull/32968) fix(pivot-table): Revert "fix(Pivot Table): Fix column width to respect currency config (#31414)" (@justinpark)
- [#32795](https://github.com/apache/superset/pull/32795) fix(log): store navigation path to get correct logging path (@justinpark)
- [#33216](https://github.com/apache/superset/pull/33216) fix: Downgrade to marshmallow<4 (@amotl)
- [#32866](https://github.com/apache/superset/pull/32866) fix: make packages PEP 625 compliant (@sadpandajoe)
- [#32035](https://github.com/apache/superset/pull/32035) fix(fe/dashboard-list): display modifier info for `Last modified` data (@hainenber)
- [#32708](https://github.com/apache/superset/pull/32708) fix(logging): missing path in event data (@justinpark)
- [#32699](https://github.com/apache/superset/pull/32699) fix: Signature of Celery pruner jobs (@michael-s-molina)
- [#32681](https://github.com/apache/superset/pull/32681) fix(log): Update recent_activity by event name (@justinpark)
- [#32608](https://github.com/apache/superset/pull/32608) fix(welcome): perf on distinct recent activities (@justinpark)
- [#32572](https://github.com/apache/superset/pull/32572) fix: Log table retention policy (@michael-s-molina)
- [#32406](https://github.com/apache/superset/pull/32406) fix(model/helper): represent RLS filter clause in proper textual SQL string (@hainenber)
- [#32240](https://github.com/apache/superset/pull/32240) fix: upgrade to 3.11.11-slim-bookworm to address critical vulnerabilities (@gpchandran)
- [#30858](https://github.com/apache/superset/pull/30858) fix(chart data): removing query from /chart/data payload when accessing as guest user (@fisjac)
**Others**
- [#33612](https://github.com/apache/superset/pull/33612) chore: update Dockerfile - Upgrade to 3.11.12 (@gpchandran)
- [#33435](https://github.com/apache/superset/pull/33435) docs: CVEs fixed on 4.1.2 (@sha174n)
- [#33339](https://github.com/apache/superset/pull/33339) chore(🦾): bump python h11 0.14.0 -> 0.16.0 (@github-actions[bot])
- [#32745](https://github.com/apache/superset/pull/32745) chore(🦾): bump python sqlglot 26.1.3 -> 26.11.1 (@github-actions[bot])
- [#32782](https://github.com/apache/superset/pull/32782) chore: Revert "chore: bump base image in Dockerfile with `ARG PY_VER=3.11.11-slim-bookworm`" (@sadpandajoe)
- [#32780](https://github.com/apache/superset/pull/32780) chore: bump base image in Dockerfile with `ARG PY_VER=3.11.11-slim-bookworm` (@gpchandran)

View File

@@ -19,7 +19,7 @@ under the License.
## Change Log
### 5.0.0 (Wed Jun 18 13:54:10 2025 -0300)
### 5.0.0 (Tue May 27 17:02:10 2025 -0300)
**Database Migrations**
@@ -108,19 +108,6 @@ under the License.
**Fixes**
- [#33817](https://github.com/apache/superset/pull/33817) fix: SQL Lab warning message sizes (@michael-s-molina)
- [#33779](https://github.com/apache/superset/pull/33779) fix(Echarts): Echarts Legend Scroll fix (@amaannawab923)
- [#33765](https://github.com/apache/superset/pull/33765) fix(tooltip): Sanitize tooltip html (@msyavuz)
- [#33759](https://github.com/apache/superset/pull/33759) fix: apply d3 format to BigNumber(s) (@betodealmeida)
- [#33752](https://github.com/apache/superset/pull/33752) fix(create chart page): add missing space between words (@Quatters)
- [#33748](https://github.com/apache/superset/pull/33748) fix: sync dot color between dashboard chart and edit chart (@anantaoutlook)
- [#33743](https://github.com/apache/superset/pull/33743) fix(dataset): Fix plural toast messages (@rad-pat)
- [#33717](https://github.com/apache/superset/pull/33717) fix(explore): add gap to the "Cached" button (@Quatters)
- [#33719](https://github.com/apache/superset/pull/33719) fix(Alerts & reports): invalid "Last updated" time formatting (@Quatters)
- [#33726](https://github.com/apache/superset/pull/33726) fix(dashboard): show dashboard thumbnail images when retrieved (@rad-pat)
- [#33296](https://github.com/apache/superset/pull/33296) fix(template_processing): get_filters now works for IS_NULL and IS_NOT_NULL operators (@Prokos)
- [#32414](https://github.com/apache/superset/pull/32414) fix(api): Added uuid to list api calls (@withnale)
- [#33710](https://github.com/apache/superset/pull/33710) fix: Migrate charts with empty query_context (@luizotavio32)
- [#33592](https://github.com/apache/superset/pull/33592) fix: Makes time compare migration more resilient (@michael-s-molina)
- [#33596](https://github.com/apache/superset/pull/33596) fix: Missing processor context when rendering Jinja (@michael-s-molina)
- [#33285](https://github.com/apache/superset/pull/33285) fix: Adjust viz migrations to also migrate the queries object (@luizotavio32)
@@ -395,8 +382,6 @@ under the License.
**Others**
- [#33745](https://github.com/apache/superset/pull/33745) build: update Dockerfile to 3.11.13-slim-bookworm (@gpchandran)
- [#33612](https://github.com/apache/superset/pull/33612) chore: update Dockerfile - Upgrade to 3.11.12 (@gpchandran)
- [#33339](https://github.com/apache/superset/pull/33339) chore(🦾): bump python h11 0.14.0 -> 0.16.0 (@github-actions[bot])
- [#32745](https://github.com/apache/superset/pull/32745) chore(🦾): bump python sqlglot 26.1.3 -> 26.11.1 (@github-actions[bot])
- [#32239](https://github.com/apache/superset/pull/32239) docs: adding notes about using uv instead of raw pip (@mistercrunch)

View File

@@ -1 +0,0 @@
LLMS.md

View File

@@ -5,7 +5,7 @@
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0

View File

@@ -18,7 +18,7 @@
######################################################################
# Node stage to deal with static asset construction
######################################################################
ARG PY_VER=3.11.13-slim-bookworm
ARG PY_VER=3.11.11-slim-bookworm
# If BUILDPLATFORM is null, set it to 'amd64' (or leave as is otherwise).
ARG BUILDPLATFORM=${BUILDPLATFORM:-amd64}
@@ -167,7 +167,7 @@ RUN mkdir -p \
&& touch superset/static/version_info.json
# Install Playwright and optionally setup headless browsers
ARG INCLUDE_CHROMIUM="false"
ARG INCLUDE_CHROMIUM="true"
ARG INCLUDE_FIREFOX="false"
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
if [ "$INCLUDE_CHROMIUM" = "true" ] || [ "$INCLUDE_FIREFOX" = "true" ]; then \
@@ -208,7 +208,7 @@ RUN rm superset/translations/*/*/*.po
COPY --from=superset-node /app/superset/translations superset/translations
COPY --from=python-translation-compiler /app/translations_mo superset/translations
HEALTHCHECK CMD /app/docker/docker-healthcheck.sh
HEALTHCHECK CMD curl -f "http://localhost:${SUPERSET_PORT}/health"
CMD ["/app/docker/entrypoints/run-server.sh"]
EXPOSE ${SUPERSET_PORT}
@@ -223,7 +223,7 @@ RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
/app/docker/pip-install.sh --requires-build-essential -r requirements/base.txt
# Install the superset package
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
uv pip install -e .
uv pip install .
RUN python -m compileall /app/superset
USER superset
@@ -246,7 +246,7 @@ RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
/app/docker/pip-install.sh --requires-build-essential -r requirements/development.txt
# Install the superset package
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
uv pip install -e .
uv pip install .
RUN uv pip install .[postgres]
RUN python -m compileall /app/superset

View File

@@ -1 +0,0 @@
LLMS.md

1
GPT.md
View File

@@ -1 +0,0 @@
LLMS.md

148
LLMS.md
View File

@@ -1,148 +0,0 @@
# LLM Context Guide for Apache Superset
Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend.
## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do)
**These migrations are actively happening - avoid deprecated patterns:**
### Frontend Modernization
- **NO `any` types** - Use proper TypeScript types
- **NO JavaScript files** - Convert to TypeScript (.ts/.tsx)
- **Use @superset-ui/core** - Don't import Ant Design directly
### Testing Strategy Migration
- **Prefer unit tests** over integration tests
- **Prefer integration tests** over Cypress end-to-end tests
- **Cypress is last resort** - Actively moving away from Cypress
- **Use Jest + React Testing Library** for component testing
### Backend Type Safety
- **Add type hints** - All new Python code needs proper typing
- **MyPy compliance** - Run `pre-commit run mypy` to validate
- **SQLAlchemy typing** - Use proper model annotations
## Key Directories
```
superset/
├── superset/ # Python backend (Flask, SQLAlchemy)
│ ├── views/api/ # REST API endpoints
│ ├── models/ # Database models
│ └── connectors/ # Database connections
├── superset-frontend/src/ # React TypeScript frontend
│ ├── components/ # Reusable components
│ ├── explore/ # Chart builder
│ ├── dashboard/ # Dashboard interface
│ └── SqlLab/ # SQL editor
├── superset-frontend/packages/
│ └── superset-ui-core/ # UI component library (USE THIS)
├── tests/ # Python/integration tests
├── docs/ # Documentation (UPDATE FOR CHANGES)
└── UPDATING.md # Breaking changes log
```
## Code Standards
### TypeScript Frontend
- **Avoid `any` types** - Use proper TypeScript, reuse existing types
- **Functional components** with hooks
- **@superset-ui/core** for UI components (not direct antd)
- **Jest** for testing (NO Enzyme)
- **Redux** for global state where it exists, hooks for local
### Python Backend
- **Type hints required** for all new code
- **MyPy compliant** - run `pre-commit run mypy`
- **SQLAlchemy models** with proper typing
- **pytest** for testing
### Apache License Headers
- **New files require ASF license headers** - When creating new code files, include the standard Apache Software Foundation license header
- **LLM instruction files are excluded** - Files like LLMS.md, CLAUDE.md, etc. are in `.rat-excludes` to avoid header token overhead
## Documentation Requirements
- **docs/**: Update for any user-facing changes
- **UPDATING.md**: Add breaking changes here
- **Docstrings**: Required for new functions/classes
## Architecture Patterns
### Security & Features
- **RBAC**: Role-based access via Flask-AppBuilder
- **Feature flags**: Control feature rollouts
- **Row-level security**: SQL-based data access control
## Test Utilities
### Python Test Helpers
- **`SupersetTestCase`** - Base class in `tests/integration_tests/base_tests.py`
- **`@with_config`** - Config mocking decorator
- **`@with_feature_flags`** - Feature flag testing
- **`login_as()`, `login_as_admin()`** - Authentication helpers
- **`create_dashboard()`, `create_slice()`** - Data setup utilities
### TypeScript Test Helpers
- **`superset-frontend/spec/helpers/testing-library.tsx`** - Custom render() with providers
- **`createWrapper()`** - Redux/Router/Theme wrapper
- **`selectOption()`** - Select component helper
- **React Testing Library** - NO Enzyme (removed)
### Running Tests
```bash
# Frontend
npm run test # All tests
npm run test -- filename.test.tsx # Single file
# Backend
pytest # All tests
pytest tests/unit_tests/specific_test.py # Single file
pytest tests/unit_tests/ # Directory
# If pytest fails with database/setup issues, ask the user to run test environment setup
```
## Environment Validation
**Quick Setup Check (run this first):**
```bash
# Verify Superset is running
curl -f http://localhost:8088/health || echo "❌ Setup required - see https://superset.apache.org/docs/contributing/development#working-with-llms"
```
**If health checks fail:**
"It appears you aren't set up properly. Please refer to the [Working with LLMs](https://superset.apache.org/docs/contributing/development#working-with-llms) section in the development docs for setup instructions."
**Key Project Files:**
- `superset-frontend/package.json` - Frontend build scripts (`npm run dev` on port 9000, `npm run test`, `npm run lint`)
- `pyproject.toml` - Python tooling (ruff, mypy configs)
- `requirements/` folder - Python dependencies (base.txt, development.txt)
## Pre-commit Validation
**Use pre-commit hooks for quality validation:**
```bash
# Install hooks
pre-commit install
# Quick validation (faster than --all-files)
pre-commit run # Staged files only
pre-commit run mypy # Python type checking
pre-commit run prettier # Code formatting
pre-commit run eslint # Frontend linting
```
## Platform-Specific Instructions
- **[CLAUDE.md](CLAUDE.md)** - For Claude/Anthropic tools
- **[.github/copilot-instructions.md](.github/copilot-instructions.md)** - For GitHub Copilot
- **[GEMINI.md](GEMINI.md)** - For Google Gemini tools
- **[GPT.md](GPT.md)** - For OpenAI/ChatGPT tools
- **[.cursor/rules/dev-standard.mdc](.cursor/rules/dev-standard.mdc)** - For Cursor editor
---
**LLM Note**: This codebase is actively modernizing toward full TypeScript and type safety. Always run `pre-commit run` to validate changes. Follow the ongoing refactors section to avoid deprecated patterns.

View File

@@ -20,11 +20,11 @@ under the License.
# Superset
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/license/apache-2-0)
[![Latest Release on Github](https://img.shields.io/github/v/release/apache/superset?sort=semver)](https://github.com/apache/superset/releases/latest)
[![Build Status](https://github.com/apache/superset/actions/workflows/superset-python-unittest.yml/badge.svg)](https://github.com/apache/superset/actions)
[![PyPI version](https://badge.fury.io/py/apache_superset.svg)](https://badge.fury.io/py/apache_superset)
[![GitHub release (latest SemVer)](https://img.shields.io/github/v/release/apache/superset?sort=semver)](https://github.com/apache/superset/tree/latest)
[![Build Status](https://github.com/apache/superset/workflows/Python/badge.svg)](https://github.com/apache/superset/actions)
[![PyPI version](https://badge.fury.io/py/apache-superset.svg)](https://badge.fury.io/py/apache-superset)
[![Coverage Status](https://codecov.io/github/apache/superset/coverage.svg?branch=master)](https://codecov.io/github/apache/superset)
[![PyPI](https://img.shields.io/pypi/pyversions/apache_superset.svg?maxAge=2592000)](https://pypi.python.org/pypi/apache_superset)
[![PyPI](https://img.shields.io/pypi/pyversions/apache-superset.svg?maxAge=2592000)](https://pypi.python.org/pypi/apache-superset)
[![Get on Slack](https://img.shields.io/badge/slack-join-orange.svg)](http://bit.ly/join-superset-slack)
[![Documentation](https://img.shields.io/badge/docs-apache.org-blue.svg)](https://superset.apache.org)
@@ -72,10 +72,8 @@ Superset provides:
## Screenshots & Gifs
**Video Overview**
<!-- File hosted here https://github.com/apache/superset-site/raw/lfs/superset-video-4k.mp4 -->
[superset-video-1080p.webm](https://github.com/user-attachments/assets/b37388f7-a971-409c-96a7-90c4e31322e6)
[superset-video-4k.webm](https://github.com/apache/superset/assets/812905/da036bc2-150c-4ee7-80f9-75e63210ff76)
<br/>
@@ -103,7 +101,7 @@ Here are some of the major database solutions that are supported:
<p align="center">
<img src="https://superset.apache.org/img/databases/redshift.png" alt="redshift" border="0" width="200"/>
<img src="https://superset.apache.org/img/databases/google-biquery.png" alt="google-bigquery" border="0" width="200"/>
<img src="https://superset.apache.org/img/databases/google-biquery.png" alt="google-biquery" border="0" width="200"/>
<img src="https://superset.apache.org/img/databases/snowflake.png" alt="snowflake" border="0" width="200"/>
<img src="https://superset.apache.org/img/databases/trino.png" alt="trino" border="0" width="150" />
<img src="https://superset.apache.org/img/databases/presto.png" alt="presto" border="0" width="200"/>
@@ -111,6 +109,7 @@ Here are some of the major database solutions that are supported:
<img src="https://superset.apache.org/img/databases/druid.png" alt="druid" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/firebolt.png" alt="firebolt" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/timescale.png" alt="timescale" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/rockset.png" alt="rockset" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/postgresql.png" alt="postgresql" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/mysql.png" alt="mysql" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/mssql-server.png" alt="mssql-server" border="0" width="200" />
@@ -135,10 +134,9 @@ Here are some of the major database solutions that are supported:
<img src="https://superset.apache.org/img/databases/starrocks.png" alt="starrocks" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/doris.png" alt="doris" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/oceanbase.svg" alt="oceanbase" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/sap-hana.png" alt="sap-hana" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/sap-hana.png" alt="oceanbase" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/denodo.png" alt="denodo" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/ydb.svg" alt="ydb" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/tdengine.png" alt="TDengine" border="0" width="200" />
</p>
**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/configuration/databases).
@@ -147,7 +145,7 @@ Want to add support for your datastore or data engine? Read more [here](https://
## Installation and Configuration
Try out Superset's [quickstart](https://superset.apache.org/docs/quickstart/) guide or learn about [the options for production deployments](https://superset.apache.org/docs/installation/architecture/).
[Extended documentation for Superset](https://superset.apache.org/docs/installation/docker-compose)
## Get Involved
@@ -156,7 +154,7 @@ Try out Superset's [quickstart](https://superset.apache.org/docs/quickstart/) gu
and please read our [Slack Community Guidelines](https://github.com/apache/superset/blob/master/CODE_OF_CONDUCT.md#slack-community-guidelines)
- [Join our dev@superset.apache.org Mailing list](https://lists.apache.org/list.html?dev@superset.apache.org). To join, simply send an email to [dev-subscribe@superset.apache.org](mailto:dev-subscribe@superset.apache.org)
- If you want to help troubleshoot GitHub Issues involving the numerous database drivers that Superset supports, please consider adding your name and the databases you have access to on the [Superset Database Familiarity Rolodex](https://docs.google.com/spreadsheets/d/1U1qxiLvOX0kBTUGME1AHHi6Ywel6ECF8xk_Qy-V9R8c/edit#gid=0)
- Join Superset's Town Hall and [Operational Model](https://preset.io/blog/the-superset-operational-model-wants-you/) recurring meetings. Meeting info is available on the [Superset Community Calendar](https://superset.apache.org/community)
- Join Superset's Town Hall and [Operational Model](https://preset.io/blog/the-superset-operational-model-wants-you/) recurring meetings. Meeting info is available on the [Superset Community Calendar](https://superset.apache.org/community)
## Contributor Guide
@@ -184,16 +182,14 @@ Understanding the Superset Points of View
- [Building New Database Connectors](https://preset.io/blog/building-database-connector/)
- [Create Your First Dashboard](https://superset.apache.org/docs/using-superset/creating-your-first-dashboard/)
- [Comprehensive Tutorial for Contributing Code to Apache Superset
](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
](https://preset.io/blog/tutorial-contributing-code-to-apache-superset/)
- [Resources to master Superset by Preset](https://preset.io/resources/)
- Deploying Superset
- [Official Docker image](https://hub.docker.com/r/apache/superset)
- [Helm Chart](https://github.com/apache/superset/tree/master/helm/superset)
- Recordings of Past [Superset Community Events](https://preset.io/events)
- [Mixed Time Series Charts](https://preset.io/events/mixed-time-series-visualization-in-superset-workshop/)
- [How the Bing Team Customized Superset for the Internal Self-Serve Data & Analytics Platform](https://preset.io/events/how-the-bing-team-heavily-customized-superset-for-their-internal-data/)
- [Live Demo: Visualizing MongoDB and Pinot Data using Trino](https://preset.io/events/2021-04-13-visualizing-mongodb-and-pinot-data-using-trino/)
@@ -201,7 +197,6 @@ Understanding the Superset Points of View
- [Building a Database Connector for Superset](https://preset.io/events/2021-02-16-building-a-database-connector-for-superset/)
- Visualizations
- [Creating Viz Plugins](https://superset.apache.org/docs/contributing/creating-viz-plugins/)
- [Managing and Deploying Custom Viz Plugins](https://medium.com/nmc-techblog/apache-superset-manage-custom-viz-plugins-in-production-9fde1a708e55)
- [Why Apache Superset is Betting on Apache ECharts](https://preset.io/blog/2021-4-1-why-echarts/)

View File

@@ -20,7 +20,7 @@ RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset
# Configure environment
ENV LANG=C.UTF-8 \
LC_ALL=C.UTF-8
LC_ALL=C.UTF-8
RUN apt-get update -y
@@ -30,14 +30,14 @@ RUN apt-get install -y apt-transport-https apt-utils
# Install superset dependencies
# https://superset.apache.org/docs/installation/installing-superset-from-scratch
RUN apt-get install -y build-essential libssl-dev \
libffi-dev python3-dev libsasl2-dev libldap2-dev libxi-dev chromium zstd
libffi-dev python3-dev libsasl2-dev libldap2-dev libxi-dev chromium zstd
# Install nodejs for custom build
# https://nodejs.org/en/download/package-manager/
RUN set -eux; \
curl -sL https://deb.nodesource.com/setup_20.x | bash -; \
apt-get install -y nodejs; \
node --version;
curl -sL https://deb.nodesource.com/setup_20.x | bash -; \
apt-get install -y nodejs; \
node --version;
RUN if ! which npm; then apt-get install -y npm; fi
RUN mkdir -p /home/superset
@@ -50,21 +50,21 @@ ARG SUPERSET_RELEASE_RC_TARBALL
# Can fetch source from svn or copy tarball from local mounted directory
COPY $SUPERSET_RELEASE_RC_TARBALL ./
RUN tar -xvf *.tar.gz
WORKDIR /home/superset/apache_superset-$VERSION/superset-frontend
WORKDIR /home/superset/apache-superset-$VERSION/superset-frontend
RUN npm ci \
&& npm run build \
&& rm -rf node_modules
&& npm run build \
&& rm -rf node_modules
WORKDIR /home/superset/apache_superset-$VERSION
WORKDIR /home/superset/apache-superset-$VERSION
RUN pip install --upgrade setuptools pip \
&& pip install -r requirements/base.txt \
&& pip install --no-cache-dir .
&& pip install -r requirements/base.txt \
&& pip install --no-cache-dir .
RUN flask fab babel-compile --target superset/translations
ENV PATH=/home/superset/superset/bin:$PATH \
PYTHONPATH=/home/superset/superset/ \
SUPERSET_TESTENV=true
PYTHONPATH=/home/superset/superset/ \
SUPERSET_TESTENV=true
COPY from_tarball_entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]

View File

@@ -20,7 +20,7 @@ RUN useradd --user-group --create-home --no-log-init --shell /bin/bash superset
# Configure environment
ENV LANG=C.UTF-8 \
LC_ALL=C.UTF-8
LC_ALL=C.UTF-8
RUN apt-get update -y
@@ -30,14 +30,14 @@ RUN apt-get install -y apt-transport-https apt-utils
# Install superset dependencies
# https://superset.apache.org/docs/installation/installing-superset-from-scratch
RUN apt-get install -y subversion build-essential libssl-dev \
libffi-dev python3-dev libsasl2-dev libldap2-dev libxi-dev chromium zstd
libffi-dev python3-dev libsasl2-dev libldap2-dev libxi-dev chromium zstd
# Install nodejs for custom build
# https://nodejs.org/en/download/package-manager/
RUN set -eux; \
curl -sL https://deb.nodesource.com/setup_20.x | bash -; \
apt-get install -y nodejs; \
node --version;
curl -sL https://deb.nodesource.com/setup_20.x | bash -; \
apt-get install -y nodejs; \
node --version;
RUN if ! which npm; then apt-get install -y npm; fi
RUN mkdir -p /home/superset
@@ -49,20 +49,20 @@ ARG VERSION
# Can fetch source from svn or copy tarball from local mounted directory
RUN svn co https://dist.apache.org/repos/dist/dev/superset/$VERSION ./
RUN tar -xvf *.tar.gz
WORKDIR /home/superset/apache_superset-$VERSION/superset-frontend
WORKDIR /home/superset/apache-superset-$VERSION/superset-frontend
RUN npm ci \
&& npm run build \
&& rm -rf node_modules
&& npm run build \
&& rm -rf node_modules
WORKDIR /home/superset/apache_superset-$VERSION
WORKDIR /home/superset/apache-superset-$VERSION
RUN pip install --upgrade setuptools pip \
&& pip install -r requirements/base.txt \
&& pip install --no-cache-dir .
&& pip install -r requirements/base.txt \
&& pip install --no-cache-dir .
RUN flask fab babel-compile --target superset/translations
ENV PATH=/home/superset/superset/bin:$PATH \
PYTHONPATH=/home/superset/superset/
PYTHONPATH=/home/superset/superset/
COPY from_tarball_entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]

View File

@@ -123,10 +123,10 @@ SUPERSET_RC=1
SUPERSET_GITHUB_BRANCH=1.5
SUPERSET_PGP_FULLNAME=villebro@apache.org
SUPERSET_VERSION_RC=1.5.1rc1
SUPERSET_RELEASE=apache_superset-1.5.1
SUPERSET_RELEASE_RC=apache_superset-1.5.1rc1
SUPERSET_RELEASE_TARBALL=apache_superset-1.5.1-source.tar.gz
SUPERSET_RELEASE_RC_TARBALL=apache_superset-1.5.1rc1-source.tar.gz
SUPERSET_RELEASE=apache-superset-1.5.1
SUPERSET_RELEASE_RC=apache-superset-1.5.1rc1
SUPERSET_RELEASE_TARBALL=apache-superset-1.5.1-source.tar.gz
SUPERSET_RELEASE_RC_TARBALL=apache-superset-1.5.1rc1-source.tar.gz
SUPERSET_TMP_ASF_SITE_PATH=/tmp/incubator-superset-site-1.5.1
-------------------------------
```
@@ -380,7 +380,7 @@ Official instructions:
https://www.apache.org/info/verification.html
We now have a handy script for anyone validating a release to use. The core of it is in this very folder, `verify_release.py`. Just make sure you have all three release files in the same directory (`{some version}.tar.gz`, `{some version}.tar.gz.asc` and `{some version}tar.gz.sha512`). Then you can pass this script the path to the `.gz` file like so:
`python verify_release.py ~/path/tp/apache_superset-{version/candidate}-source.tar.gz`
`python verify_release.py ~/path/tp/apache-superset-{version/candidate}-source.tar.gz`
If all goes well, you will see this result in your terminal:
@@ -454,11 +454,8 @@ cd ../
# Compile translations for the backend
./scripts/translations/generate_mo_files.sh
# update build version number
sed -i '' "s/version_string = .*/version_string = \"$SUPERSET_VERSION\"/" setup.py
# build the python distribution
python setup.py sdist
python -m build
```
Publish to PyPI
@@ -469,7 +466,8 @@ an account first if you don't have one, and reference your username
while requesting access to push packages.
```bash
twine upload dist/*
twine upload dist/apache_superset-${SUPERSET_VERSION}-py3-none-any.whl
twine upload dist/apache-superset-${SUPERSET_VERSION}.tar.gz
```
Set your username to `__token__`

View File

@@ -232,7 +232,8 @@ class GitChangeLog:
for log in self._logs:
yield {
"pr_number": log.pr_number,
"pr_link": f"https://github.com/{SUPERSET_REPO}/pull/{log.pr_number}",
"pr_link": f"https://github.com/{SUPERSET_REPO}/pull/"
f"{log.pr_number}",
"message": log.message,
"time": log.time,
"author": log.author,
@@ -322,9 +323,9 @@ class BaseParameters:
def print_title(message: str) -> None:
print(f"{50 * '-'}")
print(f"{50*'-'}")
print(message)
print(f"{50 * '-'}")
print(f"{50*'-'}")
@click.group()
@@ -348,14 +349,14 @@ def compare(base_parameters: BaseParameters) -> None:
previous_logs = base_parameters.previous_logs
current_logs = base_parameters.current_logs
print_title(
f"Pull requests from {current_logs.git_ref} not in {previous_logs.git_ref}"
f"Pull requests from " f"{current_logs.git_ref} not in {previous_logs.git_ref}"
)
previous_diff_logs = previous_logs.diff(current_logs)
for diff_log in previous_diff_logs:
print(f"{diff_log}")
print_title(
f"Pull requests from {previous_logs.git_ref} not in {current_logs.git_ref}"
f"Pull requests from " f"{previous_logs.git_ref} not in {current_logs.git_ref}"
)
current_diff_logs = current_logs.diff(previous_logs)
for diff_log in current_diff_logs:

View File

@@ -31,7 +31,7 @@ The official source release:
https://downloads.apache.org/{{ project_module }}/{{ version }}
The PyPI package:
https://pypi.org/project/apache_superset/{{ version }}
https://pypi.org/project/apache-superset/{{ version }}
The CHANGELOG for the release:
https://github.com/apache/{{ project_module }}/blob/{{ version }}/CHANGELOG/{{ version }}.md

View File

@@ -32,7 +32,7 @@ else
SUPERSET_VERSION="${1}"
SUPERSET_RC="${2}"
SUPERSET_PGP_FULLNAME="${3}"
SUPERSET_RELEASE_RC_TARBALL="apache_superset-${SUPERSET_VERSION_RC}-source.tar.gz"
SUPERSET_RELEASE_RC_TARBALL="apache-superset-${SUPERSET_VERSION_RC}-source.tar.gz"
fi
SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}"

View File

@@ -22,7 +22,7 @@ if [ -z "${SUPERSET_VERSION_RC}" ] || [ -z "${SUPERSET_SVN_DEV_PATH}" ] || [ -z
exit 1
fi
SUPERSET_RELEASE_RC=apache_superset-"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC=apache-superset-"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC_TARBALL="${SUPERSET_RELEASE_RC}"-source.tar.gz
SUPERSET_RELEASE_RC_BASE_PATH="${SUPERSET_SVN_DEV_PATH}"/"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC_TARBALL_PATH="${SUPERSET_RELEASE_RC_BASE_PATH}"/"${SUPERSET_RELEASE_RC_TARBALL}"

View File

@@ -50,8 +50,8 @@ else
export SUPERSET_GITHUB_BRANCH="${VERSION_MAJOR}.${VERSION_MINOR}"
export SUPERSET_PGP_FULLNAME="${2}"
export SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${VERSION_RC}"
export SUPERSET_RELEASE=apache_superset-"${SUPERSET_VERSION}"
export SUPERSET_RELEASE_RC=apache_superset-"${SUPERSET_VERSION_RC}"
export SUPERSET_RELEASE=apache-superset-"${SUPERSET_VERSION}"
export SUPERSET_RELEASE_RC=apache-superset-"${SUPERSET_VERSION_RC}"
export SUPERSET_RELEASE_TARBALL="${SUPERSET_RELEASE}"-source.tar.gz
export SUPERSET_RELEASE_RC_TARBALL="${SUPERSET_RELEASE_RC}"-source.tar.gz
export SUPERSET_TMP_ASF_SITE_PATH="/tmp/incubator-superset-site-${SUPERSET_VERSION}"

View File

@@ -27,7 +27,7 @@ if [ -z "${SUPERSET_SVN_DEV_PATH}" ]; then
fi
if [[ -n ${1} ]] && [[ ${1} == "local" ]]; then
SUPERSET_RELEASE_RC=apache_superset-"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC=apache-superset-"${SUPERSET_VERSION_RC}"
SUPERSET_RELEASE_RC_TARBALL="${SUPERSET_RELEASE_RC}"-source.tar.gz
SUPERSET_TARBALL_PATH="${SUPERSET_SVN_DEV_PATH}"/${SUPERSET_VERSION_RC}/${SUPERSET_RELEASE_RC_TARBALL}
SUPERSET_TMP_TARBALL_FILENAME=_tmp_"${SUPERSET_VERSION_RC}".tar.gz

View File

@@ -38,7 +38,7 @@ get_pip_command() {
PYTHON=$(get_python_command)
PIP=$(get_pip_command)
# Get the release directory's path. If you unzip an Apache release and just run the npm script to validate the release, this will be a file name like `apache_superset-x.x.xrcx-source.tar.gz`
# Get the release directory's path. If you unzip an Apache release and just run the npm script to validate the release, this will be a file name like `apache-superset-x.x.xrcx-source.tar.gz`
RELEASE_ZIP_PATH="../../$(basename "$(dirname "$(pwd)")")-source.tar.gz"
# Install dependencies from requirements.txt if the file exists

View File

@@ -49,6 +49,7 @@ These features are **finished** but currently being tested. They are usable, but
- ENABLE_SUPERSET_META_DB: [(docs)](https://superset.apache.org/docs/configuration/databases/#querying-across-databases)
- ESTIMATE_QUERY_COST
- GLOBAL_ASYNC_QUERIES [(docs)](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#async-chart-queries)
- HORIZONTAL_FILTER_BAR
- IMPERSONATE_WITH_EMAIL_PREFIX
- PLAYWRIGHT_REPORTS_AND_THUMBNAILS
- RLS_IN_SQLLAB

View File

@@ -25,8 +25,8 @@ all you have to do is file a simple PR [like this one](https://github.com/apache
the categorization is inaccurate, please file a PR with your correction as well.
Join our growing community!
### Sharing Economy
### Sharing Economy
- [Airbnb](https://github.com/airbnb)
- [Faasos](https://faasos.com/) [@shashanksingh]
- [Free2Move](https://www.free2move.com/) [@PaoloTerzi]
@@ -36,14 +36,12 @@ Join our growing community!
- [Ontruck](https://www.ontruck.com/)
### Financial Services
- [Aktia Bank plc](https://www.aktia.com)
- [American Express](https://www.americanexpress.com) [@TheLastSultan]
- [bumper](https://www.bumper.co/) [@vasu-ram, @JamiePercival]
- [Cape Crypto](https://capecrypto.com)
- [Capital Service S.A.](https://capitalservice.pl) [@pkonarzewski]
- [Clark.de](https://clark.de/)
- [Europace](https://europace.de)
- [KarrotPay](https://www.daangnpay.com/)
- [Remita](https://remita.net) [@mujibishola]
- [Taveo](https://www.taveo.com) [@codek]
@@ -53,11 +51,9 @@ Join our growing community!
- [Cover Genius](https://covergenius.com/)
### Gaming
- [Popoko VM Games Studio](https://popoko.live)
### E-Commerce
- [AiHello](https://www.aihello.com) [@ganeshkrishnan1]
- [Bazaar Technologies](https://www.bazaartech.com) [@umair-abro]
- [Dragonpass](https://www.dragonpass.com.cn/) [@zhxjdwh]
@@ -83,14 +79,12 @@ Join our growing community!
- [Zepto](https://www.zeptonow.com/) [@gwthm-in]
### Enterprise Technology
- [A3Data](https://a3data.com.br) [@neylsoncrepalde]
- [Analytics Aura](https://analyticsaura.com/) [@Analytics-Aura]
- [Apollo GraphQL](https://www.apollographql.com/) [@evans]
- [Astronomer](https://www.astronomer.io) [@ryw]
- [Avesta Technologies](https://avestatechnologies.com/) [@TheRum]
- [Caizin](https://caizin.com/) [@tejaskatariya]
- [Canonical](https://canonical.com)
- [Careem](https://www.careem.com/) [@samraHanif0340]
- [Cloudsmith](https://cloudsmith.io) [@alancarson]
- [Cyberhaven](https://www.cyberhaven.com/) [@toliver-ch]
@@ -102,10 +96,8 @@ Join our growing community!
- [ELMO Cloud HR & Payroll](https://elmosoftware.com.au/)
- [Endress+Hauser](https://www.endress.com/) [@rumbin]
- [FBK - ICT center](https://ict.fbk.eu)
- [Formbricks](https://formbricks.com)
- [Gavagai](https://gavagai.io) [@gavagai-corp]
- [GfK Data Lab](https://www.gfk.com/home) [@mherr]
- [HPE](https://www.hpe.com/in/en/home.html) [@anmol-hpe]
- [Hydrolix](https://www.hydrolix.io/)
- [Intercom](https://www.intercom.com/) [@kate-gallo]
- [jampp](https://jampp.com/)
@@ -117,7 +109,6 @@ Join our growing community!
- [Ona](https://ona.io) [@pld]
- [Orange](https://www.orange.com) [@icsu]
- [Oslandia](https://oslandia.com)
- [Oxylabs](https://oxylabs.io/) [@rytis-ulys]
- [Peak AI](https://www.peak.ai/) [@azhar22k]
- [PeopleDoc](https://www.people-doc.com) [@rodo]
- [PlaidCloud](https://www.plaidcloud.com)
@@ -125,11 +116,8 @@ Join our growing community!
- [PubNub](https://pubnub.com) [@jzucker2]
- [ReadyTech](https://www.readytech.io)
- [Reward Gateway](https://www.rewardgateway.com)
- [RIADVICE](https://riadvice.tn) [@riadvice]
- [ScopeAI](https://www.getscopeai.com) [@iloveluce]
- [shipmnts](https://shipmnts.com)
- [Showmax](https://showmax.com) [@bobek]
- [SingleStore](https://www.singlestore.com/)
- [TechAudit](https://www.techaudit.info) [@ETselikov]
- [Tenable](https://www.tenable.com) [@dflionis]
- [Tentacle](https://www.linkedin.com/company/tentacle-cmi/) [@jdclarke5]
@@ -140,11 +128,9 @@ Join our growing community!
- [Virtuoso QA](https://www.virtuosoqa.com)
- [Whale](https://whale.im)
- [Windsor.ai](https://www.windsor.ai/) [@octaviancorlade]
- [WinWin Network马上赢](https://brandct.cn/) [@wenbinye]
- [Zeta](https://www.zeta.tech/) [@shaikidris]
### Media & Entertainment
- [6play](https://www.6play.fr) [@CoryChaplin]
- [bilibili](https://www.bilibili.com) [@Moinheart]
- [BurdaForward](https://www.burda-forward.de/en/)
@@ -157,7 +143,6 @@ Join our growing community!
- [Zaihang](https://www.zaih.com/)
### Education
- [Aveti Learning](https://avetilearning.com/) [@TheShubhendra]
- [Brilliant.org](https://brilliant.org/)
- [Open edX](https://openedx.org/)
@@ -169,7 +154,6 @@ Join our growing community!
- [WikiMedia Foundation](https://wikimediafoundation.org) [@vg]
### Energy
- [Airboxlab](https://foobot.io) [@antoine-galataud]
- [DouroECI](https://www.douroeci.com/) [@nunohelibeires]
- [Safaricom](https://www.safaricom.co.ke/) [@mmutiso]
@@ -177,7 +161,6 @@ Join our growing community!
- [Wattbewerb](https://wattbewerb.de/) [@wattbewerb]
### Healthcare
- [Amino](https://amino.com) [@shkr]
- [Bluesquare](https://www.bluesquarehub.com/) [@madewulf]
- [Care](https://www.getcare.io/) [@alandao2021]
@@ -190,36 +173,29 @@ Join our growing community!
- [2070Health](https://2070health.com/)
### HR / Staffing
- [Swile](https://www.swile.co/) [@PaoloTerzi]
- [Symmetrics](https://www.symmetrics.fyi)
- [bluquist](https://bluquist.com/)
### Government
### Government / Non-Profit
- [City of Ann Arbor, MI](https://www.a2gov.org/) [@sfirke]
- [RIS3 Strategy of CZ, MIT CR](https://www.ris3.cz/) [@RIS3CZ]
- [NRLM - Sarathi, India](https://pib.gov.in/PressReleasePage.aspx?PRID=1999586)
### Travel
- [Agoda](https://www.agoda.com/) [@lostseaway, @maiake, @obombayo]
- [HomeToGo](https://hometogo.com/) [@pedromartinsteenstrup]
- [Skyscanner](https://www.skyscanner.net/) [@cleslie, @stanhoucke]
### Others
- [10Web](https://10web.io/)
- [AI inside](https://inside.ai/en/)
- [Automattic](https://automattic.com/) [@Khrol, @Usiel]
- [Dropbox](https://www.dropbox.com/) [@bkyryliuk]
- [Flowbird](https://flowbird.com) [@EmmanuelCbd]
- [GEOTAB](https://www.geotab.com) [@JZ6]
- [Grassroot](https://www.grassrootinstitute.org/)
- [Increff](https://www.increff.com/) [@ishansinghania]
- [komoot](https://www.komoot.com/) [@christophlingg]
- [Let's Roam](https://www.letsroam.com/)
- [Machrent SA](https://www.machrent.com/)
- [Onebeat](https://1beat.com/) [@GuyAttia]
- [X](https://x.com/)
- [VLMedia](https://www.vlmedia.com.tr/) [@ibotheperfect]

View File

@@ -43,8 +43,8 @@ under the License.
| can this form post on ResetPasswordView |:heavy_check_mark:|O|O|O|
| can this form get on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form post on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form get on UserInfoEditView |:heavy_check_mark:|O|O|O|
| can this form post on UserInfoEditView |:heavy_check_mark:|O|O|O|
| can this form get on UserInfoEditView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form post on UserInfoEditView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can show on UserDBModelView |:heavy_check_mark:|O|O|O|
| can edit on UserDBModelView |:heavy_check_mark:|O|O|O|
| can delete on UserDBModelView |:heavy_check_mark:|O|O|O|
@@ -65,6 +65,7 @@ under the License.
| can get on MenuApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can list on AsyncEventsRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can invalidate on CacheRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can function names on Database |:heavy_check_mark:|O|O|O|
| can csv upload on Database |:heavy_check_mark:|O|O|O|
| can excel upload on Database |:heavy_check_mark:|O|O|O|
| can query form data on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
@@ -75,6 +76,7 @@ under the License.
| can get on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can my queries on SqlLab |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
| can log on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can schemas access for csv upload on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can import dashboards on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can schemas on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can sqllab history on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
@@ -116,6 +118,8 @@ under the License.
| menu access on Data |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Databases |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Datasets |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Upload a CSV |:heavy_check_mark:|:heavy_check_mark:|O|O|
| menu access on Upload Excel |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Charts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on SQL Lab |:heavy_check_mark:|O|O|:heavy_check_mark:|
@@ -125,6 +129,13 @@ under the License.
| all datasource access on all_datasource_access |:heavy_check_mark:|:heavy_check_mark:|O|O|
| all database access on all_database_access |:heavy_check_mark:|:heavy_check_mark:|O|O|
| all query access on all_query_access |:heavy_check_mark:|O|O|O|
| can edit on UserOAuthModelView |:heavy_check_mark:|O|O|O|
| can list on UserOAuthModelView |:heavy_check_mark:|O|O|O|
| can show on UserOAuthModelView |:heavy_check_mark:|O|O|O|
| can userinfo on UserOAuthModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can add on UserOAuthModelView |:heavy_check_mark:|O|O|O|
| can delete on UserOAuthModelView |:heavy_check_mark:|O|O|O|
| userinfoedit on UserOAuthModelView |:heavy_check_mark:|O|O|O|
| can write on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can edit on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can list on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
@@ -132,6 +143,13 @@ under the License.
| can download on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can add on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can delete on DynamicPlugin |:heavy_check_mark:|O|O|O|
| can edit on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
| can list on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
| can show on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
| can download on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
| can add on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
| can delete on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
| muldelete on RowLevelSecurityFiltersModelView |:heavy_check_mark:|O|O|O|
| can external metadata by name on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can get value on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can store on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
@@ -174,6 +192,7 @@ under the License.
| can share chart on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form get on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can this form post on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| menu access on Upload a Columnar file |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can export on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can write on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
| can read on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|

View File

@@ -22,16 +22,6 @@ under the License.
This file documents any backwards-incompatible changes in Superset and
assists people when migrating to a new version.
## Next
- [34235](https://github.com/apache/superset/pull/34235) CSV exports now use `utf-8-sig` encoding by default to include a UTF-8 BOM, improving compatibility with Excel.
- [34258](https://github.com/apache/superset/pull/34258) changing the default in Dockerfile to INCLUDE_CHROMIUM="false" (from "true") in the past. This ensures the `lean` layer is lean by default, and people can opt-in to the `chromium` layer by setting the build arg `INCLUDE_CHROMIUM=true`. This is a breaking change for anyone using the `lean` layer, as it will no longer include Chromium by default.
- [34204](https://github.com/apache/superset/pull/33603) OpenStreetView has been promoted as the new default for Deck.gl visualization since it can be enabled by default without requiring an API key. If you have Mapbox set up and want to disable OpenStreeView in your environment, please follow the steps documented here [https://superset.apache.org/docs/configuration/map-tiles].
- [33116](https://github.com/apache/superset/pull/33116) In Echarts Series charts (e.g. Line, Area, Bar, etc.) charts, the `x_axis_sort_series` and `x_axis_sort_series_ascending` form data items have been renamed with `x_axis_sort` and `x_axis_sort_asc`.
There's a migration added that can potentially affect a significant number of existing charts.
- [32317](https://github.com/apache/superset/pull/32317) The horizontal filter bar feature is now out of testing/beta development and its feature flag `HORIZONTAL_FILTER_BAR` has been removed.
- [31590](https://github.com/apache/superset/pull/31590) Marks the begining of intricate work around supporting dynamic Theming, and breaks support for [THEME_OVERRIDES](https://github.com/apache/superset/blob/732de4ac7fae88e29b7f123b6cbb2d7cd411b0e4/superset/config.py#L671) in favor of a new theming system based on AntD V5. Likely this will be in disrepair until settling over the 5.x lifecycle.
- [32432](https://github.com/apache/superset/pull/31260) Moves the List Roles FAB view to the frontend and requires `FAB_ADD_SECURITY_API` to be enabled in the configuration and `superset init` to be executed.
## 5.0.0
- [31976](https://github.com/apache/superset/pull/31976) Removed the `DISABLE_LEGACY_DATASOURCE_EDITOR` feature flag. The previous value of the feature flag was `True` and now the feature is permanently removed.
@@ -45,7 +35,7 @@ assists people when migrating to a new version.
- [31198](https://github.com/apache/superset/pull/31198) Disallows by default the use of the following ClickHouse functions: "version", "currentDatabase", "hostName".
- [29798](https://github.com/apache/superset/pull/29798) Since 3.1.0, the intial schedule for an alert or report was mistakenly offset by the specified timezone's relation to UTC. The initial schedule should now begin at the correct time.
- [30021](https://github.com/apache/superset/pull/30021) The `dev` layer in our Dockerfile no long includes firefox binaries, only Chromium to reduce bloat/docker-build-time.
- [30099](https://github.com/apache/superset/pull/30099) Translations are no longer included in the default docker image builds. If your environment requires translations, you'll want to set the docker build arg `BUILD_TRANSLATIONS=true`.
- [30099](https://github.com/apache/superset/pull/30099) Translations are no longer included in the default docker image builds. If your environment requires translations, you'll want to set the docker build arg `BUILD_TRANSACTION=true`.
- [31262](https://github.com/apache/superset/pull/31262) NOTE: deprecated `pylint` in favor of `ruff` as our only python linter. Only affect development workflows positively (not the release itself). It should cover most important rules, be much faster, but some things linting rules that were enforced before may not be enforce in the exact same way as before.
- [31173](https://github.com/apache/superset/pull/31173) Modified `fetch_csrf_token` to align with HTTP standards, particularly regarding how cookies are handled. If you encounter any issues related to CSRF functionality, please report them as a new issue and reference this PR for context.
- [31413](https://github.com/apache/superset/pull/31413) Enable the DATE_FORMAT_IN_EMAIL_SUBJECT feature flag to allow users to specify a date format for the email subject, which will then be replaced with the actual date.
@@ -58,13 +48,6 @@ assists people when migrating to a new version.
- [31961](https://github.com/apache/superset/pull/31961) Upgraded React from version 16.13.1 to 17.0.2. If you are using custom frontend extensions or plugins, you may need to update them to be compatible with React 17.
- [31260](https://github.com/apache/superset/pull/31260) Docker images now use `uv pip install` instead of `pip install` to manage the python envrionment. Most docker-based deployments will be affected, whether you derive one of the published images, or have custom bootstrap script that install python libraries (drivers)
### Potential Downtime
## 4.1.2
- [31198](https://github.com/apache/superset/pull/31198) Disallows by default the use of the following ClickHouse functions: "version", "currentDatabase", "hostName".
- [31173](https://github.com/apache/superset/pull/31173) Modified `fetch_csrf_token` to align with HTTP standards, particularly regarding how cookies are handled. If you encounter any issues related to CSRF functionality, please report them as a new issue and reference this PR for context.
## 4.1.0
- [29274](https://github.com/apache/superset/pull/29274): We made it easier to trigger CI on your

View File

@@ -41,7 +41,7 @@ services:
required: true
- path: docker/.env-local # optional override
required: false
image: postgres:16
image: postgres:15
container_name: superset_db
restart: unless-stopped
volumes:
@@ -65,6 +65,8 @@ services:
superset-init:
condition: service_completed_successfully
volumes: *superset-volumes
environment:
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
superset-init:
image: *superset-image
@@ -84,6 +86,9 @@ services:
volumes: *superset-volumes
healthcheck:
disable: true
environment:
SUPERSET_LOAD_EXAMPLES: "${SUPERSET_LOAD_EXAMPLES:-yes}"
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
superset-worker:
image: *superset-image
@@ -106,6 +111,8 @@ services:
"CMD-SHELL",
"celery -A superset.tasks.celery_app:app inspect ping -d celery@$$HOSTNAME",
]
environment:
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
superset-worker-beat:
image: *superset-image
@@ -124,6 +131,8 @@ services:
volumes: *superset-volumes
healthcheck:
disable: true
environment:
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
volumes:
superset_home:

View File

@@ -46,7 +46,7 @@ services:
required: true
- path: docker/.env-local # optional override
required: false
image: postgres:16
image: postgres:15
container_name: superset_db
restart: unless-stopped
volumes:
@@ -71,6 +71,8 @@ services:
superset-init:
condition: service_completed_successfully
volumes: *superset-volumes
environment:
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
superset-init:
container_name: superset_init
@@ -91,6 +93,9 @@ services:
volumes: *superset-volumes
healthcheck:
disable: true
environment:
SUPERSET_LOAD_EXAMPLES: "${SUPERSET_LOAD_EXAMPLES:-yes}"
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
superset-worker:
build:
@@ -114,6 +119,8 @@ services:
"CMD-SHELL",
"celery -A superset.tasks.celery_app:app inspect ping -d celery@$$HOSTNAME",
]
environment:
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
superset-worker-beat:
build:
@@ -133,6 +140,8 @@ services:
volumes: *superset-volumes
healthcheck:
disable: true
environment:
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
volumes:
superset_home:

View File

@@ -29,6 +29,7 @@ x-superset-volumes: &superset-volumes
- ./superset-frontend:/app/superset-frontend
- superset_home:/app/superset_home
- ./tests:/app/tests
x-common-build: &common-build
context: .
target: ${SUPERSET_BUILD_TARGET:-dev} # can use `dev` (default) or `lean`
@@ -42,11 +43,6 @@ x-common-build: &common-build
services:
nginx:
env_file:
- path: docker/.env # default
required: true
- path: docker/.env-local # optional override
required: false
image: nginx:latest
container_name: superset_nginx
restart: unless-stopped
@@ -56,8 +52,6 @@ services:
- "host.docker.internal:host-gateway"
volumes:
- ./docker/nginx/nginx.conf:/etc/nginx/nginx.conf:ro
- ./docker/nginx/templates:/etc/nginx/templates:ro
redis:
image: redis:7
container_name: superset_cache
@@ -73,7 +67,7 @@ services:
required: true
- path: docker/.env-local # optional override
required: false
image: postgres:16
image: postgres:15
container_name: superset_db
restart: unless-stopped
ports:
@@ -104,6 +98,9 @@ services:
superset-init:
condition: service_completed_successfully
volumes: *superset-volumes
environment:
CYPRESS_CONFIG: "${CYPRESS_CONFIG:-}"
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
superset-websocket:
container_name: superset_websocket
@@ -155,6 +152,10 @@ services:
condition: service_started
user: *superset-user
volumes: *superset-volumes
environment:
CYPRESS_CONFIG: "${CYPRESS_CONFIG:-}"
SUPERSET_LOAD_EXAMPLES: "${SUPERSET_LOAD_EXAMPLES:-yes}"
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
healthcheck:
disable: true
@@ -175,7 +176,7 @@ services:
NPM_RUN_PRUNE: false
SCARF_ANALYTICS: "${SCARF_ANALYTICS:-}"
# configuring the dev-server to use the host.docker.internal to connect to the backend
superset: "http://superset:8088"
superset: "http://host.docker.internal:8088"
ports:
- "127.0.0.1:9000:9000" # exposing the dynamic webpack dev server
container_name: superset_node
@@ -199,6 +200,8 @@ services:
required: false
environment:
CELERYD_CONCURRENCY: 2
CYPRESS_CONFIG: "${CYPRESS_CONFIG:-}"
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
restart: unless-stopped
depends_on:
superset-init:
@@ -230,6 +233,9 @@ services:
volumes: *superset-volumes
healthcheck:
disable: true
environment:
CYPRESS_CONFIG: "${CYPRESS_CONFIG:-}"
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
superset-tests-worker:
build:
@@ -250,6 +256,7 @@ services:
REDIS_RESULTS_DB: 3
REDIS_HOST: localhost
CELERYD_CONCURRENCY: 8
SUPERSET_LOG_LEVEL: "${SUPERSET_LOG_LEVEL:-info}"
network_mode: host
depends_on:
superset-init:

View File

@@ -54,7 +54,6 @@ REDIS_HOST=redis
REDIS_PORT=6379
FLASK_DEBUG=true
SUPERSET_APP_ROOT="/"
SUPERSET_ENV=development
SUPERSET_LOAD_EXAMPLES=yes
CYPRESS_CONFIG=false
@@ -63,6 +62,7 @@ MAPBOX_API_KEY=''
# Make sure you set this to a unique secure random value on production
SUPERSET_SECRET_KEY=TEST_NON_DEV_SECRET
ENABLE_PLAYWRIGHT=false
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
BUILD_SUPERSET_FRONTEND_IN_DOCKER=true

View File

@@ -1,19 +0,0 @@
#!/usr/bin/env bash
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
curl -f "http://localhost:${SUPERSET_PORT}/${SUPERSET_APP_ROOT/\//}/health" || exit 1

View File

@@ -90,5 +90,44 @@ http {
client_max_body_size 10m;
include /etc/nginx/conf.d/superset.conf;
upstream superset_app {
server host.docker.internal:8088;
keepalive 100;
}
upstream superset_websocket {
server host.docker.internal:8080;
keepalive 100;
}
server {
listen 80 default_server;
server_name _;
location /ws {
proxy_pass http://superset_websocket;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "Upgrade";
proxy_set_header Host $host;
}
location /static {
proxy_pass http://host.docker.internal:9000; # Proxy to superset-node
proxy_http_version 1.1;
proxy_set_header Host $host;
}
location / {
proxy_pass http://superset_app;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_http_version 1.1;
port_in_redirect off;
proxy_connect_timeout 300;
}
}
}

View File

@@ -1,57 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
upstream superset_app {
server host.docker.internal:8088;
keepalive 100;
}
upstream superset_websocket {
server host.docker.internal:8080;
keepalive 100;
}
server {
listen 80 default_server;
server_name _;
location /ws {
proxy_pass http://superset_websocket;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "Upgrade";
proxy_set_header Host $host;
}
location ${SUPERSET_APP_ROOT}/static {
proxy_pass http://host.docker.internal:9000; # Proxy to superset-node
proxy_http_version 1.1;
proxy_set_header Host $host;
}
location ${SUPERSET_APP_ROOT} {
proxy_pass http://superset_app;
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_http_version 1.1;
port_in_redirect off;
proxy_connect_timeout 300;
}
}

View File

@@ -71,7 +71,6 @@ CACHE_CONFIG = {
"CACHE_REDIS_DB": REDIS_RESULTS_DB,
}
DATA_CACHE_CONFIG = CACHE_CONFIG
THUMBNAIL_CACHE_CONFIG = CACHE_CONFIG
class CeleryConfig:
@@ -101,11 +100,9 @@ CELERY_CONFIG = CeleryConfig
FEATURE_FLAGS = {"ALERT_REPORTS": True}
ALERT_REPORTS_NOTIFICATION_DRY_RUN = True
WEBDRIVER_BASEURL = f"http://superset_app{os.environ.get('SUPERSET_APP_ROOT', '/')}/" # When using docker compose baseurl should be http://superset_nginx{ENV{BASEPATH}}/ # noqa: E501
WEBDRIVER_BASEURL = "http://superset:8088/" # When using docker compose baseurl should be http://superset_app:8088/ # noqa: E501
# The base URL for the email report hyperlinks.
WEBDRIVER_BASEURL_USER_FRIENDLY = (
f"http://localhost:8888/{os.environ.get('SUPERSET_APP_ROOT', '/')}/"
)
WEBDRIVER_BASEURL_USER_FRIENDLY = WEBDRIVER_BASEURL
SQLLAB_CTAS_NO_LIMIT = True
log_level_text = os.getenv("SUPERSET_LOG_LEVEL", "INFO")
@@ -132,7 +129,7 @@ try:
from superset_config_docker import * # noqa
logger.info(
f"Loaded your Docker configuration at [{superset_config_docker.__file__}]"
f"Loaded your Docker configuration at " f"[{superset_config_docker.__file__}]"
)
except ImportError:
logger.info("Using default Docker config...")

View File

@@ -1 +1 @@
v20.18.3
v20.16.0

View File

@@ -18,6 +18,6 @@ under the License.
-->
This is the public documentation site for Superset, built using
[Docusaurus 3](https://docusaurus.io/). See
[Docusaurus 2](https://docusaurus.io/). See
[CONTRIBUTING.md](../CONTRIBUTING.md#documentation) for documentation on
contributing to documentation.

View File

@@ -1,4 +1,3 @@
/* eslint-env node */
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file

View File

@@ -86,7 +86,6 @@
"Israel",
"Italy",
"Italy (regions)",
"Ivory Coast",
"Japan",
"Jordan",
"Kazakhstan",
@@ -144,7 +143,6 @@
"Poland",
"Portugal",
"Qatar",
"Republic Of Serbia",
"Romania",
"Russia",
"Rwanda",

View File

@@ -4,6 +4,7 @@ hide_title: true
sidebar_position: 10
---
import { Buffer } from 'buffer/index.js';
import SwaggerUI from 'swagger-ui-react';
import openapi from '/resources/openapi.json';
import 'swagger-ui-react/swagger-ui.css';

View File

@@ -92,7 +92,6 @@ You can find documentation about each field in the default `config.py` in the Gi
You need to replace default values with your custom Redis, Slack and/or SMTP config.
Superset uses Celery beat and Celery worker(s) to send alerts and reports.
- The beat is the scheduler that tells the worker when to perform its tasks. This schedule is defined when you create the alert or report.
- The worker will process the tasks that need to be performed when an alert or report is fired.
@@ -144,7 +143,7 @@ SLACK_API_TOKEN = "xoxb-"
SMTP_HOST = "smtp.sendgrid.net" # change to your host
SMTP_PORT = 2525 # your port, e.g. 587
SMTP_STARTTLS = True
SMTP_SSL_SERVER_AUTH = True # If you're using an SMTP server with a valid certificate
SMTP_SSL_SERVER_AUTH = True # If your using an SMTP server with a valid certificate
SMTP_SSL = False
SMTP_USER = "your_user" # use the empty string "" if using an unauthenticated SMTP server
SMTP_PASSWORD = "your_password" # use the empty string "" if using an unauthenticated SMTP server
@@ -188,6 +187,7 @@ ALERT_REPORTS_EXECUTORS = [FixedExecutor("admin")]
Please refer to `ExecutorType` in the codebase for other executor types.
**Important notes**
- Be mindful of the concurrency setting for celery (using `-c 4`). Selenium/webdriver instances can
@@ -199,6 +199,7 @@ Please refer to `ExecutorType` in the codebase for other executor types.
- Adjust `WEBDRIVER_BASEURL` in your configuration file if celery workers cant access Superset via
its default value of `http://0.0.0.0:8080/`.
It's also possible to specify a minimum interval between each report's execution through the config file:
``` python
@@ -304,7 +305,6 @@ One symptom of an invalid connection to an email server is receiving an error of
Confirm via testing that your outbound email configuration is correct. Here is the simplest test, for an un-authenticated email SMTP email service running on port 25. If you are sending over SSL, for instance, study how [Superset's codebase sends emails](https://github.com/apache/superset/blob/master/superset/utils/core.py#L818) and then test with those commands and arguments.
Start Python in your worker environment, replace all example values, and run:
```python
import smtplib
from email.mime.multipart import MIMEMultipart
@@ -326,7 +326,6 @@ mailserver.quit()
This should send an email.
Possible fixes:
- Some cloud hosts disable outgoing unauthenticated SMTP email to prevent spam. For instance, [Azure blocks port 25 by default on some machines](https://learn.microsoft.com/en-us/azure/virtual-network/troubleshoot-outbound-smtp-connectivity). Enable that port or use another sending method.
- Use another set of SMTP credentials that you verify works in this setup.

View File

@@ -42,13 +42,13 @@ CELERY_CONFIG = CeleryConfig
To start a Celery worker to leverage the configuration, run the following command:
```bash
```
celery --app=superset.tasks.celery_app:app worker --pool=prefork -O fair -c 4
```
To start a job which schedules periodic background jobs, run the following command:
```bash
```
celery --app=superset.tasks.celery_app:app beat
```
@@ -93,12 +93,12 @@ issues arise. Please clear your existing results cache store when upgrading an e
Flower is a web based tool for monitoring the Celery cluster which you can install from pip:
```bash
```python
pip install flower
```
You can run flower using:
```bash
```
celery --app=superset.tasks.celery_app:app flower
```

View File

@@ -17,7 +17,6 @@ Caching can be configured by providing dictionaries in
`superset_config.py` that comply with [the Flask-Caching config specifications](https://flask-caching.readthedocs.io/en/latest/#configuring-flask-caching).
The following cache configurations can be customized in this way:
- Dashboard filter state (required): `FILTER_STATE_CACHE_CONFIG`.
- Explore chart form data (required): `EXPLORE_FORM_DATA_CACHE_CONFIG`
- Metadata cache (optional): `CACHE_CONFIG`
@@ -82,7 +81,7 @@ See [Async Queries via Celery](/docs/configuration/async-queries-celery) for det
## Caching Thumbnails
This is an optional feature that can be turned on by activating its [feature flag](/docs/configuration/configuring-superset#feature-flags) on config:
This is an optional feature that can be turned on by activating its [feature flag](/docs/configuration/configuring-superset#feature-flags) on config:
```
FEATURE_FLAGS = {
@@ -100,6 +99,7 @@ from superset.tasks.types import FixedExecutor
THUMBNAIL_EXECUTORS = [FixedExecutor("admin")]
```
For this feature you will need a cache system and celery workers. All thumbnails are stored on cache
and are processed asynchronously by the workers.

View File

@@ -117,7 +117,7 @@ Your deployment must use a complex, unique key.
### Rotating to a newer SECRET_KEY
If you wish to change your existing SECRET_KEY, add the existing SECRET_KEY to your `superset_config.py` file as
`PREVIOUS_SECRET_KEY =`and provide your new key as `SECRET_KEY =`. You can find your current SECRET_KEY with these
`PREVIOUS_SECRET_KEY = `and provide your new key as `SECRET_KEY =`. You can find your current SECRET_KEY with these
commands - if running Superset with Docker, execute from within the Superset application container:
```python
@@ -141,10 +141,10 @@ database engine on a separate host or container.
Superset supports the following database engines/versions:
| Database Engine | Supported Versions |
| ----------------------------------------- | ---------------------------------------- |
| [PostgreSQL](https://www.postgresql.org/) | 10.X, 11.X, 12.X, 13.X, 14.X, 15.X, 16.X |
| [MySQL](https://www.mysql.com/) | 5.7, 8.X |
| Database Engine | Supported Versions |
| ----------------------------------------- | ---------------------------------- |
| [PostgreSQL](https://www.postgresql.org/) | 10.X, 11.X, 12.X, 13.X, 14.X, 15.X |
| [MySQL](https://www.mysql.com/) | 5.7, 8.X |
Use the following database drivers and connection strings:
@@ -215,45 +215,6 @@ In case the reverse proxy is used for providing SSL encryption, an explicit defi
RequestHeader set X-Forwarded-Proto "https"
```
## Configuring the application root
*Please be advised that this feature is in BETA.*
Superset supports running the application under a non-root path. The root path
prefix can be specified in one of two ways:
- Setting the `SUPERSET_APP_ROOT` environment variable to the desired prefix.
- Customizing the [Flask entrypoint](https://github.com/apache/superset/blob/master/superset/app.py#L29)
by passing the `superset_app_root` variable.
Note, the prefix should start with a `/`.
### Customizing the Flask entrypoint
To configure a prefix, e.g `/analytics`, pass the `superset_app_root` argument to
`create_app` when calling flask run either through the `FLASK_APP`
environment variable:
```sh
FLASK_APP="superset:create_app(superset_app_root='/analytics')"
```
or as part of the `--app` argument to `flask run`:
```sh
flask --app "superset.app:create_app(superset_app_root='/analytics')"
```
### Docker builds
The [docker compose](/docs/installation/docker-compose#configuring-further) developer
configuration includes an additional environmental variable,
[`SUPERSET_APP_ROOT`](https://github.com/apache/superset/blob/master/docker/.env),
to simplify the process of setting up a non-default root path across the services.
In `docker/.env-local` set `SUPERSET_APP_ROOT` to the desired prefix and then bring the
services up with `docker compose up --detach`.
## Custom OAuth2 Configuration
Superset is built on Flask-AppBuilder (FAB), which supports many providers out of the box
@@ -302,15 +263,6 @@ AUTH_USER_REGISTRATION = True
AUTH_USER_REGISTRATION_ROLE = "Public"
```
In case you want to assign the `Admin` role on new user registration, it can be assigned as follows:
```python
AUTH_USER_REGISTRATION_ROLE = "Admin"
```
If you encounter the [issue](https://github.com/apache/superset/issues/13243) of not being able to list users from the Superset main page settings, although a newly registered user has an `Admin` role, please re-run `superset init` to sync the required permissions. Below is the command to re-run `superset init` using docker compose.
```
docker-compose exec superset superset init
```
Then, create a `CustomSsoSecurityManager` that extends `SupersetSecurityManager` and overrides
`oauth_user_info`:
@@ -331,7 +283,7 @@ class CustomSsoSecurityManager(SupersetSecurityManager):
...
```
This file must be located in the same directory as `superset_config.py` with the name
This file must be located at the same directory than `superset_config.py` with the name
`custom_sso_security_manager.py`. Finally, add the following 2 lines to `superset_config.py`:
```
@@ -348,7 +300,6 @@ CUSTOM_SECURITY_MANAGER = CustomSsoSecurityManager
- If an OAuth2 authorization server supports OpenID Connect 1.0, you could configure its configuration
document URL only without providing `api_base_url`, `access_token_url`, `authorize_url` and other
required options like user info endpoint, jwks uri etc. For instance:
```python
OAUTH_PROVIDERS = [
{ 'name':'egaSSO',
@@ -362,15 +313,12 @@ CUSTOM_SECURITY_MANAGER = CustomSsoSecurityManager
}
]
```
### Keycloak-Specific Configuration using Flask-OIDC
If you are using Keycloak as OpenID Connect 1.0 Provider, the above configuration based on [`Authlib`](https://authlib.org/) might not work. In this case using [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is a viable option.
Make sure the pip package [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is installed on the webserver. This was successfully tested using version 2.2.0. This package requires [`Flask-OpenID`](https://pypi.org/project/Flask-OpenID/) as a dependency.
Make sure the pip package [`Flask-OIDC`](https://pypi.org/project/flask-oidc/) is installed on the webserver. This was succesfully tested using version 2.2.0. This package requires [`Flask-OpenID`](https://pypi.org/project/Flask-OpenID/) as a dependency.
The following code defines a new security manager. Add it to a new file named `keycloak_security_manager.py`, placed in the same directory as your `superset_config.py` file.
```python
from flask_appbuilder.security.manager import AUTH_OID
from superset.security import SupersetSecurityManager
@@ -425,9 +373,7 @@ class AuthOIDCView(AuthOIDView):
return redirect(
oidc.client_secrets.get('issuer') + '/protocol/openid-connect/logout?redirect_uri=' + quote(redirect_url))
```
Then add to your `superset_config.py` file:
```python
from keycloak_security_manager import OIDCSecurityManager
from flask_appbuilder.security.manager import AUTH_OID, AUTH_REMOTE_USER, AUTH_DB, AUTH_LDAP, AUTH_OAUTH
@@ -447,9 +393,7 @@ AUTH_USER_REGISTRATION = True
# The default user self registration role
AUTH_USER_REGISTRATION_ROLE = 'Public'
```
Store your client-specific OpenID information in a file called `client_secret.json`. Create this file in the same directory as `superset_config.py`:
```json
{
"<myOpenIDProvider>": {
@@ -466,7 +410,6 @@ Store your client-specific OpenID information in a file called `client_secret.js
}
}
```
## LDAP Authentication
FAB supports authenticating user credentials against an LDAP server.
@@ -489,7 +432,6 @@ AUTH_ROLES_MAPPING = {
"superset_admins": ["Admin"],
}
```
### Mapping LDAP groups to Superset roles
The following `AUTH_ROLES_MAPPING` dictionary would map the LDAP DN "cn=superset_users,ou=groups,dc=example,dc=com" to the Superset roles "Gamma" as well as "Alpha", and the LDAP DN "cn=superset_admins,ou=groups,dc=example,dc=com" to the Superset role "Admin".
@@ -500,7 +442,6 @@ AUTH_ROLES_MAPPING = {
"cn=superset_admins,ou=groups,dc=example,dc=com": ["Admin"],
}
```
Note: This requires `AUTH_LDAP_SEARCH` to be set. For more details, please see the [FAB Security documentation](https://flask-appbuilder.readthedocs.io/en/latest/security.html).
### Syncing roles at login
@@ -534,7 +475,7 @@ def FLASK_APP_MUTATOR(app: Flask) -> None:
To support a diverse set of users, Superset has some features that are not enabled by default. For
example, some users have stronger security restrictions, while some others may not. So Superset
allows users to enable or disable some features by config. For feature owners, you can add optional
allow users to enable or disable some features by config. For feature owners, you can add optional
functionalities in Superset, but will be only affected by a subset of users.
You can enable or disable features with flag from `superset_config.py`:

View File

@@ -31,17 +31,18 @@ install new database drivers into your Superset configuration.
### Supported Databases and Dependencies
Some of the recommended packages are shown below. Please refer to
[pyproject.toml](https://github.com/apache/superset/blob/master/pyproject.toml) for the versions that
are compatible with Superset.
| <div style={{width: '150px'}}>Database</div> | PyPI package | Connection String |
| --------------------------------------------------------- | ---------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ |
| [AWS Athena](/docs/configuration/databases#aws-athena) | `pip install pyathena[pandas]` , `pip install PyAthenaJDBC` | `awsathena+rest://{access_key_id}:{access_key}@athena.{region}.amazonaws.com/{schema}?s3_staging_dir={s3_staging_dir}&...` |
| [AWS Athena](/docs/configuration/databases#aws-athena) | `pip install pyathena[pandas]` , `pip install PyAthenaJDBC` | `awsathena+rest://{access_key_id}:{access_key}@athena.{region}.amazonaws.com/{schema}?s3_staging_dir={s3_staging_dir}&... ` |
| [AWS DynamoDB](/docs/configuration/databases#aws-dynamodb) | `pip install pydynamodb` | `dynamodb://{access_key_id}:{secret_access_key}@dynamodb.{region_name}.amazonaws.com?connector=superset` |
| [AWS Redshift](/docs/configuration/databases#aws-redshift) | `pip install sqlalchemy-redshift` | `redshift+psycopg2://<userName>:<DBPassword>@<AWS End Point>:5439/<Database Name>` |
| [AWS Redshift](/docs/configuration/databases#aws-redshift) | `pip install sqlalchemy-redshift` | ` redshift+psycopg2://<userName>:<DBPassword>@<AWS End Point>:5439/<Database Name>` |
| [Apache Doris](/docs/configuration/databases#apache-doris) | `pip install pydoris` | `doris://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>` |
| [Apache Drill](/docs/configuration/databases#apache-drill) | `pip install sqlalchemy-drill` | `drill+sadrill://<username>:<password>@<host>:<port>/<storage_plugin>`, often useful: `?use_ssl=True/False` |
| [Apache Drill](/docs/configuration/databases#apache-drill) | `pip install sqlalchemy-drill` | `drill+sadrill:// For JDBC drill+jdbc://` |
| [Apache Druid](/docs/configuration/databases#apache-druid) | `pip install pydruid` | `druid://<User>:<password>@<Host>:<Port-default-9088>/druid/v2/sql` |
| [Apache Hive](/docs/configuration/databases#hive) | `pip install pyhive` | `hive://hive@{hostname}:{port}/{database}` |
| [Apache Impala](/docs/configuration/databases#apache-impala) | `pip install impyla` | `impala://{hostname}:{port}/{database}` |
@@ -67,24 +68,22 @@ are compatible with Superset.
| [IBM Netezza Performance Server](/docs/configuration/databases#ibm-netezza-performance-server) | `pip install nzalchemy` | `netezza+nzpy://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
| [MySQL](/docs/configuration/databases#mysql) | `pip install mysqlclient` | `mysql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
| [OceanBase](/docs/configuration/databases#oceanbase) | `pip install oceanbase_py` | `oceanbase://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
| [Oracle](/docs/configuration/databases#oracle) | `pip install cx_Oracle` | `oracle://<username>:<password>@<hostname>:<port>` |
| [Oracle](/docs/configuration/databases#oracle) | `pip install cx_Oracle` | `oracle://` |
| [Parseable](/docs/configuration/databases#parseable) | `pip install sqlalchemy-parseable` | `parseable://<UserName>:<DBPassword>@<Database Host>/<Stream Name>` |
| [PostgreSQL](/docs/configuration/databases#postgres) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
| [Presto](/docs/configuration/databases#presto) | `pip install pyhive` | `presto://{username}:{password}@{hostname}:{port}/{database}` |
| [SAP Hana](/docs/configuration/databases#hana) | `pip install hdbcli sqlalchemy-hana` or `pip install apache_superset[hana]` | `hana://{username}:{password}@{host}:{port}` |
| [SingleStore](/docs/configuration/databases#singlestore) | `pip install sqlalchemy-singlestoredb` | `singlestoredb://{username}:{password}@{host}:{port}/{database}` |
| [Presto](/docs/configuration/databases#presto) | `pip install pyhive` | `presto://` |
| [Rockset](/docs/configuration/databases#rockset) | `pip install rockset-sqlalchemy` | `rockset://<api_key>:@<api_server>` |
| [SAP Hana](/docs/configuration/databases#hana) | `pip install hdbcli sqlalchemy-hana` or `pip install apache-superset[hana]` | `hana://{username}:{password}@{host}:{port}` |
| [StarRocks](/docs/configuration/databases#starrocks) | `pip install starrocks` | `starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>` |
| [Snowflake](/docs/configuration/databases#snowflake) | `pip install snowflake-sqlalchemy` | `snowflake://{user}:{password}@{account}.{region}/{database}?role={role}&warehouse={warehouse}` |
| SQLite | No additional library needed | `sqlite://path/to/file.db?check_same_thread=false` |
| [SQL Server](/docs/configuration/databases#sql-server) | `pip install pymssql` | `mssql+pymssql://<Username>:<Password>@<Host>:<Port-default:1433>/<Database Name>` |
| [TDengine](/docs/configuration/databases#tdengine) | `pip install taospy` `pip install taos-ws-py` | `taosws://<user>:<password>@<host>:<port>` |
| [SQL Server](/docs/configuration/databases#sql-server) | `pip install pymssql` | `mssql+pymssql://` |
| [Teradata](/docs/configuration/databases#teradata) | `pip install teradatasqlalchemy` | `teradatasql://{user}:{password}@{host}` |
| [TimescaleDB](/docs/configuration/databases#timescaledb) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>:<Port>/<Database Name>` |
| [Trino](/docs/configuration/databases#trino) | `pip install trino` | `trino://{username}:{password}@{hostname}:{port}/{catalog}` |
| [Vertica](/docs/configuration/databases#vertica) | `pip install sqlalchemy-vertica-python` | `vertica+vertica_python://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
| [YDB](/docs/configuration/databases#ydb) | `pip install ydb-sqlalchemy` | `ydb://{host}:{port}/{database_name}` |
| [YugabyteDB](/docs/configuration/databases#yugabytedb) | `pip install psycopg2` | `postgresql://<UserName>:<DBPassword>@<Database Host>/<Database Name>` |
---
Note that many other databases are supported, the main criteria being the existence of a functional
@@ -185,6 +184,7 @@ purposes of isolating the problem.
Repeat this process for each type of database you want Superset to connect to.
### Database-specific Instructions
#### Ascend.io
@@ -210,12 +210,14 @@ You'll need the following setting values to form the connection string:
- **Catalog**: Catalog Name
- **Database**: Database Name
Here's what the connection string looks like:
```
doris://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>
```
#### AWS Athena
##### PyAthenaJDBC
@@ -245,7 +247,6 @@ awsathena+rest://{aws_access_key_id}:{aws_secret_access_key}@athena.{region_name
```
The PyAthena library also allows to assume a specific IAM role which you can define by adding following parameters in Superset's Athena database connection UI under ADVANCED --> Other --> ENGINE PARAMETERS.
```json
{
"connect_args": {
@@ -268,6 +269,7 @@ dynamodb://{aws_access_key_id}:{aws_secret_access_key}@dynamodb.{region_name}.am
To get more documentation, please visit: [PyDynamoDB WIKI](https://github.com/passren/PyDynamoDB/wiki/5.-Superset).
#### AWS Redshift
The [sqlalchemy-redshift](https://pypi.org/project/sqlalchemy-redshift/) library is the recommended
@@ -283,6 +285,7 @@ You'll need to set the following values to form the connection string:
- **Database Name**: Database Name
- **Port**: default 5439
##### psycopg2
Here's what the SQLALCHEMY URI looks like:
@@ -291,6 +294,7 @@ Here's what the SQLALCHEMY URI looks like:
redshift+psycopg2://<userName>:<DBPassword>@<AWS End Point>:5439/<Database Name>
```
##### redshift_connector
Here's what the SQLALCHEMY URI looks like:
@@ -299,7 +303,8 @@ Here's what the SQLALCHEMY URI looks like:
redshift+redshift_connector://<userName>:<DBPassword>@<AWS End Point>:5439/<Database Name>
```
###### Using IAM-based credentials with Redshift cluster
###### Using IAM-based credentials with Redshift cluster:
[Amazon redshift cluster](https://docs.aws.amazon.com/redshift/latest/mgmt/working-with-clusters.html) also supports generating temporary IAM-based database user credentials.
@@ -310,10 +315,10 @@ You have to define the following arguments in Superset's redshift database conne
```
{"connect_args":{"iam":true,"database":"<database>","cluster_identifier":"<cluster_identifier>","db_user":"<db_user>"}}
```
and SQLALCHEMY URI should be set to `redshift+redshift_connector://`
###### Using IAM-based credentials with Redshift serverless
###### Using IAM-based credentials with Redshift serverless:
[Redshift serverless](https://docs.aws.amazon.com/redshift/latest/mgmt/serverless-whatis.html) supports connection using IAM roles.
@@ -325,6 +330,8 @@ You have to define the following arguments in Superset's redshift database conne
{"connect_args":{"iam":true,"is_serverless":true,"serverless_acct_id":"<aws account number>","serverless_work_group":"<redshift work group>","database":"<database>","user":"IAMR:<superset iam role name>"}}
```
#### ClickHouse
To use ClickHouse with Superset, you will need to install the `clickhouse-connect` Python library:
@@ -357,6 +364,8 @@ uses the default user without a password (and doesn't encrypt the connection):
clickhousedb://localhost/default
```
#### CockroachDB
The recommended connector library for CockroachDB is
@@ -368,12 +377,13 @@ The expected connection string is formatted as follows:
cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable
```
#### Couchbase
The Couchbase's Superset connection is designed to support two services: Couchbase Analytics and Couchbase Columnar.
The recommended connector library for couchbase is
[couchbase-sqlalchemy](https://github.com/couchbase/couchbase-sqlalchemy).
```
pip install couchbase-sqlalchemy
```
@@ -384,25 +394,22 @@ The expected connection string is formatted as follows:
couchbase://{username}:{password}@{hostname}:{port}?truststorepath={certificate path}?ssl={true/false}
```
#### CrateDB
The connector library for CrateDB is [sqlalchemy-cratedb].
We recommend to add the following item to your `requirements.txt` file:
```
sqlalchemy-cratedb>=0.40.1,<1
```
An SQLAlchemy connection string for [CrateDB Self-Managed] on localhost,
for evaluation purposes, looks like this:
```
crate://crate@127.0.0.1:4200
```
An SQLAlchemy connection string for connecting to [CrateDB Cloud] looks like
this:
```
crate://<username>:<password>@<clustername>.cratedb.net:4200/?ssl=true
```
@@ -410,7 +417,6 @@ crate://<username>:<password>@<clustername>.cratedb.net:4200/?ssl=true
Follow the steps [here](/docs/configuration/databases#installing-database-drivers)
to install the CrateDB connector package when setting up Superset locally using
Docker Compose.
```
echo "sqlalchemy-cratedb" >> ./docker/requirements-local.txt
```
@@ -419,6 +425,7 @@ echo "sqlalchemy-cratedb" >> ./docker/requirements-local.txt
[CrateDB Self-Managed]: https://cratedb.com/product/self-managed
[sqlalchemy-cratedb]: https://pypi.org/project/sqlalchemy-cratedb/
#### Databend
The recommended connector library for Databend is [databend-sqlalchemy](https://pypi.org/project/databend-sqlalchemy/).
@@ -436,6 +443,7 @@ Here's a connection string example of Superset connecting to a Databend database
databend://user:password@localhost:8000/default?secure=false
```
#### Databricks
Databricks now offer a native DB API 2.0 driver, `databricks-sql-connector`, that can be used with the `sqlalchemy-databricks` dialect. You can install both with:
@@ -519,6 +527,7 @@ For a connection to a SQL endpoint you need to use the HTTP path from the endpoi
{"connect_args": {"http_path": "/sql/1.0/endpoints/****", "driver_path": "/path/to/odbc/driver"}}
```
#### Denodo
The recommended connector library for Denodo is
@@ -530,6 +539,7 @@ The expected connection string is formatted as follows (default port is 9996):
denodo://{username}:{password}@{hostname}:{port}/{database}
```
#### Dremio
The recommended connector library for Dremio is
@@ -550,6 +560,7 @@ dremio+flight://{username}:{password}@{host}:{port}/dremio
This [blog post by Dremio](https://www.dremio.com/tutorials/dremio-apache-superset/) has some
additional helpful instructions on connecting Superset to Dremio.
#### Apache Drill
##### SQLAlchemy
@@ -591,6 +602,8 @@ We recommend reading the
the [GitHub README](https://github.com/JohnOmernik/sqlalchemy-drill#usage-with-odbc) to learn how to
work with Drill through ODBC.
import useBaseUrl from "@docusaurus/useBaseUrl";
#### Apache Druid
@@ -604,7 +617,6 @@ The connection string looks like:
```
druid://<User>:<password>@<Host>:<Port-default-9088>/druid/v2/sql
```
Here's a breakdown of the key components of this connection string:
- `User`: username portion of the credentials needed to connect to your database
@@ -633,7 +645,7 @@ To disable SSL verification, add the following to the **Extras** field:
```
engine_params:
{"connect_args":
{"scheme": "https", "ssl_verify_cert": false}}
{"scheme": "https", "ssl_verify_cert": false}}
```
##### Aggregations
@@ -657,6 +669,7 @@ much like you would create an aggregation manually, but specify `postagg` as a `
then have to provide a valid json post-aggregation definition (as specified in the Druid docs) in
the JSON field.
#### Elasticsearch
The recommended connector library for Elasticsearch is
@@ -705,7 +718,7 @@ Then register your table with the alias name logstash_all
By default, Superset uses UTC time zone for elasticsearch query. If you need to specify a time zone,
please edit your Database and enter the settings of your specified time zone in the Other > ENGINE PARAMETERS:
```json
```
{
"connect_args": {
"time_zone": "Asia/Shanghai"
@@ -727,6 +740,8 @@ To disable SSL verification, add the following to the **SQLALCHEMY URI** field:
elasticsearch+https://{user}:{password}@{host}:9200/?verify_certs=False
```
#### Exasol
The recommended connector library for Exasol is
@@ -738,6 +753,7 @@ The connection string for Exasol looks like this:
exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC
```
#### Firebird
The recommended connector library for Firebird is [sqlalchemy-firebird](https://pypi.org/project/sqlalchemy-firebird/).
@@ -755,6 +771,7 @@ Here's a connection string example of Superset connecting to a local Firebird da
firebird+fdb://SYSDBA:masterkey@192.168.86.38:3050//Library/Frameworks/Firebird.framework/Versions/A/Resources/examples/empbuild/employee.fdb
```
#### Firebolt
The recommended connector library for Firebolt is [firebolt-sqlalchemy](https://pypi.org/project/firebolt-sqlalchemy/).
@@ -785,7 +802,7 @@ The recommended connector library for BigQuery is
Follow the steps [here](/docs/configuration/databases#installing-drivers-in-docker-images) about how to
install new database drivers when setting up Superset locally via docker compose.
```bash
```
echo "sqlalchemy-bigquery" >> ./docker/requirements-local.txt
```
@@ -798,7 +815,7 @@ credentials file (as a JSON).
appropriate BigQuery datasets, and download the JSON configuration file for the service account.
2. In Superset, you can either upload that JSON or add the JSON blob in the following format (this should be the content of your credential JSON file):
```json
```
{
"type": "service_account",
"project_id": "...",
@@ -826,7 +843,7 @@ credentials file (as a JSON).
Go to the **Advanced** tab, Add a JSON blob to the **Secure Extra** field in the database configuration form with
the following format:
```json
```
{
"credentials_info": <contents of credentials JSON file>
}
@@ -834,7 +851,7 @@ credentials file (as a JSON).
The resulting file should have this structure:
```json
```
{
"credentials_info": {
"type": "service_account",
@@ -861,6 +878,8 @@ To be able to upload CSV or Excel files to BigQuery in Superset, you'll need to
Currently, the Google BigQuery Python SDK is not compatible with `gevent`, due to some dynamic monkeypatching on python core library by `gevent`.
So, when you deploy Superset with `gunicorn` server, you have to use worker type except `gevent`.
#### Google Sheets
Google Sheets has a very limited
@@ -871,6 +890,7 @@ There are a few steps involved in connecting Superset to Google Sheets. This
[tutorial](https://preset.io/blog/2020-06-01-connect-superset-google-sheets/) has the most up to date
instructions on setting up this connection.
#### Hana
The recommended connector library is [sqlalchemy-hana](https://github.com/SAP/sqlalchemy-hana).
@@ -881,6 +901,7 @@ The connection string is formatted as follows:
hana://{username}:{password}@{host}:{port}
```
#### Apache Hive
The [pyhive](https://pypi.org/project/PyHive/) library is the recommended way to connect to Hive through SQLAlchemy.
@@ -891,6 +912,7 @@ The expected connection string is formatted as follows:
hive://hive@{hostname}:{port}/{database}
```
#### Hologres
Hologres is a real-time interactive analytics service developed by Alibaba Cloud. It is fully compatible with PostgreSQL 11 and integrates seamlessly with the big data ecosystem.
@@ -909,6 +931,7 @@ The connection string looks like:
postgresql+psycopg2://{username}:{password}@{host}:{port}/{database}
```
#### IBM DB2
The [IBM_DB_SA](https://github.com/ibmdb/python-ibmdbsa/tree/master/ibm_db_sa) library provides a
@@ -926,6 +949,7 @@ There are two DB2 dialect versions implemented in SQLAlchemy. If you are connect
ibm_db_sa://{username}:{passport}@{hostname}:{port}/{database}
```
#### Apache Impala
The recommended connector library to Apache Impala is [impyla](https://github.com/cloudera/impyla).
@@ -936,6 +960,7 @@ The expected connection string is formatted as follows:
impala://{hostname}:{port}/{database}
```
#### Kusto
The recommended connector library for Kusto is
@@ -956,6 +981,7 @@ kustokql+https://{cluster_url}/{database}?azure_ad_client_id={azure_ad_client_id
Make sure the user has privileges to access and use all required
databases/tables/views.
#### Apache Kylin
The recommended connector library for Apache Kylin is
@@ -967,6 +993,10 @@ The expected connection string is formatted as follows:
kylin://<username>:<password>@<hostname>:<port>/<project>?<param1>=<value1>&<param2>=<value2>
```
#### MySQL
The recommended connector library for MySQL is [mysqlclient](https://pypi.org/project/mysqlclient/).
@@ -991,6 +1021,7 @@ One problem with `mysqlclient` is that it will fail to connect to newer MySQL da
mysql+mysqlconnector://{username}:{password}@{host}/{database}
```
#### IBM Netezza Performance Server
The [nzalchemy](https://pypi.org/project/nzalchemy/) library provides a
@@ -1007,19 +1038,21 @@ netezza+nzpy://{username}:{password}@{hostname}:{port}/{database}
The [sqlalchemy-oceanbase](https://pypi.org/project/oceanbase_py/) library is the recommended
way to connect to OceanBase through SQLAlchemy.
The connection string for OceanBase looks like this:
```
oceanbase://<User>:<Password>@<Host>:<Port>/<Database>
```
#### Ocient DB
The recommended connector library for Ocient is [sqlalchemy-ocient](https://pypi.org/project/sqlalchemy-ocient).
##### Install the Ocient Driver
```bash
```
pip install sqlalchemy-ocient
```
@@ -1060,7 +1093,7 @@ parseable://admin:admin@demo.parseable.com:443/ingress-nginx
Note: The stream_name in the URI represents the Parseable logstream you want to query. You can use both HTTP (port 80) and HTTPS (port 443) connections.
>>>>>>>
#### Apache Pinot
The recommended connector library for Apache Pinot is [pinotdb](https://pypi.org/project/pinotdb/).
@@ -1079,8 +1112,7 @@ pinot://<username>:<password>@<pinot-broker-host>:<pinot-broker-port>/query/sql?
If you want to use explore view or joins, window functions, etc. then enable [multi-stage query engine](https://docs.pinot.apache.org/reference/multi-stage-engine).
Add below argument while creating database connection in Advanced -> Other -> ENGINE PARAMETERS
```json
```
{"connect_args":{"use_multistage_engine":"true"}}
```
@@ -1120,6 +1152,7 @@ More information about PostgreSQL connection options can be found in the
and the
[PostgreSQL docs](https://www.postgresql.org/docs/9.1/libpq-connect.html#LIBPQ-PQCONNECTDBPARAMS).
#### Presto
The [pyhive](https://pypi.org/project/PyHive/) library is the recommended way to connect to Presto through SQLAlchemy.
@@ -1145,7 +1178,7 @@ presto://datascientist:securepassword@presto.example.com:8080/hive
By default Superset assumes the most recent version of Presto is being used when querying the
datasource. If youre using an older version of Presto, you can configure it in the extra parameter:
```json
```
{
"version": "0.123"
}
@@ -1153,7 +1186,7 @@ datasource. If youre using an older version of Presto, you can configure it i
SSL Secure extra add json config to extra connection information.
```json
```
{
"connect_args":
{"protocol": "https",
@@ -1162,6 +1195,8 @@ SSL Secure extra add json config to extra connection information.
}
```
#### RisingWave
The recommended connector library for RisingWave is
@@ -1173,6 +1208,27 @@ The expected connection string is formatted as follows:
risingwave://root@{hostname}:{port}/{database}?sslmode=disable
```
#### Rockset
The connection string for Rockset is:
```
rockset://{api key}:@{api server}
```
Get your API key from the [Rockset console](https://console.rockset.com/apikeys).
Find your API server from the [API reference](https://rockset.com/docs/rest-api/#introduction). Omit the `https://` portion of the URL.
To target to a specific virtual instance, use this URI format:
```
rockset://{api key}:@{api server}/{VI ID}
```
For more complete instructions, we recommend the [Rockset documentation](https://docs.rockset.com/apache-superset/).
#### Snowflake
##### Install Snowflake Driver
@@ -1180,7 +1236,7 @@ risingwave://root@{hostname}:{port}/{database}?sslmode=disable
Follow the steps [here](/docs/configuration/databases#installing-database-drivers) about how to
install new database drivers when setting up Superset locally via docker compose.
```bash
```
echo "snowflake-sqlalchemy" >> ./docker/requirements-local.txt
```
@@ -1213,7 +1269,7 @@ To connect Snowflake with Key Pair Authentication, you need to add the following
***Please note that you need to merge multi-line private key content to one line and insert `\n` between each line***
```json
```
{
"auth_method": "keypair",
"auth_params": {
@@ -1225,7 +1281,7 @@ To connect Snowflake with Key Pair Authentication, you need to add the following
If your private key is stored on server, you can replace "privatekey_body" with “privatekey_path” in parameter.
```json
```
{
"auth_method": "keypair",
"auth_params": {
@@ -1246,6 +1302,7 @@ The connection string for Solr looks like this:
solr://{username}:{password}@{host}:{port}/{server_path}/{collection}[/?use_ssl=true|false]
```
#### Apache Spark SQL
The recommended connector library for Apache Spark SQL [pyhive](https://pypi.org/project/PyHive/).
@@ -1263,34 +1320,16 @@ The recommended connector library for SQL Server is [pymssql](https://github.com
The connection string for SQL Server looks like this:
```
mssql+pymssql://<Username>:<Password>@<Host>:<Port-default:1433>/<Database Name>
mssql+pymssql://<Username>:<Password>@<Host>:<Port-default:1433>/<Database Name>/?Encrypt=yes
```
It is also possible to connect using [pyodbc](https://pypi.org/project/pyodbc) with the parameter [odbc_connect](https://docs.sqlalchemy.org/en/14/dialects/mssql.html#pass-through-exact-pyodbc-string)
The connection string for SQL Server looks like this:
```
mssql+pyodbc:///?odbc_connect=Driver%3D%7BODBC+Driver+17+for+SQL+Server%7D%3BServer%3Dtcp%3A%3Cmy_server%3E%2C1433%3BDatabase%3Dmy_database%3BUid%3Dmy_user_name%3BPwd%3Dmy_password%3BEncrypt%3Dyes%3BConnection+Timeout%3D30
```
:::note
You might have noticed that some special charecters are used in the above connection string. For example see the `odbc_connect` parameter. The value is `Driver%3D%7BODBC+Driver+17+for+SQL+Server%7D%3B` which is a URL-encoded form of `Driver={ODBC+Driver+17+for+SQL+Server};`. It's important to give the connection string is URL encoded.
For more information about this check the [sqlalchemy documentation](https://docs.sqlalchemy.org/en/20/core/engines.html#escaping-special-characters-such-as-signs-in-passwords). Which says `When constructing a fully formed URL string to pass to create_engine(), special characters such as those that may be used in the user and password need to be URL encoded to be parsed correctly. This includes the @ sign.`
:::
#### SingleStore
The recommended connector library for SingleStore is
[sqlalchemy-singlestoredb](https://github.com/singlestore-labs/sqlalchemy-singlestoredb).
The expected connection string is formatted as follows:
```
singlestoredb://{username}:{password}@{host}:{port}/{database}
```
#### StarRocks
The [sqlalchemy-starrocks](https://pypi.org/project/starrocks/) library is the recommended
@@ -1315,24 +1354,6 @@ starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>
StarRocks maintains their Superset docuementation [here](https://docs.starrocks.io/docs/integrations/BI_integrations/Superset/).
:::
#### TDengine
[TDengine](https://www.tdengine.com) is a High-Performance, Scalable Time-Series Database for Industrial IoT and provides SQL-like query interface.
The recommended connector library for TDengine is [taospy](https://pypi.org/project/taospy/) and [taos-ws-py](https://pypi.org/project/taos-ws-py/)
The expected connection string is formatted as follows:
```
taosws://<user>:<password>@<host>:<port>
```
For example:
```
taosws://root:taosdata@127.0.0.1:6041
```
#### Teradata
The recommended connector library is
@@ -1354,7 +1375,7 @@ here: https://downloads.teradata.com/download/connectivity/odbc-driver/linux
Here are the required environment variables:
```bash
```
export ODBCINI=/.../teradata/client/ODBC_64/odbc.ini
export ODBCINST=/.../teradata/client/ODBC_64/odbcinst.ini
```
@@ -1363,8 +1384,8 @@ We recommend using the first library because of the
lack of requirement around ODBC drivers and
because it's more regularly updated.
#### TimescaleDB
#### TimescaleDB
[TimescaleDB](https://www.timescale.com) is the open-source relational database for time-series and analytics to build powerful data-intensive applications.
TimescaleDB is a PostgreSQL extension, and you can use the standard PostgreSQL connector library, [psycopg2](https://www.psycopg.org/docs/), to connect to the database.
@@ -1396,38 +1417,31 @@ postgresql://{username}:{password}@{host}:{port}/{database name}?sslmode=require
[Learn more about TimescaleDB!](https://docs.timescale.com/)
#### Trino
Supported trino version 352 and higher
##### Connection String
The connection string format is as follows:
```
trino://{username}:{password}@{hostname}:{port}/{catalog}
```
If you are running Trino with docker on local machine, please use the following connection URL
```
trino://trino@host.docker.internal:8080
```
##### Authentications
###### 1. Basic Authentication
You can provide `username`/`password` in the connection string or in the `Secure Extra` field at `Advanced / Security`
- In Connection String
* In Connection String
```
trino://{username}:{password}@{hostname}:{port}/{catalog}
```
- In `Secure Extra` field
* In `Secure Extra` field
```json
{
"auth_method": "basic",
@@ -1441,9 +1455,7 @@ You can provide `username`/`password` in the connection string or in the `Secure
NOTE: if both are provided, `Secure Extra` always takes higher priority.
###### 2. Kerberos Authentication
In `Secure Extra` field, config as following example:
```json
{
"auth_method": "kerberos",
@@ -1460,9 +1472,7 @@ All fields in `auth_params` are passed directly to the [`KerberosAuthentication`
NOTE: Kerberos authentication requires installing the [`trino-python-client`](https://github.com/trinodb/trino-python-client) locally with either the `all` or `kerberos` optional features, i.e., installing `trino[all]` or `trino[kerberos]` respectively.
###### 3. Certificate Authentication
In `Secure Extra` field, config as following example:
```json
{
"auth_method": "certificate",
@@ -1476,9 +1486,7 @@ In `Secure Extra` field, config as following example:
All fields in `auth_params` are passed directly to the [`CertificateAuthentication`](https://github.com/trinodb/trino-python-client/blob/0.315.0/trino/auth.py#L416) class.
###### 4. JWT Authentication
Config `auth_method` and provide token in `Secure Extra` field
```json
{
"auth_method": "jwt",
@@ -1489,10 +1497,8 @@ Config `auth_method` and provide token in `Secure Extra` field
```
###### 5. Custom Authentication
To use custom authentication, first you need to add it into
`ALLOWED_EXTRA_AUTHENTICATIONS` allow list in Superset config file:
```python
from your.module import AuthClass
from another.extra import auth_method
@@ -1506,7 +1512,6 @@ ALLOWED_EXTRA_AUTHENTICATIONS: Dict[str, Dict[str, Callable[..., Any]]] = {
```
Then in `Secure Extra` field:
```json
{
"auth_method": "custom_auth",
@@ -1522,8 +1527,8 @@ or factory function (which returns an `Authentication` instance) to `auth_method
All fields in `auth_params` are passed directly to your class/function.
**Reference**:
* [Trino-Superset-Podcast](https://trino.io/episodes/12.html)
- [Trino-Superset-Podcast](https://trino.io/episodes/12.html)
#### Vertica
@@ -1550,6 +1555,8 @@ Other parameters:
- Load Balancer - Backup Host
#### YDB
The recommended connector library for [YDB](https://ydb.tech/) is
@@ -1564,7 +1571,6 @@ ydb://{host}:{port}/{database_name}
```
##### Protocol
You can specify `protocol` in the `Secure Extra` field at `Advanced / Security`:
```
@@ -1575,10 +1581,9 @@ You can specify `protocol` in the `Secure Extra` field at `Advanced / Security`:
Default is `grpc`.
##### Authentication Methods
###### Static Credentials
To use `Static Credentials` you should provide `username`/`password` in the `Secure Extra` field at `Advanced / Security`:
```
@@ -1590,8 +1595,8 @@ To use `Static Credentials` you should provide `username`/`password` in the `Sec
}
```
###### Access Token Credentials
###### Access Token Credentials
To use `Access Token Credentials` you should provide `token` in the `Secure Extra` field at `Advanced / Security`:
```
@@ -1602,8 +1607,8 @@ To use `Access Token Credentials` you should provide `token` in the `Secure Extr
}
```
##### Service Account Credentials
##### Service Account Credentials
To use Service Account Credentials, you should provide `service_account_json` in the `Secure Extra` field at `Advanced / Security`:
```
@@ -1621,6 +1626,8 @@ To use Service Account Credentials, you should provide `service_account_json` in
}
```
#### YugabyteDB
[YugabyteDB](https://www.yugabyte.com/) is a distributed SQL database built on top of PostgreSQL.
@@ -1635,6 +1642,8 @@ The connection string looks like:
postgresql://{username}:{password}@{host}:{port}/{database}
```
## Connecting through the UI
Here is the documentation on how to leverage the new DB Connection UI. This will provide admins the ability to enhance the UX for users who want to connect to new databases.
@@ -1707,6 +1716,9 @@ For databases like MySQL and Postgres that use the standard format of `engine+dr
For other databases you need to implement these methods yourself. The BigQuery DB engine spec is a good example of how to do that.
### Extra Database Settings
##### Deeper SQLAlchemy Integration
@@ -1770,7 +1782,9 @@ You can use the `Extra` field in the **Edit Databases** form to configure SSL:
}
```
## Misc
## Misc.
### Querying across databases

View File

@@ -51,7 +51,6 @@ if desired. Most endpoints hit are logged as
well as key events like query start and end in SQL Lab.
To setup StatsD logging, its a matter of configuring the logger in your `superset_config.py`.
If not already present, you need to ensure that the `statsd`-package is installed in Superset's python environment.
```python
from superset.stats_logger import StatsdStatsLogger

View File

@@ -10,7 +10,7 @@ version: 1
The superset cli allows you to import and export datasources from and to YAML. Datasources include
databases. The data is expected to be organized in the following hierarchy:
```text
```
├──databases
| ├──database_1
| | ├──table_1
@@ -30,13 +30,13 @@ databases. The data is expected to be organized in the following hierarchy:
You can print your current datasources to stdout by running:
```bash
```
superset export_datasources
```
To save your datasources to a ZIP file run:
```bash
```
superset export_datasources -f <filename>
```
@@ -55,7 +55,7 @@ Alternatively, you can export datasources using the UI:
In order to obtain an **exhaustive list of all fields** you can import using the YAML import run:
```bash
```
superset export_datasource_schema
```
@@ -65,13 +65,13 @@ As a reminder, you can use the `-b` flag to include back references.
In order to import datasources from a ZIP file, run:
```bash
```
superset import_datasources -p <path / filename>
```
The optional username flag **-u** sets the user used for the datasource import. The default is 'admin'. Example:
```bash
```
superset import_datasources -p <path / filename> -u 'admin'
```
@@ -81,7 +81,7 @@ superset import_datasources -p <path / filename> -u 'admin'
When using Superset version 4.x.x to import from an older version (2.x.x or 3.x.x) importing is supported as the command `legacy_import_datasources` and expects a JSON or directory of JSONs. The options are `-r` for recursive and `-u` for specifying a user. Example of legacy import without options:
```bash
```
superset legacy_import_datasources -p <path or filename>
```
@@ -89,21 +89,21 @@ superset legacy_import_datasources -p <path or filename>
When using an older Superset version (2.x.x & 3.x.x) of Superset, the command is `import_datasources`. ZIP and YAML files are supported and to switch between them the feature flag `VERSIONED_EXPORT` is used. When `VERSIONED_EXPORT` is `True`, `import_datasources` expects a ZIP file, otherwise YAML. Example:
```bash
```
superset import_datasources -p <path or filename>
```
When `VERSIONED_EXPORT` is `False`, if you supply a path all files ending with **yaml** or **yml** will be parsed. You can apply
additional flags (e.g. to search the supplied path recursively):
```bash
```
superset import_datasources -p <path> -r
```
The sync flag **-s** takes parameters in order to sync the supplied elements with your file. Be
careful this can delete the contents of your meta database. Example:
```bash
```
superset import_datasources -p <path / filename> -s columns,metrics
```
@@ -115,7 +115,7 @@ If you dont supply the sync flag (**-s**) importing will only add and update
E.g. you can add a verbose_name to the column ds in the table random_time_series from the example
datasets by saving the following YAML to file and then running the **import_datasources** command.
```yaml
```
databases:
- database_name: main
tables:

View File

@@ -1,78 +0,0 @@
---
title: Map Tiles
sidebar_position: 12
version: 1
---
# Map tiles
Superset uses OSM and Mapbox tiles by default. OSM is free but you still need setting your MAPBOX_API_KEY if you want to use mapbox maps.
## Setting map tiles
Map tiles can be set with `DECKGL_BASE_MAP` in your `superset_config.py` or `superset_config_docker.py`
For adding your own map tiles, you can use the following format.
```python
DECKGL_BASE_MAP = [
['tile://https://your_personal_url/{z}/{x}/{y}.png', 'MyTile']
]
```
Openstreetmap tiles url can be added without prefix.
```python
DECKGL_BASE_MAP = [
['https://c.tile.openstreetmap.org/{z}/{x}/{y}.png', 'OpenStreetMap']
]
```
Default values are:
```python
DECKGL_BASE_MAP = [
['https://tile.openstreetmap.org/{z}/{x}/{y}.png', 'Streets (OSM)'],
['https://tile.osm.ch/osm-swiss-style/{z}/{x}/{y}.png', 'Topography (OSM)'],
['mapbox://styles/mapbox/streets-v9', 'Streets'],
['mapbox://styles/mapbox/dark-v9', 'Dark'],
['mapbox://styles/mapbox/light-v9', 'Light'],
['mapbox://styles/mapbox/satellite-streets-v9', 'Satellite Streets'],
['mapbox://styles/mapbox/satellite-v9', 'Satellite'],
['mapbox://styles/mapbox/outdoors-v9', 'Outdoors'],
]
```
It is possible to set only mapbox by removing osm tiles and other way around.
:::warning
Setting `DECKGL_BASE_MAP` overwrite default values
:::
After defining your map tiles, set them in these variables:
- `CORS_OPTIONS`
- `connect-src` of `TALISMAN_CONFIG` and `TALISMAN_CONFIG_DEV` variables.
```python
ENABLE_CORS = True
CORS_OPTIONS: dict[Any, Any] = {
"origins": [
"https://tile.openstreetmap.org",
"https://tile.osm.ch",
"https://your_personal_url/{z}/{x}/{y}.png",
]
}
.
.
TALISMAN_CONFIG = {
"content_security_policy": {
...
"connect-src": [
"'self'",
"https://api.mapbox.com",
"https://events.mapbox.com",
"https://tile.openstreetmap.org",
"https://tile.osm.ch",
"https://your_personal_url/{z}/{x}/{y}.png",
],
...
}
```

View File

@@ -8,11 +8,11 @@ version: 1
## CORS
To configure CORS, or cross-origin resource sharing, the following dependency must be installed:
:::note
In Superset versions prior to `5.x` you have to install to install `flask-cors` with `pip install flask-cors` to enable CORS support.
:::
```python
pip install apache-superset[cors]
```
The following keys in `superset_config.py` can be specified to configure CORS:
@@ -20,12 +20,14 @@ The following keys in `superset_config.py` can be specified to configure CORS:
- `CORS_OPTIONS`: options passed to Flask-CORS
([documentation](https://flask-cors.readthedocs.io/en/latest/api.html#extension))
## HTTP headers
Note that Superset bundles [flask-talisman](https://pypi.org/project/talisman/)
Self-described as a small Flask extension that handles setting HTTP headers that can help
protect against a few common web application security issues.
## HTML Embedding of Dashboards and Charts
There are two ways to embed a dashboard: Using the [SDK](https://www.npmjs.com/package/@superset-ui/embedded-sdk) or embedding a direct link. Note that in the latter case everybody who knows the link is able to access the dashboard.
@@ -37,16 +39,14 @@ This works by first changing the content security policy (CSP) of [flask-talisma
#### Changing flask-talisman CSP
Add to `superset_config.py` the entire `TALISMAN_CONFIG` section from `config.py` and include a `frame-ancestors` section:
```python
TALISMAN_ENABLED = True
TALISMAN_CONFIG = {
"content_security_policy": {
...
"frame-ancestors": ["*.my-domain.com", "*.another-domain.com"],
"frame-ancestors": ["*.my-domain.com", "*.another-domain.com"],
...
```
Restart Superset for this configuration change to take effect.
#### Making a Dashboard Public
@@ -69,7 +69,6 @@ Now anybody can directly access the dashboard's URL. You can embed it in an ifra
>
</iframe>
```
#### Embedding a Chart
A chart's embed code can be generated by going to a chart's edit view and then clicking at the top right on `...` > `Share` > `Embed code`
@@ -86,10 +85,11 @@ SUPERSET_FEATURE_EMBEDDED_SUPERSET=true
## CSRF settings
Similarly, [flask-wtf](https://flask-wtf.readthedocs.io/en/0.15.x/config/) is used to manage
Similarly, [flask-wtf](https://flask-wtf.readthedocs.io/en/0.15.x/config/) is used manage
some CSRF configurations. If you need to exempt endpoints from CSRF (e.g. if you are
running a custom auth postback endpoint), you can add the endpoints to `WTF_CSRF_EXEMPT_LIST`:
## SSH Tunneling
1. Turn on feature flag
@@ -105,6 +105,7 @@ running a custom auth postback endpoint), you can add the endpoints to `WTF_CSRF
3. Verify data is flowing
- Once SSH tunneling has been enabled, go to SQL Lab and write a query to verify data is properly flowing.
## Domain Sharding
:::note
@@ -138,4 +139,4 @@ of your additional middleware classes.
For example, to use `AUTH_REMOTE_USER` from behind a proxy server like nginx, you have to add a
simple middleware class to add the value of `HTTP_X_PROXY_REMOTE_USER` (or any other custom header
from the proxy) to Gunicorns `REMOTE_USER` environment variable.
from the proxy) to Gunicorns `REMOTE_USER` environment variable:

View File

@@ -0,0 +1,6 @@
---
title: Setup SSH Tunneling
hide_title: true
sidebar_position: 8
version: 1
---

View File

@@ -77,7 +77,6 @@ In the UI you can assign a set of parameters as JSON
"my_table": "foo"
}
```
The parameters become available in your SQL (example: `SELECT * FROM {{ my_table }}` ) by using Jinja templating syntax.
SQL Lab template parameters are stored with the dataset as `TEMPLATE PARAMETERS`.
@@ -104,6 +103,7 @@ GROUP BY action
Note ``_filters`` is not stored with the dataset. It's only used within the SQL Lab UI.
Besides default Jinja templating, SQL lab also supports self-defined template processor by setting
the `CUSTOM_TEMPLATE_PROCESSORS` in your superset configuration. The values in this dictionary
overwrite the default Jinja template processors of the specified database engine. The example below
@@ -186,7 +186,7 @@ cache hit in the future and Superset can retrieve cached data.
You can disable the inclusion of the `username` value in the calculation of the
cache key by adding the following parameter to your Jinja code:
```python
```
{{ current_username(add_to_cache_keys=False) }}
```
@@ -201,7 +201,7 @@ cache hit in the future and Superset can retrieve cached data.
You can disable the inclusion of the account `id` value in the calculation of the
cache key by adding the following parameter to your Jinja code:
```python
```
{{ current_user_id(add_to_cache_keys=False) }}
```
@@ -216,48 +216,10 @@ cache hit in the future and Superset can retrieve cached data.
You can disable the inclusion of the email value in the calculation of the
cache key by adding the following parameter to your Jinja code:
```python
```
{{ current_user_email(add_to_cache_keys=False) }}
```
**Current User Roles**
The `{{ current_user_roles() }}` macro returns an array of roles for the logged in user.
If you have caching enabled in your Superset configuration, then by default the roles value will be used
by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
cache hit in the future and Superset can retrieve cached data.
You can disable the inclusion of the roles value in the calculation of the
cache key by adding the following parameter to your Jinja code:
```python
{{ current_user_roles(add_to_cache_keys=False) }}
```
You can json-stringify the array by adding `|tojson` to your Jinja code:
```python
{{ current_user_roles()|tojson }}
```
You can use the `|where_in` filter to use your roles in a SQL statement. For example, if `current_user_roles()` returns `['admin', 'viewer']`, the following template:
```python
SELECT * FROM users WHERE role IN {{ current_user_roles()|where_in }}
```
Will be rendered as:
```sql
SELECT * FROM users WHERE role IN ('admin', 'viewer')
```
**Current User RLS Rules**
The `{{ current_user_rls_rules() }}` macro returns an array of RLS rules applied to the current dataset for the logged in user.
If you have caching enabled in your Superset configuration, then the list of RLS Rules will be used
by Superset when calculating the cache key. A cache key is a unique identifier that determines if there's a
cache hit in the future and Superset can retrieve cached data.
**Custom URL Parameters**
The `{{ url_param('custom_variable') }}` macro lets you define arbitrary URL
@@ -311,7 +273,7 @@ You can retrieve the value for a specific filter as a list using `{{ filter_valu
This is useful if:
- You want to use a filter component to filter a query where the name of filter component column doesn't match the one in the select statement
- You want to have the ability to filter inside the main query for performance purposes
- You want to have the ability for filter inside the main query for performance purposes
Here's a concrete example:
@@ -339,7 +301,7 @@ This is useful if:
Here's a concrete example:
```sql
```
WITH RECURSIVE
superiors(employee_id, manager_id, full_name, level, lineage) AS (
SELECT
@@ -395,7 +357,6 @@ considerably improve performance, as many databases and query engines are able t
if the temporal filter is placed on the inner query, as opposed to the outer query.
The macro takes the following parameters:
- `column`: Name of the temporal column. Leave undefined to reference the time range from a Dashboard Native Time Range
filter (when present).
- `default`: The default value to fall back to if the time filter is not present, or has the value `No filter`
@@ -409,7 +370,6 @@ The macro takes the following parameters:
filter should only apply to the inner query.
The return type has the following properties:
- `from_expr`: the start of the time filter (if any)
- `to_expr`: the end of the time filter (if any)
- `time_range`: The applied time range
@@ -450,7 +410,6 @@ LIMIT 1000;
When using the `default` parameter, the templated query can be simplified, as the endpoints will always be defined
(to use a fixed time range, you can also use something like `default="2024-08-27 : 2024-09-03"`)
```
{% set time_filter = get_time_filter("dttm", default="Last week", remove_filter=True) %}
SELECT
@@ -470,19 +429,19 @@ To use the macro, first you need to find the ID of the dataset. This can be done
Once you have the ID you can query it as if it were a table:
```sql
```
SELECT * FROM {{ dataset(42) }} LIMIT 10
```
If you want to select the metric definitions as well, in addition to the columns, you need to pass an additional keyword argument:
```sql
```
SELECT * FROM {{ dataset(42, include_metrics=True) }} LIMIT 10
```
Since metrics are aggregations, the resulting SQL expression will be grouped by all non-metric columns. You can specify a subset of columns to group by instead:
```sql
```
SELECT * FROM {{ dataset(42, include_metrics=True, columns=["ds", "category"]) }} LIMIT 10
```
@@ -499,37 +458,3 @@ This macro avoids copy/paste, allowing users to centralize the metric definition
The `dataset_id` parameter is optional, and if not provided Superset will use the current dataset from context (for example, when using this macro in the Chart Builder, by default the `macro_key` will be searched in the dataset powering the chart).
The parameter can be used in SQL Lab, or when fetching a metric from another dataset.
## Available Filters
Superset supports [builtin filters from the Jinja2 templating package](https://jinja.palletsprojects.com/en/stable/templates/#builtin-filters). Custom filters have also been implemented:
**Where In**
Parses a list into a SQL-compatible statement. This is useful with macros that return an array (for example the `filter_values` macro):
```
Dashboard filter with "First", "Second" and "Third" options selected
{{ filter_values('column') }} => ["First", "Second", "Third"]
{{ filter_values('column')|where_in }} => ('First', 'Second', 'Third')
```
By default, this filter returns `()` (as a string) in case the value is null. The `default_to_none` parameter can be se to `True` to return null in this case:
```
Dashboard filter without any value applied
{{ filter_values('column') }} => ()
{{ filter_values('column')|where_in(default_to_none=True) }} => None
```
**To Datetime**
Loads a string as a `datetime` object. This is useful when performing date operations. For example:
```
{% set from_expr = get_time_filter("dttm", strftime="%Y-%m-%d").from_expr %}
{% set to_expr = get_time_filter("dttm", strftime="%Y-%m-%d").to_expr %}
{% if (to_expr|to_datetime(format="%Y-%m-%d") - from_expr|to_datetime(format="%Y-%m-%d")).days > 100 %}
do something
{% else %}
do something else
{% endif %}
```

View File

@@ -1,53 +0,0 @@
---
title: Theming
hide_title: true
sidebar_position: 12
version: 1
---
# Theming Superset
:::note
apache-superset>=6.0
:::
Superset now rides on **Ant Design v5s token-based theming**.
Every Antd token works, plus a handful of Superset-specific ones for charts and dashboard chrome.
## 1 — Create a theme
1. Open the official [Ant Design Theme Editor](https://ant.design/theme-editor)
2. Design your palette, typography, and component overrides.
3. Open the `CONFIG` modal and paste the JSON.
You can also extend with Superset-specific tokens (documented in the default theme object) before you import.
## 2 — Apply it instance-wide
```python
# superset_config.py
THEME = {
# Paste your JSON theme definition here
}
```
Restart Superset to apply changes
## 3 — Tweak live in the app (beta)
Set the feature flag in your `superset_config`
```python
DEFAULT_FEATURE_FLAGS: dict[str, bool] = {
{{ ... }}
THEME_ALLOW_THEME_EDITOR_BETA = True,
}
```
- Enables a JSON editor panel inside Superset as a new icon in the navbar
- Intended for testing/design and rapid in-context iteration
- End-user theme switching & preferences coming later
## 4 — Potential Next Steps
- CRUD UI for managing multiple themes
- Per-dashboard & per-workspace theme assignment
- User-selectable theme preferences

View File

@@ -24,7 +24,7 @@ The challenge however lies with the slew of [database engines](/docs/configurati
For example the following is a comparison of MySQL and Presto,
```python
```
import pandas as pd
from sqlalchemy import create_engine
@@ -41,7 +41,7 @@ pd.read_sql_query(
which outputs `{"ts":{"0":1640995200000}}` (which infers the UTC timezone per the Epoch time definition) and `{"ts":{"0":"2022-01-01 00:00:00.000"}}` (without an explicit timezone) respectively and thus are treated differently in JavaScript:
```js
```
new Date(1640995200000)
> Sat Jan 01 2022 13:00:00 GMT+1300 (New Zealand Daylight Time)

View File

@@ -26,9 +26,9 @@ More references:
Here's a list of repositories that contain Superset-related packages:
- [apache/superset](https://github.com/apache/superset)
is the main repository containing the `apache_superset` Python package
is the main repository containing the `apache-superset` Python package
distributed on
[pypi](https://pypi.org/project/apache_superset/). This repository
[pypi](https://pypi.org/project/apache-superset/). This repository
also includes Superset's main TypeScript/JavaScript bundles and react apps under
the [superset-frontend](https://github.com/apache/superset/tree/master/superset-frontend)
folder.

View File

@@ -52,7 +52,7 @@ Note that:
[docker-compose.yml](https://github.com/apache/superset/blob/master/docker-compose.yml)
- The local repository is mounted within the services, meaning updating
the code on the host will be reflected in the docker images
- Superset is served at localhost:9000/
- Superset is served at localhost:8088/
- You can login with admin/admin
:::note
@@ -72,13 +72,13 @@ documentation.
configured to be secure.
:::
### Supported environment variables
Affecting the Docker build process:
- **SUPERSET_BUILD_TARGET (default=dev):** which --target to build, either `lean` or `dev` are commonly used
- **INCLUDE_FIREFOX (default=false):** whether to include the Firefox headless browser in the build
- **INCLUDE_CHROMIUM (default=false):** whether to include the Chromium headless browser in the build
- **INCLUDE_CHROMIUM (default=false):** whether to include the Firefox headless browser in the build
- **BUILD_TRANSLATIONS(default=false):** whether to compile the translations from the .po files available
- **SUPERSET_LOAD_EXAMPLES (default=yes):** whether to load the examples into the database upon startup,
save some precious time on startup by `SUPERSET_LOAD_EXAMPLES=no docker compose up`
@@ -90,7 +90,6 @@ For more env vars that affect your configuration, see this
used in the `docker compose` context to assign env vars to the superset configuration.
### Accessing the postgres database
Sometimes it's useful to access the database in the docker container directly.
You can enter a `psql` shell (the official Postgres client) by running the following command:
@@ -194,48 +193,6 @@ You can also run the pre-commit checks manually in various ways:
Replace `<hook_id>` with the ID of the specific hook you want to run. You can find the list
of available hooks in the `.pre-commit-config.yaml` file.
## Working with LLMs
### Environment Setup
Ensure Docker Compose is running before starting LLM sessions:
```bash
docker compose up
```
Validate your environment:
```bash
curl -f http://localhost:8088/health && echo "✅ Superset ready"
```
### LLM Session Best Practices
- Always validate environment setup first using the health checks above
- Use focused validation commands: `pre-commit run` (not `--all-files`)
- **Read [LLMS.md](https://github.com/apache/superset/blob/master/LLMS.md) first** - Contains comprehensive development guidelines, coding standards, and critical refactor information
- **Check platform-specific files** when available:
- `CLAUDE.md` - For Claude/Anthropic tools
- `CURSOR.md` - For Cursor editor
- `GEMINI.md` - For Google Gemini tools
- `GPT.md` - For OpenAI/ChatGPT tools
- Follow the TypeScript migration guidelines and avoid deprecated patterns listed in LLMS.md
### Key Development Commands
```bash
# Frontend development
cd superset-frontend
npm run dev # Development server on http://localhost:9000
npm run test # Run all tests
npm run test -- filename.test.tsx # Run single test file
npm run lint # Linting and type checking
# Backend validation
pre-commit run mypy # Type checking
pytest # Run all tests
pytest tests/unit_tests/specific_test.py # Run single test file
pytest tests/unit_tests/ # Run all tests in directory
```
For detailed development context, environment setup, and coding guidelines, see [LLMS.md](https://github.com/apache/superset/blob/master/LLMS.md).
## Alternatives to `docker compose`
:::caution
@@ -312,22 +269,22 @@ If you have made changes to the FAB-managed templates, which are not built the s
If you add a new requirement or update an existing requirement (per the `install_requires` section in `setup.py`) you must recompile (freeze) the Python dependencies to ensure that for CI, testing, etc. the build is deterministic. This can be achieved via,
```bash
python3 -m venv venv
source venv/bin/activate
python3 -m pip install -r requirements/development.txt
./scripts/uv-pip-compile.sh
$ python3 -m venv venv
$ source venv/bin/activate
$ python3 -m pip install -r requirements/development.txt
$ ./scripts/uv-pip-compile.sh
```
When upgrading the version number of a single package, you should run `./scripts/uv-pip-compile.sh` with the `-P` flag:
```bash
./scripts/uv-pip-compile.sh -P some-package-to-upgrade
$ ./scripts/uv-pip-compile.sh -P some-package-to-upgrade
```
To bring all dependencies up to date as per the restrictions defined in `setup.py` and `requirements/*.in`, run `./scripts/uv-pip-compile.sh --upgrade`
```bash
./scripts/uv-pip-compile.sh --upgrade
$ ./scripts/uv-pip-compile.sh --upgrade
```
This should be done periodically, but it is recommended to do thorough manual testing of the application to ensure no breaking changes have been introduced that aren't caught by the unit and integration tests.
@@ -548,10 +505,12 @@ pre-commit install
A series of checks will now run when you make a git commit.
## Linting
See [how tos](/docs/contributing/howtos#linting)
## GitHub Actions and `act`
:::tip
@@ -564,7 +523,6 @@ For more targetted iteration, see the `gh workflow run --ref {BRANCH}` subcomman
For automation and CI/CD, Superset makes extensive use of GitHub Actions (GHA). You
can find all of the workflows and other assets under the `.github/` folder. This includes:
- running the backend unit test suites (`tests/`)
- running the frontend test suites (`superset-frontend/src/**.*.test.*`)
- running our Cypress end-to-end tests (`superset-frontend/cypress-base/`)
@@ -606,7 +564,6 @@ act pull_request --job {workflow_name} --secret GITHUB_TOKEN=$GITHUB_TOKEN --con
```
In the example above, notice that:
- we target a specific workflow, using `--job`
- we pass a secret using `--secret`, as many jobs require read access (public) to the repo
- we simulate a `pull_request` event by specifying it as the first arg,
@@ -656,6 +613,9 @@ act --job test-python-38 --secret GITHUB_TOKEN=$GITHUB_TOKEN --event pull_reques
There is also a utility script included in the Superset codebase to run Python integration tests. The [readme can be found here](https://github.com/apache/superset/tree/master/scripts/tests).
There is also a utility script included in the Superset codebase to run python integration tests. The [readme can be
found here](https://github.com/apache/superset/tree/master/scripts/tests)
To run all integration tests, for example, run this script from the root directory:
```bash
@@ -825,7 +785,7 @@ To debug Flask running in POD inside a kubernetes cluster, you'll need to make s
add: ["SYS_PTRACE"]
```
See [set capabilities for a container](https://kubernetes.io/docs/tasks/configure-pod-container/security-context/#set-capabilities-for-a-container) for more details.
See (set capabilities for a container)[https://kubernetes.io/docs/tasks/configure-pod-container/security-context/#set-capabilities-for-a-container] for more details.
Once the pod is running as root and has the SYS_PTRACE capability it will be able to debug the Flask app.

View File

@@ -10,7 +10,7 @@ version: 1
The latest documentation and tutorial are available at https://superset.apache.org/.
The documentation site is built using [Docusaurus 3](https://docusaurus.io/), a modern
The documentation site is built using [Docusaurus 2](https://docusaurus.io/), a modern
static website generator, the source for which resides in `./docs`.
### Local Development
@@ -223,9 +223,9 @@ To run a single test file:
npm run test -- path/to/file.js
```
### E2E Integration Testing
### e2e Integration Testing
For E2E testing, we recommend that you use a `docker compose` backend
For e2e testing, we recommend that you use a `docker compose` backend
```bash
CYPRESS_CONFIG=true docker compose up --build
@@ -411,7 +411,7 @@ See [set capabilities for a container](https://kubernetes.io/docs/tasks/configur
Once the pod is running as root and has the `SYS_PTRACE` capability it will be able to debug the Flask app.
You can follow the same instructions as in `docker compose`. Enter the pod and install the required library and packages: gdb, netstat and debugpy.
You can follow the same instructions as in `docker compose`. Enter the pod and install the required library and packages; gdb, netstat and debugpy.
Often in a Kubernetes environment nodes are not addressable from outside the cluster. VSCode will thus be unable to remotely connect to port 5678 on a Kubernetes node. In order to do this you need to create a tunnel that port forwards 5678 to your local machine.
@@ -608,27 +608,3 @@ If using the eslint extension with vscode, put the following in your workspace `
"superset-frontend"
]
```
## GitHub Ephemeral Environments
On any given pull request on GitHub, it's possible to create a temporary environment/deployment
by simply adding the label `testenv-up` to the PR. Once you add the `testenv-up` label, a
GitHub Action will be triggered that will:
- build a docker image
- deploy it in EC2 (sponsored by the folks at [Preset](https://preset.io))
- write a comment on the PR with a link to the ephemeral environment
For more advanced use cases, it's possible to set a feature flag on the PR body, which will
take effect on the ephemeral environment. For example, if you want to set the `TAGGING_SYSTEM`
feature flag to `true`, you can add the following line to the PR body/description:
```
FEATURE_TAGGING_SYSTEM=true
```
Simarly, it's possible to disable feature flags with:
```
FEATURE_TAGGING_SYSTEM=false
```

View File

@@ -3,7 +3,7 @@ sidebar_position: 6
version: 1
---
# Miscellaneous
# Misc.
## Reporting a Security Vulnerability
@@ -11,7 +11,7 @@ Please report security vulnerabilities to private@superset.apache.org.
In the event a community member discovers a security flaw in Superset, it is important to follow the [Apache Security Guidelines](https://www.apache.org/security/committers.html) and release a fix as quickly as possible before public disclosure. Reporting security vulnerabilities through the usual GitHub Issues channel is not ideal as it will publicize the flaw before a fix can be applied.
## SQL Lab Async
### SQL Lab Async
It's possible to configure a local database to operate in `async` mode,
to work on `async` related features.
@@ -46,7 +46,7 @@ Note that:
to your production environment, and use the similar broker as well as
results backend configuration
## Async Chart Queries
### Async Chart Queries
It's possible to configure database queries for charts to operate in `async` mode. This is especially useful for dashboards with many charts that may otherwise be affected by browser connection limits. To enable async queries for dashboards and Explore, the following dependencies are required:

View File

@@ -4,96 +4,10 @@ version: 1
---
import InteractiveSVG from '../../src/components/InteractiveERDSVG';
import Mermaid from '@theme/Mermaid';
# Resources
## High Level Architecture
<div style={{ maxWidth: "600px", margin: "0 auto", marginLeft: 0, marginRight: "auto" }}>
```mermaid
flowchart TD
%% Top Level
LB["<b>Load Balancer(s)</b><br/>(optional)"]
LB -.-> WebServers
%% Web Servers
subgraph WebServers ["<b>Web Server(s)</b>"]
WS1["<b>Frontend</b><br/>(React, AntD, ECharts, AGGrid)"]
WS2["<b>Backend</b><br/>(Python, Flask, SQLAlchemy, Pandas, ...)"]
end
%% Infra
subgraph InfraServices ["<b>Infra</b>"]
DB[("<b>Metadata Database</b><br/>(Postgres / MySQL)")]
subgraph Caching ["<b>Caching Subservices<br/></b>(Redis, memcache, S3, ...)"]
direction LR
DummySpace[" "]:::invisible
QueryCache["<b>Query Results Cache</b><br/>(Accelerated Dashboards)"]
CsvCache["<b>CSV Exports Cache</b>"]
ThumbnailCache["<b>Thumbnails Cache</b>"]
AlertImageCache["<b>Alert/Report Images Cache</b>"]
QueryCache -- " " --> CsvCache
linkStyle 1 stroke:transparent;
ThumbnailCache -- " " --> AlertImageCache
linkStyle 2 stroke:transparent;
end
Broker(("<b>Message Queue</b><br/>(Redis / RabbitMQ / SQS)"))
end
AsyncBackend["<b>Async Workers (Celery)</b><br>required for Alerts & Reports, thumbnails, CSV exports, long-running workloads, ..."]
%% External DBs
subgraph ExternalDatabases ["<b>Analytics Databases</b>"]
direction LR
BigQuery[(BigQuery)]
Snowflake[(Snowflake)]
Redshift[(Redshift)]
Postgres[(Postgres)]
Postgres[(... any ...)]
end
%% Connections
LB -.-> WebServers
WebServers --> DB
WebServers -.-> Caching
WebServers -.-> Broker
WebServers -.-> ExternalDatabases
Broker -.-> AsyncBackend
AsyncBackend -.-> ExternalDatabases
AsyncBackend -.-> Caching
%% Legend styling
classDef requiredNode stroke-width:2px,stroke:black;
class Required requiredNode;
class Optional optionalNode;
%% Hide real arrow
linkStyle 0 stroke:transparent;
%% Styling
classDef optionalNode stroke-dasharray: 5 5, opacity:0.9;
class LB optionalNode;
class Caching optionalNode;
class AsyncBackend optionalNode;
class Broker optionalNode;
class QueryCache optionalNode;
class CsvCache optionalNode;
class ThumbnailCache optionalNode;
class AlertImageCache optionalNode;
class Celery optionalNode;
classDef invisible fill:transparent,stroke:transparent;
```
</div>
## Entity-Relationship Diagram
## Entity-Relationship Diagram
Here is our interactive ERD:

View File

@@ -66,7 +66,7 @@ For running long query from Sql Lab, by default Superset allows it run as long a
being killed by celery. If you want to increase the time for running query, you can specify the
timeout in configuration. For example:
```python
```
SQLLAB_ASYNC_TIME_LIMIT_SEC = 60 * 60 * 6
```
@@ -78,7 +78,7 @@ come back within client-side timeout (60 seconds by default), Superset will disp
to avoid gateway timeout message. If you have a longer gateway timeout limit, you can change the
timeout settings in **superset_config.py**:
```python
```
SUPERSET_WEBSERVER_TIMEOUT = 60
```
@@ -87,7 +87,7 @@ SUPERSET_WEBSERVER_TIMEOUT = 60
You need to register a free account at [Mapbox.com](https://www.mapbox.com), obtain an API key, and add it
to **.env** at the key MAPBOX_API_KEY:
```python
```
MAPBOX_API_KEY = "longstringofalphanumer1c"
```
@@ -99,7 +99,7 @@ refreshed - especially if some data is slow moving, or run heavy queries. To exc
from the timed refresh process, add the `timed_refresh_immune_slices` key to the dashboard JSON
Metadata field:
```json
```
{
"filter_immune_slices": [],
"expanded_slices": {},
@@ -115,7 +115,7 @@ Slice refresh will also be staggered over the specified period. You can turn off
setting the `stagger_refresh` to false and modify the stagger period by setting `stagger_time` to a
value in milliseconds in the JSON Metadata field:
```json
```
{
"stagger_refresh": false,
"stagger_time": 2500
@@ -125,7 +125,7 @@ value in milliseconds in the JSON Metadata field:
Here, the entire dashboard will refresh at once if periodic refresh is on. The stagger time of 2.5
seconds is ignored.
**Why does flask fab or superset freeze/hang/not responding when started (my home directory is
**Why does flask fab or superset freezed/hung/not responding when started (my home directory is
NFS mounted)?**
By default, Superset creates and uses an SQLite database at `~/.superset/superset.db`. SQLite is
@@ -137,7 +137,7 @@ You can override this path using the **SUPERSET_HOME** environment variable.
Another workaround is to change where superset stores the sqlite database by adding the following in
`superset_config.py`:
```python
```
SQLALCHEMY_DATABASE_URI = 'sqlite:////new/location/superset.db?check_same_thread=false'
```
@@ -157,12 +157,12 @@ table afterwards to configure the Columns tab, check the appropriate boxes and s
To clarify, the database backend is an OLTP database used by Superset to store its internal
information like your list of users and dashboard definitions. While Superset supports a
[variety of databases as data _sources_](/docs/configuration/databases#installing-database-drivers),
[variety of databases as data *sources*](/docs/configuration/databases#installing-database-drivers),
only a few database engines are supported for use as the OLTP backend / metadata store.
Superset is tested using MySQL, PostgreSQL, and SQLite backends. Its recommended you install
Superset on one of these database servers for production. Installation on other OLTP databases
may work but isnt tested. It has been reported that [Microsoft SQL Server does _not_
may work but isnt tested. It has been reported that [Microsoft SQL Server does *not*
work as a Superset backend](https://github.com/apache/superset/issues/18961). Column-store,
non-OLTP databases are not designed for this type of workload.
@@ -213,7 +213,7 @@ SQLAlchemy and DBAPI scope. This includes features like:
Beyond the SQLAlchemy connector, its also possible, though much more involved, to extend Superset
and write your own connector. The only example of this at the moment is the Druid connector, which
is getting superseded by Druids growing SQL support and the recent availability of a DBAPI and
SQLAlchemy driver. If the database you are considering integrating has any kind of SQL support,
SQLAlchemy driver. If the database you are considering integrating has any kind of of SQL support,
its probably preferable to go the SQLAlchemy route. Note that for a native connector to be possible
the database needs to have support for running OLAP-type queries and should be able to do things that
are typical in basic SQL:
@@ -236,7 +236,7 @@ made to cover more and more use cases.
The API available is documented using [Swagger](https://swagger.io/) and the documentation can be
made available under **/swagger/v1** by enabling the following flag in `superset_config.py`:
```python
```
FAB_API_SWAGGER_UI = True
```
@@ -275,11 +275,3 @@ No. Currently, there is no way to recover a deleted Superset dashboard/chart/dat
Hence, it is recommended to take periodic backups of the metadata database. For recovery, you can launch a recovery instance of a Superset server with the backed-up copy of the DB attached and use the Export Dashboard button in the Superset UI (or the `superset export-dashboards` CLI command). Then, take the .zip file and import it into the current Superset instance.
Alternatively, you can programmatically take regular exports of the assets as a backup.
## I ran a security scan of the Superset container image and it showed dozens of "high" and "critical" vulnerabilities! Can you release a version of Superset without these?
You are talking about dependency CVEs: identified vulnerabilities in software that Superset uses. Most of these CVEs are in the Linux kernel or Python, both of which have many other people working on their security.
We address these dependency CVEs as best we can by regularly updating our dependencies to newer versions. We use bots to assist with that and cheerfully welcome pull requests from humans that fix dependency CVEs.
The Superset [security team](https://superset.apache.org/docs/security/#reporting-security-vulnerabilities) focuses primarily on vulnerabilities _in Superset itself_. See our [CVEs page](https://superset.apache.org/docs/security/cves) for a list of past Superset CVEs.

View File

@@ -14,7 +14,6 @@ This page is meant to give new administrators an understanding of Superset's com
## Components
A Superset installation is made up of these components:
1. The Superset application itself
2. A metadata database
3. A caching layer (optional, but necessary for some features)
@@ -23,7 +22,6 @@ A Superset installation is made up of these components:
### Optional components and associated features
The optional components above are necessary to enable these features:
- [Alerts and Reports](/docs/configuration/alerts-reports)
- [Caching](/docs/configuration/cache)
- [Async Queries](/docs/configuration/async-queries-celery/)
@@ -38,7 +36,6 @@ Here are further details on each component.
### The Superset Application
This is the core application. Superset operates like this:
- A user visits a chart or dashboard
- That triggers a SQL query to the data warehouse holding the underlying dataset
- The resulting data is served up in a data visualization
@@ -48,14 +45,13 @@ This is the core application. Superset operates like this:
This is where chart and dashboard definitions, user information, logs, etc. are stored. Superset is tested to work with PostgreSQL and MySQL databases as the metadata database (not be confused with a data source like your data warehouse, which could be a much greater variety of options like Snowflake, Redshift, etc.).
Some installation methods like our Quickstart and PyPI come configured by default to use a SQLite on-disk database. And in a Docker Compose installation, the data would be stored in a PostgreSQL container volume. Neither of these cases are recommended for production instances of Superset.
Some installation methods like our Quickstart and PyPI come configured by default to use a SQLite on-disk database. And in a Docker Compose installation, the data would be stored in a PostgresQL container volume. Neither of these cases are recommended for production instances of Superset.
For production, a properly-configured, managed, standalone database is recommended. No matter what database you use, you should plan to back it up regularly.
### Caching Layer
The caching layer serves two main functions:
- Store the results of queries to your data warehouse so that when a chart is loaded twice, it pulls from the cache the second time, speeding up the application and reducing load on your data warehouse.
- Act as a message broker for the worker, enabling the Alerts & Reports, async queries, and thumbnail caching features.

View File

@@ -1,7 +1,7 @@
---
title: Docker Builds
hide_title: true
sidebar_position: 7
sidebar_position: 6
version: 1
---
@@ -16,7 +16,7 @@ https://hub.docker.com/r/apache/superset) using GitHub Actions.
Different sets of images are built and/or published at different times:
- **Published releases** (`release`): published using
tags like `5.0.0` and the `latest` tag.
tags like `3.0.0` and the `latest` tag.
- **Pull request iterations** (`pull_request`): for each pull request, while
we actively build the docker to validate the build, we do
not publish those images for security reasons, we simply `docker build --load`
@@ -32,7 +32,7 @@ for the build, and/or base image.
Here are the build presets that are exposed through the `supersetbot docker` utility:
- `lean`: The default Docker image, including both frontend and backend. Tags
without a build_preset are lean builds (ie: `latest`, `5.0.0`, `4.1.2`, ...). `lean`
without a build_preset are lean builds (ie: `latest`, `4.0.0`, `3.0.0`, ...). `lean`
builds do not contain database
drivers, meaning you need to install your own. That applies to analytics databases **AND
the metadata database**. You'll likely want to layer either `mysqlclient` or `psycopg2-binary`
@@ -44,7 +44,7 @@ Here are the build presets that are exposed through the `supersetbot docker` uti
- `py311`, e.g., Py311: Similar to lean but with a different Python version (in this example, 3.11).
- `ci`: For certain CI workloads.
- `websocket`: For Superset clusters supporting advanced features.
- `dockerize`: Used by Helm in initContainers to wait for database dependencies to be available.
- `dockerize`: Used by Helm.
## Key tags examples
@@ -59,63 +59,14 @@ Here are the build presets that are exposed through the `supersetbot docker` uti
this specific SHA, which could be from a `master` merge, or release.
- `websocket-latest`: The WebSocket image for use in a Superset cluster.
For insights or modifications to the build matrix and tagging conventions,
check the [supersetbot docker](https://github.com/apache-superset/supersetbot)
subcommand and the [docker.yml](https://github.com/apache/superset/blob/master/.github/workflows/docker.yml)
GitHub action.
## Building your own production Docker image
Every Superset deployment will require its own set of drivers depending on the data warehouse(s),
etc. so we recommend that users build their own Docker image by extending the `lean` image.
Here's an example Dockerfile that does this. Follow the in-line comments to customize it for
your desired Superset version and database drivers. The comments also note that a certain feature flag will
have to be enabled in your config file.
You would build the image with `docker build -t mysuperset:latest .` or `docker build -t ourcompanysuperset:5.0.0 .`
```Dockerfile
# change this to apache/superset:5.0.0 or whatever version you want to build from;
# otherwise the default is the latest commit on GitHub master branch
FROM apache/superset:master
USER root
# Set environment variable for Playwright
ENV PLAYWRIGHT_BROWSERS_PATH=/usr/local/share/playwright-browsers
# Install packages using uv into the virtual environment
# Superset started using uv after the 4.1 branch; if you are building from apache/superset:4.1.x or an older version,
# replace the first two lines with RUN pip install \
RUN . /app/.venv/bin/activate && \
uv pip install \
# install psycopg2 for using PostgreSQL metadata store - could be a MySQL package if using that backend:
psycopg2-binary \
# add the driver(s) for your data warehouse(s), in this example we're showing for Microsoft SQL Server:
pymssql \
# package needed for using single-sign on authentication:
Authlib \
# openpyxl to be able to upload Excel files
openpyxl \
# Pillow for Alerts & Reports to generate PDFs of dashboards
Pillow \
# install Playwright for taking screenshots for Alerts & Reports. This assumes the feature flag PLAYWRIGHT_REPORTS_AND_THUMBNAILS is enabled
# That feature flag will default to True starting in 6.0.0
# Playwright works only with Chrome.
# If you are still using Selenium instead of Playwright, you would instead install here the selenium package and a headless browser & webdriver
playwright \
&& playwright install-deps \
&& PLAYWRIGHT_BROWSERS_PATH=/usr/local/share/playwright-browsers playwright install chromium
# Switch back to the superset user
USER superset
CMD ["/app/docker/entrypoints/run-server.sh"]
```
## Key ARGs in Dockerfile
- `BUILD_TRANSLATIONS`: whether to build the translations into the image. For the
frontend build this tells webpack to strip out all locales other than `en` from
the `moment-timezone` library. For the backendthis skips compiling the

View File

@@ -1,7 +1,7 @@
---
title: Docker Compose
hide_title: true
sidebar_position: 5
sidebar_position: 4
version: 1
---
@@ -17,7 +17,7 @@ Since `docker compose` is primarily designed to run a set of containers on **a s
and can't support requirements for **high availability**, we do not support nor recommend
using our `docker compose` constructs to support production-type use-cases. For single host
environments, we recommend using [minikube](https://minikube.sigs.k8s.io/docs/start/) along
with our [installing on k8s](https://superset.apache.org/docs/installation/running-on-kubernetes)
our [installing on k8s](https://superset.apache.org/docs/installation/running-on-kubernetes)
documentation.
:::
@@ -36,13 +36,14 @@ Note that there are 3 major ways we support to run `docker compose`:
at the time you fire this up will be reflected, but changes to the code
while `up` won't be reflected in the app
1. **docker-compose-image-tag.yml** where we fetch an image from docker-hub say for the
`5.0.0` release for instance, and fire it up so you can try it. Here what's in
`3.0.0` release for instance, and fire it up so you can try it. Here what's in
the local branch has no effects on what's running, we just fetch and run
pre-built images from docker-hub. For `docker compose` to work along with the
Postgres image it boots up, you'll want to point to a `-dev`-suffixed TAG, as in
`export TAG=5.0.0-dev` or `export TAG=4.1.2-dev`, with `latest-dev` being the default.
The `dev` builds include the `psycopg2-binary` required to connect
`export TAG=4.0.0-dev` or `export TAG=3.0.0-dev`, with `latest-dev` being the default.
That's because The `dev` builds happen to package the `psycopg2-binary` required to connect
to the Postgres database launched as part of the `docker compose` builds.
``
More on these two approaches after setting up the requirements for either.
@@ -112,15 +113,7 @@ docker compose -f docker-compose-non-dev.yml up
### Option #3 - boot up an official release
```bash
# Set the version you want to run
export TAG=5.0.0
# Fetch the tag you're about to check out (assuming you shallow-cloned the repo)
git fetch --depth=1 origin tag $TAG
# Could also fetch all tags too if you've got bandwidth to spare
# git fetch --tags
# Checkout the corresponding git ref
git checkout $TAG
# Fire up docker compose
export TAG=3.1.1
docker compose -f docker-compose-image-tag.yml up
```
@@ -129,8 +122,8 @@ Refer to the docker-related documentation to learn more about existing tags you
from Docker Hub.
:::note
For option #2 and #3, we recommend checking out the release tag from the git repository
(ie: `git checkout 5.0.0`) for more guaranteed results. This ensures that the `docker-compose.*.yml`
For option #2 and #3, we recommend checking out the release tag from the better repository
(ie: `git checkout 4.0.0`) for more guaranteed results. This ensures that the `docker-compose.*.yml`
configurations and that the mounted `docker/` scripts are in sync with the image you are
looking to fire up.
:::
@@ -143,7 +136,7 @@ its metadata database. In production, this database should be backed up. The de
with docker compose will store that data in a PostgreSQL database contained in a Docker
[volume](https://docs.docker.com/storage/volumes/), which is not backed up.
Again, **THE DOCKER-COMPOSE INSTALLATION IS NOT PRODUCTION-READY OUT OF THE BOX.**
Again, **DO NOT USE THIS FOR PRODUCTION**
:::

View File

@@ -1,56 +0,0 @@
---
title: Installation Methods
hide_title: true
sidebar_position: 2
version: 1
---
import useBaseUrl from "@docusaurus/useBaseUrl";
# Installation Methods
How should you install Superset? Here's a comparison of the different options. It will help if you've first read the [Architecture](/docs/installation/architecture.mdx) page to understand Superset's different components.
The fundamental trade-off is between you needing to do more of the detail work yourself vs. using a more complex deployment route that handles those details.
## [Docker Compose](/docs/installation/docker-compose.mdx)
**Summary:** This takes advantage of containerization while remaining simpler than Kubernetes. This is the best way to try out Superset; it's also useful for developing & contributing back to Superset.
If you're not just demoing the software, you'll need a moderate understanding of Docker to customize your deployment and avoid a few risks. Even when fully-optimized this is not as robust a method as Kubernetes when it comes to large-scale production deployments.
You manage a superset-config.py file and a docker-compose.yml file. Docker Compose brings up all the needed services - the Superset application, a Postgres metadata DB, Redis cache, Celery worker and beat. They are automatically connected to each other.
**Responsibilities**
You will need to back up your metadata DB. That could mean backing up the service running as a Docker container and its volume; ideally you are running Postgres as a service outside of that container and backing up that service.
You will also need to extend the Superset docker image. The default `lean` images do not contain drivers needed to access your metadata database (Postgres or MySQL), nor to access your data warehouse, nor the headless browser needed for Alerts & Reports. You could run a `-dev` image while demoing Superset, which has some of this, but you'll still need to install the driver for your data warehouse. The `-dev` images run as root, which is not recommended for production.
Ideally you will build your own image of Superset that extends `lean`, adding what your deployment needs. See [Building your own production Docker image](/docs/installation/docker-builds/#building-your-own-production-docker-image).
## [Kubernetes (K8s)](/docs/installation/kubernetes.mdx)
**Summary:** This is the best-practice way to deploy a production instance of Superset, but has the steepest skill requirement - someone who knows Kubernetes.
You will deploy Superset into a K8s cluster. The most common method is using the community-maintained Helm chart, though work is now underway to implement [SIP-149 - a Kubernetes Operator for Superset](https://github.com/apache/superset/issues/31408).
A K8s deployment can scale up and down based on usage and deploy rolling updates with zero downtime - features that big deployments appreciate.
**Responsibilities**
You will need to build your own Docker image, and back up your metadata DB, both as described in Docker Compose above. You'll also need to customize your Helm chart values and deploy and maintain your Kubernetes cluster.
## [PyPI (Python)](/docs/installation/pypi.mdx)
**Summary:** This is the only method that requires no knowledge of containers. It requires the most hands-on work to deploy, connect, and maintain each component.
You install Superset as a Python package and run it that way, providing your own metadata database. Superset has documentation on how to install this way, but it is updated infrequently.
If you want caching, you'll set up Redis or RabbitMQ. If you want Alerts & Reports, you'll set up Celery.
**Responsibilities**
You will need to get the component services running and communicating with each other. You'll need to arrange backups of your metadata database.
When upgrading, you'll need to manage the system environment and packages and ensure all components have functional dependencies.

View File

@@ -1,16 +1,17 @@
---
title: Kubernetes
title: Kubernetes
hide_title: true
sidebar_position: 3
sidebar_position: 2
version: 1
---
import useBaseUrl from "@docusaurus/useBaseUrl";
import useBaseUrl from '@docusaurus/useBaseUrl';
# Installing on Kubernetes
<img src={useBaseUrl("/img/k8s.png" )} width="150" />
<br /><br />
<img src={useBaseUrl('/img/k8s.png')} width="150" />
<br />
<br />
Running Superset on Kubernetes is supported with the provided [Helm](https://helm.sh/) chart
found in the official [Superset helm repository](https://apache.github.io/superset/index.yaml).
@@ -134,7 +135,7 @@ init:
```
:::note
Superset uses [Scarf Gateway](https://about.scarf.sh/scarf-gateway) to collect telemetry data. Knowing the installation counts for different Superset versions informs the project's decisions about patching and long-term support. Scarf purges personally identifiable information (PII) and provides only aggregated statistics.
Superset uses [Scarf Gateway](https://about.scarf.sh/scarf-gateway) to collect telemetry data. Knowing the installation counts for different Superset versions informs the project's decisions about patching and long-term support. Scarf purges personally identifiable information (PII) and provides only aggregated statistics.
To opt-out of this data collection in your Helm-based installation, edit the `repository:` line in your `helm/superset/values.yaml` file, replacing `apachesuperset.docker.scarf.sh/apache/superset` with `apache/superset` to pull the image directly from Docker Hub.
:::
@@ -150,9 +151,6 @@ Superset requires a Python DB-API database driver and a SQLAlchemy
dialect to be installed for each datastore you want to connect to.
See [Install Database Drivers](/docs/configuration/databases) for more information.
It is recommended that you refer to versions listed in
[pyproject.toml](https://github.com/apache/superset/blob/master/pyproject.toml)
instead of hard-coding them in your bootstrap script, as seen below.
:::
@@ -161,9 +159,9 @@ The following example installs the drivers for BigQuery and Elasticsearch, allow
```yaml
bootstrapScript: |
#!/bin/bash
uv pip install .[postgres] \
.[bigquery] \
.[elasticsearch] &&\
uv pip install psycopg2==2.9.6 \
sqlalchemy-bigquery==1.6.1 \
elasticsearch-dbapi==0.2.5 &&\
if [ ! -f ~/bootstrap ]; then echo "Running Superset with uid {{ .Values.runAsUser }}" > ~/bootstrap; fi
```
@@ -195,7 +193,7 @@ Those can be passed as key/values either with `extraEnv` or `extraSecretEnv` if
extraEnv:
SMTP_HOST: smtp.gmail.com
SMTP_USER: user@gmail.com
SMTP_PORT: "587"
SMTP_PORT: '587'
SMTP_MAIL_FROM: user@gmail.com
extraSecretEnv:
@@ -356,7 +354,7 @@ supersetCeleryBeat:
extraEnv:
SMTP_HOST: smtp.gmail.com
SMTP_USER: user@gmail.com
SMTP_PORT: "587"
SMTP_PORT: '587'
SMTP_MAIL_FROM: user@gmail.com
extraSecretEnv:

View File

@@ -1,7 +1,7 @@
---
title: PyPI
hide_title: true
sidebar_position: 4
sidebar_position: 3
version: 1
---
@@ -12,7 +12,7 @@ import useBaseUrl from "@docusaurus/useBaseUrl";
<img src={useBaseUrl("/img/pypi.png" )} width="150" />
<br /><br />
This page describes how to install Superset using the `apache_superset` package [published on PyPI](https://pypi.org/project/apache_superset/).
This page describes how to install Superset using the `apache-superset` package [published on PyPI](https://pypi.org/project/apache-superset/).
## OS Dependencies
@@ -22,13 +22,6 @@ level dependencies.
**Debian and Ubuntu**
Ubuntu **24.04** uses python 3.12 per default, which currently is not supported by Superset. You need to add a second python installation of 3.11 and install the required additional dependencies.
```bash
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update
sudo apt install python3.11 python3.11-dev python3.11-venv build-essential libssl-dev libffi-dev libsasl2-dev libldap2-dev default-libmysqlclient-dev
```
In Ubuntu **20.04 and 22.04** the following command will ensure that the required dependencies are installed:
```bash
@@ -101,9 +94,14 @@ These will now be available when pip installing requirements.
## Python Virtual Environment
We highly recommend installing Superset inside of a virtual environment.
We highly recommend installing Superset inside of a virtual environment. Python ships with
`virtualenv` out of the box. If you're using [pyenv](https://github.com/pyenv/pyenv), you can install [pyenv-virtualenv](https://github.com/pyenv/pyenv-virtualenv). Or you can install it with `pip`:
You can create and activate a virtual environment using the following commands. Ensure you are using a compatible version of python. You might have to explicitly use for example `python3.11` instead of `python3`.
```bash
pip install virtualenv
```
You can create and activate a virtual environment using:
```bash
# virtualenv is shipped in Python 3.6+ as venv instead of pyvenv.
@@ -126,15 +124,15 @@ command line.
### Installing and Initializing Superset
First, start by installing `apache_superset`:
First, start by installing `apache-superset`:
```bash
pip install apache_superset
pip install apache-superset
```
Then, define mandatory configurations, SECRET_KEY and FLASK_APP:
```bash
export SUPERSET_SECRET_KEY=YOUR-SECRET-KEY # For production use, make sure this is a strong key, for example generated using `openssl rand -base64 42`. See https://superset.apache.org/docs/configuration/configuring-superset#specifying-a-secret_key
export SUPERSET_SECRET_KEY=YOUR-SECRET-KEY
export FLASK_APP=superset
```

View File

@@ -1,7 +1,7 @@
---
title: Upgrading Superset
hide_title: true
sidebar_position: 6
sidebar_position: 5
version: 1
---
@@ -32,7 +32,7 @@ docker compose up
To upgrade superset in a native installation, run the following commands:
```bash
pip install apache_superset --upgrade
pip install apache-superset --upgrade
```
## Upgrading the Metadata Database

View File

@@ -22,7 +22,7 @@ page.
### 1. Get Superset
```bash
git clone https://github.com/apache/superset
$ git clone https://github.com/apache/superset
```
### 2. Start the latest official release of Superset
@@ -32,7 +32,7 @@ git clone https://github.com/apache/superset
$ cd superset
# Set the repo to the state associated with the latest official version
$ git checkout tags/5.0.0
$ git checkout tags/4.1.1
# Fire up Superset using Docker Compose
$ docker compose -f docker-compose-image-tag.yml up
@@ -61,7 +61,7 @@ password: admin
Once you're done with Superset, you can stop and delete just like any other container environment:
```bash
docker compose down
$ docker compose down
```
:::tip

Some files were not shown because too many files have changed in this diff Show More