Compare commits

..

118 Commits

Author SHA1 Message Date
Elizabeth Thompson
507a7562e0 update changelog 2022-12-15 10:43:47 -08:00
Michael S. Molina
8b8e215947 fix: Allow empty CSS in Handlebars (#22422) 2022-12-15 11:23:31 -05:00
geido
d2410df01c Move to Handlebars 2022-12-15 11:20:16 -05:00
AAfghahi
e628608b33 changelog 2022-12-15 11:10:42 -05:00
Michael S. Molina
ee7d16c110 feat: Improves SafeMarkdown HTML sanitization 2022-12-15 11:10:42 -05:00
AAfghahi
45cf96aee4 pylint 2022-12-15 11:10:41 -05:00
AAfghahi
3e8e5c242b changelog 2022-12-15 11:10:41 -05:00
Eric Briscoe
5677400ba8 fix(CustomFrame): Resolves issue #21731 where date range in explore throws runtime error (#21776) 2022-12-15 11:10:41 -05:00
AAfghahi
a991e1ae0e changelog 2022-12-15 11:10:41 -05:00
Mayur
580dbaff29 fix: respect chart cache timeout setting (#21637) 2022-12-15 11:10:41 -05:00
Mayur
c9921f8d72 fix: allow adhoc columns in non-aggregate query (#21729) 2022-12-15 11:10:41 -05:00
Ville Brofeldt
8156d860c7 chore(sqla): refactor query utils (#21811)
Co-authored-by: Ville Brofeldt <ville.brofeldt@apple.com>
2022-12-15 11:10:41 -05:00
AAfghahi
c26be921c4 add fixture 2022-12-15 11:10:41 -05:00
Mayur
98c0774513 pylint 2022-12-15 11:10:41 -05:00
AAfghahi
92450c6068 added fixtures 2022-12-15 11:10:41 -05:00
Ville Brofeldt
292d1b66df fix(cache): respect default cache timeout on v1 chart data requests (#21441) 2022-12-15 11:10:41 -05:00
AAfghahi
3653cd39fb add physical dataset 2022-12-15 11:10:41 -05:00
AAfghahi
7144b22680 integration test 2022-12-15 11:10:41 -05:00
Daniel Vaz Gaspar
421e5be12f fix: datasource save, improve data validation (#22038) 2022-12-15 11:10:41 -05:00
Daniel Vaz Gaspar
338dcc7ff9 fix: deprecate approve and request_access endpoint (#22022)
Co-authored-by: Michael S. Molina <70410625+michael-s-molina@users.noreply.github.com>
2022-12-15 11:10:41 -05:00
AAfghahi
ca98c77ea4 added test fix 2022-12-15 11:10:41 -05:00
Elizabeth Thompson
2f4096bfd1 fix tests 2022-12-15 11:10:41 -05:00
Daniel Vaz Gaspar
51753e4b43 fix: dashboard api cache decorator (#21964) 2022-12-15 11:10:41 -05:00
Beto Dealmeida
aa65263c3a fix: check that imports are ZIPs (#21875) 2022-12-15 11:10:41 -05:00
Ville Brofeldt
315d49c2bb chore(sqla): refactor query utils (#21811)
Co-authored-by: Ville Brofeldt <ville.brofeldt@apple.com>
2022-12-15 11:10:41 -05:00
Michael S. Molina
ea3d609f78 feat: Adds a Content Security Policy (CSP) check for production environments (#21874)
(cherry picked from commit f4da74ce8d)
2022-12-15 11:10:41 -05:00
Michael S. Molina
05f0c1dbfb feat: Disables HTML rendering in Toast by default (#21853)
(cherry picked from commit 47b1e0ca9d)
2022-12-15 11:10:41 -05:00
Daniel Vaz Gaspar
d6bf83c211 fix: flash message on database data upload forms (#21761)
(cherry picked from commit ba3275a4d0)
2022-12-15 11:10:41 -05:00
Daniel Vaz Gaspar
80b1b3050e fix: database schema selector on import data (#21759)
(cherry picked from commit 91f0de0c5d)
2022-12-15 11:10:41 -05:00
Elizabeth Thompson
0c611d21bb update changelog 2022-12-15 11:10:41 -05:00
dependabot[bot]
4d3aae9a1a chore(deps): bump moment from 2.29.2 to 2.29.4 in /superset-frontend (#20644)
Bumps [moment](https://github.com/moment/moment) from 2.29.2 to 2.29.4.
- [Release notes](https://github.com/moment/moment/releases)
- [Changelog](https://github.com/moment/moment/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/moment/moment/compare/2.29.2...2.29.4)

---
updated-dependencies:
- dependency-name: moment
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-15 11:10:41 -05:00
Elizabeth Thompson
6f95c1e13a fix test 2022-12-15 11:10:40 -05:00
Elizabeth Thompson
9039866681 update changelog 2022-12-15 11:10:40 -05:00
Elizabeth Thompson
a0b0d9dd23 xit cypress test 2022-12-15 11:10:40 -05:00
Mayur
bac190e043 fix: allow adhoc columns in non-aggregate query (#21729) 2022-12-15 11:10:40 -05:00
Elizabeth Thompson
f7e664bd68 fix: remove deprecated ETagResponseMixin (#21773)
(cherry picked from commit 75e6a04269)
2022-10-18 13:55:45 -07:00
Rui Zhao
b1c9adb52b fix(report): Fix permission check for set up email report on charts/dashboards. Fixes #21559 (#21561)
Co-authored-by: Rui Zhao <zhaorui@dropbox.com>
(cherry picked from commit 7f971b4103)
2022-10-18 13:55:41 -07:00
Yongjie Zhao
9e907be7a8 fix: annotation broken (#20651)
* fix: annotation broken

* fix UT

* add annotation data to mixed timeseries chart

(cherry picked from commit 7f918a4ec0)
2022-10-18 13:55:37 -07:00
Elizabeth Thompson
61b64492a2 bump package-lock version 2022-10-06 15:45:55 -07:00
AAfghahi
2c65ef967e build: changelog for 2.0.1 (#21721) 2022-10-06 14:32:46 -07:00
AAfghahi
ad52469a04 test fix 2022-10-06 14:45:06 -04:00
AAfghahi
548311c424 linting issues 2022-10-06 14:45:06 -04:00
Hugh A. Miles II
40c5b2a688 remove eleement reference (#20830)
(cherry picked from commit 2263a76f4d)
2022-10-06 14:45:06 -04:00
aehanno
511cafa790 fix: Add locale for DatePicker component (#20063)
Co-authored-by: Kevin Dethelot <kevin.dethelot@kosmos.fr>
Co-authored-by: Yongjie Zhao <yongjie.zhao@gmail.com>
2022-10-06 14:45:06 -04:00
Daniel Vaz Gaspar
dd919bc176 fix: disallow users from viewing other user's profile on config (#21302) 2022-10-06 14:45:06 -04:00
Cody Leff
143c5f1ecc fix(explore): Prevent unnecessary series limit subquery (#21154)
* Prevent series limit when no series limit columns specified.

* Add timeseries check for legacy charts.

* Apply fix to helpers.py.

* Skip Cypress color consistency tests.
2022-10-06 14:45:06 -04:00
Mayur
dfcb66a098 fix: set correct favicon from config for login and FAB list views (#21498)
(cherry picked from commit b29e7e7d9e)
2022-10-06 14:45:06 -04:00
MichaelHintz
440ab64c53 fix(sqllab): Fix cursor alignment in SQL lab editor by avoiding Lucida Console font on Windows (#21380)
(cherry picked from commit 3098e657e5)
2022-10-06 14:45:06 -04:00
aehanno
5b662f7874 fix: Add french translation missing (#20061)
(cherry picked from commit 944808a0ce)
2022-10-06 14:45:06 -04:00
JUST.in DO IT
08052a7db3 fix(plugin-chart-echarts): missing value format in mixed timeseries (#21044) 2022-10-06 14:45:06 -04:00
Daniel Vaz Gaspar
2583f7fde3 fix: cached common bootstrap Revert (#21018) (#21419)
(cherry picked from commit 094400c308)
2022-10-06 14:45:05 -04:00
Ville Brofeldt
db661ec17f fix(plugin-chart-echarts): show zero value in tooltip (#21296)
Co-authored-by: Ville Brofeldt <ville.brofeldt@apple.com>
(cherry picked from commit 1aeb8fd6b7)
2022-10-06 14:45:05 -04:00
Kamil Gabryjelski
7f8d6b3400 fix(explore): Time column label not formatted when GENERIC_X_AXES enabled (#21294)
(cherry picked from commit c3a00d43d0)
2022-10-06 14:45:05 -04:00
Daniel Vaz Gaspar
2b760d0775 feat: adds TLS certificate validation option for SMTP (#21272)
(cherry picked from commit 9fd752057e)
2022-09-14 11:38:28 -07:00
ʈᵃᵢ
4298690e16 fix(celery cache warmup): add auth and use warm_up_cache endpoint (#21076)
(cherry picked from commit 04dd8d414d)
2022-09-14 11:38:27 -07:00
Stephen Liu
32736680da fix(database-list): hidden upload file button if no permission (#21216)
(cherry picked from commit 0c43190e04)
2022-09-14 11:38:27 -07:00
JUST.in DO IT
9337dec038 fix(sqllab): missing zero values while copy-to-clipboard (#21153)
(cherry picked from commit 4e23d62d4f)
2022-09-14 11:38:26 -07:00
stevetracvc
05d7c3d74d fix(native filters): groupby filter issue (#21084)
(cherry picked from commit d79b0bfc74)
2022-09-14 11:38:25 -07:00
Stephen Liu
c3e04d7ebf fix(plugin-chart-handlebars): order by control not work (#21005)
(cherry picked from commit e70699fb43)
2022-09-14 11:38:25 -07:00
EugeneTorap
47c3cd10bd fix(dashboard): Fix scroll behaviour in DashboardBuilderSidepane (#20969)
(cherry picked from commit 6f3a555e58)
2022-09-14 11:38:24 -07:00
Bogdan
b4df82591e Memoize the common_bootstrap_payload (#21018)
Try patch

Co-authored-by: Bogdan Kyryliuk <bogdankyryliuk@dropbox.com>
(cherry picked from commit 495a205dec)
2022-09-14 11:38:24 -07:00
Stephen Liu
884e2f1ca7 fix(plugin-chart-echarts): gauge chart enhancements and fixes (#21007)
* fix(plugin-chart-echarts): gauge chart enhancements and fixes

* fix lint

(cherry picked from commit b303d1e156)
2022-09-14 11:38:24 -07:00
Erik Cederstrand
e91222eb65 chore(deps): unpin holidays dependency version (#21091)
The blocking issue has been fixed upstream

Co-authored-by: Erik Cederstrand <erik@adamatics.com>
(cherry picked from commit d817a1dc87)
2022-09-14 11:38:23 -07:00
JUST.in DO IT
2db82d578c fix(plugin-chart-echarts): invalid total label location for negative values in stacked bar chart (#21032)
(cherry picked from commit a8ba544e60)
2022-09-14 11:38:23 -07:00
Kamil Gabryjelski
346c035690 fix: Explore scrolled down when navigating from dashboard (#20962)
(cherry picked from commit e4fc5564ce)
2022-09-14 11:38:22 -07:00
Antonio Rivero Martinez
6a5b12ec8c Big Number Viz: (#20946)
- When the value is zero we still render the percent change and suffix if present

(cherry picked from commit aa53c10312)
2022-09-14 11:38:22 -07:00
Diego Medina
8c2ca2d8d8 Temporal X Axis values are not properly displayed if the time column has a custom label defined (#20819)
(cherry picked from commit 51869f32ac)
2022-09-14 11:38:21 -07:00
Yongjie Zhao
43b8f18a21 fix: getting default value in run-server.sh (#20736)
(cherry picked from commit 5990ea639e)
2022-09-14 11:38:21 -07:00
Multazim Deshmukh
5efee17def fix: make max-requests and max-requests-jitter adjustable (#20733)
Co-authored-by: Multazim Deshmukh <multazim.deshmukh@morningstar.com>
(cherry picked from commit 883241070f)
2022-09-14 11:38:21 -07:00
Beto Dealmeida
56137ebbe5 fix: logger message (#20714)
(cherry picked from commit c70d102b73)
2022-09-14 11:38:21 -07:00
Michael S. Molina
067495d954 Updates CHANGELOG.md 2022-07-06 13:48:18 -03:00
Reese
bba486b08c fix: Allow dataset owners to explore their datasets (#20382)
* fix: Allow dataset owners to explore their datasets

* Re-order imports

* Give owners security manager permissions to their datasets

* Update test suite

* Add SqlaTable to is_owner types

* Add owners to datasource mock

* Fix VSCode import error

* Fix merge error
2022-07-06 11:43:19 -03:00
Daniel Vaz Gaspar
6e74f3e82c docs: fix link for Apache Superset source code (#20620)
(cherry picked from commit b39a3d8f78)
2022-07-06 11:40:43 -03:00
Daniel Vaz Gaspar
3e97c60c8d chore: bump FAB to 4.1.3 (#20621)
(cherry picked from commit 183871b002)
2022-07-06 11:40:43 -03:00
Rui Zhao
747b011bb7 fix(embedded): Retry when executing alert queries to avoid sending transient errors to users as alert failure notifications (#20419)
Co-authored-by: Rui Zhao <zhaorui@dropbox.com>
(cherry picked from commit 818962cc89)
2022-07-06 11:40:43 -03:00
Evan Rusackas
dc71454416 fix: Respecting max/min opacities, and adding tests. (#20555)
* Respecting max/min opacities, and adding tests.

* revising tests

* Adding missing test case for maximum coverage :)

* removing unnecessary logic and test

* adding another unit test for (hopefully) full coverage.

* no more ternary operator

* New approach with Math.min  - take THAT codecov.

* one more stab at making codecov happy... ignoring the file next.

* lint fixes

(cherry picked from commit ac8e502228)
2022-07-06 11:38:33 -03:00
Evan Rusackas
4dcc805dee Revert "feat(plugin-chart-echarts): Support stacking negative and positive values (#20408)" (#20571)
This reverts commit c959d92dd1.

(cherry picked from commit f5f8ddec3e)
2022-07-06 11:38:33 -03:00
Stephen Liu
3b5513be36 fix(database-modal): forms in database modal will be effected by external form values (#20487)
(cherry picked from commit 932e304ffb)
2022-07-06 11:38:33 -03:00
Stephen Liu
4b2397f0c7 fix(big-number): big number gets cut off on a Dashboard (#20488)
(cherry picked from commit 24a53c38c6)
2022-07-06 11:38:32 -03:00
yourssvk
9a26a211d4 fix: SQL Lab cancel query in Redshift database connection does not wo… (#16326)
* fix: SQL Lab cancel query in Redshift database connection does not work #16325

Co-authored-by: Venkata Krishnan Somasundaram <venkata_cred@Venkatas-MacBook-Pro.local>
Co-authored-by: Elizabeth Thompson <eschutho@gmail.com>
(cherry picked from commit 90d486a643)
2022-07-06 11:38:32 -03:00
Diego Medina
eedcefc64a fix: Unable to download the Dashboard as image in case there's an image added through Markdown (#20362)
* fix: Unable to download the Dashboard as image in case there's an image added through Markdown

* licence

(cherry picked from commit c5d3678a31)
2022-07-06 11:38:32 -03:00
Alex Lauderbaugh
eba63b4add Updated copy in chart drop down to "View as table" (#20486)
(cherry picked from commit 93fbfe9d28)
2022-07-06 11:38:32 -03:00
Michael S. Molina
edbbf886af Updates CHANGELOG.md 2022-06-29 09:27:48 -03:00
Michael S. Molina
80e2f1abe7 fix: Removes psycopg2 as a required dependency (#20543)
* fix: Removes psycopg2 as a required dependency

* Disables lint warning

(cherry picked from commit cb3cd41dcd)
2022-06-29 09:25:30 -03:00
Michael S. Molina
b5a4c06d82 Updates CHANGELOG.md, UPDATING.md and package.json 2022-06-28 15:55:58 -03:00
smileydev
789f99341b fix(db): Show the only db install guide when the db is already installed and error is existed while importing file. (#20442)
* fix(db): make to show the db error msg when db installed and error is exist

* fix(db): make to replace dbinstall str into showDbInstallInstructions

(cherry picked from commit 23e62d3782)
2022-06-28 14:22:22 -03:00
Daniel Vaz Gaspar
63229dcf56 fix: bump FAB to 4.1.2 (#20483) 2022-06-28 14:22:12 -03:00
Multazim Deshmukh
92038db579 fix: correction from mmsql to mssql in setup.py (#20493)
Co-authored-by: Multazim Deshmukh <multazim.deshmukh@morningstar.com>
(cherry picked from commit 5a2abfab65)
2022-06-28 14:21:25 -03:00
Yongjie Zhao
b8d9208b6e feat(standardized form data): keep all columns and metrics (#20377)
(cherry picked from commit bbbe102887)
2022-06-28 14:21:25 -03:00
Elizabeth Thompson
885bbdde95 remove autoflush for queries during dual write (#20460)
(cherry picked from commit 44f0b511dd)
2022-06-28 14:21:25 -03:00
John Bodley
25a6f02cd6 fix: Re-add filter-box time granularity/column (#20485)
Co-authored-by: John Bodley <john.bodley@airbnb.com>
(cherry picked from commit 661ab35bd0)
2022-06-28 14:21:25 -03:00
Stephen Liu
bbca109ea3 fix(docs): prevent some symbols from being copied with (#20480)
(cherry picked from commit aa4068048a)
2022-06-28 14:19:23 -03:00
stevetracvc
bcc23bbacd fix: issue with sorting by multiple columns in a table (#19920)
Recent commit to sort alphanumeric columns via case insensitive
comparison broke the multi-column sort option. React-table only sorts
by the second (or third...) column if the first column matches.
Since the alphanumeric sort only returned -1 or 1, it never would move
to the subsequent columns when the earlier column values matched.

(cherry picked from commit a45d011e74)
2022-06-28 14:19:23 -03:00
John Bodley
23061d6822 fix(migration): Ensure key_value LargeBinary is encoded as a MEDIUMBLOB as opposed to BLOB for MySQL (#20385)
* fix(migration): Ensure key_value LargeBinary is encoded as a MEDIUMBLOB as opposed to BLOB for MySQL

* Update 2022-06-14_15-28_e09b4ae78457_resize_key_value_blob.py

Co-authored-by: John Bodley <john.bodley@airbnb.com>
(cherry picked from commit f5cb23e0a3)
2022-06-28 14:19:23 -03:00
Diego Medina
40a9257311 fix: alert & reports active toggle optimistic update (#20402)
(cherry picked from commit 4dc30441b7)
2022-06-28 14:19:23 -03:00
Michael S. Molina
ca0544a573 fix: Changes the return type of get_permissions to be JSON friendly (#20472)
* fix: Changes the return type of get_permissions to be JSON friendly

* Removes dangling comma

* Removes unused import

* Fixes typing errors

(cherry picked from commit a169b60712)
2022-06-28 14:19:23 -03:00
AAfghahi
d789f376b3 async queries limit bug (#20468)
(cherry picked from commit 2c16be42e1)
2022-06-28 14:19:23 -03:00
smileydev
3d850426ff fix(home): Show home page tabs as pills instead of links (#20257)
* fix(home): make to update the css style of links

* fix(home): make to fix lint issue

* fix(home): make to remove underline when tab is active

* fix(homes): make to fix the issue of tab

* fix(home): make to move styles to a tag

(cherry picked from commit a833674a8d)
2022-06-28 14:19:23 -03:00
Beto Dealmeida
4728d8f49b fix: ensure column name in description is string (#20340)
* fix: ensure column name in description is string

* Add unit test

(cherry picked from commit f3b289d3c3)
2022-06-28 14:19:23 -03:00
Diego Medina
9e6a3e1a4e fix(viz): BigQuery time grain 'minute'/'second' throws an error (#20350)
(cherry picked from commit 5afeba34bd)
2022-06-28 14:19:22 -03:00
smileydev
2d551faaf4 fix(chart & table): make to prevent dates from wrapping (#20384)
(cherry picked from commit 1ae935379f)
2022-06-28 14:19:22 -03:00
Yongjie Zhao
f58ad259ac fix: suppress translation warning in jest (#20404)
(cherry picked from commit 9fad26fa19)
2022-06-28 14:19:22 -03:00
Yongjie Zhao
4e93690e19 fix: should raise exception when apply a categorical axis (#20451)
(cherry picked from commit 8bbbd6f03f)
2022-06-28 14:19:22 -03:00
Diego Medina
d1ac6e5db4 fix: table viz sort icon bottom aligned (#20447)
(cherry picked from commit 93774d1860)
2022-06-28 14:19:22 -03:00
Samira El Aabidi
59fbf2a202 feat(chart): Enable caching per user when user impersonation is enabled (#20114)
* add username to extra cache keys when impersonation is enabled.

* don't put effective_user in extra_cache_key

* get_impersonation_key method in engine_spec class  to construct an impersonation key

* pass datasource when creating query objects

* adding an impersonation key when construction cache key

* add feature flag to control caching per user

* revert changes

* make precommit and pylint happy

* pass a User instance

* remove unnecessary import

(cherry picked from commit 68af5980ea)
2022-06-23 09:12:14 -03:00
Sam Firke
4841e8fb9c style(typo): occured -> occurred (#20116)
* Occured -> Occurred

* Occured -> Occurred

* Occured -> Occurred

* Occured - > Occurred

* Update FallbackComponent.tsx

* Update FallbackComponent.tsx

Co-authored-by: John Bodley <4567245+john-bodley@users.noreply.github.com>
(cherry picked from commit b7eb235440)
2022-06-23 09:12:14 -03:00
John Bodley
06592180ea [fbprophet] Fix frequencies (#20326)
(cherry picked from commit 8b0bee5e8b)
2022-06-23 09:12:14 -03:00
Simon Thelin
cb270034f3 fix(20428): Address-Presto/Trino-Poll-Issue-Refactor (#20434)
* fix(20428)-Address-Presto/Trino-Poll-Issue-Refacto
r

Update linter

* Update to only use BaseEngineSpec handle_cursor

* Fix CI

Co-authored-by: John Bodley <4567245+john-bodley@users.noreply.github.com>
(cherry picked from commit 8b7262fa90)
2022-06-23 09:12:14 -03:00
Diego Medina
7e48de484a fix(dashboard): new created chart did not have high lighted effect when using the permalink of chart share in dashboard (#20411)
(cherry picked from commit c2f01a676c)
2022-06-23 09:12:14 -03:00
Lily Kuang
a08499d88d fix(embedded): CSV download for chart (#20261)
* move postForm to superset client

* lint

* fix lint

* fix type

* update tests

* add tests

* add test for form submit

* add test for request form

* lint

* fix test

* fix tests

* more tests

* more tests

* test

* lint

* more test for postForm

* lint

* Update superset-frontend/packages/superset-ui-core/test/connection/SupersetClientClass.test.ts

Co-authored-by: David Aaron Suddjian <1858430+suddjian@users.noreply.github.com>

* update tests

* remove useless test

* make test cover happy

* make test cover happy

* make test cover happy

* make codecov happy

* make codecov happy

Co-authored-by: David Aaron Suddjian <1858430+suddjian@users.noreply.github.com>
(cherry picked from commit ab9f72f1a1)
2022-06-23 09:12:14 -03:00
jiAng
bd721cd86c fix(cosmetic): cannot find m-r-10 class in superset.less (#20276)
* fix(cosmetic): cannot find m-r-10 class in superset.less

* fix: remove .m-r-10 class and use emotion instead

* Update superset-frontend/src/components/Datasource/CollectionTable.tsx

Co-authored-by: Michael S. Molina <70410625+michael-s-molina@users.noreply.github.com>

* Update superset-frontend/src/components/Datasource/DatasourceEditor.jsx

Co-authored-by: Michael S. Molina <70410625+michael-s-molina@users.noreply.github.com>

Co-authored-by: Michael S. Molina <70410625+michael-s-molina@users.noreply.github.com>
(cherry picked from commit f6f93aad37)
2022-06-23 09:12:14 -03:00
Stephen Liu
67c853790d fix: rm eslint-plugin-translation-vars engine requirement (#20420)
(cherry picked from commit fa7f144a68)
2022-06-23 09:12:14 -03:00
Stephen Liu
3b3c3be9b2 fix(bar-chart-v2): remove marker from bar chart V2 (#20409)
(cherry picked from commit b32288fddf)
2022-06-23 09:12:13 -03:00
mohittt8
2f07a88c32 fix(presto): use correct timespec for presto (#20333)
(cherry picked from commit 41bbf62e58)
2022-06-23 09:12:13 -03:00
Elizabeth Thompson
128722085c fix key error on permalink fetch for old permalinks (#20414)
(cherry picked from commit 12436e47c9)
2022-06-23 09:12:13 -03:00
Smart-Codi
78c577b515 adding extra metrics after chart configuration (#20410)
(cherry picked from commit a8a6b732e9)
2022-06-23 09:12:13 -03:00
chuancy
9aa047cf12 Chinese translation and English translation do not match (#20405)
(cherry picked from commit 11d94ce56c)
2022-06-23 09:12:13 -03:00
Kamil Gabryjelski
ce9807941b feat(plugin-chart-echarts): Support stacking negative and positive values (#20408)
(cherry picked from commit c959d92dd1)
2022-06-16 09:03:57 -03:00
8674 changed files with 599810 additions and 1396761 deletions

View File

@@ -17,16 +17,7 @@
# https://cwiki.apache.org/confluence/display/INFRA/.asf.yaml+features+for+git+repositories
---
notifications:
commits: commits@superset.apache.org
issues: notifications@superset.apache.org
pullrequests: notifications@superset.apache.org
discussions: notifications@superset.apache.org
github:
pull_requests:
del_branch_on_merge: true
allow_update_branch: true
description: "Apache Superset is a Data Visualization and Data Exploration Platform"
homepage: https://superset.apache.org/
labels:
@@ -56,44 +47,31 @@ github:
projects: true
# Enable wiki for documentation
wiki: true
# Enable discussions
discussions: true
enabled_merge_buttons:
squash: true
merge: false
rebase: false
ghp_branch: gh-pages
ghp_path: /
protected_branches:
master:
required_status_checks:
# strict means "Require branches to be up to date before merging".
strict: false
# contexts are the names of checks that must pass
# unfortunately AFAICT for `matrix:` jobs, we have to itemize every
# combination here.
contexts:
- lint-check
- cypress-matrix (0, chrome)
- check
- cypress-matrix (1, chrome)
- cypress-matrix (2, chrome)
- cypress-matrix (3, chrome)
- cypress-matrix (4, chrome)
- cypress-matrix (5, chrome)
- dependency-review
- docker-build
- frontend-build
- playwright-tests (chromium)
- pre-commit (current)
- pre-commit (previous)
- test-mysql
- test-postgres (current)
- test-postgres-hive
- test-postgres-presto
- test-sqlite
- unit-tests (current)
- pre-commit (3.8)
- python-lint (3.8)
- test-mysql (3.8)
- test-postgres (3.8)
- test-postgres (3.9)
- test-sqlite (3.8)
required_pull_request_reviews:
dismiss_stale_reviews: false
@@ -101,10 +79,3 @@ github:
required_approving_review_count: 1
required_signatures: false
gh-pages:
required_pull_request_reviews:
dismiss_stale_reviews: false
require_code_owner_reviews: true
required_approving_review_count: 1
required_signatures: false

View File

@@ -1,10 +0,0 @@
# JavaScript to TypeScript Migration Command
## Usage
```
/js-to-ts <core-filename>
```
- `<core-filename>` - Path to CORE file relative to `superset-frontend/` (e.g., `src/utils/common.js`, `src/middleware/loggerMiddleware.js`)
## Agent Instructions
**See:** [../projects/js-to-ts/AGENT.md](../projects/js-to-ts/AGENT.md) for complete migration guide.

View File

@@ -1,684 +0,0 @@
# JavaScript to TypeScript Migration Agent Guide
**Complete technical reference for converting JavaScript/JSX files to TypeScript/TSX in Apache Superset frontend.**
**Agent Role:** Atomic migration unit - migrate the core file + ALL related tests/mocks as one cohesive unit. Use `git mv` to preserve history, NO `git commit`. NO global import changes. Report results upon completion.
---
## 🎯 Migration Principles
1. **Atomic migration units** - Core file + all related tests/mocks migrate together
2. **Zero `any` types** - Use proper TypeScript throughout
3. **Leverage existing types** - Reuse established definitions
4. **Type inheritance** - Derivatives extend base component types
5. **Strategic placement** - File types for maximum discoverability
6. **Surgical improvements** - Enhance existing types during migration
---
## Step 0: Dependency Check (MANDATORY)
**Command:**
```bash
grep -E "from '\.\./.*\.jsx?'|from '\./.*\.jsx?'|from 'src/.*\.jsx?'" superset-frontend/{filename}
```
**Decision:**
- ✅ No matches → Proceed with atomic migration (core + tests + mocks)
- ❌ Matches found → EXIT with dependency report (see format below)
---
## Step 1: Identify Related Files (REQUIRED)
**Atomic Migration Scope:**
For core file `src/utils/example.js`, also migrate:
- `src/utils/example.test.js` / `src/utils/example.test.jsx`
- `src/utils/example.spec.js` / `src/utils/example.spec.jsx`
- `src/utils/__mocks__/example.js`
- Any other related test/mock files found by pattern matching
**Find all related test and mock files:**
```bash
# Pattern-based search for related files
basename=$(basename {filename} .js)
dirname=$(dirname superset-frontend/{filename})
# Find test files
find "$dirname" -name "${basename}.test.js" -o -name "${basename}.test.jsx"
find "$dirname" -name "${basename}.spec.js" -o -name "${basename}.spec.jsx"
# Find mock files
find "$dirname" -name "__mocks__/${basename}.js"
find "$dirname" -name "${basename}.mock.js"
```
**Migration Requirement:** All discovered related files MUST be migrated together as one atomic unit.
**Test File Creation:** If NO test files exist for the core file, CREATE a minimal test file using the following pattern:
- Location: Same directory as core file
- Name: `{basename}.test.ts` (e.g., `DebouncedMessageQueue.test.ts`)
- Content: Basic test structure importing and testing the main functionality
- Use proper TypeScript types in test file
---
## 🗺️ Type Reference Map
### From `@superset-ui/core`
```typescript
// Data & Query
QueryFormData, QueryData, JsonObject, AnnotationData, AdhocMetric
LatestQueryFormData, GenericDataType, DatasourceType, ExtraFormData
DataMaskStateWithId, NativeFilterScope, NativeFiltersState, NativeFilterTarget
// UI & Theme
FeatureFlagMap, LanguagePack, ColorSchemeConfig, SequentialSchemeConfig
```
### From `@superset-ui/chart-controls`
```typescript
Dataset, ColumnMeta, ControlStateMapping
```
### From Local Types (`src/types/`)
```typescript
// Authentication
User, UserWithPermissionsAndRoles, BootstrapUser, PermissionsAndRoles
// Dashboard
Dashboard, DashboardState, DashboardInfo, DashboardLayout, LayoutItem
ComponentType, ChartConfiguration, ActiveFilters
// Charts
Chart, ChartState, ChartStatus, ChartLinkedDashboard, Slice, SaveActionType
// Data
Datasource, Database, Owner, Role
// UI Components
TagType, FavoriteStatus, Filter, ImportResourceName
```
### From Domain Types
```typescript
// src/dashboard/types.ts
RootState, ChartsState, DatasourcesState, FilterBarOrientation
ChartCrossFiltersConfig, ActiveTabs, MenuKeys
// src/explore/types.ts
ExplorePageInitialData, ExplorePageState, ExploreResponsePayload, OptionSortType
// src/SqlLab/types.ts
[SQL Lab specific types]
```
---
## 🏗️ Type Organization Strategy
### Type Placement Hierarchy
1. **Component-Colocated** (90% of cases)
```typescript
// Same file as component
interface MyComponentProps {
title: string;
onClick: () => void;
}
```
2. **Feature-Shared**
```typescript
// src/[domain]/components/[Feature]/types.ts
export interface FilterConfiguration {
filterId: string;
targets: NativeFilterTarget[];
}
```
3. **Domain-Wide**
```typescript
// src/[domain]/types.ts
export interface ExploreFormData extends QueryFormData {
viz_type: string;
}
```
4. **Global**
```typescript
// src/types/[TypeName].ts
export interface ApiResponse<T> {
result: T;
count?: number;
}
```
### Type Discovery Commands
```bash
# Search existing types before creating
find superset-frontend/src -name "types.ts" -exec grep -l "[TypeConcept]" {} \;
grep -r "interface.*Props\|type.*Props" superset-frontend/src/
```
### Derivative Component Patterns
**Rule:** Components that extend others should extend their type interfaces.
```typescript
// ✅ Base component type
interface SelectProps {
value: string | number;
options: SelectOption[];
onChange: (value: string | number) => void;
disabled?: boolean;
}
// ✅ Derivative extends base
interface ChartSelectProps extends SelectProps {
charts: Chart[];
onChartSelect: (chart: Chart) => void;
}
// ✅ Derivative with modified props
interface DatabaseSelectProps extends Omit<SelectProps, 'value' | 'onChange'> {
value: number; // Narrowed type
onChange: (databaseId: number) => void; // Specific signature
}
```
**Common Patterns:**
- **Extension:** `extends BaseProps` - adds new props
- **Omission:** `Omit<BaseProps, 'prop'>` - removes props
- **Modification:** `Omit<BaseProps, 'prop'> & { prop: NewType }` - changes prop type
- **Restriction:** Override with narrower types (union → specific)
---
## 📋 Migration Recipe
### Step 2: File Conversion
```bash
# Use git mv to preserve history
git mv component.js component.ts
git mv Component.jsx Component.tsx
```
### Step 3: Import & Type Setup
```typescript
// Import order (enforced by linting)
import { FC, ReactNode } from 'react';
import { JsonObject, QueryFormData } from '@superset-ui/core';
import { Dataset } from '@superset-ui/chart-controls';
import type { Dashboard } from 'src/types/Dashboard';
```
### Step 4: Function & Component Typing
```typescript
// Functions with proper parameter/return types
export function processData(
data: Dataset[],
config: JsonObject
): ProcessedData[] {
// implementation
}
// Component props with inheritance
interface ComponentProps extends BaseProps {
data: Chart[];
onSelect: (id: number) => void;
}
const Component: FC<ComponentProps> = ({ data, onSelect }) => {
// implementation
};
```
### Step 5: State & Redux Typing
```typescript
// Hooks with specific types
const [data, setData] = useState<Chart[]>([]);
const [selected, setSelected] = useState<number | null>(null);
// Redux with existing RootState
const mapStateToProps = (state: RootState) => ({
charts: state.charts,
user: state.user,
});
```
---
## 🧠 Type Debugging Strategies (Real-World Learnings)
### The Evolution of Type Approaches
When you hit type errors, follow this debugging evolution:
#### 1. ❌ Idealized Union Types (First Attempt)
```typescript
// Looks clean but doesn't match reality
type DatasourceInput = Datasource | QueryEditor;
```
**Problem**: Real calling sites pass variations, not exact types.
#### 2. ❌ Overly Precise Types (Second Attempt)
```typescript
// Tried to match exact calling signatures
type DatasourceInput =
| IDatasource // From DatasourcePanel
| (QueryEditor & { columns: ColumnMeta[] }); // From SaveQuery
```
**Problem**: Too rigid, doesn't handle legacy variations.
#### 3. ✅ Flexible Interface (Final Solution)
```typescript
// Captures what the function actually needs
interface DatasourceInput {
name?: string | null; // Allow null for compatibility
datasource_name?: string | null; // Legacy variations
columns?: any[]; // Multiple column types accepted
database?: { id?: number };
// ... other optional properties
}
```
**Success**: Works with all calling sites, focuses on function needs.
### Type Debugging Process
1. **Start with compilation errors** - they show exact mismatches
2. **Examine actual usage** - look at calling sites, not idealized types
3. **Build flexible interfaces** - capture what functions need, not rigid contracts
4. **Iterate based on downstream validation** - let calling sites guide your types
---
## 🚨 Anti-Patterns to Avoid
```typescript
// ❌ Never use any
const obj: any = {};
// ✅ Use proper types
const obj: Record<string, JsonObject> = {};
// ❌ Don't recreate base component props
interface ChartSelectProps {
value: string; // Duplicated from SelectProps
onChange: () => void; // Duplicated from SelectProps
charts: Chart[]; // New prop
}
// ✅ Inherit and extend
interface ChartSelectProps extends SelectProps {
charts: Chart[]; // Only new props
}
// ❌ Don't create ad-hoc type variations
interface UserInfo {
name: string;
email: string;
}
// ✅ Extend existing types (DRY principle)
import { User } from 'src/types/bootstrapTypes';
type UserDisplayInfo = Pick<User, 'firstName' | 'lastName' | 'email'>;
// ❌ Don't create overly rigid unions
type StrictInput = ExactTypeA | ExactTypeB;
// ✅ Create flexible interfaces for function parameters
interface FlexibleInput {
// Focus on what the function actually needs
commonProperty: string;
optionalVariations?: any; // Allow for legacy variations
}
```
## 📍 DRY Type Guidelines (WHERE TYPES BELONG)
### Type Placement Rules
**CRITICAL**: Type variations must live close to where they belong, not scattered across files.
#### ✅ Proper Type Organization
```typescript
// ❌ Don't create one-off interfaces in utility files
// src/utils/datasourceUtils.ts
interface DatasourceInput { /* custom interface */ } // Wrong!
// ✅ Use existing types or extend them in their proper domain
// src/utils/datasourceUtils.ts
import { IDatasource } from 'src/explore/components/DatasourcePanel';
import { QueryEditor } from 'src/SqlLab/types';
// Create flexible interface that references existing types
interface FlexibleDatasourceInput {
// Properties that actually exist across variations
}
```
#### Type Location Hierarchy
1. **Domain Types**: `src/{domain}/types.ts` (dashboard, explore, SqlLab)
2. **Component Types**: Co-located with components
3. **Global Types**: `src/types/` directory
4. **Utility Types**: Only when they truly don't belong elsewhere
#### ✅ DRY Type Patterns
```typescript
// ✅ Extend existing domain types
interface SaveQueryData extends Pick<QueryEditor, 'sql' | 'dbId' | 'catalog'> {
columns: ColumnMeta[]; // Add what's needed
}
// ✅ Create flexible interfaces for cross-domain utilities
interface CrossDomainInput {
// Common properties that exist across different source types
name?: string | null; // Accommodate legacy null values
// Only include properties the function actually uses
}
```
---
## 🎯 PropTypes Auto-Generation (Elegant Approach)
**IMPORTANT**: Superset has `babel-plugin-typescript-to-proptypes` configured to automatically generate PropTypes from TypeScript interfaces. Use this instead of manual PropTypes duplication!
### ❌ Manual PropTypes Duplication (Avoid This)
```typescript
export interface MyComponentProps {
title: string;
count?: number;
}
// 8+ lines of manual PropTypes duplication 😱
const propTypes = PropTypes.shape({
title: PropTypes.string.isRequired,
count: PropTypes.number,
});
export default propTypes;
```
### ✅ Auto-Generated PropTypes (Use This)
```typescript
import { InferProps } from 'prop-types';
export interface MyComponentProps {
title: string;
count?: number;
}
// Single validator function - babel plugin auto-generates PropTypes! ✨
export default function MyComponentValidator(props: MyComponentProps) {
return null; // PropTypes auto-assigned by babel-plugin-typescript-to-proptypes
}
// Optional: For consumers needing PropTypes type inference
export type MyComponentPropsInferred = InferProps<typeof MyComponentValidator>;
```
### Migration Pattern for Type-Only Files
**When migrating type-only files with manual PropTypes:**
1. **Keep the TypeScript interfaces** (single source of truth)
2. **Replace manual PropTypes** with validator function
3. **Remove PropTypes imports** and manual shape definitions
4. **Add InferProps import** if type inference needed
**Example Migration:**
```typescript
// Before: 25+ lines with manual PropTypes duplication
export interface AdhocFilterType { /* ... */ }
const adhocFilterTypePropTypes = PropTypes.oneOfType([...]);
// After: 3 lines with auto-generation
export interface AdhocFilterType { /* ... */ }
export default function AdhocFilterValidator(props: { filter: AdhocFilterType }) {
return null; // Auto-generated PropTypes by babel plugin
}
```
### Component PropTypes Pattern
**For React components, the babel plugin works automatically:**
```typescript
interface ComponentProps {
title: string;
onClick: () => void;
}
const MyComponent: FC<ComponentProps> = ({ title, onClick }) => {
// Component implementation
};
// PropTypes automatically generated by babel plugin - no manual work needed!
export default MyComponent;
```
### Auto-Generation Benefits
- ✅ **Single source of truth**: TypeScript interfaces drive PropTypes
- ✅ **No duplication**: Eliminate 15-20 lines of manual PropTypes code
- ✅ **Automatic updates**: Changes to TypeScript automatically update PropTypes
- ✅ **Type safety**: Compile-time checking ensures PropTypes match interfaces
- ✅ **Backward compatibility**: Existing JavaScript components continue working
### Babel Plugin Configuration
The plugin is already configured in `babel.config.js`:
```javascript
['babel-plugin-typescript-to-proptypes', { loose: true }]
```
**No additional setup required** - just use TypeScript interfaces and the plugin handles the rest!
---
## 🧪 Test File Migration Patterns
### Test File Priority
- **Always migrate test files** alongside production files
- **Test files are often leaf nodes** - good starting candidates
- **Create tests if missing** - Leverage new TypeScript types for better test coverage
### Test-Specific Type Patterns
```typescript
// Mock interfaces for testing
interface MockStore {
getState: () => Partial<RootState>; // Partial allows minimal mocking
}
// Type-safe mocking for complex objects
const mockDashboardInfo: Partial<DashboardInfo> as DashboardInfo = {
id: 123,
json_metadata: '{}',
};
// Sinon stub typing
let postStub: sinon.SinonStub;
beforeEach(() => {
postStub = sinon.stub(SupersetClient, 'post');
});
// Use stub reference instead of original method
expect(postStub.callCount).toBe(1);
expect(postStub.getCall(0).args[0].endpoint).toMatch('/api/');
```
### Test Migration Recipe
1. **Migrate production file first** (if both need migration)
2. **Update test imports** to point to `.ts/.tsx` files
3. **Add proper mock typing** using `Partial<T> as T` pattern
4. **Fix stub typing** - Use stub references, not original methods
5. **Verify all tests pass** with TypeScript compilation
---
## 🔧 Type Conflict Resolution
### Multiple Type Definitions Issue
**Problem**: Same type name defined in multiple files causes compilation errors.
**Example**: `DashboardInfo` defined in both:
- `src/dashboard/reducers/types.ts` (minimal)
- `src/dashboard/components/Header/types.ts` (different shape)
- `src/dashboard/types.ts` (complete - used by RootState)
### Resolution Strategy
1. **Identify the authoritative type**:
```bash
# Find which type is used by RootState/main interfaces
grep -r "DashboardInfo" src/dashboard/types.ts
```
2. **Use import from authoritative source**:
```typescript
// ✅ Import from main domain types
import { RootState, DashboardInfo } from 'src/dashboard/types';
// ❌ Don't import from component-specific files
import { DashboardInfo } from 'src/dashboard/components/Header/types';
```
3. **Mock complex types in tests**:
```typescript
// For testing - provide minimal required fields
const mockInfo: Partial<DashboardInfo> as DashboardInfo = {
id: 123,
json_metadata: '{}',
// Only provide fields actually used in test
};
```
### Type Hierarchy Discovery Commands
```bash
# Find all definitions of a type
grep -r "interface.*TypeName\|type.*TypeName" src/
# Find import usage patterns
grep -r "import.*TypeName" src/
# Check what RootState uses
grep -A 10 -B 10 "TypeName" src/*/types.ts
```
---
## Agent Constraints (CRITICAL)
1. **Use git mv** - Run `git mv file.js file.ts` to preserve git history, but NO `git commit`
2. **NO global import changes** - Don't update imports across codebase
3. **Type files OK** - Can modify existing type files to improve/align types
4. **Single-File TypeScript Validation** (CRITICAL) - tsc has known issues with multi-file compilation:
- **Core Issue**: TypeScript's `tsc` has documented problems validating multiple files simultaneously in complex projects
- **Solution**: ALWAYS validate files one at a time using individual `tsc` calls
- **Command Pattern**: `cd superset-frontend && npx tscw --noEmit --allowJs --composite false --project tsconfig.json {single-file-path}`
- **Why**: Multi-file validation can produce false positives, miss real errors, and conflict during parallel agent execution
5. **Downstream Impact Validation** (CRITICAL) - Your migration affects calling sites:
- **Find downstream files**: `find superset-frontend/src -name "*.tsx" -o -name "*.ts" | xargs grep -l "your-core-filename" 2>/dev/null || echo "No files found"`
- **Validate each downstream file individually**: `cd superset-frontend && npx tscw --noEmit --allowJs --composite false --project tsconfig.json {each-downstream-file}`
- **Fix type mismatches** you introduced in calling sites
- **NEVER ignore downstream errors** - they indicate your types don't match reality
6. **Avoid Project-Wide Validation During Migration**:
- **NEVER use `npm run type`** during parallel agent execution - produces unreliable results
- **Single-file validation is authoritative** - trust individual file checks over project-wide scans
6. **ESLint validation** - Run `npm run eslint -- --fix {file}` for each migrated file to auto-fix formatting/linting issues
6. Zero `any` types - use proper TypeScript types
7. Search existing types before creating new ones
8. Follow patterns from this guide
---
## Success Report Format
```
SUCCESS: Atomic Migration of {core-filename}
## Files Migrated (Atomic Unit)
- Core: {core-filename} → {core-filename.ts/tsx}
- Tests: {list-of-test-files} → {list-of-test-files.ts/tsx} OR "CREATED: {basename}.test.ts"
- Mocks: {list-of-mock-files} → {list-of-mock-files.ts}
- Type files modified: {list-of-type-files}
## Types Created/Improved
- {TypeName}: {location} ({scope}) - {rationale}
- {ExistingType}: enhanced in {location} - {improvement-description}
## Documentation Recommendations
- ADD_TO_DIRECTORY: {TypeName} - {reason}
- NO_DOCUMENTATION: {TypeName} - {reason}
## Quality Validation
- **Single-File TypeScript Validation**: ✅ PASS - Core files individually validated
- Core file: `npx tscw --noEmit --allowJs --composite false --project tsconfig.json {core-file}`
- Test files: `npx tscw --noEmit --allowJs --composite false --project tsconfig.json {test-file}` (if exists)
- **Downstream Impact Check**: ✅ PASS - Found {N} files importing this module, all validate individually
- Downstream files: {list-of-files-that-import-your-module}
- Individual validation: `npx tscw --noEmit --allowJs --composite false --project tsconfig.json {each-downstream-file}`
- **ESLint validation**: ✅ PASS (using `npm run eslint -- --fix {files}` to auto-fix formatting)
- **Zero any types**: ✅ PASS
- **Local imports resolved**: ✅ PASS
- **Functionality preserved**: ✅ PASS
- **Tests pass** (if test file): ✅ PASS
- **Follow-up action required**: {YES/NO}
## Validation Strategy Notes
- **Single-file approach used**: Avoided multi-file tsc validation due to known TypeScript compilation issues
- **Project-wide validation skipped**: `npm run type` not used during parallel migration to prevent false positives
## Migration Learnings
- Type conflicts encountered: {describe any multiple type definitions}
- Mock patterns used: {describe test mocking approaches}
- Import hierarchy decisions: {note authoritative type sources used}
- PropTypes strategy: {AUTO_GENERATED via babel plugin | MANUAL_DUPLICATION_REMOVED | N/A}
## Improvement Suggestions for Documentation
- AGENT.md enhancement: {suggest additions to migration guide}
- Common pattern identified: {note reusable patterns for future migrations}
```
---
## Dependency Block Report Format
```
DEPENDENCY_BLOCK: Cannot migrate {filename}
## Blocking Dependencies
- {path}: {type} - {usage} - {priority}
## Impact Analysis
- Estimated types: {number}
- Expected locations: {list}
- Cross-domain: {YES/NO}
## Recommended Order
{ordered-list}
```
---
## 📚 Quick Reference
**Type Utilities:**
- `Record<K, V>` - Object with specific key/value types
- `Partial<T>` - All properties optional
- `Pick<T, K>` - Subset of properties
- `Omit<T, K>` - Exclude specific properties
- `NonNullable<T>` - Exclude null/undefined
**Event Types:**
- `MouseEvent<HTMLButtonElement>`
- `ChangeEvent<HTMLInputElement>`
- `FormEvent<HTMLFormElement>`
**React Types:**
- `FC<Props>` - Functional component
- `ReactNode` - Any renderable content
- `CSSProperties` - Style objects
---
**Remember:** Every type should add value and clarity. The goal is meaningful type safety that catches bugs and improves developer experience.

View File

@@ -1,199 +0,0 @@
# JS-to-TS Coordinator Workflow
**Role:** Strategic migration coordination - select leaf-node files, trigger agents, review results, handle integration, manage dependencies.
---
## 1. Core File Selection Strategy
**Target ONLY Core Files**: Coordinators identify core files (production code), agents handle related tests/mocks atomically.
**File Analysis Commands**:
```bash
# Find CORE files with no JS/JSX dependencies (exclude tests/mocks) - SIZE PRIORITIZED
find superset-frontend/src -name "*.js" -o -name "*.jsx" | grep -v "test\|spec\|mock" | xargs wc -l | sort -n | head -20
# Alternative: Get file sizes in lines with paths
find superset-frontend/src -name "*.js" -o -name "*.jsx" | grep -v "test\|spec\|mock" | while read file; do
lines=$(wc -l < "$file")
echo "$lines $file"
done | sort -n | head -20
# Check dependencies for core files only (start with smallest)
for file in <core-files-sorted-by-size>; do
echo "=== $file ($(wc -l < "$file") lines) ==="
grep -E "from '\.\./.*\.jsx?'|from '\./.*\.jsx?'|from 'src/.*\.jsx?'" "$file" || echo "✅ LEAF CANDIDATE"
done
# Identify heavily imported files (migrate last)
grep -r "from.*utils/common" superset-frontend/src/ | wc -l
# Quick leaf analysis with size priority
find superset-frontend/src -name "*.js" -o -name "*.jsx" | grep -v "test\|spec\|mock" | head -30 | while read file; do
deps=$(grep -E "from '\.\./.*\.jsx?'|from '\./.*\.jsx?'|from 'src/.*\.jsx?'" "$file" | wc -l)
lines=$(wc -l < "$file")
if [ "$deps" -eq 0 ]; then
echo "✅ LEAF: $lines lines - $file"
fi
done | sort -n
```
**Priority Order** (Smallest files first for easier wins):
1. **Small leaf files** (<50 lines) - No JS/JSX imports, quick TypeScript conversion
2. **Medium leaf files** (50-200 lines) - Self-contained utilities and helpers
3. **Small dependency files** (<100 lines) - Import only already-migrated files
4. **Larger components** (200+ lines) - Complex but well-contained functionality
5. **Core foundational files** (utils/common.js, controls.jsx) - migrate last regardless of size
**Size-First Benefits**:
- Faster completion builds momentum
- Earlier validation of migration patterns
- Easier rollback if issues arise
- Better success rate for agent learning
**Migration Unit**: Each agent call migrates:
- 1 core file (primary target)
- All related `*.test.js/jsx` files
- All related `*.mock.js` files
- All related `__mocks__/` files
---
## 2. Task Creation & Agent Control
### Task Triggering
When triggering the `/js-to-ts` command:
- **Task Title**: Use the core filename as the task title (e.g., "DebouncedMessageQueue.js migration", "hostNamesConfig.js migration")
- **Task Description**: Include the full relative path to help agent locate the file
- **Reference**: Point agent to [AGENT.md](./AGENT.md) for technical instructions
### Post-Processing Workflow
After each agent completes:
1. **Review Agent Report**: Always read and analyze the complete agent report
2. **Share Summary**: Provide user with key highlights from agent's work:
- Files migrated (core + tests/mocks)
- Types created or improved
- Any validation issues or coordinator actions needed
3. **Quality Assessment**: Evaluate agent's TypeScript implementation against criteria:
-**Type Usage**: Proper types used, no `any` types
-**Type Filing**: Types placed in correct hierarchy (component → feature → domain → global)
-**Side Effects**: No unintended changes to other files
-**Import Alignment**: Proper .ts/.tsx import extensions
4. **Integration Decision**:
- **COMMIT**: If agent work is complete and high quality
- **FIX & COMMIT**: If minor issues need coordinator fixes
- **ROLLBACK**: If major issues require complete rework
5. **Next Action**: Ask user preference - commit this work or trigger next migration
---
## 3. Integration Decision Framework
**Automatic Integration** ✅:
- `npm run type` passes without errors
- Agent created clean TypeScript with proper types
- Types appropriately filed in hierarchy
**Coordinator Integration** (Fix Side-Effects) 🔧:
- `npm run type` fails BUT agent's work is high quality
- Good type usage, proper patterns, well-organized
- Side-effects are manageable TypeScript compilation errors
- **Coordinator Action**: Integrate the change, then fix global compilation issues
**Rollback Only** ❌:
- Agent introduced `any` types or poor type choices
- Types poorly organized or conflicting with existing patterns
- Fundamental approach issues requiring complete rework
**Integration Process**:
1. **Review**: Agent already used `git mv` to preserve history
2. **Fix Side-Effects**: Update dependent files with proper import extensions
3. **Resolve Types**: Fix any cascading type issues across codebase
4. **Validate**: Ensure `npm run type` passes after fixes
---
## 4. Common Integration Patterns
**Common Side-Effects (Expect These)**:
- **Type import conflicts**: Multiple definitions of same type name
- **Mock object typing**: Tests need complete type satisfaction
- **Stub method references**: Use stub vars instead of original methods
**Coordinator Fixes (Standard Process)**:
1. **Import Resolution**:
```bash
# Find authoritative type source
grep -r "TypeName" src/*/types.ts
# Import from domain types (src/dashboard/types.ts) not component types
```
2. **Test Mock Completion**:
```typescript
// Use Partial<T> as T pattern for minimal mocking
const mockDashboard: Partial<DashboardInfo> as DashboardInfo = {
id: 123,
json_metadata: '{}',
};
```
3. **Stub Reference Fixes**:
```typescript
// ✅ Use stub variable
expect(postStub.callCount).toBe(1);
// ❌ Don't use original method
expect(SupersetClient.post.callCount).toBe(1);
```
4. **Validation Commands**:
```bash
npm run type # TypeScript compilation
npm test -- filename # Test functionality
git status # Should show rename, not add/delete
```
---
## 5. File Categories for Planning
### Leaf Files (Start Here)
**Self-contained files with minimal JS/JSX dependencies**:
- Test files (80 files) - Usually only import the file being tested
- Utility files without internal dependencies
- Components importing only external libraries
### Heavily Imported Files (Migrate Last)
**Core files that many others depend on**:
- `utils/common.js` - Core utility functions
- `utils/reducerUtils.js` - Redux helpers
- `@superset-ui/core` equivalent files
- Major state management files (`explore/store.js`, `dashboard/actions/`)
### Complex Components (Middle Priority)
**Large files requiring careful type analysis**:
- `components/Datasource/DatasourceEditor.jsx` (1,809 lines)
- `explore/components/controls/AnnotationLayerControl/AnnotationLayer.jsx` (1,031 lines)
- `explore/components/ExploreViewContainer/index.jsx` (911 lines)
---
## 6. Success Metrics & Continuous Improvement
**Per-File Gates**:
- ✅ `npm run type` passes after each migration
- ✅ Zero `any` types introduced
- ✅ All imports properly typed
- ✅ Types filed in correct hierarchy
**Linear Scheduling**:
When agents report `DEPENDENCY_BLOCK`:
- Queue dependencies in linear order
- Process one file at a time to avoid conflicts
- Handle cascading type changes between files
**After Each Migration**:
1. **Update guides** with new patterns discovered
2. **Document coordinator fixes** that become common
3. **Enhance agent instructions** based on recurring issues
4. **Track success metrics** - automatic vs coordinator integration rates

View File

@@ -1,76 +0,0 @@
# JavaScript to TypeScript Migration Project
Progressive migration of 219 JS/JSX files to TypeScript in Apache Superset frontend.
## 📁 Project Documentation
- **[AGENT.md](./AGENT.md)** - Complete technical migration guide for agents (includes type reference, patterns, validation)
- **[COORDINATOR.md](./COORDINATOR.md)** - Strategic workflow for coordinators (file selection, task management, integration)
## 🎯 Quick Start
**For Agents:** Read [AGENT.md](./AGENT.md) for complete migration instructions
**For Coordinators:** Read [COORDINATOR.md](./COORDINATOR.md) for workflow and [AGENT.md](./AGENT.md) for supervision
**Command:** `/js-to-ts <filename>` - See [../../commands/js-to-ts.md](../../commands/js-to-ts.md)
## 📊 Migration Progress
**Scope**: 219 files total (112 JS + 107 JSX)
- Production files: 139 (63%)
- Test files: 80 (37%)
**Strategy**: Leaf-first migration with dependency-aware coordination
### Completed Migrations ✅
1. **roundDecimal** - `plugins/legacy-plugin-chart-map-box/src/utils/roundDecimal.js`
- Migrated core + test files
- Added proper TypeScript function signature with optional precision parameter
- All tests pass
2. **timeGrainSqlaAnimationOverrides** - `src/explore/controlPanels/timeGrainSqlaAnimationOverrides.js`
- Migrated to TypeScript with ControlPanelState and Dataset types
- Added TimeGrainOverrideState interface for return type
- Used type guards for safe property access
3. **DebouncedMessageQueue** - `src/utils/DebouncedMessageQueue.js`
- Migrated to TypeScript with proper generics
- Created DebouncedMessageQueueOptions interface
- **CREATED test file** with 4 comprehensive test cases
- Excellent class property typing with private/readonly modifiers
**Files Migrated**: 3/219 (1.4%)
**Tests Created**: 2 (roundDecimal had existing, DebouncedMessageQueue created)
### Next Candidates (Leaf Nodes) 🎯
**Identified leaf files with no JS/JSX dependencies:**
- `src/utils/hostNamesConfig.js` - Domain configuration utility
- `src/explore/controlPanels/Separator.js` - Control panel configuration
- `src/middleware/loggerMiddleware.js` - Logging middleware
**Migration Quality**: All completed migrations have:
- ✅ Zero `any` types
- ✅ Proper TypeScript compilation
- ✅ ESLint validation passed
- ✅ Test coverage (created where missing)
---
## 📈 Success Metrics
**Per-File Gates**:
-`npm run type` passes after each migration
- ✅ Zero `any` types introduced
- ✅ All imports properly typed
- ✅ Types filed in correct hierarchy
**Overall Progress**:
- **Automatic Integration Rate**: 100% (3/3 migrations required no coordinator fixes)
- **Test Coverage**: Improved (1 new test file created)
- **Type Safety**: Enhanced with proper interfaces and generics
---
*This is a claudette-managed progressive refactor. All documentation and coordination resources are organized under `.claude/projects/js-to-ts/`*

View File

@@ -1,15 +0,0 @@
{
"hooks": {
"PreToolUse": [
{
"matcher": "Bash",
"hooks": [
{
"type": "command",
"command": "jq -r '.tool_input.command // \"\"' | grep -qE '^git commit' && cd \"$CLAUDE_PROJECT_DIR\" && echo '🔍 Running pre-commit before commit...' && pre-commit run || true"
}
]
}
]
}
}

View File

@@ -1,36 +0,0 @@
# .coveragerc to control coverage.py
[run]
branch = True
source = superset
# omit = bad_file.py
[paths]
source =
superset/
*/site-packages/
[report]
# Regexes for lines to exclude from consideration
exclude_lines =
# Have to re-enable the standard pragma
pragma: no cover
# Don't complain about missing debug-only code:
def __repr__
if self\.debug
# Don't complain if tests don't hit defensive assertion code:
raise AssertionError
raise NotImplementedError
# Don't complain if non-runnable code isn't run:
if 0:
if __name__ == .__main__.:
# Ignore importlib backport
from importlib
if TYPE_CHECKING:
#fail_under = 100
show_missing = True

View File

@@ -1,125 +0,0 @@
---
description: Apache Superset development standards and guidelines for Cursor IDE
globs: ["**/*.py", "**/*.ts", "**/*.tsx", "**/*.js", "**/*.jsx", "**/*.sql", "**/*.md"]
alwaysApply: true
---
# Apache Superset Development Standards for Cursor IDE
Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend.
## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do)
**These migrations are actively happening - avoid deprecated patterns:**
### Frontend Modernization
- **NO `any` types** - Use proper TypeScript types
- **NO JavaScript files** - Convert to TypeScript (.ts/.tsx)
- **NO Enzyme** - Use React Testing Library/Jest (Enzyme fully removed)
- **Use @superset-ui/core** - Don't import Ant Design directly
### Testing Strategy Migration
- **Prefer unit tests** over integration tests
- **Prefer integration tests** over Cypress end-to-end tests
- **Cypress is last resort** - Actively moving away from Cypress
- **Use Jest + React Testing Library** for component testing
### Backend Type Safety
- **Add type hints** - All new Python code needs proper typing
- **MyPy compliance** - Run `pre-commit run mypy` to validate
- **SQLAlchemy typing** - Use proper model annotations
## Code Standards
### TypeScript Frontend
- **NO `any` types** - Use proper TypeScript
- **Functional components** with hooks
- **@superset-ui/core** for UI components (not direct antd)
- **Jest** for testing (NO Enzyme)
- **Redux** for global state, hooks for local
### Python Backend
- **Type hints required** for all new code
- **MyPy compliant** - run `pre-commit run mypy`
- **SQLAlchemy models** with proper typing
- **pytest** for testing
### Apache License Headers
- **New files require ASF license headers** - When creating new code files, include the standard Apache Software Foundation license header
- **LLM instruction files are excluded** - Files like LLMS.md, CLAUDE.md, etc. are in `.rat-excludes` to avoid header token overhead
## Key Directory Structure
```
superset/
├── superset/ # Python backend (Flask, SQLAlchemy)
│ ├── views/api/ # REST API endpoints
│ ├── models/ # Database models
│ └── connectors/ # Database connections
├── superset-frontend/src/ # React TypeScript frontend
│ ├── components/ # Reusable components
│ ├── explore/ # Chart builder
│ ├── dashboard/ # Dashboard interface
│ └── SqlLab/ # SQL editor
├── superset-frontend/packages/
│ └── superset-ui-core/ # UI component library (USE THIS)
├── tests/ # Python/integration tests
├── docs/ # Documentation (UPDATE FOR CHANGES)
└── UPDATING.md # Breaking changes log
```
## Architecture Patterns
### Dataset-Centric Approach
Charts built from enriched datasets containing:
- Dimension columns with labels/descriptions
- Predefined metrics as SQL expressions
- Self-service analytics within defined contexts
### Security & Features
- **RBAC**: Role-based access via Flask-AppBuilder
- **Feature flags**: Control feature rollouts
- **Row-level security**: SQL-based data access control
## Test Utilities
### Python Test Helpers
- **`SupersetTestCase`** - Base class in `tests/integration_tests/base_tests.py`
- **`@with_config`** - Config mocking decorator
- **`@with_feature_flags`** - Feature flag testing
- **`login_as()`, `login_as_admin()`** - Authentication helpers
- **`create_dashboard()`, `create_slice()`** - Data setup utilities
### TypeScript Test Helpers
- **`superset-frontend/spec/helpers/testing-library.tsx`** - Custom render() with providers
- **`createWrapper()`** - Redux/Router/Theme wrapper
- **`selectOption()`** - Select component helper
- **React Testing Library** - NO Enzyme (removed)
## Pre-commit Validation
**Use pre-commit hooks for quality validation:**
```bash
# Install hooks
pre-commit install
# Quick validation (faster than --all-files)
pre-commit run # Staged files only
pre-commit run mypy # Python type checking
pre-commit run prettier # Code formatting
pre-commit run eslint # Frontend linting
```
## Development Guidelines
- **Documentation**: Update docs/ for any user-facing changes
- **Breaking Changes**: Add to UPDATING.md
- **Docstrings**: Required for new functions/classes
- **Follow existing patterns**: Mimic code style, use existing libraries and utilities
- **Type Safety**: This codebase is actively modernizing toward full TypeScript and type safety
- **Always run `pre-commit run`** to validate changes before committing
---
**Note**: This codebase is actively modernizing toward full TypeScript and type safety. Always run `pre-commit run` to validate changes. Follow the ongoing refactors section to avoid deprecated patterns.

View File

@@ -1,20 +0,0 @@
# Keep this in sync with the base image in the main Dockerfile (ARG PY_VER)
FROM python:3.11.13-trixie AS base
# Install system dependencies that Superset needs
# This layer will be cached across Codespace sessions
RUN apt-get update && apt-get install -y \
libsasl2-dev \
libldap2-dev \
libpq-dev \
tmux \
gh \
&& rm -rf /var/lib/apt/lists/*
# Install uv for fast Python package management
# This will also be cached in the image
RUN curl -LsSf https://astral.sh/uv/install.sh | sh && \
echo 'export PATH="/root/.cargo/bin:$PATH"' >> /etc/bash.bashrc
# Set the cargo/bin directory in PATH for all users
ENV PATH="/root/.cargo/bin:${PATH}"

View File

@@ -1,5 +0,0 @@
# Superset Development with GitHub Codespaces
For complete documentation on using GitHub Codespaces with Apache Superset, please see:
**[Setting up a Development Environment - GitHub Codespaces](https://superset.apache.org/docs/contributing/development#github-codespaces-cloud-development)**

View File

@@ -1,62 +0,0 @@
# Superset Codespaces environment setup
# This file is appended to ~/.bashrc during Codespace setup
# Find the workspace directory (handles both 'superset' and 'superset-2' names)
WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1)
if [ -n "$WORKSPACE_DIR" ]; then
# Check if virtual environment exists
if [ -d "$WORKSPACE_DIR/.venv" ]; then
# Activate the virtual environment
source "$WORKSPACE_DIR/.venv/bin/activate"
echo "✅ Python virtual environment activated"
# Verify pre-commit is installed and set up
if command -v pre-commit &> /dev/null; then
echo "✅ pre-commit is available ($(pre-commit --version))"
# Install git hooks if not already installed
if [ -d "$WORKSPACE_DIR/.git" ] && [ ! -f "$WORKSPACE_DIR/.git/hooks/pre-commit" ]; then
echo "🪝 Installing pre-commit hooks..."
cd "$WORKSPACE_DIR" && pre-commit install
fi
else
echo "⚠️ pre-commit not found. Run: pip install pre-commit"
fi
else
echo "⚠️ Python virtual environment not found at $WORKSPACE_DIR/.venv"
echo " Run: cd $WORKSPACE_DIR && .devcontainer/setup-dev.sh"
fi
# Always cd to the workspace directory for convenience
cd "$WORKSPACE_DIR"
fi
# Add helpful aliases for Superset development
alias start-superset="$WORKSPACE_DIR/.devcontainer/start-superset.sh"
alias setup-dev="$WORKSPACE_DIR/.devcontainer/setup-dev.sh"
# Show helpful message on login
echo ""
echo "🚀 Superset Codespaces Environment"
echo "=================================="
# Check if Superset is running
if docker ps 2>/dev/null | grep -q "superset"; then
echo "✅ Superset is running!"
echo " - Check the 'Ports' tab for your live Superset URL"
echo " - Initial startup takes 10-20 minutes"
echo " - Login: admin/admin"
else
echo "⚠️ Superset is not running. Use: start-superset"
# Check if there's a startup log
if [ -f "/tmp/superset-startup.log" ]; then
echo " 📋 Startup log found: cat /tmp/superset-startup.log"
fi
fi
echo ""
echo "Quick commands:"
echo " start-superset - Start Superset with Docker Compose"
echo " setup-dev - Set up Python environment (if not already done)"
echo " pre-commit run - Run pre-commit checks on staged files"
echo ""

View File

@@ -1,20 +0,0 @@
#!/bin/bash
# Script to build and push the devcontainer image to GitHub Container Registry
# This allows caching the image between Codespace sessions
# You'll need to run this with appropriate GitHub permissions
# gh auth login --scopes write:packages
REGISTRY="ghcr.io"
OWNER="apache"
REPO="superset"
TAG="devcontainer-base"
echo "Building devcontainer image..."
docker build -t $REGISTRY/$OWNER/$REPO:$TAG .devcontainer/
echo "Pushing to GitHub Container Registry..."
docker push $REGISTRY/$OWNER/$REPO:$TAG
echo "Done! Update .devcontainer/devcontainer.json to use:"
echo " \"image\": \"$REGISTRY/$OWNER/$REPO:$TAG\""

View File

@@ -1,19 +0,0 @@
{
// Extend the base configuration
"extends": "../devcontainer-base.json",
"name": "Apache Superset Development (Default)",
// Forward ports for development
"forwardPorts": [9001],
"portsAttributes": {
"9001": {
"label": "Superset (via Webpack Dev Server)",
"onAutoForward": "notify",
"visibility": "public"
}
},
// Auto-start Superset on Codespace resume
"postStartCommand": ".devcontainer/start-superset.sh"
}

View File

@@ -1,39 +0,0 @@
{
"name": "Apache Superset Development",
// Keep this in sync with the base image in Dockerfile (ARG PY_VER)
// Using the same base as Dockerfile, but non-slim for dev tools
"image": "python:3.11.13-bookworm",
"features": {
"ghcr.io/devcontainers/features/docker-in-docker:2": {
"moby": true,
"dockerDashComposeVersion": "v2"
},
"ghcr.io/devcontainers/features/node:1": {
"version": "20"
},
"ghcr.io/devcontainers/features/git:1": {},
"ghcr.io/devcontainers/features/common-utils:2": {
"configureZshAsDefaultShell": true
},
"ghcr.io/devcontainers/features/sshd:1": {
"version": "latest"
}
},
// Run commands after container is created
"postCreateCommand": "chmod +x .devcontainer/setup-dev.sh && .devcontainer/setup-dev.sh",
// VS Code customizations
"customizations": {
"vscode": {
"extensions": [
"ms-python.python",
"ms-python.vscode-pylance",
"charliermarsh.ruff",
"dbaeumer.vscode-eslint",
"esbenp.prettier-vscode"
]
}
}
}

View File

@@ -1,66 +0,0 @@
{
"name": "Apache Superset Development",
// Option 1: Use pre-built image directly
// "image": "ghcr.io/apache/superset:devcontainer-base",
// Option 2: Build from Dockerfile with cache (current approach)
"build": {
"dockerfile": "Dockerfile",
"context": ".",
// Cache from the Apache registry image
"cacheFrom": ["ghcr.io/apache/superset:devcontainer-base"]
},
"features": {
"ghcr.io/devcontainers/features/docker-in-docker:2": {
"moby": false,
"dockerDashComposeVersion": "v2"
},
"ghcr.io/devcontainers/features/node:1": {
"version": "20"
},
"ghcr.io/devcontainers/features/git:1": {},
"ghcr.io/devcontainers/features/common-utils:2": {
"configureZshAsDefaultShell": true
},
"ghcr.io/devcontainers/features/sshd:1": {
"version": "latest"
}
},
// Forward ports for development
"forwardPorts": [9001],
"portsAttributes": {
"9001": {
"label": "Superset (via Webpack Dev Server)",
"onAutoForward": "notify",
"visibility": "public"
}
},
// Run commands after container is created
"postCreateCommand": "bash .devcontainer/setup-dev.sh || echo '⚠️ Setup had issues - run .devcontainer/setup-dev.sh manually'",
// Auto-start Superset after ensuring Docker is ready
// Run in foreground to see any errors, but don't block on failures
"postStartCommand": "bash -c 'echo \"Waiting 30s for services to initialize...\"; sleep 30; .devcontainer/start-superset.sh || echo \"⚠️ Auto-start failed - run start-superset manually\"'",
// Set environment variables
"remoteEnv": {
// Removed automatic venv activation to prevent startup issues
// The setup script will handle this
},
// VS Code customizations
"customizations": {
"vscode": {
"extensions": [
"ms-python.python",
"ms-python.vscode-pylance",
"charliermarsh.ruff",
"dbaeumer.vscode-eslint",
"esbenp.prettier-vscode"
]
}
}
}

View File

@@ -1,32 +0,0 @@
#!/bin/bash
# Setup script for Superset Codespaces development environment
echo "🔧 Setting up Superset development environment..."
# The universal image has most tools, just need Superset-specific libs
echo "📦 Installing Superset-specific dependencies..."
sudo apt-get update
sudo apt-get install -y \
libsasl2-dev \
libldap2-dev \
libpq-dev \
tmux \
gh
# Install uv for fast Python package management
echo "📦 Installing uv..."
curl -LsSf https://astral.sh/uv/install.sh | sh
# Add cargo/bin to PATH for uv
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.bashrc
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.zshrc
# Install Claude Code CLI via npm
echo "🤖 Installing Claude Code..."
npm install -g @anthropic-ai/claude-code
# Make the start script executable
chmod +x .devcontainer/start-superset.sh
echo "✅ Development environment setup complete!"
echo "🚀 Run '.devcontainer/start-superset.sh' to start Superset"

View File

@@ -1,69 +0,0 @@
#!/bin/bash
# Startup script for Superset in Codespaces
echo "🚀 Starting Superset in Codespaces..."
echo "🌐 Frontend will be available at port 9001"
# Check if MCP is enabled
if [ "$ENABLE_MCP" = "true" ]; then
echo "🤖 MCP Service will be available at port 5008"
fi
# Find the workspace directory (Codespaces clones as 'superset', not 'superset-2')
WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1)
if [ -n "$WORKSPACE_DIR" ]; then
cd "$WORKSPACE_DIR"
echo "📁 Working in: $WORKSPACE_DIR"
else
echo "📁 Using current directory: $(pwd)"
fi
# Check if docker is running
if ! docker info > /dev/null 2>&1; then
echo "⏳ Waiting for Docker to start..."
sleep 5
fi
# Clean up any existing containers
echo "🧹 Cleaning up existing containers..."
docker-compose -f docker-compose-light.yml --profile mcp down
# Start services
echo "🏗️ Building and starting services..."
echo ""
echo "📝 Once started, login with:"
echo " Username: admin"
echo " Password: admin"
echo ""
echo "📋 Running in foreground with live logs (Ctrl+C to stop)..."
# Run docker-compose and capture exit code
if [ "$ENABLE_MCP" = "true" ]; then
echo "🤖 Starting with MCP Service enabled..."
docker-compose -f docker-compose-light.yml --profile mcp up
else
docker-compose -f docker-compose-light.yml up
fi
EXIT_CODE=$?
# If it failed, provide helpful instructions
if [ $EXIT_CODE -ne 0 ] && [ $EXIT_CODE -ne 130 ]; then # 130 is Ctrl+C
echo ""
echo "❌ Superset startup failed (exit code: $EXIT_CODE)"
echo ""
echo "🔄 To restart Superset, run:"
echo " .devcontainer/start-superset.sh"
echo ""
echo "🔧 For troubleshooting:"
echo " # View logs:"
echo " docker-compose -f docker-compose-light.yml logs"
echo ""
echo " # Clean restart (removes volumes):"
echo " docker-compose -f docker-compose-light.yml down -v"
echo " .devcontainer/start-superset.sh"
echo ""
echo " # Common issues:"
echo " - Network timeouts: Just retry, often transient"
echo " - Port conflicts: Check 'docker ps'"
echo " - Database issues: Try clean restart with -v"
fi

View File

@@ -1,29 +0,0 @@
{
// Extend the base configuration
"extends": "../devcontainer-base.json",
"name": "Apache Superset Development with MCP",
// Forward ports for development
"forwardPorts": [9001, 5008],
"portsAttributes": {
"9001": {
"label": "Superset (via Webpack Dev Server)",
"onAutoForward": "notify",
"visibility": "public"
},
"5008": {
"label": "MCP Service (Model Context Protocol)",
"onAutoForward": "notify",
"visibility": "private"
}
},
// Auto-start Superset with MCP on Codespace resume
"postStartCommand": "ENABLE_MCP=true .devcontainer/start-superset.sh",
// Environment variables
"containerEnv": {
"ENABLE_MCP": "true"
}
}

View File

@@ -34,6 +34,7 @@
**/*.sqllite
**/*.swp
**/.terser-plugin-cache/
**/.storybook/
**/node_modules/
tests/
@@ -41,8 +42,6 @@ docs/
install/
superset-frontend/cypress-base/
superset-frontend/coverage/
superset-frontend/.temp_cache/
superset/static/assets/
superset-websocket/dist/
venv
.venv

View File

@@ -1,41 +0,0 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Auto-configure Docker Compose for multi-instance support
# Requires direnv: https://direnv.net/
#
# Install: brew install direnv (or apt install direnv)
# Setup: Add 'eval "$(direnv hook bash)"' to ~/.bashrc (or ~/.zshrc)
# Allow: Run 'direnv allow' in this directory once
# Generate unique project name from directory
export COMPOSE_PROJECT_NAME=$(basename "$PWD" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9]/-/g')
# Find available ports sequentially to avoid collisions
_is_free() { ! lsof -i ":$1" &>/dev/null 2>&1; }
_p=80; while ! _is_free $_p; do ((_p++)); done; export NGINX_PORT=$_p
_p=8088; while ! _is_free $_p; do ((_p++)); done; export SUPERSET_PORT=$_p
_p=9000; while ! _is_free $_p; do ((_p++)); done; export NODE_PORT=$_p
_p=8080; while ! _is_free $_p || [ $_p -eq $NGINX_PORT ]; do ((_p++)); done; export WEBSOCKET_PORT=$_p
_p=8081; while ! _is_free $_p || [ $_p -eq $WEBSOCKET_PORT ]; do ((_p++)); done; export CYPRESS_PORT=$_p
_p=5432; while ! _is_free $_p; do ((_p++)); done; export DATABASE_PORT=$_p
_p=6379; while ! _is_free $_p; do ((_p++)); done; export REDIS_PORT=$_p
unset _p _is_free
echo "🐳 Superset configured: http://localhost:$SUPERSET_PORT (dev: localhost:$NODE_PORT)"

View File

@@ -15,4 +15,4 @@
# limitations under the License.
#
FLASK_APP="superset.app:create_app()"
FLASK_DEBUG=true
FLASK_ENV="development"

3
.gitattributes vendored
View File

@@ -1,4 +1 @@
docker/**/*.sh text eol=lf
*.svg binary
*.ipynb binary
*.geojson binary

49
.github/CODEOWNERS vendored
View File

@@ -2,47 +2,18 @@
# https://github.com/apache/superset/issues/13351
/superset/migrations/ @mistercrunch @michael-s-molina @betodealmeida @eschutho @sadpandajoe
/superset/migrations/ @apache/superset-committers
# Notify some committers of changes in the components
# Notify Preset team when ephemeral env settings are changed
/superset-frontend/src/components/Select/ @michael-s-molina @geido @kgabryje
/superset-frontend/src/components/MetadataBar/ @michael-s-molina @geido @kgabryje
/superset-frontend/src/components/DropdownContainer/ @michael-s-molina @geido @kgabryje
.github/workflows/ecs-task-definition.json @robdiciuccio @craig-rueda @rusackas @eschutho @dpgaspar @nytai @mistercrunch
.github/workflows/docker-ephemeral-env.yml @robdiciuccio @craig-rueda @rusackas @eschutho @dpgaspar @nytai @mistercrunch
.github/workflows/ephemeral*.yml @robdiciuccio @craig-rueda @rusackas @eschutho @dpgaspar @nytai @mistercrunch
# Notify some committers of changes in the Select component
/superset-frontend/src/components/Select/ @michael-s-molina @geido @ktmud
# Notify Helm Chart maintainers about changes in it
/helm/superset/ @craig-rueda @dpgaspar @villebro @nytai @michael-s-molina @mistercrunch @rusackas @Antonio-RiveroMartnez
# Notify E2E test maintainers of changes
/superset-frontend/cypress-base/ @sadpandajoe @geido @eschutho @rusackas @betodealmeida @mistercrunch
# Notify PMC members of changes to GitHub Actions
/.github/ @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @kgabryje @dpgaspar @sadpandajoe @hainenber
# Notify PMC members of changes to CI-executed scripts (supply-chain risk:
# scripts/ files run directly in CI workflows and can execute arbitrary code)
/scripts/ @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @kgabryje @dpgaspar @sadpandajoe @hainenber
# Notify PMC members of changes to required GitHub Actions
/.asf.yaml @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @kgabryje @dpgaspar @Antonio-RiveroMartnez
# Maps are a finicky contribution process we care about
**/*.geojson @villebro @rusackas
/superset-frontend/plugins/legacy-plugin-chart-country-map/ @villebro @rusackas
# Notify PMC members of changes to extension-related files
/docs/developer_portal/extensions/ @michael-s-molina @villebro @rusackas
/superset-core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-extensions-cli/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset/core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset/extensions/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-frontend/src/packages/superset-core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-frontend/src/core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/superset-frontend/src/extensions/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
/helm/superset/ @craig-rueda @dpgaspar @villebro

View File

@@ -1,99 +0,0 @@
name: Bug report
description: Report a bug to improve Superset's stability
labels: ["bug"]
body:
- type: markdown
attributes:
value: |
Hello Superset Community member! Please keep things tidy by putting your post in the proper place:
🚨 Reporting a security issue: send an email to security@superset.apache.org. DO NOT USE GITHUB ISSUES TO REPORT SECURITY PROBLEMS.
🐛 Reporting a bug: use this form.
🙏 Asking a question or getting help: post in the [Superset Slack chat](http://bit.ly/join-superset-slack) or [GitHub Discussions](https://github.com/apache/superset/discussions) under "Q&A / Help".
💡 Requesting a new feature: Search [GitHub Discussions](https://github.com/apache/superset/discussions) to see if it exists already. If not, add a new post there under "Ideas".
- type: textarea
id: bug-description
attributes:
label: Bug description
description: A clear description of what the bug is, including reproduction steps and expected behavior.
placeholder: |
The bug is that...
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
validations:
required: true
- type: textarea
id: screenshots-recordings
attributes:
label: Screenshots/recordings
description: If applicable, add screenshots or recordings to help explain your problem.
- type: markdown
attributes:
value: |
### Environment
Please specify your environment. If your environment does not match the alternatives, you need to upgrade your environment before submitting the issue as it may have already been fixed. For additional information about the releases, see [Release Process](https://github.com/apache/superset/wiki/Release-Process).
- type: dropdown
id: superset-version
attributes:
label: Superset version
options:
- master / latest-dev
- "6.0.0"
- "5.0.0"
validations:
required: true
- type: dropdown
id: python-version
attributes:
label: Python version
options:
- "3.9"
- "3.10"
- "3.11"
- Not applicable
- I don't know
validations:
required: true
- type: dropdown
id: node-version
attributes:
label: Node version
options:
- "16"
- "17"
- "18 or greater"
- Not applicable
- I don't know
validations:
required: true
- type: dropdown
id: browser
attributes:
label: Browser
options:
- Chrome
- Firefox
- Safari
- Not applicable
validations:
required: true
- type: textarea
id: additional-context
attributes:
label: Additional context
description: |
Add any other context about the problem here such as the feature flags that you have enabled, any customizations you have made, the data source you are querying, etc.
- type: checkboxes
id: checklist
attributes:
label: Checklist
description: Make sure to follow these steps before submitting your issue - thank you!
options:
- label: I have searched Superset docs and Slack and didn't find a solution to my problem.
- label: I have searched the GitHub issue tracker and didn't find a similar bug report.
- label: I have checked Superset's logs for errors and if I found a relevant Python stacktrace, I included it here as text in the "additional context" section.
validations:
required: true

50
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,50 @@
---
name: Bug report
about: Create a report to help us improve
labels: "#bug"
---
A clear and concise description of what the bug is.
#### How to reproduce the bug
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
### Expected results
what you expected to happen.
### Actual results
what actually happens.
#### Screenshots
If applicable, add screenshots to help explain your problem.
### Environment
(please complete the following information):
- browser type and version:
- superset version: `superset version`
- python version: `python --version`
- node.js version: `node -v`
- any feature flags active:
### Checklist
Make sure to follow these steps before submitting your issue - thank you!
- [ ] I have checked the superset logs for python stacktraces and included it here as text if there are any.
- [ ] I have reproduced the issue with at least the latest released version of superset.
- [ ] I have checked the issue tracker for the same issue and I haven't found one similar.
### Additional context
Add any other context about the problem here.

View File

@@ -1,12 +0,0 @@
---
blank_issues_enabled: false
contact_links:
- name: Feature Request
url: https://github.com/apache/superset/discussions/new?category=ideas
about: Propose a feature request to the Superset community
- name: Q&A
url: https://github.com/apache/superset/discussions/new?category=q-a-help
about: Open a community Q&A thread on GitHub Discussions
- name: Slack
url: https://bit.ly/join-superset-slack
about: Join the Superset Community on Slack for other discussions and assistance

View File

@@ -2,6 +2,7 @@
name: Cosmetic Issue
about: Describe a cosmetic issue with CSS, positioning, layout, labeling, or similar
labels: "cosmetic-issue"
---
## Screenshot

View File

@@ -0,0 +1,14 @@
---
name: Feature request
about: Suggest an idea for this project
labels: "#enhancement"
---
Github Discussions is our new home for discussing features and improvements!
https://github.com/apache/superset/discussions/categories/ideas
We'd like to keep Github Issues focuses on bugs and SIP's (Superset Improvement Proposals)!
Please note that feature requests opened as Github Issues will be moved to Discussions.

View File

@@ -1,15 +1,14 @@
---
name: SIP
about: "Superset Improvement Proposal. See SIP-0 (https://github.com/apache/superset/issues/5602) for details. A SIP introduces any major change into Apache Superset's code or process."
labels: sip
title: "[SIP] Your Title Here (do not add SIP number)"
assignees: "apache/superset-committers"
about: Superset Improvement Proposal
labels: "#SIP"
---
*Please make sure you are familiar with the SIP process documented*
[here](https://github.com/apache/superset/issues/5602). The SIP will be numbered by a committer upon acceptance.
(here)[https://github.com/apache/superset/issues/5602]. The SIP number should be the next number after the latest SIP listed [here](https://github.com/apache/superset/issues?q=is%3Aissue+label%3Asip).
## [SIP] Proposal for ...<title>
## [SIP-\<number>] Proposal for <title>
### Motivation

67
.github/SECURITY.md vendored
View File

@@ -1,67 +0,0 @@
# Security Policy
This is a project of the [Apache Software Foundation](https://apache.org) and follows the
ASF [vulnerability handling process](https://apache.org/security/#vulnerability-handling).
## Reporting Vulnerabilities
**⚠️ Please do not file GitHub issues for security vulnerabilities as they are public! ⚠️**
Apache Software Foundation takes a rigorous standpoint in annihilating the security issues
in its software projects. Apache Superset is highly sensitive and forthcoming to issues
pertaining to its features and functionality.
If you have any concern or believe you have found a vulnerability in Apache Superset,
please get in touch with the Apache Superset Security Team privately at
e-mail address [security@superset.apache.org](mailto:security@superset.apache.org).
More details can be found on the ASF website at
[ASF vulnerability reporting process](https://apache.org/security/#reporting-a-vulnerability)
**Submission Standards & AI Policy**
To ensure engineering focus remains on verified risks and to manage high reporting volumes, all reports must meet the following criteria:
- Plain Text Format: In accordance with Apache guidelines, please provide all details in plain text within the email body. Avoid sending PDFs, Word documents, or password-protected archives.
- Mandatory AI Disclosure: If you utilized Large Language Models (LLMs) or AI tools to identify a flaw or assist in writing a report, you must disclose this in your submission so our triage team can contextualize the findings.
- Human-Verified PoC: All submissions must include a manual, step-by-step Proof of Concept (PoC) performed on a supported release. Raw AI outputs, hypothetical chat transcripts, or unverified scanner logs will be closed as Invalid.
We kindly ask you to include the following information in your report to assist our developers in triaging and remediating issues efficiently:
- Version/Commit: The specific version of Apache Superset or the Git commit hash you are using.
- Configuration: A sanitized copy of your `superset_config.py` file or any config overrides.
- Environment: Your deployment method (e.g., Docker Compose, Helm, or source) and relevant OS/Browser details.
- Impacted Component: Identification of the affected area (e.g., Python backend, React frontend, or a specific database connector).
- Expected vs. Actual Behavior: A clear description of the intended system behavior versus the observed vulnerability.
- Detailed Reproduction Steps: Clear, manual steps to reproduce the vulnerability.
**Out of Scope Vulnerabilities**
To prioritize engineering efforts on genuine architectural risks, the following scenarios are explicitly out of scope and will not be issued a CVE:
- Attacks requiring Admin privileges: (e.g., CSS injection, template manipulation, dashboard ownership overrides, or modifying global system settings). Per the CVE vulnerability definition in CNA Operational Rules 4.1, a qualifying vulnerability must allow violation of a security policy. The Admin role is a fully trusted operational boundary defined by Apache Superset's security policy; actions within this boundary do not violate that policy and are therefore considered intended capabilities 'by design,' not vulnerabilities.
- Brute Force and Rate Limiting: Reports targeting a lack of resource exhaustion protections, generic rate-limiting, or volumetric Denial of Service (DoS) attempts.
- Theoretical attack vectors: Issues without a demonstrable, reproducible exploit path.
- Non-Exploitable Findings: Missing security headers, generic banner disclosures, or descriptive error messages that do not lead to a direct, documented exploit.
**Outcome of Reports**
Reports that are deemed out-of-scope for a CVE but represent valid security best practices or hardening opportunities may be converted into public GitHub issues. This allows the community to contribute to the general hardening of the platform even when a specific vulnerability threshold is not met.
Note that Apache Superset is not responsible for any third-party dependencies that may
have security issues. Any vulnerabilities found in third-party dependencies should be
reported to the maintainers of those projects. Results from security scans of Apache
Superset dependencies found on its official Docker image can be remediated at release time
by extending the image itself.
**Vulnerability Aggregation & CVE Attribution**
In accordance with MITRE CNA Operational Rules (4.1.10, 4.1.11, and 4.2.13), Apache Superset issues CVEs based on the underlying architectural root cause rather than the number of affected endpoints or exploit payloads.
- Aggregation: If multiple exploit vectors stem from the same programmatic failure or shared vulnerable code, they must be aggregated into a single, comprehensive report.
- Independent Fixes: Separate CVEs will only be assigned if the vulnerabilities reside in decoupled architectural modules and can be fixed independently of one another.
Reports that fail to aggregate related findings will be merged during triage to ensure an accurate and defensible CVE record.
**Your responsible disclosure and collaboration are invaluable.**
## Extra Information
- [Apache Superset documentation](https://superset.apache.org/docs/security)
- [Common Vulnerabilities and Exposures by release](https://superset.apache.org/docs/security/cves)
- [How Security Vulnerabilities are Reported & Handled in Apache Superset (Blog)](https://preset.io/blog/how-security-vulnerabilities-are-reported-and-handled-in-apache-superset/)

View File

@@ -1,34 +0,0 @@
name: Change Detector
description: Detects file changes for pull request and push events
inputs:
token:
description: GitHub token for authentication
required: true
outputs:
python:
description: Whether Python-related files were changed
value: ${{ steps.change-detector.outputs.python }}
frontend:
description: Whether frontend-related files were changed
value: ${{ steps.change-detector.outputs.frontend }}
docker:
description: Whether docker-related files were changed
value: ${{ steps.change-detector.outputs.docker }}
docs:
description: Whether docs-related files were changed
value: ${{ steps.change-detector.outputs.docs }}
superset-extensions-cli:
description: Whether superset-extensions-cli package-related files were changed
value: ${{ steps.change-detector.outputs.superset-extensions-cli }}
runs:
using: composite
steps:
- name: Detect file changes
id: change-detector
run: |
python --version
python scripts/change_detector.py
shell: bash
env:
GITHUB_TOKEN: ${{ inputs.token }}
GITHUB_OUTPUT: ${{ github.output }}

View File

@@ -1,23 +0,0 @@
name: Label Draft PRs
on:
pull_request:
types:
- opened
- converted_to_draft
jobs:
label-draft:
runs-on: ubuntu-latest
steps:
- name: Check if the PR is a draft
id: check-draft
uses: actions/github-script@v8
with:
script: |
const isDraft = context.payload.pull_request.draft;
core.setOutput('isDraft', isDraft);
- name: Add `review:draft` Label
if: steps.check-draft.outputs.isDraft == 'true'
uses: actions-ecosystem/action-add-labels@v1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
labels: "review:draft"

View File

@@ -1,58 +0,0 @@
name: 'Setup Python Environment'
description: 'Set up Python and install dependencies with optional configurations.'
inputs:
python-version:
description: 'Python version to set up. Accepts a version number, "current", or "next".'
required: true
default: 'current'
cache:
description: 'Cache dependencies. Options: pip'
required: false
default: 'pip'
requirements-type:
description: 'Type of requirements to install. Options: base, development, default'
required: false
default: 'dev'
install-superset:
description: 'Whether to install Superset itself. If false, only python is installed'
required: false
default: 'true'
runs:
using: 'composite'
steps:
- name: Interpret Python Version
id: set-python-version
shell: bash
run: |
if [ "${{ inputs.python-version }}" = "current" ]; then
echo "PYTHON_VERSION=3.11" >> $GITHUB_ENV
elif [ "${{ inputs.python-version }}" = "next" ]; then
# currently disabled in GHA matrixes because of library compatibility issues
echo "PYTHON_VERSION=3.12" >> $GITHUB_ENV
elif [ "${{ inputs.python-version }}" = "previous" ]; then
echo "PYTHON_VERSION=3.10" >> $GITHUB_ENV
else
echo "PYTHON_VERSION=${{ inputs.python-version }}" >> $GITHUB_ENV
fi
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: ${{ inputs.cache }}
- name: Install dependencies
run: |
if [ "${{ inputs.install-superset }}" = "true" ]; then
sudo apt-get update && sudo apt-get -y install libldap2-dev libsasl2-dev
pip install --upgrade pip setuptools wheel uv
if [ "${{ inputs.requirements-type }}" = "dev" ]; then
uv pip install --system -r requirements/development.txt
elif [ "${{ inputs.requirements-type }}" = "base" ]; then
uv pip install --system -r requirements/base.txt
fi
uv pip install --system -e .
fi
shell: bash

View File

@@ -1,69 +0,0 @@
name: "Setup Docker Environment"
description: "Reusable steps for setting up QEMU, Docker Buildx, DockerHub login, Supersetbot, and optionally Docker Compose"
inputs:
build:
description: "Used for building?"
required: false
default: "false"
dockerhub-user:
description: "DockerHub username"
required: false
dockerhub-token:
description: "DockerHub token"
required: false
install-docker-compose:
description: "Flag to install Docker Compose"
required: false
default: "true"
login-to-dockerhub:
description: "Whether you want to log into dockerhub"
required: false
default: "true"
outputs: {}
runs:
using: "composite"
steps:
- name: Set up QEMU
if: ${{ inputs.build == 'true' }}
uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392 # v3.6.0
- name: Set up Docker Buildx
if: ${{ inputs.build == 'true' }}
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Try to login to DockerHub
if: ${{ inputs.login-to-dockerhub == 'true' }}
continue-on-error: true
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
with:
username: ${{ inputs.dockerhub-user }}
password: ${{ inputs.dockerhub-token }}
- name: Install Docker Compose
if: ${{ inputs.install-docker-compose == 'true' }}
shell: bash
run: |
sudo apt-get update
sudo apt-get install -y ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
# Download and save the Docker GPG key in the correct format
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
# Ensure the key file is readable
sudo chmod a+r /etc/apt/keyrings/docker.gpg
# Add the Docker repository using the correct key
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \
$(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# Update package lists and install Docker Compose plugin
sudo apt update
sudo apt install -y docker-compose-plugin
- name: Docker Version Info
shell: bash
run: docker info

View File

@@ -1,40 +0,0 @@
name: 'Setup supersetbot'
description: 'Sets up supersetbot npm lib from the repo or npm'
inputs:
from-npm:
description: 'Install from npm instead of local setup'
required: false
default: 'true' # Defaults to using the local setup
runs:
using: 'composite'
steps:
- name: Setup Node Env
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install supersetbot from npm
if: ${{ inputs.from-npm == 'true' }}
shell: bash
run: npm install -g supersetbot
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
if: ${{ inputs.from-npm == 'false' }}
uses: actions/checkout@v4
with:
repository: apache-superset/supersetbot
path: supersetbot
- name: Setup supersetbot from repo
if: ${{ inputs.from-npm == 'false' }}
shell: bash
working-directory: supersetbot
run: |
# simple trick to install globally with dependencies
npm pack
npm install -g ./supersetbot*.tgz
- name: echo supersetbot version
shell: bash
run: supersetbot version

View File

@@ -1 +0,0 @@
../AGENTS.md

344
.github/dependabot.yml vendored
View File

@@ -1,69 +1,17 @@
version: 2
enable-beta-ecosystems: true
updates:
- package-ecosystem: "github-actions"
directory: "/"
ignore:
# Ignore temporarily as release schedule is too mentally taxing for dep-handling maintainers
# Additionally, very few PRs are reviewed by this action.
- dependency-name: anthropics/claude-code-action
schedule:
interval: "daily"
- package-ecosystem: "npm"
ignore:
# TODO: remove below entries until React >= 18.0.0
- dependency-name: "storybook"
update-types: ["version-update:semver-major", "version-update:semver-minor"]
- dependency-name: "@storybook*"
update-types: ["version-update:semver-major", "version-update:semver-minor"]
- dependency-name: "eslint-plugin-storybook"
- dependency-name: "react-error-boundary"
- dependency-name: "@rjsf/*"
# remark-gfm v4+ requires react-markdown v9+, which needs React 18
- dependency-name: "remark-gfm"
- dependency-name: "react-markdown"
# TODO: remove below entries until React >= 19.0.0
- dependency-name: "react-icons"
# JSDOM v30 doesn't play well with Jest v30
# Source: https://jestjs.io/blog#known-issues
# GH thread: https://github.com/jsdom/jsdom/issues/3492
- dependency-name: "jest-environment-jsdom"
# `@swc/plugin-transform-imports` doesn't work with current Webpack-SWC hybrid setup
# See https://github.com/apache/superset/pull/37384#issuecomment-3793991389
# TODO: remove the plugin once Lodash usage has been migrated to a more readily tree-shakeable alternative
- dependency-name: "@swc/plugin-transform-imports"
# `just-handlerbars-helpers` library in plugin-chart-handlebars requires `currencyformatter`` to be < 2
- dependency-name: "currencyformatter.js"
update-types: ["version-update:semver-major"]
# TODO: remove below clause once https://github.com/pmmmwh/react-refresh-webpack-plugin/pull/940 lands onto a future release
# and confirm the issue https://github.com/apache/superset/issues/39600 is fixed
- dependency-name: "react-checkbox-tree"
update-types: ["version-update:semver-major"]
groups:
storybook:
applies-to: version-updates
patterns:
- "@storybook*"
- "storybook"
update-types:
- "patch"
directory: "/superset-frontend/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 30
versioning-strategy: increase
- package-ecosystem: "pip"
directory: "/"
open-pull-requests-limit: 10
directory: "/requirements/"
schedule:
interval: "weekly"
interval: "daily"
labels:
- pip
- dependabot
@@ -72,294 +20,10 @@ updates:
directory: ".github/actions"
schedule:
interval: "daily"
open-pull-requests-limit: 10
versioning-strategy: increase
open-pull-requests-limit: 0
- package-ecosystem: "npm"
directory: "/docs/"
ignore:
# TODO: remove below entries until React >= 18.0.0 in superset-frontend
- dependency-name: "storybook"
update-types: ["version-update:semver-major", "version-update:semver-minor"]
- dependency-name: "@storybook*"
update-types: ["version-update:semver-major", "version-update:semver-minor"]
- dependency-name: "eslint-plugin-storybook"
- dependency-name: "react-error-boundary"
groups:
storybook:
applies-to: version-updates
patterns:
- "@storybook*"
- "storybook"
update-types:
- "patch"
schedule:
interval: "daily"
open-pull-requests-limit: 10
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-websocket/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-websocket/utils/client-ws-app/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 10
versioning-strategy: increase
# Now for all of our plugins and packages!
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-calendar/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-partition/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-world-map/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-pivot-table/"
ignore:
# TODO: remove below entries until React >= 19.0.0
- dependency-name: "react-icons"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-chord/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-horizon/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-rose/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-preset-chart-deckgl/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-table/"
ignore:
# TODO: remove below entries until React >= 19.0.0
- dependency-name: "react-icons"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-country-map/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-map-box/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-preset-chart-nvd3/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-word-cloud/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-paired-t-test/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-echarts/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-ag-grid-table/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-cartodiagram/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-parallel-coordinates/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-handlebars/"
ignore:
# `just-handlerbars-helpers` library in plugin-chart-handlebars requires `currencyformatter`` to be < 2
- dependency-name: "currencyformatter.js"
update-types: ["version-update:semver-major"]
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/packages/generator-superset/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/packages/superset-ui-chart-controls/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/packages/superset-ui-core/"
ignore:
# not until React >= 18.0.0
- dependency-name: "react-markdown"
- dependency-name: "remark-gfm"
- dependency-name: "react-error-boundary"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/packages/superset-ui-switchboard/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
open-pull-requests-limit: 0

173
.github/labeler.yml vendored
View File

@@ -1,173 +0,0 @@
# TODO (if we can)
# - Label PRs in need of codeowner review
# - viz:charts:xyz labels
# component/design system areas
# - storybook(s)
# - f/e and b/e test changes?
# - product areas (SQL Lab, Explore, Dashboard, etc.)
# - database areas (SQLAlchemy, labelind DBs by driver, etc.)
############################################
# General workflow warnings
# full list of labels is here: https://github.com/apache/superset/labels
############################################
"risk:db-migration":
- changed-files:
- any-glob-to-any-file:
- 'superset/migrations/**'
"risk:ci-script":
- changed-files:
- any-glob-to-any-file:
- 'scripts/**'
############################################
# Dependencies
############################################
"dependencies:python":
- changed-files:
- any-glob-to-any-file:
- 'superset/requirements/**'
- 'superset/translations/requirements.txt'
- 'RELEASING/requirements.txt'
"dependencies:npm":
- changed-files:
- any-glob-to-any-file:
- 'superset-frontend/package.json'
- 'superset-frontend/package-lock.json'
- 'superset-embedded-sdk/package.json'
- 'superset-embedded-sdk/package-lock.json'
- 'superset-websocket/package.json'
- 'superset-websocket/package-lock.json'
- 'superset-frontend/cypress-base/package.json'
- 'superset-frontend/cypress-base/package-lock.json'
- 'superset-frontend/packages/**/package.json'
- 'superset-frontend/plugins/**/package.json'
############################################
# Areas of the main codebase
############################################
"doc":
- changed-files:
- any-glob-to-any-file:
- 'docs/**'
"api":
- changed-files:
- any-glob-to-any-file:
- 'superset/**/api.py'
- 'superset/views/core.py'
"i18n":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/**'
"i18n:brazilian":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/pt_BR/**'
"i18n:chinese":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/zh/**'
"i18n:czech":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/cs/**'
"i18n:traditional-chinese":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/zh_TW/**'
"i18n:dutch":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/nl/**'
"i18n:french":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/fr/**'
"i18n:italian":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/it/**'
"i18n:japanese":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/ja/**'
"i18n:korean":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/ko/**'
"i18n:portuguese":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/pt/**'
"i18n:russian":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/ru/**'
"i18n:slovak":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/sk/**'
"i18n:latvian":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/lv/**'
"i18n:ukrainian":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/uk/**'
"i18n:spanish":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/es/**'
"i18n:persian":
- changed-files:
- any-glob-to-any-file:
- 'superset/translations/fa/**'
############################################
# Sub-projects and monorepo packages
############################################
"plugins":
- changed-files:
- any-glob-to-any-file:
- 'superset-frontend/plugins/**'
"packages":
- changed-files:
- any-glob-to-any-file:
- 'superset-frontend/packages/**'
"embedded":
- changed-files:
- any-glob-to-any-file:
- 'superset-embedded-sdk/**'
"github_actions":
- changed-files:
- any-glob-to-any-file:
- '.github/actions/**'
- '.github/workflows/**'

View File

@@ -31,6 +31,19 @@ say() {
fi
}
# default command to run when the `run` input is empty
default-setup-command() {
apt-get-install
pip-upgrade
}
apt-get-install() {
say "::group::apt-get install dependencies"
sudo apt-get update && sudo apt-get install --yes \
libsasl2-dev
say "::endgroup::"
}
pip-upgrade() {
say "::group::Upgrade pip"
pip install --upgrade pip
@@ -89,8 +102,6 @@ EOF
setup-mysql() {
say "::group::Initialize database"
mysql -h 127.0.0.1 -P 13306 -u root --password=root <<-EOF
SET GLOBAL transaction_isolation='READ-COMMITTED';
SET GLOBAL TRANSACTION ISOLATION LEVEL READ COMMITTED;
DROP DATABASE IF EXISTS superset;
CREATE DATABASE superset DEFAULT CHARACTER SET utf8 COLLATE utf8_unicode_ci;
DROP DATABASE IF EXISTS sqllab_test_db;
@@ -117,44 +128,9 @@ testdata() {
say "::endgroup::"
}
playwright_testdata() {
cd "$GITHUB_WORKSPACE"
say "::group::Load all examples for Playwright tests"
# must specify PYTHONPATH to make `tests.superset_test_config` importable
export PYTHONPATH="$GITHUB_WORKSPACE"
pip install -e .
superset db upgrade
superset load_test_users
superset load_examples
superset init
# Enable DML on the examples database so Playwright tests can create/drop
# temporary tables via SQL Lab without depending on external data sources.
superset shell <<'PYEOF'
import sys
from superset.extensions import db
from superset.models.core import Database
examples_db = db.session.query(Database).filter_by(database_name='examples').first()
if not examples_db:
sys.exit('ERROR: examples database not found. load_examples may have failed.')
examples_db.allow_dml = True
db.session.commit()
print('Enabled allow_dml on examples database')
PYEOF
say "::endgroup::"
}
celery-worker() {
cd "$GITHUB_WORKSPACE"
say "::group::Start Celery worker"
# must specify PYTHONPATH to make `tests.superset_test_config` importable
export PYTHONPATH="$GITHUB_WORKSPACE"
celery \
--app=superset.tasks.celery_app:app \
worker \
--concurrency=2 \
--detach \
--optimization=fair
codecov() {
say "::group::Upload code coverage"
bash ".github/workflows/codecov.sh" "$@"
say "::endgroup::"
}
@@ -170,132 +146,71 @@ cypress-install() {
cache-save cypress
}
cypress-run-all() {
local USE_DASHBOARD=$1
local APP_ROOT=$2
# Run Cypress and upload coverage reports
cypress-run() {
cd "$GITHUB_WORKSPACE/superset-frontend/cypress-base"
local page=$1
local group=${2:-Default}
local cypress="./node_modules/.bin/cypress run"
local browser=${CYPRESS_BROWSER:-chrome}
export TERM="xterm"
say "::group::Run Cypress for [$page]"
if [[ -z $CYPRESS_KEY ]]; then
$cypress --spec "cypress/integration/$page" --browser "$browser"
else
export CYPRESS_RECORD_KEY=$(echo $CYPRESS_KEY | base64 --decode)
# additional flags for Cypress dashboard recording
$cypress --spec "cypress/integration/$page" --browser "$browser" \
--record --group "$group" --tag "${GITHUB_REPOSITORY},${GITHUB_EVENT_NAME}" \
--parallel --ci-build-id "${GITHUB_SHA:0:8}-${NONCE}"
fi
# don't add quotes to $record because we do want word splitting
say "::endgroup::"
}
cypress-run-all() {
# Start Flask and run it in background
# --no-debugger means disable the interactive debugger on the 500 page
# so errors can print to stderr.
local flasklog="${HOME}/flask.log"
local port=8081
CYPRESS_BASE_URL="http://localhost:${port}"
if [ -n "$APP_ROOT" ]; then
export SUPERSET_APP_ROOT=$APP_ROOT
CYPRESS_BASE_URL=${CYPRESS_BASE_URL}${APP_ROOT}
fi
export CYPRESS_BASE_URL
export CYPRESS_BASE_URL="http://localhost:${port}"
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
local flaskProcessId=$!
USE_DASHBOARD_FLAG=''
if [ "$USE_DASHBOARD" = "true" ]; then
USE_DASHBOARD_FLAG='--use-dashboard'
fi
# UNCOMMENT the next few commands to monitor memory usage
# monitor_memory & # Start memory monitoring in the background
# memoryMonitorPid=$!
python ../../scripts/cypress_run.py --parallelism $PARALLELISM --parallelism-id $PARALLEL_ID --group $PARALLEL_ID --retries 5 $USE_DASHBOARD_FLAG
# kill $memoryMonitorPid
cypress-run "*/**/!(*.applitools.test.ts)"
# After job is done, print out Flask log for debugging
echo "::group::Flask log for default run"
say "::group::Flask log for default run"
cat "$flasklog"
echo "::endgroup::"
# make sure the program exits
kill $flaskProcessId
}
playwright-install() {
cd "$GITHUB_WORKSPACE/superset-frontend"
say "::group::Install Playwright browsers"
npx playwright install --with-deps chromium
# Create output directories for test results and debugging
mkdir -p playwright-results
mkdir -p test-results
say "::endgroup::"
}
playwright-run() {
local APP_ROOT=$1
local TEST_PATH=$2
# Start Flask from the project root (same as Cypress)
cd "$GITHUB_WORKSPACE"
local flasklog="${HOME}/flask-playwright.log"
local port=8081
PLAYWRIGHT_BASE_URL="http://localhost:${port}"
if [ -n "$APP_ROOT" ]; then
export SUPERSET_APP_ROOT=$APP_ROOT
PLAYWRIGHT_BASE_URL=${PLAYWRIGHT_BASE_URL}${APP_ROOT}/
fi
export PLAYWRIGHT_BASE_URL
# Rerun SQL Lab tests with backend persist disabled
export SUPERSET_CONFIG=tests.integration_tests.superset_test_config_sqllab_backend_persist_off
# Restart Flask with new configs
kill $flaskProcessId
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
local flaskProcessId=$!
# Ensure cleanup on exit
trap "kill $flaskProcessId 2>/dev/null || true" EXIT
cypress-run "sqllab/!(*.applitools.test.ts)" "Backend persist"
# Wait for server to be ready with health check
local timeout=60
say "Waiting for Flask server to start on port $port..."
while [ $timeout -gt 0 ]; do
if curl -f ${PLAYWRIGHT_BASE_URL}/health >/dev/null 2>&1; then
say "Flask server is ready"
break
fi
sleep 1
timeout=$((timeout - 1))
done
# Upload code coverage separately so each page can have separate flags
# -c will clean existing coverage reports, -F means add flags
# || true to prevent CI failure on codecov upload
codecov -c -F "cypress" || true
if [ $timeout -eq 0 ]; then
echo "::error::Flask server failed to start within 60 seconds"
echo "::group::Flask startup log"
cat "$flasklog"
echo "::endgroup::"
return 1
fi
# Change to frontend directory for Playwright execution
cd "$GITHUB_WORKSPACE/superset-frontend"
say "::group::Run Playwright tests"
echo "Running Playwright with baseURL: ${PLAYWRIGHT_BASE_URL}"
if [ -n "$TEST_PATH" ]; then
# Check if there are any test files in the specified path
if ! find "playwright/tests/${TEST_PATH}" -name "*.spec.ts" -type f 2>/dev/null | grep -q .; then
echo "No test files found in ${TEST_PATH} - skipping test run"
say "::endgroup::"
kill $flaskProcessId
return 0
fi
echo "Running tests: ${TEST_PATH}"
# Set INCLUDE_EXPERIMENTAL=true to allow experimental tests to run
export INCLUDE_EXPERIMENTAL=true
npx playwright test "${TEST_PATH}" --output=playwright-results
local status=$?
# Unset to prevent leaking into subsequent commands
unset INCLUDE_EXPERIMENTAL
else
echo "Running all required tests (experimental/ excluded via playwright.config.ts)"
npx playwright test --output=playwright-results
local status=$?
fi
say "::group::Flask log for backend persist"
cat "$flasklog"
say "::endgroup::"
# After job is done, print out Flask log for debugging
echo "::group::Flask log for Playwright run"
cat "$flasklog"
echo "::endgroup::"
# make sure the program exits
kill $flaskProcessId
return $status
}
eyes-storybook-dependencies() {
@@ -304,17 +219,22 @@ eyes-storybook-dependencies() {
say "::endgroup::"
}
monitor_memory() {
# This is a small utility to monitor memory usage. Useful for debugging memory in GHA.
# To use wrap your command as follows
#
# monitor_memory & # Start memory monitoring in the background
# memoryMonitorPid=$!
# YOUR_COMMAND_HERE
# kill $memoryMonitorPid
while true; do
echo "$(date) - Top 5 memory-consuming processes:"
ps -eo pid,comm,%mem --sort=-%mem | head -n 6 # First line is the header, next 5 are top processes
sleep 2
done
cypress-run-applitools() {
local flasklog="${HOME}/flask.log"
local port=8081
export CYPRESS_BASE_URL="http://localhost:${port}"
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
local flaskProcessId=$!
cypress-run "*/**/*.applitools.test.ts"
codecov -c -F "cypress" || true
say "::group::Flask log for default run"
cat "$flasklog"
say "::endgroup::"
# make sure the program exits
kill $flaskProcessId
}

View File

@@ -1,81 +0,0 @@
name: Bump Python Package
on:
# Can be triggered manually
workflow_dispatch:
inputs:
package:
required: false
description: The python package to bump (all if empty)
group:
required: false
description: The optional dependency group to bump (as defined in pyproject.toml)
limit:
required: true
description: Max number of PRs to open (0 for no limit)
default: 5
extra-flags:
required: false
default: --only-base
description: Additional flags to pass to the bump-python command
#schedule:
# - cron: '0 0 * * *' # Runs daily at midnight UTC
jobs:
bump-python-package:
runs-on: ubuntu-24.04
permissions:
actions: write
contents: write
pull-requests: write
checks: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: true
ref: master
- name: Setup supersetbot
uses: ./.github/actions/setup-supersetbot/
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6
with:
python-version: "3.10"
- name: Install uv
run: pip install uv
- name: supersetbot bump-python -p "${{ github.event.inputs.package }}"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
INPUT_PACKAGE: ${{ github.event.inputs.package }}
INPUT_GROUP: ${{ github.event.inputs.group }}
INPUT_EXTRA_FLAGS: ${{ github.event.inputs.extra-flags }}
INPUT_LIMIT: ${{ github.event.inputs.limit }}
run: |
git config --global user.email "action@github.com"
git config --global user.name "GitHub Action"
PACKAGE_OPT=""
if [ -n "${INPUT_PACKAGE}" ]; then
PACKAGE_OPT="-p ${INPUT_PACKAGE}"
fi
GROUP_OPT=""
if [ -n "${INPUT_GROUP}" ]; then
GROUP_OPT="-g ${INPUT_GROUP}"
fi
EXTRA_FLAGS="${INPUT_EXTRA_FLAGS}"
supersetbot bump-python \
--verbose \
--use-current-repo \
--include-subpackages \
--limit ${INPUT_LIMIT} \
$PACKAGE_OPT \
$GROUP_OPT \
$EXTRA_FLAGS

View File

@@ -25,8 +25,6 @@ const assetsConfig = {
path: [`${workspaceDirectory}/superset/static/assets`],
hashFiles: [
`${workspaceDirectory}/superset-frontend/src/**/*`,
`${workspaceDirectory}/superset-frontend/packages/**/*`,
`${workspaceDirectory}/superset-frontend/plugins/**/*`,
`${workspaceDirectory}/superset-frontend/*.js`,
`${workspaceDirectory}/superset-frontend/*.json`,
],

View File

@@ -9,15 +9,12 @@ on:
jobs:
cancel-duplicate-runs:
name: Cancel duplicate workflow runs
runs-on: ubuntu-24.04
permissions:
actions: write
contents: read
runs-on: ubuntu-20.04
steps:
- name: Check number of queued tasks
id: check_queued
env:
GITHUB_TOKEN: ${{ github.token }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPO: ${{ github.repository }}
run: |
get_count() {
@@ -27,16 +24,16 @@ jobs:
}
count=$(( `get_count queued` + `get_count in_progress` ))
echo "Found $count unfinished jobs."
echo "count=$count" >> $GITHUB_OUTPUT
echo "::set-output name=count::$count"
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
if: steps.check_queued.outputs.count >= 20
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
- name: Cancel duplicate workflow runs
if: steps.check_queued.outputs.count >= 20
env:
GITHUB_TOKEN: ${{ github.token }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPOSITORY: ${{ github.repository }}
run: |
pip install click requests typing_extensions python-dateutil

View File

@@ -1,59 +0,0 @@
name: Check python dependencies
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
check-python-deps:
runs-on: ubuntu-22.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
fetch-depth: 1
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
if: steps.check.outputs.python
uses: ./.github/actions/setup-backend/
- name: Run uv
if: steps.check.outputs.python
run: ./scripts/uv-pip-compile.sh
- name: Check for uncommitted changes
if: steps.check.outputs.python
run: |
echo "Full diff (for logging/debugging):"
git diff
echo "Filtered diff (excluding comments and whitespace):"
filtered_diff=$(git diff -U0 | grep '^[-+]' | grep -vE '^[-+]{3}' | grep -vE '^[-+][[:space:]]*#' | grep -vE '^[-+][[:space:]]*$' || true)
echo "$filtered_diff"
if [[ -n "$filtered_diff" ]]; then
echo
echo "ERROR: The pinned dependencies are not up-to-date."
echo "Please run './scripts/uv-pip-compile.sh' and commit the changes."
echo "More info: https://github.com/apache/superset/tree/master/requirements"
exit 1
else
echo "Pinned dependencies are up-to-date."
fi

View File

@@ -3,39 +3,24 @@ on:
push:
paths:
- "superset/migrations/**"
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
paths:
- "superset/migrations/**"
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
check_db_migration_conflict:
name: Check DB migration conflict
runs-on: ubuntu-24.04
permissions:
contents: read
pull-requests: write
runs-on: ubuntu-20.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
- name: Check and notify
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@v3
with:
github-token: ${{ github.token }}
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
// API reference: https://octokit.github.io/rest.js
const currentBranch = context.ref.replace('refs/heads/', '');
// Find all pull requests to current branch
const opts = github.rest.pulls.list.endpoint.merge({
const opts = github.pulls.list.endpoint.merge({
owner: context.repo.owner,
repo: context.repo.repo,
base: context.ref,
@@ -49,7 +34,7 @@ jobs:
}
for (const pull of pulls) {
const listFilesOpts = await github.rest.pulls.listFiles.endpoint.merge({
const listFilesOpts = await github.pulls.listFiles.endpoint.merge({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: pull.number,
@@ -59,7 +44,7 @@ jobs:
files.some(x => x.contents_url.includes('/contents/superset/migrations'))
) {
console.log(`PR #${pull.number} "${pull.title}" also added db migration`)
await github.rest.issues.createComment({
await github.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
@@ -69,7 +54,7 @@ jobs:
`❗ @${pull.user.login} Your base branch \`${currentBranch}\` has ` +
'also updated `superset/migrations`.\n' +
'\n' +
'**Please consider rebasing your branch and [resolving potential db migration conflicts](https://superset.apache.org/docs/contributing/development#merging-db-migrations).**',
'**Please consider rebasing your branch and [resolving potential db migration conflicts](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#merging-db-migrations).**',
});
}
}

View File

@@ -1,82 +0,0 @@
name: Claude PR Assistant
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
jobs:
check-permissions:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude'))
runs-on: ubuntu-latest
outputs:
allowed: ${{ steps.check.outputs.allowed }}
steps:
- name: Check if user is allowed
id: check
run: |
# List of allowed users
ALLOWED_USERS="mistercrunch,rusackas"
# Get the commenter's username
COMMENTER="${{ github.event.comment.user.login }}"
echo "Checking permissions for user: $COMMENTER"
# Check if user is in allowed list
if [[ ",$ALLOWED_USERS," == *",$COMMENTER,"* ]]; then
echo "allowed=true" >> $GITHUB_OUTPUT
echo "✅ User $COMMENTER is allowed to use Claude"
else
echo "allowed=false" >> $GITHUB_OUTPUT
echo "❌ User $COMMENTER is not allowed to use Claude"
fi
deny-access:
needs: check-permissions
if: needs.check-permissions.outputs.allowed == 'false'
runs-on: ubuntu-latest
permissions:
issues: write
pull-requests: write
steps:
- name: Comment access denied
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
script: |
const message = `👋 Hi @${{ github.event.comment.user.login || github.event.review.user.login || github.event.issue.user.login }}!
Thanks for trying to use Claude Code, but currently only certain team members have access to this feature.
If you believe you should have access, please contact a project maintainer.`;
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body: message
});
claude-code-action:
needs: check-permissions
if: needs.check-permissions.outputs.allowed == 'true'
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
issues: write
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
fetch-depth: 1
- name: Run Claude PR Action
uses: anthropics/claude-code-action@5fb899572b81d2bb648d4d187173a2f423a9677c # beta
with:
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
timeout_minutes: "60"

1903
.github/workflows/codecov.sh vendored Executable file

File diff suppressed because it is too large Load Diff

View File

@@ -1,58 +0,0 @@
name: "CodeQL"
on:
push:
branches: ["master", "[0-9].[0-9]*"]
pull_request:
# The branches below must be a subset of the branches above
branches: ["master"]
schedule:
- cron: "0 4 * * *"
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
analyze:
name: Analyze
runs-on: ubuntu-24.04
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: ["python", "javascript"]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v4
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
- name: Perform CodeQL Analysis
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: github/codeql-action/analyze@v4
with:
category: "/language:${{matrix.language}}"

View File

@@ -1,67 +0,0 @@
# Dependency Review Action
#
# This Action will scan dependency manifest files that change as part of a Pull Request, surfacing known-vulnerable versions of the packages declared or updated in the PR. Once installed, if the workflow run is marked as required, PRs introducing known-vulnerable packages will be blocked from merging.
#
# Source repository: https://github.com/actions/dependency-review-action
# Public documentation: https://docs.github.com/en/code-security/supply-chain-security/understanding-your-software-supply-chain/about-dependency-review#dependency-review-enforcement
name: "Dependency Review"
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
permissions:
contents: read
jobs:
dependency-review:
if: github.event_name == 'pull_request'
runs-on: ubuntu-24.04
steps:
- name: "Checkout Repository"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: "Dependency Review"
uses: actions/dependency-review-action@2031cfc080254a8a887f58cffee85186f0e49e48 # v4.9.0
continue-on-error: true
with:
fail-on-severity: critical
# compatible/incompatible licenses addressed here: https://www.apache.org/legal/resolved.html
# find SPDX identifiers here: https://spdx.org/licenses/
deny-licenses: MS-LPL, BUSL-1.1, QPL-1.0, Sleepycat, SSPL-1.0, CPOL-1.02, AGPL-3.0, GPL-1.0+, BSD-4-Clause-UC, NPL-1.0, NPL-1.1, JSON
# pkg:npm/store2@2.14.2
# adding an exception for an ambigious license on store2, which has been resolved in
# the latest version. It's MIT: https://github.com/nbubna/store/blob/master/LICENSE-MIT
# pkg:npm/node-forge@1.3.1
# selecting BSD-3-Clause licensing terms for node-forge to ensure compatibility with Apache
allow-dependencies-licenses: pkg:npm/store2@2.14.2, pkg:npm/node-forge@1.3.1, pkg:npm/rgbcolor, pkg:npm/jszip@3.10.1
python-dependency-liccheck:
# NOTE: Configuration for liccheck lives in our pyproject.yml.
# You cannot use a liccheck.ini file in this workflow.
runs-on: ubuntu-22.04
steps:
- name: "Checkout Repository"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Setup Python
uses: ./.github/actions/setup-backend/
with:
requirements-type: base
- name: "Set up liccheck"
run: |
uv pip install --system liccheck
- name: "Run liccheck"
run: |
# run the checks
liccheck -R output.txt
# Print the report
cat output.txt

View File

@@ -0,0 +1,78 @@
name: Push ephmereral env image
on:
workflow_run:
workflows: ["Docker"]
types:
- completed
jobs:
docker_ephemeral_env:
name: Push ephemeral env Docker image to ECR
if: github.event.workflow_run.event == 'pull_request' && github.event.workflow_run.conclusion == 'success'
runs-on: ubuntu-latest
steps:
- name: 'Download artifact'
uses: actions/github-script@v3.1.0
with:
script: |
const artifacts = await github.actions.listWorkflowRunArtifacts({
owner: context.repo.owner,
repo: context.repo.repo,
run_id: ${{ github.event.workflow_run.id }},
});
core.info('*** artifacts')
core.info(JSON.stringify(artifacts))
const matchArtifact = artifacts.data.artifacts.filter((artifact) => {
return artifact.name == "build"
})[0];
if(!matchArtifact) return core.setFailed("Build artifacts not found")
const download = await github.actions.downloadArtifact({
owner: context.repo.owner,
repo: context.repo.repo,
artifact_id: matchArtifact.id,
archive_format: 'zip',
});
var fs = require('fs');
fs.writeFileSync('${{github.workspace}}/build.zip', Buffer.from(download.data));
- run: unzip build.zip
- name: Display downloaded files (debug)
run: ls -la
- name: Get SHA
id: get-sha
run: echo "::set-output name=sha::$(cat ./SHA)"
- name: Get PR
id: get-pr
run: echo "::set-output name=num::$(cat ./PR-NUM)"
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-west-2
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v1
- name: Load, tag and push image to ECR
id: push-image
env:
ECR_REGISTRY: ${{ steps.login-ecr.outputs.registry }}
ECR_REPOSITORY: superset-ci
SHA: ${{ steps.get-sha.outputs.sha }}
IMAGE_TAG: pr-${{ steps.get-pr.outputs.num }}
run: |
docker load < $SHA.tar.gz
docker tag $SHA $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG
docker tag $SHA $ECR_REGISTRY/$ECR_REPOSITORY:$SHA
docker push -a $ECR_REGISTRY/$ECR_REPOSITORY

22
.github/workflows/docker-release.yml vendored Normal file
View File

@@ -0,0 +1,22 @@
name: Docker
on:
release:
types: [published]
jobs:
docker-release:
name: docker-release
runs-on: ubuntu-latest
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.ref }}
- shell: bash
env:
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
run: |
.github/workflows/docker_build_push.sh

View File

@@ -1,140 +1,42 @@
name: Build & publish docker images
name: Docker
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
- 'master'
pull_request:
branches:
- "master"
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
types: [synchronize, opened, reopened, ready_for_review]
jobs:
setup_matrix:
runs-on: ubuntu-24.04
outputs:
matrix_config: ${{ steps.set_matrix.outputs.matrix_config }}
steps:
- id: set_matrix
run: |
MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["dev", "lean"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize", "py311", "py312"]'; fi)
echo "matrix_config=${MATRIX_CONFIG}" >> $GITHUB_OUTPUT
echo $GITHUB_OUTPUT
docker-build:
if: github.event.pull_request.draft == false
name: docker-build
needs: setup_matrix
runs-on: ubuntu-24.04
strategy:
matrix:
build_preset: ${{fromJson(needs.setup_matrix.outputs.matrix_config)}}
fail-fast: false
env:
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
IMAGE_TAG: apache/superset:GHA-${{ matrix.build_preset }}-${{ github.run_id }}
runs-on: ubuntu-latest
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Docker Environment
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
uses: ./.github/actions/setup-docker
with:
dockerhub-user: ${{ secrets.DOCKERHUB_USER }}
dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }}
build: "true"
- name: Setup supersetbot
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
uses: ./.github/actions/setup-supersetbot/
- name: Build Docker Image
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
shell: bash
- shell: bash
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
run: |
# Single platform builds in pull_request context to speed things up
if [ "${{ github.event_name }}" = "push" ]; then
PLATFORM_ARG="--platform linux/arm64 --platform linux/amd64"
# can only --load images in single-platform builds
PUSH_OR_LOAD="--push"
elif [ "${{ github.event_name }}" = "pull_request" ]; then
PLATFORM_ARG="--platform linux/amd64"
PUSH_OR_LOAD="--load"
fi
.github/workflows/docker_build_push.sh
supersetbot docker \
$PUSH_OR_LOAD \
--preset ${{ matrix.build_preset }} \
--context "$EVENT" \
--context-ref "$RELEASE" $FORCE_LATEST \
--extra-flags "--build-arg INCLUDE_CHROMIUM=false --tag $IMAGE_TAG" \
$PLATFORM_ARG
# in the context of push (using multi-platform build), we need to pull the image locally
- name: Docker pull
if: github.event_name == 'push' && (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker)
run: docker pull $IMAGE_TAG
- name: Print docker stats
if: steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker
- name: Build ephemeral env image
if: github.event_name == 'pull_request'
run: |
echo "SHA: ${{ github.sha }}"
echo "IMAGE: $IMAGE_TAG"
docker images $IMAGE_TAG
docker history $IMAGE_TAG
mkdir -p ./build
echo ${{ github.sha }} > ./build/SHA
echo ${{ github.event.pull_request.number }} > ./build/PR-NUM
docker build --target ci -t ${{ github.sha }} -t "pr-${{ github.event.pull_request.number }}" .
docker save ${{ github.sha }} | gzip > ./build/${{ github.sha }}.tar.gz
- name: docker-compose sanity check
if: (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'dev'
shell: bash
run: |
export SUPERSET_BUILD_TARGET=${{ matrix.build_preset }}
# This should reuse the CACHED image built in the previous steps
docker compose build superset-init --build-arg DEV_MODE=false --build-arg INCLUDE_CHROMIUM=false
docker compose up superset-init --exit-code-from superset-init
docker-compose-image-tag:
# Run this job only on pushes to master (not for PRs)
# goal is to check that building the latest image works, not required for all PR pushes
if: github.event_name == 'push' && github.ref == 'refs/heads/master'
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Upload build artifacts
if: github.event_name == 'pull_request'
uses: actions/upload-artifact@v2
with:
persist-credentials: false
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Docker Environment
if: steps.check.outputs.docker
uses: ./.github/actions/setup-docker
with:
dockerhub-user: ${{ secrets.DOCKERHUB_USER }}
dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }}
build: "false"
install-docker-compose: "true"
- name: docker-compose sanity check
if: steps.check.outputs.docker
shell: bash
run: |
docker compose -f docker-compose-image-tag.yml up superset-init --exit-code-from superset-init
name: build
path: build/

80
.github/workflows/docker_build_push.sh vendored Executable file
View File

@@ -0,0 +1,80 @@
#!/usr/bin/env bash
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
set -eo pipefail
SHA=$(git rev-parse HEAD)
REPO_NAME="apache/superset"
if [[ "${GITHUB_EVENT_NAME}" == "pull_request" ]]; then
REFSPEC=$(echo "${GITHUB_HEAD_REF}" | sed 's/[^a-zA-Z0-9]/-/g' | head -c 40)
PR_NUM=$(echo "${GITHUB_REF}" | sed 's:refs/pull/::' | sed 's:/merge::')
LATEST_TAG="pr-${PR_NUM}"
elif [[ "${GITHUB_EVENT_NAME}" == "release" ]]; then
REFSPEC=$(echo "${GITHUB_REF}" | sed 's:refs/tags/::' | head -c 40)
LATEST_TAG="${REFSPEC}"
else
REFSPEC=$(echo "${GITHUB_REF}" | sed 's:refs/heads/::' | sed 's/[^a-zA-Z0-9]/-/g' | head -c 40)
LATEST_TAG="${REFSPEC}"
fi
if [[ "${REFSPEC}" == "master" ]]; then
LATEST_TAG="latest"
fi
cat<<EOF
Rolling with tags:
- ${REPO_NAME}:${SHA}
- ${REPO_NAME}:${REFSPEC}
- ${REPO_NAME}:${LATEST_TAG}
EOF
#
# Build the "lean" image
#
docker build --target lean \
-t "${REPO_NAME}:${SHA}" \
-t "${REPO_NAME}:${REFSPEC}" \
-t "${REPO_NAME}:${LATEST_TAG}" \
--label "sha=${SHA}" \
--label "built_at=$(date)" \
--label "target=lean" \
--label "build_actor=${GITHUB_ACTOR}" \
.
#
# Build the dev image
#
docker build --target dev \
-t "${REPO_NAME}:${SHA}-dev" \
-t "${REPO_NAME}:${REFSPEC}-dev" \
-t "${REPO_NAME}:${LATEST_TAG}-dev" \
--label "sha=${SHA}" \
--label "built_at=$(date)" \
--label "target=dev" \
--label "build_actor=${GITHUB_ACTOR}" \
.
if [ -z "${DOCKERHUB_TOKEN}" ]; then
# Skip if secrets aren't populated -- they're only visible for actions running in the repo (not on forks)
echo "Skipping Docker push"
else
# Login and push
docker logout
docker login --username "${DOCKERHUB_USER}" --password "${DOCKERHUB_TOKEN}"
docker push --all-tags "${REPO_NAME}"
fi

View File

@@ -23,14 +23,6 @@
{
"name": "SUPERSET_PORT",
"value": "8080"
},
{
"name": "SUPERSET_SECRET_KEY",
"value": "super-secret-for-ephemerals"
},
{
"name": "TALISMAN_ENABLED",
"value": "False"
}
],
"mountPoints": [],

View File

@@ -3,37 +3,19 @@ name: Embedded SDK Release
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
- 'master'
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${NPM_TOKEN}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
env:
NPM_TOKEN: ${{ (secrets.NPM_TOKEN != '') || '' }}
build:
needs: config
if: needs.config.outputs.has-secrets
runs-on: ubuntu-24.04
runs-on: ubuntu-20.04
defaults:
run:
working-directory: superset-embedded-sdk
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version-file: './superset-embedded-sdk/.nvmrc'
node-version: "16"
registry-url: 'https://registry.npmjs.org'
- run: npm ci
- run: npm run ci:release

View File

@@ -6,22 +6,18 @@ on:
- "superset-embedded-sdk/**"
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
embedded-sdk-test:
runs-on: ubuntu-24.04
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
defaults:
run:
working-directory: superset-embedded-sdk
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version-file: './superset-embedded-sdk/.nvmrc'
node-version: "16"
registry-url: 'https://registry.npmjs.org'
- run: npm ci
- run: npm test

View File

@@ -1,41 +1,16 @@
name: Cleanup ephemeral envs (PR close) [DEPRECATED]
# ⚠️ DEPRECATION NOTICE ⚠️
# This workflow is deprecated and will be removed in a future version.
# The new Superset Showtime workflow handles cleanup automatically.
# See .github/workflows/showtime.yml and showtime-cleanup.yml for replacements.
# Migration guide: https://github.com/mistercrunch/superset-showtime
name: Cleanup ephemeral envs (PR close)
on:
pull_request_target:
types: [closed]
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${AWS_ACCESS_KEY_ID}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
env:
AWS_ACCESS_KEY_ID: ${{ (secrets.AWS_ACCESS_KEY_ID != '' && secrets.AWS_SECRET_ACCESS_KEY != '') || '' }}
ephemeral-env-cleanup:
needs: config
if: needs.config.outputs.has-secrets
name: Cleanup ephemeral envs
runs-on: ubuntu-24.04
permissions:
pull-requests: write
runs-on: ubuntu-latest
steps:
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7 # v6
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
@@ -44,7 +19,7 @@ jobs:
- name: Describe ECS service
id: describe-services
run: |
echo "active=$(aws ecs describe-services --cluster superset-ci --services pr-${{ github.event.number }}-service | jq '.services[] | select(.status == "ACTIVE") | any')" >> $GITHUB_OUTPUT
echo "::set-output name=active::$(aws ecs describe-services --cluster superset-ci --services pr-${{ github.event.number }}-service | jq '.services[] | select(.status == "ACTIVE") | any')"
- name: Delete ECS service
if: steps.describe-services.outputs.active == 'true'
@@ -58,7 +33,7 @@ jobs:
- name: Login to Amazon ECR
if: steps.describe-services.outputs.active == 'true'
id: login-ecr
uses: aws-actions/amazon-ecr-login@19d944daaa35f0fa1d3f7f8af1d3f2e5de25c5b7 # v2
uses: aws-actions/amazon-ecr-login@v1
- name: Delete ECR image tag
if: steps.describe-services.outputs.active == 'true'
@@ -71,13 +46,13 @@ jobs:
- name: Comment (success)
if: steps.describe-services.outputs.active == 'true'
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
uses: actions/github-script@v3
with:
github-token: ${{github.token}}
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
github.rest.issues.createComment({
github.issues.createComment({
issue_number: ${{ github.event.number }},
owner: context.repo.owner,
repo: context.repo.repo,
body: '⚠️ **DEPRECATED WORKFLOW** - Ephemeral environment shutdown and build artifacts deleted. Please migrate to the new Superset Showtime system for future PRs.'
body: 'Ephemeral environment shutdown and build artifacts deleted.'
})

View File

@@ -1,350 +1,194 @@
name: Ephemeral env workflow [DEPRECATED]
# ⚠️ DEPRECATION NOTICE ⚠️
# This workflow is deprecated and will be removed in a future version.
# Please use the new Superset Showtime workflow instead:
# - Use label "🎪 trigger-start" instead of "testenv-up"
# - Showtime provides better reliability and easier management
# - See .github/workflows/showtime.yml for the replacement
# - Migration guide: https://github.com/mistercrunch/superset-showtime
# Example manual trigger:
# gh workflow run ephemeral-env.yml --ref fix_ephemerals --field label_name="testenv-up" --field issue_number=666
name: Ephemeral env workflow
on:
pull_request_target:
types:
- labeled
workflow_dispatch:
inputs:
label_name:
description: 'Label name to simulate label-based /testenv trigger'
required: true
default: 'testenv-up'
issue_number:
description: 'Issue or PR number'
required: true
issue_comment:
types: [created]
jobs:
ephemeral-env-label:
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}-label
cancel-in-progress: true
name: Evaluate ephemeral env label trigger
runs-on: ubuntu-24.04
permissions:
pull-requests: write
ephemeral_env_comment:
if: github.event.issue.pull_request
name: Evaluate ephemeral env comment trigger (/testenv)
runs-on: ubuntu-latest
outputs:
slash-command: ${{ steps.eval-label.outputs.result }}
slash-command: ${{ steps.eval-body.outputs.result }}
feature-flags: ${{ steps.eval-feature-flags.outputs.result }}
sha: ${{ steps.get-sha.outputs.sha }}
env:
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
steps:
- name: Check for the "testenv-up" label
id: eval-label
run: |
if [[ "${{ github.event_name }}" == "workflow_dispatch" ]]; then
LABEL_NAME="${INPUT_LABEL_NAME}"
else
LABEL_NAME="${{ github.event.label.name }}"
fi
- name: Debug
run: |
echo "Comment on PR #${{ github.event.issue.number }} by ${{ github.event.issue.user.login }}, ${{ github.event.comment.author_association }}"
echo "Evaluating label: $LABEL_NAME"
- name: Eval comment body for /testenv slash command
uses: actions/github-script@v3
id: eval-body
with:
result-encoding: string
script: |
const pattern = /^\/testenv (up|down)/
const result = pattern.exec(context.payload.comment.body)
return result === null ? 'noop' : result[1]
if [[ "$LABEL_NAME" == "testenv-up" ]]; then
echo "result=up" >> $GITHUB_OUTPUT
else
echo "result=noop" >> $GITHUB_OUTPUT
fi
- name: Eval comment body for feature flags
uses: actions/github-script@v3
id: eval-feature-flags
with:
script: |
const pattern = /FEATURE_(\w+)=(\w+)/g;
let results = [];
[...context.payload.comment.body.matchAll(pattern)].forEach(match => {
const config = {
name: `SUPERSET_FEATURE_${match[1]}`,
value: match[2],
};
results.push(config);
});
return results;
env:
INPUT_LABEL_NAME: ${{ github.event.inputs.label_name }}
- name: Get event SHA
id: get-sha
if: steps.eval-label.outputs.result == 'up'
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
let prSha;
- name: Limit to committers
if: >
steps.eval-body.outputs.result != 'noop' &&
github.event.comment.author_association != 'MEMBER' &&
github.event.comment.author_association != 'OWNER'
uses: actions/github-script@v3
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const errMsg = '@${{ github.event.comment.user.login }} Ephemeral environment creation is currently limited to committers.'
github.issues.createComment({
issue_number: ${{ github.event.issue.number }},
owner: context.repo.owner,
repo: context.repo.repo,
body: errMsg
})
core.setFailed(errMsg)
// If event is workflow_dispatch, use the issue_number from inputs
if (context.eventName === "workflow_dispatch") {
const prNumber = "${{ github.event.inputs.issue_number }}";
if (!prNumber) {
console.log("No PR number found.");
return;
}
// Fetch PR details using the provided issue_number
const { data: pr } = await github.rest.pulls.get({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: prNumber
});
prSha = pr.head.sha;
} else {
// If it's not workflow_dispatch, use the PR head sha from the event
prSha = context.payload.pull_request.head.sha;
}
console.log(`PR SHA: ${prSha}`);
core.setOutput("sha", prSha);
- name: Looking for feature flags in PR description
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
id: eval-feature-flags
if: steps.eval-label.outputs.result == 'up'
with:
script: |
const description = context.payload.pull_request
? context.payload.pull_request.body || ''
: context.payload.inputs.pr_description || '';
const pattern = /FEATURE_(\w+)=(\w+)/g;
let results = [];
[...description.matchAll(pattern)].forEach(match => {
const config = {
name: `SUPERSET_FEATURE_${match[1]}`,
value: match[2],
};
results.push(config);
});
return results;
- name: Reply with confirmation comment
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
if: steps.eval-label.outputs.result == 'up'
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const action = '${{ steps.eval-label.outputs.result }}';
const user = context.actor;
const runId = context.runId;
const workflowUrl = `${context.serverUrl}/${context.repo.owner}/${context.repo.repo}/actions/runs/${runId}`;
const issueNumber = context.payload.pull_request
? context.payload.pull_request.number
: context.payload.inputs.issue_number;
if (!issueNumber) {
throw new Error("Issue number is not available.");
}
const body = `⚠️ **DEPRECATED WORKFLOW** ⚠️\n\n@${user} This workflow is deprecated! Please use the new **Superset Showtime** system instead:\n\n` +
`- Replace "testenv-up" label with "🎪 trigger-start"\n` +
`- Better reliability and easier management\n` +
`- See https://github.com/mistercrunch/superset-showtime for details\n\n` +
`Processing your ephemeral environment request [here](${workflowUrl}). Action: **${action}**.` +
` More information on [how to use or configure ephemeral environments]` +
`(https://superset.apache.org/docs/contributing/howtos/#github-ephemeral-environments)`;
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issueNumber,
body,
});
ephemeral-docker-build:
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}-build
cancel-in-progress: true
needs: ephemeral-env-label
if: needs.ephemeral-env-label.outputs.slash-command == 'up'
name: ephemeral-docker-build
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ needs.ephemeral-env-label.outputs.sha }} : ${{steps.get-sha.outputs.sha}} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
ref: ${{ needs.ephemeral-env-label.outputs.sha }}
persist-credentials: false
- name: Setup Docker Environment
uses: ./.github/actions/setup-docker
with:
dockerhub-user: ${{ secrets.DOCKERHUB_USER }}
dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }}
build: "true"
install-docker-compose: "false"
- name: Setup supersetbot
uses: ./.github/actions/setup-supersetbot/
- name: Build ephemeral env image
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
supersetbot docker \
--push \
--load \
--preset ci \
--platform linux/amd64 \
--context-ref "$RELEASE" \
--extra-flags "--build-arg INCLUDE_CHROMIUM=false"
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7 # v6
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-west-2
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@19d944daaa35f0fa1d3f7f8af1d3f2e5de25c5b7 # v2
- name: Load, tag and push image to ECR
id: push-image
env:
ECR_REGISTRY: ${{ steps.login-ecr.outputs.registry }}
ECR_REPOSITORY: superset-ci
IMAGE_TAG: apache/superset:${{ needs.ephemeral-env-label.outputs.sha }}-ci
PR_NUMBER: ${{ github.event.inputs.issue_number || github.event.pull_request.number }}
run: |
docker tag $IMAGE_TAG $ECR_REGISTRY/$ECR_REPOSITORY:pr-$PR_NUMBER-ci
docker push -a $ECR_REGISTRY/$ECR_REPOSITORY
ephemeral-env-up:
needs: [ephemeral-env-label, ephemeral-docker-build]
if: needs.ephemeral-env-label.outputs.slash-command == 'up'
ephemeral_env_up:
needs: ephemeral_env_comment
if: needs.ephemeral_env_comment.outputs.slash-command == 'up'
name: Spin up an ephemeral environment
runs-on: ubuntu-24.04
permissions:
contents: read
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
- uses: actions/checkout@v2
with:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7 # v6
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-west-2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-west-2
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@19d944daaa35f0fa1d3f7f8af1d3f2e5de25c5b7 # v2
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v1
- name: Check target image exists in ECR
id: check-image
continue-on-error: true
env:
PR_NUMBER: ${{ github.event.inputs.issue_number || github.event.pull_request.number }}
run: |
aws ecr describe-images \
--registry-id $(echo "${{ steps.login-ecr.outputs.registry }}" | grep -Eo "^[0-9]+") \
--repository-name superset-ci \
--image-ids imageTag=pr-$PR_NUMBER-ci
- name: Check target image exists in ECR
id: check-image
continue-on-error: true
run: |
aws ecr describe-images \
--registry-id $(echo "${{ steps.login-ecr.outputs.registry }}" | grep -Eo "^[0-9]+") \
--repository-name superset-ci \
--image-ids imageTag=pr-${{ github.event.issue.number }}
- name: Fail on missing container image
if: steps.check-image.outcome == 'failure'
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
github-token: ${{ github.token }}
script: |
const errMsg = '@${{ github.event.comment.user.login }} Container image not yet published for this PR. Please try again when build is complete.';
github.rest.issues.createComment({
issue_number: ${{ github.event.inputs.issue_number || github.event.pull_request.number }},
owner: context.repo.owner,
repo: context.repo.repo,
body: errMsg
});
core.setFailed(errMsg);
- name: Fail on missing container image
if: steps.check-image.outcome == 'failure'
uses: actions/github-script@v3
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const errMsg = '@${{ github.event.comment.user.login }} Container image not yet published for this PR. Please try again when build is complete.'
github.issues.createComment({
issue_number: ${{ github.event.issue.number }},
owner: context.repo.owner,
repo: context.repo.repo,
body: errMsg
})
core.setFailed(errMsg)
- name: Fill in the new image ID in the Amazon ECS task definition
id: task-def
uses: aws-actions/amazon-ecs-render-task-definition@77954e213ba1f9f9cb016b86a1d4f6fcdea0d57e # v1
with:
task-definition: .github/workflows/ecs-task-definition.json
container-name: superset-ci
image: ${{ steps.login-ecr.outputs.registry }}/superset-ci:pr-${{ github.event.inputs.issue_number || github.event.pull_request.number }}-ci
- name: Fill in the new image ID in the Amazon ECS task definition
id: task-def
uses: aws-actions/amazon-ecs-render-task-definition@v1
with:
task-definition: .github/workflows/ecs-task-definition.json
container-name: superset-ci
image: ${{ steps.login-ecr.outputs.registry }}/superset-ci:pr-${{ github.event.issue.number }}
- name: Update env vars in the Amazon ECS task definition
run: |
cat <<< "$(jq '.containerDefinitions[0].environment += ${{ needs.ephemeral-env-label.outputs.feature-flags }}' < ${{ steps.task-def.outputs.task-definition }})" > ${{ steps.task-def.outputs.task-definition }}
- name: Update env vars in the Amazon ECS task definition
run: |
cat <<< "$(jq '.containerDefinitions[0].environment += ${{ needs.ephemeral_env_comment.outputs.feature-flags }}' < ${{ steps.task-def.outputs.task-definition }})" > ${{ steps.task-def.outputs.task-definition }}
- name: Describe ECS service
id: describe-services
run: |
echo "active=$(aws ecs describe-services --cluster superset-ci --services pr-${INPUT_ISSUE_NUMBER}-service | jq '.services[] | select(.status == "ACTIVE") | any')" >> $GITHUB_OUTPUT
env:
INPUT_ISSUE_NUMBER: ${{ github.event.inputs.issue_number || github.event.pull_request.number }}
- name: Create ECS service
id: create-service
if: steps.describe-services.outputs.active != 'true'
env:
ECR_SUBNETS: subnet-0e15a5034b4121710,subnet-0e8efef4a72224974
ECR_SECURITY_GROUP: sg-092ff3a6ae0574d91
PR_NUMBER: ${{ github.event.inputs.issue_number || github.event.pull_request.number }}
run: |
aws ecs create-service \
--cluster superset-ci \
--service-name pr-$PR_NUMBER-service \
--task-definition superset-ci \
--launch-type FARGATE \
--desired-count 1 \
--platform-version LATEST \
--network-configuration "awsvpcConfiguration={subnets=[$ECR_SUBNETS],securityGroups=[$ECR_SECURITY_GROUP],assignPublicIp=ENABLED}" \
--tags key=pr,value=$PR_NUMBER key=github_user,value=${{ github.actor }}
- name: Deploy Amazon ECS task definition
id: deploy-task
uses: aws-actions/amazon-ecs-deploy-task-definition@fc8fc60f3a60ffd500fcb13b209c59d221ac8c8c # v2
with:
task-definition: ${{ steps.task-def.outputs.task-definition }}
service: pr-${{ github.event.inputs.issue_number || github.event.pull_request.number }}-service
cluster: superset-ci
wait-for-service-stability: true
wait-for-minutes: 10
- name: Describe ECS service
id: describe-services
run: |
echo "::set-output name=active::$(aws ecs describe-services --cluster superset-ci --services pr-${{ github.event.issue.number }}-service | jq '.services[] | select(.status == "ACTIVE") | any')"
- name: List tasks
id: list-tasks
run: |
echo "task=$(aws ecs list-tasks --cluster superset-ci --service-name pr-${INPUT_ISSUE_NUMBER}-service | jq '.taskArns | first')" >> $GITHUB_OUTPUT
env:
INPUT_ISSUE_NUMBER: ${{ github.event.inputs.issue_number || github.event.pull_request.number }}
- name: Get network interface
id: get-eni
run: |
echo "eni=$(aws ecs describe-tasks --cluster superset-ci --tasks ${{ steps.list-tasks.outputs.task }} | jq '.tasks[0].attachments[0].details | map(select(.name=="networkInterfaceId"))[0].value')" >> $GITHUB_OUTPUT
- name: Get public IP
id: get-ip
run: |
echo "ip=$(aws ec2 describe-network-interfaces --network-interface-ids ${{ steps.get-eni.outputs.eni }} | jq -r '.NetworkInterfaces | first | .Association.PublicIp')" >> $GITHUB_OUTPUT
- name: Comment (success)
if: ${{ success() }}
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
github-token: ${{github.token}}
script: |
const issue_number = context.payload.inputs?.issue_number || context.issue.number;
github.rest.issues.createComment({
issue_number: issue_number,
owner: context.repo.owner,
repo: context.repo.repo,
body: `@${{ github.actor }} Ephemeral environment spinning up at http://${{ steps.get-ip.outputs.ip }}:8080. Credentials are 'admin'/'admin'. Please allow several minutes for bootstrapping and startup.`
});
- name: Comment (failure)
if: ${{ failure() }}
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
github-token: ${{github.token}}
script: |
const issue_number = context.payload.inputs?.issue_number || context.issue.number;
github.rest.issues.createComment({
issue_number: issue_number,
owner: context.repo.owner,
repo: context.repo.repo,
body: '@${{ github.event.inputs.user_login || github.event.comment.user.login }} Ephemeral environment creation failed. Please check the Actions logs for details.'
})
- name: Create ECS service
if: steps.describe-services.outputs.active != 'true'
id: create-service
env:
ECR_SUBNETS: subnet-0e15a5034b4121710,subnet-0e8efef4a72224974
ECR_SECURITY_GROUP: sg-092ff3a6ae0574d91
run: |
aws ecs create-service \
--cluster superset-ci \
--service-name pr-${{ github.event.issue.number }}-service \
--task-definition superset-ci \
--launch-type FARGATE \
--desired-count 1 \
--platform-version LATEST \
--network-configuration "awsvpcConfiguration={subnets=[$ECR_SUBNETS],securityGroups=[$ECR_SECURITY_GROUP],assignPublicIp=ENABLED}" \
--tags key=pr,value=${{ github.event.issue.number }} key=github_user,value=${{ github.actor }}
- name: Deploy Amazon ECS task definition
id: deploy-task
uses: aws-actions/amazon-ecs-deploy-task-definition@v1
with:
task-definition: ${{ steps.task-def.outputs.task-definition }}
service: pr-${{ github.event.issue.number }}-service
cluster: superset-ci
wait-for-service-stability: true
wait-for-minutes: 10
- name: List tasks
id: list-tasks
run: |
echo "::set-output name=task::$(aws ecs list-tasks --cluster superset-ci --service-name pr-${{ github.event.issue.number }}-service | jq '.taskArns | first')"
- name: Get network interface
id: get-eni
run: |
echo "::set-output name=eni::$(aws ecs describe-tasks --cluster superset-ci --tasks ${{ steps.list-tasks.outputs.task }} | jq '.tasks | .[0] | .attachments | .[0] | .details | map(select(.name=="networkInterfaceId")) | .[0] | .value')"
- name: Get public IP
id: get-ip
run: |
echo "::set-output name=ip::$(aws ec2 describe-network-interfaces --network-interface-ids ${{ steps.get-eni.outputs.eni }} | jq -r '.NetworkInterfaces | first | .Association.PublicIp')"
- name: Comment (success)
if: ${{ success() }}
uses: actions/github-script@v3
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
github.issues.createComment({
issue_number: ${{ github.event.issue.number }},
owner: context.repo.owner,
repo: context.repo.repo,
body: '@${{ github.event.comment.user.login }} Ephemeral environment spinning up at http://${{ steps.get-ip.outputs.ip }}:8080. Credentials are `admin`/`admin`. Please allow several minutes for bootstrapping and startup.'
})
- name: Comment (failure)
if: ${{ failure() }}
uses: actions/github-script@v3
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
github.issues.createComment({
issue_number: ${{ github.event.issue.number }},
owner: context.repo.owner,
repo: context.repo.repo,
body: '@${{ github.event.comment.user.login }} Ephemeral environment creation failed. Please check the Actions logs for details.'
})

View File

@@ -1,67 +0,0 @@
name: Generate FOSSA report
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${FOSSA_API_KEY}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
env:
FOSSA_API_KEY: ${{ (secrets.FOSSA_API_KEY != '' ) || '' }}
license_check:
needs: config
if: needs.config.outputs.has-secrets
name: Generate Report
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
- name: Setup Java
uses: actions/setup-java@be666c2fcd27ec809703dec50e508c2fdc7f6654 # v5
with:
distribution: "temurin"
java-version: "11"
- name: Generate fossa report
env:
FOSSA_API_KEY: ${{ secrets.FOSSA_API_KEY }}
run: |
set -eo pipefail
if [[ "${{github.event_name}}" != "pull_request" ]]; then
./scripts/fossa.sh
exit 0
fi
URL="https://api.github.com/repos/${{ github.repository }}/pulls/${{ github.event.pull_request.number }}/files"
FILES=$(curl -s -X GET -G $URL | jq -r '.[] | .filename')
cat<<EOF
CHANGED FILES:
$FILES
EOF
if [[ "${FILES}" =~ (.*package*\.json|requirements\/[a-z_-]+\.txt|setup\.py) ]]; then
echo "Detected dependency changes... running fossa check"
./scripts/fossa.sh
else
echo "No dependency changes... skiping fossa check"
fi
shell: bash

View File

@@ -1,28 +0,0 @@
#!/bin/bash
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Inspired from https://github.com/mpalmer/action-validator?tab=readme-ov-file#pre-commit-hook-example
echo "Running pre-commit hook for GitHub Actions: https://github.com/mpalmer/action-validator"
for action in $(git ls-files .github/ | grep -E '^\.github/(workflows|actions)/.*\.ya?ml$'); do
if action-validator "$action"; then
echo "$action"
else
echo "$action"
exit 1
fi
done

View File

@@ -1,28 +0,0 @@
name: Validate All GitHub Actions
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
jobs:
validate-all-ghas:
runs-on: ubuntu-24.04
steps:
- name: Checkout Repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: '20'
- name: Install Dependencies
run: npm install -g @action-validator/core @action-validator/cli --save-dev
- name: Run Script
run: bash .github/workflows/github-action-validator.sh

View File

@@ -1,34 +0,0 @@
name: supersetbot orglabel based on author
on:
issues:
types: [created, edited]
pull_request:
types: [created, edited]
jobs:
superbot-orglabel:
runs-on: ubuntu-24.04
permissions:
contents: read
pull-requests: write
issues: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
- name: Setup supersetbot
uses: ./.github/actions/setup-supersetbot/
- name: Execute supersetbot orglabel command
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
# Label the issue with the appropriate org using supersetbot
# - this requires for the author to be publicly associated with their org
# - and for the org to be listed in `supersetbot/src/metadata.js`
supersetbot orglabel --issue ${{ github.event.number }} --repo ${{ github.repository }} || true

View File

@@ -1,21 +0,0 @@
name: "Pull Request Labeler"
on:
- pull_request_target
jobs:
labeler:
permissions:
contents: read
pull-requests: write
runs-on: ubuntu-24.04
steps:
- uses: actions/labeler@v6
with:
sync-labels: true
# TODO: run scripts based on labels!
# - id: run-translation-scripts
# if: contains(steps.label-the-PR.outputs.all-labels, 'i18n')
# run: |
# echo "Running translation scripts"
# # Generate .pot -> .po -> .json files

View File

@@ -6,13 +6,11 @@ on:
jobs:
latest-release:
name: Add/update tag to new release
runs-on: ubuntu-24.04
permissions:
contents: write
runs-on: ubuntu-latest
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
@@ -22,11 +20,6 @@ jobs:
run: |
source ./scripts/tag_latest_release.sh $(echo ${{ github.event.release.tag_name }}) --dry-run
- name: Configure Git
run: |
git config user.name "$GITHUB_ACTOR"
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
- name: Run latest-tag
uses: ./.github/actions/latest-tag
if: (! ${{ steps.latest-tag.outputs.SKIP_TAG }} )
@@ -34,4 +27,4 @@ jobs:
description: Superset latest release
tag-name: latest
env:
GITHUB_TOKEN: ${{ github.token }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -1,28 +0,0 @@
name: License Template Check
on:
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
license_check:
name: License Check
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
- name: Setup Java
uses: actions/setup-java@be666c2fcd27ec809703dec50e508c2fdc7f6654 # v5
with:
distribution: 'temurin'
java-version: '11'
- name: Run license check
run: ./scripts/check_license.sh

99
.github/workflows/misc.yml vendored Normal file
View File

@@ -0,0 +1,99 @@
name: Miscellaneous
on:
push:
branches-ignore:
- "dependabot/**"
pull_request:
jobs:
license_check:
name: License Check
runs-on: ubuntu-20.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Setup Java
uses: actions/setup-java@v1
with:
java-version: 8
- name: Generate fossa report
env:
FOSSA_API_KEY: ${{ secrets.FOSSA_API_KEY }}
run: |
set -eo pipefail
if [[ "${{github.event_name}}" != "pull_request" ]]; then
./scripts/fossa.sh
exit 0
fi
URL="https://api.github.com/repos/${{ github.repository }}/pulls/${{ github.event.pull_request.number }}/files"
FILES=$(curl -s -X GET -G $URL | jq -r '.[] | .filename')
cat<<EOF
CHANGED FILES:
$FILES
EOF
if [[ "${FILES}" =~ (.*package*\.json|requirements\/[a-z_-]+\.txt|setup\.py) ]]; then
echo "Detected dependency changes... running fossa check"
./scripts/fossa.sh
else
echo "No dependency changes... skiping fossa check"
fi
shell: bash
- name: Run license check
run: ./scripts/check_license.sh
prefer_typescript:
if: github.ref == 'ref/heads/master' && github.event_name == 'pull_request'
name: Prefer Typescript
runs-on: ubuntu-latest
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Get changed files
id: changed
uses: ./.github/actions/file-changes-action
with:
githubToken: ${{ secrets.GITHUB_TOKEN }}
- name: Determine if a .js or .jsx file was added
id: check
run: |
js_files_added() {
jq -r '
map(
select(
endswith(".js") or endswith(".jsx")
)
) | join("\n")
' ${HOME}/files_added.json
}
echo ::set-output name=js_files_added::$(js_files_added)
- if: steps.check.outputs.js_files_added
name: Add Comment to PR
uses: ./.github/actions/comment-on-pr
continue-on-error: true
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
msg: |
### WARNING: Prefer TypeScript
Looks like your PR contains new `.js` or `.jsx` files:
```
${{steps.check.outputs.js_files_added}}
```
As decided in [SIP-36](https://github.com/apache/superset/issues/9101), all new frontend code should be written in TypeScript. Please convert above files to TypeScript then re-request review.

View File

@@ -1,28 +0,0 @@
name: Hold Label Check
on:
pull_request:
types: [labeled, unlabeled, opened, reopened, synchronize]
permissions:
pull-requests: read
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
check-hold-label:
runs-on: ubuntu-24.04
steps:
- name: Check for 'hold' label
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const payload = context.payload.pull_request
const holdLabelPresent = !!payload.labels.find(label => label.name.includes('hold'))
if (holdLabelPresent) {
core.setFailed('Hold label is present, merge is blocked.')
}

View File

@@ -9,14 +9,11 @@ on:
types: [opened, edited, reopened, synchronize]
jobs:
lint-check:
runs-on: ubuntu-24.04
permissions:
contents: read
pull-requests: write
check:
runs-on: ubuntu-latest
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
@@ -28,4 +25,4 @@ jobs:
on-failed-regex-create-review: false
on-failed-regex-comment:
"Please format your PR title to match: `%regex%`!"
repo-token: "${{ github.token }}"
repo-token: "${{ secrets.GITHUB_TOKEN }}"

View File

@@ -1,96 +0,0 @@
name: pre-commit checks
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
permissions:
contents: read
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
pre-commit:
runs-on: ubuntu-24.04
strategy:
matrix:
python-version: ["current", "previous", "next"]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
- name: Setup Python
uses: ./.github/actions/setup-backend/
with:
python-version: ${{ matrix.python-version }}
- name: Enable brew and helm-docs
# Add brew to the path - see https://github.com/actions/runner-images/issues/6283
run: |
echo "/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin" >> $GITHUB_PATH
eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)"
echo "HOMEBREW_PREFIX=$HOMEBREW_PREFIX" >>"${GITHUB_ENV}"
echo "HOMEBREW_CELLAR=$HOMEBREW_CELLAR" >>"${GITHUB_ENV}"
echo "HOMEBREW_REPOSITORY=$HOMEBREW_REPOSITORY" >>"${GITHUB_ENV}"
brew install norwoodj/tap/helm-docs
- name: Setup Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: '20'
- name: Install Frontend Dependencies
run: |
cd superset-frontend
npm ci
- name: Install Docs Dependencies
run: |
cd docs
yarn install --immutable
- name: Cache pre-commit environments
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5
with:
path: ~/.cache/pre-commit
key: pre-commit-v2-${{ runner.os }}-py${{ matrix.python-version }}-${{ hashFiles('.pre-commit-config.yaml') }}
restore-keys: |
pre-commit-v2-${{ runner.os }}-py${{ matrix.python-version }}-
- name: Get changed files
id: changed_files
uses: ./.github/actions/file-changes-action
with:
output: ' '
- name: pre-commit
run: |
set +e # Don't exit immediately on failure
export SKIP=type-checking-frontend
pre-commit run --files ${{ steps.changed_files.outputs.files }}
PRE_COMMIT_EXIT_CODE=$?
git diff --quiet --exit-code
GIT_DIFF_EXIT_CODE=$?
if [ "${PRE_COMMIT_EXIT_CODE}" -ne 0 ] || [ "${GIT_DIFF_EXIT_CODE}" -ne 0 ]; then
if [ "${PRE_COMMIT_EXIT_CODE}" -ne 0 ]; then
echo "❌ Pre-commit check failed (exit code: ${PRE_COMMIT_EXIT_CODE})."
echo "🔍 Modified files:"
git diff --name-only
else
echo "❌ Git working directory is dirty."
echo "📌 This likely means that pre-commit made changes that were not committed."
echo "🔍 Modified files:"
git diff --name-only
fi
echo "🚒 To prevent/address this CI issue, please install/use pre-commit locally."
echo "📖 More details here: https://superset.apache.org/docs/contributing/development#git-hooks"
exit 1
fi

View File

@@ -3,54 +3,36 @@ name: release-workflow
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
- 'master'
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${NPM_TOKEN}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
env:
NPM_TOKEN: ${{ (secrets.NPM_TOKEN != '' && secrets.GH_PERSONAL_ACCESS_TOKEN != '') || '' }}
build:
needs: config
if: needs.config.outputs.has-secrets
name: Bump version and publish package(s)
runs-on: ubuntu-24.04
runs-on: ubuntu-20.04
strategy:
matrix:
node-version: [16]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- uses: actions/checkout@v2
with:
# pulls all commits (needed for lerna / semantic release to correctly version)
fetch-depth: 0
- name: Get tags and filter trigger tags
run: |
if ! git fetch --depth=1 origin "+refs/tags/*:refs/tags/*"; then
echo "::notice title=Workflow skipped::No tags present in repository"
exit
fi
echo "HAS_TAGS=1" >> $GITHUB_ENV"
git fetch --depth=1 origin "+refs/tags/*:refs/tags/*"
git fetch --prune --unshallow
git tag -d `git tag | grep -E '^trigger-'`
- name: Install Node.js
if: env.HAS_TAGS
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: ${{ matrix.node-version }}
- name: Cache npm
if: env.HAS_TAGS
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5
uses: actions/cache@v1
with:
path: ~/.npm # npm cache files are stored in `~/.npm` on Linux/macOS
key: ${{ runner.OS }}-node-${{ hashFiles('**/package-lock.json') }}
@@ -59,12 +41,10 @@ jobs:
${{ runner.OS }}-
- name: Get npm cache directory path
if: env.HAS_TAGS
id: npm-cache-dir-path
run: echo "dir=$(npm config get cache)" >> $GITHUB_OUTPUT
run: echo "::set-output name=dir::$(npm config get cache)"
- name: Cache npm
if: env.HAS_TAGS
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5
uses: actions/cache@v1
id: npm-cache # use this to check for `cache-hit` (`steps.npm-cache.outputs.cache-hit != 'true'`)
with:
path: ${{ steps.npm-cache-dir-path.outputs.dir }}
@@ -73,20 +53,16 @@ jobs:
${{ runner.os }}-npm-
- name: Install dependencies
if: env.HAS_TAGS
working-directory: ./superset-frontend
run: npm ci
- name: Run unit tests
if: env.HAS_TAGS
working-directory: ./superset-frontend
run: npm run test -- plugins packages
- name: Build packages
if: env.HAS_TAGS
working-directory: ./superset-frontend
run: npm run plugins:build
- name: Configure npm and git
if: env.HAS_TAGS
run: |
echo "@superset-ui:registry=https://registry.npmjs.org/" > .npmrc
echo "registry=https://registry.npmjs.org/" >> .npmrc
@@ -94,17 +70,17 @@ jobs:
npm whoami
git config --local user.email "action@github.com"
git config --local user.name "GitHub Action"
git remote set-url origin "https://${GITHUB_TOKEN}@github.com/apache-superset/superset-ui.git" > /dev/null 2>&1
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
GITHUB_TOKEN: ${{ github.token }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Bump version and publish package(s)
if: env.HAS_TAGS
working-directory: ./superset-frontend
run: |
git tag -d `git tag | grep -E '^trigger-'`
npm run plugins:release-from-tag
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
GITHUB_TOKEN: ${{ github.token }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GH_TOKEN: ${{ secrets.GH_PERSONAL_ACCESS_TOKEN }}

View File

@@ -1,36 +0,0 @@
name: 🎪 Showtime Cleanup
# Scheduled cleanup of expired environments
on:
schedule:
- cron: '0 */6 * * *' # Every 6 hours
# Manual trigger for testing
workflow_dispatch:
# Common environment variables
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_REGION: ${{ vars.AWS_REGION || 'us-west-2' }}
GITHUB_ORG: ${{ github.repository_owner }}
GITHUB_REPO: ${{ github.event.repository.name }}
jobs:
cleanup-expired:
name: Clean up expired showtime environments
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
steps:
- name: Install Superset Showtime
run: pip install superset-showtime
- name: Cleanup expired environments
run: |
echo "Cleaning up environments respecting TTL labels"
python -m showtime cleanup --respect-ttl

View File

@@ -1,183 +0,0 @@
name: 🎪 Superset Showtime
# Ultra-simple: just sync on any PR state change
on:
pull_request_target:
types: [labeled, unlabeled, synchronize, closed]
# Manual testing
workflow_dispatch:
inputs:
pr_number:
description: 'PR number to sync'
required: true
type: number
sha:
description: 'Specific SHA to deploy (optional, defaults to latest)'
required: false
type: string
# Common environment variables for all jobs (non-sensitive only)
env:
AWS_REGION: us-west-2
GITHUB_ORG: ${{ github.repository_owner }}
GITHUB_REPO: ${{ github.event.repository.name }}
GITHUB_ACTOR: ${{ github.actor }}
jobs:
sync:
name: 🎪 Sync PR to desired state
runs-on: ubuntu-latest
timeout-minutes: 90
permissions:
contents: read
pull-requests: write
steps:
- name: Security Check - Authorize Maintainers Only
id: auth
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
script: |
const actor = context.actor;
console.log(`🔍 Checking authorization for ${actor}`);
// Early exit for workflow_dispatch - assume authorized since it's manually triggered
if (context.eventName === 'workflow_dispatch') {
console.log(`✅ Workflow dispatch event - assuming authorized for ${actor}`);
core.setOutput('authorized', 'true');
return;
}
const { data: permission } = await github.rest.repos.getCollaboratorPermissionLevel({
owner: context.repo.owner,
repo: context.repo.repo,
username: actor
});
console.log(`📊 Permission level for ${actor}: ${permission.permission}`);
const authorized = ['write', 'admin'].includes(permission.permission);
// If this is a synchronize event from unauthorized user, check if Showtime is active and set blocked label
if (!authorized && context.eventName === 'pull_request_target' && context.payload.action === 'synchronize') {
console.log(`🔒 Synchronize event detected - checking if Showtime is active`);
// Check if PR has any circus tent labels (Showtime is in use)
const { data: issue } = await github.rest.issues.get({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.pull_request.number
});
const hasCircusLabels = issue.labels.some(label => label.name.startsWith('🎪 '));
if (hasCircusLabels) {
console.log(`🎪 Circus labels found - setting blocked label to prevent auto-deployment`);
await github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.pull_request.number,
labels: ['🎪 🔒 showtime-blocked']
});
console.log(`✅ Blocked label set - Showtime will detect and skip operations`);
} else {
console.log(` No circus labels found - Showtime not in use, skipping block`);
}
}
if (!authorized) {
console.log(`🚨 Unauthorized user ${actor} - skipping all operations`);
core.setOutput('authorized', 'false');
return;
}
console.log(`✅ Authorized maintainer: ${actor}`);
core.setOutput('authorized', 'true');
- name: Install Superset Showtime
if: steps.auth.outputs.authorized == 'true'
run: |
echo "::notice::Maintainer ${{ github.actor }} triggered deploy for PR ${PULL_REQUEST_NUMBER}"
pip install --upgrade superset-showtime
showtime version
env:
PULL_REQUEST_NUMBER: ${{ github.event.pull_request.number || github.event.inputs.pr_number }}
- name: Check what actions are needed
if: steps.auth.outputs.authorized == 'true'
id: check
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
INPUT_PR_NUMBER: ${{ github.event.inputs.pr_number }}
INPUT_SHA: ${{ github.event.inputs.sha }}
run: |
# Bulletproof PR number extraction
if [[ -n "${{ github.event.pull_request.number }}" ]]; then
PR_NUM="${{ github.event.pull_request.number }}"
elif [[ -n "${INPUT_PR_NUMBER}" ]]; then
PR_NUM="${INPUT_PR_NUMBER}"
else
echo "❌ No PR number found in event or inputs"
exit 1
fi
echo "Using PR number: $PR_NUM"
# Run sync check-only with optional SHA override
if [[ -n "${INPUT_SHA}" ]]; then
OUTPUT=$(python -m showtime sync $PR_NUM --check-only --sha "${INPUT_SHA}")
else
OUTPUT=$(python -m showtime sync $PR_NUM --check-only)
fi
echo "$OUTPUT"
# Extract the outputs we need for conditional steps
BUILD=$(echo "$OUTPUT" | grep "build_needed=" | cut -d'=' -f2)
SYNC=$(echo "$OUTPUT" | grep "sync_needed=" | cut -d'=' -f2)
PR_NUM_OUT=$(echo "$OUTPUT" | grep "pr_number=" | cut -d'=' -f2)
TARGET_SHA=$(echo "$OUTPUT" | grep "target_sha=" | cut -d'=' -f2)
echo "build_needed=$BUILD" >> $GITHUB_OUTPUT
echo "sync_needed=$SYNC" >> $GITHUB_OUTPUT
echo "pr_number=$PR_NUM_OUT" >> $GITHUB_OUTPUT
echo "target_sha=$TARGET_SHA" >> $GITHUB_OUTPUT
- name: Checkout PR code (only if build needed)
if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.build_needed == 'true'
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
ref: ${{ steps.check.outputs.target_sha }}
persist-credentials: false
- name: Setup Docker Environment (only if build needed)
if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.build_needed == 'true'
uses: ./.github/actions/setup-docker
with:
dockerhub-user: ${{ secrets.DOCKERHUB_USER }}
dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }}
build: "true"
install-docker-compose: "false"
- name: Execute sync (handles everything)
if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.sync_needed == 'true'
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
run: |
PR_NUM="${{ steps.check.outputs.pr_number }}"
TARGET_SHA="${{ steps.check.outputs.target_sha }}"
if [[ -n "$TARGET_SHA" ]]; then
python -m showtime sync $PR_NUM --sha "$TARGET_SHA"
else
python -m showtime sync $PR_NUM
fi

View File

@@ -1,67 +0,0 @@
name: Superset App CLI tests
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
test-load-examples:
runs-on: ubuntu-24.04
env:
PYTHONPATH: ${{ github.workspace }}
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
REDIS_PORT: 16379
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
services:
postgres:
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
ports:
# Use custom ports for services to avoid accidentally connecting to
# GitHub action runner's default installations
- 15432:5432
redis:
image: redis:7-alpine
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
if: steps.check.outputs.python
uses: ./.github/actions/setup-backend/
- name: Setup Postgres
if: steps.check.outputs.python
uses: ./.github/actions/cached-dependencies
with:
run: setup-postgres
- name: superset init
if: steps.check.outputs.python
run: |
pip install -e .
superset db upgrade
superset load_test_users
- name: superset load_examples
if: steps.check.outputs.python
run: |
# load examples without test data
superset load_examples --load-big-data

View File

@@ -0,0 +1,88 @@
name: Applitools Cypress
on:
schedule:
- cron: "0 1 * * *"
jobs:
cypress-applitools:
runs-on: ubuntu-20.04
strategy:
fail-fast: false
matrix:
browser: ["chrome"]
node: [16]
env:
FLASK_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
PYTHONPATH: ${{ github.workspace }}
REDIS_PORT: 16379
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
APPLITOOLS_APP_NAME: Superset
APPLITOOLS_API_KEY: ${{ secrets.APPLITOOLS_API_KEY }}
APPLITOOLS_BATCH_ID: ${{ github.sha }}
APPLITOOLS_BATCH_NAME: Superset Cypress
services:
postgres:
image: postgres:14-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
ports:
- 15432:5432
redis:
image: redis:5-alpine
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v3
with:
persist-credentials: false
submodules: recursive
ref: master
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: "3.8"
- name: OS dependencies
uses: ./.github/actions/cached-dependencies
with:
run: apt-get-install
- name: Install python dependencies
uses: ./.github/actions/cached-dependencies
with:
run: |
pip-upgrade
pip install -r requirements/testing.txt
- name: Setup postgres
uses: ./.github/actions/cached-dependencies
with:
run: setup-postgres
- name: Import test data
uses: ./.github/actions/cached-dependencies
with:
run: testdata
- name: Setup Node.js
uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node }}
- name: Install npm dependencies
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: Build javascript packages
uses: ./.github/actions/cached-dependencies
with:
run: build-instrumented-assets
- name: Install cypress
uses: ./.github/actions/cached-dependencies
with:
run: cypress-install
- name: Run Cypress
uses: ./.github/actions/cached-dependencies
env:
CYPRESS_BROWSER: ${{ matrix.browser }}
with:
run: cypress-run-applitools

View File

@@ -0,0 +1,40 @@
name: Applitools Storybook
on:
schedule:
- cron: "0 0 * * *"
env:
APPLITOOLS_APP_NAME: Superset
APPLITOOLS_API_KEY: ${{ secrets.APPLITOOLS_API_KEY }}
APPLITOOLS_BATCH_ID: ${{ github.sha }}
APPLITOOLS_BATCH_NAME: Superset Storybook
jobs:
cron:
runs-on: ubuntu-20.04
strategy:
matrix:
node: [16]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v3
with:
persist-credentials: false
submodules: recursive
ref: master
- name: Set up Node.js
uses: actions/setup-node@v3.1.1
with:
node-version: ${{ matrix.node }}
- name: Install eyes-storybook dependencies
uses: ./.github/actions/cached-dependencies
with:
run: eyes-storybook-dependencies
- name: Install NPM dependencies
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: Run Applitools Eyes-Storybook
working-directory: ./superset-frontend
run: npx eyes-storybook -u https://superset-storybook.netlify.app/

76
.github/workflows/superset-cli.yml vendored Normal file
View File

@@ -0,0 +1,76 @@
name: Superset CLI tests
on:
push:
branches-ignore:
- "dependabot/npm_and_yarn/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
jobs:
test-load-examples:
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.9]
env:
PYTHONPATH: ${{ github.workspace }}
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
REDIS_PORT: 16379
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
services:
postgres:
image: postgres:14-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
ports:
# Use custom ports for services to avoid accidentally connecting to
# GitHub action runner's default installations
- 15432:5432
redis:
image: redis:5-alpine
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check if python changes are present
id: check
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python
- name: Setup Python
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: |
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/testing.txt
setup-postgres
- name: superset init
if: steps.check.outcome == 'failure'
run: |
pip install -e .
superset db upgrade
superset load_test_users
- name: superset load_examples
if: steps.check.outcome == 'failure'
run: |
# load examples without test data
superset load_examples --load-big-data

View File

@@ -1,114 +0,0 @@
name: Docs Deployment
on:
# Deploy after integration tests complete on master
workflow_run:
workflows: ["Python-Integration"]
types: [completed]
branches: [master]
# Also allow manual trigger and direct pushes to docs
push:
paths:
- "docs/**"
- "README.md"
branches:
- "master"
workflow_dispatch: {}
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${SUPERSET_SITE_BUILD}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
env:
SUPERSET_SITE_BUILD: ${{ (secrets.SUPERSET_SITE_BUILD != '' && secrets.SUPERSET_SITE_BUILD != '') || '' }}
build-deploy:
needs: config
if: needs.config.outputs.has-secrets
name: Build & Deploy
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.event.workflow_run.head_sha || github.sha }}"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
ref: ${{ github.event.workflow_run.head_sha || github.sha }}
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version-file: './docs/.nvmrc'
- name: Setup Python
uses: ./.github/actions/setup-backend/
- uses: actions/setup-java@be666c2fcd27ec809703dec50e508c2fdc7f6654 # v5
with:
distribution: 'zulu'
java-version: '21'
- name: Install Graphviz
run: sudo apt-get install -y graphviz
- name: Compute Entity Relationship diagram (ERD)
env:
SUPERSET_SECRET_KEY: not-a-secret
run: |
python scripts/erd/erd.py
curl -L http://sourceforge.net/projects/plantuml/files/1.2023.7/plantuml.1.2023.7.jar/download > ~/plantuml.jar
java -jar ~/plantuml.jar -v -tsvg -r -o "${{ github.workspace }}/docs/static/img/" "${{ github.workspace }}/scripts/erd/erd.puml"
- name: yarn install
working-directory: docs
run: |
yarn install --check-cache
- name: Download database diagnostics (if triggered by integration tests)
if: github.event_name == 'workflow_run' && github.event.workflow_run.conclusion == 'success'
uses: dawidd6/action-download-artifact@b6e2e70617bc3265edd6dab6c906732b2f1ae151 # v21
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml
run_id: ${{ github.event.workflow_run.id }}
name: database-diagnostics
path: docs/src/data/
- name: Try to download latest diagnostics (for push/dispatch triggers)
if: github.event_name != 'workflow_run'
uses: dawidd6/action-download-artifact@b6e2e70617bc3265edd6dab6c906732b2f1ae151 # v21
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml
name: database-diagnostics
path: docs/src/data/
branch: master
search_artifacts: true
if_no_artifact_found: warn
- name: Use diagnostics artifact if available
working-directory: docs
run: |
if [ -f "src/data/databases-diagnostics.json" ]; then
echo "Using fresh diagnostics from integration tests"
mv src/data/databases-diagnostics.json src/data/databases.json
else
echo "Using committed databases.json (no artifact found)"
fi
- name: yarn build
working-directory: docs
run: |
yarn build
- name: deploy docs
uses: ./.github/actions/github-action-push-to-another-repository
env:
API_TOKEN_GITHUB: ${{ secrets.SUPERSET_SITE_BUILD }}
with:
source-directory: "./docs/build"
destination-github-username: "apache"
destination-repository-name: "superset-site"
target-branch: "asf-site"
commit-message: "deploying docs: ${{ github.event.head_commit.message || 'triggered by integration tests' }} (apache/superset@${{ github.event.workflow_run.head_sha || github.sha }})"
user-email: dev@superset.apache.org

View File

@@ -1,134 +0,0 @@
name: Docs Testing
on:
pull_request:
paths:
- "docs/**"
- "superset/db_engine_specs/**"
- ".github/workflows/superset-docs-verify.yml"
types: [synchronize, opened, reopened, ready_for_review]
workflow_run:
workflows: ["Python-Integration"]
types: [completed]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.event.workflow_run.head_sha || github.run_id }}
cancel-in-progress: true
jobs:
linkinator:
# See docs here: https://github.com/marketplace/actions/linkinator
# Only run on pull_request, not workflow_run
if: github.event_name == 'pull_request'
name: Link Checking
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
# Do not bump this linkinator-action version without opening
# an ASF Infra ticket to allow the new version first!
- uses: JustinBeckwith/linkinator-action@af984b9f30f63e796ae2ea5be5e07cb587f1bbd9 # v2.3
continue-on-error: true # This will make the job advisory (non-blocking, no red X)
with:
paths: "**/*.md, **/*.mdx"
linksToSkip: >-
^https://github.com/apache/(superset|incubator-superset)/(pull|issues)/\d+,
^https://github.com/apache/(superset|incubator-superset)/commit/[a-f0-9]+,
superset-frontend/.*CHANGELOG\.md,
http://localhost:8088/,
http://127.0.0.1:3000/,
http://localhost:9001/,
https://charts.bitnami.com/bitnami,
https://www.li.me/,
https://www.fanatics.com/,
https://tails.com/gb/,
https://www.techaudit.info/,
https://avetilearning.com/,
https://www.udemy.com/,
https://trustmedis.com/,
http://theiconic.com.au/,
https://dev.mysql.com/doc/refman/5.7/en/innodb-limits.html,
^https://img\.shields\.io/.*,
https://vkusvill.ru/,
https://www.linkedin.com/in/mark-thomas-b16751158/,
https://theiconic.com.au/,
https://wattbewerb.de/,
https://timbr.ai/,
https://opensource.org/license/apache-2-0,
https://www.plaidcloud.com/
build-on-pr:
# Build docs when PR changes docs/** (uses committed databases.json)
if: github.event_name == 'pull_request'
name: Build (PR trigger)
runs-on: ubuntu-24.04
defaults:
run:
working-directory: docs
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version-file: './docs/.nvmrc'
- name: yarn install
run: |
yarn install --check-cache
- name: yarn typecheck
run: |
yarn typecheck
- name: yarn build
run: |
yarn build
build-after-tests:
# Build docs after integration tests complete (uses fresh diagnostics)
# Only runs if integration tests succeeded
if: >
github.event_name == 'workflow_run' &&
github.event.workflow_run.conclusion == 'success'
name: Build (after integration tests)
runs-on: ubuntu-24.04
defaults:
run:
working-directory: docs
steps:
- name: "Checkout PR head: ${{ github.event.workflow_run.head_sha }}"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
ref: ${{ github.event.workflow_run.head_sha }}
persist-credentials: false
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version-file: './docs/.nvmrc'
- name: yarn install
run: |
yarn install --check-cache
- name: Download database diagnostics from integration tests
uses: dawidd6/action-download-artifact@b6e2e70617bc3265edd6dab6c906732b2f1ae151 # v21
with:
workflow: superset-python-integrationtest.yml
run_id: ${{ github.event.workflow_run.id }}
name: database-diagnostics
path: docs/src/data/
if_no_artifact_found: 'warning'
- name: Use fresh diagnostics
run: |
if [ -f "src/data/databases-diagnostics.json" ]; then
echo "Using fresh diagnostics from integration tests"
mv src/data/databases-diagnostics.json src/data/databases.json
else
echo "Warning: No diagnostics artifact found, using committed data"
fi
- name: yarn typecheck
run: |
yarn typecheck
- name: yarn build
run: |
yarn build

41
.github/workflows/superset-docs.yml vendored Normal file
View File

@@ -0,0 +1,41 @@
name: Docs
on:
push:
paths:
- "docs/**"
pull_request:
paths:
- "docs/**"
jobs:
build-deploy:
name: Build & Deploy
runs-on: ubuntu-20.04
defaults:
run:
working-directory: docs
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: yarn install
run: |
yarn install --check-cache
- name: yarn build
run: |
yarn build
- name: deploy docs
if: github.ref == 'refs/heads/master'
uses: ./.github/actions/github-action-push-to-another-repository
env:
API_TOKEN_GITHUB: ${{ secrets.SUPERSET_SITE_BUILD }}
with:
source-directory: './docs/build'
destination-github-username: 'apache'
destination-repository-name: 'superset-site'
target-branch: 'asf-site'
commit-message: "deploying docs: ${{ github.event.head_commit.message }} (apache/superset@${{ github.sha }})"
user-email: dev@superset.apache.org

View File

@@ -2,267 +2,123 @@ name: E2E
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
branches-ignore:
- "dependabot/**/docs/**"
paths-ignore:
- "docs/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
workflow_dispatch:
inputs:
use_dashboard:
description: 'Use Cypress Dashboard (true/false) [paid service - trigger manually when needed]. You MUST provide a branch and/or PR number below for this to work.'
required: false
default: 'false'
ref:
description: 'The branch or tag to checkout'
required: false
default: ''
pr_id:
description: 'The pull request ID to checkout'
required: false
default: ''
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
cypress-matrix:
# Somehow one test flakes on 24.04 for unknown reasons, this is the only GHA left on 22.04
runs-on: ubuntu-22.04
permissions:
contents: read
pull-requests: read
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
# when one test fails, DO NOT cancel the other
# parallel_id, because this will kill Cypress processes
# containers, because this will kill Cypress processes
# leaving the Dashboard hanging ...
# https://github.com/cypress-io/github-action/issues/48
fail-fast: false
matrix:
parallel_id: [0, 1, 2, 3, 4, 5]
containers: [1, 2, 3]
browser: ["chrome"]
app_root: ["", "/app/prefix"]
env:
SUPERSET_ENV: development
FLASK_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
PYTHONPATH: ${{ github.workspace }}
REDIS_PORT: 16379
GITHUB_TOKEN: ${{ github.token }}
# Only use dashboard when explicitly requested via workflow_dispatch
USE_DASHBOARD: ${{ github.event.inputs.use_dashboard == 'true' || 'false' }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
services:
postgres:
image: postgres:17-alpine
image: postgres:14-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
ports:
- 15432:5432
redis:
image: redis:7-alpine
image: redis:5-alpine
ports:
- 16379:6379
steps:
# -------------------------------------------------------
# Conditional checkout based on context
- name: Checkout for push or pull_request event
if: github.event_name == 'push' || github.event_name == 'pull_request'
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: "Checkout (pull) ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
if: github.event_name == 'push'
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: "Checkout (pull_request) ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
if: github.event_name == 'pull_request' || github.event_name == 'pull_request_target'
with:
ref: "refs/pull/${{ github.event.number }}/merge"
persist-credentials: false
ref: ${{ github.event.inputs.ref }}
submodules: recursive
- name: Checkout using PR ID (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
submodules: recursive
# -------------------------------------------------------
- name: Check for file changes
- name: Check if python or frontend changes are present
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python frontend
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python || steps.check.outputs.frontend
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: "3.8"
- name: OS dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: apt-get-install
- name: Install python dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: |
pip-upgrade
pip install -r requirements/testing.txt
- name: Setup postgres
if: steps.check.outputs.python || steps.check.outputs.frontend
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: setup-postgres
- name: Import test data
if: steps.check.outputs.python || steps.check.outputs.frontend
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@v2
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: "16"
- name: Install npm dependencies
if: steps.check.outputs.python || steps.check.outputs.frontend
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: Build javascript packages
if: steps.check.outputs.python || steps.check.outputs.frontend
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: build-instrumented-assets
- name: Install cypress
if: steps.check.outputs.python || steps.check.outputs.frontend
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: cypress-install
- name: Run Cypress
if: steps.check.outputs.python || steps.check.outputs.frontend
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
env:
CYPRESS_BROWSER: ${{ matrix.browser }}
PARALLEL_ID: ${{ matrix.parallel_id }}
PARALLELISM: 6
CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
NODE_OPTIONS: "--max-old-space-size=4096"
CYPRESS_KEY: YjljODE2MzAtODcwOC00NTA3LWE4NmMtMTU3YmFmMjIzOTRhCg==
with:
run: cypress-run-all ${{ env.USE_DASHBOARD }} ${{ matrix.app_root }}
- name: Set safe app root
if: failure()
id: set-safe-app-root
run: |
APP_ROOT="${{ matrix.app_root }}"
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
run: cypress-run-all
- name: Upload Artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
uses: actions/upload-artifact@v2
if: failure()
with:
name: screenshots
path: ${{ github.workspace }}/superset-frontend/cypress-base/cypress/screenshots
name: cypress-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}-${{ matrix.parallel_id }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}
playwright-tests:
runs-on: ubuntu-22.04
permissions:
contents: read
pull-requests: read
strategy:
fail-fast: false
matrix:
browser: ["chromium"]
app_root: ["", "/app/prefix"]
env:
SUPERSET_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
PYTHONPATH: ${{ github.workspace }}
REDIS_PORT: 16379
GITHUB_TOKEN: ${{ github.token }}
services:
postgres:
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
ports:
- 15432:5432
redis:
image: redis:7-alpine
ports:
- 16379:6379
steps:
# -------------------------------------------------------
# Conditional checkout based on context (same as Cypress workflow)
- name: Checkout for push or pull_request event
if: github.event_name == 'push' || github.event_name == 'pull_request'
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
ref: ${{ github.event.inputs.ref }}
submodules: recursive
- name: Checkout using PR ID (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
submodules: recursive
# -------------------------------------------------------
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python || steps.check.outputs.frontend
- name: Setup postgres
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: setup-postgres
- name: Import test data
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: playwright_testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: Build javascript packages
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: build-instrumented-assets
- name: Install Playwright
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: playwright-install
- name: Run Playwright (Required Tests)
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
env:
NODE_OPTIONS: "--max-old-space-size=4096"
with:
run: playwright-run "${{ matrix.app_root }}"
- name: Set safe app root
if: failure()
id: set-safe-app-root
run: |
APP_ROOT="${{ matrix.app_root }}"
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Playwright Artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
if: failure()
with:
path: |
${{ github.workspace }}/superset-frontend/playwright-results/
${{ github.workspace }}/superset-frontend/test-results/
name: playwright-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}

View File

@@ -1,64 +0,0 @@
name: Superset Extensions CLI Package Tests
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
test-superset-extensions-cli-package:
runs-on: ubuntu-24.04
strategy:
matrix:
python-version: ["previous", "current", "next"]
defaults:
run:
working-directory: superset-extensions-cli
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
if: steps.check.outputs.superset-extensions-cli
uses: ./.github/actions/setup-backend/
with:
python-version: ${{ matrix.python-version }}
requirements-type: dev
- name: Run pytest with coverage
if: steps.check.outputs.superset-extensions-cli
run: |
pytest --cov=superset_extensions_cli --cov-report=xml --cov-report=term-missing --cov-report=html -v --tb=short
- name: Upload coverage reports to Codecov
if: steps.check.outputs.superset-extensions-cli
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v5
with:
file: ./coverage.xml
flags: superset-extensions-cli
name: superset-extensions-cli-coverage
fail_ci_if_error: false
- name: Upload HTML coverage report
if: steps.check.outputs.superset-extensions-cli
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
with:
name: superset-extensions-cli-coverage-html
path: htmlcov/

View File

@@ -1,200 +1,67 @@
name: "Frontend Build CI (unit tests, linting & sanity checks)"
name: Frontend
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
branches-ignore:
- "dependabot/**/docs/**"
- "dependabot/**/cypress-base/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
env:
TAG: apache/superset:GHA-${{ github.run_id }}
jobs:
frontend-build:
runs-on: ubuntu-24.04
outputs:
should-run: ${{ steps.check.outputs.frontend }}
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
steps:
- name: Checkout Code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
fetch-depth: 0
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Check for File Changes
submodules: recursive
- name: Check npm lock file version
run: ./scripts/ci_check_npm_lock_version.sh ./superset-frontend/package-lock.json
- name: Check if frontend changes are present
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker Image
if: steps.check.outputs.frontend
shell: bash
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
echo "git rev-parse --short HEAD"
git rev-parse --short HEAD
echo "git show -s --format=raw HEAD"
git show -s --format=raw HEAD
docker buildx build \
-t $TAG \
--cache-from=type=registry,ref=apache/superset-cache:3.10-slim-trixie \
--target superset-node-ci \
.
- name: Save Docker Image as Artifact
if: steps.check.outputs.frontend
run: |
docker save $TAG | zstd -3 --threads=0 > docker-image.tar.zst
- name: Upload Docker Image Artifact
if: steps.check.outputs.frontend
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh frontend
- name: Setup Node.js
if: steps.check.outcome == 'failure'
uses: actions/setup-node@v2
with:
name: docker-image
path: docker-image.tar.zst
sharded-jest-tests:
needs: frontend-build
if: needs.frontend-build.outputs.should-run == 'true'
strategy:
matrix:
shard: [1, 2, 3, 4, 5, 6, 7, 8]
fail-fast: false
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8
node-version: "16"
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
name: docker-image
- name: Load Docker Image
run: |
zstd -d < docker-image.tar.zst | docker load
- name: npm run test with coverage
run: |
mkdir -p ${{ github.workspace }}/superset-frontend/coverage
docker run \
-v ${{ github.workspace }}/superset-frontend/coverage:/app/superset-frontend/coverage \
--rm $TAG \
bash -c \
"npm run test -- --coverage --shard=${{ matrix.shard }}/8 --coverageReporters=json"
- name: Upload Coverage Artifact
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
with:
name: coverage-artifacts-${{ matrix.shard }}
path: superset-frontend/coverage
report-coverage:
needs: [sharded-jest-tests]
if: needs.frontend-build.outputs.should-run == 'true'
runs-on: ubuntu-24.04
permissions:
id-token: write
steps:
- name: Checkout Code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
fetch-depth: 0
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Download Coverage Artifacts
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8
with:
pattern: coverage-artifacts-*
path: coverage/
- name: Reorganize test result reports
run: |
find coverage/
for i in {1..8}; do
mv coverage/coverage-artifacts-${i}/coverage-final.json coverage/coverage-shard-${i}.json
done
shell: bash
- name: Merge Code Coverage
run: npx nyc merge coverage/ merged-output/coverage-summary.json
- name: Upload Code Coverage
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v5
with:
flags: javascript
use_oidc: true
verbose: true
disable_search: true
files: merged-output/coverage-summary.json
slug: apache/superset
lint-frontend:
needs: frontend-build
if: needs.frontend-build.outputs.should-run == 'true'
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8
with:
name: docker-image
- name: Load Docker Image
run: |
zstd -d < docker-image.tar.zst | docker load
run: npm-install
- name: lint
if: steps.check.outcome == 'failure'
working-directory: ./superset-frontend
run: |
docker run --rm $TAG bash -c \
"npm i && npm run lint"
- name: tsc
npm run lint
npm run prettier-check
- name: Build plugins packages
if: steps.check.outcome == 'failure'
working-directory: ./superset-frontend
run: npm run plugins:build
- name: Build plugins Storybook
if: steps.check.outcome == 'failure'
working-directory: ./superset-frontend
run: npm run plugins:build-storybook
- name: unit tests
if: steps.check.outcome == 'failure'
working-directory: ./superset-frontend
run: |
docker run --rm $TAG bash -c \
"npm i && npm run plugins:build && npm run type"
validate-frontend:
needs: frontend-build
if: needs.frontend-build.outputs.should-run == 'true'
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8
with:
name: docker-image
- name: Load Docker Image
run: |
zstd -d < docker-image.tar.zst | docker load
- name: Build Plugins Packages
run: |
docker run --rm $TAG bash -c \
"npm run plugins:build"
test-storybook:
needs: frontend-build
if: needs.frontend-build.outputs.should-run == 'true'
runs-on: ubuntu-24.04
steps:
- name: Download Docker Image Artifact
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8
with:
name: docker-image
- name: Load Docker Image
run: |
zstd -d < docker-image.tar.zst | docker load
- name: Build Storybook and Run Tests
run: |
docker run --rm $TAG bash -c \
"npm run build-storybook && npx playwright install-deps && npx playwright install chromium && npm run test-storybook:ci"
npm run test -- --coverage
# todo: remove this step when fix generator as a project in root jest.config.js
- name: generator-superset unit tests
if: steps.check.outcome == 'failure'
working-directory: ./superset-frontend/packages/generator-superset
run: npx jest
- name: Upload code coverage
if: steps.check.outcome == 'failure'
working-directory: ./superset-frontend
run: ../.github/workflows/codecov.sh -c -F javascript

View File

@@ -1,36 +1,28 @@
name: "Helm: lint and test charts"
name: Lint and Test Charts
on:
pull_request:
types: [opened, edited, reopened, synchronize]
paths:
- "helm/**"
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
lint-test:
runs-on: ubuntu-24.04
runs-on: ubuntu-latest
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
fetch-depth: 0
- name: Set up Helm
uses: azure/setup-helm@dda3372f752e03dde6b3237bc9431cdc2f7a02a2 # v5.0.0
uses: azure/setup-helm@v1
with:
version: v3.16.4
version: v3.5.4
- name: Setup Python
uses: ./.github/actions/setup-backend/
- uses: actions/setup-python@v2
with:
install-superset: 'false'
python-version: 3.8
- name: Set up chart-testing
uses: ./.github/actions/chart-testing-action
@@ -40,7 +32,7 @@ jobs:
run: |
changed=$(ct list-changed --print-config)
if [[ -n "$changed" ]]; then
echo "changed=true" >> $GITHUB_OUTPUT
echo "::set-output name=changed::true"
fi
env:
CT_CHART_DIRS: helm
@@ -52,4 +44,3 @@ jobs:
CT_CHART_DIRS: helm
CT_LINT_CONF: lintconf.yaml
CT_SINCE: HEAD
CT_CHART_REPOS: bitnami=https://charts.bitnami.com/bitnami

View File

@@ -1,38 +1,20 @@
# This workflow automates the release process for Helm charts.
# The workflow creates a new branch for the release and opens a pull request against the 'gh-pages' branch,
# allowing the changes to be reviewed and merged manually.
name: "Helm: release charts"
name: Release Charts
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
- 'master'
paths:
- "helm/**"
workflow_dispatch:
inputs:
ref:
description: "The branch, tag, or commit SHA to check out"
required: false
default: "master"
- 'helm/**'
jobs:
release:
runs-on: ubuntu-24.04
permissions:
contents: write
pull-requests: write
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
ref: ${{ inputs.ref || github.ref_name }}
persist-credentials: true
persist-credentials: false
submodules: recursive
fetch-depth: 0
@@ -42,84 +24,14 @@ jobs:
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
- name: Install Helm
uses: azure/setup-helm@dda3372f752e03dde6b3237bc9431cdc2f7a02a2 # v5.0.0
uses: azure/setup-helm@v1
with:
version: v3.5.4
- name: Add bitnami repo dependency
run: helm repo add bitnami https://charts.bitnami.com/bitnami
- name: Fetch/list all tags
run: |
# Debugging tags
git fetch --tags --force
git tag -d superset-helm-chart-0.13.4 || true
echo "DEBUG TAGS"
git show-ref --tags
- name: Create unique pages branch name
id: vars
run: echo "branch_name=helm-publish-${GITHUB_SHA:0:7}" >> $GITHUB_ENV
- name: Force recreate branch from gh-pages
run: |
# Ensure a clean working directory
git reset --hard
git clean -fdx
git checkout -b local_gha_temp
git submodule update
# Fetch the latest gh-pages branch
git fetch origin gh-pages
# Check out and reset the target branch based on gh-pages
git checkout -B ${{ env.branch_name }} origin/gh-pages
# Remove submodules from the branch
git submodule deinit -f --all
# Force push to the remote branch
git push origin ${{ env.branch_name }} --force
# Return to the original branch
git checkout local_gha_temp
- name: Fetch/list all tags
run: |
git submodule update
cat .github/actions/chart-releaser-action/action.yml
- name: Run chart-releaser
uses: ./.github/actions/chart-releaser-action
with:
version: v1.6.0
charts_dir: helm
mark_as_latest: false
pages_branch: ${{ env.branch_name }}
env:
CR_TOKEN: "${{ github.token }}"
CR_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
CR_RELEASE_NAME_TEMPLATE: "superset-helm-chart-{{ .Version }}"
- name: Open Pull Request
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
script: |
const branchName = '${{ env.branch_name }}';
const [owner, repo] = process.env.GITHUB_REPOSITORY.split('/');
if (!branchName) {
throw new Error("Branch name is not defined.");
}
const pr = await github.rest.pulls.create({
owner,
repo,
title: `Helm chart release for ${branchName}`,
head: branchName,
base: "gh-pages", // Adjust if the target branch is different
body: `This PR releases Helm charts to the gh-pages branch.`,
});
core.info(`Pull request created: ${pr.data.html_url}`);
env:
BRANCH_NAME: ${{ env.branch_name }}

View File

@@ -1,142 +0,0 @@
name: Playwright Experimental Tests
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
workflow_dispatch:
inputs:
ref:
description: 'The branch or tag to checkout'
required: false
default: ''
pr_id:
description: 'The pull request ID to checkout'
required: false
default: ''
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
# NOTE: Required Playwright tests are in superset-e2e.yml (E2E / playwright-tests)
# This workflow contains only experimental tests that run in shadow mode
playwright-tests-experimental:
runs-on: ubuntu-22.04
continue-on-error: true
permissions:
contents: read
pull-requests: read
strategy:
fail-fast: false
matrix:
browser: ["chromium"]
app_root: ["", "/app/prefix"]
env:
SUPERSET_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
PYTHONPATH: ${{ github.workspace }}
REDIS_PORT: 16379
GITHUB_TOKEN: ${{ github.token }}
services:
postgres:
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
ports:
- 15432:5432
redis:
image: redis:7-alpine
ports:
- 16379:6379
steps:
# -------------------------------------------------------
# Conditional checkout based on context (same as Cypress workflow)
- name: Checkout for push or pull_request event
if: github.event_name == 'push' || github.event_name == 'pull_request'
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
submodules: recursive
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
- name: Checkout using ref (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
ref: ${{ github.event.inputs.ref }}
submodules: recursive
- name: Checkout using PR ID (workflow_dispatch)
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
submodules: recursive
# -------------------------------------------------------
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python || steps.check.outputs.frontend
- name: Setup postgres
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: setup-postgres
- name: Import test data
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: playwright_testdata
- name: Setup Node.js
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: Build javascript packages
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: build-instrumented-assets
- name: Install Playwright
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: playwright-install
- name: Run Playwright (Experimental Tests)
if: steps.check.outputs.python || steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
env:
NODE_OPTIONS: "--max-old-space-size=4096"
with:
run: playwright-run "${{ matrix.app_root }}" experimental/
- name: Set safe app root
if: failure()
id: set-safe-app-root
run: |
APP_ROOT="${{ matrix.app_root }}"
SAFE_APP_ROOT=${APP_ROOT//\//_}
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
- name: Upload Playwright Artifacts
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
if: failure()
with:
path: |
${{ github.workspace }}/superset-frontend/playwright-results/
${{ github.workspace }}/superset-frontend/test-results/
name: playwright-experimental-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}

View File

@@ -3,21 +3,18 @@ name: Python-Integration
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
branches-ignore:
- "dependabot/npm_and_yarn/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
test-mysql:
runs-on: ubuntu-24.04
permissions:
id-token: write
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.8]
env:
PYTHONPATH: ${{ github.workspace }}
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
@@ -26,93 +23,64 @@ jobs:
mysql+mysqldb://superset:superset@127.0.0.1:13306/superset?charset=utf8mb4&binary_prefix=true
services:
mysql:
image: mysql:8.0
image: mysql:5.7
env:
MYSQL_ROOT_PASSWORD: root
ports:
- 13306:3306
options: >-
--health-cmd="mysqladmin ping --silent"
--health-interval=10s
--health-timeout=5s
--health-retries=5
redis:
image: redis:7-alpine
image: redis:5-alpine
options: --entrypoint redis-server
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
- name: Check if python changes are present
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python
- name: Setup MySQL
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: setup-mysql
- name: Start Celery worker
if: steps.check.outputs.python
uses: ./.github/actions/cached-dependencies
with:
run: celery-worker
run: |
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/testing.txt
setup-mysql
- name: Run celery
if: steps.check.outcome == 'failure'
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
- name: Python integration tests (MySQL)
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
run: |
./scripts/python_tests.sh
- name: Upload code coverage
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v5
with:
flags: python,mysql
verbose: true
use_oidc: true
slug: apache/superset
- name: Generate database diagnostics for docs
if: steps.check.outputs.python
env:
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
SUPERSET__SQLALCHEMY_DATABASE_URI: |
mysql+mysqldb://superset:superset@127.0.0.1:13306/superset?charset=utf8mb4&binary_prefix=true
if: steps.check.outcome == 'failure'
run: |
python -c "
import json
from superset.app import create_app
from superset.db_engine_specs.lib import generate_yaml_docs
app = create_app()
with app.app_context():
docs = generate_yaml_docs()
# Wrap in the expected format
output = {
'generated': '$(date -Iseconds)',
'databases': docs
}
with open('databases-diagnostics.json', 'w') as f:
json.dump(output, f, indent=2, default=str)
print(f'Generated diagnostics for {len(docs)} databases')
"
- name: Upload database diagnostics artifact
if: steps.check.outputs.python
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
with:
name: database-diagnostics
path: databases-diagnostics.json
retention-days: 7
bash .github/workflows/codecov.sh -c -F python -F mysql
test-postgres:
runs-on: ubuntu-24.04
permissions:
id-token: write
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: ["current", "previous", "next"]
python-version: [3.8, 3.9]
env:
PYTHONPATH: ${{ github.workspace }}
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
@@ -120,7 +88,7 @@ jobs:
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
services:
postgres:
image: postgres:17-alpine
image: postgres:14-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
@@ -129,99 +97,106 @@ jobs:
# GitHub action runner's default installations
- 15432:5432
redis:
image: redis:7-alpine
image: redis:5-alpine
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
- name: Check if python changes are present
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Setup Postgres
if: steps.check.outputs.python
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: |
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/testing.txt
setup-postgres
- name: Start Celery worker
if: steps.check.outputs.python
uses: ./.github/actions/cached-dependencies
with:
run: celery-worker
- name: Run celery
if: steps.check.outcome == 'failure'
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
- name: Python integration tests (PostgreSQL)
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
run: |
./scripts/python_tests.sh
- name: Upload code coverage
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v5
with:
flags: python,postgres
verbose: true
use_oidc: true
slug: apache/superset
if: steps.check.outcome == 'failure'
run: |
bash .github/workflows/codecov.sh -c -F python -F postgres
test-sqlite:
runs-on: ubuntu-24.04
permissions:
id-token: write
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.8]
env:
PYTHONPATH: ${{ github.workspace }}
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
REDIS_PORT: 16379
SUPERSET__SQLALCHEMY_DATABASE_URI: |
sqlite:///${{ github.workspace }}/.temp/superset.db?check_same_thread=true
SUPERSET__SQLALCHEMY_EXAMPLES_URI: |
sqlite:///${{ github.workspace }}/.temp/examples.db?check_same_thread=true
sqlite:///${{ github.workspace }}/.temp/unittest.db
services:
redis:
image: redis:7-alpine
image: redis:5-alpine
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
- name: Check if python changes are present
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
- name: Install dependencies
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: |
# sqlite needs this working directory
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/testing.txt
mkdir ${{ github.workspace }}/.temp
- name: Start Celery worker
if: steps.check.outputs.python
uses: ./.github/actions/cached-dependencies
with:
run: celery-worker
- name: Run celery
if: steps.check.outcome == 'failure'
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
- name: Python integration tests (SQLite)
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
run: |
./scripts/python_tests.sh
- name: Upload code coverage
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v5
with:
flags: python,sqlite
verbose: true
use_oidc: true
slug: apache/superset
if: steps.check.outcome == 'failure'
run: |
bash .github/workflows/codecov.sh -c -F python -F sqlite

View File

@@ -0,0 +1,111 @@
# Python Misc unit tests
name: Python Misc
on:
push:
branches-ignore:
- "dependabot/npm_and_yarn/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
jobs:
python-lint:
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.8]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check if python changes are present
id: check
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python
- name: Setup Python
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: |
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/testing.txt
- name: pylint
if: steps.check.outcome == 'failure'
# `-j 0` run Pylint in parallel
run: pylint -j 0 superset
pre-commit:
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.8]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: |
requirements/base.txt
requirements/integration.txt
- name: Install dependencies
uses: ./.github/actions/cached-dependencies
with:
run: |
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/base.txt
pip install -r requirements/integration.txt
- name: pre-commit
run: pre-commit run --all-files
babel-extract:
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.8]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: 'requirements/base.txt'
- name: Install dependencies
uses: ./.github/actions/cached-dependencies
with:
run: |
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/base.txt
- name: Test babel extraction
run: flask fab babel-extract --target superset/translations --output superset/translations/messages.pot --config superset/translations/babel.cfg -k _,__,t,tn,tct

View File

@@ -3,22 +3,18 @@ name: Python Presto/Hive
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
branches-ignore:
- "dependabot/npm_and_yarn/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
test-postgres-presto:
runs-on: ubuntu-24.04
permissions:
id-token: write
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.8]
env:
PYTHONPATH: ${{ github.workspace }}
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
@@ -27,7 +23,7 @@ jobs:
SUPERSET__SQLALCHEMY_EXAMPLES_URI: presto://localhost:15433/memory/default
services:
postgres:
image: postgres:17-alpine
image: postgres:14-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
@@ -45,51 +41,57 @@ jobs:
# GitHub action runner's default installations
- 15433:8080
redis:
image: redis:7-alpine
image: redis:5-alpine
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
- name: Check if python changes are present
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python == 'true'
- name: Setup Postgres
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: |
echo "${{ steps.check.outputs.python }}"
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/testing.txt
setup-postgres
- name: Start Celery worker
if: steps.check.outputs.python
uses: ./.github/actions/cached-dependencies
with:
run: celery-worker
- name: Run celery
if: steps.check.outcome == 'failure'
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
- name: Python unit tests (PostgreSQL)
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
run: |
./scripts/python_tests.sh -m 'chart_data_flow or sql_json_flow'
- name: Upload code coverage
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v5
with:
flags: python,presto
verbose: true
use_oidc: true
slug: apache/superset
if: steps.check.outcome == 'failure'
run: |
bash .github/workflows/codecov.sh -c -F python -F presto
test-postgres-hive:
runs-on: ubuntu-24.04
permissions:
id-token: write
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.8]
env:
PYTHONPATH: ${{ github.workspace }}
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
@@ -99,7 +101,7 @@ jobs:
UPLOAD_FOLDER: /tmp/.superset/uploads/
services:
postgres:
image: postgres:17-alpine
image: postgres:14-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
@@ -108,51 +110,56 @@ jobs:
# GitHub action runner's default installations
- 15432:5432
redis:
image: redis:7-alpine
image: redis:5-alpine
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
- name: Check if python changes are present
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python
- name: Create csv upload directory
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
run: sudo mkdir -p /tmp/.superset/uploads
- name: Give write access to the csv upload directory
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
run: sudo chown -R $USER:$USER /tmp/.superset
- name: Start hadoop and hive
if: steps.check.outputs.python
run: docker compose -f scripts/databases/hive/docker-compose.yml up -d
if: steps.check.outcome == 'failure'
run: docker-compose -f scripts/databases/hive/docker-compose.yml up -d
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python
- name: Setup Postgres
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: setup-postgres
- name: Start Celery worker
if: steps.check.outputs.python
uses: ./.github/actions/cached-dependencies
with:
run: celery-worker
run: |
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/testing.txt
setup-postgres
- name: Run celery
if: steps.check.outcome == 'failure'
run: celery --app=superset.tasks.celery_app:app worker -Ofair -c 2 &
- name: Python unit tests (PostgreSQL)
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
run: |
pip install -e .[hive]
./scripts/python_tests.sh -m 'chart_data_flow or sql_json_flow'
- name: Upload code coverage
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v5
with:
flags: python,hive
verbose: true
use_oidc: true
slug: apache/superset
if: steps.check.outcome == 'failure'
run: |
bash .github/workflows/codecov.sh -c -F python -F hive

View File

@@ -3,61 +3,56 @@ name: Python-Unit
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
branches-ignore:
- "dependabot/npm_and_yarn/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
unit-tests:
runs-on: ubuntu-24.04
permissions:
id-token: write
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: ["previous", "current", "next"]
python-version: [3.8, 3.9]
env:
PYTHONPATH: ${{ github.workspace }}
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
- name: Check if python changes are present
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
env:
GITHUB_REPO: ${{ github.repository }}
PR_NUMBER: ${{ github.event.pull_request.number }}
continue-on-error: true
run: ./scripts/ci_check_no_file_changes.sh python
- name: Setup Python
uses: ./.github/actions/setup-backend/
if: steps.check.outputs.python
if: steps.check.outcome == 'failure'
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Python unit tests
if: steps.check.outputs.python
env:
SUPERSET_TESTENV: true
SUPERSET_SECRET_KEY: not-a-secret
run: |
pytest --durations-min=0.5 --cov-report= --cov=superset ./tests/common ./tests/unit_tests --cache-clear --maxfail=50
- name: Python 100% coverage unit tests
if: steps.check.outputs.python
env:
SUPERSET_TESTENV: true
SUPERSET_SECRET_KEY: not-a-secret
run: |
pytest --durations-min=0.5 --cov=superset/sql/ ./tests/unit_tests/sql/ --cache-clear --cov-fail-under=100
- name: Upload code coverage
uses: codecov/codecov-action@57e3a136b779b570ffcdbf80b3bdc90e7fab3de2 # v5
cache: 'pip'
cache-dependency-path: 'requirements/testing.txt'
# TODO: separated requirements.txt file just for unit tests
- name: Install dependencies
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
flags: python,unit
verbose: true
use_oidc: true
slug: apache/superset
run: |
apt-get-install
pip-upgrade
pip install wheel
pip install -r requirements/testing.txt
mkdir ${{ github.workspace }}/.temp
- name: Python unit tests
if: steps.check.outcome == 'failure'
run: |
pytest --durations-min=0.5 --cov-report= --cov=superset ./tests/common ./tests/unit_tests --cache-clear
- name: Upload code coverage
if: steps.check.outcome == 'failure'
run: |
bash .github/workflows/codecov.sh -c -F python -F unit

View File

@@ -2,70 +2,56 @@ name: Translations
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
branches-ignore:
- "dependabot/npm_and_yarn/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
frontend-check-translations:
runs-on: ubuntu-24.04
frontend-check:
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Node.js
if: steps.check.outputs.frontend
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
uses: actions/setup-node@v2
with:
node-version-file: './superset-frontend/.nvmrc'
node-version: '16'
- name: Install dependencies
if: steps.check.outputs.frontend
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: lint
if: steps.check.outputs.frontend
working-directory: ./superset-frontend
run: |
npm run build-translation
npm run check-translation
babel-extract:
runs-on: ubuntu-24.04
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
strategy:
matrix:
python-version: [3.8]
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
submodules: recursive
- name: Check for file changes
id: check
uses: ./.github/actions/change-detector/
with:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Setup Python
if: steps.check.outputs.python
uses: ./.github/actions/setup-backend/
- name: Install msgcat
run: sudo apt update && sudo apt install gettext
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
uses: ./.github/actions/cached-dependencies
with:
run: |
apt-get-install
pip-upgrade
pip install -r requirements/base.txt
- name: Test babel extraction
if: steps.check.outputs.python
run: ./scripts/translations/babel_update.sh
run: ./scripts/babel_update.sh

View File

@@ -1,38 +1,27 @@
name: WebSocket server
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
paths:
- "superset-websocket/**"
pull_request:
paths:
- "superset-websocket/**"
types: [synchronize, opened, reopened, ready_for_review]
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
app-checks:
runs-on: ubuntu-24.04
if: github.event.pull_request.draft == false
runs-on: ubuntu-20.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
uses: actions/checkout@v2
with:
persist-credentials: false
- name: Install dependencies
working-directory: ./superset-websocket
run: npm ci
- name: eslint
run: npm install
- name: lint
working-directory: ./superset-websocket
run: npm run eslint -- . --quiet
- name: typescript checks
working-directory: ./superset-websocket
run: npm run type
run: npm run lint
- name: prettier
working-directory: ./superset-websocket
run: npm run prettier-check

View File

@@ -1,56 +0,0 @@
name: SupersetBot Workflow
on:
issue_comment:
types: [created, edited]
# Making the workflow testable since `issue_comment` only triggers on
# the default branch
workflow_dispatch:
inputs:
comment_body:
description: 'Comment Body'
required: true
type: string
jobs:
supersetbot:
runs-on: ubuntu-24.04
if: >
github.event_name == 'workflow_dispatch' ||
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@supersetbot'))
permissions:
contents: read
pull-requests: write
issues: write
steps:
- name: Quickly add thumbs up!
if: github.event_name == 'issue_comment' && contains(github.event.comment.body, '@supersetbot')
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with:
script: |
const [owner, repo] = process.env.GITHUB_REPOSITORY.split('/')
await github.rest.reactions.createForIssueComment({
owner,
repo,
comment_id: context.payload.comment.id,
content: '+1'
});
- name: "Checkout ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
persist-credentials: false
- name: Setup supersetbot
uses: ./.github/actions/setup-supersetbot/
- name: Execute custom Node.js script
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_ACTOR: ${{ github.actor }}
GITHUB_REPOSITORY: ${{ github.repository }}
GITHUB_ISSUE_NUMBER: ${{ github.event.issue.number }}
COMMENT_BODY: ${{ github.event.comment.body }}${{ github.event.inputs.comment_body }}
run: |
supersetbot run "$COMMENT_BODY"

View File

@@ -1,140 +0,0 @@
name: Publish a Release
on:
release:
types: [published, edited]
# Can be triggered manually
workflow_dispatch:
inputs:
release:
required: true
description: The version to generate
git-ref:
required: true
description: The git reference to checkout prior to running the docker build
force-latest:
required: true
type: choice
default: 'false'
description: Whether to force a latest tag on the release
options:
- 'true'
- 'false'
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${DOCKERHUB_USER}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
env:
DOCKERHUB_USER: ${{ (secrets.DOCKERHUB_USER != '' && secrets.DOCKERHUB_TOKEN != '') || '' }}
docker-release:
needs: config
if: needs.config.outputs.has-secrets
name: docker-release
runs-on: ubuntu-24.04
strategy:
matrix:
build_preset: ["dev", "lean", "py310", "websocket", "dockerize", "py311", "py312"]
fail-fast: false
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
fetch-depth: 0
- name: Setup Docker Environment
uses: ./.github/actions/setup-docker
with:
dockerhub-user: ${{ secrets.DOCKERHUB_USER }}
dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }}
install-docker-compose: "false"
build: "true"
- name: Use Node.js 20
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: 20
- name: Setup supersetbot
uses: ./.github/actions/setup-supersetbot/
- name: Execute custom Node.js script
env:
DOCKERHUB_USER: ${{ secrets.DOCKERHUB_USER }}
DOCKERHUB_TOKEN: ${{ secrets.DOCKERHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
INPUT_RELEASE: ${{ github.event.inputs.release }}
INPUT_FORCE_LATEST: ${{ github.event.inputs.force-latest }}
INPUT_GIT_REF: ${{ github.event.inputs.git-ref }}
run: |
RELEASE="${{ github.event.release.tag_name }}"
FORCE_LATEST=""
EVENT="${{github.event_name}}"
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
# in the case of a manually-triggered run, read release from input
RELEASE="${INPUT_RELEASE}"
if [ "${INPUT_FORCE_LATEST}" = "true" ]; then
FORCE_LATEST="--force-latest"
fi
git checkout "${INPUT_GIT_REF}"
EVENT="release"
fi
supersetbot docker \
--push \
--preset ${{ matrix.build_preset }} \
--context "$EVENT" \
--context-ref "$RELEASE" $FORCE_LATEST \
--platform "linux/arm64" \
--platform "linux/amd64"
# Returning to master to support closing setup-supersetbot
git checkout master
update-prs-with-release-info:
needs: config
if: needs.config.outputs.has-secrets
runs-on: ubuntu-24.04
permissions:
contents: read
pull-requests: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
with:
fetch-depth: 0
- name: Use Node.js 20
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: 20
- name: Setup supersetbot
uses: ./.github/actions/setup-supersetbot/
- name: Label the PRs with the right release-related labels
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
INPUT_RELEASE: ${{ github.event.inputs.release }}
run: |
export GITHUB_ACTOR=""
git fetch --all --tags
git checkout master
RELEASE="${{ github.event.release.tag_name }}"
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
# in the case of a manually-triggered run, read release from input
RELEASE="${INPUT_RELEASE}"
fi
supersetbot release-label $RELEASE

View File

@@ -1,52 +0,0 @@
name: Upload Technical Debt Metrics to Google Sheets
on:
push:
branches:
- master
- "[0-9].[0-9]*"
permissions:
contents: read
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${GSHEET_KEY}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
env:
GSHEET_KEY: ${{ (secrets.GSHEET_KEY != '' ) || '' }}
process-and-upload:
needs: config
if: needs.config.outputs.has-secrets
runs-on: ubuntu-24.04
name: Generate Reports
steps:
- name: Checkout Repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Node.js
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install Dependencies
run: npm ci
working-directory: ./superset-frontend
- name: Run Script
env:
SPREADSHEET_ID: "1oABNnzxJYzwUrHjr_c9wfYEq9dFL1ScVof9LlaAdxvo"
SERVICE_ACCOUNT_KEY: ${{ secrets.GSHEET_KEY }}
run: npm run lint-stats
continue-on-error: true
working-directory: ./superset-frontend

View File

@@ -6,17 +6,20 @@ on:
jobs:
welcome:
runs-on: ubuntu-24.04
runs-on: ubuntu-latest
permissions:
pull-requests: write
issues: write
steps:
- name: Welcome Message
uses: actions/first-interaction@v3
continue-on-error: true
uses: actions/first-interaction@v1.0.0
with:
repo-token: ${{ github.token }}
repo-token: ${{ secrets.GITHUB_TOKEN }}
pr-message: |-
Congrats on making your first PR and thank you for contributing to Superset! :tada: :heart:
We hope to see you in our [Slack](https://apache-superset.slack.com/) community too! Not signed up? Use our [Slack App](http://bit.ly/join-superset-slack) to self-register.
We hope to see you in our [Slack](https://apache-superset.slack.com/) community too!
- name: First Time Label
uses: andymckay/labeler@master
with:
add-labels: "new:contributor"
repo-token: ${{ secrets.GITHUB_TOKEN }}

47
.gitignore vendored
View File

@@ -21,7 +21,6 @@
*.swp
__pycache__
.aider*
.local
.cache
.bento*
@@ -33,7 +32,6 @@ cover
.env
.envrc
.idea
.roo
.mypy_cache
.python-version
.tox
@@ -44,7 +42,7 @@ _modules
_static
build
app.db
*.egg-info/
apache_superset.egg-info/
changelog.sh
dist
dump.rdb
@@ -52,6 +50,7 @@ env
venv*
env_py3
envpy3
env36
local_config.py
/superset_config.py
/superset_text.yml
@@ -60,20 +59,17 @@ superset/bin/supersetc
tmp
rat-results.txt
superset/app/
superset-websocket/config.json
.direnv
*.log
# Node.js, webpack artifacts, storybook
*.entry.js
*.js.map
node_modules
npm-debug.log*
superset/static/*
superset/static/assets/*
!superset/static/assets/.gitkeep
superset/static/uploads/*
!superset/static/uploads/.gitkeep
superset/static/assets
superset/static/version_info.json
superset-frontend/**/esm/*
superset-frontend/**/lib/*
superset-frontend/**/storybook-static/*
yarn-error.log
*.map
*.min.js
@@ -91,7 +87,6 @@ scripts/*.zip
# IntelliJ
*.iml
venv
.venv
@eaDir/
# PyCharm
@@ -107,35 +102,11 @@ ghostdriver.log
testCSV.csv
.terser-plugin-cache/
apache-superset-*.tar.gz*
apache_superset-*.tar.gz*
release.json
# Translation-related files
# these json files are generated by ./scripts/po2json.sh
superset/translations/**/messages.json
# these mo binary files are generated by `pybabel compile`
superset/translations/**/messages.mo
# Translation binaries
messages.mo
docker/requirements-local.txt
cache/
docker/*local*
docker/superset-websocket/config.json
docker-compose.override.yml
.temp_cache
# Jest test report
test-report.html
superset/static/stats/statistics.html
# LLM-related
CLAUDE.local.md
PROJECT.md
.aider*
.claude_rc*
.claude/settings.local.json
.env.local
oxc-custom-build/
*.code-workspace
*.duckdb

View File

@@ -1,4 +0,0 @@
{
"no-bare-urls": false,
"line-length": false
}

View File

@@ -15,146 +15,45 @@
# limitations under the License.
#
repos:
- repo: https://github.com/MarcoGorelli/auto-walrus
rev: 0.3.4
- repo: https://github.com/PyCQA/isort
rev: 5.9.3
hooks:
- id: auto-walrus
- id: isort
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.15.0
rev: v0.941
hooks:
- id: mypy
name: mypy (main)
args: [--check-untyped-defs]
exclude: ^superset-extensions-cli/
additional_dependencies: [
types-cachetools,
types-simplejson,
types-python-dateutil,
types-requests,
# types-redis 4.6.0.5 is failing mypy
# because of https://github.com/python/typeshed/pull/10531
types-redis==4.6.0.4,
types-pytz,
types-croniter,
types-PyYAML,
types-setuptools,
types-paramiko,
types-Markdown,
]
- id: mypy
name: mypy (superset-extensions-cli)
args: [--check-untyped-defs]
files: ^superset-extensions-cli/
additional_dependencies: [types-all]
- repo: https://github.com/peterdemin/pip-compile-multi
rev: v2.4.1
hooks:
- id: pip-compile-multi-verify
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
rev: v3.2.0
hooks:
- id: check-docstring-first
- id: check-added-large-files
exclude: ^.*\.(geojson)$|^docs/static/img/screenshots/.*|^superset-frontend/CHANGELOG\.md$|^superset/examples/.*/data\.parquet$
exclude: \.(geojson)$
- id: check-yaml
exclude: ^helm/superset/templates/
- id: debug-statements
- id: end-of-file-fixer
exclude: .*/lerna\.json$|^docs/static/img/logos/
- id: trailing-whitespace
exclude: ^.*\.(snap)
args: ["--markdown-linebreak-ext=md"]
- repo: local
- repo: https://github.com/psf/black
rev: 22.3.0
hooks:
- id: prettier-frontend
name: prettier (frontend)
entry: bash -c 'cd superset-frontend && for file in "$@"; do npx prettier --write "${file#superset-frontend/}"; done'
language: system
pass_filenames: true
files: ^superset-frontend/.*\.(js|jsx|ts|tsx|css|scss|sass|json)$
- repo: local
- id: black
language_version: python3
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v2.4.1 # Use the sha or tag you want to point at
hooks:
- id: oxlint-frontend
name: oxlint (frontend)
entry: ./scripts/oxlint.sh
language: system
pass_filenames: true
files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$
- id: custom-rules-frontend
name: custom rules (frontend)
entry: ./scripts/check-custom-rules.sh
language: system
pass_filenames: true
files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$
- id: eslint-docs
name: eslint (docs)
entry: bash -c 'cd docs && FILES=$(printf "%s\n" "$@" | sed "s|^docs/||" | tr "\n" " ") && yarn eslint --fix --quiet $FILES'
language: system
pass_filenames: true
files: ^docs/.*\.(js|jsx|ts|tsx)$
- id: type-checking-frontend
name: Type-Checking (Frontend)
entry: ./scripts/check-type.js package=superset-frontend excludeDeclarationDir=cypress-base
language: system
files: ^superset-frontend\/.*\.(js|jsx|ts|tsx)$
exclude: ^superset-frontend/cypress-base\/
require_serial: true
- id: prettier
args: ['--ignore-path=./superset-frontend/.prettierignore']
files: 'superset-frontend'
# blacklist unsafe functions like make_url (see #19526)
- repo: https://github.com/skorokithakis/blacklist-pre-commit-hook
rev: e2f070289d8eddcaec0b580d3bde29437e7c8221
hooks:
- id: blacklist
args: ["--blacklisted-names=make_url", "--ignore=tests/"]
- repo: https://github.com/norwoodj/helm-docs
rev: v1.14.2
hooks:
- id: helm-docs
files: helm
verbose: false
args: ["--log-level", "error"]
# Using local hooks ensures ruff version matches requirements/development.txt
- repo: local
hooks:
- id: ruff-format
name: ruff-format
entry: ruff format
language: system
types: [python]
- id: ruff
name: ruff
entry: ruff check --fix --show-fixes
language: system
types: [python]
- repo: local
hooks:
- id: pylint
name: pylint with custom Superset plugins
entry: bash
language: system
types: [python]
exclude: ^(tests/|superset/migrations/|scripts/|RELEASING/|docker/)
args:
- -c
- |
TARGET_BRANCH=${GITHUB_BASE_REF:-master}
# Only fetch if we're not in CI (CI already has all refs)
if [ -z "$CI" ]; then
git fetch --no-recurse-submodules origin "$TARGET_BRANCH" 2>/dev/null || true
fi
BASE=$(git merge-base origin/"$TARGET_BRANCH" HEAD 2>/dev/null) || BASE="HEAD"
files=$(git diff --name-only --diff-filter=ACM "$BASE"..HEAD 2>/dev/null | grep '^superset/.*\.py$' || true)
if [ -n "$files" ]; then
pylint --rcfile=.pylintrc --load-plugins=superset.extensions.pylint --reports=no $files
else
echo "No Python files to lint."
fi
- id: db-engine-spec-metadata
name: database engine spec metadata validation
entry: python superset/db_engine_specs/lint_metadata.py --strict
language: system
files: ^superset/db_engine_specs/.*\.py$
exclude: ^superset/db_engine_specs/(base|lib|lint_metadata|__init__)\.py$
pass_filenames: false
- repo: local
hooks:
- id: feature-flags-sync
name: feature flags documentation sync
entry: bash -c 'python scripts/extract_feature_flags.py > docs/static/feature-flags.json.tmp && if ! diff -q docs/static/feature-flags.json docs/static/feature-flags.json.tmp > /dev/null 2>&1; then mv docs/static/feature-flags.json.tmp docs/static/feature-flags.json && echo "Updated docs/static/feature-flags.json" && exit 1; else rm docs/static/feature-flags.json.tmp; fi'
language: system
files: ^superset/config\.py$
pass_filenames: false

View File

@@ -36,7 +36,7 @@ persistent=yes
# List of plugins (as comma separated values of python modules names) to load,
# usually to register additional checkers.
load-plugins=superset.extensions.pylint
load-plugins=
# Use multiple processes to speed up Pylint.
jobs=2
@@ -50,11 +50,44 @@ unsafe-load-any-extension=no
# run arbitrary code
extension-pkg-whitelist=pyarrow
# Allow optimization of some AST trees. This will activate a peephole AST
# optimizer, which will apply various small optimizations. For instance, it can
# be used to obtain the result of joining multiple strings with the addition
# operator. Joining a lot of strings can lead to a maximum recursion error in
# Pylint and this flag can prevent that. It has one side effect, the resulting
# AST will be different than the one from reality. This option is deprecated
# and it will be removed in Pylint 2.0.
optimize-ast=no
[MESSAGES CONTROL]
disable=all
enable=json-import,disallowed-sql-import,consider-using-transaction
# Only show warnings with the listed confidence levels. Leave empty to show
# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED
confidence=
# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
# multiple time (only on the command line, not in the configuration file where
# it should appear only once). See also the "--disable" option for examples.
enable=
useless-suppression,
# Disable the message, report, category or checker with the given id(s). You
# can either give multiple identifiers separated by comma (,) or put this
# option multiple times (only on the command line, not in the configuration
# file where it should appear only once).You can also use "--disable=all" to
# disable everything first and then reenable specific checks. For example, if
# you want to run only the similarities checker, you can use "--disable=all
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
disable=
missing-docstring,
duplicate-code,
unspecified-encoding,
# re-enable once this no longer raises false positives
too-many-instance-attributes
[REPORTS]
@@ -63,6 +96,12 @@ enable=json-import,disallowed-sql-import,consider-using-transaction
# mypackage.mymodule.MyReporterClass.
output-format=text
# Put messages in a separate file for each module / package specified on the
# command line instead of printing them on stdout. Reports (if any) will be
# written in a file name "pylint_global.[txt|html]". This option is deprecated
# and it will be removed in Pylint 2.0.
files-output=no
# Tells whether to display a full report or only the messages
reports=yes
@@ -84,7 +123,7 @@ evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / stateme
good-names=_,df,ex,f,i,id,j,k,l,o,pk,Run,ts,v,x,y
# Bad variable names which should always be refused, separated by a comma
bad-names=bar,baz,db,fd,foo,sesh,session,tata,toto,tutu
bad-names=fd,foo,bar,baz,toto,tutu,tata
# Colon-delimited sets of names that determine each other's naming style when
# the name regexes allow several styles.
@@ -95,40 +134,68 @@ include-naming-hint=no
# List of decorators that produce properties, such as abc.abstractproperty. Add
# to this list to register other decorators that produce valid properties.
property-classes=
abc.abstractproperty,
sqlalchemy.ext.hybrid.hybrid_property
property-classes=abc.abstractproperty
# Regular expression matching correct argument names
argument-rgx=[a-z_][a-z0-9_]{2,30}$
# Naming hint for argument names
argument-name-hint=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct method names
method-rgx=[a-z_][a-z0-9_]{2,30}$
# Naming hint for method names
method-name-hint=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct variable names
variable-rgx=[a-z_][a-z0-9_]{1,30}$
# Naming hint for variable names
variable-name-hint=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct inline iteration names
inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
# Naming hint for inline iteration names
inlinevar-name-hint=[A-Za-z_][A-Za-z0-9_]*$
# Regular expression matching correct constant names
const-rgx=(([A-Za-z_][A-Za-z0-9_]*)|(__.*__))$
# Naming hint for constant names
const-name-hint=(([A-Z_][A-Z0-9_]*)|(__.*__))$
# Regular expression matching correct class names
class-rgx=[A-Z_][a-zA-Z0-9]+$
# Naming hint for class names
class-name-hint=[A-Z_][a-zA-Z0-9]+$
# Regular expression matching correct class attribute names
class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
# Naming hint for class attribute names
class-attribute-name-hint=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
# Regular expression matching correct module names
module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
# Naming hint for module names
module-name-hint=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
# Regular expression matching correct attribute names
attr-rgx=[a-z_][a-z0-9_]{2,30}$
# Naming hint for attribute names
attr-name-hint=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct function names
function-rgx=[a-z_][a-z0-9_]{2,30}$
# Naming hint for function names
function-name-hint=[a-z_][a-z0-9_]{2,30}$
# Regular expression which should only match function or class names that do
# not require a docstring.
no-docstring-rgx=^_
@@ -147,7 +214,7 @@ max-nested-blocks=5
[FORMAT]
# Maximum number of characters on a single line.
max-line-length=100
max-line-length=90
# Regexp for a line that is allowed to be longer than the limit.
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
@@ -156,6 +223,12 @@ ignore-long-lines=^\s*(# )?<?https?://\S+>?$
# else.
single-line-if-stmt=no
# List of optional constructs for which whitespace checking is disabled. `dict-
# separator` is used to allow tabulation in dicts, etc.: {1 : 1,\n222: 2}.
# `trailing-comma` allows a space between comma and closing bracket: (a, ).
# `empty-line` allows space-only lines.
no-space-check=trailing-comma,dict-separator
# Maximum number of lines in a module
max-module-lines=1000
@@ -225,12 +298,12 @@ ignore-mixin-members=yes
# (useful for modules/projects where namespaces are manipulated during runtime
# and thus existing member attributes cannot be deduced by static analysis. It
# supports qualified module names, as well as Unix pattern matching.
ignored-modules=numpy,pandas,alembic.op,sqlalchemy,alembic.context,flask_appbuilder.security.sqla.PermissionView.role,flask_appbuilder.Model.metadata,flask_appbuilder.Base.metadata
ignored-modules=numpy,pandas,alembic.op,sqlalchemy,alembic.context,flask_appbuilder.security.sqla.PermissionView.role,flask_appbuilder.Model.metadata,flask_appbuilder.Base.metadata,distutils
# List of class names for which member attributes should not be checked (useful
# for classes with dynamically set attributes). This supports the use of
# qualified names.
ignored-classes=contextlib.closing,optparse.Values,thread._local,_thread._local
ignored-classes=contextlib.closing,optparse.Values,thread._local,_thread._local,sqlalchemy.orm.scoping.scoped_session
# List of members which are set dynamically and missed by pylint inference
# system, and so shouldn't trigger E1101 when accessed. Python regular
@@ -352,4 +425,4 @@ analyse-fallback-blocks=no
# Exceptions that will emit a warning when being caught. Defaults to
# "Exception"
overgeneral-exceptions=builtins.Exception
overgeneral-exceptions=Exception

Some files were not shown because too many files have changed in this diff Show More