mirror of
https://github.com/apache/superset.git
synced 2026-05-02 14:34:22 +00:00
Compare commits
780 Commits
playwright
...
fix-docker
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
fa1546a3a5 | ||
|
|
9c0337d092 | ||
|
|
3ef33dcb76 | ||
|
|
5a99588f57 | ||
|
|
0b34363654 | ||
|
|
55ec1152ec | ||
|
|
810d6ff480 | ||
|
|
1501af06fe | ||
|
|
6cb3cea960 | ||
|
|
675a4c7a66 | ||
|
|
7110fc9cde | ||
|
|
73e095db8e | ||
|
|
5fedb65bc0 | ||
|
|
b3526fc4ca | ||
|
|
042229bf80 | ||
|
|
06e4f4ff4c | ||
|
|
bb5be6cf54 | ||
|
|
f6f9e083ac | ||
|
|
ad0186093f | ||
|
|
073c3c72b4 | ||
|
|
d4b89de001 | ||
|
|
912538d176 | ||
|
|
cfeb7ccd31 | ||
|
|
abf90de0ca | ||
|
|
ec2509a8b4 | ||
|
|
43653d1fa1 | ||
|
|
da56bddada | ||
|
|
3347b9bf6c | ||
|
|
6663709a23 | ||
|
|
e6d0f97aab | ||
|
|
3bcd3b1683 | ||
|
|
b223f10ab5 | ||
|
|
f787aec567 | ||
|
|
2ec3aaaeea | ||
|
|
20da4eb86e | ||
|
|
27a4575f3e | ||
|
|
5fa6925522 | ||
|
|
a7e7cc30a9 | ||
|
|
e4d71c2a55 | ||
|
|
10a8d8b8ee | ||
|
|
1681f74b2e | ||
|
|
58ab4e78ff | ||
|
|
8f6dd4aba0 | ||
|
|
dba75bd897 | ||
|
|
e28d2782f1 | ||
|
|
97aea5d128 | ||
|
|
bd419d19af | ||
|
|
56ad429200 | ||
|
|
7fc9974a7c | ||
|
|
73d4332b51 | ||
|
|
10a9b4bb94 | ||
|
|
290bcc1dbb | ||
|
|
26ac832138 | ||
|
|
4db6f9e04c | ||
|
|
0fd528c7af | ||
|
|
647f21c26a | ||
|
|
89b998d6b7 | ||
|
|
695e295333 | ||
|
|
f2fc5dec11 | ||
|
|
95a465ad7c | ||
|
|
8aebfe1105 | ||
|
|
c7cec19827 | ||
|
|
ce84ab4ce2 | ||
|
|
470c593c3d | ||
|
|
04a9be04ab | ||
|
|
09b5af5945 | ||
|
|
19d5fa86fc | ||
|
|
b09e60c1ec | ||
|
|
2f81720603 | ||
|
|
0ecc69d2f1 | ||
|
|
319a131ec9 | ||
|
|
3f37cdbf9c | ||
|
|
3a811d680d | ||
|
|
a60f8d761d | ||
|
|
3580dc6cad | ||
|
|
b99fc582e4 | ||
|
|
e4f649e49c | ||
|
|
d54e227e25 | ||
|
|
39ebf7a7ad | ||
|
|
34418d7e0b | ||
|
|
d6328fcb42 | ||
|
|
e8363cf606 | ||
|
|
5747fb1e85 | ||
|
|
d823dfd2b9 | ||
|
|
baaa8c5f54 | ||
|
|
429d9b27f6 | ||
|
|
56cf7a810b | ||
|
|
bbab86a0b1 | ||
|
|
f83f952221 | ||
|
|
e14931c368 | ||
|
|
8951362852 | ||
|
|
eeb4065d7d | ||
|
|
6a46700721 | ||
|
|
790b79541b | ||
|
|
7c69ec7f24 | ||
|
|
ef395662aa | ||
|
|
e1ce553b2b | ||
|
|
b81543c18c | ||
|
|
5f67fa45ce | ||
|
|
8e0c584a92 | ||
|
|
01a9541a0e | ||
|
|
5e3acc2041 | ||
|
|
f2b54e882d | ||
|
|
54919c942a | ||
|
|
04c5517206 | ||
|
|
b1ad54220b | ||
|
|
c6821cac6f | ||
|
|
b7a5b24a54 | ||
|
|
760227d630 | ||
|
|
3ca8c998ab | ||
|
|
87bbd54d0a | ||
|
|
b630830841 | ||
|
|
9fabd7f997 | ||
|
|
fadab21493 | ||
|
|
cc972cad5a | ||
|
|
de6ac2a444 | ||
|
|
2b647d2352 | ||
|
|
7888da9e30 | ||
|
|
b576665f9a | ||
|
|
7f4c260cbe | ||
|
|
febc5d54d5 | ||
|
|
aa37e96a02 | ||
|
|
3fa5bb4138 | ||
|
|
0289028313 | ||
|
|
5e7fe81cfa | ||
|
|
02495a130f | ||
|
|
d4723ef116 | ||
|
|
c3d5edbae9 | ||
|
|
bb3452b43c | ||
|
|
996e0e1e7a | ||
|
|
daec330127 | ||
|
|
d2907b2577 | ||
|
|
17d6f4ebc4 | ||
|
|
dee063a4c5 | ||
|
|
ec36791551 | ||
|
|
0fedfe03d5 | ||
|
|
23fec55e3d | ||
|
|
212559dab2 | ||
|
|
c564655f39 | ||
|
|
dc15feb83d | ||
|
|
95169807d3 | ||
|
|
10ed60b4c1 | ||
|
|
a33f96b2fc | ||
|
|
39d5511b29 | ||
|
|
b460ca94c6 | ||
|
|
2c1a33fd32 | ||
|
|
13013bbd64 | ||
|
|
a1d24f1e4a | ||
|
|
3fa7dba094 | ||
|
|
801c84f0ef | ||
|
|
238bebebec | ||
|
|
281c0c9672 | ||
|
|
807ff513ef | ||
|
|
445bc403b8 | ||
|
|
2267b78a10 | ||
|
|
d0e80d2079 | ||
|
|
25647942fd | ||
|
|
3fba967856 | ||
|
|
e1fa374517 | ||
|
|
50d0508a92 | ||
|
|
2187fb4ab4 | ||
|
|
fe16c828cf | ||
|
|
6e1718910f | ||
|
|
2d20079a88 | ||
|
|
1f19ef92cb | ||
|
|
f4597be341 | ||
|
|
4393db57d9 | ||
|
|
409cdad264 | ||
|
|
c0cbbe393a | ||
|
|
2900258e05 | ||
|
|
2e29e33dd8 | ||
|
|
39238ef8a9 | ||
|
|
476e454384 | ||
|
|
4d462c76bd | ||
|
|
a06e6eb680 | ||
|
|
cee5ce13e0 | ||
|
|
6453980d8d | ||
|
|
f984dca5cc | ||
|
|
a77c2d550c | ||
|
|
f00f7d1c18 | ||
|
|
33ff127370 | ||
|
|
b941be01cf | ||
|
|
f4474b2e3e | ||
|
|
896947c787 | ||
|
|
4b1d92e575 | ||
|
|
2bcb66c2fc | ||
|
|
d0783da3e5 | ||
|
|
4532ccf638 | ||
|
|
c30edaf075 | ||
|
|
54f19856de | ||
|
|
ab8df1ab34 | ||
|
|
9555798d37 | ||
|
|
95c14b1fc1 | ||
|
|
b142f1956f | ||
|
|
e071e0c5a4 | ||
|
|
129b8e10a2 | ||
|
|
82d74d15ec | ||
|
|
89380638b0 | ||
|
|
c6ad0dbd3a | ||
|
|
f69cd43bd0 | ||
|
|
4c267b7ee2 | ||
|
|
7f6cdc5616 | ||
|
|
db61e4f62a | ||
|
|
68e917c3f6 | ||
|
|
96a3f2a187 | ||
|
|
c867d9379f | ||
|
|
81fdc2bd0e | ||
|
|
23b91d22ef | ||
|
|
6eb4db6930 | ||
|
|
2324b4c9e5 | ||
|
|
6ca0f7a925 | ||
|
|
64424f1625 | ||
|
|
debdfbc835 | ||
|
|
4c01b5c324 | ||
|
|
137ebdee39 | ||
|
|
a272253243 | ||
|
|
35ac4c74fd | ||
|
|
4f3403b134 | ||
|
|
86bc493423 | ||
|
|
91dba9dcbf | ||
|
|
bdcc98743d | ||
|
|
e053418c97 | ||
|
|
0404c99e39 | ||
|
|
fd3eea0557 | ||
|
|
0c490dc1ab | ||
|
|
1166df3579 | ||
|
|
0b4fcce03b | ||
|
|
393259bb9e | ||
|
|
560da50df8 | ||
|
|
5d5012aa9f | ||
|
|
6c75365427 | ||
|
|
3a3cbc2900 | ||
|
|
23a47e2f5a | ||
|
|
fc67569cd4 | ||
|
|
e17bfae6bd | ||
|
|
936e37bd02 | ||
|
|
ad3812edd7 | ||
|
|
5f58241795 | ||
|
|
481bfa0f68 | ||
|
|
462fffc23c | ||
|
|
7503ee4e09 | ||
|
|
fa3d4a75ca | ||
|
|
adb575be2f | ||
|
|
d56bc5826f | ||
|
|
72c69e2ca6 | ||
|
|
22cfc4536b | ||
|
|
fac5d2bcb6 | ||
|
|
4fe2085596 | ||
|
|
005b2af985 | ||
|
|
2a38ce001e | ||
|
|
f4772a9383 | ||
|
|
dcdcf88969 | ||
|
|
d8f7ae83ee | ||
|
|
911d72c957 | ||
|
|
169d27c9e9 | ||
|
|
62c7b48b5c | ||
|
|
4f444ae1d2 | ||
|
|
459b4cb23d | ||
|
|
d914b35cc0 | ||
|
|
4f5789abfe | ||
|
|
e9b6791ffb | ||
|
|
0294c30c9e | ||
|
|
1cface15e6 | ||
|
|
ecefba5bf7 | ||
|
|
53dddf4db2 | ||
|
|
1e8d648f47 | ||
|
|
14c0cad0ba | ||
|
|
f895250cf9 | ||
|
|
ea90d1f141 | ||
|
|
f48322c17d | ||
|
|
413dfc98ff | ||
|
|
57f8f50292 | ||
|
|
07aa7622f7 | ||
|
|
ffbf81e952 | ||
|
|
734d64081f | ||
|
|
85cf46dc1c | ||
|
|
4fc8157d6f | ||
|
|
af185ad8c9 | ||
|
|
dd7f0c4224 | ||
|
|
77ebc5ac10 | ||
|
|
e9b91f6ec9 | ||
|
|
3ac76e116d | ||
|
|
4249b8ee6a | ||
|
|
047360641a | ||
|
|
337eb3baf5 | ||
|
|
70b2e520b2 | ||
|
|
16e046b5d9 | ||
|
|
a6d85dccf8 | ||
|
|
84279acd2f | ||
|
|
03caa7b337 | ||
|
|
6f67b05375 | ||
|
|
64ee48f147 | ||
|
|
f9be2b816a | ||
|
|
dfdf8e75d8 | ||
|
|
0c1edd4568 | ||
|
|
7a5441bc7a | ||
|
|
5edaed2e5b | ||
|
|
861e5cd013 | ||
|
|
d7d94ba640 | ||
|
|
9968393e4c | ||
|
|
9aff89c1b4 | ||
|
|
aaa174f820 | ||
|
|
1949d1bb96 | ||
|
|
f9fde87e85 | ||
|
|
cedc35e39f | ||
|
|
12a266fd2f | ||
|
|
5909e90081 | ||
|
|
dcc556a9a7 | ||
|
|
61986100bd | ||
|
|
c76ddcbbec | ||
|
|
740ddc03e2 | ||
|
|
2080633e57 | ||
|
|
ac27c0aa3c | ||
|
|
53fa65fe67 | ||
|
|
fdef8fa50a | ||
|
|
1334040fd6 | ||
|
|
52af489d8f | ||
|
|
d07a452e9b | ||
|
|
aed95453b3 | ||
|
|
4451e8db05 | ||
|
|
dd2eb6293d | ||
|
|
1b1be96274 | ||
|
|
e5489bd30f | ||
|
|
12aa425049 | ||
|
|
c31224c891 | ||
|
|
85e830de46 | ||
|
|
d4ba44fce2 | ||
|
|
7cd76e4647 | ||
|
|
e112d863bf | ||
|
|
fe5d5fdae6 | ||
|
|
02411ffde0 | ||
|
|
1697cf733b | ||
|
|
28c802fb6c | ||
|
|
362b5e3b89 | ||
|
|
bf5070471d | ||
|
|
100789200a | ||
|
|
f95f125c4c | ||
|
|
fd67d3190a | ||
|
|
bd8d4ddbee | ||
|
|
ecb4e483df | ||
|
|
f8cb935105 | ||
|
|
ba8d6eb9ac | ||
|
|
c399fd2801 | ||
|
|
9e04c3471d | ||
|
|
8f8fe19e3e | ||
|
|
ff3dab9b3b | ||
|
|
ff24e2f27d | ||
|
|
54eb6317ef | ||
|
|
32c98d02d3 | ||
|
|
6b25d0663e | ||
|
|
c0bcf28947 | ||
|
|
e0ea807031 | ||
|
|
8d070f5cb6 | ||
|
|
5cd8e1e736 | ||
|
|
0ced20457b | ||
|
|
e3e6b0e18b | ||
|
|
c026ae2ce7 | ||
|
|
ae491aee00 | ||
|
|
3258082819 | ||
|
|
d36ddbbb33 | ||
|
|
5920cb57ea | ||
|
|
fb6f3fbb4d | ||
|
|
91539f77aa | ||
|
|
b8f31124d0 | ||
|
|
da8e077a44 | ||
|
|
32435bc3e9 | ||
|
|
2cf0d7936e | ||
|
|
0830a57fa6 | ||
|
|
0f56e3b9ae | ||
|
|
ee45b26ad7 | ||
|
|
f3407d7a56 | ||
|
|
f51f7f3307 | ||
|
|
2f4f64dfe8 | ||
|
|
ae584c8886 | ||
|
|
b1e004e122 | ||
|
|
737a5162e4 | ||
|
|
b800412eda | ||
|
|
24a4f8510d | ||
|
|
33a425bbbc | ||
|
|
5ce4c52cfa | ||
|
|
c9ec173647 | ||
|
|
71f9dcff5a | ||
|
|
479b7a3fba | ||
|
|
594ea972ca | ||
|
|
d77f7b6d20 | ||
|
|
f4ded02e0d | ||
|
|
f97fa08477 | ||
|
|
789be78166 | ||
|
|
ea3d247017 | ||
|
|
6456f4c516 | ||
|
|
e9bbf06938 | ||
|
|
ebee35ea5a | ||
|
|
d0fb77cbc8 | ||
|
|
46659c2bd1 | ||
|
|
8407e9cf3b | ||
|
|
5eeba2e734 | ||
|
|
4ca8c000d1 | ||
|
|
7108658de0 | ||
|
|
42311f602e | ||
|
|
6b948ee894 | ||
|
|
5e0ee40762 | ||
|
|
6aaf2266a9 | ||
|
|
d14f502126 | ||
|
|
d0361cb881 | ||
|
|
821b259805 | ||
|
|
2329d49f9e | ||
|
|
28e3ba749e | ||
|
|
cd2c889c9a | ||
|
|
52c711b0bc | ||
|
|
6f8052b828 | ||
|
|
5f431ee1ec | ||
|
|
de7a72a37b | ||
|
|
a1a57d50a4 | ||
|
|
5844c05281 | ||
|
|
c7a4d4f2cc | ||
|
|
d6d8e71b71 | ||
|
|
57ec3b5a6d | ||
|
|
11d3750044 | ||
|
|
40db928091 | ||
|
|
fdde5fe2d3 | ||
|
|
6bd37d11ae | ||
|
|
94900e0fb3 | ||
|
|
c3a9e28573 | ||
|
|
e28ab05068 | ||
|
|
b27ec49204 | ||
|
|
a1706229db | ||
|
|
8bcb499a06 | ||
|
|
a3ea950567 | ||
|
|
c722c92adb | ||
|
|
824dafa342 | ||
|
|
104eb90013 | ||
|
|
76f1b5ed5a | ||
|
|
29a52652b9 | ||
|
|
8b1c41a012 | ||
|
|
71a38305d9 | ||
|
|
4ae62dcae8 | ||
|
|
da31e82b6a | ||
|
|
18d3da81ca | ||
|
|
2b1c72a92c | ||
|
|
75c6da97b2 | ||
|
|
6439440260 | ||
|
|
989bb3432f | ||
|
|
b441844ca6 | ||
|
|
09a1788a8b | ||
|
|
cde9abfce2 | ||
|
|
0bcefe34ac | ||
|
|
649112aa1f | ||
|
|
120ca5cf8f | ||
|
|
0035da83af | ||
|
|
019f9442ae | ||
|
|
084f9832c7 | ||
|
|
8a339febeb | ||
|
|
e5579ed939 | ||
|
|
d5dbd06824 | ||
|
|
a1b5b92265 | ||
|
|
92c63a54e4 | ||
|
|
e5b7e38a30 | ||
|
|
edcb38517f | ||
|
|
51a6b30179 | ||
|
|
a588668899 | ||
|
|
9cf86c1533 | ||
|
|
740b328199 | ||
|
|
375bcd00ba | ||
|
|
076d4950d0 | ||
|
|
0e9cffe12e | ||
|
|
46e21c3003 | ||
|
|
1f5df7407f | ||
|
|
e1c022344e | ||
|
|
78081755aa | ||
|
|
955953b467 | ||
|
|
ead19f9ba3 | ||
|
|
c16ca9527c | ||
|
|
d674d54e2e | ||
|
|
0e8b69089d | ||
|
|
0e8c420002 | ||
|
|
e433cd5f69 | ||
|
|
c1b52cb8ed | ||
|
|
dba3fdfadf | ||
|
|
b7a541a9da | ||
|
|
1bde06b366 | ||
|
|
e8927ca3b3 | ||
|
|
9d58599329 | ||
|
|
858a72d8c1 | ||
|
|
c79c85cdfe | ||
|
|
47ea316792 | ||
|
|
5b38a1a0d4 | ||
|
|
f2e677c150 | ||
|
|
d992a5836f | ||
|
|
ee2ab7e078 | ||
|
|
683a65488f | ||
|
|
751804d044 | ||
|
|
0d3c4d5d22 | ||
|
|
b8b7b958d9 | ||
|
|
0092cdca81 | ||
|
|
2120569267 | ||
|
|
4b7ae3a8f7 | ||
|
|
a64e5e15fc | ||
|
|
4f14eddf73 | ||
|
|
70a3e8fb42 | ||
|
|
954da8a3cc | ||
|
|
4c9a463db3 | ||
|
|
47dbdd7a59 | ||
|
|
abc0678454 | ||
|
|
dc403145ed | ||
|
|
955b3b2b19 | ||
|
|
ab80ec8066 | ||
|
|
bb22eb1ca8 | ||
|
|
8d7c83419c | ||
|
|
d300d69f8f | ||
|
|
9e7e813255 | ||
|
|
f4b919bf7d | ||
|
|
3940354120 | ||
|
|
b35b1d7633 | ||
|
|
0131e542e9 | ||
|
|
e7c060466d | ||
|
|
340ab9f238 | ||
|
|
1dcc887a62 | ||
|
|
67cf287c03 | ||
|
|
440cbc4c1f | ||
|
|
8d04c33adf | ||
|
|
8a00badf45 | ||
|
|
a76ec75933 | ||
|
|
240091516a | ||
|
|
1127374edd | ||
|
|
a18b62cf6b | ||
|
|
98553f83e3 | ||
|
|
1d9c93a793 | ||
|
|
fb2826f92e | ||
|
|
236e000398 | ||
|
|
8c603a6f8b | ||
|
|
45a42396ab | ||
|
|
4479614754 | ||
|
|
e1a8886d32 | ||
|
|
23b61b080e | ||
|
|
2f14c6cd69 | ||
|
|
964c16f1a4 | ||
|
|
b6f1b4db2f | ||
|
|
482c674a0f | ||
|
|
16e6452b8c | ||
|
|
c36ac53445 | ||
|
|
eabb5bdf7d | ||
|
|
e5da6d3183 | ||
|
|
3345eb32c5 | ||
|
|
1d8d30e5bb | ||
|
|
4a249a0745 | ||
|
|
d121cfdbda | ||
|
|
3eec441abe | ||
|
|
22c061c06c | ||
|
|
7f85d92b85 | ||
|
|
dc98a3b397 | ||
|
|
c458f99dd4 | ||
|
|
70aec7fa76 | ||
|
|
5c91ab91fb | ||
|
|
62d86aba14 | ||
|
|
b40467c7e2 | ||
|
|
f955f0d133 | ||
|
|
a8111de3ef | ||
|
|
9800fb7702 | ||
|
|
17027ff5ca | ||
|
|
9c963b50e6 | ||
|
|
bf5de3cb50 | ||
|
|
2c6bed27aa | ||
|
|
d05ab91d11 | ||
|
|
7748b60ff5 | ||
|
|
e4cb84bc02 | ||
|
|
005e4e3ea8 | ||
|
|
5e3ff0787b | ||
|
|
51798edb23 | ||
|
|
6e0960c3f5 | ||
|
|
db995ad5bf | ||
|
|
b12f5f8394 | ||
|
|
92986c2ecc | ||
|
|
d4206de8e0 | ||
|
|
4935938bb1 | ||
|
|
2b6b4e363b | ||
|
|
de69377b04 | ||
|
|
fd7ce4976a | ||
|
|
775d1ba061 | ||
|
|
9fc7a83320 | ||
|
|
a754258fad | ||
|
|
a745fd49fa | ||
|
|
01f032017f | ||
|
|
d5c5dbb3bf | ||
|
|
c9a7a85159 | ||
|
|
de7f41a888 | ||
|
|
a0e63faf62 | ||
|
|
341ae994c5 | ||
|
|
81e561bdc9 | ||
|
|
170d1b92eb | ||
|
|
2db0107d12 | ||
|
|
df0211fe29 | ||
|
|
1bf1890084 | ||
|
|
2af5a5adb0 | ||
|
|
fe21485065 | ||
|
|
18ab5382b1 | ||
|
|
f98939103b | ||
|
|
ab36bd3965 | ||
|
|
fb2a8ac9a2 | ||
|
|
06a8f4df02 | ||
|
|
cf88551a56 | ||
|
|
aca18fff99 | ||
|
|
a4860075d2 | ||
|
|
bae716fa83 | ||
|
|
0c87034b17 | ||
|
|
8d5d71199a | ||
|
|
cd36845d56 | ||
|
|
c966dd4f9e | ||
|
|
062e4a2922 | ||
|
|
186693b840 | ||
|
|
ab8352ee66 | ||
|
|
bf2cef7d87 | ||
|
|
a6b6eb4ab3 | ||
|
|
cac6ffcd3c | ||
|
|
08c1d03479 | ||
|
|
a2267d869b | ||
|
|
e303537e0c | ||
|
|
a0c29cc260 | ||
|
|
e7c54376e2 | ||
|
|
9d40c24a16 | ||
|
|
e6b258f418 | ||
|
|
02bbc7c7de | ||
|
|
348b19cb4c | ||
|
|
979d385eea | ||
|
|
71c015c579 | ||
|
|
7805666103 | ||
|
|
be0283b9f2 | ||
|
|
e68150c3ce | ||
|
|
92d8139136 | ||
|
|
35f156a1e1 | ||
|
|
6d359161bb | ||
|
|
53207302f9 | ||
|
|
05d10d8e77 | ||
|
|
f5b79c3623 | ||
|
|
9f55287672 | ||
|
|
69fc7f6852 | ||
|
|
cdbd5bf4f9 | ||
|
|
d0bf1cca60 | ||
|
|
66afdfd119 | ||
|
|
4582f0e8d2 | ||
|
|
21f85a4145 | ||
|
|
53b9045943 | ||
|
|
a268232ed6 | ||
|
|
43e9e1ec36 | ||
|
|
80ec241108 | ||
|
|
8315804c85 | ||
|
|
a9fd600c52 | ||
|
|
b5cac47ba7 | ||
|
|
d91d81b5df | ||
|
|
225886e859 | ||
|
|
cc1d22012c | ||
|
|
fb325a8f24 | ||
|
|
f8943c17c2 | ||
|
|
28bdec2c79 | ||
|
|
6fc7af5ba8 | ||
|
|
3b226038ba | ||
|
|
9d06a5888f | ||
|
|
fb7d0e0e3d | ||
|
|
282f4e5de2 | ||
|
|
9bff64824b | ||
|
|
6723a58780 | ||
|
|
519990e2fb | ||
|
|
fb8eb2a5c3 | ||
|
|
962faa2196 | ||
|
|
dad469297c | ||
|
|
85413f2a65 | ||
|
|
9605a4a9cb | ||
|
|
c2baba50f9 | ||
|
|
c955a5dc08 | ||
|
|
e6a5616543 | ||
|
|
001b6cb801 | ||
|
|
f3e620cd0f | ||
|
|
9ef87e75d5 | ||
|
|
f8933c2743 | ||
|
|
b051f779e6 | ||
|
|
37d58a476c | ||
|
|
78f9debdd4 | ||
|
|
74a590cb76 | ||
|
|
4a04d46118 | ||
|
|
467b008f36 | ||
|
|
6701d0ae0c | ||
|
|
4515d18ddd | ||
|
|
60f29ba6fb | ||
|
|
306f4c14cf | ||
|
|
310dcd7b94 | ||
|
|
008c7c6517 | ||
|
|
c244e7f847 | ||
|
|
bb2e2a5ed6 | ||
|
|
a45c0528da | ||
|
|
0b535b792e | ||
|
|
9fbfcf0ccd | ||
|
|
d123249bd2 | ||
|
|
4376476ec4 | ||
|
|
9be61a1245 | ||
|
|
e2e831e322 | ||
|
|
21d585d586 | ||
|
|
0a5144fc1d | ||
|
|
64ca080bb8 | ||
|
|
b85621e9a7 | ||
|
|
e915d7d1d0 | ||
|
|
ae63f64771 | ||
|
|
63dfd95aa2 | ||
|
|
c9f65cf1c2 | ||
|
|
c42e3c6837 | ||
|
|
3167a0dbc0 | ||
|
|
909bd877c9 | ||
|
|
4d0fdba97a | ||
|
|
d2550a525b | ||
|
|
258512fef2 | ||
|
|
9546ee37e5 | ||
|
|
728bc2c632 | ||
|
|
0307c71945 | ||
|
|
208b1f7fa3 | ||
|
|
2f2128ac48 | ||
|
|
c11be72ead | ||
|
|
f5f5913a29 | ||
|
|
af37e12de4 | ||
|
|
0827ff7687 | ||
|
|
3765c31163 | ||
|
|
84a1abd357 | ||
|
|
14f20e644e | ||
|
|
1f960d5761 | ||
|
|
32099d8f49 | ||
|
|
392b880b52 | ||
|
|
3f49938b79 | ||
|
|
a1d4dff99d | ||
|
|
04231c86db | ||
|
|
7265567561 | ||
|
|
0d70373d95 | ||
|
|
53687ae659 | ||
|
|
46db8b7803 | ||
|
|
7cd04c088c | ||
|
|
51b4df7a1f | ||
|
|
b92b725fe2 | ||
|
|
1c1d1ddbec | ||
|
|
5224347c39 | ||
|
|
27011d0239 | ||
|
|
bae2c90bae | ||
|
|
ae1f0bb107 | ||
|
|
405ed2e736 | ||
|
|
17299e469a | ||
|
|
3c43ce4e06 | ||
|
|
00a6772507 | ||
|
|
be3690c22b | ||
|
|
d1f9c77afd | ||
|
|
fee4e7d8e2 | ||
|
|
30d584afd1 | ||
|
|
0a95f74f11 | ||
|
|
6e27bee2ca | ||
|
|
f6f15f58ee | ||
|
|
6f50ddf710 | ||
|
|
5fc934d859 | ||
|
|
8ccdf3b32b | ||
|
|
a5eb02d178 | ||
|
|
dd857a2c7a | ||
|
|
2013963e0b | ||
|
|
48ee0821d3 | ||
|
|
99b61143f6 | ||
|
|
2db19008fb | ||
|
|
514d56d1ae | ||
|
|
5c57c9c0b2 | ||
|
|
7733265fa2 | ||
|
|
61c68f7b8f | ||
|
|
5218b4eea2 | ||
|
|
e1455057e7 | ||
|
|
7a759c903b | ||
|
|
61758c07d2 | ||
|
|
bf830b2dd5 | ||
|
|
6704c0aaec | ||
|
|
6e60a00d69 | ||
|
|
0bf34d4d6f | ||
|
|
19473af401 | ||
|
|
51aad52e6c | ||
|
|
3d2ed363aa | ||
|
|
cc6a5dc29a | ||
|
|
7c9720e22b | ||
|
|
c3b8c96db6 | ||
|
|
93cb60b24e | ||
|
|
5e4a80e5d0 | ||
|
|
79918a7939 | ||
|
|
d09421230b |
@@ -83,6 +83,7 @@ github:
|
||||
- cypress-matrix (5, chrome)
|
||||
- dependency-review
|
||||
- frontend-build
|
||||
- playwright-tests (chromium)
|
||||
- pre-commit (current)
|
||||
- pre-commit (previous)
|
||||
- test-mysql
|
||||
|
||||
15
.claude/settings.json
Normal file
15
.claude/settings.json
Normal file
@@ -0,0 +1,15 @@
|
||||
{
|
||||
"hooks": {
|
||||
"PreToolUse": [
|
||||
{
|
||||
"matcher": "Bash",
|
||||
"hooks": [
|
||||
{
|
||||
"type": "command",
|
||||
"command": "jq -r '.tool_input.command // \"\"' | grep -qE '^git commit' && cd \"$CLAUDE_PROJECT_DIR\" && echo '🔍 Running pre-commit before commit...' && pre-commit run || true"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
@@ -3,14 +3,3 @@
|
||||
For complete documentation on using GitHub Codespaces with Apache Superset, please see:
|
||||
|
||||
**[Setting up a Development Environment - GitHub Codespaces](https://superset.apache.org/docs/contributing/development#github-codespaces-cloud-development)**
|
||||
|
||||
## Pre-installed Development Environment
|
||||
|
||||
When you create a new Codespace from this repository, it automatically:
|
||||
|
||||
1. **Creates a Python virtual environment** using `uv venv`
|
||||
2. **Installs all development dependencies** via `uv pip install -r requirements/development.txt`
|
||||
3. **Sets up pre-commit hooks** with `pre-commit install`
|
||||
4. **Activates the virtual environment** automatically in all terminals
|
||||
|
||||
The virtual environment is located at `/workspaces/{repository-name}/.venv` and is automatically activated through environment variables set in the devcontainer configuration.
|
||||
|
||||
19
.devcontainer/default/devcontainer.json
Normal file
19
.devcontainer/default/devcontainer.json
Normal file
@@ -0,0 +1,19 @@
|
||||
{
|
||||
// Extend the base configuration
|
||||
"extends": "../devcontainer-base.json",
|
||||
|
||||
"name": "Apache Superset Development (Default)",
|
||||
|
||||
// Forward ports for development
|
||||
"forwardPorts": [9001],
|
||||
"portsAttributes": {
|
||||
"9001": {
|
||||
"label": "Superset (via Webpack Dev Server)",
|
||||
"onAutoForward": "notify",
|
||||
"visibility": "public"
|
||||
}
|
||||
},
|
||||
|
||||
// Auto-start Superset on Codespace resume
|
||||
"postStartCommand": ".devcontainer/start-superset.sh"
|
||||
}
|
||||
39
.devcontainer/devcontainer-base.json
Normal file
39
.devcontainer/devcontainer-base.json
Normal file
@@ -0,0 +1,39 @@
|
||||
{
|
||||
"name": "Apache Superset Development",
|
||||
// Keep this in sync with the base image in Dockerfile (ARG PY_VER)
|
||||
// Using the same base as Dockerfile, but non-slim for dev tools
|
||||
"image": "python:3.11.13-bookworm",
|
||||
|
||||
"features": {
|
||||
"ghcr.io/devcontainers/features/docker-in-docker:2": {
|
||||
"moby": true,
|
||||
"dockerDashComposeVersion": "v2"
|
||||
},
|
||||
"ghcr.io/devcontainers/features/node:1": {
|
||||
"version": "20"
|
||||
},
|
||||
"ghcr.io/devcontainers/features/git:1": {},
|
||||
"ghcr.io/devcontainers/features/common-utils:2": {
|
||||
"configureZshAsDefaultShell": true
|
||||
},
|
||||
"ghcr.io/devcontainers/features/sshd:1": {
|
||||
"version": "latest"
|
||||
}
|
||||
},
|
||||
|
||||
// Run commands after container is created
|
||||
"postCreateCommand": "chmod +x .devcontainer/setup-dev.sh && .devcontainer/setup-dev.sh",
|
||||
|
||||
// VS Code customizations
|
||||
"customizations": {
|
||||
"vscode": {
|
||||
"extensions": [
|
||||
"ms-python.python",
|
||||
"ms-python.vscode-pylance",
|
||||
"charliermarsh.ruff",
|
||||
"dbaeumer.vscode-eslint",
|
||||
"esbenp.prettier-vscode"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -3,76 +3,30 @@
|
||||
|
||||
echo "🔧 Setting up Superset development environment..."
|
||||
|
||||
# System dependencies and uv are now pre-installed in the Docker image
|
||||
# This speeds up Codespace creation significantly!
|
||||
# The universal image has most tools, just need Superset-specific libs
|
||||
echo "📦 Installing Superset-specific dependencies..."
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y \
|
||||
libsasl2-dev \
|
||||
libldap2-dev \
|
||||
libpq-dev \
|
||||
tmux \
|
||||
gh
|
||||
|
||||
# Create virtual environment using uv
|
||||
echo "🐍 Creating Python virtual environment..."
|
||||
if ! uv venv; then
|
||||
echo "❌ Failed to create virtual environment"
|
||||
exit 1
|
||||
fi
|
||||
# Install uv for fast Python package management
|
||||
echo "📦 Installing uv..."
|
||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
|
||||
# Install Python dependencies
|
||||
echo "📦 Installing Python dependencies..."
|
||||
if ! uv pip install -r requirements/development.txt; then
|
||||
echo "❌ Failed to install Python dependencies"
|
||||
echo "💡 You may need to run this manually after the Codespace starts"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Install pre-commit hooks
|
||||
echo "🪝 Installing pre-commit hooks..."
|
||||
if source .venv/bin/activate && pre-commit install; then
|
||||
echo "✅ Pre-commit hooks installed"
|
||||
else
|
||||
echo "⚠️ Pre-commit hooks installation failed (non-critical)"
|
||||
fi
|
||||
# Add cargo/bin to PATH for uv
|
||||
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.bashrc
|
||||
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.zshrc
|
||||
|
||||
# Install Claude Code CLI via npm
|
||||
echo "🤖 Installing Claude Code..."
|
||||
if npm install -g @anthropic-ai/claude-code; then
|
||||
echo "✅ Claude Code installed"
|
||||
else
|
||||
echo "⚠️ Claude Code installation failed (non-critical)"
|
||||
fi
|
||||
npm install -g @anthropic-ai/claude-code
|
||||
|
||||
# Make the start script executable
|
||||
chmod +x .devcontainer/start-superset.sh
|
||||
|
||||
# Add bashrc additions for automatic venv activation
|
||||
echo "🔧 Setting up automatic environment activation..."
|
||||
if [ -f ~/.bashrc ]; then
|
||||
# Check if we've already added our additions
|
||||
if ! grep -q "Superset Codespaces environment setup" ~/.bashrc; then
|
||||
echo "" >> ~/.bashrc
|
||||
cat .devcontainer/bashrc-additions >> ~/.bashrc
|
||||
echo "✅ Added automatic venv activation to ~/.bashrc"
|
||||
else
|
||||
echo "✅ Bashrc additions already present"
|
||||
fi
|
||||
else
|
||||
# Create bashrc if it doesn't exist
|
||||
cat .devcontainer/bashrc-additions > ~/.bashrc
|
||||
echo "✅ Created ~/.bashrc with automatic venv activation"
|
||||
fi
|
||||
|
||||
# Also add to zshrc since that's the default shell
|
||||
if [ -f ~/.zshrc ] || [ -n "$ZSH_VERSION" ]; then
|
||||
if ! grep -q "Superset Codespaces environment setup" ~/.zshrc; then
|
||||
echo "" >> ~/.zshrc
|
||||
cat .devcontainer/bashrc-additions >> ~/.zshrc
|
||||
echo "✅ Added automatic venv activation to ~/.zshrc"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "✅ Development environment setup complete!"
|
||||
echo ""
|
||||
echo "📝 The virtual environment will be automatically activated in new terminals"
|
||||
echo ""
|
||||
echo "🔄 To activate in this terminal, run:"
|
||||
echo " source ~/.bashrc"
|
||||
echo ""
|
||||
echo "🚀 To start Superset:"
|
||||
echo " start-superset"
|
||||
echo ""
|
||||
echo "🚀 Run '.devcontainer/start-superset.sh' to start Superset"
|
||||
|
||||
@@ -1,14 +1,14 @@
|
||||
#!/bin/bash
|
||||
# Startup script for Superset in Codespaces
|
||||
|
||||
# Log to a file for debugging
|
||||
LOG_FILE="/tmp/superset-startup.log"
|
||||
echo "[$(date)] Starting Superset startup script" >> "$LOG_FILE"
|
||||
echo "[$(date)] User: $(whoami), PWD: $(pwd)" >> "$LOG_FILE"
|
||||
|
||||
echo "🚀 Starting Superset in Codespaces..."
|
||||
echo "🌐 Frontend will be available at port 9001"
|
||||
|
||||
# Check if MCP is enabled
|
||||
if [ "$ENABLE_MCP" = "true" ]; then
|
||||
echo "🤖 MCP Service will be available at port 5008"
|
||||
fi
|
||||
|
||||
# Find the workspace directory (Codespaces clones as 'superset', not 'superset-2')
|
||||
WORKSPACE_DIR=$(find /workspaces -maxdepth 1 -name "superset*" -type d | head -1)
|
||||
if [ -n "$WORKSPACE_DIR" ]; then
|
||||
@@ -18,71 +18,32 @@ else
|
||||
echo "📁 Using current directory: $(pwd)"
|
||||
fi
|
||||
|
||||
# Wait for Docker to be available
|
||||
echo "⏳ Waiting for Docker to start..."
|
||||
echo "[$(date)] Waiting for Docker..." >> "$LOG_FILE"
|
||||
max_attempts=30
|
||||
attempt=0
|
||||
while ! docker info > /dev/null 2>&1; do
|
||||
if [ $attempt -eq $max_attempts ]; then
|
||||
echo "❌ Docker failed to start after $max_attempts attempts"
|
||||
echo "[$(date)] Docker failed to start after $max_attempts attempts" >> "$LOG_FILE"
|
||||
echo "🔄 Please restart the Codespace or run this script manually later"
|
||||
exit 1
|
||||
fi
|
||||
echo " Attempt $((attempt + 1))/$max_attempts..."
|
||||
echo "[$(date)] Docker check attempt $((attempt + 1))/$max_attempts" >> "$LOG_FILE"
|
||||
sleep 2
|
||||
attempt=$((attempt + 1))
|
||||
done
|
||||
echo "✅ Docker is ready!"
|
||||
echo "[$(date)] Docker is ready" >> "$LOG_FILE"
|
||||
|
||||
# Check if Superset containers are already running
|
||||
if docker ps | grep -q "superset"; then
|
||||
echo "✅ Superset containers are already running!"
|
||||
echo ""
|
||||
echo "🌐 To access Superset:"
|
||||
echo " 1. Click the 'Ports' tab at the bottom of VS Code"
|
||||
echo " 2. Find port 9001 and click the globe icon to open"
|
||||
echo " 3. Wait 10-20 minutes for initial startup"
|
||||
echo ""
|
||||
echo "📝 Login credentials: admin/admin"
|
||||
exit 0
|
||||
# Check if docker is running
|
||||
if ! docker info > /dev/null 2>&1; then
|
||||
echo "⏳ Waiting for Docker to start..."
|
||||
sleep 5
|
||||
fi
|
||||
|
||||
# Clean up any existing containers
|
||||
echo "🧹 Cleaning up existing containers..."
|
||||
docker-compose -f docker-compose-light.yml down
|
||||
docker-compose -f docker-compose-light.yml --profile mcp down
|
||||
|
||||
# Start services
|
||||
echo "🏗️ Starting Superset in background (daemon mode)..."
|
||||
echo "🏗️ Building and starting services..."
|
||||
echo ""
|
||||
echo "📝 Once started, login with:"
|
||||
echo " Username: admin"
|
||||
echo " Password: admin"
|
||||
echo ""
|
||||
echo "📋 Running in foreground with live logs (Ctrl+C to stop)..."
|
||||
|
||||
# Start in detached mode
|
||||
docker-compose -f docker-compose-light.yml up -d
|
||||
|
||||
echo ""
|
||||
echo "✅ Docker Compose started successfully!"
|
||||
echo ""
|
||||
echo "📋 Important information:"
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
echo "⏱️ Initial startup takes 10-20 minutes"
|
||||
echo "🌐 Check the 'Ports' tab for your Superset URL (port 9001)"
|
||||
echo "👤 Login: admin / admin"
|
||||
echo ""
|
||||
echo "📊 Useful commands:"
|
||||
echo " docker-compose -f docker-compose-light.yml logs -f # Follow logs"
|
||||
echo " docker-compose -f docker-compose-light.yml ps # Check status"
|
||||
echo " docker-compose -f docker-compose-light.yml down # Stop services"
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
echo ""
|
||||
echo "💤 Keeping terminal open for 60 seconds to test persistence..."
|
||||
sleep 60
|
||||
echo "✅ Test complete - check if this terminal is still visible!"
|
||||
|
||||
# Show final status
|
||||
docker-compose -f docker-compose-light.yml ps
|
||||
# Run docker-compose and capture exit code
|
||||
if [ "$ENABLE_MCP" = "true" ]; then
|
||||
echo "🤖 Starting with MCP Service enabled..."
|
||||
docker-compose -f docker-compose-light.yml --profile mcp up
|
||||
else
|
||||
docker-compose -f docker-compose-light.yml up
|
||||
fi
|
||||
EXIT_CODE=$?
|
||||
|
||||
# If it failed, provide helpful instructions
|
||||
|
||||
29
.devcontainer/with-mcp/devcontainer.json
Normal file
29
.devcontainer/with-mcp/devcontainer.json
Normal file
@@ -0,0 +1,29 @@
|
||||
{
|
||||
// Extend the base configuration
|
||||
"extends": "../devcontainer-base.json",
|
||||
|
||||
"name": "Apache Superset Development with MCP",
|
||||
|
||||
// Forward ports for development
|
||||
"forwardPorts": [9001, 5008],
|
||||
"portsAttributes": {
|
||||
"9001": {
|
||||
"label": "Superset (via Webpack Dev Server)",
|
||||
"onAutoForward": "notify",
|
||||
"visibility": "public"
|
||||
},
|
||||
"5008": {
|
||||
"label": "MCP Service (Model Context Protocol)",
|
||||
"onAutoForward": "notify",
|
||||
"visibility": "private"
|
||||
}
|
||||
},
|
||||
|
||||
// Auto-start Superset with MCP on Codespace resume
|
||||
"postStartCommand": "ENABLE_MCP=true .devcontainer/start-superset.sh",
|
||||
|
||||
// Environment variables
|
||||
"containerEnv": {
|
||||
"ENABLE_MCP": "true"
|
||||
}
|
||||
}
|
||||
41
.envrc.example
Normal file
41
.envrc.example
Normal file
@@ -0,0 +1,41 @@
|
||||
#
|
||||
# Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
# contributor license agreements. See the NOTICE file distributed with
|
||||
# this work for additional information regarding copyright ownership.
|
||||
# The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
# (the "License"); you may not use this file except in compliance with
|
||||
# the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
# Auto-configure Docker Compose for multi-instance support
|
||||
# Requires direnv: https://direnv.net/
|
||||
#
|
||||
# Install: brew install direnv (or apt install direnv)
|
||||
# Setup: Add 'eval "$(direnv hook bash)"' to ~/.bashrc (or ~/.zshrc)
|
||||
# Allow: Run 'direnv allow' in this directory once
|
||||
|
||||
# Generate unique project name from directory
|
||||
export COMPOSE_PROJECT_NAME=$(basename "$PWD" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9]/-/g')
|
||||
|
||||
# Find available ports sequentially to avoid collisions
|
||||
_is_free() { ! lsof -i ":$1" &>/dev/null 2>&1; }
|
||||
|
||||
_p=80; while ! _is_free $_p; do ((_p++)); done; export NGINX_PORT=$_p
|
||||
_p=8088; while ! _is_free $_p; do ((_p++)); done; export SUPERSET_PORT=$_p
|
||||
_p=9000; while ! _is_free $_p; do ((_p++)); done; export NODE_PORT=$_p
|
||||
_p=8080; while ! _is_free $_p || [ $_p -eq $NGINX_PORT ]; do ((_p++)); done; export WEBSOCKET_PORT=$_p
|
||||
_p=8081; while ! _is_free $_p || [ $_p -eq $WEBSOCKET_PORT ]; do ((_p++)); done; export CYPRESS_PORT=$_p
|
||||
_p=5432; while ! _is_free $_p; do ((_p++)); done; export DATABASE_PORT=$_p
|
||||
_p=6379; while ! _is_free $_p; do ((_p++)); done; export REDIS_PORT=$_p
|
||||
|
||||
unset _p _is_free
|
||||
|
||||
echo "🐳 Superset configured: http://localhost:$SUPERSET_PORT (dev: localhost:$NODE_PORT)"
|
||||
1
.github/CODEOWNERS
vendored
1
.github/CODEOWNERS
vendored
@@ -33,6 +33,7 @@
|
||||
|
||||
# Notify PMC members of changes to extension-related files
|
||||
|
||||
/docs/developer_portal/extensions/ @michael-s-molina @villebro @rusackas
|
||||
/superset-core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
/superset-extensions-cli/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
/superset/core/ @michael-s-molina @villebro @geido @eschutho @rusackas @kgabryje
|
||||
|
||||
2
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
2
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -41,8 +41,8 @@ body:
|
||||
label: Superset version
|
||||
options:
|
||||
- master / latest-dev
|
||||
- "6.0.0"
|
||||
- "5.0.0"
|
||||
- "4.1.3"
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
||||
@@ -10,7 +10,7 @@ jobs:
|
||||
steps:
|
||||
- name: Check if the PR is a draft
|
||||
id: check-draft
|
||||
uses: actions/github-script@v6
|
||||
uses: actions/github-script@v8
|
||||
with:
|
||||
script: |
|
||||
const isDraft = context.payload.pull_request.draft;
|
||||
|
||||
7
.github/dependabot.yml
vendored
7
.github/dependabot.yml
vendored
@@ -12,10 +12,17 @@ updates:
|
||||
# not until React >= 18.0.0
|
||||
- dependency-name: "storybook"
|
||||
- dependency-name: "@storybook*"
|
||||
# remark-gfm v4+ requires react-markdown v9+, which needs React 18
|
||||
- dependency-name: "remark-gfm"
|
||||
- dependency-name: "react-markdown"
|
||||
# JSDOM v30 doesn't play well with Jest v30
|
||||
# Source: https://jestjs.io/blog#known-issues
|
||||
# GH thread: https://github.com/jsdom/jsdom/issues/3492
|
||||
- dependency-name: "jest-environment-jsdom"
|
||||
# `@swc/plugin-transform-imports` doesn't work with current Webpack-SWC hybrid setup
|
||||
# See https://github.com/apache/superset/pull/37384#issuecomment-3793991389
|
||||
# TODO: remove the plugin once Lodash usage has been migrated to a more readily tree-shakeable alternative
|
||||
- dependency-name: "@swc/plugin-transform-imports"
|
||||
directory: "/superset-frontend/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
|
||||
36
.github/workflows/bashlib.sh
vendored
36
.github/workflows/bashlib.sh
vendored
@@ -117,6 +117,19 @@ testdata() {
|
||||
say "::endgroup::"
|
||||
}
|
||||
|
||||
playwright_testdata() {
|
||||
cd "$GITHUB_WORKSPACE"
|
||||
say "::group::Load all examples for Playwright tests"
|
||||
# must specify PYTHONPATH to make `tests.superset_test_config` importable
|
||||
export PYTHONPATH="$GITHUB_WORKSPACE"
|
||||
pip install -e .
|
||||
superset db upgrade
|
||||
superset load_test_users
|
||||
superset load_examples
|
||||
superset init
|
||||
say "::endgroup::"
|
||||
}
|
||||
|
||||
celery-worker() {
|
||||
cd "$GITHUB_WORKSPACE"
|
||||
say "::group::Start Celery worker"
|
||||
@@ -195,6 +208,7 @@ playwright-install() {
|
||||
|
||||
playwright-run() {
|
||||
local APP_ROOT=$1
|
||||
local TEST_PATH=$2
|
||||
|
||||
# Start Flask from the project root (same as Cypress)
|
||||
cd "$GITHUB_WORKSPACE"
|
||||
@@ -238,8 +252,26 @@ playwright-run() {
|
||||
|
||||
say "::group::Run Playwright tests"
|
||||
echo "Running Playwright with baseURL: ${PLAYWRIGHT_BASE_URL}"
|
||||
npx playwright test auth/login --reporter=github --output=playwright-results
|
||||
local status=$?
|
||||
if [ -n "$TEST_PATH" ]; then
|
||||
# Check if there are any test files in the specified path
|
||||
if ! find "playwright/tests/${TEST_PATH}" -name "*.spec.ts" -type f 2>/dev/null | grep -q .; then
|
||||
echo "No test files found in ${TEST_PATH} - skipping test run"
|
||||
say "::endgroup::"
|
||||
kill $flaskProcessId
|
||||
return 0
|
||||
fi
|
||||
echo "Running tests: ${TEST_PATH}"
|
||||
# Set INCLUDE_EXPERIMENTAL=true to allow experimental tests to run
|
||||
export INCLUDE_EXPERIMENTAL=true
|
||||
npx playwright test "${TEST_PATH}" --output=playwright-results
|
||||
local status=$?
|
||||
# Unset to prevent leaking into subsequent commands
|
||||
unset INCLUDE_EXPERIMENTAL
|
||||
else
|
||||
echo "Running all required tests (experimental/ excluded via playwright.config.ts)"
|
||||
npx playwright test --output=playwright-results
|
||||
local status=$?
|
||||
fi
|
||||
say "::endgroup::"
|
||||
|
||||
# After job is done, print out Flask log for debugging
|
||||
|
||||
2
.github/workflows/bump-python-package.yml
vendored
2
.github/workflows/bump-python-package.yml
vendored
@@ -32,7 +32,7 @@ jobs:
|
||||
steps:
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: true
|
||||
ref: master
|
||||
|
||||
2
.github/workflows/cancel_duplicates.yml
vendored
2
.github/workflows/cancel_duplicates.yml
vendored
@@ -31,7 +31,7 @@ jobs:
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
if: steps.check_queued.outputs.count >= 20
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
|
||||
- name: Cancel duplicate workflow runs
|
||||
if: steps.check_queued.outputs.count >= 20
|
||||
|
||||
2
.github/workflows/check-python-deps.yml
vendored
2
.github/workflows/check-python-deps.yml
vendored
@@ -18,7 +18,7 @@ jobs:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
@@ -25,7 +25,7 @@ jobs:
|
||||
pull-requests: write
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
- name: Check and notify
|
||||
uses: actions/github-script@v8
|
||||
with:
|
||||
@@ -69,7 +69,7 @@ jobs:
|
||||
`❗ @${pull.user.login} Your base branch \`${currentBranch}\` has ` +
|
||||
'also updated `superset/migrations`.\n' +
|
||||
'\n' +
|
||||
'**Please consider rebasing your branch and [resolving potential db migration conflicts](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#merging-db-migrations).**',
|
||||
'**Please consider rebasing your branch and [resolving potential db migration conflicts](https://superset.apache.org/docs/contributing/development#merging-db-migrations).**',
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
2
.github/workflows/claude.yml
vendored
2
.github/workflows/claude.yml
vendored
@@ -71,7 +71,7 @@ jobs:
|
||||
id-token: write
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
|
||||
2
.github/workflows/codeql-analysis.yml
vendored
2
.github/workflows/codeql-analysis.yml
vendored
@@ -31,7 +31,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
|
||||
4
.github/workflows/dependency-review.yml
vendored
4
.github/workflows/dependency-review.yml
vendored
@@ -27,7 +27,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout Repository"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
- name: "Dependency Review"
|
||||
uses: actions/dependency-review-action@v4
|
||||
continue-on-error: true
|
||||
@@ -53,7 +53,7 @@ jobs:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- name: "Checkout Repository"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
|
||||
21
.github/workflows/docker.yml
vendored
21
.github/workflows/docker.yml
vendored
@@ -42,7 +42,7 @@ jobs:
|
||||
steps:
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
@@ -101,6 +101,23 @@ jobs:
|
||||
docker images $IMAGE_TAG
|
||||
docker history $IMAGE_TAG
|
||||
|
||||
# Scan for vulnerabilities in built container image after pushes to mainline branch.
|
||||
- name: Run Trivy container image vulnerabity scan
|
||||
if: github.event_name == 'push' && github.ref == 'refs/heads/master' && (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'lean'
|
||||
uses: aquasecurity/trivy-action@b6643a29fecd7f34b3597bc6acb0a98b03d33ff8 # v0.33.1
|
||||
with:
|
||||
image-ref: ${{ env.IMAGE_TAG }}
|
||||
format: 'sarif'
|
||||
output: 'trivy-results.sarif'
|
||||
vuln-type: 'os'
|
||||
severity: 'CRITICAL,HIGH'
|
||||
ignore-unfixed: true
|
||||
- name: Upload Trivy scan results to GitHub Security tab
|
||||
if: github.event_name == 'push' && github.ref == 'refs/heads/master' && (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'lean'
|
||||
uses: github/codeql-action/upload-sarif@1b168cd39490f61582a9beae412bb7057a6b2c4e # v4.31.8
|
||||
with:
|
||||
sarif_file: 'trivy-results.sarif'
|
||||
|
||||
- name: docker-compose sanity check
|
||||
if: (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'dev'
|
||||
shell: bash
|
||||
@@ -117,7 +134,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Check for file changes
|
||||
|
||||
4
.github/workflows/embedded-sdk-release.yml
vendored
4
.github/workflows/embedded-sdk-release.yml
vendored
@@ -28,8 +28,8 @@ jobs:
|
||||
run:
|
||||
working-directory: superset-embedded-sdk
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: actions/setup-node@v5
|
||||
- uses: actions/checkout@v6
|
||||
- uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-embedded-sdk/.nvmrc'
|
||||
registry-url: 'https://registry.npmjs.org'
|
||||
|
||||
4
.github/workflows/embedded-sdk-test.yml
vendored
4
.github/workflows/embedded-sdk-test.yml
vendored
@@ -18,8 +18,8 @@ jobs:
|
||||
run:
|
||||
working-directory: superset-embedded-sdk
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: actions/setup-node@v5
|
||||
- uses: actions/checkout@v6
|
||||
- uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-embedded-sdk/.nvmrc'
|
||||
registry-url: 'https://registry.npmjs.org'
|
||||
|
||||
4
.github/workflows/ephemeral-env.yml
vendored
4
.github/workflows/ephemeral-env.yml
vendored
@@ -160,7 +160,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ needs.ephemeral-env-label.outputs.sha }} : ${{steps.get-sha.outputs.sha}} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ needs.ephemeral-env-label.outputs.sha }}
|
||||
persist-credentials: false
|
||||
@@ -220,7 +220,7 @@ jobs:
|
||||
pull-requests: write
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
|
||||
2
.github/workflows/generate-FOSSA-report.yml
vendored
2
.github/workflows/generate-FOSSA-report.yml
vendored
@@ -27,7 +27,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
@@ -14,10 +14,10 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
|
||||
2
.github/workflows/issue_creation.yml
vendored
2
.github/workflows/issue_creation.yml
vendored
@@ -17,7 +17,7 @@ jobs:
|
||||
steps:
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
|
||||
2
.github/workflows/latest-release-tag.yml
vendored
2
.github/workflows/latest-release-tag.yml
vendored
@@ -12,7 +12,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
2
.github/workflows/license-check.yml
vendored
2
.github/workflows/license-check.yml
vendored
@@ -15,7 +15,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
2
.github/workflows/pr-lint.yml
vendored
2
.github/workflows/pr-lint.yml
vendored
@@ -16,7 +16,7 @@ jobs:
|
||||
pull-requests: write
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
10
.github/workflows/pre-commit.yml
vendored
10
.github/workflows/pre-commit.yml
vendored
@@ -21,7 +21,7 @@ jobs:
|
||||
python-version: ["current", "previous", "next"]
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
@@ -39,7 +39,7 @@ jobs:
|
||||
echo "HOMEBREW_REPOSITORY=$HOMEBREW_REPOSITORY" >>"${GITHUB_ENV}"
|
||||
brew install norwoodj/tap/helm-docs
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
@@ -54,7 +54,7 @@ jobs:
|
||||
yarn install --immutable
|
||||
|
||||
- name: Cache pre-commit environments
|
||||
uses: actions/cache@v4
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: ~/.cache/pre-commit
|
||||
key: pre-commit-v2-${{ runner.os }}-py${{ matrix.python-version }}-${{ hashFiles('.pre-commit-config.yaml') }}
|
||||
@@ -71,7 +71,9 @@ jobs:
|
||||
GIT_DIFF_EXIT_CODE=$?
|
||||
if [ "${PRE_COMMIT_EXIT_CODE}" -ne 0 ] || [ "${GIT_DIFF_EXIT_CODE}" -ne 0 ]; then
|
||||
if [ "${PRE_COMMIT_EXIT_CODE}" -ne 0 ]; then
|
||||
echo "❌ Pre-commit check failed (exit code: ${EXIT_CODE})."
|
||||
echo "❌ Pre-commit check failed (exit code: ${PRE_COMMIT_EXIT_CODE})."
|
||||
echo "🔍 Modified files:"
|
||||
git diff --name-only
|
||||
else
|
||||
echo "❌ Git working directory is dirty."
|
||||
echo "📌 This likely means that pre-commit made changes that were not committed."
|
||||
|
||||
2
.github/workflows/prefer-typescript.yml
vendored
2
.github/workflows/prefer-typescript.yml
vendored
@@ -27,7 +27,7 @@ jobs:
|
||||
pull-requests: write
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
8
.github/workflows/release.yml
vendored
8
.github/workflows/release.yml
vendored
@@ -26,7 +26,7 @@ jobs:
|
||||
name: Bump version and publish package(s)
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: actions/checkout@v6
|
||||
with:
|
||||
# pulls all commits (needed for lerna / semantic release to correctly version)
|
||||
fetch-depth: 0
|
||||
@@ -42,13 +42,13 @@ jobs:
|
||||
|
||||
- name: Install Node.js
|
||||
if: env.HAS_TAGS
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-frontend/.nvmrc'
|
||||
|
||||
- name: Cache npm
|
||||
if: env.HAS_TAGS
|
||||
uses: actions/cache@v4
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: ~/.npm # npm cache files are stored in `~/.npm` on Linux/macOS
|
||||
key: ${{ runner.OS }}-node-${{ hashFiles('**/package-lock.json') }}
|
||||
@@ -62,7 +62,7 @@ jobs:
|
||||
run: echo "dir=$(npm config get cache)" >> $GITHUB_OUTPUT
|
||||
- name: Cache npm
|
||||
if: env.HAS_TAGS
|
||||
uses: actions/cache@v4
|
||||
uses: actions/cache@v5
|
||||
id: npm-cache # use this to check for `cache-hit` (`steps.npm-cache.outputs.cache-hit != 'true'`)
|
||||
with:
|
||||
path: ${{ steps.npm-cache-dir-path.outputs.dir }}
|
||||
|
||||
18
.github/workflows/showtime-cleanup.yml
vendored
18
.github/workflows/showtime-cleanup.yml
vendored
@@ -7,12 +7,6 @@ on:
|
||||
|
||||
# Manual trigger for testing
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
max_age_hours:
|
||||
description: 'Maximum age in hours before cleanup'
|
||||
required: false
|
||||
default: '48'
|
||||
type: string
|
||||
|
||||
# Common environment variables
|
||||
env:
|
||||
@@ -38,13 +32,5 @@ jobs:
|
||||
|
||||
- name: Cleanup expired environments
|
||||
run: |
|
||||
MAX_AGE="${{ github.event.inputs.max_age_hours || '48' }}"
|
||||
|
||||
# Validate max_age is numeric
|
||||
if [[ ! "$MAX_AGE" =~ ^[0-9]+$ ]]; then
|
||||
echo "❌ Invalid max_age_hours format: $MAX_AGE (must be numeric)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Cleaning up environments older than ${MAX_AGE}h"
|
||||
python -m showtime cleanup --older-than "${MAX_AGE}h"
|
||||
echo "Cleaning up environments respecting TTL labels"
|
||||
python -m showtime cleanup --respect-ttl
|
||||
|
||||
2
.github/workflows/showtime-trigger.yml
vendored
2
.github/workflows/showtime-trigger.yml
vendored
@@ -147,7 +147,7 @@ jobs:
|
||||
|
||||
- name: Checkout PR code (only if build needed)
|
||||
if: steps.auth.outputs.authorized == 'true' && steps.check.outputs.build_needed == 'true'
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ steps.check.outputs.target_sha }}
|
||||
persist-credentials: false
|
||||
|
||||
2
.github/workflows/superset-app-cli.yml
vendored
2
.github/workflows/superset-app-cli.yml
vendored
@@ -37,7 +37,7 @@ jobs:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
@@ -51,7 +51,7 @@ jobs:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
@@ -63,7 +63,7 @@ jobs:
|
||||
with:
|
||||
run: testdata
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-frontend/.nvmrc'
|
||||
- name: Install npm dependencies
|
||||
|
||||
@@ -30,13 +30,13 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
ref: master
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-frontend/.nvmrc'
|
||||
- name: Install eyes-storybook dependencies
|
||||
|
||||
45
.github/workflows/superset-docs-deploy.yml
vendored
45
.github/workflows/superset-docs-deploy.yml
vendored
@@ -1,6 +1,13 @@
|
||||
name: Docs Deployment
|
||||
|
||||
on:
|
||||
# Deploy after integration tests complete on master
|
||||
workflow_run:
|
||||
workflows: ["Python-Integration"]
|
||||
types: [completed]
|
||||
branches: [master]
|
||||
|
||||
# Also allow manual trigger and direct pushes to docs
|
||||
push:
|
||||
paths:
|
||||
- "docs/**"
|
||||
@@ -30,13 +37,14 @@ jobs:
|
||||
name: Build & Deploy
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
- name: "Checkout ${{ github.event.workflow_run.head_sha || github.sha }}"
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ github.event.workflow_run.head_sha || github.sha }}
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './docs/.nvmrc'
|
||||
- name: Setup Python
|
||||
@@ -58,6 +66,35 @@ jobs:
|
||||
working-directory: docs
|
||||
run: |
|
||||
yarn install --check-cache
|
||||
- name: Download database diagnostics (if triggered by integration tests)
|
||||
if: github.event_name == 'workflow_run' && github.event.workflow_run.conclusion == 'success'
|
||||
uses: dawidd6/action-download-artifact@v12
|
||||
continue-on-error: true
|
||||
with:
|
||||
workflow: superset-python-integrationtest.yml
|
||||
run_id: ${{ github.event.workflow_run.id }}
|
||||
name: database-diagnostics
|
||||
path: docs/src/data/
|
||||
- name: Try to download latest diagnostics (for push/dispatch triggers)
|
||||
if: github.event_name != 'workflow_run'
|
||||
uses: dawidd6/action-download-artifact@v12
|
||||
continue-on-error: true
|
||||
with:
|
||||
workflow: superset-python-integrationtest.yml
|
||||
name: database-diagnostics
|
||||
path: docs/src/data/
|
||||
branch: master
|
||||
search_artifacts: true
|
||||
if_no_artifact_found: warn
|
||||
- name: Use diagnostics artifact if available
|
||||
working-directory: docs
|
||||
run: |
|
||||
if [ -f "src/data/databases-diagnostics.json" ]; then
|
||||
echo "Using fresh diagnostics from integration tests"
|
||||
mv src/data/databases-diagnostics.json src/data/databases.json
|
||||
else
|
||||
echo "Using committed databases.json (no artifact found)"
|
||||
fi
|
||||
- name: yarn build
|
||||
working-directory: docs
|
||||
run: |
|
||||
@@ -71,5 +108,5 @@ jobs:
|
||||
destination-github-username: "apache"
|
||||
destination-repository-name: "superset-site"
|
||||
target-branch: "asf-site"
|
||||
commit-message: "deploying docs: ${{ github.event.head_commit.message }} (apache/superset@${{ github.sha }})"
|
||||
commit-message: "deploying docs: ${{ github.event.head_commit.message || 'triggered by integration tests' }} (apache/superset@${{ github.event.workflow_run.head_sha || github.sha }})"
|
||||
user-email: dev@superset.apache.org
|
||||
|
||||
70
.github/workflows/superset-docs-verify.yml
vendored
70
.github/workflows/superset-docs-verify.yml
vendored
@@ -4,24 +4,30 @@ on:
|
||||
pull_request:
|
||||
paths:
|
||||
- "docs/**"
|
||||
- "superset/db_engine_specs/**"
|
||||
- ".github/workflows/superset-docs-verify.yml"
|
||||
types: [synchronize, opened, reopened, ready_for_review]
|
||||
workflow_run:
|
||||
workflows: ["Python-Integration"]
|
||||
types: [completed]
|
||||
|
||||
# cancel previous workflow jobs for PRs
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.event.workflow_run.head_sha || github.run_id }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
linkinator:
|
||||
# See docs here: https://github.com/marketplace/actions/linkinator
|
||||
# Only run on pull_request, not workflow_run
|
||||
if: github.event_name == 'pull_request'
|
||||
name: Link Checking
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: actions/checkout@v6
|
||||
# Do not bump this linkinator-action version without opening
|
||||
# an ASF Infra ticket to allow the new version first!
|
||||
- uses: JustinBeckwith/linkinator-action@3d5ba091319fa7b0ac14703761eebb7d100e6f6d # v1.11.0
|
||||
- uses: JustinBeckwith/linkinator-action@af984b9f30f63e796ae2ea5be5e07cb587f1bbd9 # v2.3
|
||||
continue-on-error: true # This will make the job advisory (non-blocking, no red X)
|
||||
with:
|
||||
paths: "**/*.md, **/*.mdx"
|
||||
@@ -50,20 +56,23 @@ jobs:
|
||||
https://timbr.ai/,
|
||||
https://opensource.org/license/apache-2-0,
|
||||
https://www.plaidcloud.com/
|
||||
build-deploy:
|
||||
name: Build & Deploy
|
||||
|
||||
build-on-pr:
|
||||
# Build docs when PR changes docs/** (uses committed databases.json)
|
||||
if: github.event_name == 'pull_request'
|
||||
name: Build (PR trigger)
|
||||
runs-on: ubuntu-24.04
|
||||
defaults:
|
||||
run:
|
||||
working-directory: docs
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './docs/.nvmrc'
|
||||
- name: yarn install
|
||||
@@ -75,3 +84,50 @@ jobs:
|
||||
- name: yarn build
|
||||
run: |
|
||||
yarn build
|
||||
|
||||
build-after-tests:
|
||||
# Build docs after integration tests complete (uses fresh diagnostics)
|
||||
# Only runs if integration tests succeeded
|
||||
if: >
|
||||
github.event_name == 'workflow_run' &&
|
||||
github.event.workflow_run.conclusion == 'success'
|
||||
name: Build (after integration tests)
|
||||
runs-on: ubuntu-24.04
|
||||
defaults:
|
||||
run:
|
||||
working-directory: docs
|
||||
steps:
|
||||
- name: "Checkout PR head: ${{ github.event.workflow_run.head_sha }}"
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ github.event.workflow_run.head_sha }}
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './docs/.nvmrc'
|
||||
- name: yarn install
|
||||
run: |
|
||||
yarn install --check-cache
|
||||
- name: Download database diagnostics from integration tests
|
||||
uses: dawidd6/action-download-artifact@v12
|
||||
with:
|
||||
workflow: superset-python-integrationtest.yml
|
||||
run_id: ${{ github.event.workflow_run.id }}
|
||||
name: database-diagnostics
|
||||
path: docs/src/data/
|
||||
- name: Use fresh diagnostics
|
||||
run: |
|
||||
if [ -f "src/data/databases-diagnostics.json" ]; then
|
||||
echo "Using fresh diagnostics from integration tests"
|
||||
mv src/data/databases-diagnostics.json src/data/databases.json
|
||||
else
|
||||
echo "Warning: No diagnostics artifact found, using committed data"
|
||||
fi
|
||||
- name: yarn typecheck
|
||||
run: |
|
||||
yarn typecheck
|
||||
- name: yarn build
|
||||
run: |
|
||||
yarn build
|
||||
|
||||
125
.github/workflows/superset-e2e.yml
vendored
125
.github/workflows/superset-e2e.yml
vendored
@@ -69,21 +69,21 @@ jobs:
|
||||
# Conditional checkout based on context
|
||||
- name: Checkout for push or pull_request event
|
||||
if: github.event_name == 'push' || github.event_name == 'pull_request'
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
|
||||
- name: Checkout using ref (workflow_dispatch)
|
||||
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
ref: ${{ github.event.inputs.ref }}
|
||||
submodules: recursive
|
||||
- name: Checkout using PR ID (workflow_dispatch)
|
||||
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
|
||||
@@ -109,7 +109,7 @@ jobs:
|
||||
run: testdata
|
||||
- name: Setup Node.js
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-frontend/.nvmrc'
|
||||
- name: Install npm dependencies
|
||||
@@ -146,8 +146,123 @@ jobs:
|
||||
SAFE_APP_ROOT=${APP_ROOT//\//_}
|
||||
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
|
||||
- name: Upload Artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
uses: actions/upload-artifact@v6
|
||||
if: failure()
|
||||
with:
|
||||
path: ${{ github.workspace }}/superset-frontend/cypress-base/cypress/screenshots
|
||||
name: cypress-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}-${{ matrix.parallel_id }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}
|
||||
|
||||
playwright-tests:
|
||||
runs-on: ubuntu-22.04
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: read
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
browser: ["chromium"]
|
||||
app_root: ["", "/app/prefix"]
|
||||
env:
|
||||
SUPERSET_ENV: development
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
REDIS_PORT: 16379
|
||||
GITHUB_TOKEN: ${{ github.token }}
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16-alpine
|
||||
env:
|
||||
POSTGRES_USER: superset
|
||||
POSTGRES_PASSWORD: superset
|
||||
ports:
|
||||
- 15432:5432
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
ports:
|
||||
- 16379:6379
|
||||
steps:
|
||||
# -------------------------------------------------------
|
||||
# Conditional checkout based on context (same as Cypress workflow)
|
||||
- name: Checkout for push or pull_request event
|
||||
if: github.event_name == 'push' || github.event_name == 'pull_request'
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
|
||||
- name: Checkout using ref (workflow_dispatch)
|
||||
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
ref: ${{ github.event.inputs.ref }}
|
||||
submodules: recursive
|
||||
- name: Checkout using PR ID (workflow_dispatch)
|
||||
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
|
||||
submodules: recursive
|
||||
# -------------------------------------------------------
|
||||
- name: Check for file changes
|
||||
id: check
|
||||
uses: ./.github/actions/change-detector/
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Setup Python
|
||||
uses: ./.github/actions/setup-backend/
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
- name: Setup postgres
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: setup-postgres
|
||||
- name: Import test data
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: playwright_testdata
|
||||
- name: Setup Node.js
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-frontend/.nvmrc'
|
||||
- name: Install npm dependencies
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: npm-install
|
||||
- name: Build javascript packages
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: build-instrumented-assets
|
||||
- name: Install Playwright
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: playwright-install
|
||||
- name: Run Playwright (Required Tests)
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
env:
|
||||
NODE_OPTIONS: "--max-old-space-size=4096"
|
||||
with:
|
||||
run: playwright-run "${{ matrix.app_root }}"
|
||||
- name: Set safe app root
|
||||
if: failure()
|
||||
id: set-safe-app-root
|
||||
run: |
|
||||
APP_ROOT="${{ matrix.app_root }}"
|
||||
SAFE_APP_ROOT=${APP_ROOT//\//_}
|
||||
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
|
||||
- name: Upload Playwright Artifacts
|
||||
uses: actions/upload-artifact@v6
|
||||
if: failure()
|
||||
with:
|
||||
path: |
|
||||
${{ github.workspace }}/superset-frontend/playwright-results/
|
||||
${{ github.workspace }}/superset-frontend/test-results/
|
||||
name: playwright-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}
|
||||
|
||||
@@ -24,7 +24,7 @@ jobs:
|
||||
working-directory: superset-extensions-cli
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
@@ -58,7 +58,7 @@ jobs:
|
||||
|
||||
- name: Upload HTML coverage report
|
||||
if: steps.check.outputs.superset-extensions-cli
|
||||
uses: actions/upload-artifact@v4
|
||||
uses: actions/upload-artifact@v6
|
||||
with:
|
||||
name: superset-extensions-cli-coverage-html
|
||||
path: htmlcov/
|
||||
|
||||
38
.github/workflows/superset-frontend.yml
vendored
38
.github/workflows/superset-frontend.yml
vendored
@@ -23,7 +23,7 @@ jobs:
|
||||
should-run: ${{ steps.check.outputs.frontend }}
|
||||
steps:
|
||||
- name: Checkout Code
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
fetch-depth: 0
|
||||
@@ -58,7 +58,7 @@ jobs:
|
||||
|
||||
- name: Upload Docker Image Artifact
|
||||
if: steps.check.outputs.frontend
|
||||
uses: actions/upload-artifact@v4
|
||||
uses: actions/upload-artifact@v6
|
||||
with:
|
||||
name: docker-image
|
||||
path: docker-image.tar.gz
|
||||
@@ -73,7 +73,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: Download Docker Image Artifact
|
||||
uses: actions/download-artifact@v5
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
name: docker-image
|
||||
|
||||
@@ -90,7 +90,7 @@ jobs:
|
||||
"npm run test -- --coverage --shard=${{ matrix.shard }}/8 --coverageReporters=json-summary"
|
||||
|
||||
- name: Upload Coverage Artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
uses: actions/upload-artifact@v6
|
||||
with:
|
||||
name: coverage-artifacts-${{ matrix.shard }}
|
||||
path: superset-frontend/coverage
|
||||
@@ -101,7 +101,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: Download Coverage Artifacts
|
||||
uses: actions/download-artifact@v5
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
pattern: coverage-artifacts-*
|
||||
path: coverage/
|
||||
@@ -127,7 +127,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: Download Docker Image Artifact
|
||||
uses: actions/download-artifact@v5
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
name: docker-image
|
||||
|
||||
@@ -135,15 +135,15 @@ jobs:
|
||||
run: |
|
||||
docker load < docker-image.tar.gz
|
||||
|
||||
- name: eslint
|
||||
- name: lint
|
||||
run: |
|
||||
docker run --rm $TAG bash -c \
|
||||
"npm i && npm run eslint -- . --quiet"
|
||||
"npm i && npm run lint"
|
||||
|
||||
- name: tsc
|
||||
run: |
|
||||
docker run --rm $TAG bash -c \
|
||||
"npm run plugins:build && npm run type"
|
||||
"npm i && npm run plugins:build && npm run type"
|
||||
|
||||
validate-frontend:
|
||||
needs: frontend-build
|
||||
@@ -151,7 +151,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: Download Docker Image Artifact
|
||||
uses: actions/download-artifact@v5
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
name: docker-image
|
||||
|
||||
@@ -167,3 +167,21 @@ jobs:
|
||||
run: |
|
||||
docker run --rm $TAG bash -c \
|
||||
"npm run plugins:build-storybook"
|
||||
|
||||
test-storybook:
|
||||
needs: frontend-build
|
||||
if: needs.frontend-build.outputs.should-run == 'true'
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: Download Docker Image Artifact
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
name: docker-image
|
||||
|
||||
- name: Load Docker Image
|
||||
run: docker load < docker-image.tar.gz
|
||||
|
||||
- name: Build Storybook and Run Tests
|
||||
run: |
|
||||
docker run --rm $TAG bash -c \
|
||||
"npm run build-storybook && npx playwright install-deps && npx playwright install chromium && npm run test-storybook:ci"
|
||||
|
||||
2
.github/workflows/superset-helm-lint.yml
vendored
2
.github/workflows/superset-helm-lint.yml
vendored
@@ -16,7 +16,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
2
.github/workflows/superset-helm-release.yml
vendored
2
.github/workflows/superset-helm-release.yml
vendored
@@ -29,7 +29,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ inputs.ref || github.ref_name }}
|
||||
persist-credentials: true
|
||||
|
||||
25
.github/workflows/superset-playwright.yml
vendored
25
.github/workflows/superset-playwright.yml
vendored
@@ -1,4 +1,4 @@
|
||||
name: Playwright E2E Tests
|
||||
name: Playwright Experimental Tests
|
||||
|
||||
on:
|
||||
push:
|
||||
@@ -23,9 +23,10 @@ concurrency:
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
playwright-tests:
|
||||
# NOTE: Required Playwright tests are in superset-e2e.yml (E2E / playwright-tests)
|
||||
# This workflow contains only experimental tests that run in shadow mode
|
||||
playwright-tests-experimental:
|
||||
runs-on: ubuntu-22.04
|
||||
# Allow workflow to succeed even if tests fail during shadow mode
|
||||
continue-on-error: true
|
||||
permissions:
|
||||
contents: read
|
||||
@@ -59,21 +60,21 @@ jobs:
|
||||
# Conditional checkout based on context (same as Cypress workflow)
|
||||
- name: Checkout for push or pull_request event
|
||||
if: github.event_name == 'push' || github.event_name == 'pull_request'
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.sha || github.sha }}
|
||||
- name: Checkout using ref (workflow_dispatch)
|
||||
if: github.event_name == 'workflow_dispatch' && github.event.inputs.ref != ''
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
ref: ${{ github.event.inputs.ref }}
|
||||
submodules: recursive
|
||||
- name: Checkout using PR ID (workflow_dispatch)
|
||||
if: github.event_name == 'workflow_dispatch' && github.event.inputs.pr_id != ''
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
ref: refs/pull/${{ github.event.inputs.pr_id }}/merge
|
||||
@@ -96,10 +97,10 @@ jobs:
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: testdata
|
||||
run: playwright_testdata
|
||||
- name: Setup Node.js
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-frontend/.nvmrc'
|
||||
- name: Install npm dependencies
|
||||
@@ -117,13 +118,13 @@ jobs:
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
with:
|
||||
run: playwright-install
|
||||
- name: Run Playwright
|
||||
- name: Run Playwright (Experimental Tests)
|
||||
if: steps.check.outputs.python || steps.check.outputs.frontend
|
||||
uses: ./.github/actions/cached-dependencies
|
||||
env:
|
||||
NODE_OPTIONS: "--max-old-space-size=4096"
|
||||
with:
|
||||
run: playwright-run ${{ matrix.app_root }}
|
||||
run: playwright-run "${{ matrix.app_root }}" experimental/
|
||||
- name: Set safe app root
|
||||
if: failure()
|
||||
id: set-safe-app-root
|
||||
@@ -132,10 +133,10 @@ jobs:
|
||||
SAFE_APP_ROOT=${APP_ROOT//\//_}
|
||||
echo "safe_app_root=$SAFE_APP_ROOT" >> $GITHUB_OUTPUT
|
||||
- name: Upload Playwright Artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
uses: actions/upload-artifact@v6
|
||||
if: failure()
|
||||
with:
|
||||
path: |
|
||||
${{ github.workspace }}/superset-frontend/playwright-results/
|
||||
${{ github.workspace }}/superset-frontend/test-results/
|
||||
name: playwright-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}
|
||||
name: playwright-experimental-artifact-${{ github.run_id }}-${{ github.job }}-${{ matrix.browser }}--${{ steps.set-safe-app-root.outputs.safe_app_root }}
|
||||
|
||||
@@ -41,7 +41,7 @@ jobs:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
@@ -73,6 +73,36 @@ jobs:
|
||||
flags: python,mysql
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
- name: Generate database diagnostics for docs
|
||||
if: steps.check.outputs.python
|
||||
env:
|
||||
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: |
|
||||
mysql+mysqldb://superset:superset@127.0.0.1:13306/superset?charset=utf8mb4&binary_prefix=true
|
||||
run: |
|
||||
python -c "
|
||||
import json
|
||||
from superset.app import create_app
|
||||
from superset.db_engine_specs.lib import generate_yaml_docs
|
||||
app = create_app()
|
||||
with app.app_context():
|
||||
docs = generate_yaml_docs()
|
||||
# Wrap in the expected format
|
||||
output = {
|
||||
'generated': '$(date -Iseconds)',
|
||||
'databases': docs
|
||||
}
|
||||
with open('databases-diagnostics.json', 'w') as f:
|
||||
json.dump(output, f, indent=2, default=str)
|
||||
print(f'Generated diagnostics for {len(docs)} databases')
|
||||
"
|
||||
- name: Upload database diagnostics artifact
|
||||
if: steps.check.outputs.python
|
||||
uses: actions/upload-artifact@v6
|
||||
with:
|
||||
name: database-diagnostics
|
||||
path: databases-diagnostics.json
|
||||
retention-days: 7
|
||||
test-postgres:
|
||||
runs-on: ubuntu-24.04
|
||||
strategy:
|
||||
@@ -99,7 +129,7 @@ jobs:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
@@ -152,7 +182,7 @@ jobs:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
@@ -48,7 +48,7 @@ jobs:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
@@ -108,7 +108,7 @@ jobs:
|
||||
- 16379:6379
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
@@ -24,7 +24,7 @@ jobs:
|
||||
PYTHONPATH: ${{ github.workspace }}
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
6
.github/workflows/superset-translations.yml
vendored
6
.github/workflows/superset-translations.yml
vendored
@@ -18,7 +18,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
@@ -31,7 +31,7 @@ jobs:
|
||||
|
||||
- name: Setup Node.js
|
||||
if: steps.check.outputs.frontend
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-frontend/.nvmrc'
|
||||
- name: Install dependencies
|
||||
@@ -49,7 +49,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
submodules: recursive
|
||||
|
||||
2
.github/workflows/superset-websocket.yml
vendored
2
.github/workflows/superset-websocket.yml
vendored
@@ -21,7 +21,7 @@ jobs:
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install dependencies
|
||||
|
||||
2
.github/workflows/supersetbot.yml
vendored
2
.github/workflows/supersetbot.yml
vendored
@@ -38,7 +38,7 @@ jobs:
|
||||
});
|
||||
|
||||
- name: "Checkout ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
|
||||
8
.github/workflows/tag-release.yml
vendored
8
.github/workflows/tag-release.yml
vendored
@@ -47,7 +47,7 @@ jobs:
|
||||
steps:
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
@@ -60,7 +60,7 @@ jobs:
|
||||
build: "true"
|
||||
|
||||
- name: Use Node.js 20
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version: 20
|
||||
|
||||
@@ -107,12 +107,12 @@ jobs:
|
||||
steps:
|
||||
|
||||
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Use Node.js 20
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version: 20
|
||||
|
||||
|
||||
4
.github/workflows/tech-debt.yml
vendored
4
.github/workflows/tech-debt.yml
vendored
@@ -27,10 +27,10 @@ jobs:
|
||||
name: Generate Reports
|
||||
steps:
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v5
|
||||
uses: actions/checkout@v6
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version-file: './superset-frontend/.nvmrc'
|
||||
|
||||
|
||||
11
.gitignore
vendored
11
.gitignore
vendored
@@ -33,6 +33,7 @@ cover
|
||||
.env
|
||||
.envrc
|
||||
.idea
|
||||
.roo
|
||||
.mypy_cache
|
||||
.python-version
|
||||
.tox
|
||||
@@ -60,6 +61,7 @@ tmp
|
||||
rat-results.txt
|
||||
superset/app/
|
||||
superset-websocket/config.json
|
||||
.direnv
|
||||
|
||||
# Node.js, webpack artifacts, storybook
|
||||
*.entry.js
|
||||
@@ -71,10 +73,6 @@ superset/static/assets/*
|
||||
superset/static/uploads/*
|
||||
!superset/static/uploads/.gitkeep
|
||||
superset/static/version_info.json
|
||||
superset-frontend/**/esm/*
|
||||
superset-frontend/**/lib/*
|
||||
superset-frontend/**/storybook-static/*
|
||||
superset-frontend/migration-storybook.log
|
||||
yarn-error.log
|
||||
*.map
|
||||
*.min.js
|
||||
@@ -121,6 +119,8 @@ docker/requirements-local.txt
|
||||
|
||||
cache/
|
||||
docker/*local*
|
||||
docker/superset-websocket/config.json
|
||||
docker-compose.override.yml
|
||||
|
||||
.temp_cache
|
||||
|
||||
@@ -134,3 +134,6 @@ PROJECT.md
|
||||
.aider*
|
||||
.claude_rc*
|
||||
.env.local
|
||||
oxc-custom-build/
|
||||
*.code-workspace
|
||||
*.duckdb
|
||||
|
||||
@@ -49,20 +49,34 @@ repos:
|
||||
hooks:
|
||||
- id: check-docstring-first
|
||||
- id: check-added-large-files
|
||||
exclude: ^.*\.(geojson)$|^docs/static/img/screenshots/.*|^superset-frontend/CHANGELOG\.md$
|
||||
exclude: ^.*\.(geojson)$|^docs/static/img/screenshots/.*|^superset-frontend/CHANGELOG\.md$|^superset/examples/.*/data\.parquet$
|
||||
- id: check-yaml
|
||||
exclude: ^helm/superset/templates/
|
||||
- id: debug-statements
|
||||
- id: end-of-file-fixer
|
||||
exclude: .*/lerna\.json$
|
||||
exclude: .*/lerna\.json$|^docs/static/img/logos/
|
||||
- id: trailing-whitespace
|
||||
exclude: ^.*\.(snap)
|
||||
args: ["--markdown-linebreak-ext=md"]
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: eslint-frontend
|
||||
name: eslint (frontend)
|
||||
entry: ./scripts/eslint.sh
|
||||
- id: prettier-frontend
|
||||
name: prettier (frontend)
|
||||
entry: bash -c 'cd superset-frontend && for file in "$@"; do npx prettier --write "${file#superset-frontend/}"; done'
|
||||
language: system
|
||||
pass_filenames: true
|
||||
files: ^superset-frontend/.*\.(js|jsx|ts|tsx|css|scss|sass|json)$
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: oxlint-frontend
|
||||
name: oxlint (frontend)
|
||||
entry: ./scripts/oxlint.sh
|
||||
language: system
|
||||
pass_filenames: true
|
||||
files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$
|
||||
- id: custom-rules-frontend
|
||||
name: custom rules (frontend)
|
||||
entry: ./scripts/check-custom-rules.sh
|
||||
language: system
|
||||
pass_filenames: true
|
||||
files: ^superset-frontend/.*\.(js|jsx|ts|tsx)$
|
||||
@@ -92,12 +106,19 @@ repos:
|
||||
files: helm
|
||||
verbose: false
|
||||
args: ["--log-level", "error"]
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: v0.9.7
|
||||
# Using local hooks ensures ruff version matches requirements/development.txt
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: ruff-format
|
||||
name: ruff-format
|
||||
entry: ruff format
|
||||
language: system
|
||||
types: [python]
|
||||
- id: ruff
|
||||
args: [--fix]
|
||||
name: ruff
|
||||
entry: ruff check --fix --show-fixes
|
||||
language: system
|
||||
types: [python]
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: pylint
|
||||
@@ -110,11 +131,29 @@ repos:
|
||||
- -c
|
||||
- |
|
||||
TARGET_BRANCH=${GITHUB_BASE_REF:-master}
|
||||
git fetch origin "$TARGET_BRANCH"
|
||||
BASE=$(git merge-base origin/"$TARGET_BRANCH" HEAD)
|
||||
files=$(git diff --name-only --diff-filter=ACM "$BASE"..HEAD | grep '^superset/.*\.py$' || true)
|
||||
# Only fetch if we're not in CI (CI already has all refs)
|
||||
if [ -z "$CI" ]; then
|
||||
git fetch --no-recurse-submodules origin "$TARGET_BRANCH" 2>/dev/null || true
|
||||
fi
|
||||
BASE=$(git merge-base origin/"$TARGET_BRANCH" HEAD 2>/dev/null) || BASE="HEAD"
|
||||
files=$(git diff --name-only --diff-filter=ACM "$BASE"..HEAD 2>/dev/null | grep '^superset/.*\.py$' || true)
|
||||
if [ -n "$files" ]; then
|
||||
pylint --rcfile=.pylintrc --load-plugins=superset.extensions.pylint --reports=no $files
|
||||
else
|
||||
echo "No Python files to lint."
|
||||
fi
|
||||
- id: db-engine-spec-metadata
|
||||
name: database engine spec metadata validation
|
||||
entry: python superset/db_engine_specs/lint_metadata.py --strict
|
||||
language: system
|
||||
files: ^superset/db_engine_specs/.*\.py$
|
||||
exclude: ^superset/db_engine_specs/(base|lib|lint_metadata|__init__)\.py$
|
||||
pass_filenames: false
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: feature-flags-sync
|
||||
name: feature flags documentation sync
|
||||
entry: bash -c 'python scripts/extract_feature_flags.py > docs/static/feature-flags.json.tmp && if ! diff -q docs/static/feature-flags.json docs/static/feature-flags.json.tmp > /dev/null 2>&1; then mv docs/static/feature-flags.json.tmp docs/static/feature-flags.json && echo "Updated docs/static/feature-flags.json" && exit 1; else rm docs/static/feature-flags.json.tmp; fi'
|
||||
language: system
|
||||
files: ^superset/config\.py$
|
||||
pass_filenames: false
|
||||
|
||||
@@ -53,7 +53,7 @@ extension-pkg-whitelist=pyarrow
|
||||
|
||||
[MESSAGES CONTROL]
|
||||
disable=all
|
||||
enable=disallowed-json-import,disallowed-sql-import,consider-using-transaction
|
||||
enable=json-import,disallowed-sql-import,consider-using-transaction
|
||||
|
||||
|
||||
[REPORTS]
|
||||
|
||||
@@ -11,6 +11,7 @@
|
||||
.nvmrc
|
||||
.prettierrc
|
||||
.rat-excludes
|
||||
.swcrc
|
||||
.*log
|
||||
.*pyc
|
||||
.*lock
|
||||
@@ -66,22 +67,19 @@ temporary_superset_ui/*
|
||||
# skip license checks for auto-generated test snapshots
|
||||
.*snap
|
||||
|
||||
# docs overrides for third party logos we don't have the rights to
|
||||
google-big-query.svg
|
||||
google-sheets.svg
|
||||
ibm-db2.svg
|
||||
postgresql.svg
|
||||
snowflake.svg
|
||||
ydb.svg
|
||||
loading.svg
|
||||
# docs third-party logos (database logos, org logos, etc.)
|
||||
databases/*
|
||||
logos/*
|
||||
|
||||
# docs-related
|
||||
erd.puml
|
||||
erd.svg
|
||||
intro_header.txt
|
||||
TODO.md
|
||||
|
||||
# for LLMs
|
||||
llm-context.md
|
||||
llms.txt
|
||||
AGENTS.md
|
||||
LLMS.md
|
||||
CLAUDE.md
|
||||
|
||||
45
AGENTS.md
45
AGENTS.md
@@ -2,6 +2,27 @@
|
||||
|
||||
Apache Superset is a data visualization platform with Flask/Python backend and React/TypeScript frontend.
|
||||
|
||||
## ⚠️ CRITICAL: Always Run Pre-commit Before Pushing
|
||||
|
||||
**ALWAYS run `pre-commit run --all-files` before pushing commits.** CI will fail if pre-commit checks don't pass. This is non-negotiable.
|
||||
|
||||
```bash
|
||||
# Stage your changes first
|
||||
git add .
|
||||
|
||||
# Run pre-commit on all files
|
||||
pre-commit run --all-files
|
||||
|
||||
# If there are auto-fixes, stage them and commit
|
||||
git add .
|
||||
git commit --amend # or new commit
|
||||
```
|
||||
|
||||
Common pre-commit failures:
|
||||
- **Formatting** - black, prettier, eslint will auto-fix
|
||||
- **Type errors** - mypy failures need manual fixes
|
||||
- **Linting** - ruff, pylint issues need manual fixes
|
||||
|
||||
## ⚠️ CRITICAL: Ongoing Refactors (What NOT to Do)
|
||||
|
||||
**These migrations are actively happening - avoid deprecated patterns:**
|
||||
@@ -80,6 +101,30 @@ superset/
|
||||
- **UPDATING.md**: Add breaking changes here
|
||||
- **Docstrings**: Required for new functions/classes
|
||||
|
||||
## Developer Portal: Storybook-to-MDX Documentation
|
||||
|
||||
The Developer Portal auto-generates MDX documentation from Storybook stories. **Stories are the single source of truth.**
|
||||
|
||||
### Core Philosophy
|
||||
- **Fix issues in the STORY, not the generator** - When something doesn't render correctly, update the story file first
|
||||
- **Generator should be lightweight** - It extracts and passes through data; avoid special cases
|
||||
- **Stories define everything** - Props, controls, galleries, examples all come from story metadata
|
||||
|
||||
### Story Requirements for Docs Generation
|
||||
- Use `export default { title: '...' }` (inline), not `const meta = ...; export default meta;`
|
||||
- Name interactive stories `Interactive${ComponentName}` (e.g., `InteractiveButton`)
|
||||
- Define `args` for default prop values
|
||||
- Define `argTypes` at the story level (not meta level) with control types and descriptions
|
||||
- Use `parameters.docs.gallery` for size×style variant grids
|
||||
- Use `parameters.docs.sampleChildren` for components that need children
|
||||
- Use `parameters.docs.liveExample` for custom live code blocks
|
||||
- Use `parameters.docs.staticProps` for complex object props that can't be parsed inline
|
||||
|
||||
### Generator Location
|
||||
- Script: `docs/scripts/generate-superset-components.mjs`
|
||||
- Wrapper: `docs/src/components/StorybookWrapper.jsx`
|
||||
- Output: `docs/developer_portal/components/`
|
||||
|
||||
## Architecture Patterns
|
||||
|
||||
### Security & Features
|
||||
|
||||
@@ -49,3 +49,4 @@ under the License.
|
||||
- [4.1.3](./CHANGELOG/4.1.3.md)
|
||||
- [4.1.4](./CHANGELOG/4.1.4.md)
|
||||
- [5.0.0](./CHANGELOG/5.0.0.md)
|
||||
- [6.0.0](./CHANGELOG/6.0.0.md)
|
||||
|
||||
1062
CHANGELOG/6.0.0.md
Normal file
1062
CHANGELOG/6.0.0.md
Normal file
File diff suppressed because it is too large
Load Diff
26
Dockerfile
26
Dockerfile
@@ -18,7 +18,7 @@
|
||||
######################################################################
|
||||
# Node stage to deal with static asset construction
|
||||
######################################################################
|
||||
ARG PY_VER=3.11.13-slim-trixie
|
||||
ARG PY_VER=3.11.14-slim-trixie
|
||||
|
||||
# If BUILDPLATFORM is null, set it to 'amd64' (or leave as is otherwise).
|
||||
ARG BUILDPLATFORM=${BUILDPLATFORM:-amd64}
|
||||
@@ -26,9 +26,6 @@ ARG BUILDPLATFORM=${BUILDPLATFORM:-amd64}
|
||||
# Include translations in the final build
|
||||
ARG BUILD_TRANSLATIONS="false"
|
||||
|
||||
# Build arg to pre-populate examples DuckDB file
|
||||
ARG LOAD_EXAMPLES_DUCKDB="false"
|
||||
|
||||
######################################################################
|
||||
# superset-node-ci used as a base for building frontend assets and CI
|
||||
######################################################################
|
||||
@@ -146,9 +143,6 @@ RUN if [ "${BUILD_TRANSLATIONS}" = "true" ]; then \
|
||||
######################################################################
|
||||
FROM python-base AS python-common
|
||||
|
||||
# Re-declare build arg to receive it in this stage
|
||||
ARG LOAD_EXAMPLES_DUCKDB
|
||||
|
||||
ENV SUPERSET_HOME="/app/superset_home" \
|
||||
HOME="/app/superset_home" \
|
||||
SUPERSET_ENV="production" \
|
||||
@@ -160,7 +154,7 @@ ENV SUPERSET_HOME="/app/superset_home" \
|
||||
COPY --chmod=755 docker/entrypoints /app/docker/entrypoints
|
||||
|
||||
WORKDIR /app
|
||||
# Set up necessary directories and user
|
||||
# Set up necessary directories
|
||||
RUN mkdir -p \
|
||||
${PYTHONPATH} \
|
||||
superset/static \
|
||||
@@ -171,6 +165,8 @@ RUN mkdir -p \
|
||||
&& touch superset/static/version_info.json
|
||||
|
||||
# Install Playwright and optionally setup headless browsers
|
||||
ENV PLAYWRIGHT_BROWSERS_PATH=/usr/local/share/playwright-browsers
|
||||
|
||||
ARG INCLUDE_CHROMIUM="false"
|
||||
ARG INCLUDE_FIREFOX="false"
|
||||
RUN --mount=type=cache,target=${SUPERSET_HOME}/.cache/uv \
|
||||
@@ -200,17 +196,9 @@ RUN /app/docker/apt-install.sh \
|
||||
libecpg-dev \
|
||||
libldap2-dev
|
||||
|
||||
# Pre-load examples DuckDB file if requested
|
||||
RUN if [ "$LOAD_EXAMPLES_DUCKDB" = "true" ]; then \
|
||||
mkdir -p /app/data && \
|
||||
echo "Downloading pre-built examples.duckdb..." && \
|
||||
curl -L -o /app/data/examples.duckdb \
|
||||
"https://raw.githubusercontent.com/apache-superset/examples-data/master/examples.duckdb" && \
|
||||
chown -R superset:superset /app/data; \
|
||||
else \
|
||||
mkdir -p /app/data && \
|
||||
chown -R superset:superset /app/data; \
|
||||
fi
|
||||
# Create data directory for DuckDB examples database
|
||||
# The database file will be created at runtime when examples are loaded from Parquet files
|
||||
RUN mkdir -p /app/data && chown -R superset:superset /app/data
|
||||
|
||||
# Copy compiled things from previous stages
|
||||
COPY --from=superset-node /app/superset/static/assets superset/static/assets
|
||||
|
||||
20
INSTALL.md
20
INSTALL.md
@@ -16,8 +16,20 @@ KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
# INSTALL / BUILD instructions for Apache Superset
|
||||
# Installing Apache Superset
|
||||
|
||||
At this time, the docker file at RELEASING/Dockerfile.from_local_tarball
|
||||
constitutes the recipe on how to get to a working release from a source
|
||||
release tarball.
|
||||
For comprehensive installation instructions, please see the Apache Superset documentation:
|
||||
|
||||
**[📚 Installation Guide →](https://superset.apache.org/docs/installation/installation-methods)**
|
||||
|
||||
The documentation covers:
|
||||
- [Docker Compose](https://superset.apache.org/docs/installation/docker-compose) (recommended for development)
|
||||
- [Kubernetes / Helm](https://superset.apache.org/docs/installation/kubernetes)
|
||||
- [PyPI](https://superset.apache.org/docs/installation/pypi)
|
||||
- [Docker Builds](https://superset.apache.org/docs/installation/docker-builds)
|
||||
- [Architecture Overview](https://superset.apache.org/docs/installation/architecture)
|
||||
|
||||
## Building from Source
|
||||
|
||||
For building from a source release tarball, see the Dockerfile at:
|
||||
`RELEASING/Dockerfile.from_local_tarball`
|
||||
|
||||
27
Makefile
27
Makefile
@@ -18,7 +18,7 @@
|
||||
# Python version installed; we need 3.10-3.11
|
||||
PYTHON=`command -v python3.11 || command -v python3.10`
|
||||
|
||||
.PHONY: install superset venv pre-commit
|
||||
.PHONY: install superset venv pre-commit up down logs ps nuke ports open
|
||||
|
||||
install: superset pre-commit
|
||||
|
||||
@@ -112,3 +112,28 @@ report-celery-beat:
|
||||
|
||||
admin-user:
|
||||
superset fab create-admin
|
||||
|
||||
# Docker Compose with auto-assigned ports (for running multiple instances)
|
||||
up:
|
||||
./scripts/docker-compose-up.sh
|
||||
|
||||
up-detached:
|
||||
./scripts/docker-compose-up.sh -d
|
||||
|
||||
down:
|
||||
./scripts/docker-compose-up.sh down
|
||||
|
||||
logs:
|
||||
./scripts/docker-compose-up.sh logs -f
|
||||
|
||||
ps:
|
||||
./scripts/docker-compose-up.sh ps
|
||||
|
||||
nuke:
|
||||
./scripts/docker-compose-up.sh nuke
|
||||
|
||||
ports:
|
||||
./scripts/docker-compose-up.sh ports
|
||||
|
||||
open:
|
||||
./scripts/docker-compose-up.sh open
|
||||
|
||||
120
README.md
120
README.md
@@ -23,8 +23,12 @@ under the License.
|
||||
[](https://github.com/apache/superset/releases/latest)
|
||||
[](https://github.com/apache/superset/actions)
|
||||
[](https://badge.fury.io/py/apache_superset)
|
||||
[](https://codecov.io/github/apache/superset)
|
||||
[](https://pypi.python.org/pypi/apache_superset)
|
||||
[](https://github.com/apache/superset/stargazers)
|
||||
[](https://github.com/apache/superset/graphs/contributors)
|
||||
[](https://github.com/apache/superset/commits/master)
|
||||
[](https://github.com/apache/superset/issues)
|
||||
[](https://github.com/apache/superset/pulls)
|
||||
[](http://bit.ly/join-superset-slack)
|
||||
[](https://superset.apache.org)
|
||||
|
||||
@@ -51,7 +55,7 @@ A modern, enterprise-ready business intelligence web application.
|
||||
[**Get Involved**](#get-involved) |
|
||||
[**Contributor Guide**](#contributor-guide) |
|
||||
[**Resources**](#resources) |
|
||||
[**Organizations Using Superset**](https://github.com/apache/superset/blob/master/RESOURCES/INTHEWILD.md)
|
||||
[**Organizations Using Superset**](https://superset.apache.org/inTheWild)
|
||||
|
||||
## Why Superset?
|
||||
|
||||
@@ -85,7 +89,7 @@ Superset provides:
|
||||
|
||||
**Craft Beautiful, Dynamic Dashboards**
|
||||
|
||||
<kbd><img title="View Dashboards" src="https://superset.apache.org/img/screenshots/slack_dash.jpg"/></kbd><br/>
|
||||
<kbd><img title="View Dashboards" src="https://superset.apache.org/img/screenshots/dashboard.jpg"/></kbd><br/>
|
||||
|
||||
**No-Code Chart Builder**
|
||||
|
||||
@@ -97,51 +101,77 @@ Superset provides:
|
||||
|
||||
## Supported Databases
|
||||
|
||||
Superset can query data from any SQL-speaking datastore or data engine (Presto, Trino, Athena, [and more](https://superset.apache.org/docs/configuration/databases)) that has a Python DB-API driver and a SQLAlchemy dialect.
|
||||
Superset can query data from any SQL-speaking datastore or data engine (Presto, Trino, Athena, [and more](https://superset.apache.org/docs/databases)) that has a Python DB-API driver and a SQLAlchemy dialect.
|
||||
|
||||
Here are some of the major database solutions that are supported:
|
||||
|
||||
<!-- SUPPORTED_DATABASES_START -->
|
||||
<p align="center">
|
||||
<img src="https://superset.apache.org/img/databases/redshift.png" alt="redshift" border="0" width="200"/>
|
||||
<img src="https://superset.apache.org/img/databases/google-biquery.png" alt="google-bigquery" border="0" width="200"/>
|
||||
<img src="https://superset.apache.org/img/databases/snowflake.png" alt="snowflake" border="0" width="200"/>
|
||||
<img src="https://superset.apache.org/img/databases/trino.png" alt="trino" border="0" width="150" />
|
||||
<img src="https://superset.apache.org/img/databases/presto.png" alt="presto" border="0" width="200"/>
|
||||
<img src="https://superset.apache.org/img/databases/databricks.png" alt="databricks" border="0" width="160" />
|
||||
<img src="https://superset.apache.org/img/databases/druid.png" alt="druid" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/firebolt.png" alt="firebolt" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/timescale.png" alt="timescale" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/postgresql.png" alt="postgresql" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/mysql.png" alt="mysql" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/mssql-server.png" alt="mssql-server" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/ibm-db2.svg" alt="db2" border="0" width="220" />
|
||||
<img src="https://superset.apache.org/img/databases/sqlite.png" alt="sqlite" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/sybase.png" alt="sybase" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/mariadb.png" alt="mariadb" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/vertica.png" alt="vertica" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/oracle.png" alt="oracle" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/firebird.png" alt="firebird" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/greenplum.png" alt="greenplum" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/clickhouse.png" alt="clickhouse" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/exasol.png" alt="exasol" border="0" width="160" />
|
||||
<img src="https://superset.apache.org/img/databases/monet-db.png" alt="monet-db" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/apache-kylin.png" alt="apache-kylin" border="0" width="80"/>
|
||||
<img src="https://superset.apache.org/img/databases/hologres.png" alt="hologres" border="0" width="80"/>
|
||||
<img src="https://superset.apache.org/img/databases/netezza.png" alt="netezza" border="0" width="80"/>
|
||||
<img src="https://superset.apache.org/img/databases/pinot.png" alt="pinot" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/teradata.png" alt="teradata" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/yugabyte.png" alt="yugabyte" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/databend.png" alt="databend" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/starrocks.png" alt="starrocks" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/doris.png" alt="doris" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/oceanbase.svg" alt="oceanbase" border="0" width="220" />
|
||||
<img src="https://superset.apache.org/img/databases/sap-hana.png" alt="sap-hana" border="0" width="220" />
|
||||
<img src="https://superset.apache.org/img/databases/denodo.png" alt="denodo" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/ydb.svg" alt="ydb" border="0" width="200" />
|
||||
<img src="https://superset.apache.org/img/databases/tdengine.png" alt="TDengine" border="0" width="200" />
|
||||
<a href="https://superset.apache.org/docs/databases/supported/amazon-athena" title="Amazon Athena"><img src="docs/static/img/databases/amazon-athena.jpg" alt="Amazon Athena" width="76" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/amazon-dynamodb" title="Amazon DynamoDB"><img src="docs/static/img/databases/aws.png" alt="Amazon DynamoDB" width="40" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/amazon-redshift" title="Amazon Redshift"><img src="docs/static/img/databases/redshift.png" alt="Amazon Redshift" width="100" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-doris" title="Apache Doris"><img src="docs/static/img/databases/doris.png" alt="Apache Doris" width="103" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-drill" title="Apache Drill"><img src="docs/static/img/databases/apache-drill.png" alt="Apache Drill" width="81" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-druid" title="Apache Druid"><img src="docs/static/img/databases/druid.png" alt="Apache Druid" width="117" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-hive" title="Apache Hive"><img src="docs/static/img/databases/apache-hive.svg" alt="Apache Hive" width="44" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-impala" title="Apache Impala"><img src="docs/static/img/databases/apache-impala.png" alt="Apache Impala" width="21" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-kylin" title="Apache Kylin"><img src="docs/static/img/databases/apache-kylin.png" alt="Apache Kylin" width="44" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-pinot" title="Apache Pinot"><img src="docs/static/img/databases/apache-pinot.svg" alt="Apache Pinot" width="76" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-solr" title="Apache Solr"><img src="docs/static/img/databases/apache-solr.png" alt="Apache Solr" width="79" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/apache-spark-sql" title="Apache Spark SQL"><img src="docs/static/img/databases/apache-spark.png" alt="Apache Spark SQL" width="75" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/ascend" title="Ascend"><img src="docs/static/img/databases/ascend.webp" alt="Ascend" width="117" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/aurora-mysql-data-api" title="Aurora MySQL (Data API)"><img src="docs/static/img/databases/mysql.png" alt="Aurora MySQL (Data API)" width="77" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/aurora-postgresql-data-api" title="Aurora PostgreSQL (Data API)"><img src="docs/static/img/databases/postgresql.svg" alt="Aurora PostgreSQL (Data API)" width="76" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/azure-data-explorer" title="Azure Data Explorer"><img src="docs/static/img/databases/kusto.png" alt="Azure Data Explorer" width="40" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/azure-synapse" title="Azure Synapse"><img src="docs/static/img/databases/azure.svg" alt="Azure Synapse" width="40" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/clickhouse" title="ClickHouse"><img src="docs/static/img/databases/clickhouse.png" alt="ClickHouse" width="150" height="37" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/cloudflare-d1" title="Cloudflare D1"><img src="docs/static/img/databases/cloudflare.png" alt="Cloudflare D1" width="40" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/cockroachdb" title="CockroachDB"><img src="docs/static/img/databases/cockroachdb.png" alt="CockroachDB" width="150" height="24" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/couchbase" title="Couchbase"><img src="docs/static/img/databases/couchbase.svg" alt="Couchbase" width="150" height="35" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/cratedb" title="CrateDB"><img src="docs/static/img/databases/cratedb.svg" alt="CrateDB" width="180" height="24" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/databend" title="Databend"><img src="docs/static/img/databases/databend.png" alt="Databend" width="100" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/databricks" title="Databricks"><img src="docs/static/img/databases/databricks.png" alt="Databricks" width="152" height="24" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/denodo" title="Denodo"><img src="docs/static/img/databases/denodo.png" alt="Denodo" width="138" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/dremio" title="Dremio"><img src="docs/static/img/databases/dremio.png" alt="Dremio" width="126" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/duckdb" title="DuckDB"><img src="docs/static/img/databases/duckdb.png" alt="DuckDB" width="52" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/elasticsearch" title="Elasticsearch"><img src="docs/static/img/databases/elasticsearch.png" alt="Elasticsearch" width="40" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/exasol" title="Exasol"><img src="docs/static/img/databases/exasol.png" alt="Exasol" width="72" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/firebird" title="Firebird"><img src="docs/static/img/databases/firebird.png" alt="Firebird" width="100" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/firebolt" title="Firebolt"><img src="docs/static/img/databases/firebolt.png" alt="Firebolt" width="100" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/google-bigquery" title="Google BigQuery"><img src="docs/static/img/databases/google-big-query.svg" alt="Google BigQuery" width="76" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/google-sheets" title="Google Sheets"><img src="docs/static/img/databases/google-sheets.svg" alt="Google Sheets" width="76" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/greenplum" title="Greenplum"><img src="docs/static/img/databases/greenplum.png" alt="Greenplum" width="124" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/hologres" title="Hologres"><img src="docs/static/img/databases/hologres.png" alt="Hologres" width="44" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/ibm-db2" title="IBM Db2"><img src="docs/static/img/databases/ibm-db2.svg" alt="IBM Db2" width="91" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/ibm-netezza-performance-server" title="IBM Netezza Performance Server"><img src="docs/static/img/databases/netezza.png" alt="IBM Netezza Performance Server" width="40" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/mariadb" title="MariaDB"><img src="docs/static/img/databases/mariadb.png" alt="MariaDB" width="150" height="37" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/microsoft-sql-server" title="Microsoft SQL Server"><img src="docs/static/img/databases/msql.png" alt="Microsoft SQL Server" width="50" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/monetdb" title="MonetDB"><img src="docs/static/img/databases/monet-db.png" alt="MonetDB" width="100" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/mongodb" title="MongoDB"><img src="docs/static/img/databases/mongodb.png" alt="MongoDB" width="150" height="38" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/motherduck" title="MotherDuck"><img src="docs/static/img/databases/motherduck.png" alt="MotherDuck" width="40" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/oceanbase" title="OceanBase"><img src="docs/static/img/databases/oceanbase.svg" alt="OceanBase" width="175" height="24" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/oracle" title="Oracle"><img src="docs/static/img/databases/oraclelogo.png" alt="Oracle" width="111" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/presto" title="Presto"><img src="docs/static/img/databases/presto-og.png" alt="Presto" width="127" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/risingwave" title="RisingWave"><img src="docs/static/img/databases/risingwave.svg" alt="RisingWave" width="147" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/sap-hana" title="SAP HANA"><img src="docs/static/img/databases/sap-hana.png" alt="SAP HANA" width="137" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/sap-sybase" title="SAP Sybase"><img src="docs/static/img/databases/sybase.png" alt="SAP Sybase" width="100" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/shillelagh" title="Shillelagh"><img src="docs/static/img/databases/shillelagh.png" alt="Shillelagh" width="40" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/singlestore" title="SingleStore"><img src="docs/static/img/databases/singlestore.png" alt="SingleStore" width="150" height="31" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/snowflake" title="Snowflake"><img src="docs/static/img/databases/snowflake.svg" alt="Snowflake" width="76" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/sqlite" title="SQLite"><img src="docs/static/img/databases/sqlite.png" alt="SQLite" width="84" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/starrocks" title="StarRocks"><img src="docs/static/img/databases/starrocks.png" alt="StarRocks" width="149" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/superset-meta-database" title="Superset meta database"><img src="docs/static/img/databases/superset.svg" alt="Superset meta database" width="150" height="39" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/tdengine" title="TDengine"><img src="docs/static/img/databases/tdengine.png" alt="TDengine" width="140" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/teradata" title="Teradata"><img src="docs/static/img/databases/teradata.png" alt="Teradata" width="124" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/timescaledb" title="TimescaleDB"><img src="docs/static/img/databases/timescale.png" alt="TimescaleDB" width="150" height="36" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/trino" title="Trino"><img src="docs/static/img/databases/trino.png" alt="Trino" width="89" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/vertica" title="Vertica"><img src="docs/static/img/databases/vertica.png" alt="Vertica" width="128" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/ydb" title="YDB"><img src="docs/static/img/databases/ydb.svg" alt="YDB" width="110" height="40" /></a>
|
||||
<a href="https://superset.apache.org/docs/databases/supported/yugabytedb" title="YugabyteDB"><img src="docs/static/img/databases/yugabyte.png" alt="YugabyteDB" width="150" height="26" /></a>
|
||||
</p>
|
||||
<!-- SUPPORTED_DATABASES_END -->
|
||||
|
||||
**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/configuration/databases).
|
||||
**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/databases).
|
||||
|
||||
Want to add support for your datastore or data engine? Read more [here](https://superset.apache.org/docs/frequently-asked-questions#does-superset-work-with-insert-database-engine-here) about the technical requirements.
|
||||
|
||||
@@ -161,14 +191,14 @@ Try out Superset's [quickstart](https://superset.apache.org/docs/quickstart/) gu
|
||||
## Contributor Guide
|
||||
|
||||
Interested in contributing? Check out our
|
||||
[CONTRIBUTING.md](https://github.com/apache/superset/blob/master/CONTRIBUTING.md)
|
||||
[Developer Portal](https://superset.apache.org/developer_portal/)
|
||||
to find resources around contributing along with a detailed guide on
|
||||
how to set up a development environment.
|
||||
|
||||
## Resources
|
||||
|
||||
- [Superset "In the Wild"](https://github.com/apache/superset/blob/master/RESOURCES/INTHEWILD.md) - open a PR to add your org to the list!
|
||||
- [Feature Flags](https://github.com/apache/superset/blob/master/RESOURCES/FEATURE_FLAGS.md) - the status of Superset's Feature Flags.
|
||||
- [Superset "In the Wild"](https://superset.apache.org/inTheWild) - see who's using Superset, and [add your organization](https://github.com/apache/superset/edit/master/RESOURCES/INTHEWILD.yaml) to the list!
|
||||
- [Feature Flags](https://superset.apache.org/docs/configuration/feature-flags) - the status of Superset's Feature Flags.
|
||||
- [Standard Roles](https://github.com/apache/superset/blob/master/RESOURCES/STANDARD_ROLES.md) - How RBAC permissions map to roles.
|
||||
- [Superset Wiki](https://github.com/apache/superset/wiki) - Tons of additional community resources: best practices, community content and other information.
|
||||
- [Superset SIPs](https://github.com/orgs/apache/projects/170) - The status of Superset's SIPs (Superset Improvement Proposals) for both consensus and implementation status.
|
||||
|
||||
@@ -92,7 +92,7 @@ Some of the new features in this release are disabled by default. Each has a fea
|
||||
|
||||
| Feature | Feature Flag | Dependencies | Documentation
|
||||
| --- | --- | --- | --- |
|
||||
| Global Async Queries | `GLOBAL_ASYNC_QUERIES: True` | Redis 5.0+, celery workers configured and running | [Extra documentation](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#async-chart-queries )
|
||||
| Global Async Queries | `GLOBAL_ASYNC_QUERIES: True` | Redis 5.0+, celery workers configured and running | [Extra documentation](https://superset.apache.org/docs/contributing/misc#async-chart-queries)
|
||||
| Dashboard Native Filters | `DASHBOARD_NATIVE_FILTERS: True` | |
|
||||
| Alerts & Reporting | `ALERT_REPORTS: True` | [Celery workers configured & celery beat process](https://superset.apache.org/docs/installation/async-queries-celery) |
|
||||
| Homescreen Thumbnails | `THUMBNAILS: TRUE, THUMBNAIL_CACHE_CONFIG: CacheConfig = { "CACHE_TYPE": "null", "CACHE_NO_NULL_WARNING": True}`| selenium, pillow 7, celery |
|
||||
|
||||
@@ -1,103 +0,0 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Superset Feature Flags
|
||||
|
||||
This is a list of the current Superset optional features. See config.py for default values. These features can be turned on/off by setting your preferred values in superset_config.py to True/False respectively
|
||||
|
||||
## In Development
|
||||
|
||||
These features are considered **unfinished** and should only be used on development environments.
|
||||
|
||||
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
|
||||
|
||||
- ALERT_REPORT_TABS
|
||||
- DATE_RANGE_TIMESHIFTS_ENABLED
|
||||
- ENABLE_ADVANCED_DATA_TYPES
|
||||
- PRESTO_EXPAND_DATA
|
||||
- SHARE_QUERIES_VIA_KV_STORE
|
||||
- TAGGING_SYSTEM
|
||||
- CHART_PLUGINS_EXPERIMENTAL
|
||||
|
||||
## In Testing
|
||||
|
||||
These features are **finished** but currently being tested. They are usable, but may still contain some bugs.
|
||||
|
||||
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
|
||||
|
||||
- ALERT_REPORTS: [(docs)](https://superset.apache.org/docs/configuration/alerts-reports)
|
||||
- ALLOW_FULL_CSV_EXPORT
|
||||
- CACHE_IMPERSONATION
|
||||
- CONFIRM_DASHBOARD_DIFF
|
||||
- DYNAMIC_PLUGINS
|
||||
- DATE_FORMAT_IN_EMAIL_SUBJECT: [(docs)](https://superset.apache.org/docs/configuration/alerts-reports#commons)
|
||||
- ENABLE_SUPERSET_META_DB: [(docs)](https://superset.apache.org/docs/configuration/databases/#querying-across-databases)
|
||||
- ESTIMATE_QUERY_COST
|
||||
- GLOBAL_ASYNC_QUERIES [(docs)](https://github.com/apache/superset/blob/master/CONTRIBUTING.md#async-chart-queries)
|
||||
- IMPERSONATE_WITH_EMAIL_PREFIX
|
||||
- PLAYWRIGHT_REPORTS_AND_THUMBNAILS
|
||||
- RLS_IN_SQLLAB
|
||||
- SSH_TUNNELING [(docs)](https://superset.apache.org/docs/configuration/setup-ssh-tunneling)
|
||||
- USE_ANALAGOUS_COLORS
|
||||
|
||||
## Stable
|
||||
|
||||
These features flags are **safe for production**. They have been tested and will be supported for the at least the current major version cycle.
|
||||
|
||||
[//]: # "PLEASE KEEP THESE LISTS SORTED ALPHABETICALLY"
|
||||
|
||||
### Flags on the path to feature launch and flag deprecation/removal
|
||||
|
||||
- DASHBOARD_VIRTUALIZATION
|
||||
|
||||
### Flags retained for runtime configuration
|
||||
|
||||
Currently some of our feature flags act as dynamic configurations that can changed
|
||||
on the fly. This acts in contradiction with the typical ephemeral feature flag use case,
|
||||
where the flag is used to mature a feature, and eventually deprecated once the feature is
|
||||
solid. Eventually we'll likely refactor these under a more formal "dynamic configurations" managed
|
||||
independently. This new framework will also allow for non-boolean configurations.
|
||||
|
||||
- ALERTS_ATTACH_REPORTS
|
||||
- ALLOW_ADHOC_SUBQUERY
|
||||
- DASHBOARD_RBAC [(docs)](https://superset.apache.org/docs/using-superset/creating-your-first-dashboard#manage-access-to-dashboards)
|
||||
- DATAPANEL_CLOSED_BY_DEFAULT
|
||||
- DRILL_BY
|
||||
- DRUID_JOINS
|
||||
- EMBEDDABLE_CHARTS
|
||||
- EMBEDDED_SUPERSET
|
||||
- ENABLE_TEMPLATE_PROCESSING
|
||||
- ESCAPE_MARKDOWN_HTML
|
||||
- LISTVIEWS_DEFAULT_CARD_VIEW
|
||||
- SCHEDULED_QUERIES [(docs)](https://superset.apache.org/docs/configuration/alerts-reports)
|
||||
- SLACK_ENABLE_AVATARS (see `superset/config.py` for more information)
|
||||
- SQLLAB_BACKEND_PERSISTENCE
|
||||
- SQL_VALIDATORS_BY_ENGINE [(docs)](https://superset.apache.org/docs/configuration/sql-templating)
|
||||
- THUMBNAILS [(docs)](https://superset.apache.org/docs/configuration/cache)
|
||||
|
||||
## Deprecated Flags
|
||||
|
||||
These features flags currently default to True and **will be removed in a future major release**. For this current release you can turn them off by setting your config to False, but it is advised to remove or set these flags in your local configuration to **True** so that you do not experience any unexpected changes in a future release.
|
||||
|
||||
[//]: # "PLEASE KEEP THE LIST SORTED ALPHABETICALLY"
|
||||
|
||||
- AVOID_COLORS_COLLISION
|
||||
- DRILL_TO_DETAIL
|
||||
- ENABLE_JAVASCRIPT_CONTROLS
|
||||
- KV_STORE
|
||||
@@ -1,226 +0,0 @@
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
## Superset Users in the Wild
|
||||
|
||||
Here's a list of organizations, broken down into broad industry categories, that have taken the time to send a PR to let
|
||||
the world know they are using Apache Superset. If you are a user and want to be recognized,
|
||||
all you have to do is file a simple PR [like this one](https://github.com/apache/superset/pull/10122) — [just click here](https://github.com/apache/superset/edit/master/RESOURCES/INTHEWILD.md) to do so. If you think
|
||||
the categorization is inaccurate, please file a PR with your correction as well.
|
||||
Join our growing community!
|
||||
|
||||
### Sharing Economy
|
||||
|
||||
- [Airbnb](https://github.com/airbnb)
|
||||
- [Faasos](https://faasos.com/) [@shashanksingh]
|
||||
- [Free2Move](https://www.free2move.com/) [@PaoloTerzi]
|
||||
- [Hostnfly](https://www.hostnfly.com/) [@alexisrosuel]
|
||||
- [Lime](https://www.li.me/) [@cxmcc]
|
||||
- [Lyft](https://www.lyft.com/)
|
||||
- [Ontruck](https://www.ontruck.com/)
|
||||
|
||||
### Financial Services
|
||||
|
||||
- [Aktia Bank plc](https://www.aktia.com)
|
||||
- [American Express](https://www.americanexpress.com) [@TheLastSultan]
|
||||
- [bumper](https://www.bumper.co/) [@vasu-ram, @JamiePercival]
|
||||
- [Cape Crypto](https://capecrypto.com)
|
||||
- [Capital Service S.A.](https://capitalservice.pl) [@pkonarzewski]
|
||||
- [Clark.de](https://clark.de/)
|
||||
- [Europace](https://europace.de)
|
||||
- [KarrotPay](https://www.daangnpay.com/)
|
||||
- [Remita](https://remita.net) [@mujibishola]
|
||||
- [Taveo](https://www.taveo.com) [@codek]
|
||||
- [Unit](https://www.unit.co/about-us) [@amitmiran137]
|
||||
- [Wise](https://wise.com) [@koszti]
|
||||
- [Xendit](https://xendit.co/) [@LieAlbertTriAdrian]
|
||||
- [Cover Genius](https://covergenius.com/)
|
||||
|
||||
### Gaming
|
||||
|
||||
- [Popoko VM Games Studio](https://popoko.live)
|
||||
|
||||
### E-Commerce
|
||||
|
||||
- [AiHello](https://www.aihello.com) [@ganeshkrishnan1]
|
||||
- [Bazaar Technologies](https://www.bazaartech.com) [@umair-abro]
|
||||
- [Dragonpass](https://www.dragonpass.com.cn/) [@zhxjdwh]
|
||||
- [Dropit Shopping](https://www.dropit.shop/) [@dropit-dev]
|
||||
- [Fanatics](https://www.fanatics.com/) [@coderfender]
|
||||
- [Fordeal](https://www.fordeal.com) [@Renkai]
|
||||
- [Fynd](https://www.fynd.com/) [@darpanjain07]
|
||||
- [GFG - Global Fashion Group](https://global-fashion-group.com) [@ksaagariconic]
|
||||
- [GoTo/Gojek](https://www.gojek.io/) [@gwthm-in]
|
||||
- [HuiShouBao](https://www.huishoubao.com/) [@Yukinoshita-Yukino]
|
||||
- [Now](https://www.now.vn/) [@davidkohcw]
|
||||
- [Qunar](https://www.qunar.com/) [@flametest]
|
||||
- [Rakuten Viki](https://www.viki.com)
|
||||
- [Shopee](https://shopee.sg) [@xiaohanyu]
|
||||
- [Shopkick](https://www.shopkick.com) [@LAlbertalli]
|
||||
- [ShopUp](https://www.shopup.org/) [@gwthm-in]
|
||||
- [Tails.com](https://tails.com/gb/) [@alanmcruickshank]
|
||||
- [THE ICONIC](https://theiconic.com.au/) [@ksaagariconic]
|
||||
- [Utair](https://www.utair.ru) [@utair-digital]
|
||||
- [VkusVill](https://vkusvill.ru/) [@ETselikov]
|
||||
- [Zalando](https://www.zalando.com) [@dmigo]
|
||||
- [Zalora](https://www.zalora.com) [@ksaagariconic]
|
||||
- [Zepto](https://www.zeptonow.com/) [@gwthm-in]
|
||||
|
||||
### Enterprise Technology
|
||||
|
||||
- [A3Data](https://a3data.com.br) [@neylsoncrepalde]
|
||||
- [Analytics Aura](https://analyticsaura.com/) [@Analytics-Aura]
|
||||
- [Apollo GraphQL](https://www.apollographql.com/) [@evans]
|
||||
- [Astronomer](https://www.astronomer.io) [@ryw]
|
||||
- [Avesta Technologies](https://avestatechnologies.com/) [@TheRum]
|
||||
- [Caizin](https://caizin.com/) [@tejaskatariya]
|
||||
- [Canonical](https://canonical.com)
|
||||
- [Careem](https://www.careem.com/) [@samraHanif0340]
|
||||
- [Cloudsmith](https://cloudsmith.io) [@alancarson]
|
||||
- [Cyberhaven](https://www.cyberhaven.com/) [@toliver-ch]
|
||||
- [Deepomatic](https://deepomatic.com/) [@Zanoellia]
|
||||
- [Dial Once](https://www.dial-once.com/)
|
||||
- [Dremio](https://dremio.com) [@narendrans]
|
||||
- [EFinance](https://www.efinance.com.eg) [@habeeb556]
|
||||
- [Elestio](https://elest.io/) [@kaiwalyakoparkar]
|
||||
- [ELMO Cloud HR & Payroll](https://elmosoftware.com.au/)
|
||||
- [Endress+Hauser](https://www.endress.com/) [@rumbin]
|
||||
- [FBK - ICT center](https://ict.fbk.eu)
|
||||
- [Formbricks](https://formbricks.com)
|
||||
- [Gavagai](https://gavagai.io) [@gavagai-corp]
|
||||
- [GfK Data Lab](https://www.gfk.com/home) [@mherr]
|
||||
- [HPE](https://www.hpe.com/in/en/home.html) [@anmol-hpe]
|
||||
- [Hydrolix](https://www.hydrolix.io/)
|
||||
- [Intercom](https://www.intercom.com/) [@kate-gallo]
|
||||
- [jampp](https://jampp.com/)
|
||||
- [Konfío](https://konfio.mx) [@uis-rodriguez]
|
||||
- [Mainstrat](https://mainstrat.com/)
|
||||
- [mishmash io](https://mishmash.io/) [@mishmash-io]
|
||||
- [Myra Labs](https://www.myralabs.com/) [@viksit]
|
||||
- [Nielsen](https://www.nielsen.com/) [@amitNielsen]
|
||||
- [Ona](https://ona.io) [@pld]
|
||||
- [Orange](https://www.orange.com) [@icsu]
|
||||
- [Oslandia](https://oslandia.com)
|
||||
- [Oxylabs](https://oxylabs.io/) [@rytis-ulys]
|
||||
- [Peak AI](https://www.peak.ai/) [@azhar22k]
|
||||
- [PeopleDoc](https://www.people-doc.com) [@rodo]
|
||||
- [PlaidCloud](https://www.plaidcloud.com)
|
||||
- [Preset, Inc.](https://preset.io)
|
||||
- [PubNub](https://pubnub.com) [@jzucker2]
|
||||
- [ReadyTech](https://www.readytech.io)
|
||||
- [Reward Gateway](https://www.rewardgateway.com)
|
||||
- [RIADVICE](https://riadvice.tn) [@riadvice]
|
||||
- [ScopeAI](https://www.getscopeai.com) [@iloveluce]
|
||||
- [shipmnts](https://shipmnts.com)
|
||||
- [Showmax](https://showmax.com) [@bobek]
|
||||
- [SingleStore](https://www.singlestore.com/)
|
||||
- [TechAudit](https://www.techaudit.info) [@ETselikov]
|
||||
- [Tenable](https://www.tenable.com) [@dflionis]
|
||||
- [Tentacle](https://www.linkedin.com/company/tentacle-cmi/) [@jdclarke5]
|
||||
- [timbr.ai](https://timbr.ai/) [@semantiDan]
|
||||
- [Tobii](https://www.tobii.com/) [@dwa]
|
||||
- [Tooploox](https://www.tooploox.com/) [@jakubczaplicki]
|
||||
- [Unvired](https://unvired.com) [@srinisubramanian]
|
||||
- [Virtuoso QA](https://www.virtuosoqa.com)
|
||||
- [Whale](https://whale.im)
|
||||
- [Windsor.ai](https://www.windsor.ai/) [@octaviancorlade]
|
||||
- [WinWin Network马上赢](https://brandct.cn/) [@wenbinye]
|
||||
- [Zeta](https://www.zeta.tech/) [@shaikidris]
|
||||
|
||||
### Media & Entertainment
|
||||
|
||||
- [6play](https://www.6play.fr) [@CoryChaplin]
|
||||
- [bilibili](https://www.bilibili.com) [@Moinheart]
|
||||
- [BurdaForward](https://www.burda-forward.de/en/)
|
||||
- [Douban](https://www.douban.com/) [@luchuan]
|
||||
- [Kuaishou](https://www.kuaishou.com/) [@zhaoyu89730105]
|
||||
- [Netflix](https://www.netflix.com/)
|
||||
- [Prensa Iberica](https://www.prensaiberica.es/) [@zamar-roura]
|
||||
- [TME QQMUSIC/WESING](https://www.tencentmusic.com/) [@shenyuanli,@marklaw]
|
||||
- [Xite](https://xite.com/) [@shashankkoppar]
|
||||
- [Zaihang](https://www.zaih.com/)
|
||||
|
||||
### Education
|
||||
|
||||
- [Aveti Learning](https://avetilearning.com/) [@TheShubhendra]
|
||||
- [Brilliant.org](https://brilliant.org/)
|
||||
- [Open edX](https://openedx.org/)
|
||||
- [Platzi.com](https://platzi.com/)
|
||||
- [Sunbird](https://www.sunbird.org/) [@eksteporg]
|
||||
- [The GRAPH Network](https://thegraphnetwork.org/) [@fccoelho]
|
||||
- [Udemy](https://www.udemy.com/) [@sungjuly]
|
||||
- [VIPKID](https://www.vipkid.com.cn/) [@illpanda]
|
||||
- [WikiMedia Foundation](https://wikimediafoundation.org) [@vg]
|
||||
|
||||
### Energy
|
||||
|
||||
- [Airboxlab](https://foobot.io) [@antoine-galataud]
|
||||
- [DouroECI](https://www.douroeci.com/) [@nunohelibeires]
|
||||
- [Safaricom](https://www.safaricom.co.ke/) [@mmutiso]
|
||||
- [Scoot](https://scoot.co/) [@haaspt]
|
||||
- [Wattbewerb](https://wattbewerb.de/) [@wattbewerb]
|
||||
|
||||
### Healthcare
|
||||
|
||||
- [Amino](https://amino.com) [@shkr]
|
||||
- [Bluesquare](https://www.bluesquarehub.com/) [@madewulf]
|
||||
- [Care](https://www.getcare.io/) [@alandao2021]
|
||||
- [Living Goods](https://www.livinggoods.org) [@chelule]
|
||||
- [Maieutical Labs](https://maieuticallabs.it) [@xrmx]
|
||||
- [Medic](https://medic.org) [@1yuv]
|
||||
- [REDCap Cloud](https://www.redcapcloud.com/)
|
||||
- [TrustMedis](https://trustmedis.com/) [@famasya]
|
||||
- [WeSure](https://www.wesure.cn/)
|
||||
- [2070Health](https://2070health.com/)
|
||||
|
||||
### HR / Staffing
|
||||
|
||||
- [Swile](https://www.swile.co/) [@PaoloTerzi]
|
||||
- [Symmetrics](https://www.symmetrics.fyi)
|
||||
- [bluquist](https://bluquist.com/)
|
||||
|
||||
### Government
|
||||
|
||||
- [City of Ann Arbor, MI](https://www.a2gov.org/) [@sfirke]
|
||||
- [RIS3 Strategy of CZ, MIT CR](https://www.ris3.cz/) [@RIS3CZ]
|
||||
- [NRLM - Sarathi, India](https://pib.gov.in/PressReleasePage.aspx?PRID=1999586)
|
||||
|
||||
### Travel
|
||||
|
||||
- [Agoda](https://www.agoda.com/) [@lostseaway, @maiake, @obombayo]
|
||||
- [HomeToGo](https://hometogo.com/) [@pedromartinsteenstrup]
|
||||
- [Skyscanner](https://www.skyscanner.net/) [@cleslie, @stanhoucke]
|
||||
|
||||
### Others
|
||||
|
||||
- [10Web](https://10web.io/)
|
||||
- [AI inside](https://inside.ai/en/)
|
||||
- [Automattic](https://automattic.com/) [@Khrol, @Usiel]
|
||||
- [Dropbox](https://www.dropbox.com/) [@bkyryliuk]
|
||||
- [Flowbird](https://flowbird.com) [@EmmanuelCbd]
|
||||
- [GEOTAB](https://www.geotab.com) [@JZ6]
|
||||
- [Grassroot](https://www.grassrootinstitute.org/)
|
||||
- [Increff](https://www.increff.com/) [@ishansinghania]
|
||||
- [komoot](https://www.komoot.com/) [@christophlingg]
|
||||
- [Let's Roam](https://www.letsroam.com/)
|
||||
- [Machrent SA](https://www.machrent.com/)
|
||||
- [Onebeat](https://1beat.com/) [@GuyAttia]
|
||||
- [X](https://x.com/)
|
||||
- [VLMedia](https://www.vlmedia.com.tr/) [@ibotheperfect]
|
||||
- [Yahoo!](https://yahoo.com/)
|
||||
686
RESOURCES/INTHEWILD.yaml
Normal file
686
RESOURCES/INTHEWILD.yaml
Normal file
@@ -0,0 +1,686 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
# Apache Superset Users in the Wild
|
||||
#
|
||||
# To add your organization:
|
||||
# 1. Find the appropriate category (or add a new one)
|
||||
# 2. Add an entry with your organization details
|
||||
# 3. Optionally add a logo file to docs/static/img/logos/
|
||||
#
|
||||
# Required fields:
|
||||
# - name: Your organization name
|
||||
# - url: Link to your organization's website
|
||||
#
|
||||
# Optional fields:
|
||||
# - logo: Filename of logo in docs/static/img/logos/ (e.g., "mycompany.svg")
|
||||
# - contributors: List of GitHub usernames who contributed (e.g., ["@username"])
|
||||
|
||||
categories:
|
||||
Sharing Economy:
|
||||
- name: Airbnb
|
||||
url: https://github.com/airbnb
|
||||
|
||||
- name: Faasos
|
||||
url: https://faasos.com/
|
||||
contributors: ["@shashanksingh"]
|
||||
|
||||
- name: Free2Move
|
||||
url: https://www.free2move.com/
|
||||
contributors: ["@PaoloTerzi"]
|
||||
|
||||
- name: Hostnfly
|
||||
url: https://www.hostnfly.com/
|
||||
contributors: ["@alexisrosuel"]
|
||||
|
||||
- name: Lime
|
||||
url: https://www.li.me/
|
||||
contributors: ["@cxmcc"]
|
||||
|
||||
- name: Lyft
|
||||
url: https://www.lyft.com/
|
||||
|
||||
- name: Ontruck
|
||||
url: https://www.ontruck.com/
|
||||
|
||||
Financial Services:
|
||||
- name: Aktia Bank plc
|
||||
url: https://www.aktia.com
|
||||
|
||||
- name: American Express
|
||||
url: https://www.americanexpress.com
|
||||
contributors: ["@TheLastSultan"]
|
||||
|
||||
- name: bumper
|
||||
url: https://www.bumper.co/
|
||||
contributors: ["@vasu-ram", "@JamiePercival"]
|
||||
|
||||
- name: Cape Crypto
|
||||
url: https://capecrypto.com
|
||||
|
||||
- name: Capital Service S.A.
|
||||
url: https://capitalservice.pl
|
||||
contributors: ["@pkonarzewski"]
|
||||
|
||||
- name: Clark.de
|
||||
url: https://clark.de/
|
||||
|
||||
- name: EnquiryLabs
|
||||
url: https://www.enquirylabs.co.uk
|
||||
|
||||
- name: Europace
|
||||
url: https://europace.de
|
||||
|
||||
- name: KarrotPay
|
||||
url: https://www.daangnpay.com/
|
||||
|
||||
- name: Remita
|
||||
url: https://remita.net
|
||||
contributors: ["@mujibishola"]
|
||||
|
||||
- name: Taveo
|
||||
url: https://www.taveo.com
|
||||
contributors: ["@codek"]
|
||||
|
||||
- name: Unit
|
||||
url: https://www.unit.co/about-us
|
||||
contributors: ["@amitmiran137"]
|
||||
|
||||
- name: Wise
|
||||
url: https://wise.com
|
||||
contributors: ["@koszti"]
|
||||
|
||||
- name: Xendit
|
||||
url: https://xendit.co/
|
||||
contributors: ["@LieAlbertTriAdrian"]
|
||||
|
||||
- name: Cover Genius
|
||||
url: https://covergenius.com/
|
||||
|
||||
Gaming:
|
||||
- name: Popoko VM Games Studio
|
||||
url: https://popoko.live
|
||||
|
||||
E-Commerce:
|
||||
- name: AiHello
|
||||
url: https://www.aihello.com
|
||||
contributors: ["@ganeshkrishnan1"]
|
||||
|
||||
- name: Bazaar Technologies
|
||||
url: https://www.bazaartech.com
|
||||
contributors: ["@umair-abro"]
|
||||
|
||||
- name: Blinkit
|
||||
url: https://www.blinkit.com/
|
||||
contributors: ["@amsharm2"]
|
||||
|
||||
- name: Dragonpass
|
||||
url: https://www.dragonpass.com.cn/
|
||||
contributors: ["@zhxjdwh"]
|
||||
|
||||
- name: Dropit Shopping
|
||||
url: https://www.dropit.shop/
|
||||
contributors: ["@dropit-dev"]
|
||||
|
||||
- name: Fordeal
|
||||
url: https://www.fordeal.com
|
||||
contributors: ["@Renkai"]
|
||||
|
||||
- name: Fynd
|
||||
url: https://www.fynd.com/
|
||||
contributors: ["@darpanjain07"]
|
||||
|
||||
- name: GFG - Global Fashion Group
|
||||
url: https://global-fashion-group.com
|
||||
contributors: ["@ksaagariconic"]
|
||||
|
||||
- name: GoTo/Gojek
|
||||
url: https://www.gojek.io/
|
||||
contributors: ["@gwthm-in"]
|
||||
|
||||
- name: HuiShouBao
|
||||
url: https://www.huishoubao.com/
|
||||
contributors: ["@Yukinoshita-Yukino"]
|
||||
|
||||
- name: Now
|
||||
url: https://www.now.vn/
|
||||
contributors: ["@davidkohcw"]
|
||||
|
||||
- name: Qunar
|
||||
url: https://www.qunar.com/
|
||||
contributors: ["@flametest"]
|
||||
|
||||
- name: Rakuten Viki
|
||||
url: https://www.viki.com
|
||||
|
||||
- name: Shopee
|
||||
url: https://shopee.sg
|
||||
contributors: ["@xiaohanyu"]
|
||||
|
||||
- name: Shopkick
|
||||
url: https://www.shopkick.com
|
||||
contributors: ["@LAlbertalli"]
|
||||
|
||||
- name: ShopUp
|
||||
url: https://www.shopup.org/
|
||||
contributors: ["@gwthm-in"]
|
||||
|
||||
- name: Tails.com
|
||||
url: https://tails.com/gb/
|
||||
contributors: ["@alanmcruickshank"]
|
||||
|
||||
- name: THE ICONIC
|
||||
url: https://theiconic.com.au/
|
||||
contributors: ["@ksaagariconic"]
|
||||
|
||||
- name: Utair
|
||||
url: https://www.utair.ru
|
||||
contributors: ["@utair-digital"]
|
||||
|
||||
- name: VkusVill
|
||||
url: https://vkusvill.ru/
|
||||
contributors: ["@ETselikov"]
|
||||
|
||||
- name: Zalando
|
||||
url: https://www.zalando.com
|
||||
contributors: ["@dmigo"]
|
||||
|
||||
- name: Zalora
|
||||
url: https://www.zalora.com
|
||||
contributors: ["@ksaagariconic"]
|
||||
|
||||
- name: Zepto
|
||||
url: https://www.zeptonow.com/
|
||||
contributors: ["@gwthm-in"]
|
||||
|
||||
Enterprise Technology:
|
||||
- name: A3Data
|
||||
url: https://a3data.com.br
|
||||
contributors: ["@neylsoncrepalde"]
|
||||
|
||||
- name: Analytics Aura
|
||||
url: https://analyticsaura.com/
|
||||
contributors: ["@Analytics-Aura"]
|
||||
|
||||
- name: Apollo GraphQL
|
||||
url: https://www.apollographql.com/
|
||||
contributors: ["@evans"]
|
||||
|
||||
- name: Astronomer
|
||||
url: https://www.astronomer.io
|
||||
contributors: ["@ryw"]
|
||||
|
||||
- name: Avesta Technologies
|
||||
url: https://avestatechnologies.com/
|
||||
contributors: ["@TheRum"]
|
||||
|
||||
- name: Caizin
|
||||
url: https://caizin.com/
|
||||
contributors: ["@tejaskatariya"]
|
||||
|
||||
- name: Canonical
|
||||
url: https://canonical.com
|
||||
|
||||
- name: Careem
|
||||
url: https://www.careem.com/
|
||||
contributors: ["@samraHanif0340"]
|
||||
|
||||
- name: Cloudsmith
|
||||
url: https://cloudsmith.io
|
||||
contributors: ["@alancarson"]
|
||||
|
||||
- name: Cyberhaven
|
||||
url: https://www.cyberhaven.com/
|
||||
contributors: ["@toliver-ch"]
|
||||
|
||||
- name: Deepomatic
|
||||
url: https://deepomatic.com/
|
||||
contributors: ["@Zanoellia"]
|
||||
|
||||
- name: Dial Once
|
||||
url: https://www.dial-once.com/
|
||||
|
||||
- name: Dremio
|
||||
url: https://dremio.com
|
||||
contributors: ["@narendrans"]
|
||||
|
||||
- name: EFinance
|
||||
url: https://www.efinance.com.eg
|
||||
contributors: ["@habeeb556"]
|
||||
|
||||
- name: Elestio
|
||||
url: https://elest.io/
|
||||
contributors: ["@kaiwalyakoparkar"]
|
||||
|
||||
- name: ELMO Cloud HR & Payroll
|
||||
url: https://elmosoftware.com.au/
|
||||
|
||||
- name: Endress+Hauser
|
||||
url: https://www.endress.com/
|
||||
contributors: ["@rumbin"]
|
||||
|
||||
- name: FBK - ICT center
|
||||
url: https://ict.fbk.eu
|
||||
|
||||
- name: Formbricks
|
||||
url: https://formbricks.com
|
||||
|
||||
- name: Gavagai
|
||||
url: https://gavagai.io
|
||||
contributors: ["@gavagai-corp"]
|
||||
|
||||
- name: GfK Data Lab
|
||||
url: https://www.gfk.com/home
|
||||
contributors: ["@mherr"]
|
||||
|
||||
# Logo approved by @anmol-hpe on behalf of HPE
|
||||
- name: HPE
|
||||
url: https://www.hpe.com/in/en/home.html
|
||||
logo: hpe.png
|
||||
contributors: ["@anmol-hpe"]
|
||||
|
||||
- name: Hydrolix
|
||||
url: https://www.hydrolix.io/
|
||||
|
||||
- name: Intercom
|
||||
url: https://www.intercom.com/
|
||||
contributors: ["@kate-gallo"]
|
||||
|
||||
- name: jampp
|
||||
url: https://jampp.com/
|
||||
|
||||
- name: Konfío
|
||||
url: https://konfio.mx
|
||||
contributors: ["@uis-rodriguez"]
|
||||
|
||||
- name: Mainstrat
|
||||
url: https://mainstrat.com/
|
||||
|
||||
- name: mishmash io
|
||||
url: https://mishmash.io/
|
||||
contributors: ["@mishmash-io"]
|
||||
|
||||
- name: Myra Labs
|
||||
url: https://www.myralabs.com/
|
||||
contributors: ["@viksit"]
|
||||
|
||||
- name: Nielsen
|
||||
url: https://www.nielsen.com/
|
||||
contributors: ["@amitNielsen"]
|
||||
|
||||
- name: Ona
|
||||
url: https://ona.io
|
||||
contributors: ["@pld"]
|
||||
|
||||
- name: Orange
|
||||
url: https://www.orange.com
|
||||
contributors: ["@icsu"]
|
||||
|
||||
- name: Oslandia
|
||||
url: https://oslandia.com
|
||||
|
||||
- name: Oxylabs
|
||||
url: https://oxylabs.io/
|
||||
contributors: ["@rytis-ulys"]
|
||||
|
||||
- name: Peak AI
|
||||
url: https://www.peak.ai/
|
||||
contributors: ["@azhar22k"]
|
||||
|
||||
- name: PeopleDoc
|
||||
url: https://www.people-doc.com
|
||||
contributors: ["@rodo"]
|
||||
|
||||
- name: PlaidCloud
|
||||
url: https://plaidcloud.com
|
||||
logo: plaidcloud.svg
|
||||
contributors: ["@rad-pat"]
|
||||
|
||||
- name: Preset, Inc.
|
||||
url: https://preset.io
|
||||
logo: preset.svg
|
||||
contributors: ["@mistercrunch", "@betodealmeida", "@dpgaspar", "@rusackas", "@sadpandajoe", "@Vitor-Avila", "@kgabryje", "@geido", "@eschutho", "@Antonio-RiveroMartnez", "@yousoph"]
|
||||
|
||||
- name: PubNub
|
||||
url: https://pubnub.com
|
||||
contributors: ["@jzucker2"]
|
||||
|
||||
- name: ReadyTech
|
||||
url: https://www.readytech.io
|
||||
|
||||
- name: Reward Gateway
|
||||
url: https://www.rewardgateway.com
|
||||
|
||||
- name: RIADVICE
|
||||
url: https://riadvice.tn
|
||||
contributors: ["@riadvice"]
|
||||
|
||||
- name: ScopeAI
|
||||
url: https://www.getscopeai.com
|
||||
contributors: ["@iloveluce"]
|
||||
|
||||
- name: shipmnts
|
||||
url: https://shipmnts.com
|
||||
|
||||
- name: Showmax
|
||||
url: https://showmax.com
|
||||
contributors: ["@bobek"]
|
||||
|
||||
- name: SingleStore
|
||||
url: https://www.singlestore.com/
|
||||
|
||||
- name: TechAudit
|
||||
url: https://www.techaudit.info
|
||||
contributors: ["@ETselikov"]
|
||||
|
||||
- name: Tenable
|
||||
url: https://www.tenable.com
|
||||
contributors: ["@dflionis"]
|
||||
|
||||
- name: Tentacle
|
||||
url: https://www.linkedin.com/company/tentacle-cmi/
|
||||
contributors: ["@jdclarke5"]
|
||||
|
||||
- name: timbr.ai
|
||||
url: https://timbr.ai/
|
||||
contributors: ["@semantiDan"]
|
||||
|
||||
- name: Tobii
|
||||
url: https://www.tobii.com/
|
||||
contributors: ["@dwa"]
|
||||
|
||||
- name: Tooploox
|
||||
url: https://www.tooploox.com/
|
||||
contributors: ["@jakubczaplicki"]
|
||||
|
||||
- name: Unvired
|
||||
url: https://unvired.com
|
||||
contributors: ["@srinisubramanian"]
|
||||
|
||||
- name: UserGuiding
|
||||
url: https://userguiding.com/
|
||||
logo: userguiding.svg
|
||||
contributors: ["@tzercin"]
|
||||
|
||||
- name: Virtuoso QA
|
||||
url: https://www.virtuosoqa.com
|
||||
|
||||
- name: Whale
|
||||
url: https://whale.im
|
||||
|
||||
- name: Windsor.ai
|
||||
url: https://www.windsor.ai/
|
||||
contributors: ["@octaviancorlade"]
|
||||
|
||||
- name: WinWin Network马上赢
|
||||
url: https://brandct.cn/
|
||||
contributors: ["@wenbinye"]
|
||||
|
||||
- name: Zeta
|
||||
url: https://www.zeta.tech/
|
||||
contributors: ["@shaikidris"]
|
||||
|
||||
Media & Entertainment:
|
||||
- name: 6play
|
||||
url: https://www.6play.fr
|
||||
contributors: ["@CoryChaplin"]
|
||||
|
||||
- name: bilibili
|
||||
url: https://www.bilibili.com
|
||||
contributors: ["@Moinheart"]
|
||||
|
||||
- name: BurdaForward
|
||||
url: https://www.burda-forward.de/en/
|
||||
|
||||
- name: Douban
|
||||
url: https://www.douban.com/
|
||||
contributors: ["@luchuan"]
|
||||
|
||||
- name: Kuaishou
|
||||
url: https://www.kuaishou.com/
|
||||
contributors: ["@zhaoyu89730105"]
|
||||
|
||||
- name: Netflix
|
||||
url: https://www.netflix.com/
|
||||
|
||||
- name: Prensa Iberica
|
||||
url: https://www.prensaiberica.es/
|
||||
contributors: ["@zamar-roura"]
|
||||
|
||||
- name: TME QQMUSIC/WESING
|
||||
url: https://www.tencentmusic.com/
|
||||
contributors: ["@shenyuanli", "@marklaw"]
|
||||
|
||||
- name: Xite
|
||||
url: https://xite.com/
|
||||
contributors: ["@shashankkoppar"]
|
||||
|
||||
- name: Zaihang
|
||||
url: https://www.zaih.com/
|
||||
|
||||
Education:
|
||||
- name: Aveti Learning
|
||||
url: https://avetilearning.com/
|
||||
contributors: ["@TheShubhendra"]
|
||||
|
||||
- name: Brilliant.org
|
||||
url: https://brilliant.org/
|
||||
|
||||
- name: Cirrus Assessment
|
||||
url: https://cirrusassessment.com/
|
||||
logo: cirrus.svg
|
||||
contributors: ["@jeroenhabets", "@ddmm-white", "@paulrocost"]
|
||||
|
||||
- name: Open edX
|
||||
url: https://openedx.org/
|
||||
|
||||
- name: Platzi.com
|
||||
url: https://platzi.com/
|
||||
|
||||
- name: Sunbird
|
||||
url: https://www.sunbird.org/
|
||||
contributors: ["@eksteporg"]
|
||||
|
||||
- name: The GRAPH Network
|
||||
url: https://thegraphnetwork.org/
|
||||
contributors: ["@fccoelho"]
|
||||
|
||||
- name: Udemy
|
||||
url: https://www.udemy.com/
|
||||
contributors: ["@sungjuly"]
|
||||
|
||||
- name: VIPKID
|
||||
url: https://www.vipkid.com.cn/
|
||||
contributors: ["@illpanda"]
|
||||
|
||||
- name: WikiMedia Foundation
|
||||
url: https://wikimediafoundation.org
|
||||
contributors: ["@vg"]
|
||||
|
||||
Energy:
|
||||
- name: Airboxlab
|
||||
url: https://foobot.io
|
||||
contributors: ["@antoine-galataud"]
|
||||
|
||||
- name: DouroECI
|
||||
url: https://www.douroeci.com/
|
||||
contributors: ["@nunohelibeires"]
|
||||
|
||||
- name: Safaricom
|
||||
url: https://www.safaricom.co.ke/
|
||||
contributors: ["@mmutiso"]
|
||||
|
||||
- name: Scoot
|
||||
url: https://scoot.co/
|
||||
contributors: ["@haaspt"]
|
||||
|
||||
- name: Wattbewerb
|
||||
url: https://wattbewerb.de/
|
||||
contributors: ["@wattbewerb"]
|
||||
|
||||
- name: Rogow
|
||||
url: https://rogow.com.br/
|
||||
contributors: ["@nilmonto"]
|
||||
|
||||
Healthcare:
|
||||
- name: Amino
|
||||
url: https://amino.com
|
||||
contributors: ["@shkr"]
|
||||
|
||||
- name: Bluesquare
|
||||
url: https://www.bluesquarehub.com/
|
||||
contributors: ["@madewulf"]
|
||||
|
||||
- name: Care
|
||||
url: https://www.getcare.io/
|
||||
contributors: ["@alandao2021"]
|
||||
|
||||
- name: Living Goods
|
||||
url: https://www.livinggoods.org
|
||||
contributors: ["@chelule"]
|
||||
|
||||
- name: Maieutical Labs
|
||||
url: https://maieuticallabs.it
|
||||
contributors: ["@xrmx"]
|
||||
|
||||
- name: Medic
|
||||
url: https://medic.org
|
||||
contributors: ["@1yuv"]
|
||||
|
||||
- name: REDCap Cloud
|
||||
url: https://www.redcapcloud.com/
|
||||
|
||||
- name: TrustMedis
|
||||
url: https://trustmedis.com/
|
||||
contributors: ["@famasya"]
|
||||
|
||||
- name: WeSure
|
||||
url: https://www.wesure.cn/
|
||||
|
||||
- name: 2070Health
|
||||
url: https://2070health.com/
|
||||
|
||||
HR / Staffing:
|
||||
- name: Swile
|
||||
url: https://www.swile.co/
|
||||
contributors: ["@PaoloTerzi"]
|
||||
|
||||
- name: Symmetrics
|
||||
url: https://www.symmetrics.fyi
|
||||
|
||||
- name: bluquist
|
||||
url: https://bluquist.com/
|
||||
|
||||
Government:
|
||||
- name: City of Ann Arbor, MI
|
||||
url: https://www.a2gov.org/
|
||||
contributors: ["@sfirke"]
|
||||
|
||||
- name: RIS3 Strategy of CZ, MIT CR
|
||||
url: https://www.ris3.cz/
|
||||
contributors: ["@RIS3CZ"]
|
||||
|
||||
- name: NRLM - Sarathi, India
|
||||
url: https://pib.gov.in/PressReleasePage.aspx?PRID=1999586
|
||||
|
||||
Mobile Software:
|
||||
- name: VLMedia
|
||||
url: https://www.vlmedia.com.tr
|
||||
logo: vlmedia.svg
|
||||
contributors: ["@iercan"]
|
||||
|
||||
Travel:
|
||||
- name: Agoda
|
||||
url: https://www.agoda.com/
|
||||
contributors: ["@lostseaway", "@maiake", "@obombayo"]
|
||||
|
||||
- name: HomeToGo
|
||||
url: https://hometogo.com/
|
||||
contributors: ["@pedromartinsteenstrup"]
|
||||
|
||||
- name: Skyscanner
|
||||
url: https://www.skyscanner.net/
|
||||
contributors: ["@cleslie", "@stanhoucke"]
|
||||
|
||||
Logistics:
|
||||
- name: Stockarea
|
||||
url: https://stockarea.io
|
||||
|
||||
Sports:
|
||||
- name: Club 25 de Agosto (Femenino / Women's Team)
|
||||
url: https://www.instagram.com/25deagosto.basketfemenino/
|
||||
contributors: [ "@lion90" ]
|
||||
logo: club25deagosto.svg
|
||||
|
||||
- name: Fanatics
|
||||
url: https://www.fanatics.com/
|
||||
contributors: [ "@coderfender" ]
|
||||
|
||||
- name: komoot
|
||||
url: https://www.komoot.com/
|
||||
contributors: [ "@christophlingg" ]
|
||||
|
||||
Others:
|
||||
- name: 10Web
|
||||
url: https://10web.io/
|
||||
|
||||
- name: AI inside
|
||||
url: https://inside.ai/en/
|
||||
|
||||
- name: Automattic
|
||||
url: https://automattic.com/
|
||||
contributors: ["@Khrol", "@Usiel"]
|
||||
|
||||
- name: Dropbox
|
||||
url: https://www.dropbox.com/
|
||||
contributors: ["@bkyryliuk"]
|
||||
|
||||
- name: Flowbird
|
||||
url: https://flowbird.com
|
||||
contributors: ["@EmmanuelCbd"]
|
||||
|
||||
- name: GEOTAB
|
||||
url: https://www.geotab.com
|
||||
contributors: ["@JZ6"]
|
||||
|
||||
- name: Grassroot
|
||||
url: https://www.grassrootinstitute.org/
|
||||
|
||||
- name: HOLLYLAND猛玛
|
||||
url: https://www.hollyland.com
|
||||
logo: hollyland猛玛.svg
|
||||
contributors: ["@hlyda0601"]
|
||||
|
||||
- name: Increff
|
||||
url: https://www.increff.com/
|
||||
contributors: ["@ishansinghania"]
|
||||
|
||||
- name: Let's Roam
|
||||
url: https://www.letsroam.com/
|
||||
|
||||
- name: Machrent SA
|
||||
url: https://www.machrent.com/
|
||||
|
||||
- name: Onebeat
|
||||
url: https://1beat.com/
|
||||
contributors: ["@GuyAttia"]
|
||||
|
||||
- name: X
|
||||
url: https://x.com/
|
||||
|
||||
- name: Yahoo!
|
||||
url: https://yahoo.com/
|
||||
@@ -17,192 +17,193 @@ specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
| |Admin|Alpha|Gamma|SQL_LAB|
|
||||
|--------------------------------------------------|---|---|---|---|
|
||||
| Permission/role description |Admins have all possible rights, including granting or revoking rights from other users and altering other people’s slices and dashboards.|Alpha users have access to all data sources, but they cannot grant or revoke access from other users. They are also limited to altering the objects that they own. Alpha users can add and alter data sources.|Gamma users have limited access. They can only consume data coming from data sources they have been given access to through another complementary role. They only have access to view the slices and dashboards made from data sources that they have access to. Currently Gamma users are not able to alter or add data sources. We assume that they are mostly content consumers, though they can create slices and dashboards.|The sql_lab role grants access to SQL Lab. Note that while Admin users have access to all databases by default, both Alpha and Gamma users need to be given access on a per database basis.||
|
||||
| can read on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can write on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can read on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on Annotation |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Annotation |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on Dataset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on Log |:heavy_check_mark:|O|O|O|
|
||||
| can write on Log |:heavy_check_mark:|O|O|O|
|
||||
| can read on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on Database |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can write on Database |:heavy_check_mark:|O|O|O|
|
||||
| can read on Query |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can this form get on ResetPasswordView |:heavy_check_mark:|O|O|O|
|
||||
| can this form post on ResetPasswordView |:heavy_check_mark:|O|O|O|
|
||||
| can this form get on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form post on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form get on UserInfoEditView |:heavy_check_mark:|O|O|O|
|
||||
| can this form post on UserInfoEditView |:heavy_check_mark:|O|O|O|
|
||||
| can show on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can edit on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can delete on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can add on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can list on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can userinfo on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| resetmypassword on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| resetpasswords on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| userinfoedit on UserDBModelView |:heavy_check_mark:|O|O|O|
|
||||
| can show on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can edit on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can delete on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can add on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can list on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| copyrole on RoleModelView |:heavy_check_mark:|O|O|O|
|
||||
| can get on OpenApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on SwaggerView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get on MenuApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on AsyncEventsRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can invalidate on CacheRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can csv upload on Database |:heavy_check_mark:|O|O|O|
|
||||
| can excel upload on Database |:heavy_check_mark:|O|O|O|
|
||||
| can query form data on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can query on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can time range on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can external metadata on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can save on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can my queries on SqlLab |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can log on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can import dashboards on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can schemas on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sqllab history on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can publish on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can csv on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can slice on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sync druid source on Superset |:heavy_check_mark:|O|O|O|
|
||||
| can explore on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can approve on Superset |:heavy_check_mark:|O|O|O|
|
||||
| can explore json on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can fetch datasource metadata on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can csrf token on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sqllab on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can select star on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can warm up cache on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sqllab table viz on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can available domains on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can request access on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can post on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can expanded on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can delete on TableSchemaView |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can get on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can post on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can delete query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can migrate query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can activate on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can delete on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can put on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can read on SecurityRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| menu access on Security |:heavy_check_mark:|O|O|O|
|
||||
| menu access on List Users |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on List Roles |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Action Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Manage |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Annotation Layers |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on CSS Templates |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Import Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Data |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Databases |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Datasets |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Charts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on SQL Lab |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| menu access on SQL Editor |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| menu access on Saved Queries |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| menu access on Query Search |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| all datasource access on all_datasource_access |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| all database access on all_database_access |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| all query access on all_query_access |:heavy_check_mark:|O|O|O|
|
||||
| can write on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can edit on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can list on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can download on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can add on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can delete on DynamicPlugin |:heavy_check_mark:|O|O|O|
|
||||
| can external metadata by name on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get value on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can store on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can tagged objects on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can suggestions on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can post on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can edit on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can add on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| muldelete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can edit on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can add on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| muldelete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can edit on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can add on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can list on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can show on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Row Level Security |:heavy_check_mark:|O|O|O|
|
||||
| menu access on Access requests |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Home |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Plugins |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Dashboard Email Schedules |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Chart Emails |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Alerts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Alerts & Report |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| menu access on Scan New Datasources |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can share dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can share chart on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form get on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can this form post on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can set embedded on Dashboard |:heavy_check_mark:|O|O|O|
|
||||
| can export on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on Database |:heavy_check_mark:|O|O|O|
|
||||
| can export on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can write on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can import on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|
||||
| can dashboard permalink on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can grant guest token on SecurityRestApi |:heavy_check_mark:|O|O|O|
|
||||
| can read on AdvancedDataType |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on EmbeddedDashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can duplicate on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on Explore |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can samples on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on AvailableDomains |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can get or create dataset on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get column values on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can export csv on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can get results on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can execute sql query on SQLLab |:heavy_check_mark:|O|O|:heavy_check_mark:|
|
||||
| can recent activity on Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| |Admin|Alpha|Gamma|Public|SQL_LAB|
|
||||
|--------------------------------------------------|---|---|---|---|---|
|
||||
| Permission/role description |Admins have all possible rights, including granting or revoking rights from other users and altering other people's slices and dashboards.|Alpha users have access to all data sources, but they cannot grant or revoke access from other users. They are also limited to altering the objects that they own. Alpha users can add and alter data sources.|Gamma users have limited access. They can only consume data coming from data sources they have been given access to through another complementary role. They only have access to view the slices and dashboards made from data sources that they have access to. Currently Gamma users are not able to alter or add data sources. We assume that they are mostly content consumers, though they can create slices and dashboards.|Public is the most restrictive built-in role, designed for anonymous/unauthenticated users viewing public dashboards. It provides minimal read-only access for dashboard viewing with interactive filters. Use `PUBLIC_ROLE_LIKE = "Public"` to apply these permissions to anonymous users.|The sql_lab role grants access to SQL Lab. Note that while Admin users have access to all databases by default, both Alpha and Gamma users need to be given access on a per database basis.||
|
||||
| can read on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can write on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can read on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on CssTemplate |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can write on ReportSchedule |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on Annotation |:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|O|
|
||||
| can write on Annotation |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can read on AnnotationLayerRestApi |:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|O|
|
||||
| can read on Dataset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can write on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can read on Log |:heavy_check_mark:|O|O|O|O|
|
||||
| can write on Log |:heavy_check_mark:|O|O|O|O|
|
||||
| can read on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on Database |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can write on Database |:heavy_check_mark:|O|O|O|O|
|
||||
| can read on Query |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can this form get on ResetPasswordView |:heavy_check_mark:|O|O|O|O|
|
||||
| can this form post on ResetPasswordView |:heavy_check_mark:|O|O|O|O|
|
||||
| can this form get on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can this form post on ResetMyPasswordView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can this form get on UserInfoEditView |:heavy_check_mark:|O|O|O|O|
|
||||
| can this form post on UserInfoEditView |:heavy_check_mark:|O|O|O|O|
|
||||
| can show on UserDBModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can edit on UserDBModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can delete on UserDBModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can add on UserDBModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can list on UserDBModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can userinfo on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| resetmypassword on UserDBModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| resetpasswords on UserDBModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| userinfoedit on UserDBModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can show on RoleModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can edit on RoleModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can delete on RoleModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can add on RoleModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can list on RoleModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| copyrole on RoleModelView |:heavy_check_mark:|O|O|O|O|
|
||||
| can get on OpenApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can show on SwaggerView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get on MenuApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can list on AsyncEventsRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can invalidate on CacheRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can csv upload on Database |:heavy_check_mark:|O|O|O|O|
|
||||
| can excel upload on Database |:heavy_check_mark:|O|O|O|O|
|
||||
| can query form data on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can query on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can time range on Api |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can external metadata on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can save on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can get on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can my queries on SqlLab |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can log on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can import dashboards on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can schemas on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can sqllab history on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can publish on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can csv on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can slice on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can sync druid source on Superset |:heavy_check_mark:|O|O|O|O|
|
||||
| can explore on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can approve on Superset |:heavy_check_mark:|O|O|O|O|
|
||||
| can explore json on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can fetch datasource metadata on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can csrf token on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can sqllab on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can select star on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can warm up cache on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can sqllab table viz on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can available domains on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can request access on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can post on TableSchemaView |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
|
||||
| can expanded on TableSchemaView |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
|
||||
| can delete on TableSchemaView |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
|
||||
| can get on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can post on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can delete query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can migrate query on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can activate on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can delete on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can put on TabStateView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can read on SecurityRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| menu access on Security |:heavy_check_mark:|O|O|O|O|
|
||||
| menu access on List Users |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on List Roles |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Action Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Manage |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| menu access on Annotation Layers |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on CSS Templates |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| menu access on Import Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Data |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Databases |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Datasets |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Charts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Dashboards |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on SQL Lab |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
|
||||
| menu access on SQL Editor |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| menu access on Saved Queries |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| menu access on Query Search |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| all datasource access on all_datasource_access |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| all database access on all_database_access |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| all query access on all_query_access |:heavy_check_mark:|O|O|O|O|
|
||||
| can write on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
|
||||
| can edit on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
|
||||
| can list on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can show on DynamicPlugin |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can download on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
|
||||
| can add on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
|
||||
| can delete on DynamicPlugin |:heavy_check_mark:|O|O|O|O|
|
||||
| can external metadata by name on Datasource |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get value on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can store on KV |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can tagged objects on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can suggestions on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can post on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can delete on TagView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can edit on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can list on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can show on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can add on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can delete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| muldelete on DashboardEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can edit on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can list on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can show on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can add on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can delete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| muldelete on SliceEmailScheduleView |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can edit on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can list on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can show on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can add on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can delete on AlertModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can list on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can show on AlertLogModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can list on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can show on AlertObservationModelView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Row Level Security |:heavy_check_mark:|O|O|O|O|
|
||||
| menu access on Access requests |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Home |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Plugins |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Dashboard Email Schedules |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Chart Emails |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Alerts |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Alerts & Report |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| menu access on Scan New Datasources |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can share dashboard on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can share chart on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can this form get on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can this form post on ColumnarToDatabaseView |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can export on Chart |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can write on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can read on DashboardFilterStateRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can write on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on DashboardPermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can delete embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can set embedded on Dashboard |:heavy_check_mark:|O|O|O|O|
|
||||
| can export on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get embedded on Dashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can export on Database |:heavy_check_mark:|O|O|O|O|
|
||||
| can export on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can write on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on ExploreFormDataRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can write on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on ExplorePermalinkRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can export on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can import on ImportExportRestApi |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can export on SavedQuery |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|:heavy_check_mark:|
|
||||
| can dashboard permalink on Superset |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can grant guest token on SecurityRestApi |:heavy_check_mark:|O|O|O|O|
|
||||
| can read on AdvancedDataType |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can read on EmbeddedDashboard |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|
||||
| can duplicate on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can read on Explore |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can samples on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can read on AvailableDomains |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
| can get or create dataset on Dataset |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can get column values on Datasource |:heavy_check_mark:|:heavy_check_mark:|O|O|O|
|
||||
| can export csv on SQLLab |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
|
||||
| can get results on SQLLab |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
|
||||
| can execute sql query on SQLLab |:heavy_check_mark:|O|O|O|:heavy_check_mark:|
|
||||
| can recent activity on Log |:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|O|
|
||||
|
||||
183
UPDATING.md
183
UPDATING.md
@@ -23,8 +23,189 @@ This file documents any backwards-incompatible changes in Superset and
|
||||
assists people when migrating to a new version.
|
||||
|
||||
## Next
|
||||
- [33055](https://github.com/apache/superset/pull/33055): Upgrades Flask-AppBuilder to 5.0.0. The AUTH_OID authentication type has been deprecated and is no longer available as an option in Flask-AppBuilder. OpenID (OID) is considered a deprecated authentication protocol - if you are using AUTH_OID, you will need to migrate to an alternative authentication method such as OAuth, LDAP, or database authentication before upgrading.
|
||||
|
||||
### Example Data Loading Improvements
|
||||
|
||||
#### New Directory Structure
|
||||
Examples are now organized by name with data and configs co-located:
|
||||
```
|
||||
superset/examples/
|
||||
├── _shared/ # Shared database & metadata configs
|
||||
├── birth_names/ # Each example is self-contained
|
||||
│ ├── data.parquet # Dataset (Parquet format)
|
||||
│ ├── dataset.yaml # Dataset metadata
|
||||
│ ├── dashboard.yaml # Dashboard config (optional)
|
||||
│ └── charts/ # Chart configs (optional)
|
||||
└── ...
|
||||
```
|
||||
|
||||
#### Simplified Parquet-based Loading
|
||||
- Auto-discovery: create `superset/examples/my_dataset/data.parquet` to add a new example
|
||||
- Parquet is an Apache project format: compressed (~27% smaller), self-describing schema
|
||||
- YAML configs define datasets, charts, and dashboards declaratively
|
||||
- Removed Python-based data generation from individual example files
|
||||
|
||||
#### Test Data Reorganization
|
||||
- Moved `big_data.py` to `superset/cli/test_loaders.py` - better reflects its purpose as a test utility
|
||||
- Fixed inverted logic for `--load-test-data` flag (now correctly includes .test.yaml files when flag is set)
|
||||
- Clarified CLI flags:
|
||||
- `--force` / `-f`: Force reload even if tables exist
|
||||
- `--only-metadata` / `-m`: Create table metadata without loading data
|
||||
- `--load-test-data` / `-t`: Include test dashboards and .test.yaml configs
|
||||
- `--load-big-data` / `-b`: Generate synthetic stress-test data
|
||||
|
||||
#### Bug Fixes
|
||||
- Fixed numpy array serialization for PostgreSQL (converts complex types to JSON strings)
|
||||
- Fixed KeyError for `allow_csv_upload` field in database configs (now optional with default)
|
||||
- Fixed test data loading logic that was incorrectly filtering files
|
||||
|
||||
### MCP Service
|
||||
|
||||
The MCP (Model Context Protocol) service enables AI assistants and automation tools to interact programmatically with Superset.
|
||||
|
||||
#### New Features
|
||||
- MCP service infrastructure with FastMCP framework
|
||||
- Tools for dashboards, charts, datasets, SQL Lab, and instance metadata
|
||||
- Optional dependency: install with `pip install apache-superset[fastmcp]`
|
||||
- Runs as separate process from Superset web server
|
||||
- JWT-based authentication for production deployments
|
||||
|
||||
#### New Configuration Options
|
||||
|
||||
**Development** (single-user, local testing):
|
||||
```python
|
||||
# superset_config.py
|
||||
MCP_DEV_USERNAME = "admin" # User for MCP authentication
|
||||
MCP_SERVICE_HOST = "localhost"
|
||||
MCP_SERVICE_PORT = 5008
|
||||
```
|
||||
|
||||
**Production** (JWT-based, multi-user):
|
||||
```python
|
||||
# superset_config.py
|
||||
MCP_AUTH_ENABLED = True
|
||||
MCP_JWT_ISSUER = "https://your-auth-provider.com"
|
||||
MCP_JWT_AUDIENCE = "superset-mcp"
|
||||
MCP_JWT_ALGORITHM = "RS256" # or "HS256" for shared secrets
|
||||
|
||||
# Option 1: Use JWKS endpoint (recommended for RS256)
|
||||
MCP_JWKS_URI = "https://auth.example.com/.well-known/jwks.json"
|
||||
|
||||
# Option 2: Use static public key (RS256)
|
||||
MCP_JWT_PUBLIC_KEY = "-----BEGIN PUBLIC KEY-----..."
|
||||
|
||||
# Option 3: Use shared secret (HS256)
|
||||
MCP_JWT_ALGORITHM = "HS256"
|
||||
MCP_JWT_SECRET = "your-shared-secret-key"
|
||||
|
||||
# Optional overrides
|
||||
MCP_SERVICE_HOST = "0.0.0.0"
|
||||
MCP_SERVICE_PORT = 5008
|
||||
MCP_SESSION_CONFIG = {
|
||||
"SESSION_COOKIE_SECURE": True,
|
||||
"SESSION_COOKIE_HTTPONLY": True,
|
||||
"SESSION_COOKIE_SAMESITE": "Strict",
|
||||
}
|
||||
```
|
||||
|
||||
#### Running the MCP Service
|
||||
|
||||
```bash
|
||||
# Development
|
||||
superset mcp run --port 5008 --debug
|
||||
|
||||
# Production
|
||||
superset mcp run --port 5008
|
||||
|
||||
# With factory config
|
||||
superset mcp run --port 5008 --use-factory-config
|
||||
```
|
||||
|
||||
#### Deployment Considerations
|
||||
|
||||
The MCP service runs as a **separate process** from the Superset web server.
|
||||
|
||||
**Important**:
|
||||
- Requires same Python environment and configuration as Superset
|
||||
- Shares database connections with main Superset app
|
||||
- Can be scaled independently from web server
|
||||
- Requires `fastmcp` package (optional dependency)
|
||||
|
||||
**Installation**:
|
||||
```bash
|
||||
# Install with MCP support
|
||||
pip install apache-superset[fastmcp]
|
||||
|
||||
# Or add to requirements.txt
|
||||
apache-superset[fastmcp]>=X.Y.Z
|
||||
```
|
||||
|
||||
**Process Management**:
|
||||
Use systemd, supervisord, or Kubernetes to manage the MCP service process.
|
||||
See `superset/mcp_service/PRODUCTION.md` for deployment guides.
|
||||
|
||||
**Security**:
|
||||
- Development: Uses `MCP_DEV_USERNAME` for single-user access
|
||||
- Production: **MUST** configure JWT authentication
|
||||
- See `superset/mcp_service/SECURITY.md` for details
|
||||
|
||||
#### Documentation
|
||||
|
||||
- Architecture: `superset/mcp_service/ARCHITECTURE.md`
|
||||
- Security: `superset/mcp_service/SECURITY.md`
|
||||
- Production: `superset/mcp_service/PRODUCTION.md`
|
||||
- Developer Guide: `superset/mcp_service/CLAUDE.md`
|
||||
- Quick Start: `superset/mcp_service/README.md`
|
||||
|
||||
---
|
||||
|
||||
- [35621](https://github.com/apache/superset/pull/35621): The default hash algorithm has changed from MD5 to SHA-256 for improved security and FedRAMP compliance. This affects cache keys for thumbnails, dashboard digests, chart digests, and filter option names. Existing cached data will be invalidated upon upgrade. To opt out of this change and maintain backward compatibility, set `HASH_ALGORITHM = "md5"` in your `superset_config.py`.
|
||||
- [35062](https://github.com/apache/superset/pull/35062): Changed the function signature of `setupExtensions` to `setupCodeOverrides` with options as arguments.
|
||||
|
||||
### Breaking Changes
|
||||
- [37370](https://github.com/apache/superset/pull/37370): The `APP_NAME` configuration variable no longer controls the browser window/tab title or other frontend branding. Application names should now be configured using the theme system with the `brandAppName` token. The `APP_NAME` config is still used for backend contexts (MCP service, logs, etc.) and serves as a fallback if `brandAppName` is not set.
|
||||
- **Migration:**
|
||||
```python
|
||||
# Before (Superset 5.x)
|
||||
APP_NAME = "My Custom App"
|
||||
|
||||
# After (Superset 6.x) - Option 1: Use theme system (recommended)
|
||||
THEME_DEFAULT = {
|
||||
"token": {
|
||||
"brandAppName": "My Custom App", # Window titles
|
||||
"brandLogoAlt": "My Custom App", # Logo alt text
|
||||
"brandLogoUrl": "/static/assets/images/custom_logo.png"
|
||||
}
|
||||
}
|
||||
|
||||
# After (Superset 6.x) - Option 2: Temporary fallback
|
||||
# Keep APP_NAME for now (will be used as fallback for brandAppName)
|
||||
APP_NAME = "My Custom App"
|
||||
# But you should migrate to THEME_DEFAULT.token.brandAppName
|
||||
```
|
||||
- **Note:** For dark mode, set the same tokens in `THEME_DARK` configuration.
|
||||
|
||||
- [36317](https://github.com/apache/superset/pull/36317): The `CUSTOM_FONT_URLS` configuration option has been removed. Use the new per-theme `fontUrls` token in `THEME_DEFAULT` or database-managed themes instead.
|
||||
- **Before:**
|
||||
```python
|
||||
CUSTOM_FONT_URLS = [
|
||||
"https://fonts.example.com/myfont.css",
|
||||
]
|
||||
```
|
||||
- **After:**
|
||||
```python
|
||||
THEME_DEFAULT = {
|
||||
"token": {
|
||||
"fontUrls": [
|
||||
"https://fonts.example.com/myfont.css",
|
||||
],
|
||||
# ... other tokens
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 6.0.0
|
||||
- [33055](https://github.com/apache/superset/pull/33055): Upgrades Flask-AppBuilder to 5.0.0. The AUTH_OID authentication type has been deprecated and is no longer available as an option in Flask-AppBuilder. OpenID (OID) is considered a deprecated authentication protocol - if you are using AUTH_OID, you will need to migrate to an alternative authentication method such as OAuth, LDAP, or database authentication before upgrading.
|
||||
- [34871](https://github.com/apache/superset/pull/34871): Fixed Jest test hanging issue from Ant Design v5 upgrade. MessageChannel is now mocked in test environment to prevent rc-overflow from causing Jest to hang. Test environment only - no production impact.
|
||||
- [34782](https://github.com/apache/superset/pull/34782): Dataset exports now include the dataset ID in their file name (similar to charts and dashboards). If managing assets as code, make sure to rename existing dataset YAMLs to include the ID (and avoid duplicated files).
|
||||
- [34536](https://github.com/apache/superset/pull/34536): The `ENVIRONMENT_TAG_CONFIG` color values have changed to support only Ant Design semantic colors. Update your `superset_config.py`:
|
||||
|
||||
@@ -77,7 +77,6 @@ x-common-build: &common-build
|
||||
INCLUDE_CHROMIUM: ${INCLUDE_CHROMIUM:-false}
|
||||
INCLUDE_FIREFOX: ${INCLUDE_FIREFOX:-false}
|
||||
BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false}
|
||||
LOAD_EXAMPLES_DUCKDB: ${LOAD_EXAMPLES_DUCKDB:-true}
|
||||
|
||||
services:
|
||||
db-light:
|
||||
@@ -116,7 +115,6 @@ services:
|
||||
DATABASE_HOST: db-light
|
||||
DATABASE_DB: superset_light
|
||||
POSTGRES_DB: superset_light
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py
|
||||
GITHUB_HEAD_REF: ${GITHUB_HEAD_REF:-}
|
||||
GITHUB_SHA: ${GITHUB_SHA:-}
|
||||
@@ -139,8 +137,11 @@ services:
|
||||
DATABASE_HOST: db-light
|
||||
DATABASE_DB: superset_light
|
||||
POSTGRES_DB: superset_light
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
SUPERSET_CONFIG_PATH: /app/docker/pythonpath_dev/superset_config_docker_light.py
|
||||
# Override examples host to use light DB service
|
||||
EXAMPLES_HOST: db-light
|
||||
# Skip example loading for faster startup
|
||||
SUPERSET_LOAD_EXAMPLES: "no"
|
||||
healthcheck:
|
||||
disable: true
|
||||
|
||||
@@ -163,7 +164,7 @@ services:
|
||||
# configuring the dev-server to use the host.docker.internal to connect to the backend
|
||||
superset: "http://superset-light:8088"
|
||||
# Webpack dev server configuration
|
||||
WEBPACK_DEVSERVER_HOST: "${WEBPACK_DEVSERVER_HOST:-127.0.0.1}"
|
||||
WEBPACK_DEVSERVER_HOST: "${WEBPACK_DEVSERVER_HOST:-0.0.0.0}"
|
||||
WEBPACK_DEVSERVER_PORT: "${WEBPACK_DEVSERVER_PORT:-9000}"
|
||||
ports:
|
||||
- "${NODE_PORT:-9001}:9000" # Parameterized port, accessible on all interfaces
|
||||
@@ -196,7 +197,6 @@ services:
|
||||
DATABASE_DB: test
|
||||
POSTGRES_DB: test
|
||||
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@db-light:5432/test
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
SUPERSET_CONFIG: superset_test_config_light
|
||||
PYTHONPATH: /app/pythonpath:/app/docker/pythonpath_dev:/app
|
||||
|
||||
|
||||
@@ -44,7 +44,6 @@ x-common-build: &common-build
|
||||
INCLUDE_CHROMIUM: ${INCLUDE_CHROMIUM:-false}
|
||||
INCLUDE_FIREFOX: ${INCLUDE_FIREFOX:-false}
|
||||
BUILD_TRANSLATIONS: ${BUILD_TRANSLATIONS:-false}
|
||||
LOAD_EXAMPLES_DUCKDB: ${LOAD_EXAMPLES_DUCKDB:-true}
|
||||
|
||||
services:
|
||||
nginx:
|
||||
@@ -54,10 +53,9 @@ services:
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
image: nginx:latest
|
||||
container_name: superset_nginx
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "80:80"
|
||||
- "${NGINX_PORT:-80}:80"
|
||||
extra_hosts:
|
||||
- "host.docker.internal:host-gateway"
|
||||
volumes:
|
||||
@@ -66,10 +64,9 @@ services:
|
||||
|
||||
redis:
|
||||
image: redis:7
|
||||
container_name: superset_cache
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "127.0.0.1:6379:6379"
|
||||
- "127.0.0.1:${REDIS_PORT:-6379}:6379"
|
||||
volumes:
|
||||
- redis:/data
|
||||
|
||||
@@ -80,10 +77,9 @@ services:
|
||||
- path: docker/.env-local # optional override
|
||||
required: false
|
||||
image: postgres:16
|
||||
container_name: superset_db
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "127.0.0.1:5432:5432"
|
||||
- "127.0.0.1:${DATABASE_PORT:-5432}:5432"
|
||||
volumes:
|
||||
- db_home:/var/lib/postgresql/data
|
||||
- ./docker/docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d
|
||||
@@ -96,13 +92,12 @@ services:
|
||||
required: false
|
||||
build:
|
||||
<<: *common-build
|
||||
container_name: superset_app
|
||||
command: ["/app/docker/docker-bootstrap.sh", "app"]
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- 8088:8088
|
||||
- ${SUPERSET_PORT:-8088}:8088
|
||||
# When in cypress-mode ->
|
||||
- 8081:8081
|
||||
- ${CYPRESS_PORT:-8081}:8081
|
||||
extra_hosts:
|
||||
- "host.docker.internal:host-gateway"
|
||||
user: *superset-user
|
||||
@@ -110,14 +105,11 @@ services:
|
||||
superset-init:
|
||||
condition: service_completed_successfully
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
|
||||
superset-websocket:
|
||||
container_name: superset_websocket
|
||||
build: ./superset-websocket
|
||||
ports:
|
||||
- 8080:8080
|
||||
- ${WEBSOCKET_PORT:-8080}:8080
|
||||
extra_hosts:
|
||||
- "host.docker.internal:host-gateway"
|
||||
depends_on:
|
||||
@@ -137,9 +129,9 @@ services:
|
||||
- /home/superset-websocket/node_modules
|
||||
- /home/superset-websocket/dist
|
||||
|
||||
# Mounting a config file that contains a dummy secret required to boot up.
|
||||
# do not use this docker compose in production
|
||||
- ./docker/superset-websocket/config.json:/home/superset-websocket/config.json
|
||||
# Mount config file. Create your own docker/superset-websocket/config.json
|
||||
# for custom settings, then point to it here. Do not use this example in production.
|
||||
- ./docker/superset-websocket/config.example.json:/home/superset-websocket/config.json:ro
|
||||
environment:
|
||||
- PORT=8080
|
||||
- REDIS_HOST=redis
|
||||
@@ -149,7 +141,6 @@ services:
|
||||
superset-init:
|
||||
build:
|
||||
<<: *common-build
|
||||
container_name: superset_init
|
||||
command: ["/app/docker/docker-init.sh"]
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
@@ -163,8 +154,6 @@ services:
|
||||
condition: service_started
|
||||
user: *superset-user
|
||||
volumes: *superset-volumes
|
||||
environment:
|
||||
SUPERSET__SQLALCHEMY_EXAMPLES_URI: "duckdb:////app/data/examples.duckdb"
|
||||
healthcheck:
|
||||
disable: true
|
||||
|
||||
@@ -186,9 +175,10 @@ services:
|
||||
SCARF_ANALYTICS: "${SCARF_ANALYTICS:-}"
|
||||
# configuring the dev-server to use the host.docker.internal to connect to the backend
|
||||
superset: "http://superset:8088"
|
||||
# Bind to all interfaces so Docker port mapping works
|
||||
WEBPACK_DEVSERVER_HOST: "0.0.0.0"
|
||||
ports:
|
||||
- "127.0.0.1:9000:9000" # exposing the dynamic webpack dev server
|
||||
container_name: superset_node
|
||||
- "127.0.0.1:${NODE_PORT:-9000}:9000" # exposing the dynamic webpack dev server
|
||||
command: ["/app/docker/docker-frontend.sh"]
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
@@ -200,7 +190,6 @@ services:
|
||||
superset-worker:
|
||||
build:
|
||||
<<: *common-build
|
||||
container_name: superset_worker
|
||||
command: ["/app/docker/docker-bootstrap.sh", "worker"]
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
@@ -226,7 +215,6 @@ services:
|
||||
superset-worker-beat:
|
||||
build:
|
||||
<<: *common-build
|
||||
container_name: superset_worker_beat
|
||||
command: ["/app/docker/docker-bootstrap.sh", "beat"]
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
@@ -244,7 +232,6 @@ services:
|
||||
superset-tests-worker:
|
||||
build:
|
||||
<<: *common-build
|
||||
container_name: superset_tests_worker
|
||||
command: ["/app/docker/docker-bootstrap.sh", "worker"]
|
||||
env_file:
|
||||
- path: docker/.env # default
|
||||
|
||||
@@ -21,6 +21,15 @@ PYTHONUNBUFFERED=1
|
||||
COMPOSE_PROJECT_NAME=superset
|
||||
DEV_MODE=true
|
||||
|
||||
# Port configuration (override in .env-local for multiple instances)
|
||||
# NGINX_PORT=80
|
||||
# SUPERSET_PORT=8088
|
||||
# NODE_PORT=9000
|
||||
# WEBSOCKET_PORT=8080
|
||||
# CYPRESS_PORT=8081
|
||||
# DATABASE_PORT=5432
|
||||
# REDIS_PORT=6379
|
||||
|
||||
# database configurations (do not modify)
|
||||
DATABASE_DB=superset
|
||||
DATABASE_HOST=db
|
||||
|
||||
39
docker/.env-local.example
Normal file
39
docker/.env-local.example
Normal file
@@ -0,0 +1,39 @@
|
||||
#
|
||||
# Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
# contributor license agreements. See the NOTICE file distributed with
|
||||
# this work for additional information regarding copyright ownership.
|
||||
# The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
# (the "License"); you may not use this file except in compliance with
|
||||
# the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# Example .env-local file for running multiple Superset instances
|
||||
# Copy this file to .env-local and customize for your setup
|
||||
# -----------------------------------------------------------------------
|
||||
|
||||
# Unique project name prevents container/volume conflicts between clones
|
||||
# Each clone should have a different name (e.g., superset-pr123, superset-feature-x)
|
||||
COMPOSE_PROJECT_NAME=superset-dev2
|
||||
|
||||
# Port offsets for running multiple instances simultaneously
|
||||
# Instance 1 (default): 80, 8088, 9000, 8080, 8081, 5432, 6379
|
||||
# Instance 2 example: 81, 8089, 9001, 8082, 8083, 5433, 6380
|
||||
NGINX_PORT=81
|
||||
SUPERSET_PORT=8089
|
||||
NODE_PORT=9001
|
||||
WEBSOCKET_PORT=8082
|
||||
CYPRESS_PORT=8083
|
||||
DATABASE_PORT=5433
|
||||
REDIS_PORT=6380
|
||||
|
||||
# For verbose logging during development:
|
||||
# SUPERSET_LOG_LEVEL=debug
|
||||
@@ -34,8 +34,24 @@ intended for use with local development.
|
||||
|
||||
### Local overrides
|
||||
|
||||
#### Environment Variables
|
||||
|
||||
To override environment variables locally, create a `./docker/.env-local` file (git-ignored). This file will be loaded after `.env` and can override any settings.
|
||||
|
||||
#### Python Configuration
|
||||
|
||||
In order to override configuration settings locally, simply make a copy of [`./docker/pythonpath_dev/superset_config_local.example`](./pythonpath_dev/superset_config_local.example)
|
||||
into `./docker/pythonpath_dev/superset_config_docker.py` (git ignored) and fill in your overrides.
|
||||
into `./docker/pythonpath_dev/superset_config_docker.py` (git-ignored) and fill in your overrides.
|
||||
|
||||
#### WebSocket Configuration
|
||||
|
||||
To customize the WebSocket server configuration, create `./docker/superset-websocket/config.json` (git-ignored) based on [`./docker/superset-websocket/config.example.json`](./superset-websocket/config.example.json).
|
||||
|
||||
Then update the `superset-websocket`.`volumes` config to mount it.
|
||||
|
||||
#### Docker Compose Overrides
|
||||
|
||||
For advanced Docker Compose customization, create a `docker-compose-override.yml` file (git-ignored) to override or extend services without modifying the main compose file.
|
||||
|
||||
### Local packages
|
||||
|
||||
@@ -61,6 +77,34 @@ To run the container, simply run: `docker compose up`
|
||||
After waiting several minutes for Superset initialization to finish, you can open a browser and view [`http://localhost:8088`](http://localhost:8088)
|
||||
to start your journey.
|
||||
|
||||
### Running Multiple Instances
|
||||
|
||||
If you need to run multiple Superset instances simultaneously (e.g., different branches or clones), use the make targets which automatically find available ports:
|
||||
|
||||
```bash
|
||||
make up
|
||||
```
|
||||
|
||||
This automatically:
|
||||
- Generates a unique project name from your directory
|
||||
- Finds available ports (incrementing from defaults if in use)
|
||||
- Displays the assigned URLs before starting
|
||||
|
||||
Available commands (run from repo root):
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `make up` | Start services (foreground) |
|
||||
| `make up-detached` | Start services (background) |
|
||||
| `make down` | Stop all services |
|
||||
| `make ps` | Show running containers |
|
||||
| `make logs` | Follow container logs |
|
||||
| `make nuke` | Stop, remove volumes & local images |
|
||||
|
||||
From a subdirectory, use: `make -C $(git rev-parse --show-toplevel) up`
|
||||
|
||||
**Important**: Always use these commands instead of plain `docker compose down`, which won't know the correct project name.
|
||||
|
||||
## Developing
|
||||
|
||||
While running, the container server will reload on modification of the Superset Python and JavaScript source code.
|
||||
|
||||
@@ -80,12 +80,16 @@ case "${1}" in
|
||||
;;
|
||||
app)
|
||||
echo "Starting web app (using development server)..."
|
||||
flask run -p $PORT --reload --debugger --without-threads --host=0.0.0.0
|
||||
flask run -p $PORT --reload --debugger --without-threads --host=0.0.0.0 --exclude-patterns "*/node_modules/*:*/.venv/*:*/build/*:*/__pycache__/*"
|
||||
;;
|
||||
app-gunicorn)
|
||||
echo "Starting web app..."
|
||||
/usr/bin/run-server.sh
|
||||
;;
|
||||
mcp)
|
||||
echo "Starting MCP service..."
|
||||
superset mcp run --host 0.0.0.0 --port ${MCP_PORT:-5008} --debug
|
||||
;;
|
||||
*)
|
||||
echo "Unknown Operation!!!"
|
||||
;;
|
||||
|
||||
@@ -19,6 +19,7 @@
|
||||
|
||||
# Import all settings from the main config first
|
||||
from flask_caching.backends.filesystemcache import FileSystemCache
|
||||
|
||||
from superset_config import * # noqa: F403
|
||||
|
||||
# Override caching to use simple in-memory cache instead of Redis
|
||||
|
||||
22
docker/superset-websocket/config.example.json
Normal file
22
docker/superset-websocket/config.example.json
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"port": 8080,
|
||||
"logLevel": "info",
|
||||
"logToFile": false,
|
||||
"logFilename": "app.log",
|
||||
"statsd": {
|
||||
"host": "127.0.0.1",
|
||||
"port": 8125,
|
||||
"globalTags": []
|
||||
},
|
||||
"redis": {
|
||||
"port": 6379,
|
||||
"host": "127.0.0.1",
|
||||
"password": "",
|
||||
"db": 0,
|
||||
"ssl": false
|
||||
},
|
||||
"redisStreamPrefix": "async-events-",
|
||||
"jwtAlgorithms": ["HS256"],
|
||||
"jwtSecret": "CHANGE-ME-IN-PRODUCTION-GOTTA-BE-LONG-AND-SECRET",
|
||||
"jwtCookieName": "async-token"
|
||||
}
|
||||
115
docs/.claude/instructions.md
Normal file
115
docs/.claude/instructions.md
Normal file
@@ -0,0 +1,115 @@
|
||||
# Developer Portal Documentation Instructions
|
||||
|
||||
## Core Principle: Stories Are the Single Source of Truth
|
||||
|
||||
When working on the Storybook-to-MDX documentation system:
|
||||
|
||||
**ALWAYS fix the story first. NEVER add workarounds to the generator.**
|
||||
|
||||
## Why This Matters
|
||||
|
||||
The generator (`scripts/generate-superset-components.mjs`) should be lightweight - it extracts data from stories and passes it through. When you add special cases to the generator:
|
||||
- It becomes harder to maintain
|
||||
- Stories diverge from their docs representation
|
||||
- Future stories need to know about generator quirks
|
||||
|
||||
When you fix stories to match the expected patterns:
|
||||
- Stories work identically in Storybook and Docs
|
||||
- The generator stays simple and predictable
|
||||
- Patterns are consistent and learnable
|
||||
|
||||
## Story Patterns for Docs Generation
|
||||
|
||||
### Required Structure
|
||||
```tsx
|
||||
// Use inline export default (NOT const meta = ...; export default meta)
|
||||
export default {
|
||||
title: 'Components/MyComponent',
|
||||
component: MyComponent,
|
||||
};
|
||||
|
||||
// Name interactive stories with Interactive prefix
|
||||
export const InteractiveMyComponent: Story = {
|
||||
args: {
|
||||
// Default prop values
|
||||
},
|
||||
argTypes: {
|
||||
// Control definitions - MUST be at story level, not meta level
|
||||
propName: {
|
||||
control: { type: 'select' },
|
||||
options: ['a', 'b', 'c'],
|
||||
description: 'What this prop does',
|
||||
},
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### For Components with Variants (size × style grids)
|
||||
```tsx
|
||||
const sizes = ['small', 'medium', 'large'];
|
||||
const variants = ['primary', 'secondary', 'danger'];
|
||||
|
||||
InteractiveButton.parameters = {
|
||||
docs: {
|
||||
gallery: {
|
||||
component: 'Button',
|
||||
sizes,
|
||||
styles: variants,
|
||||
sizeProp: 'size',
|
||||
styleProp: 'variant',
|
||||
},
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### For Components Requiring Children
|
||||
```tsx
|
||||
InteractiveIconTooltip.parameters = {
|
||||
docs: {
|
||||
// Component descriptors with dot notation for nested components
|
||||
sampleChildren: [{ component: 'Icons.InfoCircleOutlined', props: { iconSize: 'l' } }],
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### For Custom Live Code Examples
|
||||
```tsx
|
||||
InteractiveMyComponent.parameters = {
|
||||
docs: {
|
||||
liveExample: `function Demo() {
|
||||
return <MyComponent prop="value">Content</MyComponent>;
|
||||
}`,
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### For Complex Props (objects, arrays)
|
||||
```tsx
|
||||
InteractiveMenu.parameters = {
|
||||
docs: {
|
||||
staticProps: {
|
||||
items: [
|
||||
{ key: '1', label: 'Item 1' },
|
||||
{ key: '2', label: 'Item 2' },
|
||||
],
|
||||
},
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
## Common Issues and How to Fix Them (in the Story)
|
||||
|
||||
| Issue | Wrong Approach | Right Approach |
|
||||
|-------|---------------|----------------|
|
||||
| Component not generated | Add pattern to generator | Change story to use inline `export default` |
|
||||
| Control shows as text instead of select | Add special case in generator | Add `argTypes` with `control: { type: 'select' }` |
|
||||
| Missing children/content | Modify StorybookWrapper | Add `parameters.docs.sampleChildren` |
|
||||
| Gallery not showing | Add to generator output | Add `parameters.docs.gallery` config |
|
||||
| Wrong live example | Hardcode in generator | Add `parameters.docs.liveExample` |
|
||||
|
||||
## Files
|
||||
|
||||
- **Generator**: `docs/scripts/generate-superset-components.mjs`
|
||||
- **Wrapper**: `docs/src/components/StorybookWrapper.jsx`
|
||||
- **Output**: `docs/developer_portal/components/`
|
||||
- **Stories**: `superset-frontend/packages/superset-ui-core/src/components/*/`
|
||||
21
docs/.gitignore
vendored
21
docs/.gitignore
vendored
@@ -23,3 +23,24 @@ docs/.zshrc
|
||||
|
||||
# Gets copied from the root of the project at build time (yarn start / yarn build)
|
||||
docs/intro.md
|
||||
|
||||
# Generated badge images (downloaded at build time by remark-localize-badges plugin)
|
||||
static/badges/
|
||||
|
||||
# Generated database documentation MDX files (regenerated at build time)
|
||||
# Source of truth is in superset/db_engine_specs/*.py metadata attributes
|
||||
docs/databases/
|
||||
|
||||
# Generated API documentation (regenerated at build time from openapi.json)
|
||||
# Source of truth is static/resources/openapi.json
|
||||
docs/api/
|
||||
|
||||
# Generated component documentation MDX files (regenerated at build time)
|
||||
# Source of truth is Storybook stories in superset-frontend/packages/superset-ui-core/src/components/
|
||||
developer_portal/components/
|
||||
|
||||
# Generated extension component documentation (regenerated at build time)
|
||||
developer_portal/extensions/components/
|
||||
|
||||
# Note: src/data/databases.json is COMMITTED (not ignored) to preserve feature diagnostics
|
||||
# that require Flask context to generate. Update it locally with: npm run gen-db-docs
|
||||
|
||||
@@ -416,7 +416,7 @@ If versions don't appear in dropdown:
|
||||
|
||||
- [Docusaurus Documentation](https://docusaurus.io/docs)
|
||||
- [MDX Documentation](https://mdxjs.com/)
|
||||
- [Superset Contributing Guide](../CONTRIBUTING.md)
|
||||
- [Superset Developer Portal](https://superset.apache.org/developer_portal/)
|
||||
- [Main Superset Documentation](https://superset.apache.org/docs/intro)
|
||||
|
||||
## 📖 Real Examples and Patterns
|
||||
|
||||
@@ -18,9 +18,9 @@ under the License.
|
||||
-->
|
||||
|
||||
This is the public documentation site for Superset, built using
|
||||
[Docusaurus 3](https://docusaurus.io/). See
|
||||
[CONTRIBUTING.md](../CONTRIBUTING.md#documentation) for documentation on
|
||||
contributing to documentation.
|
||||
[Docusaurus 3](https://docusaurus.io/). See the
|
||||
[Developer Portal](https://superset.apache.org/developer_portal/contributing/development-setup#documentation)
|
||||
for documentation on contributing to documentation.
|
||||
|
||||
## Version Management
|
||||
|
||||
|
||||
@@ -19,5 +19,14 @@
|
||||
*/
|
||||
|
||||
module.exports = {
|
||||
presets: [require.resolve('@docusaurus/core/lib/babel/preset')],
|
||||
presets: [
|
||||
[
|
||||
require.resolve('@docusaurus/core/lib/babel/preset'),
|
||||
{
|
||||
runtime: 'automatic',
|
||||
importSource: '@emotion/react',
|
||||
},
|
||||
],
|
||||
],
|
||||
plugins: ['@emotion/babel-plugin'],
|
||||
};
|
||||
|
||||
@@ -1,772 +0,0 @@
|
||||
---
|
||||
title: Frontend API Reference
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Frontend Extension API Reference
|
||||
|
||||
The `@apache-superset/core` package provides comprehensive APIs for frontend extensions to interact with Apache Superset. All APIs are versioned and follow semantic versioning principles.
|
||||
|
||||
## Core APIs
|
||||
|
||||
### Extension Context
|
||||
|
||||
Every extension receives a context object during activation that provides access to the extension system.
|
||||
|
||||
```typescript
|
||||
interface ExtensionContext {
|
||||
// Unique extension identifier
|
||||
extensionId: string;
|
||||
|
||||
// Extension metadata
|
||||
extensionPath: string;
|
||||
extensionUri: Uri;
|
||||
|
||||
// Storage paths
|
||||
globalStorageUri: Uri;
|
||||
workspaceStorageUri: Uri;
|
||||
|
||||
// Subscription management
|
||||
subscriptions: Disposable[];
|
||||
|
||||
// State management
|
||||
globalState: Memento;
|
||||
workspaceState: Memento;
|
||||
|
||||
// Extension-specific APIs
|
||||
registerView(viewId: string, component: React.Component): Disposable;
|
||||
registerCommand(commandId: string, handler: CommandHandler): Disposable;
|
||||
}
|
||||
```
|
||||
|
||||
### Lifecycle Methods
|
||||
|
||||
```typescript
|
||||
// Required: Called when extension is activated
|
||||
export function activate(context: ExtensionContext): void | Promise<void> {
|
||||
console.log('Extension activated');
|
||||
}
|
||||
|
||||
// Optional: Called when extension is deactivated
|
||||
export function deactivate(): void | Promise<void> {
|
||||
console.log('Extension deactivated');
|
||||
}
|
||||
```
|
||||
|
||||
## SQL Lab APIs
|
||||
|
||||
The `sqlLab` namespace provides APIs specific to SQL Lab functionality.
|
||||
|
||||
### Query Management
|
||||
|
||||
```typescript
|
||||
// Get current query editor content
|
||||
sqlLab.getCurrentQuery(): string | undefined
|
||||
|
||||
// Get active tab information
|
||||
sqlLab.getCurrentTab(): Tab | undefined
|
||||
|
||||
// Get all open tabs
|
||||
sqlLab.getTabs(): Tab[]
|
||||
|
||||
// Get available databases
|
||||
sqlLab.getDatabases(): Database[]
|
||||
|
||||
// Get schemas for a database
|
||||
sqlLab.getSchemas(databaseId: number): Promise<Schema[]>
|
||||
|
||||
// Get tables for a schema
|
||||
sqlLab.getTables(databaseId: number, schema: string): Promise<Table[]>
|
||||
|
||||
// Insert text at cursor position
|
||||
sqlLab.insertText(text: string): void
|
||||
|
||||
// Replace entire query
|
||||
sqlLab.replaceQuery(query: string): void
|
||||
|
||||
// Execute current query
|
||||
sqlLab.executeQuery(): Promise<QueryResult>
|
||||
|
||||
// Stop query execution
|
||||
sqlLab.stopQuery(queryId: string): Promise<void>
|
||||
```
|
||||
|
||||
### Event Subscriptions
|
||||
|
||||
```typescript
|
||||
// Query execution events
|
||||
sqlLab.onDidQueryRun(
|
||||
listener: (event: QueryRunEvent) => void
|
||||
): Disposable
|
||||
|
||||
sqlLab.onDidQueryComplete(
|
||||
listener: (event: QueryCompleteEvent) => void
|
||||
): Disposable
|
||||
|
||||
sqlLab.onDidQueryFail(
|
||||
listener: (event: QueryFailEvent) => void
|
||||
): Disposable
|
||||
|
||||
// Editor events
|
||||
sqlLab.onDidChangeEditorContent(
|
||||
listener: (content: string) => void
|
||||
): Disposable
|
||||
|
||||
sqlLab.onDidChangeActiveTab(
|
||||
listener: (tab: Tab) => void
|
||||
): Disposable
|
||||
|
||||
// Panel events
|
||||
sqlLab.onDidOpenPanel(
|
||||
listener: (panel: Panel) => void
|
||||
): Disposable
|
||||
|
||||
sqlLab.onDidClosePanel(
|
||||
listener: (panel: Panel) => void
|
||||
): Disposable
|
||||
```
|
||||
|
||||
### Types
|
||||
|
||||
```typescript
|
||||
interface Tab {
|
||||
id: string;
|
||||
title: string;
|
||||
query: string;
|
||||
database: Database;
|
||||
schema?: string;
|
||||
isActive: boolean;
|
||||
queryId?: string;
|
||||
status?: 'pending' | 'running' | 'success' | 'error';
|
||||
}
|
||||
|
||||
interface Database {
|
||||
id: number;
|
||||
name: string;
|
||||
backend: string;
|
||||
allows_subquery: boolean;
|
||||
allows_ctas: boolean;
|
||||
allows_cvas: boolean;
|
||||
}
|
||||
|
||||
interface QueryResult {
|
||||
queryId: string;
|
||||
status: 'success' | 'error';
|
||||
data?: any[];
|
||||
columns?: Column[];
|
||||
error?: string;
|
||||
startTime: number;
|
||||
endTime: number;
|
||||
rows: number;
|
||||
}
|
||||
```
|
||||
|
||||
## Commands API
|
||||
|
||||
Register and execute commands within Superset.
|
||||
|
||||
### Registration
|
||||
|
||||
```typescript
|
||||
interface CommandHandler {
|
||||
execute(...args: any[]): any | Promise<any>;
|
||||
isEnabled?(): boolean;
|
||||
isVisible?(): boolean;
|
||||
}
|
||||
|
||||
// Register a command
|
||||
commands.registerCommand(
|
||||
commandId: string,
|
||||
handler: CommandHandler
|
||||
): Disposable
|
||||
|
||||
// Register with metadata
|
||||
commands.registerCommand(
|
||||
commandId: string,
|
||||
metadata: CommandMetadata,
|
||||
handler: (...args: any[]) => any
|
||||
): Disposable
|
||||
|
||||
interface CommandMetadata {
|
||||
title: string;
|
||||
category?: string;
|
||||
icon?: string;
|
||||
enablement?: string;
|
||||
when?: string;
|
||||
}
|
||||
```
|
||||
|
||||
### Execution
|
||||
|
||||
```typescript
|
||||
// Execute a command
|
||||
commands.executeCommand<T>(
|
||||
commandId: string,
|
||||
...args: any[]
|
||||
): Promise<T>
|
||||
|
||||
// Get all registered commands
|
||||
commands.getCommands(): Promise<string[]>
|
||||
|
||||
// Check if command exists
|
||||
commands.hasCommand(commandId: string): boolean
|
||||
```
|
||||
|
||||
### Built-in Commands
|
||||
|
||||
```typescript
|
||||
// SQL Lab commands
|
||||
'sqllab.executeQuery'
|
||||
'sqllab.formatQuery'
|
||||
'sqllab.saveQuery'
|
||||
'sqllab.shareQuery'
|
||||
'sqllab.downloadResults'
|
||||
|
||||
// Editor commands
|
||||
'editor.action.formatDocument'
|
||||
'editor.action.commentLine'
|
||||
'editor.action.findReferences'
|
||||
|
||||
// Extension commands
|
||||
'extensions.installExtension'
|
||||
'extensions.uninstallExtension'
|
||||
'extensions.enableExtension'
|
||||
'extensions.disableExtension'
|
||||
```
|
||||
|
||||
## UI Components
|
||||
|
||||
Pre-built components from `@apache-superset/core` for consistent UI.
|
||||
|
||||
### Basic Components
|
||||
|
||||
```typescript
|
||||
import {
|
||||
Button,
|
||||
Input,
|
||||
Select,
|
||||
Checkbox,
|
||||
Radio,
|
||||
Switch,
|
||||
Slider,
|
||||
DatePicker,
|
||||
TimePicker,
|
||||
Tooltip,
|
||||
Popover,
|
||||
Modal,
|
||||
Drawer,
|
||||
Alert,
|
||||
Message,
|
||||
Notification,
|
||||
Spin,
|
||||
Progress
|
||||
} from '@apache-superset/core';
|
||||
```
|
||||
|
||||
### Data Display
|
||||
|
||||
```typescript
|
||||
import {
|
||||
Table,
|
||||
List,
|
||||
Card,
|
||||
Collapse,
|
||||
Tabs,
|
||||
Tag,
|
||||
Badge,
|
||||
Statistic,
|
||||
Timeline,
|
||||
Tree,
|
||||
Empty,
|
||||
Result
|
||||
} from '@apache-superset/core';
|
||||
```
|
||||
|
||||
### Form Components
|
||||
|
||||
```typescript
|
||||
import {
|
||||
Form,
|
||||
FormItem,
|
||||
FormList,
|
||||
InputNumber,
|
||||
TextArea,
|
||||
Upload,
|
||||
Rate,
|
||||
Cascader,
|
||||
AutoComplete,
|
||||
Mentions
|
||||
} from '@apache-superset/core';
|
||||
```
|
||||
|
||||
## Authentication API
|
||||
|
||||
Access authentication and user information.
|
||||
|
||||
```typescript
|
||||
// Get current user
|
||||
authentication.getCurrentUser(): User | undefined
|
||||
|
||||
interface User {
|
||||
id: number;
|
||||
username: string;
|
||||
email: string;
|
||||
firstName: string;
|
||||
lastName: string;
|
||||
roles: Role[];
|
||||
isActive: boolean;
|
||||
isAnonymous: boolean;
|
||||
}
|
||||
|
||||
// Get CSRF token for API requests
|
||||
authentication.getCSRFToken(): Promise<string>
|
||||
|
||||
// Check permissions
|
||||
authentication.hasPermission(
|
||||
permission: string,
|
||||
resource?: string
|
||||
): boolean
|
||||
|
||||
// Get user preferences
|
||||
authentication.getPreferences(): UserPreferences
|
||||
|
||||
// Update preferences
|
||||
authentication.setPreference(
|
||||
key: string,
|
||||
value: any
|
||||
): Promise<void>
|
||||
```
|
||||
|
||||
## Storage API
|
||||
|
||||
Persist data across sessions.
|
||||
|
||||
### Global Storage
|
||||
|
||||
```typescript
|
||||
// Shared across all workspaces
|
||||
const globalState = context.globalState;
|
||||
|
||||
// Get value
|
||||
const value = globalState.get<T>(key: string): T | undefined
|
||||
|
||||
// Set value
|
||||
await globalState.update(key: string, value: any): Promise<void>
|
||||
|
||||
// Get all keys
|
||||
globalState.keys(): readonly string[]
|
||||
```
|
||||
|
||||
### Workspace Storage
|
||||
|
||||
```typescript
|
||||
// Specific to current workspace
|
||||
const workspaceState = context.workspaceState;
|
||||
|
||||
// Same API as globalState
|
||||
workspaceState.get<T>(key: string): T | undefined
|
||||
workspaceState.update(key: string, value: any): Promise<void>
|
||||
workspaceState.keys(): readonly string[]
|
||||
```
|
||||
|
||||
### Secrets Storage
|
||||
|
||||
```typescript
|
||||
// Secure storage for sensitive data
|
||||
secrets.store(key: string, value: string): Promise<void>
|
||||
secrets.get(key: string): Promise<string | undefined>
|
||||
secrets.delete(key: string): Promise<void>
|
||||
```
|
||||
|
||||
## Events API
|
||||
|
||||
Subscribe to and emit custom events.
|
||||
|
||||
```typescript
|
||||
// Create an event emitter
|
||||
const onDidChange = new EventEmitter<ChangeEvent>();
|
||||
|
||||
// Expose as event
|
||||
export const onChange = onDidChange.event;
|
||||
|
||||
// Fire event
|
||||
onDidChange.fire({
|
||||
type: 'update',
|
||||
data: newData
|
||||
});
|
||||
|
||||
// Subscribe to event
|
||||
const disposable = onChange((event) => {
|
||||
console.log('Changed:', event);
|
||||
});
|
||||
|
||||
// Cleanup
|
||||
disposable.dispose();
|
||||
```
|
||||
|
||||
## Window API
|
||||
|
||||
Interact with the UI window.
|
||||
|
||||
### Notifications
|
||||
|
||||
```typescript
|
||||
// Show info message
|
||||
window.showInformationMessage(
|
||||
message: string,
|
||||
...items: string[]
|
||||
): Promise<string | undefined>
|
||||
|
||||
// Show warning
|
||||
window.showWarningMessage(
|
||||
message: string,
|
||||
...items: string[]
|
||||
): Promise<string | undefined>
|
||||
|
||||
// Show error
|
||||
window.showErrorMessage(
|
||||
message: string,
|
||||
...items: string[]
|
||||
): Promise<string | undefined>
|
||||
|
||||
// Show with options
|
||||
window.showInformationMessage(
|
||||
message: string,
|
||||
options: MessageOptions,
|
||||
...items: MessageItem[]
|
||||
): Promise<MessageItem | undefined>
|
||||
|
||||
interface MessageOptions {
|
||||
modal?: boolean;
|
||||
detail?: string;
|
||||
}
|
||||
```
|
||||
|
||||
### Input Dialogs
|
||||
|
||||
```typescript
|
||||
// Show input box
|
||||
window.showInputBox(
|
||||
options?: InputBoxOptions
|
||||
): Promise<string | undefined>
|
||||
|
||||
interface InputBoxOptions {
|
||||
title?: string;
|
||||
prompt?: string;
|
||||
placeHolder?: string;
|
||||
value?: string;
|
||||
password?: boolean;
|
||||
validateInput?(value: string): string | null;
|
||||
}
|
||||
|
||||
// Show quick pick
|
||||
window.showQuickPick(
|
||||
items: string[] | QuickPickItem[],
|
||||
options?: QuickPickOptions
|
||||
): Promise<string | QuickPickItem | undefined>
|
||||
|
||||
interface QuickPickOptions {
|
||||
title?: string;
|
||||
placeHolder?: string;
|
||||
canPickMany?: boolean;
|
||||
matchOnDescription?: boolean;
|
||||
matchOnDetail?: boolean;
|
||||
}
|
||||
```
|
||||
|
||||
### Progress
|
||||
|
||||
```typescript
|
||||
// Show progress
|
||||
window.withProgress<T>(
|
||||
options: ProgressOptions,
|
||||
task: (progress: Progress<{message?: string}>) => Promise<T>
|
||||
): Promise<T>
|
||||
|
||||
interface ProgressOptions {
|
||||
location: ProgressLocation;
|
||||
title?: string;
|
||||
cancellable?: boolean;
|
||||
}
|
||||
|
||||
// Example usage
|
||||
await window.withProgress(
|
||||
{
|
||||
location: ProgressLocation.Notification,
|
||||
title: "Processing",
|
||||
cancellable: true
|
||||
},
|
||||
async (progress) => {
|
||||
progress.report({ message: 'Step 1...' });
|
||||
await step1();
|
||||
progress.report({ message: 'Step 2...' });
|
||||
await step2();
|
||||
}
|
||||
);
|
||||
```
|
||||
|
||||
## Workspace API
|
||||
|
||||
Access workspace information and configuration.
|
||||
|
||||
```typescript
|
||||
// Get workspace folders
|
||||
workspace.workspaceFolders: readonly WorkspaceFolder[]
|
||||
|
||||
// Get configuration
|
||||
workspace.getConfiguration(
|
||||
section?: string
|
||||
): WorkspaceConfiguration
|
||||
|
||||
// Update configuration
|
||||
workspace.getConfiguration('myExtension')
|
||||
.update('setting', value, ConfigurationTarget.Workspace)
|
||||
|
||||
// Watch for configuration changes
|
||||
workspace.onDidChangeConfiguration(
|
||||
listener: (e: ConfigurationChangeEvent) => void
|
||||
): Disposable
|
||||
|
||||
// File system operations
|
||||
workspace.fs.readFile(uri: Uri): Promise<Uint8Array>
|
||||
workspace.fs.writeFile(uri: Uri, content: Uint8Array): Promise<void>
|
||||
workspace.fs.delete(uri: Uri): Promise<void>
|
||||
workspace.fs.rename(oldUri: Uri, newUri: Uri): Promise<void>
|
||||
workspace.fs.copy(source: Uri, destination: Uri): Promise<void>
|
||||
workspace.fs.createDirectory(uri: Uri): Promise<void>
|
||||
workspace.fs.readDirectory(uri: Uri): Promise<[string, FileType][]>
|
||||
workspace.fs.stat(uri: Uri): Promise<FileStat>
|
||||
```
|
||||
|
||||
## HTTP Client API
|
||||
|
||||
Make HTTP requests from extensions.
|
||||
|
||||
```typescript
|
||||
import { api } from '@apache-superset/core';
|
||||
|
||||
// GET request
|
||||
const response = await api.get('/api/v1/chart/');
|
||||
|
||||
// POST request
|
||||
const response = await api.post('/api/v1/chart/', {
|
||||
data: chartData
|
||||
});
|
||||
|
||||
// PUT request
|
||||
const response = await api.put('/api/v1/chart/123', {
|
||||
data: updatedData
|
||||
});
|
||||
|
||||
// DELETE request
|
||||
const response = await api.delete('/api/v1/chart/123');
|
||||
|
||||
// Custom headers
|
||||
const response = await api.get('/api/v1/chart/', {
|
||||
headers: {
|
||||
'X-Custom-Header': 'value'
|
||||
}
|
||||
});
|
||||
|
||||
// Query parameters
|
||||
const response = await api.get('/api/v1/chart/', {
|
||||
params: {
|
||||
page: 1,
|
||||
page_size: 20
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Theming API
|
||||
|
||||
Access and customize theme settings.
|
||||
|
||||
```typescript
|
||||
// Get current theme
|
||||
theme.getActiveTheme(): Theme
|
||||
|
||||
interface Theme {
|
||||
name: string;
|
||||
isDark: boolean;
|
||||
colors: ThemeColors;
|
||||
typography: Typography;
|
||||
spacing: Spacing;
|
||||
}
|
||||
|
||||
// Listen for theme changes
|
||||
theme.onDidChangeTheme(
|
||||
listener: (theme: Theme) => void
|
||||
): Disposable
|
||||
|
||||
// Get theme colors
|
||||
const colors = theme.colors;
|
||||
colors.primary
|
||||
colors.success
|
||||
colors.warning
|
||||
colors.error
|
||||
colors.info
|
||||
colors.text
|
||||
colors.background
|
||||
colors.border
|
||||
```
|
||||
|
||||
## Disposable Pattern
|
||||
|
||||
Manage resource cleanup consistently.
|
||||
|
||||
```typescript
|
||||
interface Disposable {
|
||||
dispose(): void;
|
||||
}
|
||||
|
||||
// Create a disposable
|
||||
class MyDisposable implements Disposable {
|
||||
dispose() {
|
||||
// Cleanup logic
|
||||
}
|
||||
}
|
||||
|
||||
// Combine disposables
|
||||
const composite = Disposable.from(
|
||||
disposable1,
|
||||
disposable2,
|
||||
disposable3
|
||||
);
|
||||
|
||||
// Dispose all at once
|
||||
composite.dispose();
|
||||
|
||||
// Use in extension
|
||||
export function activate(context: ExtensionContext) {
|
||||
// All disposables added here are cleaned up on deactivation
|
||||
context.subscriptions.push(
|
||||
registerCommand(...),
|
||||
registerView(...),
|
||||
onDidChange(...)
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Type Definitions
|
||||
|
||||
Complete TypeScript definitions are available:
|
||||
|
||||
```typescript
|
||||
import type {
|
||||
ExtensionContext,
|
||||
Disposable,
|
||||
Event,
|
||||
EventEmitter,
|
||||
Uri,
|
||||
Command,
|
||||
QuickPickItem,
|
||||
InputBoxOptions,
|
||||
Progress,
|
||||
CancellationToken
|
||||
} from '@apache-superset/core';
|
||||
```
|
||||
|
||||
## Version Compatibility
|
||||
|
||||
The API follows semantic versioning:
|
||||
|
||||
```typescript
|
||||
// Check API version
|
||||
const version = superset.version;
|
||||
|
||||
// Version components
|
||||
version.major // Breaking changes
|
||||
version.minor // New features
|
||||
version.patch // Bug fixes
|
||||
|
||||
// Check minimum version
|
||||
if (version.major < 1) {
|
||||
throw new Error('Requires Superset API v1.0.0 or higher');
|
||||
}
|
||||
```
|
||||
|
||||
## Migration Guide
|
||||
|
||||
### From v0.x to v1.0
|
||||
|
||||
```typescript
|
||||
// Before (v0.x)
|
||||
sqlLab.runQuery(query);
|
||||
|
||||
// After (v1.0)
|
||||
sqlLab.executeQuery();
|
||||
|
||||
// Before (v0.x)
|
||||
core.registerPanel(id, component);
|
||||
|
||||
// After (v1.0)
|
||||
context.registerView(id, component);
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Error Handling
|
||||
|
||||
```typescript
|
||||
export async function activate(context: ExtensionContext) {
|
||||
try {
|
||||
await initializeExtension();
|
||||
} catch (error) {
|
||||
console.error('Failed to initialize:', error);
|
||||
window.showErrorMessage(
|
||||
`Extension failed to activate: ${error.message}`
|
||||
);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Resource Management
|
||||
|
||||
```typescript
|
||||
// Always use disposables
|
||||
const disposables: Disposable[] = [];
|
||||
|
||||
disposables.push(
|
||||
commands.registerCommand(...),
|
||||
sqlLab.onDidQueryRun(...),
|
||||
workspace.onDidChangeConfiguration(...)
|
||||
);
|
||||
|
||||
// Cleanup in deactivate
|
||||
export function deactivate() {
|
||||
disposables.forEach(d => d.dispose());
|
||||
}
|
||||
```
|
||||
|
||||
### Type Safety
|
||||
|
||||
```typescript
|
||||
// Use type guards
|
||||
function isDatabase(obj: any): obj is Database {
|
||||
return obj && typeof obj.id === 'number' && typeof obj.name === 'string';
|
||||
}
|
||||
|
||||
// Use generics
|
||||
function getValue<T>(key: string, defaultValue: T): T {
|
||||
return context.globalState.get(key) ?? defaultValue;
|
||||
}
|
||||
```
|
||||
@@ -1,194 +0,0 @@
|
||||
---
|
||||
title: Extension Architecture
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Superset Extension Architecture
|
||||
|
||||
Apache Superset's extension architecture enables developers to enhance and customize the platform without modifying the core codebase. Inspired by the successful VS Code Extensions model, this architecture provides well-defined, versioned APIs and clear contribution points that allow the community to build upon and extend Superset's functionality.
|
||||
|
||||
## Core Concepts
|
||||
|
||||
### Extensions vs Plugins
|
||||
|
||||
We use the term "extensions" rather than "plugins" to better convey the idea of enhancing and expanding Superset's core capabilities in a modular and integrated way. Extensions can add new features, modify existing behavior, and integrate deeply with the host application through well-defined APIs.
|
||||
|
||||
### Lean Core Philosophy
|
||||
|
||||
Superset's core remains minimal, with many features delegated to extensions. Built-in features are implemented using the same APIs available to external extension authors, ensuring consistency and validating the extension architecture through real-world usage.
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
The extension architecture consists of several key components:
|
||||
|
||||
### Core Packages
|
||||
|
||||
#### @apache-superset/core (Frontend)
|
||||
Provides essential building blocks for extensions:
|
||||
- Shared UI components
|
||||
- Utility functions
|
||||
- Type definitions
|
||||
- Frontend APIs for interacting with the host
|
||||
|
||||
#### apache-superset-core (Backend)
|
||||
Exposes backend functionality:
|
||||
- Database access APIs
|
||||
- Security models
|
||||
- REST API extensions
|
||||
- SQLAlchemy models and utilities
|
||||
|
||||
### Extension CLI
|
||||
|
||||
The `apache-superset-extensions-cli` package provides commands for:
|
||||
- Scaffolding new extension projects
|
||||
- Building and bundling extensions
|
||||
- Development workflows with hot-reload
|
||||
- Packaging extensions for distribution
|
||||
|
||||
### Host Application
|
||||
|
||||
Superset acts as the host, providing:
|
||||
- Extension registration and management
|
||||
- Dynamic loading of extension assets
|
||||
- API implementation for extensions
|
||||
- Lifecycle management (activation/deactivation)
|
||||
|
||||
## Extension Points
|
||||
|
||||
Extensions can contribute to various parts of Superset:
|
||||
|
||||
### SQL Lab Extensions
|
||||
- Custom panels (left, right, bottom)
|
||||
- Editor enhancements
|
||||
- Query processors
|
||||
- Autocomplete providers
|
||||
- Execution plan visualizers
|
||||
|
||||
### Dashboard Extensions (Future)
|
||||
- Custom widget types
|
||||
- Filter components
|
||||
- Interaction handlers
|
||||
|
||||
### Chart Extensions (Future)
|
||||
- New visualization types
|
||||
- Data transformers
|
||||
- Export formats
|
||||
|
||||
## Technical Foundation
|
||||
|
||||
### Module Federation
|
||||
|
||||
Frontend extensions leverage Webpack Module Federation for dynamic loading:
|
||||
- Extensions are built independently
|
||||
- Dependencies are shared with the host
|
||||
- No rebuild of Superset required
|
||||
- Runtime loading of extension assets
|
||||
|
||||
### API Versioning
|
||||
|
||||
All public APIs follow semantic versioning:
|
||||
- Breaking changes require major version bumps
|
||||
- Extensions declare compatibility requirements
|
||||
- Backward compatibility maintained within major versions
|
||||
|
||||
### Security Model
|
||||
|
||||
- Extensions disabled by default (require `ENABLE_EXTENSIONS` flag)
|
||||
- Built-in extensions follow same security standards as core
|
||||
- External extensions run in same context as host (sandboxing planned)
|
||||
- Administrators responsible for vetting third-party extensions
|
||||
|
||||
## Development Workflow
|
||||
|
||||
1. **Initialize**: Use CLI to scaffold new extension
|
||||
2. **Develop**: Work with hot-reload in development mode
|
||||
3. **Build**: Bundle frontend and backend assets
|
||||
4. **Package**: Create `.supx` distribution file
|
||||
5. **Deploy**: Upload through API or management UI
|
||||
|
||||
## Example: Dataset References Extension
|
||||
|
||||
A practical example demonstrating the architecture:
|
||||
|
||||
```typescript
|
||||
// Frontend activation
|
||||
export function activate(context) {
|
||||
// Register a new SQL Lab panel
|
||||
const panel = core.registerView('dataset_references.main',
|
||||
<DatasetReferencesPanel />
|
||||
);
|
||||
|
||||
// Listen to query changes
|
||||
const listener = sqlLab.onDidQueryRun(editor => {
|
||||
// Analyze query and update panel
|
||||
});
|
||||
|
||||
// Cleanup on deactivation
|
||||
context.subscriptions.push(panel, listener);
|
||||
}
|
||||
```
|
||||
|
||||
```python
|
||||
# Backend API extension
|
||||
from superset_core.api import rest_api
|
||||
from .api import DatasetReferencesAPI
|
||||
|
||||
# Register custom REST endpoints
|
||||
rest_api.add_extension_api(DatasetReferencesAPI)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Extension Design
|
||||
- Keep extensions focused on specific functionality
|
||||
- Use versioned APIs for stability
|
||||
- Handle cleanup properly on deactivation
|
||||
- Follow Superset's coding standards
|
||||
|
||||
### Performance
|
||||
- Lazy load assets when possible
|
||||
- Minimize bundle sizes
|
||||
- Share dependencies with host
|
||||
- Cache expensive operations
|
||||
|
||||
### Compatibility
|
||||
- Declare API version requirements
|
||||
- Test across Superset versions
|
||||
- Provide migration guides for breaking changes
|
||||
- Document compatibility clearly
|
||||
|
||||
## Future Roadmap
|
||||
|
||||
Planned enhancements include:
|
||||
- JavaScript sandboxing for untrusted extensions
|
||||
- Extension marketplace and registry
|
||||
- Inter-extension communication
|
||||
- Advanced theming capabilities
|
||||
- Backend hot-reload without restart
|
||||
|
||||
## Getting Started
|
||||
|
||||
Ready to build your first extension? Check out:
|
||||
- [Extension Project Structure](/developer_portal/extensions/extension-project-structure)
|
||||
- [API Reference](/developer_portal/api/frontend)
|
||||
- [CLI Documentation](/developer_portal/cli/overview)
|
||||
- [Frontend Contribution Types](/developer_portal/extensions/frontend-contribution-types)
|
||||
@@ -1,55 +0,0 @@
|
||||
---
|
||||
title: Common Plugin Capabilities
|
||||
sidebar_position: 2
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Common Plugin Capabilities
|
||||
|
||||
🚧 **Coming Soon** 🚧
|
||||
|
||||
Explore the shared functionality and common patterns available to all Superset plugins.
|
||||
|
||||
## Topics to be covered:
|
||||
|
||||
- Plugin lifecycle hooks (initialization, activation, deactivation)
|
||||
- Accessing Superset's core services and APIs
|
||||
- State management and data persistence
|
||||
- Event handling and plugin communication
|
||||
- Internationalization (i18n) support
|
||||
- Error handling and logging
|
||||
- Plugin configuration management
|
||||
- Accessing user context and permissions
|
||||
- Working with datasets and queries
|
||||
- Plugin metadata and manifests
|
||||
|
||||
## Core Services Available
|
||||
|
||||
- **API Client** - HTTP client for backend communication
|
||||
- **State Store** - Redux store access
|
||||
- **Theme Provider** - Access to current theme settings
|
||||
- **User Context** - Current user information and permissions
|
||||
- **Dataset Service** - Working with data sources
|
||||
- **Chart Service** - Chart rendering utilities
|
||||
|
||||
---
|
||||
|
||||
*This documentation is under active development. Check back soon for updates!*
|
||||
@@ -1,62 +0,0 @@
|
||||
---
|
||||
title: Extending the Workbench
|
||||
sidebar_position: 4
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Extending the Workbench
|
||||
|
||||
🚧 **Coming Soon** 🚧
|
||||
|
||||
Discover how to extend Superset's main interface and workbench with custom components and functionality.
|
||||
|
||||
## Topics to be covered:
|
||||
|
||||
- Adding custom menu items and navigation
|
||||
- Creating custom dashboard components
|
||||
- Extending the SQL Lab interface
|
||||
- Adding custom sidebar panels
|
||||
- Creating floating panels and modals
|
||||
- Integrating with the command palette
|
||||
- Custom toolbar buttons and actions
|
||||
- Workspace state management
|
||||
- Plugin-specific keyboard shortcuts
|
||||
- Context menu extensions
|
||||
|
||||
## Extension Points
|
||||
|
||||
- **Main navigation** - Top-level menu items
|
||||
- **Dashboard builder** - Custom components and layouts
|
||||
- **SQL Lab** - Query editor extensions
|
||||
- **Chart explorer** - Visualization building tools
|
||||
- **Settings panels** - Configuration interfaces
|
||||
- **Data source explorer** - Database navigation
|
||||
|
||||
## UI Integration Patterns
|
||||
|
||||
- React component composition
|
||||
- Portal-based rendering
|
||||
- Event-driven UI updates
|
||||
- Responsive layout adaptation
|
||||
|
||||
---
|
||||
|
||||
*This documentation is under active development. Check back soon for updates!*
|
||||
@@ -1,53 +0,0 @@
|
||||
---
|
||||
title: Plugin Capabilities Overview
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Plugin Capabilities Overview
|
||||
|
||||
🚧 **Coming Soon** 🚧
|
||||
|
||||
This section provides a comprehensive overview of what Superset plugins can do and how they integrate with the core platform.
|
||||
|
||||
## Topics to be covered:
|
||||
|
||||
- Plugin architecture and lifecycle
|
||||
- Available extension points
|
||||
- Core APIs and services
|
||||
- Plugin communication patterns
|
||||
- Configuration and settings management
|
||||
- Plugin permissions and security
|
||||
- Performance considerations
|
||||
- Best practices for plugin development
|
||||
|
||||
## Plugin Types
|
||||
|
||||
Superset supports several types of plugins:
|
||||
- **Visualization plugins** - Custom chart types and data visualizations
|
||||
- **Database connectors** - New data source integrations
|
||||
- **UI extensions** - Custom dashboard components and interfaces
|
||||
- **Theme plugins** - Custom styling and branding
|
||||
- **Filter plugins** - Custom filter components
|
||||
|
||||
---
|
||||
|
||||
*This documentation is under active development. Check back soon for updates!*
|
||||
@@ -1,61 +0,0 @@
|
||||
---
|
||||
title: Theming and Styling
|
||||
sidebar_position: 3
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Theming and Styling
|
||||
|
||||
🚧 **Coming Soon** 🚧
|
||||
|
||||
Learn how to create custom themes and style your plugins to match Superset's design system.
|
||||
|
||||
## Topics to be covered:
|
||||
|
||||
- Understanding Superset's theme architecture
|
||||
- Using the theme provider in plugins
|
||||
- Creating custom color palettes
|
||||
- Responsive design considerations
|
||||
- Dark mode and light mode support
|
||||
- Customizing chart colors and styling
|
||||
- Brand customization and white-labeling
|
||||
- CSS-in-JS best practices
|
||||
- Working with Ant Design components
|
||||
- Accessibility in custom themes
|
||||
|
||||
## Theme Structure
|
||||
|
||||
- **Color tokens** - Primary, secondary, and semantic colors
|
||||
- **Typography** - Font families, sizes, and weights
|
||||
- **Spacing** - Grid system and layout tokens
|
||||
- **Component styles** - Default component appearances
|
||||
- **Chart themes** - Color schemes for visualizations
|
||||
|
||||
## Supported Theming APIs
|
||||
|
||||
- Theme provider context
|
||||
- CSS custom properties
|
||||
- Emotion/styled-components integration
|
||||
- Chart color palette API
|
||||
|
||||
---
|
||||
|
||||
*This documentation is under active development. Check back soon for updates!*
|
||||
@@ -1,578 +0,0 @@
|
||||
---
|
||||
title: Extension CLI
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Superset Extension CLI
|
||||
|
||||
The `apache-superset-extensions-cli` package provides command-line tools for creating, developing, and packaging Apache Superset extensions. It streamlines the entire extension development workflow from initialization to deployment.
|
||||
|
||||
## Installation
|
||||
|
||||
Install the CLI globally using pip:
|
||||
|
||||
```bash
|
||||
pip install apache-superset-extensions-cli
|
||||
```
|
||||
|
||||
Or install locally in your project:
|
||||
|
||||
```bash
|
||||
pip install --user apache-superset-extensions-cli
|
||||
```
|
||||
|
||||
Verify installation:
|
||||
|
||||
```bash
|
||||
superset-extensions --version
|
||||
# Output: apache-superset-extensions-cli version 1.0.0
|
||||
```
|
||||
|
||||
## Commands Overview
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `init` | Create a new extension project |
|
||||
| `dev` | Start development mode with hot reload |
|
||||
| `build` | Build extension assets for production |
|
||||
| `bundle` | Package extension into a .supx file |
|
||||
| `validate` | Validate extension metadata and structure |
|
||||
| `publish` | Publish extension to registry (future) |
|
||||
|
||||
## Command Reference
|
||||
|
||||
### init
|
||||
|
||||
Creates a new extension project with the standard structure and boilerplate code.
|
||||
|
||||
```bash
|
||||
superset-extensions init [options] <extension-name>
|
||||
```
|
||||
|
||||
#### Options
|
||||
|
||||
- `--type, -t <type>` - Extension type: `full` (default), `frontend-only`, `backend-only`
|
||||
- `--template <template>` - Project template: `default`, `sql-lab`, `dashboard`, `chart`
|
||||
- `--author <name>` - Extension author name
|
||||
- `--description <desc>` - Extension description
|
||||
- `--license <license>` - License identifier (default: Apache-2.0)
|
||||
- `--superset-version <version>` - Minimum Superset version (default: 4.0.0)
|
||||
- `--skip-install` - Skip installing dependencies
|
||||
- `--use-typescript` - Use TypeScript for frontend (default: true)
|
||||
- `--use-npm` - Use npm instead of yarn
|
||||
|
||||
#### Examples
|
||||
|
||||
```bash
|
||||
# Create a basic extension
|
||||
superset-extensions init my-extension
|
||||
|
||||
# Create a SQL Lab focused extension
|
||||
superset-extensions init query-optimizer --template sql-lab
|
||||
|
||||
# Create frontend-only extension
|
||||
superset-extensions init custom-viz --type frontend-only
|
||||
|
||||
# Create with metadata
|
||||
superset-extensions init data-quality \
|
||||
--author "Jane Doe" \
|
||||
--description "Data quality monitoring for SQL Lab"
|
||||
```
|
||||
|
||||
#### Generated Structure
|
||||
|
||||
```
|
||||
my-extension/
|
||||
├── extension.json # Extension metadata
|
||||
├── frontend/ # Frontend source code
|
||||
│ ├── src/
|
||||
│ │ ├── index.tsx # Main entry point
|
||||
│ │ └── components/ # React components
|
||||
│ ├── package.json
|
||||
│ ├── tsconfig.json
|
||||
│ └── webpack.config.js
|
||||
├── backend/ # Backend source code
|
||||
│ ├── src/
|
||||
│ │ └── my_extension/
|
||||
│ │ ├── __init__.py
|
||||
│ │ └── api.py
|
||||
│ ├── tests/
|
||||
│ └── requirements.txt
|
||||
├── README.md
|
||||
└── .gitignore
|
||||
```
|
||||
|
||||
### dev
|
||||
|
||||
Starts development mode with automatic rebuilding and hot reload.
|
||||
|
||||
```bash
|
||||
superset-extensions dev [options]
|
||||
```
|
||||
|
||||
#### Options
|
||||
|
||||
- `--port, -p <port>` - Development server port (default: 9001)
|
||||
- `--host <host>` - Development server host (default: localhost)
|
||||
- `--watch-backend` - Also watch backend files (default: true)
|
||||
- `--watch-frontend` - Also watch frontend files (default: true)
|
||||
- `--no-open` - Don't open browser automatically
|
||||
- `--superset-url <url>` - Superset instance URL (default: http://localhost:8088)
|
||||
- `--verbose` - Enable verbose logging
|
||||
|
||||
#### Examples
|
||||
|
||||
```bash
|
||||
# Start development mode
|
||||
superset-extensions dev
|
||||
|
||||
# Use custom port
|
||||
superset-extensions dev --port 9002
|
||||
|
||||
# Connect to remote Superset
|
||||
superset-extensions dev --superset-url https://superset.example.com
|
||||
```
|
||||
|
||||
#### Development Workflow
|
||||
|
||||
1. **Start the dev server:**
|
||||
```bash
|
||||
superset-extensions dev
|
||||
```
|
||||
|
||||
2. **Configure Superset** (`superset_config.py`):
|
||||
```python
|
||||
LOCAL_EXTENSIONS = [
|
||||
"/path/to/your/extension"
|
||||
]
|
||||
ENABLE_EXTENSIONS = True
|
||||
```
|
||||
|
||||
3. **Start Superset:**
|
||||
```bash
|
||||
superset run -p 8088 --with-threads --reload
|
||||
```
|
||||
|
||||
The extension will automatically reload when you make changes.
|
||||
|
||||
### build
|
||||
|
||||
Builds extension assets for production deployment.
|
||||
|
||||
```bash
|
||||
superset-extensions build [options]
|
||||
```
|
||||
|
||||
#### Options
|
||||
|
||||
- `--mode <mode>` - Build mode: `production` (default), `development`
|
||||
- `--analyze` - Generate bundle analysis report
|
||||
- `--source-maps` - Generate source maps
|
||||
- `--minify` - Minify output (default: true in production)
|
||||
- `--output, -o <dir>` - Output directory (default: dist)
|
||||
- `--clean` - Clean output directory before build
|
||||
- `--parallel` - Build frontend and backend in parallel
|
||||
|
||||
#### Examples
|
||||
|
||||
```bash
|
||||
# Production build
|
||||
superset-extensions build
|
||||
|
||||
# Development build with source maps
|
||||
superset-extensions build --mode development --source-maps
|
||||
|
||||
# Analyze bundle size
|
||||
superset-extensions build --analyze
|
||||
|
||||
# Custom output directory
|
||||
superset-extensions build --output build
|
||||
```
|
||||
|
||||
#### Build Output
|
||||
|
||||
```
|
||||
dist/
|
||||
├── manifest.json # Build manifest
|
||||
├── frontend/
|
||||
│ ├── remoteEntry.[hash].js
|
||||
│ ├── [name].[hash].js
|
||||
│ └── assets/
|
||||
└── backend/
|
||||
└── my_extension/
|
||||
├── __init__.py
|
||||
└── *.py
|
||||
```
|
||||
|
||||
### bundle
|
||||
|
||||
Packages the built extension into a distributable `.supx` file.
|
||||
|
||||
```bash
|
||||
superset-extensions bundle [options]
|
||||
```
|
||||
|
||||
#### Options
|
||||
|
||||
- `--output, -o <file>` - Output filename (default: `{name}-{version}.supx`)
|
||||
- `--sign` - Sign the bundle (requires configured keys)
|
||||
- `--compression <level>` - Compression level 0-9 (default: 6)
|
||||
- `--exclude <patterns>` - Files to exclude (comma-separated)
|
||||
- `--include-dev-deps` - Include development dependencies
|
||||
|
||||
#### Examples
|
||||
|
||||
```bash
|
||||
# Create bundle
|
||||
superset-extensions bundle
|
||||
|
||||
# Custom output name
|
||||
superset-extensions bundle --output my-extension-latest.supx
|
||||
|
||||
# Signed bundle
|
||||
superset-extensions bundle --sign
|
||||
|
||||
# Exclude test files
|
||||
superset-extensions bundle --exclude "**/*.test.js,**/*.spec.ts"
|
||||
```
|
||||
|
||||
#### Bundle Structure
|
||||
|
||||
The `.supx` file is a ZIP archive containing:
|
||||
|
||||
```
|
||||
my-extension-1.0.0.supx
|
||||
├── manifest.json
|
||||
├── extension.json
|
||||
├── frontend/
|
||||
│ └── dist/
|
||||
└── backend/
|
||||
└── src/
|
||||
```
|
||||
|
||||
### validate
|
||||
|
||||
Validates extension structure, metadata, and compatibility.
|
||||
|
||||
```bash
|
||||
superset-extensions validate [options]
|
||||
```
|
||||
|
||||
#### Options
|
||||
|
||||
- `--strict` - Enable strict validation
|
||||
- `--fix` - Auto-fix correctable issues
|
||||
- `--check-deps` - Validate dependencies
|
||||
- `--check-security` - Run security checks
|
||||
|
||||
#### Examples
|
||||
|
||||
```bash
|
||||
# Basic validation
|
||||
superset-extensions validate
|
||||
|
||||
# Strict mode with auto-fix
|
||||
superset-extensions validate --strict --fix
|
||||
|
||||
# Full validation
|
||||
superset-extensions validate --check-deps --check-security
|
||||
```
|
||||
|
||||
#### Validation Checks
|
||||
|
||||
- Extension metadata completeness
|
||||
- File structure conformity
|
||||
- API version compatibility
|
||||
- Dependency security vulnerabilities
|
||||
- Code quality standards
|
||||
- Bundle size limits
|
||||
|
||||
## Configuration File
|
||||
|
||||
Create `.superset-extension.json` for project-specific settings:
|
||||
|
||||
```json
|
||||
{
|
||||
"build": {
|
||||
"mode": "production",
|
||||
"sourceMaps": true,
|
||||
"analyze": false,
|
||||
"parallel": true
|
||||
},
|
||||
"dev": {
|
||||
"port": 9001,
|
||||
"host": "localhost",
|
||||
"autoOpen": true
|
||||
},
|
||||
"bundle": {
|
||||
"compression": 6,
|
||||
"sign": false,
|
||||
"exclude": [
|
||||
"**/*.test.*",
|
||||
"**/*.spec.*",
|
||||
"**/tests/**"
|
||||
]
|
||||
},
|
||||
"validation": {
|
||||
"strict": true,
|
||||
"autoFix": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Configure CLI behavior using environment variables:
|
||||
|
||||
```bash
|
||||
# Superset connection
|
||||
export SUPERSET_URL=http://localhost:8088
|
||||
export SUPERSET_USERNAME=admin
|
||||
export SUPERSET_PASSWORD=admin
|
||||
|
||||
# Development settings
|
||||
export EXTENSION_DEV_PORT=9001
|
||||
export EXTENSION_DEV_HOST=localhost
|
||||
|
||||
# Build settings
|
||||
export EXTENSION_BUILD_MODE=production
|
||||
export EXTENSION_SOURCE_MAPS=true
|
||||
|
||||
# Registry settings (future)
|
||||
export EXTENSION_REGISTRY_URL=https://registry.superset.apache.org
|
||||
export EXTENSION_REGISTRY_TOKEN=your-token
|
||||
```
|
||||
|
||||
## Advanced Usage
|
||||
|
||||
### Custom Templates
|
||||
|
||||
Create custom project templates:
|
||||
|
||||
```bash
|
||||
# Use custom template
|
||||
superset-extensions init my-ext --template https://github.com/user/template
|
||||
|
||||
# Use local template
|
||||
superset-extensions init my-ext --template ./my-template
|
||||
```
|
||||
|
||||
### CI/CD Integration
|
||||
|
||||
#### GitHub Actions
|
||||
|
||||
```yaml
|
||||
name: Build Extension
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: '3.9'
|
||||
|
||||
- name: Setup Node
|
||||
uses: actions/setup-node@v2
|
||||
with:
|
||||
node-version: '16'
|
||||
|
||||
- name: Install CLI
|
||||
run: pip install apache-superset-extensions-cli
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
npm install --prefix frontend
|
||||
pip install -r backend/requirements.txt
|
||||
|
||||
- name: Validate
|
||||
run: superset-extensions validate --strict
|
||||
|
||||
- name: Build
|
||||
run: superset-extensions build --mode production
|
||||
|
||||
- name: Bundle
|
||||
run: superset-extensions bundle
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: extension-bundle
|
||||
path: '*.supx'
|
||||
```
|
||||
|
||||
### Automated Deployment
|
||||
|
||||
Deploy extensions automatically:
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# deploy.sh
|
||||
|
||||
# Build and bundle
|
||||
superset-extensions build --mode production
|
||||
superset-extensions bundle --sign
|
||||
|
||||
# Upload to Superset instance
|
||||
BUNDLE=$(ls *.supx | head -1)
|
||||
curl -X POST "$SUPERSET_URL/api/v1/extensions/import/" \
|
||||
-H "Authorization: Bearer $SUPERSET_TOKEN" \
|
||||
-F "bundle=@$BUNDLE"
|
||||
|
||||
# Verify deployment
|
||||
curl "$SUPERSET_URL/api/v1/extensions/" \
|
||||
-H "Authorization: Bearer $SUPERSET_TOKEN"
|
||||
```
|
||||
|
||||
### Multi-Extension Projects
|
||||
|
||||
Manage multiple extensions in one repository:
|
||||
|
||||
```bash
|
||||
# Initialize multiple extensions
|
||||
superset-extensions init extensions/viz-plugin --type frontend-only
|
||||
superset-extensions init extensions/sql-optimizer --template sql-lab
|
||||
superset-extensions init extensions/auth-provider --type backend-only
|
||||
|
||||
# Build all extensions
|
||||
for dir in extensions/*/; do
|
||||
(cd "$dir" && superset-extensions build)
|
||||
done
|
||||
|
||||
# Bundle all extensions
|
||||
for dir in extensions/*/; do
|
||||
(cd "$dir" && superset-extensions bundle)
|
||||
done
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
#### Port already in use
|
||||
|
||||
```bash
|
||||
# Error: Port 9001 is already in use
|
||||
|
||||
# Solution: Use a different port
|
||||
superset-extensions dev --port 9002
|
||||
```
|
||||
|
||||
#### Module not found
|
||||
|
||||
```bash
|
||||
# Error: Cannot find module '@apache-superset/core'
|
||||
|
||||
# Solution: Ensure dependencies are installed
|
||||
npm install --prefix frontend
|
||||
```
|
||||
|
||||
#### Build failures
|
||||
|
||||
```bash
|
||||
# Check Node and Python versions
|
||||
node --version # Should be 16+
|
||||
python --version # Should be 3.9+
|
||||
|
||||
# Clear cache and rebuild
|
||||
rm -rf dist node_modules frontend/node_modules
|
||||
npm install --prefix frontend
|
||||
superset-extensions build --clean
|
||||
```
|
||||
|
||||
#### Bundle too large
|
||||
|
||||
```bash
|
||||
# Warning: Bundle size exceeds recommended limit
|
||||
|
||||
# Solution: Analyze and optimize
|
||||
superset-extensions build --analyze
|
||||
|
||||
# Exclude unnecessary files
|
||||
superset-extensions bundle --exclude "**/*.map,**/*.test.*"
|
||||
```
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug logging:
|
||||
|
||||
```bash
|
||||
# Set debug environment variable
|
||||
export DEBUG=superset-extensions:*
|
||||
|
||||
# Or use verbose flag
|
||||
superset-extensions dev --verbose
|
||||
superset-extensions build --verbose
|
||||
```
|
||||
|
||||
### Getting Help
|
||||
|
||||
```bash
|
||||
# General help
|
||||
superset-extensions --help
|
||||
|
||||
# Command-specific help
|
||||
superset-extensions init --help
|
||||
superset-extensions dev --help
|
||||
|
||||
# Version information
|
||||
superset-extensions --version
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Development
|
||||
|
||||
1. **Use TypeScript** for type safety
|
||||
2. **Follow the style guide** for consistency
|
||||
3. **Write tests** for critical functionality
|
||||
4. **Document your code** with JSDoc/docstrings
|
||||
5. **Use development mode** for rapid iteration
|
||||
|
||||
### Building
|
||||
|
||||
1. **Optimize bundle size** - analyze and tree-shake
|
||||
2. **Generate source maps** for debugging
|
||||
3. **Validate before building** to catch issues early
|
||||
4. **Use production mode** for final builds
|
||||
5. **Clean build directory** to avoid stale files
|
||||
|
||||
### Deployment
|
||||
|
||||
1. **Sign your bundles** for security
|
||||
2. **Version properly** using semantic versioning
|
||||
3. **Test in staging** before production deployment
|
||||
4. **Document breaking changes** in CHANGELOG
|
||||
5. **Provide migration guides** for major updates
|
||||
|
||||
## Resources
|
||||
|
||||
- [Extension Architecture](/developer_portal/architecture/overview)
|
||||
- [API Reference](/developer_portal/api/frontend)
|
||||
- [Frontend Contribution Types](/developer_portal/extensions/frontend-contribution-types)
|
||||
- [GitHub Repository](https://github.com/apache/superset)
|
||||
- [Community Forum](https://github.com/apache/superset/discussions)
|
||||
@@ -327,9 +327,9 @@ stats.sort_stats('cumulative').print_stats(10)
|
||||
## Resources
|
||||
|
||||
### Internal
|
||||
- [Coding Guidelines](../coding-guidelines/overview)
|
||||
- [Coding Guidelines](../guidelines/design-guidelines)
|
||||
- [Testing Guide](../testing/overview)
|
||||
- [Architecture Overview](../architecture/overview)
|
||||
- [Extension Architecture](../extensions/architecture)
|
||||
|
||||
### External
|
||||
- [Google's Code Review Guide](https://google.github.io/eng-practices/review/)
|
||||
|
||||
@@ -139,6 +139,41 @@ docker volume rm superset_db_home
|
||||
docker-compose up
|
||||
```
|
||||
|
||||
### Running multiple instances
|
||||
|
||||
If you need to run multiple Superset clones simultaneously (e.g., testing different branches),
|
||||
use `make up` instead of `docker compose up`:
|
||||
|
||||
```bash
|
||||
make up
|
||||
```
|
||||
|
||||
This automatically:
|
||||
- Generates a unique project name from your directory name
|
||||
- Finds available ports (incrementing from 8088, 9000, etc. if already in use)
|
||||
- Displays the assigned URLs before starting
|
||||
|
||||
Each clone gets isolated containers and volumes, so you can run them side-by-side without conflicts.
|
||||
|
||||
Available commands (run from repo root):
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `make up` | Start services (foreground) |
|
||||
| `make up-detached` | Start services (background) |
|
||||
| `make down` | Stop all services |
|
||||
| `make ps` | Show running containers |
|
||||
| `make logs` | Follow container logs |
|
||||
| `make ports` | Show assigned URLs and ports |
|
||||
| `make open` | Open browser to dev server |
|
||||
| `make nuke` | Stop, remove volumes & local images |
|
||||
|
||||
From a subdirectory, use: `make -C $(git rev-parse --show-toplevel) up`
|
||||
|
||||
:::warning
|
||||
Always use these commands instead of plain `docker compose down`, which won't know the correct project name for your instance.
|
||||
:::
|
||||
|
||||
## GitHub Codespaces (Cloud Development)
|
||||
|
||||
GitHub Codespaces provides a complete, pre-configured development environment in the cloud. This is ideal for:
|
||||
@@ -618,7 +653,7 @@ export enum FeatureFlag {
|
||||
those specified under FEATURE_FLAGS in `superset_config.py`. For example, `DEFAULT_FEATURE_FLAGS = { 'FOO': True, 'BAR': False }` in `superset/config.py` and `FEATURE_FLAGS = { 'BAR': True, 'BAZ': True }` in `superset_config.py` will result
|
||||
in combined feature flags of `{ 'FOO': True, 'BAR': True, 'BAZ': True }`.
|
||||
|
||||
The current status of the usability of each flag (stable vs testing, etc) can be found in `RESOURCES/FEATURE_FLAGS.md`.
|
||||
The current status of the usability of each flag (stable vs testing, etc) can be found in the [Feature Flags](/docs/configuration/feature-flags) documentation.
|
||||
|
||||
## Git Hooks
|
||||
|
||||
|
||||
@@ -258,19 +258,7 @@ For debugging the Flask backend:
|
||||
|
||||
### Storybook
|
||||
|
||||
Storybook is used for developing and testing UI components in isolation:
|
||||
|
||||
```bash
|
||||
cd superset-frontend
|
||||
|
||||
# Start Storybook
|
||||
npm run storybook
|
||||
|
||||
# Build static Storybook
|
||||
npm run build-storybook
|
||||
```
|
||||
|
||||
Access Storybook at http://localhost:6006
|
||||
See the dedicated [Storybook documentation](../testing/storybook) for information on running Storybook locally and adding new stories.
|
||||
|
||||
## Contributing Translations
|
||||
|
||||
@@ -342,26 +330,79 @@ ruff check --fix .
|
||||
|
||||
Pre-commit hooks run automatically on `git commit` if installed.
|
||||
|
||||
### TypeScript
|
||||
### TypeScript / JavaScript
|
||||
|
||||
We use ESLint and Prettier for TypeScript:
|
||||
We use a hybrid linting approach combining OXC (Oxidation Compiler) for standard rules and a custom AST-based checker for Superset-specific patterns.
|
||||
|
||||
#### Quick Commands
|
||||
|
||||
```bash
|
||||
cd superset-frontend
|
||||
|
||||
# Run eslint checks
|
||||
# Run both OXC and custom rules
|
||||
npm run lint:full
|
||||
|
||||
# Run OXC linter only (faster for most checks)
|
||||
npm run lint
|
||||
|
||||
# Fix auto-fixable issues with OXC
|
||||
npm run lint-fix
|
||||
|
||||
# Run custom rules checker only
|
||||
npm run check:custom-rules
|
||||
|
||||
# Run tsc (typescript) checks
|
||||
npm run type
|
||||
|
||||
# Fix lint issues
|
||||
npm run lint-fix
|
||||
|
||||
# Format with Prettier
|
||||
npm run prettier
|
||||
```
|
||||
|
||||
#### Architecture
|
||||
|
||||
The linting system consists of two components:
|
||||
|
||||
1. **OXC Linter** (`oxlint`) - A Rust-based linter that's 50-100x faster than ESLint
|
||||
- Handles all standard JavaScript/TypeScript rules
|
||||
- Configured via `oxlint.json`
|
||||
- Runs via `npm run lint` or `npm run lint-fix`
|
||||
|
||||
2. **Custom Rules Checker** - A Node.js AST-based checker for Superset-specific patterns
|
||||
- Enforces no literal colors (use theme colors)
|
||||
- Prevents FontAwesome usage (use @superset-ui/core Icons)
|
||||
- Validates i18n template usage (no template variables)
|
||||
- Runs via `npm run check:custom-rules`
|
||||
|
||||
#### Why This Approach?
|
||||
|
||||
- **50-100x faster linting** compared to ESLint for standard rules via OXC
|
||||
- **Apache-compatible** - No custom binaries, ASF-friendly
|
||||
- **Maintainable** - Custom rules in JavaScript, not Rust
|
||||
- **Flexible** - Can evolve as OXC adds plugin support
|
||||
|
||||
#### Troubleshooting
|
||||
|
||||
**"Plugin 'basic-custom-plugin' not found" Error**
|
||||
|
||||
Ensure you're using the explicit config:
|
||||
```bash
|
||||
npx oxlint --config oxlint.json
|
||||
```
|
||||
|
||||
**Custom Rules Not Running**
|
||||
|
||||
Verify the AST parsing dependencies are installed:
|
||||
```bash
|
||||
npm ls @babel/parser @babel/traverse glob
|
||||
```
|
||||
|
||||
#### Adding New Custom Rules
|
||||
|
||||
1. Edit `scripts/check-custom-rules.js`
|
||||
2. Add a new check function following the AST visitor pattern
|
||||
3. Call the function in `processFile()`
|
||||
4. Test with `npm run check:custom-rules`
|
||||
|
||||
## GitHub Ephemeral Environments
|
||||
|
||||
For every PR, an ephemeral environment is automatically deployed for testing.
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
---
|
||||
title: Contributing Overview
|
||||
title: Overview
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
@@ -22,7 +22,7 @@ specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Contributing to Apache Superset
|
||||
# Contributing
|
||||
|
||||
Superset is an [Apache Software foundation](https://www.apache.org/theapacheway/index.html) project.
|
||||
The core contributors (or committers) to Superset communicate primarily in the following channels (which can be joined by anyone):
|
||||
|
||||
@@ -35,7 +35,7 @@ Learn how to create and submit high-quality pull requests to Apache Superset.
|
||||
- [ ] You've found or created an issue to work on
|
||||
|
||||
### PR Readiness Checklist
|
||||
- [ ] Code follows [coding guidelines](../coding-guidelines/overview)
|
||||
- [ ] Code follows [coding guidelines](../guidelines/design-guidelines)
|
||||
- [ ] Tests are passing locally
|
||||
- [ ] Linting passes (`pre-commit run --all-files`)
|
||||
- [ ] Documentation is updated if needed
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
---
|
||||
title: Architectural Principles
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Architectural Principles
|
||||
|
||||
Realizing this vision requires a strong architectural foundation. To ensure the resulting system is robust, maintainable, and adaptable, we have defined a set of architectural principles that will guide the design and implementation of the changes. These principles serve as the basis for all technical decisions and help create an environment where extensions can be developed safely and predictably, while minimizing technical debt and fragmentation.
|
||||
|
||||
The architectural principles guiding this proposal include:
|
||||
|
||||
1. **Lean core**: Superset's core should remain as minimal as possible, with many features and capabilities delegated to extensions. Wherever possible, built-in features should be implemented using the same APIs and extension mechanisms available to external extension authors. This approach reduces maintenance burden and complexity in the core application, encourages modularity, and allows the community to innovate and iterate on features independently of the main codebase.
|
||||
2. **Explicit contribution points**: All extension points must be clearly defined and documented, so extension authors know exactly where and how they can interact with the host system. Each extension must also declare its capabilities in a metadata file, enabling the host to manage the extension lifecycle and provide a consistent user experience.
|
||||
3. **Versioned and stable APIs**: Public interfaces for extensions should be versioned and follow semantic versioning, allowing for safe evolution and backward compatibility.
|
||||
4. **Lazy loading and activation**: Extensions should be loaded and activated only when needed, minimizing performance overhead and resource consumption.
|
||||
5. **Composability and reuse**: The architecture should encourage the reuse of extension points and patterns across different modules, promoting consistency and reducing duplication.
|
||||
6. **Community-driven evolution**: The system should be designed to evolve based on real-world feedback and contributions, allowing new extension points and capabilities to be added as needs emerge.
|
||||
239
docs/developer_portal/extensions/architecture.md
Normal file
239
docs/developer_portal/extensions/architecture.md
Normal file
@@ -0,0 +1,239 @@
|
||||
---
|
||||
title: Architecture
|
||||
sidebar_position: 3
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# Architecture
|
||||
|
||||
Apache Superset's extension system is designed to enable powerful customization while maintaining stability, security, and performance. This page explains the architectural principles, system design, and technical mechanisms that make the extension ecosystem possible.
|
||||
|
||||
## Architectural Principles
|
||||
|
||||
The extension architecture is built on six core principles that guide all technical decisions and ensure extensions can be developed safely and predictably:
|
||||
|
||||
### 1. Lean Core
|
||||
|
||||
Superset's core should remain minimal, with many features delegated to extensions. Built-in features use the same APIs and extension mechanisms available to external developers. This approach:
|
||||
- Reduces maintenance burden and complexity
|
||||
- Encourages modularity
|
||||
- Allows the community to innovate independently of the main codebase
|
||||
|
||||
### 2. Explicit Contribution Points
|
||||
|
||||
All extension points are clearly defined and documented. Extension authors know exactly where and how they can interact with the host system. Each extension declares its capabilities in a metadata file, enabling the host to:
|
||||
- Manage the extension lifecycle
|
||||
- Provide a consistent user experience
|
||||
- Validate extension compatibility
|
||||
|
||||
### 3. Versioned and Stable APIs
|
||||
|
||||
Public interfaces for extensions follow semantic versioning, allowing for:
|
||||
- Safe evolution of the platform
|
||||
- Backward compatibility
|
||||
- Clear upgrade paths for extension authors
|
||||
|
||||
### 4. Lazy Loading and Activation
|
||||
|
||||
Extensions are loaded and activated only when needed, which:
|
||||
- Minimizes performance overhead
|
||||
- Reduces resource consumption
|
||||
- Improves startup time
|
||||
|
||||
### 5. Composability and Reuse
|
||||
|
||||
The architecture encourages reusing extension points and patterns across different modules, promoting:
|
||||
- Consistency across extensions
|
||||
- Reduced duplication
|
||||
- Shared best practices
|
||||
|
||||
### 6. Community-Driven Evolution
|
||||
|
||||
The system evolves based on real-world feedback and contributions. New extension points and capabilities are added as needs emerge, ensuring the platform remains relevant and flexible.
|
||||
|
||||
## System Overview
|
||||
|
||||
The extension architecture is built around three main components that work together to create a flexible, maintainable ecosystem:
|
||||
|
||||
### Core Packages
|
||||
|
||||
Two core packages provide the foundation for extension development:
|
||||
|
||||
**Frontend: `@apache-superset/core`**
|
||||
|
||||
This package provides essential building blocks for frontend extensions and the host application:
|
||||
- Shared UI components
|
||||
- Utility functions
|
||||
- APIs and hooks
|
||||
- Type definitions
|
||||
|
||||
By centralizing these resources, both extensions and built-in features use the same APIs, ensuring consistency, type safety, and a seamless user experience. The package is versioned to support safe platform evolution while maintaining compatibility.
|
||||
|
||||
**Backend: `apache-superset-core`**
|
||||
|
||||
This package exposes key classes and APIs for backend extensions:
|
||||
- Database connectors
|
||||
- API extensions
|
||||
- Security manager customization
|
||||
- Core utilities and models
|
||||
|
||||
It includes dependencies on critical libraries like Flask-AppBuilder and SQLAlchemy, and follows semantic versioning for compatibility and stability.
|
||||
|
||||
### Developer Tools
|
||||
|
||||
**`apache-superset-extensions-cli`**
|
||||
|
||||
The CLI provides comprehensive commands for extension development:
|
||||
- Project scaffolding
|
||||
- Code generation
|
||||
- Building and bundling
|
||||
- Packaging for distribution
|
||||
|
||||
By standardizing these processes, the CLI ensures extensions are built consistently, remain compatible with evolving versions of Superset, and follow best practices.
|
||||
|
||||
### Host Application
|
||||
|
||||
The Superset host application serves as the runtime environment for extensions:
|
||||
|
||||
**Extension Management**
|
||||
- Exposes `/api/v1/extensions` endpoint for registration and management
|
||||
- Provides a dedicated UI for managing extensions
|
||||
- Stores extension metadata in the `extensions` database table
|
||||
|
||||
**Extension Storage**
|
||||
|
||||
The extensions table contains:
|
||||
- Extension name, version, and author
|
||||
- Contributed features and exposed modules
|
||||
- Metadata and configuration
|
||||
- Built frontend and/or backend code
|
||||
|
||||
### Architecture Diagram
|
||||
|
||||
The following diagram illustrates how these components work together:
|
||||
|
||||
<img width="955" height="586" alt="Extension System Architecture" src="https://github.com/user-attachments/assets/cc2a41df-55a4-48c8-b056-35f7a1e567c6" />
|
||||
|
||||
The diagram shows:
|
||||
1. **Extension projects** depend on core packages for development
|
||||
2. **Core packages** provide APIs and type definitions
|
||||
3. **The host application** implements the APIs and manages extensions
|
||||
4. **Extensions** integrate seamlessly with the host through well-defined interfaces
|
||||
|
||||
## Dynamic Module Loading
|
||||
|
||||
One of the most sophisticated aspects of the extension architecture is how frontend code is dynamically loaded at runtime using Webpack's Module Federation.
|
||||
|
||||
### Module Federation
|
||||
|
||||
The architecture leverages Webpack's Module Federation to enable dynamic loading of frontend assets. This allows extensions to be built independently from Superset.
|
||||
|
||||
### How It Works
|
||||
|
||||
**Extension Configuration**
|
||||
|
||||
Extensions configure Webpack to expose their entry points:
|
||||
|
||||
``` typescript
|
||||
new ModuleFederationPlugin({
|
||||
name: 'my_extension',
|
||||
filename: 'remoteEntry.[contenthash].js',
|
||||
exposes: {
|
||||
'./index': './src/index.tsx',
|
||||
},
|
||||
externalsType: 'window',
|
||||
externals: {
|
||||
'@apache-superset/core': 'superset',
|
||||
},
|
||||
shared: {
|
||||
react: { singleton: true },
|
||||
'react-dom': { singleton: true },
|
||||
'antd-v5': { singleton: true }
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
This configuration does several important things:
|
||||
|
||||
**`exposes`** - Declares which modules are available to the host application. The extension makes `./index` available as its entry point.
|
||||
|
||||
**`externals` and `externalsType`** - Tell Webpack that when the extension imports `@apache-superset/core`, it should use `window.superset` at runtime instead of bundling its own copy. This ensures extensions use the host's implementation of shared packages.
|
||||
|
||||
**`shared`** - Prevents duplication of common libraries like React and Ant Design. The `singleton: true` setting ensures only one instance of each library exists, avoiding version conflicts and reducing bundle size.
|
||||
|
||||
### Runtime Resolution
|
||||
|
||||
The following diagram illustrates the module loading process:
|
||||
|
||||
<img width="913" height="558" alt="Module Federation Flow" src="https://github.com/user-attachments/assets/e5e4d2ae-e8b5-4d17-a2a1-3667c65f25ca" />
|
||||
|
||||
Here's what happens at runtime:
|
||||
|
||||
1. **Extension Registration**: When an extension is registered, Superset stores its remote entry URL
|
||||
2. **Dynamic Loading**: When the extension is activated, the host fetches the remote entry file
|
||||
3. **Module Resolution**: The extension imports `@apache-superset/core`, which resolves to `window.superset`
|
||||
4. **Execution**: The extension code runs with access to the host's APIs and shared dependencies
|
||||
|
||||
### Host API Setup
|
||||
|
||||
On the Superset side, the APIs are mapped to `window.superset` during application bootstrap:
|
||||
|
||||
``` typescript
|
||||
import * as supersetCore from '@apache-superset/core';
|
||||
import {
|
||||
authentication,
|
||||
core,
|
||||
commands,
|
||||
extensions,
|
||||
sqlLab,
|
||||
} from 'src/extensions';
|
||||
|
||||
export default function setupExtensionsAPI() {
|
||||
window.superset = {
|
||||
...supersetCore,
|
||||
authentication,
|
||||
core,
|
||||
commands,
|
||||
extensions,
|
||||
sqlLab,
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
This function runs before any extensions are loaded, ensuring the APIs are available when extensions import from `@apache-superset/core`.
|
||||
|
||||
### Benefits
|
||||
|
||||
This architecture provides several key benefits:
|
||||
|
||||
- **Independent development**: Extensions can be built separately from Superset's codebase
|
||||
- **Version isolation**: Each extension can be developed with its own release cycle
|
||||
- **Shared dependencies**: Common libraries are shared, reducing memory usage and bundle size
|
||||
- **Type safety**: TypeScript types flow from the core package to extensions
|
||||
|
||||
## Next Steps
|
||||
|
||||
Now that you understand the architecture, explore:
|
||||
|
||||
- **[Dependencies](./dependencies)** - Managing dependencies and understanding API stability
|
||||
- **[Quick Start](./quick-start)** - Build your first extension
|
||||
- **[Contribution Types](./contribution-types)** - What kinds of extensions you can build
|
||||
- **[Development](./development)** - Project structure, APIs, and development workflow
|
||||
@@ -1,36 +0,0 @@
|
||||
---
|
||||
title: What This Means for Superset's Built-in Features
|
||||
sidebar_position: 13
|
||||
---
|
||||
|
||||
<!--
|
||||
Licensed to the Apache Software Foundation (ASF) under one
|
||||
or more contributor license agreements. See the NOTICE file
|
||||
distributed with this work for additional information
|
||||
regarding copyright ownership. The ASF licenses this file
|
||||
to you under the Apache License, Version 2.0 (the
|
||||
"License"); you may not use this file except in compliance
|
||||
with the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing,
|
||||
software distributed under the License is distributed on an
|
||||
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations
|
||||
under the License.
|
||||
-->
|
||||
|
||||
# What This Means for Superset's Built-in Features
|
||||
|
||||
Transitioning to a well-defined, versioned API model has significant implications for Superset's built-in features. By exposing stable, public APIs for core functionality, we enable both extensions and internal modules to interact with the application in a consistent and predictable way. This approach brings several key benefits:
|
||||
|
||||
- **Unified API Surface**: All important features of Superset will be accessible through documented, versioned APIs. This not only empowers extension authors but also ensures that built-in features use the same mechanisms, validating the APIs through real-world usage and making it easier to replace or enhance individual features over time.
|
||||
- **Dogfooding and Replaceability**: By building Superset's own features using the same APIs available to extensions, we ensure that these APIs are robust, flexible, and well-tested. This also means that any built-in feature can potentially be replaced or extended by a third-party extension, increasing modularity and adaptability.
|
||||
- **Versioned and Stable Contracts**: Public APIs will be versioned and follow semantic versioning, providing stability for both internal and external consumers. This stability is critical for long-term maintainability, but it also means that extra care must be taken to avoid breaking changes and to provide clear migration paths when changes are necessary.
|
||||
- **Improved Inter-Module Communication**: With clearly defined APIs and a command-based architecture, modules and extensions can communicate through explicit interfaces rather than relying on direct Redux store access or tightly coupled state management. This decouples modules, reduces the risk of unintended side effects, and makes the codebase easier to reason about and maintain.
|
||||
- **Facilitated Refactoring and Evolution**: As the application evolves, having a stable API layer allows for internal refactoring and optimization without breaking consumers. This makes it easier to modernize or optimize internal implementations while preserving compatibility.
|
||||
- **Clearer Documentation and Onboarding**: A public, versioned API surface makes it easier to document and onboard new contributors, both for core development and for extension authors.
|
||||
|
||||
Overall, this shift represents a move toward a more modular, maintainable, and extensible architecture, where both built-in features and extensions are first-class citizens, and where the boundaries between core and community-driven innovation are minimized.
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user