Compare commits

...

252 Commits

Author SHA1 Message Date
Evan Rusackas
c9dd05e4bc fix(dashboard): remove unused ReportObject import
Removes the unused `ReportObject` type import from the Header component
to fix the TS6133 lint-frontend CI failure.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 09:31:42 -08:00
Evan Rusackas
9da50c0cc3 fix(tests): fix Playwright mobile test locator failures
Fix two Playwright test failures:
1. mobile-navigation.spec.ts: Add .first() to .or() locator to avoid
   strict mode violation when both 'Recents' and 'Dashboards' text
   elements are found on the welcome page.
2. mobile-dashboard.spec.ts: Update hamburger menu selectors from
   incorrect [data-test="more-horiz"] and [aria-label="More actions"]
   to the actual attributes used by PageHeaderWithActions component:
   [data-test="actions-trigger"] and [aria-label="Menu actions trigger"].

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 19:35:58 -08:00
Evan Rusackas
80fe26ef92 fix(tests): remove unused imports in mobile test files
- Remove unused 'screen' and 'render' imports from MobileRouteGuard.test.tsx
- Remove unused 'renderAtPath' helper function from MobileRouteGuard.test.tsx
- Remove unused 'waitFor' import from MobileUnsupported.test.tsx
- Remove unused 'iconContainer' variable from MobileUnsupported.test.tsx
- Apply prettier formatting to playwright mobile-dashboard spec

These changes fix TypeScript errors flagged by CI:
- TS6133: 'screen' is declared but its value is never read
- TS6133: 'renderAtPath' is declared but its value is never read
- TS6133: 'waitFor' is declared but its value is never read
- TS6133: 'iconContainer' is declared but its value is never read

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:27 -08:00
Evan Rusackas
eb8946ba3b test(mobile): add comprehensive mobile support test suite
This commit adds a full test suite for mobile support to prevent regressions:

Unit Tests:
- MobileRouteGuard: Tests for route checking, bypass flag, storage errors
- MobileUnsupported: Tests for page rendering, navigation, storage handling
- ListView: Tests for forceViewMode, mobile drawer, filter controls
- DashboardBuilder: Tests for mobile-specific rendering
- Home (mobile): Tests for hidden panels on mobile viewports

E2E Tests (Playwright):
- mobile-navigation.spec.ts: Route guard behavior, bypass functionality
- mobile-dashboard.spec.ts: Dashboard viewing, filter drawer, interactions

Test Utilities:
- mobileTestUtils.ts: Shared breakpoint mocks, viewport constants

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:27 -08:00
Evan Rusackas
3970a53fe9 docs(tests): explain why antd mock is used for Grid.useBreakpoint
Added comment explaining that we mock 'antd' directly rather than
'@superset-ui/core/components' because the latter causes circular
dependency issues with ActionButton during jest.requireActual
evaluation. Since Grid is re-exported from antd, mocking antd works.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:27 -08:00
Evan Rusackas
61aa514c21 fix(i18n): use interpolation for mobile dashboard menu labels
Changed string concatenation to use proper i18n interpolation syntax
for the Owner and Modified labels in the mobile dashboard menu.

Before: `${t('Modified')} ${date} ${t('by')} ${user}`
After: `t('Modified %(date)s by %(user)s', { date, user })`

This allows translators to reorder the placeholders as needed for
different language structures.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:27 -08:00
Evan Rusackas
70274f9d24 fix(tests): use data-test attribute for SubMenu mobile tests
The testing library is configured to use data-test (not data-testid)
as the testIdAttribute. Updated the mobile support test elements to
use the correct attribute.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:27 -08:00
Evan Rusackas
1b5a01aab1 fix: restore clean service-worker.js from master
Removes dev build artifacts (eval, HMR, source maps) that were
accidentally committed. Service workers should only contain
production-safe code.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:27 -08:00
Evan Rusackas
1b127432ff fix: address PR review feedback
- Wrap sessionStorage access in try/catch to handle disabled storage
- Fix misleading comment about redirect behavior in MobileUnsupported
- Fix data-test vs data-testid mismatch in SubMenu tests
- Add default value for useBreakpoint to prevent flash on initial render
- Add aria-label for mobile search button accessibility

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:27 -08:00
Evan Rusackas
4c2973fe8a fix(test): add antd Grid mock to dashboard tests
Add useBreakpoint mock to DashboardBuilder, Header, and DashboardList
tests to prevent mobile rendering from affecting test assertions.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:27 -08:00
Evan Rusackas
c098053785 fix(test): mock antd Grid.useBreakpoint directly
The Grid component is re-exported from antd through @superset-ui/core/components.
Mock antd directly to ensure useBreakpoint returns desktop breakpoints.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:26 -08:00
Evan Rusackas
164faa8810 fix(test): correct Grid useBreakpoint mock path
Mock @superset-ui/core/components/Grid directly instead of the whole
components module to avoid breaking other component imports.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:26 -08:00
Evan Rusackas
6f38727041 test: mock useBreakpoint in Menu tests
Add mock for Grid.useBreakpoint() to return desktop breakpoints in
Menu tests that were failing due to mobile responsive behavior.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:26 -08:00
Evan Rusackas
1d21516b77 test: mock useBreakpoint in RightMenu and Home tests
Add mock for Grid.useBreakpoint() to return desktop breakpoints in tests
that were failing due to mobile responsive behavior being triggered.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:26 -08:00
Evan Rusackas
94c448b4b1 feat(mobile): add route guard for unsupported mobile pages
Implements a mobile route guard system that redirects users on mobile
devices to a friendly "This view isn't available on mobile" page when
accessing non-mobile-friendly routes (Charts, SQL Lab, Explore, etc.).

Users can:
- Navigate to Dashboards (primary action)
- Go to Welcome Page (secondary action)
- Click "Continue anyway" to bypass and access the page

The bypass preference is stored in sessionStorage for the session.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:26 -08:00
Evan Rusackas
4f5690a7fa fix(mobile): resolve merge conflicts and TypeScript errors from rebase
- Fix Chart.tsx: remove redundant mobile width JS logic (CSS handles it)
- Fix ListView.test.tsx: use proper MemoryRouter/ReactRouter5Adapter pattern
- Fix SubMenu.test.tsx: add type="button" to test buttons
- Fix DashboardBuilder.tsx: remove unused Button import
- Fix DashboardBuilder.test.tsx: add hasFilters to useNativeFilters mocks
- Fix Header/index.tsx: remove unused menuTriggerStyles import, add proper types
- Fix useHeaderActionsDropdownMenu.tsx: handle undefined isStarred

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:26 -08:00
Evan Rusackas
4941cfe7fd fix(mobile): Address code review feedback
- Hide ViewModeToggle when forceViewMode is set
- Add ref to FilterControls in mobile drawer for clear functionality
- Add viewMode check for CardSortSelect in mobile drawer

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:02 -08:00
Evan Rusackas
2ff2dea3cb test(mobile): Add tests for mobile support props
- Add tests for forceViewMode prop in ListView
- Add tests for mobile filter drawer props (mobileFiltersOpen, setMobileFiltersOpen)
- Add tests for leftIcon/rightIcon props in SubMenu

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
a820356c5b feat(mobile): Add filter drawer to Dashboard List page
- Add mobile filter drawer that slides in from the left with search/sort options
- Extend SubMenu with leftIcon/rightIcon props for mobile header actions
- Add mobileFiltersOpen/setMobileFiltersOpen props to ListView component
- Increase card grid-gap on mobile for better spacing
- Hide kebab menu on cards in mobile consumption mode
- Change mobile filters button style to link for consistency

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
6ffc954b71 feat(mobile): Full-width cards on ListView (dashboard list)
- Cards span full width on mobile (< 768px)
- Reduced horizontal padding for better space usage

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
c38652bcd4 fix(mobile): Force card view on dashboard list with forceViewMode prop
- Add forceViewMode prop to ListView that overrides URL-persisted viewMode
- useEffect updates viewMode when forceViewMode changes (screen resize)
- DashboardList uses forceViewMode='card' on mobile

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
1f265dd399 feat(mobile): Force card view and hide toggle on dashboard list
- Hide view mode toggle (card/list) on mobile
- Force card view as default on mobile devices
- Table view still works on desktop

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
2d3577683f feat(mobile): Full-width cards on welcome page mobile view
- Cards now span full width on mobile (< 768px)
- Reduced horizontal padding for better use of space

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
711da4d681 fix(mobile): Keep menu horizontal on mobile for compact tabs
- Remove inline menu mode switch on mobile
- Keep horizontal mode so tabs display side-by-side
- CSS makes tabs compact with smaller padding

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
e82249d663 fix(mobile): Hide + button and compact filter tabs on mobile
- Fix CSS selector to use superset-button-secondary class
- Make Favorite/Mine/All tabs display horizontally in compact style
- Reduce padding and font size for space efficiency

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
2d91c138c4 feat(mobile): Simplify welcome page for mobile consumption mode
- Hide Charts and Saved queries sections on mobile (< 768px)
- Only show Recents and Dashboards for dashboard-focused experience
- Hide + (add) buttons on mobile to prevent creation actions
- Keep View All links and filter tabs accessible

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:01 -08:00
Evan Rusackas
7dddbb0f4e feat(mobile): Add filter drawer and chart consumption mode for mobile dashboards
- Add left-side filter drawer with vertical filter layout on mobile
- Hide Actions header and show Apply/Clear buttons side by side
- Add filter button to dashboard header (only when filters exist)
- Support leftPanelItems prop in PageHeaderWithActions
- Hide chart kebab menu and disable title links on mobile
- Show full chart titles without truncation on mobile
- Center dashboard title on mobile with filter icon on left

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:40:00 -08:00
Evan Rusackas
960fa46bb9 fix(mobile): remove negative margin on mobile dashboard
On mobile, the filter bar is hidden but the -32px margin-left was still
being applied, causing the dashboard title and Filters button to be cut
off on the left edge.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:38:59 -08:00
Evan Rusackas
32c1bf0f00 feat(mobile): clean up dashboard header for mobile
- Hide star icon on mobile (moved to overflow menu)
- Add favorite toggle, status, owner, modified info to overflow menu
- Hide Edit dashboard button on mobile
- Hide Enter fullscreen and Manage email report menu items on mobile
- Center logo on mobile with 3-column layout for future left icon

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:38:58 -08:00
Evan Rusackas
1d1fc7a9ec feat(mobile): improve mobile nav drawer UX
- Hide main nav items on mobile, show hamburger menu in header
- Simplify drawer: remove header, full-width items, no right border
- Show only consumption items (Dashboards, Theme, User/Logout, About)
- Hide create actions and admin settings on mobile
- Pass menu prop to RightMenu for Dashboards link

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4 <noreply@anthropic.com>
2026-02-25 18:37:28 -08:00
Evan Rusackas
594cf060b8 feat(mobile): add drawer menus for nav and filters on mobile
- Add hamburger menu in global nav that opens a Drawer with menu items
- Add "Filters" button on mobile that opens a bottom Drawer with FilterBar
- Replace horizontal menus with mobile-friendly drawer pattern

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:37:28 -08:00
Evan Rusackas
665c283989 fix(dashboard): make charts full-width on mobile
- Override ResizableContainer width/max-width/min-width on mobile
- Add CSS overrides in Chart styles for nested containers
- Pass '100%' width to ChartRenderer on mobile viewport

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:37:27 -08:00
Evan Rusackas
247bd9c3c3 feat(dashboard): add mobile-friendly dashboard consumption mode
- Filter global nav to show only Dashboards on mobile (<768px)
- Stack dashboard charts vertically instead of row layout on mobile
- Make dashboard tabs sticky for easier navigation on mobile
- Hide native filters on mobile for simplified viewing

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:37:27 -08:00
Enzo Martellucci
26053a8b5d fix(alert-modal): show the add filter button on firefox (#38093) 2026-02-25 23:42:05 +01:00
Amin Ghadersohi
abf0b7cf4b fix(mcp): use broad Exception in outermost tool-level handlers (#38254) 2026-02-25 22:08:56 +01:00
Amin Ghadersohi
eef4d95c22 fix(mcp): add dataset validation for chart tools (#37185)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 18:54:47 +01:00
Amin Ghadersohi
cc1128a404 feat(mcp): add response size guard to prevent oversized responses (#37200) 2026-02-25 09:43:14 -08:00
Amin Ghadersohi
c54b21ef98 fix(mcp): add eager loading to get_info tools to prevent N+1 query timeouts (#38129)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 17:28:58 +01:00
dependabot[bot]
438a927420 chore(deps-dev): bump oxlint from 1.49.0 to 1.50.0 in /superset-frontend (#38240)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-25 23:10:37 +07:00
dependabot[bot]
37a4637018 chore(deps-dev): bump typescript-eslint from 8.56.0 to 8.56.1 in /superset-websocket (#38203)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-25 23:09:41 +07:00
dependabot[bot]
79b2647481 chore(deps): bump @swc/core from 1.15.11 to 1.15.13 in /docs (#38207)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-25 23:00:21 +07:00
dependabot[bot]
1b605c4dda chore(deps): bump fs-extra from 11.3.2 to 11.3.3 in /superset-frontend (#38234)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-25 22:59:44 +07:00
dependabot[bot]
b543358d2f chore(deps-dev): bump @swc/core from 1.15.11 to 1.15.13 in /superset-frontend (#38237)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-25 22:59:14 +07:00
Amin Ghadersohi
a1312a86e8 fix(mcp): normalize column names to fix time series filter prompt issue (#37187)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 15:27:53 +01:00
Amin Ghadersohi
3084907931 feat(mcp): support unsaved state in Explore and Dashboard tools (#37183)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-25 15:25:23 +01:00
Amin Ghadersohi
1cd35bb102 feat(mcp): dynamic feature availability via menus and feature flags (#37964) 2026-02-25 12:01:44 +01:00
Joe Li
5eb35a4795 fix(reports): validate database field on PUT report schedule (#38084)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 16:58:19 -08:00
dependabot[bot]
01c1b2eb8f chore(deps-dev): bump @types/lodash from 4.17.23 to 4.17.24 in /superset-frontend (#38224)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-25 07:52:22 +07:00
dependabot[bot]
9e4a88dfa2 chore(deps): bump antd from 6.3.0 to 6.3.1 in /docs (#38221)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-25 07:51:39 +07:00
dependabot[bot]
4809903bb8 chore(deps): bump markdown-to-jsx from 9.7.4 to 9.7.6 in /superset-frontend (#38225)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-25 07:51:11 +07:00
Đỗ Trọng Hải
76a2559b2b fix(ci): revert "chore(deps): bump JustinBeckwith/linkinator-action from 2.3 to 2.4" (#38164) 2026-02-24 13:22:29 -08:00
Mehmet Salih Yavuz
e4a7cd30c3 fix(GAQ): don't use async queries when cache timeout is -1 (#38089) 2026-02-24 23:21:37 +03:00
dependabot[bot]
aa475734ef chore(deps-dev): bump eslint from 9.39.2 to 10.0.2 in /superset-websocket (#38204)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-24 10:30:17 -08:00
dependabot[bot]
97b8585fe5 chore(deps-dev): bump typescript-eslint from 8.56.0 to 8.56.1 in /docs (#38209)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-24 10:30:05 -08:00
Đỗ Trọng Hải
0d66accc37 chore(build): prevent opening Dependabot PRs for @rjsf/* deps due to React 18 constraint (#37976)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-24 10:28:05 -08:00
Ville Brofeldt
35c135852e feat(extensions): add mandatory publisher field to extension metadata (#38200) 2026-02-24 09:42:17 -08:00
Evan Rusackas
7b04d251d6 fix(build): restore automatic .d.ts generation in dev mode (#38202)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 08:27:25 -08:00
Michael S. Molina
974bee14c3 fix(extensions): make LOCAL_EXTENSIONS loading resilient to individual failures (#38217) 2026-02-24 13:17:27 -03:00
Richard Fogaca Nienkotter
fca8a49561 feat: auto refresh dashboard (#37459)
Co-authored-by: Richard <richard@ip-192-168-1-32.sa-east-1.compute.internal>
Co-authored-by: richard <richard@richards-MacBook-Pro-2.local>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Evan Rusackas <evan@preset.io>
2026-02-24 11:37:28 -03:00
Vitor Avila
f60432e34c fix: Allow non-owners to fave/unfave charts (#38095) 2026-02-24 11:28:32 -03:00
dependabot[bot]
b8459c15b8 chore(deps-dev): bump @typescript-eslint/parser from 8.56.0 to 8.56.1 in /docs (#38211)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-24 20:06:14 +07:00
Evan Rusackas
8eb3046888 fix(docs): guard window reference in logging.ts for SSR compatibility (#38201) 2026-02-23 18:41:49 -08:00
Evan Rusackas
615f13419c fix(jest): ignore storybook-static and package __mocks__ directories (#37946)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-23 16:18:14 -08:00
Evan Rusackas
8a74424545 fix(types): add explicit types for extendedDayjs plugin methods (#37923) 2026-02-24 06:58:46 +07:00
madhushreeag
8f070169a5 perf(datasource): add pagination to datasource editor tables to prevent browser freeze (#37555)
Co-authored-by: madhushree agarwal <madhushree_agarwal@apple.com>
2026-02-23 15:19:33 -08:00
Richard Fogaca Nienkotter
e06427d1ef feat(embedded): add feature flag to disable logout button in embedded contexts (#37537)
Co-authored-by: richard <richard@richards-MacBook-Pro-2.local>
2026-02-23 17:56:02 -03:00
Evan Rusackas
c4eb7de6de fix(excel): remove unwanted index column from Excel exports (#38176)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-23 08:28:40 -08:00
Vitor Avila
228b598409 feat: Labels for encrypted fields (#38075) 2026-02-23 13:23:33 -03:00
Ville Brofeldt
40f609fdce fix(extensions): enforce correct naming conventions (#38167) 2026-02-23 08:21:35 -08:00
Amin Ghadersohi
6e94a6c21a fix(mcp): fix dashboard chart placement with proper COLUMN layout and tab support (#37970)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 16:41:10 +01:00
Evan Rusackas
50cc1b93d2 fix(security): fix Guest Token API 422 error by disabling JWT sub claim verification (#38177)
Co-authored-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-23 21:58:01 +07:00
Evan Rusackas
131a97b657 fix(handlebars): add missing currencyformatter.js dependency (#38173)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-23 21:56:50 +07:00
dependabot[bot]
6f3a200c19 chore(deps-dev): bump @types/lodash from 4.17.23 to 4.17.24 in /superset-websocket (#38179)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-23 21:54:18 +07:00
Michael S. Molina
cbb80f0462 refactor(extensions): simplify registerEditorProvider API (#38127) 2026-02-23 09:04:31 -03:00
Amin Ghadersohi
2a3567d2f1 fix(mcp): Remove unsupported thumbnail/preview URLs and internal fields from MCP schemas (#38109) 2026-02-23 12:44:12 +01:00
Evan Rusackas
3f64ad3da5 fix(i18n): wrap untranslated frontend strings and add i18n lint rule (#37776)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-22 21:27:37 -08:00
Evan Rusackas
672a380587 chore(frontend): enable additional oxlint rules for better code hygiene (#38145) 2026-02-23 10:36:24 +07:00
Rohan Santhosh
a87a006aae ci: declare explicit permissions in maintenance workflows (#38159)
Co-authored-by: rohan436 <rohan.santhoshkumar@googlemail.com>
2026-02-22 12:05:58 +07:00
dependabot[bot]
159fb5d6f4 chore(deps-dev): bump ajv from 6.12.6 to 6.14.0 in /superset-frontend/cypress-base (#38131)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-21 17:34:27 -08:00
dependabot[bot]
6424194c87 chore(deps): bump underscore from 1.13.7 to 1.13.8 in /superset-frontend (#38142)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-21 13:25:26 +07:00
dependabot[bot]
5bee32ea93 chore(deps): bump aquasecurity/trivy-action from 0.34.0 to 0.34.1 (#38138)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-21 11:45:40 +07:00
dependabot[bot]
82fce8d7de chore(deps-dev): bump @types/node from 25.2.3 to 25.3.0 in /superset-frontend (#38143)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-21 11:45:06 +07:00
dependabot[bot]
5e6524954c chore(deps): pin react-icons to 5.4.0 in /superset-frontend (#38144)
Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: hainenber <dotronghai96@gmail.com>
2026-02-21 11:44:46 +07:00
dependabot[bot]
987b6a6f04 chore(deps): bump swagger-ui-react from 5.31.1 to 5.31.2 in /docs (#38140)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-21 07:43:21 +07:00
Đỗ Trọng Hải
3d6644864d build(deps): migrate to lighter and modern react-icons (#38125)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-20 16:19:01 -08:00
dependabot[bot]
577b965a60 chore(deps-dev): bump ajv from 6.12.6 to 6.14.0 in /superset-frontend (#38132)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-21 06:51:28 +07:00
Enzo Martellucci
b565128fe7 fix(charts): improve error display for failed charts in dashboards (#37939) 2026-02-20 15:14:48 -08:00
madhushreeag
b290f71245 fix(explore): prevent theme object from being passed to ReactAce in TextAreaControl (#38117)
Co-authored-by: madhushree agarwal <madhushree_agarwal@apple.com>
2026-02-20 14:16:07 -08:00
dependabot[bot]
cff854b06e chore(deps-dev): bump oxlint from 1.48.0 to 1.49.0 in /superset-frontend (#38115)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: hainenber <dotronghai96@gmail.com>
2026-02-20 14:13:37 -08:00
Manoj S
44d6b6a513 fix(table): preserve line breaks in cell content modal (#37036) 2026-02-20 14:12:14 -08:00
Ujjwaljain16
2d44f52ad1 fix(encryption): resolve SECRET_KEY lazily to fix silent re-encrypt-secrets failures (#37982) 2026-02-20 14:10:09 -08:00
wuqicyber
6f34ba7d4a fix(table-chart): support orderby adhoc columns with server-side pagination (#37521) 2026-02-21 00:29:34 +03:00
Damian Pendrak
1a77e17179 fix(chart-customizations): support migration of dynamic group by (#37176) 2026-02-20 13:11:07 -08:00
Gabriel Torres Ruiz
6fdaa8e9b3 fix(crud): reorder table actions + improve react memoization + improve hooks (#37897) 2026-02-20 08:58:28 -08:00
Kamil Gabryjelski
e30a9caba5 fix(dataset-modal): fix folders tab scrollbar by establishing proper flex chain (#38123)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 16:54:23 +01:00
Kamil Gabryjelski
7937246575 fix(button): use colorLink token for link-style buttons (#38121)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 16:54:05 +01:00
Amin Ghadersohi
9f8b212ccc feat(mcp): add LIKE, ILIKE, IN, NOT IN filter operators to MCP chart tools (#38071) 2026-02-20 11:56:40 +01:00
Amin Ghadersohi
1ecff6fe5c fix(thumbnails): stabilize digest by sorting datasources and charts (#38079)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 09:51:35 +01:00
dependabot[bot]
69653dfd08 chore(deps-dev): bump baseline-browser-mapping from 2.9.19 to 2.10.0 in /superset-frontend (#38116)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 15:35:18 +07:00
dependabot[bot]
58d8aa01f8 chore(deps): bump react-intersection-observer from 10.0.2 to 10.0.3 in /superset-frontend (#38114)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 15:34:15 +07:00
dependabot[bot]
88f0e322e3 chore(deps): bump baseline-browser-mapping from 2.9.19 to 2.10.0 in /docs (#38113)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-20 15:33:56 +07:00
Vanessa Giannoni
f4acce5727 fix(table): preserve time grain aggregation when temporal column casing changes (#37893) 2026-02-19 16:46:39 -08:00
Richard Fogaca Nienkotter
5278deaf63 fix(metrics): normalize legacy currency strings (#37455) 2026-02-19 21:25:44 -03:00
Mehmet Salih Yavuz
3868821dc8 fix(webpack): skip building service worker in dev (#38106) 2026-02-20 00:26:16 +03:00
Joe Li
6a61baf5be fix(alerts): show friendly filter names in report edit modal (#38054)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 10:33:33 -08:00
dependabot[bot]
5cc8ae5427 chore(deps): bump ol from 7.5.2 to 10.8.0 in /superset-frontend (#37961)
Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: hainenber <dotronghai96@gmail.com>
2026-02-19 10:32:32 -08:00
Michael S. Molina
1f76944c2b fix: Add editors to ContributionConfig and additional properties to EditorKeyword (#38098) 2026-02-19 15:00:21 -03:00
Kamil Gabryjelski
f049d3e34a fix: Search in folders editor with verbose names (#38101) 2026-02-19 18:45:22 +01:00
Kamil Gabryjelski
86c8fa5cd7 fix: Badge count in folders editor (#38100) 2026-02-19 18:45:04 +01:00
Kamil Gabryjelski
e12140beb6 fix: Warning toast copy in folders editor (#38099) 2026-02-19 18:22:22 +01:00
Kamil Gabryjelski
b7a3224f04 feat: Larger folder drag area in folders editor (#38102) 2026-02-19 18:22:04 +01:00
Kamil Gabryjelski
f5a5a804e2 perf(dashboard): skip thumbnail_url computing on single dashboard endpoint (#38015)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 18:15:20 +01:00
Đỗ Trọng Hải
0b77ace110 chore: fix lint issue with no-unsafe-optional-chaining rule (#38103)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-19 23:54:37 +07:00
Levis Mbote
c175346808 fix(table-charts): Prevent time grain from altering Raw Records in Tables + Interactive Tables (#37561) 2026-02-19 10:24:09 +01:00
Evan Rusackas
6b80135aa2 chore(lint): enforce more strict eslint/oxlint rules (batch 2) (#37884)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-18 19:27:27 -08:00
RealGreenDragon
de079a7b19 feat(deps)!: bump postgresql from 16 to 17 (#37782) 2026-02-18 17:12:48 -08:00
dependabot[bot]
f54bbdc06b chore(deps): bump dawidd6/action-download-artifact from 14 to 15 (#38060)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-18 17:11:41 -08:00
SBIN2010
33441ccf3d feat: add formatting column and formatting object to conditional formating table (#35897) 2026-02-19 02:07:15 +03:00
Vitor Avila
9ec56f5f02 fix: Include app_root in next param (#37942) 2026-02-18 19:52:06 -03:00
dependabot[bot]
11a36ff488 chore(deps-dev): bump the storybook group across 1 directory with 11 updates (#38068) 2026-02-18 23:48:16 +07:00
Đỗ Trọng Hải
af3e088233 build(deps): resolve GHSA-36jr-mh4h-2g58 by upgrading d3-color to 3.1.0 (#37981)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-18 21:12:39 +07:00
dependabot[bot]
29f499528f chore(deps-dev): bump eslint-plugin-testing-library from 7.15.4 to 7.16.0 in /superset-frontend (#38066)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-18 21:01:04 +07:00
dependabot[bot]
21481eef4f chore(deps): bump the storybook group in /docs with 9 updates (#38067)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-18 21:00:01 +07:00
dependabot[bot]
0d2c8fd373 chore(deps): bump @storybook/core from 8.6.15 to 8.6.16 in /docs (#38046)
Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: hainenber <dotronghai96@gmail.com>
2026-02-18 20:22:21 +07:00
Đỗ Trọng Hải
7b56fc1714 fix(docs): correct DB module filename for editing + update DB metadata file (#37990)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-18 20:08:50 +07:00
Đỗ Trọng Hải
9131739f98 fix(home): null check for possibly undefined filtered other table data due to insufficient permission (#37983) 2026-02-18 17:33:51 +07:00
Đỗ Trọng Hải
a30492f55e fix(plugin/cal-heatmap): properly color tooltip's text for both dark/light theme (#38010) 2026-02-18 17:25:41 +07:00
dependabot[bot]
090eab099c chore(deps): bump storybook from 8.6.15 to 8.6.16 in /docs (#38043) 2026-02-18 16:23:26 +07:00
dependabot[bot]
cd4cd53726 chore(deps-dev): bump css-loader from 7.1.3 to 7.1.4 in /superset-frontend (#38050) 2026-02-18 16:21:39 +07:00
dependabot[bot]
65c460c9d2 chore(deps-dev): bump @swc/plugin-emotion from 14.5.0 to 14.6.0 in /superset-frontend (#38053) 2026-02-18 16:20:49 +07:00
dependabot[bot]
868e719c60 chore(deps-dev): bump oxlint from 1.47.0 to 1.48.0 in /superset-frontend (#38055) 2026-02-18 16:20:16 +07:00
dependabot[bot]
5efc7ea5a5 chore(deps-dev): bump typescript-eslint from 8.55.0 to 8.56.0 in /docs (#38024)
Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: hainenber <dotronghai96@gmail.com>
2026-02-18 12:10:50 +07:00
dependabot[bot]
b0f9a73f63 chore(deps-dev): bump typescript-eslint from 8.54.0 to 8.56.0 in /superset-websocket (#38020)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-18 11:49:11 +07:00
dependabot[bot]
746e266e90 chore(deps): bump swagger-ui-react from 5.31.0 to 5.31.1 in /docs (#38023)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-18 11:37:51 +07:00
Damian Pendrak
5a777c0f45 feat(matrixify): add single metric constraint (#37169) 2026-02-17 09:12:24 -08:00
Amin Ghadersohi
aec1f6edce fix(mcp): use last data-bearing statement in execute_sql response (#37968)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 13:13:55 +01:00
Amin Ghadersohi
f7218e7a19 feat(mcp): expose current user identity in get_instance_info and add created_by_fk filter (#37967) 2026-02-17 13:11:34 +01:00
Amin Ghadersohi
5cd829f13c fix(mcp): handle more chart types in get_chart_data fallback query construction (#37969)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 13:02:42 +01:00
dependabot[bot]
9566e8a9c6 chore(deps-dev): bump eslint-plugin-react-you-might-not-need-an-effect from 0.8.5 to 0.9.1 in /superset-frontend (#38000)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-17 12:03:13 +07:00
dependabot[bot]
604d49f557 chore(deps): bump datamaps from 0.5.9 to 0.5.10 in /superset-frontend (#37913)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-16 14:51:03 -08:00
SBIN2010
84f1ee4409 feat: added conditional formatting enhancements string to pivot table (#35863) 2026-02-17 01:08:41 +03:00
Kamil Gabryjelski
3e3c9686de perf(dashboard): Batch RLS filter lookups for dashboard digest computation (#37941) 2026-02-16 21:35:55 +01:00
Mehmet Salih Yavuz
7b21979fa3 fix(charts): Force refresh uses async mode when GAQ is enabled (#37845) 2026-02-16 21:45:10 +03:00
Đỗ Trọng Hải
8853ff19d4 chore(websocket): migrate external uuid usage with Node's native UUID generator (#37101)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-16 18:05:10 +07:00
Damian Pendrak
22ac5e02b6 fix(deckgl): remove dataset field from Deck.gl Layer Visibility Display controls (#37611) 2026-02-16 11:58:23 +01:00
dependabot[bot]
2c9f0c1c2a chore(deps-dev): bump wait-on from 9.0.3 to 9.0.4 in /superset-frontend (#37999)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-16 17:18:23 +07:00
dependabot[bot]
d47a7105df chore(deps): bump caniuse-lite from 1.0.30001769 to 1.0.30001770 in /docs (#37994)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-16 15:42:53 +07:00
dependabot[bot]
c873225308 chore(deps-dev): bump jsdom from 28.0.0 to 28.1.0 in /superset-frontend (#37997)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-16 15:42:28 +07:00
dependabot[bot]
982e2c1ef7 chore(deps-dev): bump webpack from 5.105.0 to 5.105.2 in /superset-frontend (#38003)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-16 15:36:17 +07:00
dependabot[bot]
eee3af5775 chore(deps-dev): bump oxlint from 1.46.0 to 1.47.0 in /superset-frontend (#38005)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-16 15:35:29 +07:00
dependabot[bot]
232b34d944 chore(deps-dev): bump webpack-sources from 3.3.3 to 3.3.4 in /superset-frontend (#38004)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-16 15:35:03 +07:00
dependabot[bot]
d748ed19ce chore(deps): bump hot-shots from 13.2.0 to 14.0.0 in /superset-websocket (#37993) 2026-02-16 15:16:31 +07:00
dependabot[bot]
5300f65a74 chore(deps): bump qs from 6.14.1 to 6.14.2 in /superset-frontend (#37936)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-16 13:39:06 +07:00
Türker Ziya Ercin
440602ef34 fix(utils): datetime_to_epoch function is fixed to timezone aware epoch (#37979) 2026-02-15 22:36:18 +07:00
dependabot[bot]
cbf153845e chore(deps): bump qs from 6.14.1 to 6.14.2 in /superset-websocket/utils/client-ws-app (#37933)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-14 22:18:14 +07:00
dependabot[bot]
097f474f24 chore(deps): bump pillow from 11.3.0 to 12.1.1 (#37935)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-13 16:00:47 -08:00
Joe Li
73adff55ee chore(deps): Relax sqlalchemy-utils lower bound for pydoris compatibility (#37949) 2026-02-13 14:55:54 -08:00
dependabot[bot]
a65f73a532 chore(deps): bump qs from 6.14.1 to 6.14.2 in /docs (#37937)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-14 01:01:42 +07:00
dependabot[bot]
475615e118 chore(deps): bump ioredis from 5.9.2 to 5.9.3 in /superset-websocket (#37951)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-13 23:40:50 +07:00
dependabot[bot]
79f51e2ae7 chore(deps-dev): bump webpack from 5.105.1 to 5.105.2 in /docs (#37953)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-13 23:39:56 +07:00
dependabot[bot]
75d6a95ac3 chore(deps): bump aquasecurity/trivy-action from 0.33.1 to 0.34.0 (#37958)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-13 23:39:30 +07:00
dependabot[bot]
ffd7f10320 chore(deps): bump markdown-to-jsx from 9.7.3 to 9.7.4 in /superset-frontend (#37959)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-13 23:09:31 +07:00
Michael S. Molina
e3e2bece6b feat(owners): display email in owner selectors (#37906) 2026-02-13 09:01:05 -03:00
Jean Massucatto
0c0d915391 fix(echarts-timeseries-combined-labels): combine annotation labels for events at same timestamp (#37164) 2026-02-13 12:39:28 +03:00
Jamile Celento
080f629ea2 fix(echarts): formula annotations not rendering with dataset-level columns label (#37522) 2026-02-13 12:37:19 +03:00
Joe Li
142b2cc425 test(e2e): add Playwright E2E tests for Chart List page (#37866)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-12 14:16:11 -08:00
Joe Li
6328e51620 test(examples): add tests for UUID threading and security bypass (#37557)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-12 14:12:12 -08:00
Joe Li
0d5ddb3674 feat(themes): add enhanced validation and error handling with fallback mechanisms (#37378)
Co-authored-by: Rafael Benitez <rebenitez1802@gmail.com>
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-12 14:06:58 -08:00
Pat Buxton
58d245c6b0 chore(deps): Update sqlachemy-utils to 0.42.0 (#36240) 2026-02-12 12:39:06 -08:00
Jean Massucatto
dbf5e1f131 feat(theme): use IBM Plex Mono for code and numerical displays (#37366)
Co-authored-by: Mehmet Salih Yavuz <salih.yavuz@proton.me>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-12 09:32:41 -08:00
Jonathan Alberth Quispe Fuentes
88ce1425e2 fix(roles): optimize user fetching and resolve N+1 query issue (#37235) 2026-02-12 09:32:19 -08:00
Amin Ghadersohi
4dfece9ee5 feat(mcp): add event_logger instrumentation to MCP tools (#37859) 2026-02-12 16:50:20 +01:00
Amin Ghadersohi
3f64c25712 fix(mcp): Add database_name as valid filter column for list_datasets (#37865)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-12 16:47:46 +01:00
dependabot[bot]
afacca350f chore(deps-dev): bump oxlint from 1.42.0 to 1.46.0 in /superset-frontend (#37917)
Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: hainenber <dotronghai96@gmail.com>
2026-02-12 21:45:26 +07:00
dependabot[bot]
30ccbb2e05 chore(deps): update @types/geojson requirement from ^7946.0.10 to ^7946.0.16 in /superset-frontend/plugins/plugin-chart-cartodiagram (#37908)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-12 20:59:28 +07:00
Michael S. Molina
19ec7b48a0 fix: Conditional formatting painting empty cells (#37894) 2026-02-12 10:22:00 -03:00
Vanessa Giannoni
77148277b9 feat(charts): improve negative stacked bar label positioning and accessibility (#37405) 2026-02-11 17:46:10 -08:00
Evan Rusackas
981b370fe9 chore(storybook): consolidate storybook and enhance plugin stories (#37771)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-11 16:06:23 -08:00
Enzo Martellucci
b012b63e5b fix(native-filters): align refresh icon with default value field (#37802) 2026-02-11 21:26:26 +01:00
Kamil Gabryjelski
b0be47a4ac fix: Unreachable drop zones within tabs in dashbboard editor (#37904) 2026-02-11 21:08:19 +01:00
Ville Brofeldt
00d02cb2ea perf(gtf): improve task base filter (#37900) 2026-02-11 10:40:07 -08:00
Kamil Gabryjelski
26a2e12779 perf: fix N+1 query in Slice.datasource property (#37899) 2026-02-11 18:57:28 +01:00
Luis Sánchez
5f0001affc feat(timeseries): remove stream style for bar charts (#37532) 2026-02-11 09:25:03 -08:00
Ville Brofeldt
255a0ada81 fix(gtf): add missing user_id to task commands (#37867) 2026-02-11 09:04:27 -08:00
Evan Rusackas
9089f30045 chore(lint): upgrade array creation, effect, and TypeScript rules (#37885)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-11 08:45:21 -08:00
Kamil Gabryjelski
98ca599eef perf: fix N+1 query in chart list API when thumbnail_url is requested (#37895) 2026-02-11 17:19:48 +01:00
Evan Rusackas
d640fe42c9 chore: remove Applitools visual testing integration (#37873)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-11 08:07:19 -08:00
Evan Rusackas
534fa48f1f chore(lint): enforce stricter eslint/oxlint rules (#37883)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-11 08:07:02 -08:00
Evan Rusackas
c28729f944 chore(lint): add jest/expect-expect rule for test assertions (#37887)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-11 08:06:34 -08:00
Luis Sánchez
88a14f2ba0 fix(FiltersBadge): world map wont show filter icon after refresh page (#37260) 2026-02-11 16:33:32 +03:00
Ville Brofeldt
74e1607010 fix(extensions): broken test (#37871) 2026-02-11 08:33:45 -03:00
Mehmet Salih Yavuz
69c679be20 fix(explore): Don't show unsaved changes modal on new charts (#37714) 2026-02-11 13:05:42 +03:00
Evan Rusackas
9a79dbf445 fix(docs): make page size selector work in database table (#37863)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-10 22:15:55 -08:00
dependabot[bot]
7e5ca83220 chore(deps-dev): bump @types/node from 25.2.2 to 25.2.3 in /superset-frontend (#37851)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 14:24:55 -08:00
dependabot[bot]
7d4a7f113c chore(deps-dev): bump webpack from 5.105.0 to 5.105.1 in /docs (#37849)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 14:24:13 -08:00
dependabot[bot]
4eb8fc814a chore(deps-dev): bump @types/node from 25.2.2 to 25.2.3 in /superset-websocket (#37846)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 14:23:11 -08:00
Tadas Barzdžius
39ac96817a fix(helm): Add default initContainer resources (#37637) 2026-02-10 11:51:37 -08:00
Levis Mbote
1388a62823 fix(filters): fix filter / customization name not updating in sidebar in real time (#37358) 2026-02-10 20:41:47 +01:00
Michael S. Molina
6a6b9b5386 chore: Bump core packages (0.0.1rc11, 0.0.1rc4) (#37860) 2026-02-10 16:37:07 -03:00
Michael S. Molina
b98b34a60f refactor: Make extensions contribution schema consistent (#37856) 2026-02-10 15:55:39 -03:00
Kamil Gabryjelski
7ec5f1d7ec fix(native-filters): Filters with select first value not restored correctly from url (#37855) 2026-02-10 18:54:42 +01:00
Đỗ Trọng Hải
76aa91f5ea fix(deps): pin react-error-boundary to 6.0.0 for React 17 peer dep constraint (#37706)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-10 23:22:14 +07:00
Michael S. Molina
c41942a38a chore(deps): Upgrade sqlglot from 27.15.2 to 28.10.0 (#37841) 2026-02-10 13:13:11 -03:00
Alexandru Soare
ae8d671fea fix(sql): fix sql suggestions (#37699) 2026-02-10 17:30:17 +02:00
Enzo Martellucci
c59d0a73d4 fix: Prevent table rows from overlapping pagination in table view (#37174)
Co-authored-by: Diego Pucci <diegopucci.me@gmail.com>
2026-02-10 16:01:39 +01:00
Ville Brofeldt
0f1278fa61 fix(gtf): set dedup_key on atomic sql (#37820) 2026-02-10 06:56:14 -08:00
dependabot[bot]
948b1d613b chore(deps-dev): bump typescript-eslint from 8.54.0 to 8.55.0 in /docs (#37825)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 21:52:30 +07:00
dependabot[bot]
3af795af36 chore(deps-dev): bump @typescript-eslint/eslint-plugin from 8.54.0 to 8.55.0 in /superset-websocket (#37822)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 21:36:12 +07:00
dependabot[bot]
1cba53a043 chore(deps-dev): bump @typescript-eslint/parser from 8.54.0 to 8.55.0 in /superset-websocket (#37823)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 01:09:19 -08:00
dependabot[bot]
8c6bc3eaea chore(deps): bump antd from 6.2.3 to 6.3.0 in /docs (#37824)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 01:08:58 -08:00
dependabot[bot]
4d8ff84587 chore(deps-dev): bump @playwright/test from 1.58.1 to 1.58.2 in /superset-frontend (#37826)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 01:07:55 -08:00
dependabot[bot]
f370da5a87 chore(deps-dev): bump @typescript-eslint/parser from 8.54.0 to 8.55.0 in /docs (#37827)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 01:07:36 -08:00
dependabot[bot]
2df60f9caf chore(deps): bump immer from 11.1.3 to 11.1.4 in /superset-frontend (#37830)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 01:07:08 -08:00
dependabot[bot]
d078f18ff8 chore(deps-dev): bump @types/node from 25.2.1 to 25.2.2 in /superset-websocket (#37796)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 01:06:17 -08:00
dependabot[bot]
6ca028dee9 chore(deps): bump axios from 1.12.2 to 1.13.5 in /docs (#37814)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-10 01:05:54 -08:00
Tu Shaokun
76351ff12c fix(i18n): ensure language pack loads before React renders (#36893) 2026-02-10 00:29:04 -08:00
Joe Li
f6f96ecc49 test(chart-list): migrate Chart List tests from Cypress to RTL (#37813)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-09 16:25:49 -08:00
Ville Brofeldt
59dd2fa385 feat: add global task framework (#36368) 2026-02-09 10:45:56 -08:00
Michael S. Molina
6984e93171 fix: SQL Lab improvements and bug fixes (#37760) 2026-02-09 14:29:08 -03:00
Kamil Gabryjelski
f25d95be41 fix: Vertical lines in the middle of Treemap categories (#37808) 2026-02-09 17:44:59 +01:00
Đỗ Trọng Hải
5125a67002 build(dev-deps): remove npm from @apache-superset/core (#37774)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-09 13:00:40 -03:00
dependabot[bot]
059b57d784 chore(deps-dev): bump @types/node from 25.2.1 to 25.2.2 in /superset-frontend (#37801)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 22:47:12 +07:00
Đỗ Trọng Hải
a1d65c7529 feat(deps): significant npm audit fix to trim off inadvertently runtime dep from upstream libraries (#37220)
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-02-09 22:13:57 +07:00
Amin Ghadersohi
15b3c96f8e fix(security): Add table blocklist and fix MCP SQL validation bypass (#37411) 2026-02-09 14:12:06 +01:00
Alexandru Soare
2b411b32ba fix(scatter): Fix ad-hoc metric for pointsize (#37669) 2026-02-09 11:13:06 +02:00
Nikhil
cebdb9e0b7 fix(ListView): add tooltip for layout toggle buttons (#37581)
Co-authored-by: root <root@DESKTOP-LUKSKD7.localdomain>
Co-authored-by: codeant-ai-for-open-source[bot] <244253245+codeant-ai-for-open-source[bot]@users.noreply.github.com>
2026-02-09 02:37:53 -05:00
dependabot[bot]
ce872ddaf0 chore(deps-dev): bump @swc/core from 1.14.0 to 1.15.11 in /superset-frontend (#37511)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evan Rusackas <evan@rusackas.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-09 02:27:45 -05:00
dependabot[bot]
29aa69b779 chore(deps): update @luma.gl/engine requirement from ~9.2.5 to ~9.2.6 in /superset-frontend/plugins/legacy-preset-chart-deckgl (#37762)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 01:28:59 -05:00
Evan Rusackas
ebee9bb3f9 refactor(types): consolidate shared table types and fix Funnel enum typo (#37768)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-09 01:14:34 -05:00
Vinícius Borges Alencar
82d6076804 refactor(charts): filter saved metrics by key and label (#37136) 2026-02-09 07:29:32 +03:00
Đỗ Trọng Hải
3b75af9ac3 docs(dev_portal/test): remove refs of testing tools not used in project (#37786)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-08 16:46:31 +07:00
Đỗ Trọng Hải
563d9f1a3f chore(lint): migrate Jest lint rules from eslint to oxlint (#37787)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-08 16:44:42 +07:00
Đỗ Trọng Hải
c4d2d42b3b build(dev-deps): move Webpack-dedicated js-yaml-loader to dev deps section (#37788)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-08 01:37:12 -08:00
dependabot[bot]
7580bd1401 chore(deps-dev): bump timezone-mock from 1.3.6 to 1.4.0 in /superset-frontend (#37333)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evan Rusackas <evan@rusackas.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-07 23:07:29 -08:00
SBIN2010
c4e7c3b03b refactor: consolidating ColorSchemeEnum settings into one place (#37591) 2026-02-07 23:04:20 -08:00
dependabot[bot]
3521f191b2 chore(deps): bump webpack from 5.96.1 to 5.105.0 in /superset-frontend/cypress-base (#37775)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-07 10:18:47 -08:00
Rini Misini
f4708a5648 fix(db): prevent long database error messages from overflowing UI (#37709)
Co-authored-by: RiniMisini12 <misinirini@gmail.com>
2026-02-07 21:13:09 +07:00
dependabot[bot]
b9ab03994a chore(deps-dev): bump jsdom from 27.4.0 to 28.0.0 in /superset-frontend (#37688)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-07 15:32:38 +07:00
dependabot[bot]
df253f6aa4 chore(deps-dev): bump @babel/plugin-transform-runtime from 7.28.5 to 7.29.0 in /superset-frontend (#37631)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-07 00:12:35 -08:00
dependabot[bot]
5cea4fb7fe chore(deps): update @luma.gl/shadertools requirement from ~9.2.5 to ~9.2.6 in /superset-frontend/plugins/legacy-preset-chart-deckgl (#37763)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-06 23:55:22 -08:00
dependabot[bot]
76a27d5360 chore(deps): bump d3-format from 1.4.5 to 3.1.2 in /superset-frontend/packages/superset-ui-core (#37442)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Evan Rusackas <evan@rusackas.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-06 23:42:39 -08:00
dependabot[bot]
174e3c26d3 chore(deps): update @luma.gl/webgl requirement from ~9.2.5 to ~9.2.6 in /superset-frontend/plugins/legacy-preset-chart-deckgl (#37764)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-06 23:32:37 -08:00
Alexandru Soare
9ea5ded988 fix(dashboard): Prevent fatal error when database connection is unavailable (#37576) 2026-02-06 20:52:17 -08:00
Đỗ Trọng Hải
9086ae8e6c feat(ci): only bump patch version for Storybook-related deps until React 18 (#37749)
Signed-off-by: hainenber <dotronghai96@gmail.com>
2026-02-06 13:29:32 -08:00
Evan Rusackas
fc5506e466 chore(frontend): comprehensive TypeScript quality improvements (#37625)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-06 13:16:57 -08:00
Abhishek Mishra
e9ae212c1c fix(alerts): show screenshot width field for PDF reports (#37037) 2026-02-06 11:19:18 -08:00
Evan Rusackas
46bca32677 docs(seo): add structured data, OpenGraph tags, and sitemap improvements (#37404)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-06 11:09:19 -08:00
JUST.in DO IT
a04571fa20 fix(world-map): reset hover highlight on mouse out (#37716)
Co-authored-by: Arunodoy18 <arunodoy630@gmail.com>
2026-02-06 10:27:57 -08:00
Evan Rusackas
fc26dbfebf chore(deps): upgrade deck.gl and luma.gl packages to ~9.2.6 (#37718)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-06 10:20:50 -08:00
Michael S. Molina
0415118544 chore: Bump @apache-superset/core (0.0.1-rc10) (#37759) 2026-02-06 14:18:22 -03:00
Michael S. Molina
935bbe6061 docs: Updates extensions docs (#37704) 2026-02-06 13:18:25 -03:00
Daniel Vaz Gaspar
ec6eaf4898 fix(deps): bump elasticsearch-dbapi to 0.2.12 for urllib3 2.x compatibility (#37758)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-06 16:03:04 +00:00
1473 changed files with 140084 additions and 72933 deletions

104
.github/dependabot.yml vendored
View File

@@ -9,12 +9,19 @@ updates:
- package-ecosystem: "npm"
ignore:
# not until React >= 18.0.0
# TODO: remove below entries until React >= 18.0.0
- dependency-name: "storybook"
update-types: ["version-update:semver-major", "version-update:semver-minor"]
- dependency-name: "@storybook*"
update-types: ["version-update:semver-major", "version-update:semver-minor"]
- dependency-name: "eslint-plugin-storybook"
- dependency-name: "react-error-boundary"
- dependency-name: "@rjsf/*"
# remark-gfm v4+ requires react-markdown v9+, which needs React 18
- dependency-name: "remark-gfm"
- dependency-name: "react-markdown"
# TODO: remove below entries until React >= 19.0.0
- dependency-name: "react-icons"
# JSDOM v30 doesn't play well with Jest v30
# Source: https://jestjs.io/blog#known-issues
# GH thread: https://github.com/jsdom/jsdom/issues/3492
@@ -23,6 +30,14 @@ updates:
# See https://github.com/apache/superset/pull/37384#issuecomment-3793991389
# TODO: remove the plugin once Lodash usage has been migrated to a more readily tree-shakeable alternative
- dependency-name: "@swc/plugin-transform-imports"
groups:
storybook:
applies-to: version-updates
patterns:
- "@storybook*"
- "storybook"
update-types:
- "patch"
directory: "/superset-frontend/"
schedule:
interval: "daily"
@@ -53,6 +68,22 @@ updates:
- package-ecosystem: "npm"
directory: "/docs/"
ignore:
# TODO: remove below entries until React >= 18.0.0 in superset-frontend
- dependency-name: "storybook"
update-types: ["version-update:semver-major", "version-update:semver-minor"]
- dependency-name: "@storybook*"
update-types: ["version-update:semver-major", "version-update:semver-minor"]
- dependency-name: "eslint-plugin-storybook"
- dependency-name: "react-error-boundary"
groups:
storybook:
applies-to: version-updates
patterns:
- "@storybook*"
- "storybook"
update-types:
- "patch"
schedule:
interval: "daily"
open-pull-requests-limit: 10
@@ -89,16 +120,6 @@ updates:
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-histogram/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-partition/"
schedule:
@@ -121,6 +142,9 @@ updates:
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-pivot-table/"
ignore:
# TODO: remove below entries until React >= 19.0.0
- dependency-name: "react-icons"
schedule:
interval: "daily"
labels:
@@ -171,6 +195,9 @@ updates:
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-table/"
ignore:
# TODO: remove below entries until React >= 19.0.0
- dependency-name: "react-icons"
schedule:
interval: "daily"
labels:
@@ -199,16 +226,6 @@ updates:
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-sankey/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-preset-chart-nvd3/"
schedule:
@@ -229,16 +246,6 @@ updates:
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-event-flow/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-paired-t-test/"
schedule:
@@ -249,16 +256,6 @@ updates:
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-sankey-loop/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-echarts/"
schedule:
@@ -270,7 +267,7 @@ updates:
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/preset-chart-xy/"
directory: "/superset-frontend/plugins/plugin-chart-ag-grid-table/"
schedule:
interval: "daily"
labels:
@@ -280,7 +277,7 @@ updates:
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-heatmap/"
directory: "/superset-frontend/plugins/plugin-chart-cartodiagram/"
schedule:
interval: "daily"
labels:
@@ -299,16 +296,6 @@ updates:
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/legacy-plugin-chart-sunburst/"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/plugins/plugin-chart-handlebars/"
schedule:
@@ -345,16 +332,7 @@ updates:
# not until React >= 18.0.0
- dependency-name: "react-markdown"
- dependency-name: "remark-gfm"
schedule:
interval: "daily"
labels:
- npm
- dependabot
open-pull-requests-limit: 5
versioning-strategy: increase
- package-ecosystem: "npm"
directory: "/superset-frontend/packages/superset-ui-demo/"
- dependency-name: "react-error-boundary"
schedule:
interval: "daily"
labels:

View File

@@ -304,26 +304,3 @@ monitor_memory() {
sleep 2
done
}
cypress-run-applitools() {
cd "$GITHUB_WORKSPACE/superset-frontend/cypress-base"
local flasklog="${HOME}/flask.log"
local port=8081
local cypress="./node_modules/.bin/cypress run"
local browser=${CYPRESS_BROWSER:-chrome}
export CYPRESS_BASE_URL="http://localhost:${port}"
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
local flaskProcessId=$!
$cypress --spec "cypress/applitools/**/*" --browser "$browser" --headless
say "::group::Flask log for default run"
cat "$flasklog"
say "::endgroup::"
# make sure the program exits
kill $flaskProcessId
}

View File

@@ -39,13 +39,9 @@ jobs:
# pkg:npm/store2@2.14.2
# adding an exception for an ambigious license on store2, which has been resolved in
# the latest version. It's MIT: https://github.com/nbubna/store/blob/master/LICENSE-MIT
# pkg:npm/applitools/*
# adding exception for all applitools modules (eyes-cypress and its dependencies),
# which has an explicit OSS license approved by ASF
# license: https://applitools.com/legal/open-source-terms-of-use/
# pkg:npm/node-forge@1.3.1
# selecting BSD-3-Clause licensing terms for node-forge to ensure compatibility with Apache
allow-dependencies-licenses: pkg:npm/store2@2.14.2, pkg:npm/applitools/core, pkg:npm/applitools/core-base, pkg:npm/applitools/css-tree, pkg:npm/applitools/ec-client, pkg:npm/applitools/eg-socks5-proxy-server, pkg:npm/applitools/eyes, pkg:npm/applitools/eyes-cypress, pkg:npm/applitools/nml-client, pkg:npm/applitools/tunnel-client, pkg:npm/applitools/utils, pkg:npm/node-forge@1.3.1, pkg:npm/rgbcolor, pkg:npm/jszip@3.10.1
allow-dependencies-licenses: pkg:npm/store2@2.14.2, pkg:npm/node-forge@1.3.1, pkg:npm/rgbcolor, pkg:npm/jszip@3.10.1
python-dependency-liccheck:
# NOTE: Configuration for liccheck lives in our pyproject.yml.

View File

@@ -104,7 +104,7 @@ jobs:
# Scan for vulnerabilities in built container image after pushes to mainline branch.
- name: Run Trivy container image vulnerabity scan
if: github.event_name == 'push' && github.ref == 'refs/heads/master' && (steps.check.outputs.python || steps.check.outputs.frontend || steps.check.outputs.docker) && matrix.build_preset == 'lean'
uses: aquasecurity/trivy-action@b6643a29fecd7f34b3597bc6acb0a98b03d33ff8 # v0.33.1
uses: aquasecurity/trivy-action@e368e328979b113139d6f9068e03accaed98a518 # v0.34.1
with:
image-ref: ${{ env.IMAGE_TAG }}
format: 'sarif'

View File

@@ -4,6 +4,9 @@ on:
pull_request:
types: [labeled, unlabeled, opened, reopened, synchronize]
permissions:
pull-requests: read
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}

View File

@@ -8,6 +8,9 @@ on:
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
permissions:
contents: read
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}

View File

@@ -1,70 +0,0 @@
name: Prefer TypeScript
on:
push:
branches:
- "master"
- "[0-9].[0-9]*"
paths:
- "superset-frontend/src/**"
pull_request:
types: [synchronize, opened, reopened, ready_for_review]
paths:
- "superset-frontend/src/**"
# cancel previous workflow jobs for PRs
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_id }}
cancel-in-progress: true
jobs:
prefer_typescript:
if: github.ref == 'ref/heads/master' && github.event_name == 'pull_request'
name: Prefer TypeScript
runs-on: ubuntu-24.04
permissions:
contents: read
pull-requests: write
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
with:
persist-credentials: false
submodules: recursive
- name: Get changed files
id: changed
uses: ./.github/actions/file-changes-action
with:
githubToken: ${{ github.token }}
- name: Determine if a .js or .jsx file was added
id: check
run: |
js_files_added() {
jq -r '
map(
select(
endswith(".js") or endswith(".jsx")
)
) | join("\n")
' ${HOME}/files_added.json
}
echo "js_files_added=$(js_files_added)" >> $GITHUB_OUTPUT
- if: steps.check.outputs.js_files_added
name: Add Comment to PR
uses: ./.github/actions/comment-on-pr
continue-on-error: true
env:
GITHUB_TOKEN: ${{ github.token }}
with:
msg: |
### WARNING: Prefer TypeScript
Looks like your PR contains new `.js` or `.jsx` files:
```
${{steps.check.outputs.js_files_added}}
```
As decided in [SIP-36](https://github.com/apache/superset/issues/9101), all new frontend code should be written in TypeScript. Please convert above files to TypeScript then re-request review.

View File

@@ -23,7 +23,7 @@ jobs:
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
services:
postgres:
image: postgres:16-alpine
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset

View File

@@ -1,91 +0,0 @@
name: Applitools Cypress
on:
schedule:
- cron: "0 1 * * *"
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${{ (secrets.APPLITOOLS_API_KEY != '' && secrets.APPLITOOLS_API_KEY != '') || '' }}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
cypress-applitools:
needs: config
if: needs.config.outputs.has-secrets
runs-on: ubuntu-24.04
strategy:
fail-fast: false
matrix:
browser: ["chrome"]
env:
SUPERSET_ENV: development
SUPERSET_CONFIG: tests.integration_tests.superset_test_config
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
PYTHONPATH: ${{ github.workspace }}
REDIS_PORT: 16379
GITHUB_TOKEN: ${{ github.token }}
APPLITOOLS_APP_NAME: Superset
APPLITOOLS_API_KEY: ${{ secrets.APPLITOOLS_API_KEY }}
APPLITOOLS_BATCH_ID: ${{ github.sha }}
APPLITOOLS_BATCH_NAME: Superset Cypress
services:
postgres:
image: postgres:16-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
ports:
- 15432:5432
redis:
image: redis:7-alpine
ports:
- 16379:6379
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
with:
persist-credentials: false
submodules: recursive
ref: master
- name: Setup Python
uses: ./.github/actions/setup-backend/
- name: Import test data
uses: ./.github/actions/cached-dependencies
with:
run: testdata
- name: Setup Node.js
uses: actions/setup-node@v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install npm dependencies
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: Build javascript packages
uses: ./.github/actions/cached-dependencies
with:
run: build-instrumented-assets
- name: Setup Postgres
if: steps.check.outcome == 'failure'
uses: ./.github/actions/cached-dependencies
with:
run: setup-postgres
- name: Install cypress
uses: ./.github/actions/cached-dependencies
with:
run: cypress-install
- name: Run Cypress
uses: ./.github/actions/cached-dependencies
env:
CYPRESS_BROWSER: ${{ matrix.browser }}
with:
run: cypress-run-applitools

View File

@@ -1,52 +0,0 @@
name: Applitools Storybook
on:
schedule:
- cron: "0 0 * * *"
env:
APPLITOOLS_APP_NAME: Superset
APPLITOOLS_API_KEY: ${{ secrets.APPLITOOLS_API_KEY }}
APPLITOOLS_BATCH_ID: ${{ github.sha }}
APPLITOOLS_BATCH_NAME: Superset Storybook
jobs:
config:
runs-on: ubuntu-24.04
outputs:
has-secrets: ${{ steps.check.outputs.has-secrets }}
steps:
- name: "Check for secrets"
id: check
shell: bash
run: |
if [ -n "${{ (secrets.APPLITOOLS_API_KEY != '' && secrets.APPLITOOLS_API_KEY != '') || '' }}" ]; then
echo "has-secrets=1" >> "$GITHUB_OUTPUT"
fi
cron:
needs: config
if: needs.config.outputs.has-secrets
runs-on: ubuntu-24.04
steps:
- name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )"
uses: actions/checkout@v6
with:
persist-credentials: false
submodules: recursive
ref: master
- name: Set up Node.js
uses: actions/setup-node@v6
with:
node-version-file: './superset-frontend/.nvmrc'
- name: Install eyes-storybook dependencies
uses: ./.github/actions/cached-dependencies
with:
run: eyes-storybook-dependencies
- name: Install NPM dependencies
uses: ./.github/actions/cached-dependencies
with:
run: npm-install
- name: Run Applitools Eyes-Storybook
working-directory: ./superset-frontend
run: npx eyes-storybook -u https://superset-storybook.netlify.app/

View File

@@ -68,7 +68,7 @@ jobs:
yarn install --check-cache
- name: Download database diagnostics (if triggered by integration tests)
if: github.event_name == 'workflow_run' && github.event.workflow_run.conclusion == 'success'
uses: dawidd6/action-download-artifact@v14
uses: dawidd6/action-download-artifact@v15
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml
@@ -77,7 +77,7 @@ jobs:
path: docs/src/data/
- name: Try to download latest diagnostics (for push/dispatch triggers)
if: github.event_name != 'workflow_run'
uses: dawidd6/action-download-artifact@v14
uses: dawidd6/action-download-artifact@v15
continue-on-error: true
with:
workflow: superset-python-integrationtest.yml

View File

@@ -27,7 +27,7 @@ jobs:
- uses: actions/checkout@v6
# Do not bump this linkinator-action version without opening
# an ASF Infra ticket to allow the new version first!
- uses: JustinBeckwith/linkinator-action@f62ba0c110a76effb2ee6022cc6ce4ab161085e3 # v2.4
- uses: JustinBeckwith/linkinator-action@af984b9f30f63e796ae2ea5be5e07cb587f1bbd9 # v2.3
continue-on-error: true # This will make the job advisory (non-blocking, no red X)
with:
paths: "**/*.md, **/*.mdx"
@@ -111,7 +111,7 @@ jobs:
run: |
yarn install --check-cache
- name: Download database diagnostics from integration tests
uses: dawidd6/action-download-artifact@v14
uses: dawidd6/action-download-artifact@v15
with:
workflow: superset-python-integrationtest.yml
run_id: ${{ github.event.workflow_run.id }}

View File

@@ -54,7 +54,7 @@ jobs:
USE_DASHBOARD: ${{ github.event.inputs.use_dashboard == 'true' || 'false' }}
services:
postgres:
image: postgres:16-alpine
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
@@ -171,7 +171,7 @@ jobs:
GITHUB_TOKEN: ${{ github.token }}
services:
postgres:
image: postgres:16-alpine
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset

View File

@@ -163,11 +163,6 @@ jobs:
docker run --rm $TAG bash -c \
"npm run plugins:build"
- name: Build Plugins Storybook
run: |
docker run --rm $TAG bash -c \
"npm run plugins:build-storybook"
test-storybook:
needs: frontend-build
if: needs.frontend-build.outputs.should-run == 'true'

View File

@@ -45,7 +45,7 @@ jobs:
GITHUB_TOKEN: ${{ github.token }}
services:
postgres:
image: postgres:16-alpine
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset

View File

@@ -115,7 +115,7 @@ jobs:
SUPERSET__SQLALCHEMY_DATABASE_URI: postgresql+psycopg2://superset:superset@127.0.0.1:15432/superset
services:
postgres:
image: postgres:16-alpine
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset

View File

@@ -25,7 +25,7 @@ jobs:
SUPERSET__SQLALCHEMY_EXAMPLES_URI: presto://localhost:15433/memory/default
services:
postgres:
image: postgres:16-alpine
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset
@@ -94,7 +94,7 @@ jobs:
UPLOAD_FOLDER: /tmp/.superset/uploads/
services:
postgres:
image: postgres:16-alpine
image: postgres:17-alpine
env:
POSTGRES_USER: superset
POSTGRES_PASSWORD: superset

View File

@@ -6,6 +6,9 @@ on:
- master
- "[0-9].[0-9]*"
permissions:
contents: read
jobs:
config:
runs-on: ubuntu-24.04

View File

@@ -24,6 +24,64 @@ assists people when migrating to a new version.
## Next
### MCP Tool Observability
MCP (Model Context Protocol) tools now include enhanced observability instrumentation for monitoring and debugging:
**Two-layer instrumentation:**
1. **Middleware layer** (`LoggingMiddleware`): Automatically logs all MCP tool calls with `duration_ms` and `success` status in the audit log (Action Log UI, logs table)
2. **Sub-operation tracking**: All 19 MCP tools include granular `event_logger.log_context()` blocks for tracking individual operations like validation, database writes, and query execution
**Action naming convention:**
- Tool-level logs: `mcp_tool_call` (via middleware)
- Sub-operation logs: `mcp.{tool_name}.{operation}` (e.g., `mcp.generate_chart.validation`, `mcp.execute_sql.query_execution`)
**Querying MCP logs:**
```sql
-- Top slowest MCP operations
SELECT action, COUNT(*) as calls, AVG(duration_ms) as avg_ms
FROM logs
WHERE action LIKE 'mcp.%'
GROUP BY action
ORDER BY avg_ms DESC
LIMIT 20;
-- MCP tool success rate
SELECT
json_extract(curated_payload, '$.tool') as tool,
COUNT(*) as total_calls,
SUM(CASE WHEN json_extract(curated_payload, '$.success') = 'true' THEN 1 ELSE 0 END) as successful,
ROUND(100.0 * SUM(CASE WHEN json_extract(curated_payload, '$.success') = 'true' THEN 1 ELSE 0 END) / COUNT(*), 2) as success_rate
FROM logs
WHERE action = 'mcp_tool_call'
GROUP BY tool
ORDER BY total_calls DESC;
```
**Security note:** Sensitive parameters (passwords, API keys, tokens) are automatically redacted in logs as `[REDACTED]`.
### Signal Cache Backend
A new `SIGNAL_CACHE_CONFIG` configuration provides a unified Redis-based backend for real-time coordination features in Superset. This backend enables:
- **Pub/sub messaging** for real-time event notifications between workers
- **Atomic distributed locking** using Redis SET NX EX (more performant than database-backed locks)
- **Event-based coordination** for background task management
The signal cache is used by the Global Task Framework (GTF) for abort notifications and task completion signaling, and will eventually replace `GLOBAL_ASYNC_QUERIES_CACHE_BACKEND` as the standard signaling backend. Configuring this is recommended for Redis enabled production deployments.
Example configuration in `superset_config.py`:
```python
SIGNAL_CACHE_CONFIG = {
"CACHE_TYPE": "RedisCache",
"CACHE_KEY_PREFIX": "signal_",
"CACHE_REDIS_URL": "redis://localhost:6379/1",
"CACHE_DEFAULT_TIMEOUT": 300,
}
```
See `superset/config.py` for complete configuration options.
### WebSocket config for GAQ with Docker
[35896](https://github.com/apache/superset/pull/35896) and [37624](https://github.com/apache/superset/pull/37624) updated documentation on how to run and configure Superset with Docker. Specifically for the WebSocket configuration, a new `docker/superset-websocket/config.example.json` was added to the repo, so that users could copy it to create a `docker/superset-websocket/config.json` file. The existing `docker/superset-websocket/config.json` was removed and git-ignored, so if you're using GAQ / WebSocket make sure to:

View File

@@ -45,7 +45,7 @@ services:
required: true
- path: docker/.env-local # optional override
required: false
image: postgres:16
image: postgres:17
container_name: superset_db
restart: unless-stopped
volumes:

View File

@@ -85,7 +85,7 @@ services:
required: true
- path: docker/.env-local # optional override
required: false
image: postgres:16
image: postgres:17
restart: unless-stopped
volumes:
- db_home_light:/var/lib/postgresql/data

View File

@@ -49,7 +49,7 @@ services:
required: true
- path: docker/.env-local # optional override
required: false
image: postgres:16
image: postgres:17
container_name: superset_db
restart: unless-stopped
volumes:

View File

@@ -76,7 +76,7 @@ services:
required: true
- path: docker/.env-local # optional override
required: false
image: postgres:16
image: postgres:17
restart: unless-stopped
ports:
- "127.0.0.1:${DATABASE_PORT:-5432}:5432"

View File

@@ -1 +1 @@
v20.18.3
v20.20.0

View File

@@ -788,7 +788,7 @@ pytest ./link_to_test.py
### Frontend Testing
We use [Jest](https://jestjs.io/) and [Enzyme](https://airbnb.io/enzyme/) to test TypeScript/JavaScript. Tests can be run with:
We use [Jest](https://jestjs.io/) and [React Testing Library](https://testing-library.com/docs/react-testing-library/intro/) to test TypeScript. Tests can be run with:
```bash
cd superset-frontend

View File

@@ -100,7 +100,7 @@ npm link superset-plugin-chart-hello-world
```
7. **Import and register in Superset**:
Edit `superset-frontend/src/visualizations/presets/MainPreset.js` to include your plugin.
Edit `superset-frontend/src/visualizations/presets/MainPreset.ts` to include your plugin.
## Testing

View File

@@ -38,12 +38,14 @@ Extensions can add new views or panels to the host application, such as custom S
"frontend": {
"contributions": {
"views": {
"sqllab.panels": [
{
"id": "my_extension.main",
"name": "My Panel Name"
}
]
"sqllab": {
"panels": [
{
"id": "my_extension.main",
"name": "My Panel Name"
}
]
}
}
}
}
@@ -76,25 +78,27 @@ Extensions can contribute new menu items or context menus to the host applicatio
"frontend": {
"contributions": {
"menus": {
"sqllab.editor": {
"primary": [
{
"view": "builtin.editor",
"command": "my_extension.copy_query"
}
],
"secondary": [
{
"view": "builtin.editor",
"command": "my_extension.prettify"
}
],
"context": [
{
"view": "builtin.editor",
"command": "my_extension.clear"
}
]
"sqllab": {
"editor": {
"primary": [
{
"view": "builtin.editor",
"command": "my_extension.copy_query"
}
],
"secondary": [
{
"view": "builtin.editor",
"command": "my_extension.prettify"
}
],
"context": [
{
"view": "builtin.editor",
"command": "my_extension.clear"
}
]
}
}
}
}

View File

@@ -40,10 +40,10 @@ superset-extensions bundle: Packages the extension into a .supx file.
superset-extensions dev: Automatically rebuilds the extension as files change.
```
When creating a new extension with `superset-extensions init <extension-name>`, the CLI generates a standardized folder structure:
When creating a new extension with `superset-extensions init`, the CLI generates a standardized folder structure:
```
dataset_references/
dataset-references/
├── extension.json
├── frontend/
│ ├── src/
@@ -52,25 +52,33 @@ dataset_references/
│ └── package.json
├── backend/
│ ├── src/
└── dataset_references/
└── superset_extensions/
│ │ └── dataset_references/
│ ├── tests/
│ ├── pyproject.toml
│ └── requirements.txt
├── dist/
│ ├── manifest.json
│ ├── frontend
└── dist/
├── remoteEntry.d7a9225d042e4ccb6354.js
└── 900.038b20cdff6d49cfa8d9.js
└── dist/
├── remoteEntry.d7a9225d042e4ccb6354.js
└── 900.038b20cdff6d49cfa8d9.js
│ └── backend
│ └── dataset_references/
── __init__.py
├── api.py
└── entrypoint.py
├── dataset_references-1.0.0.supx
│ └── superset_extensions/
── dataset_references/
├── __init__.py
├── api.py
│ └── entrypoint.py
├── dataset-references-1.0.0.supx
└── README.md
```
**Note**: The extension ID (`dataset-references`) serves as the basis for all technical names:
- Directory name: `dataset-references` (kebab-case)
- Backend Python package: `dataset_references` (snake_case)
- Frontend package name: `dataset-references` (kebab-case)
- Module Federation name: `datasetReferences` (camelCase)
The `extension.json` file serves as the declared metadata for the extension, containing the extension's name, version, author, description, and a list of capabilities. This file is essential for the host application to understand how to load and manage the extension.
The `frontend` directory contains the source code for the frontend components of the extension, including React components, styles, and assets. The `webpack.config.js` file is used to configure Webpack for building the frontend code, while the `tsconfig.json` file defines the TypeScript configuration for the project. The `package.json` file specifies the dependencies and scripts for building and testing the frontend code.
@@ -87,26 +95,30 @@ The `extension.json` file contains all metadata necessary for the host applicati
```json
{
"name": "dataset_references",
"id": "dataset-references",
"name": "Dataset References",
"version": "1.0.0",
"frontend": {
"contributions": {
"views": {
"sqllab.panels": [
{
"id": "dataset_references.main",
"name": "Dataset references"
}
]
"sqllab": {
"panels": [
{
"id": "dataset-references.main",
"name": "Dataset References"
}
]
}
}
},
"moduleFederation": {
"exposes": ["./index"]
"exposes": ["./index"],
"name": "datasetReferences"
}
},
"backend": {
"entryPoints": ["dataset_references.entrypoint"],
"files": ["backend/src/dataset_references/**/*.py"]
"entryPoints": ["superset_extensions.dataset_references.entrypoint"],
"files": ["backend/src/superset_extensions/dataset_references/**/*.py"]
}
}
```
@@ -134,9 +146,9 @@ export const onDidChangeActivePanel: Event<Panel>;
export const onDidChangeTabTitle: Event<string>;
export const onDidQueryRun: Event<Editor>;
export const onDidQueryRun: Event<QueryContext>;
export const onDidQueryStop: Event<Editor>;
export const onDidQueryStop: Event<QueryContext>;
```
The following code demonstrates more examples of the existing frontend APIs:
@@ -150,16 +162,16 @@ export function activate(context) {
const panelDisposable = core.registerView('my_extension.panel', <MyPanel><Button/></MyPanel>);
// Register a custom command
const commandDisposable = commands.registerCommand('my_extension.copy_query', {
title: 'Copy Query',
execute: () => {
const commandDisposable = commands.registerCommand(
'my_extension.copy_query',
() => {
// Command logic here
},
});
);
// Listen for query run events in SQL Lab
const eventDisposable = sqlLab.onDidQueryRun(editor => {
// Handle query execution event
const eventDisposable = sqlLab.onDidQueryRun(queryContext => {
console.log('Query started on database:', queryContext.tab.databaseId);
});
// Access a CSRF token for secure API requests

View File

@@ -172,13 +172,9 @@ import { editors } from '@apache-superset/core';
import MonacoSQLEditor from './MonacoSQLEditor';
export function activate(context) {
// Register the Monaco editor for SQL
// Register the Monaco editor for SQL using the contribution ID from extension.json
const disposable = editors.registerEditorProvider(
{
id: 'monaco-sql-editor.sql',
name: 'Monaco SQL Editor',
languages: ['sql'],
},
'monaco-sql-editor.sql',
MonacoSQLEditor,
);

View File

@@ -24,7 +24,7 @@ under the License.
# SQL Lab Extension Points
SQL Lab provides 5 extension points where extensions can contribute custom UI components. Each area serves a specific purpose and can be customized to add new functionality.
SQL Lab provides 4 extension points where extensions can contribute custom UI components. Each area serves a specific purpose and supports different types of customizations. These areas will evolve over time as new features are added to SQL Lab.
## Layout Overview
@@ -41,42 +41,44 @@ SQL Lab provides 5 extension points where extensions can contribute custom UI co
│ │ │ │
│ │ │ │
│ │ │ │
──────────┴─────────────────────────────────────────┴─────────────
│ Status Bar │
└──────────────────────────────────────────────────────────────────┘
──────────┴─────────────────────────────────────────┴─────────────
```
| Extension Point | ID | Description |
| ----------------- | --------------------- | ---------------------------------------------------------- |
| **Left Sidebar** | `sqllab.leftSidebar` | Navigation and browsing (database explorer, saved queries) |
| **Editor** | `sqllab.editor` | SQL query editor workspace |
| **Right Sidebar** | `sqllab.rightSidebar` | Contextual tools (AI assistants, query analysis) |
| **Panels** | `sqllab.panels` | Results and related views (visualizations, data profiling) |
| **Status Bar** | `sqllab.statusBar` | Connection status and query metrics |
| Extension Point | ID | Views | Menus | Description |
| ----------------- | --------------------- | ----- | ----- | ---------------------------------------------- |
| **Left Sidebar** | `sqllab.leftSidebar` | — | ✓ | Menu actions for the database explorer |
| **Editor** | `sqllab.editor` | ✓\* | ✓ | Custom editors + toolbar actions |
| **Right Sidebar** | `sqllab.rightSidebar` | ✓ | — | Custom panels (AI assistants, query analysis) |
| **Panels** | `sqllab.panels` | ✓ | ✓ | Custom tabs + toolbar actions (data profiling) |
## Area Customizations
\*Editor views are contributed via [Editor Contributions](./editors), not standard view contributions.
Each extension point area supports three types of action customizations:
## Customization Types
### Views
Extensions can add custom views (React components) to **Right Sidebar** and **Panels**. Views appear as new panels or tabs in their respective areas.
### Menus
Extensions can add toolbar actions to **Left Sidebar**, **Editor**, and **Panels**. Menu contributions support:
```
┌───────────────────────────────────────────────────────────────┐
Area Title [Button] [Button] [•••] │
│ [Button] [Button] [•••]
├───────────────────────────────────────────────────────────────┤
│ │
│ │
│ Area Content │
│ │
│ (right-click for context menu) │
│ │
│ │
└───────────────────────────────────────────────────────────────┘
```
| Action Type | Location | Use Case |
| --------------------- | ----------------- | ----------------------------------------------------- |
| **Primary Actions** | Top-right buttons | Frequently used actions (e.g., run, refresh, add new) |
| **Secondary Actions** | 3-dot menu (•••) | Less common actions (e.g., export, settings) |
| **Context Actions** | Right-click menu | Context-sensitive actions on content |
| Action Type | Location | Use Case |
| --------------------- | ---------------- | ----------------------------------------------------- |
| **Primary Actions** | Toolbar buttons | Frequently used actions (e.g., run, refresh, add new) |
| **Secondary Actions** | 3-dot menu (•••) | Less common actions (e.g., export, settings) |
### Custom Editors
Extensions can replace the default SQL editor with custom implementations (Monaco, CodeMirror, etc.). See [Editor Contributions](./editors) for details.
## Examples
@@ -91,12 +93,14 @@ This example adds a "Data Profiler" panel to SQL Lab:
"frontend": {
"contributions": {
"views": {
"sqllab.panels": [
{
"id": "data_profiler.main",
"name": "Data Profiler"
}
]
"sqllab": {
"panels": [
{
"id": "data_profiler.main",
"name": "Data Profiler"
}
]
}
}
}
}
@@ -140,25 +144,27 @@ This example adds primary, secondary, and context actions to the editor:
}
],
"menus": {
"sqllab.editor": {
"primary": [
{
"view": "builtin.editor",
"command": "query_tools.format"
}
],
"secondary": [
{
"view": "builtin.editor",
"command": "query_tools.explain"
}
],
"context": [
{
"view": "builtin.editor",
"command": "query_tools.copy_as_cte"
}
]
"sqllab": {
"editor": {
"primary": [
{
"view": "builtin.editor",
"command": "query_tools.format"
}
],
"secondary": [
{
"view": "builtin.editor",
"command": "query_tools.explain"
}
],
"context": [
{
"view": "builtin.editor",
"command": "query_tools.copy_as_cte"
}
]
}
}
}
}
@@ -171,32 +177,38 @@ import { commands, sqlLab } from '@apache-superset/core';
export function activate(context) {
// Register the commands declared in extension.json
const formatCommand = commands.registerCommand('query_tools.format', {
execute: () => {
const formatCommand = commands.registerCommand(
'query_tools.format',
async () => {
const tab = sqlLab.getCurrentTab();
if (tab?.editor) {
if (tab) {
const editor = await tab.getEditor();
// Format the SQL query
}
},
});
);
const explainCommand = commands.registerCommand('query_tools.explain', {
execute: () => {
const explainCommand = commands.registerCommand(
'query_tools.explain',
async () => {
const tab = sqlLab.getCurrentTab();
if (tab?.editor) {
if (tab) {
const editor = await tab.getEditor();
// Show query explanation
}
},
});
);
const copyAsCteCommand = commands.registerCommand('query_tools.copy_as_cte', {
execute: () => {
const copyAsCteCommand = commands.registerCommand(
'query_tools.copy_as_cte',
async () => {
const tab = sqlLab.getCurrentTab();
if (tab?.editor) {
if (tab) {
const editor = await tab.getEditor();
// Copy selected text as CTE
}
},
});
);
context.subscriptions.push(formatCommand, explainCommand, copyAsCteCommand);
}

View File

@@ -51,4 +51,5 @@ Extensions can provide:
- **[Deployment](./deployment)** - Packaging and deploying extensions
- **[MCP Integration](./mcp)** - Adding AI agent capabilities using extensions
- **[Security](./security)** - Security considerations and best practices
- **[Tasks](./tasks)** - Framework for creating and managing long running tasks
- **[Community Extensions](./registry)** - Browse extensions shared by the community

View File

@@ -51,27 +51,39 @@ Use the CLI to scaffold a new extension project. Extensions can include frontend
superset-extensions init
```
The CLI will prompt you for information:
The CLI will prompt you for information using a three-step publisher workflow:
```
Extension ID (unique identifier, alphanumeric only): hello_world
Extension name (human-readable display name): Hello World
Extension display name: Hello World
Extension name (hello-world): hello-world
Publisher (e.g., my-org): my-org
Initial version [0.1.0]: 0.1.0
License [Apache-2.0]: Apache-2.0
Include frontend? [Y/n]: Y
Include backend? [Y/n]: Y
```
**Publisher Namespaces**: Extensions use organizational namespaces similar to VS Code extensions, providing collision-safe naming across organizations:
- **NPM package**: `@my-org/hello-world` (scoped package for frontend distribution)
- **Module Federation name**: `myOrg_helloWorld` (collision-safe JavaScript identifier)
- **Backend package**: `my_org-hello_world` (collision-safe Python distribution name)
- **Python namespace**: `superset_extensions.my_org.hello_world`
This approach ensures that extensions from different organizations cannot conflict, even if they use the same technical name (e.g., both `acme.dashboard-widgets` and `corp.dashboard-widgets` can coexist).
This creates a complete project structure:
```
hello_world/
my-org.hello-world/
├── extension.json # Extension metadata and configuration
├── backend/ # Backend Python code
│ ├── src/
│ │ └── hello_world/
│ │ ── __init__.py
│ │ └── entrypoint.py # Backend registration
│ │ └── superset_extensions/
│ │ ── my_org/
│ │ ├── __init__.py
│ │ └── hello_world/
│ │ ├── __init__.py
│ │ └── entrypoint.py # Backend registration
│ └── pyproject.toml
└── frontend/ # Frontend TypeScript/React code
├── src/
@@ -87,43 +99,52 @@ The generated `extension.json` contains basic metadata. Update it to register yo
```json
{
"id": "hello_world",
"name": "Hello World",
"publisher": "my-org",
"name": "hello-world",
"displayName": "Hello World",
"version": "0.1.0",
"license": "Apache-2.0",
"frontend": {
"contributions": {
"views": {
"sqllab.panels": [
{
"id": "hello_world.main",
"name": "Hello World"
}
]
"sqllab": {
"panels": [
{
"id": "my-org.hello-world.main",
"name": "Hello World"
}
]
}
}
},
"moduleFederation": {
"exposes": ["./index"]
"exposes": ["./index"],
"name": "myOrg_helloWorld"
}
},
"backend": {
"entryPoints": ["hello_world.entrypoint"],
"files": ["backend/src/hello_world/**/*.py"]
"entryPoints": ["superset_extensions.my_org.hello_world.entrypoint"],
"files": ["backend/src/superset_extensions/my_org/hello_world/**/*.py"]
},
"permissions": ["can_read"]
}
```
**Note**: The `moduleFederation.name` uses collision-safe naming (`myOrg_helloWorld`), and backend entry points use the full nested Python namespace (`superset_extensions.my_org.hello_world`).
**Key fields:**
- `publisher`: Organizational namespace for the extension
- `name`: Technical identifier (kebab-case)
- `displayName`: Human-readable name shown to users
- `frontend.contributions.views.sqllab.panels`: Registers your panel in SQL Lab
- `backend.entryPoints`: Python modules to load eagerly when extension starts
## Step 4: Create Backend API
The CLI generated a basic `backend/src/hello_world/entrypoint.py`. We'll create an API endpoint.
The CLI generated a basic `backend/src/superset_extensions/my_org/hello_world/entrypoint.py`. We'll create an API endpoint.
**Create `backend/src/hello_world/api.py`**
**Create `backend/src/superset_extensions/my_org/hello_world/api.py`**
```python
from flask import Response
@@ -172,10 +193,10 @@ class HelloWorldAPI(RestApi):
- Extends `RestApi` from `superset_core.api.types.rest_api`
- Uses Flask-AppBuilder decorators (`@expose`, `@protect`, `@safe`)
- Returns responses using `self.response(status_code, result=data)`
- The endpoint will be accessible at `/extensions/hello_world/message`
- The endpoint will be accessible at `/extensions/my-org/hello-world/message`
- OpenAPI docstrings are crucial - Flask-AppBuilder uses them to automatically generate interactive API documentation at `/swagger/v1`, allowing developers to explore endpoints, understand schemas, and test the API directly from the browser
**Update `backend/src/hello_world/entrypoint.py`**
**Update `backend/src/superset_extensions/my_org/hello_world/entrypoint.py`**
Replace the generated print statement with API registration:
@@ -199,7 +220,7 @@ The `@apache-superset/core` package must be listed in both `peerDependencies` (t
```json
{
"name": "hello_world",
"name": "@my-org/hello-world",
"version": "0.1.0",
"private": true,
"license": "Apache-2.0",
@@ -250,7 +271,7 @@ module.exports = (env, argv) => {
chunkFilename: "[name].[contenthash].js",
clean: true,
path: path.resolve(__dirname, "dist"),
publicPath: `/api/v1/extensions/${packageConfig.name}/`,
publicPath: `/api/v1/extensions/my-org/hello-world/`,
},
resolve: {
extensions: [".ts", ".tsx", ".js", ".jsx"],
@@ -271,7 +292,7 @@ module.exports = (env, argv) => {
},
plugins: [
new ModuleFederationPlugin({
name: packageConfig.name,
name: "myOrg_helloWorld",
filename: "remoteEntry.[contenthash].js",
exposes: {
"./index": "./src/index.tsx",
@@ -328,7 +349,7 @@ const HelloWorldPanel: React.FC = () => {
const fetchMessage = async () => {
try {
const csrfToken = await authentication.getCSRFToken();
const response = await fetch('/extensions/hello_world/message', {
const response = await fetch('/extensions/my-org/hello-world/message', {
method: 'GET',
headers: {
'Content-Type': 'application/json',
@@ -401,7 +422,7 @@ import HelloWorldPanel from './HelloWorldPanel';
export const activate = (context: core.ExtensionContext) => {
context.disposables.push(
core.registerViewProvider('hello_world.main', () => <HelloWorldPanel />),
core.registerViewProvider('my-org.hello-world.main', () => <HelloWorldPanel />),
);
};
@@ -411,9 +432,9 @@ export const deactivate = () => {};
**Key patterns:**
- `activate` function is called when the extension loads
- `core.registerViewProvider` registers the component with ID `hello_world.main` (matching `extension.json`)
- `core.registerViewProvider` registers the component with ID `my-org.hello-world.main` (matching `extension.json`)
- `authentication.getCSRFToken()` retrieves the CSRF token for API calls
- Fetch calls to `/extensions/{extension_id}/{endpoint}` reach your backend API
- Fetch calls to `/extensions/{publisher}/{name}/{endpoint}` reach your backend API
- `context.disposables.push()` ensures proper cleanup
## Step 6: Install Dependencies
@@ -442,7 +463,7 @@ This command automatically:
- `manifest.json` - Build metadata and asset references
- `frontend/dist/` - Built frontend assets (remoteEntry.js, chunks)
- `backend/` - Python source files
- Packages everything into `hello_world-0.1.0.supx` - a zip archive with the specific structure required by Superset
- Packages everything into `my-org.hello-world-0.1.0.supx` - a zip archive with the specific structure required by Superset
## Step 8: Deploy to Superset
@@ -467,7 +488,7 @@ EXTENSIONS_PATH = "/path/to/extensions/folder"
Copy your `.supx` file to the configured extensions path:
```bash
cp hello_world-0.1.0.supx /path/to/extensions/folder/
cp my-org.hello-world-0.1.0.supx /path/to/extensions/folder/
```
**Restart Superset**
@@ -498,7 +519,7 @@ Here's what happens when your extension loads:
4. **Module Federation**: Webpack loads your extension code and resolves `@apache-superset/core` to `window.superset`
5. **Activation**: `activate()` is called, registering your view provider
6. **Rendering**: When the user opens your panel, React renders `<HelloWorldPanel />`
7. **API call**: Component fetches data from `/extensions/hello_world/message`
7. **API call**: Component fetches data from `/extensions/my-org/hello-world/message`
8. **Backend response**: Your Flask API returns the hello world message
9. **Display**: Component shows the message to the user

View File

@@ -1,6 +1,6 @@
---
title: Community Extensions
sidebar_position: 10
sidebar_position: 11
---
<!--

View File

@@ -0,0 +1,440 @@
---
title: Tasks
sidebar_position: 10
---
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Global Task Framework
The Global Task Framework (GTF) provides a unified way to manage background tasks. It handles task execution, progress tracking, cancellation, and deduplication for both synchronous and asynchronous execution. The framework uses distributed locking internally to ensure race-free operations—you don't need to worry about concurrent task creation or cancellation conflicts.
## Enabling GTF
GTF is disabled by default and must be enabled via the `GLOBAL_TASK_FRAMEWORK` feature flag in your `superset_config.py`:
```python
FEATURE_FLAGS = {
"GLOBAL_TASK_FRAMEWORK": True,
}
```
When GTF is disabled:
- The Task List UI menu item is hidden
- The `/api/v1/task/*` endpoints return 404
- Calling or scheduling a `@task`-decorated function raises `GlobalTaskFrameworkDisabledError`
:::note Future Migration
When GTF is considered stable, it will replace legacy Celery tasks for built-in features like thumbnails and alerts & reports. Enabling this flag prepares your deployment for that migration.
:::
## Quick Start
### Define a Task
```python
from superset_core.api.tasks import task, get_context
@task
def process_data(dataset_id: int) -> None:
ctx = get_context()
@ctx.on_cleanup
def cleanup():
logger.info("Processing complete")
data = fetch_dataset(dataset_id)
process_and_cache(data)
```
### Execute a Task
```python
# Async execution - schedules on Celery worker
task = process_data.schedule(dataset_id=123)
print(task.status) # "pending"
# Sync execution - runs inline in current process
task = process_data(dataset_id=123)
# ... blocks until complete
print(task.status) # "success"
```
### Async vs Sync Execution
| Method | When to Use |
|--------|-------------|
| `.schedule()` | Long-running operations, background processing, when you need to return immediately |
| Direct call | Short operations, when deduplication matters, when you need the result before responding |
Both execution modes provide the same task features: deduplication, progress tracking, cancellation, and visibility in the Task List UI. The difference is whether execution happens in a Celery worker (async) or inline (sync).
## Task Lifecycle
```
PENDING ──→ IN_PROGRESS ────→ SUCCESS
│ │
│ ├──────────→ FAILURE
│ ↓ ↑
│ ABORTING ────────────┘
│ │
│ ├──────────→ TIMED_OUT (timeout)
│ │
└─────────────┴──────────→ ABORTED (user cancel)
```
| Status | Description |
|--------|-------------|
| `PENDING` | Queued, awaiting execution |
| `IN_PROGRESS` | Executing |
| `ABORTING` | Abort/timeout triggered, abort handlers running |
| `SUCCESS` | Completed successfully |
| `FAILURE` | Failed with error or abort/cleanup handler exception |
| `ABORTED` | Cancelled by user/admin |
| `TIMED_OUT` | Exceeded configured timeout |
## Context API
Access task context via `get_context()` from within any `@task` function. The context provides methods for updating task metadata and registering handlers.
### Updating Task Metadata
Use `update_task()` to report progress and store custom payload data:
```python
@task
def my_task(items: list[int]) -> None:
ctx = get_context()
for i, item in enumerate(items):
result = process(item)
ctx.update_task(
progress=(i + 1, len(items)),
payload={"last_result": result}
)
```
:::tip
Call `update_task()` once per iteration for best performance. Frequent DB writes are throttled to limit metastore load, so batching progress and payload updates together in a single call ensures both are persisted at the same time.
:::
#### Progress Formats
The `progress` parameter accepts three formats:
| Format | Example | Display |
|--------|---------|---------|
| `tuple[int, int]` | `progress=(3, 100)` | 3 of 100 (3%) with ETA |
| `float` (0.0-1.0) | `progress=0.5` | 50% with ETA |
| `int` | `progress=42` | 42 processed |
:::tip
Use the tuple format `(current, total)` whenever possible. It provides the richest information to users: showing both the count and percentage, while still computing ETA automatically.
:::
#### Payload
The `payload` parameter stores custom metadata that can help users understand what the task is doing. Each call to `update_task()` replaces the previous payload completely.
In the Task List UI, when a payload is defined, an info icon appears in the **Details** column. Users can hover over it to see the JSON content.
### Handlers
Register handlers to run cleanup logic or respond to abort requests:
| Handler | When it runs | Use case |
|---------|--------------|----------|
| `on_cleanup` | Always (success, failure, abort) | Release resources, close connections |
| `on_abort` | When task is aborted | Set stop flag, cancel external operations |
```python
@task
def my_task() -> None:
ctx = get_context()
@ctx.on_cleanup
def cleanup():
logger.info("Task ended, cleaning up")
@ctx.on_abort
def handle_abort():
logger.info("Abort requested")
# ... task logic
```
Multiple handlers of the same type execute in LIFO order (last registered runs first). Abort handlers run first when abort is detected, then cleanup handlers run when the task ends.
#### Best-Effort Execution
**All registered handlers will always be attempted, even if one fails.** This ensures that a failure in one handler doesn't prevent other handlers from running their cleanup logic.
For example, if you have three cleanup handlers and the second one throws an exception:
1. Handler 3 runs ✓
2. Handler 2 throws an exception ✗ (logged, but execution continues)
3. Handler 1 runs ✓
If any handler fails, the task is marked as `FAILURE` with combined error details showing all handler failures.
:::tip
Write handlers to be independent and self-contained. Don't assume previous handlers succeeded, and don't rely on shared state between handlers.
:::
## Making Tasks Abortable
When users click **Cancel** in the Task List, the system decides whether to **abort** (stop) the task or **unsubscribe** (remove the user from a shared task). Abort occurs when:
- It's a private or system task
- It's a shared task and the user is the last subscriber
- An admin checks **Force abort** to stop the task for all subscribers
Pending tasks can always be aborted: they simply won't start. In-progress tasks require an abort handler to be abortable:
```python
@task
def abortable_task(items: list[str]) -> None:
ctx = get_context()
should_stop = False
@ctx.on_abort
def handle_abort():
nonlocal should_stop
should_stop = True
logger.info("Abort signal received")
@ctx.on_cleanup
def cleanup():
logger.info("Task ended, cleaning up")
for item in items:
if should_stop:
return # Exit gracefully
process(item)
```
**Key points:**
- Registering `on_abort` marks the task as abortable and starts the abort listener
- The abort handler fires automatically when abort is triggered
- Use a flag pattern to gracefully stop processing at safe points
- Without an abort handler, in-progress tasks cannot be aborted: the Cancel button in the Task List UI will be disabled
The framework automatically skips execution if a task was aborted while pending: no manual check needed at task start.
:::tip
Always implement an abort handler for long-running tasks. This allows users to cancel unneeded tasks and free up worker capacity for other operations.
:::
## Timeouts
Set a timeout to automatically abort tasks that run too long:
```python
from superset_core.api.tasks import task, get_context, TaskOptions
# Set default timeout in decorator
@task(timeout=300) # 5 minutes
def process_data(dataset_id: int) -> None:
ctx = get_context()
should_stop = False
@ctx.on_abort
def handle_abort():
nonlocal should_stop
should_stop = True
for chunk in fetch_large_dataset(dataset_id):
if should_stop:
return
process(chunk)
# Override timeout at call time
task = process_data.schedule(
dataset_id=123,
options=TaskOptions(timeout=600) # Override to 10 minutes
)
```
### How Timeouts Work
The timeout timer starts when the task begins executing (status changes to `IN_PROGRESS`). When the timeout expires:
1. **With an abort handler registered:** The task transitions to `ABORTING`, abort handlers run, then cleanup handlers run. The final status depends on handler execution:
- If handlers complete successfully → `TIMED_OUT` status
- If handlers throw an exception → `FAILURE` status
2. **Without an abort handler:** The framework cannot forcibly terminate the task. A warning is logged, and the task continues running. The Task List UI shows a warning indicator (⚠️) in the Details column to alert users that the timeout cannot be enforced.
### Timeout Precedence
| Source | Priority | Example |
|--------|----------|---------|
| `TaskOptions.timeout` | Highest | `options=TaskOptions(timeout=600)` |
| `@task(timeout=...)` | Default | `@task(timeout=300)` |
| Not set | No timeout | Task runs indefinitely |
Call-time options always override decorator defaults, allowing tasks to have sensible defaults while permitting callers to extend or shorten the timeout for specific use cases.
:::warning
Timeouts require an abort handler to be effective. Without one, the timeout triggers only a warning and the task continues running. Always implement an abort handler when using timeouts.
:::
## Deduplication
Use `task_key` to prevent duplicate task execution:
```python
from superset_core.api.tasks import TaskOptions
# Without key - creates new task each time (random UUID)
task1 = my_task.schedule(x=1)
task2 = my_task.schedule(x=1) # Different task
# With key - joins existing task if active
task1 = my_task.schedule(x=1, options=TaskOptions(task_key="report_123"))
task2 = my_task.schedule(x=1, options=TaskOptions(task_key="report_123")) # Returns same task
```
When a task with matching key already exists, the user is added as a subscriber and the existing task is returned. This behavior is consistent across all scopes—private tasks naturally have only one subscriber since their deduplication key includes the user ID.
Deduplication only applies to active tasks (pending/in-progress). Once a task completes, a new task with the same key can be created.
### Sync Join-and-Wait
When a sync call joins an existing task, it blocks until the task completes:
```python
# Schedule async task
task = my_task.schedule(options=TaskOptions(task_key="report_123"))
# Later sync call with same key blocks until completion of the active task
task2 = my_task(options=TaskOptions(task_key="report_123"))
assert task.uuid == task2.uuid # True
print(task2.status) # "success" (terminal status)
```
## Task Scopes
```python
from superset_core.api.tasks import task, TaskScope
@task # Private by default
def private_task(): ...
@task(scope=TaskScope.SHARED) # Multiple users can subscribe
def shared_task(): ...
@task(scope=TaskScope.SYSTEM) # Admin-only visibility
def system_task(): ...
```
| Scope | Visibility | Cancel Behavior |
|-------|------------|-----------------|
| `PRIVATE` | Creator only | Cancels immediately |
| `SHARED` | All subscribers | Last subscriber cancels; others unsubscribe |
| `SYSTEM` | Admins only | Admin cancels |
## Task Cleanup
Completed tasks accumulate in the database over time. Configure a scheduled prune job to automatically remove old tasks:
```python
# In your superset_config.py, add to your Celery beat schedule:
CELERY_CONFIG.beat_schedule["prune_tasks"] = {
"task": "prune_tasks",
"schedule": crontab(minute=0, hour=0), # Run daily at midnight
"kwargs": {
"retention_period_days": 90, # Keep tasks for 90 days
"max_rows_per_run": 10000, # Limit deletions per run
},
}
```
The prune job only removes tasks in terminal states (`SUCCESS`, `FAILURE`, `ABORTED`, `TIMED_OUT`). Active tasks (`PENDING`, `IN_PROGRESS`, `ABORTING`) are never pruned.
See `superset/config.py` for a complete example configuration.
:::tip Signal Cache for Faster Notifications
By default, abort detection and sync join-and-wait use database polling. Configure `SIGNAL_CACHE_CONFIG` to enable Redis pub/sub for real-time notifications. See [Signal Cache Backend](/docs/configuration/cache#signal-cache-backend) for configuration details.
:::
## API Reference
### @task Decorator
```python
@task(
name: str | None = None,
scope: TaskScope = TaskScope.PRIVATE,
timeout: int | None = None
)
```
- `name`: Task identifier (defaults to function name)
- `scope`: `PRIVATE`, `SHARED`, or `SYSTEM`
- `timeout`: Default timeout in seconds (can be overridden via `TaskOptions`)
### TaskContext Methods
| Method | Description |
|--------|-------------|
| `update_task(progress, payload)` | Update progress and/or custom payload |
| `on_cleanup(handler)` | Register cleanup handler |
| `on_abort(handler)` | Register abort handler (makes task abortable) |
### TaskOptions
```python
TaskOptions(
task_key: str | None = None,
task_name: str | None = None,
timeout: int | None = None
)
```
- `task_key`: Deduplication key (also used as display name if `task_name` is not set)
- `task_name`: Human-readable display name for the Task List UI
- `timeout`: Timeout in seconds (overrides decorator default)
:::tip
Provide a descriptive `task_name` for better readability in the Task List UI. While `task_key` is used for deduplication and may be technical (e.g., `chart_export_123`), `task_name` can be user-friendly (e.g., `"Export Sales Chart 123"`).
:::
## Error Handling
Let exceptions propagate: the framework captures them automatically and sets task status to `FAILURE`:
```python
@task
def risky_task() -> None:
# No try/catch needed - framework handles it
result = operation_that_might_fail()
```
On failure, the framework records:
- `error_message`: Exception message
- `exception_type`: Exception class name
- `stack_trace`: Full traceback (visible when `SHOW_STACKTRACE=True`)
In the Task List UI, failed tasks show error details when hovering over the status. When stack traces are enabled, a separate bug icon appears in the **Details** column for viewing the full traceback.
Cleanup handlers still run after an exception, so resources can be properly released as necessary.
:::tip
Use descriptive exception messages. In environments where stack traces are hidden (`SHOW_STACKTRACE=False`), users see only the error message and exception type when hovering over failed tasks. Clear messages help users troubleshoot issues without administrator assistance.
:::

View File

@@ -47,5 +47,5 @@ This is a list of statements that describe how we do frontend development in Sup
- We do not debate code formatting style in PRs, instead relying on automated tooling to enforce it.
- If there's not a linting rule, we don't have a rule!
- See: [Linting How-Tos](../contributing/howtos#typescript--javascript)
- We use [React Storybook](https://storybook.js.org/) and [Applitools](https://applitools.com/) to help preview/test and stabilize our components
- We use [React Storybook](https://storybook.js.org/) to help preview/test and stabilize our components
- A public Storybook with components from the `master` branch is available [here](https://apache-superset.github.io/superset-ui/?path=/story/*)

View File

@@ -53,6 +53,7 @@ module.exports = {
'extensions/deployment',
'extensions/mcp',
'extensions/security',
'extensions/tasks',
'extensions/registry',
],
},

View File

@@ -60,7 +60,6 @@ Superset embraces a testing pyramid approach:
- **pytest**: Python testing framework with powerful fixtures and plugins
- **SQLAlchemy Test Utilities**: Database testing and transaction management
- **Flask Test Client**: API endpoint testing and request simulation
- **Factory Boy**: Test data generation and model factories
## Best Practices
@@ -157,7 +156,6 @@ npm run test:coverage
- **React Testing Library** - Component testing utilities
- **Playwright** - End-to-end testing (replacing Cypress)
- **Storybook** - Component development and testing
- **MSW** - API mocking for testing
---

View File

@@ -7,6 +7,12 @@ version: 1
# Caching
:::note
When a cache backend is configured, Superset expects it to remain available. Operations will
fail if the configured backend becomes unavailable rather than silently degrading. This
fail-fast behavior ensures operators are immediately aware of infrastructure issues.
:::
Superset uses [Flask-Caching](https://flask-caching.readthedocs.io/) for caching purposes.
Flask-Caching supports various caching backends, including Redis (recommended), Memcached,
SimpleCache (in-memory), or the local filesystem.
@@ -153,6 +159,84 @@ Then on configuration:
WEBDRIVER_AUTH_FUNC = auth_driver
```
## Signal Cache Backend
Superset supports an optional signal cache (`SIGNAL_CACHE_CONFIG`) for
high-performance distributed operations. This configuration enables:
- **Distributed locking**: Moves lock operations from the metadata database to Redis, improving
performance and reducing metastore load
- **Real-time event notifications**: Enables instant pub/sub messaging for task abort signals and
completion notifications instead of polling-based approaches
:::note
This requires Redis or Valkey specifically—it uses Redis-specific features (pub/sub, `SET NX EX`)
that are not available in general Flask-Caching backends.
:::
### Configuration
The signal cache uses Flask-Caching style configuration for consistency with other cache
backends. Configure `SIGNAL_CACHE_CONFIG` in `superset_config.py`:
```python
SIGNAL_CACHE_CONFIG = {
"CACHE_TYPE": "RedisCache",
"CACHE_REDIS_HOST": "localhost",
"CACHE_REDIS_PORT": 6379,
"CACHE_REDIS_DB": 0,
"CACHE_REDIS_PASSWORD": "", # Optional
}
```
For Redis Sentinel deployments:
```python
SIGNAL_CACHE_CONFIG = {
"CACHE_TYPE": "RedisSentinelCache",
"CACHE_REDIS_SENTINELS": [("sentinel1", 26379), ("sentinel2", 26379)],
"CACHE_REDIS_SENTINEL_MASTER": "mymaster",
"CACHE_REDIS_SENTINEL_PASSWORD": None, # Sentinel password (if different)
"CACHE_REDIS_PASSWORD": "", # Redis password
"CACHE_REDIS_DB": 0,
}
```
For SSL/TLS connections:
```python
SIGNAL_CACHE_CONFIG = {
"CACHE_TYPE": "RedisCache",
"CACHE_REDIS_HOST": "redis.example.com",
"CACHE_REDIS_PORT": 6380,
"CACHE_REDIS_SSL": True,
"CACHE_REDIS_SSL_CERTFILE": "/path/to/client.crt",
"CACHE_REDIS_SSL_KEYFILE": "/path/to/client.key",
"CACHE_REDIS_SSL_CA_CERTS": "/path/to/ca.crt",
}
```
### Distributed Lock TTL
You can configure the default lock TTL (time-to-live) in seconds. Locks automatically expire after
this duration to prevent deadlocks from crashed processes:
```python
DISTRIBUTED_LOCK_DEFAULT_TTL = 30 # Default: 30 seconds
```
Individual lock acquisitions can override this value when needed.
### Database-Only Mode
When `SIGNAL_CACHE_CONFIG` is not configured, Superset uses database-backed operations:
- **Locking**: Uses the KeyValue table with periodic cleanup of expired entries
- **Event notifications**: Uses database polling instead of pub/sub
While database-backed operations work reliably, the Redis backend is recommended for production
deployments where low latency and reduced database load are important.
:::resources
- [Blog: The Data Engineer's Guide to Lightning-Fast Superset Dashboards](https://preset.io/blog/the-data-engineers-guide-to-lightning-fast-apache-superset-dashboards/)
- [Blog: Accelerating Dashboards with Materialized Views](https://preset.io/blog/accelerating-apache-superset-dashboards-with-materialized-views/)

View File

@@ -141,10 +141,10 @@ database engine on a separate host or container.
Superset supports the following database engines/versions:
| Database Engine | Supported Versions |
| ----------------------------------------- | ---------------------------------------- |
| [PostgreSQL](https://www.postgresql.org/) | 10.X, 11.X, 12.X, 13.X, 14.X, 15.X, 16.X |
| [MySQL](https://www.mysql.com/) | 5.7, 8.X |
| Database Engine | Supported Versions |
| ----------------------------------------- | ---------------------------------------------- |
| [PostgreSQL](https://www.postgresql.org/) | 10.X, 11.X, 12.X, 13.X, 14.X, 15.X, 16.X, 17.X |
| [MySQL](https://www.mysql.com/) | 5.7, 8.X |
Use the following database drivers and connection strings:

View File

@@ -96,6 +96,24 @@ To enable this entry, add the following line to the `.env` file:
SUPERSET_FEATURE_EMBEDDED_SUPERSET=true
```
### Hiding the Logout Button in Embedded Contexts
When Superset is embedded in an application that manages authentication via SSO (OAuth2, SAML, or JWT), the logout button should be hidden since session management is handled by the parent application.
To hide the logout button in embedded contexts, add to `superset_config.py`:
```python
FEATURE_FLAGS = {
"DISABLE_EMBEDDED_SUPERSET_LOGOUT": True,
}
```
This flag only hides the logout button when Superset detects it is running inside an iframe. Users accessing Superset directly (not embedded) will still see the logout button regardless of this setting.
:::note
When embedding with SSO, also set `SESSION_COOKIE_SAMESITE = 'None'` and `SESSION_COOKIE_SECURE = True`. See [Security documentation](/docs/security/securing_superset) for details.
:::
## CSRF settings
Similarly, [flask-wtf](https://flask-wtf.readthedocs.io/en/0.15.x/config/) is used to manage

View File

@@ -1,7 +1,63 @@
---
sidebar_position: 9
title: Frequently Asked Questions
description: Common questions about Apache Superset including performance, database support, visualizations, and configuration.
keywords: [superset faq, superset questions, superset help, data visualization faq]
---
import FAQSchema from '@site/src/components/FAQSchema';
<FAQSchema faqs={[
{
question: "How big of a dataset can Superset handle?",
answer: "Superset can work with even gigantic databases. Superset acts as a thin layer above your underlying databases or data engines, which do all the processing. Superset simply visualizes the results of the query. The key to achieving acceptable performance is whether your database can execute queries and return results at acceptable speed."
},
{
question: "What are the computing specifications required to run Superset?",
answer: "The specs depend on how many users you have and their activity, not on the size of your data. Community members have reported 8GB RAM, 2vCPUs as adequate for a moderately-sized instance. Monitor your resource usage and adjust as needed."
},
{
question: "Can I join or query multiple tables at one time?",
answer: "Not in the Explore or Visualization UI directly. A Superset SQLAlchemy datasource can only be a single table or a view. You can create a view that joins tables, or use SQL Lab where you can write SQL queries to join multiple tables."
},
{
question: "How do I create my own visualization?",
answer: "Read the instructions in the Creating Visualization Plugins documentation to learn how to build custom visualizations for Superset."
},
{
question: "Can I upload and visualize CSV data?",
answer: "Yes! Superset supports CSV upload functionality. Read the Exploring Data documentation to learn how to enable and use CSV upload."
},
{
question: "Why are my queries timing out?",
answer: "There are many possible causes. For SQL Lab, Superset allows queries to run up to 6 hours by default (configurable via SQLLAB_ASYNC_TIME_LIMIT_SEC). For dashboard timeouts, check your gateway/proxy timeout settings and adjust SUPERSET_WEBSERVER_TIMEOUT in superset_config.py."
},
{
question: "Why is the map not visible in the geospatial visualization?",
answer: "You need to register a free account at Mapbox.com, obtain an API key, and add it to your .env file at the key MAPBOX_API_KEY."
},
{
question: "What database engine can I use as a backend for Superset?",
answer: "Superset is tested using MySQL, PostgreSQL, and SQLite backends for storing its internal metadata. While Superset supports many databases as data sources, only these are recommended for the metadata store in production."
},
{
question: "Does Superset work with my database?",
answer: "Superset supports any database with a Python SQLAlchemy dialect and DBAPI driver. Check the Connecting to Databases documentation for the full list of supported databases."
},
{
question: "Does Superset offer a public API?",
answer: "Yes, Superset has a public REST API documented using Swagger. Enable FAB_API_SWAGGER_UI in superset_config.py to access interactive API documentation at /swagger/v1."
},
{
question: "Does Superset collect any telemetry data?",
answer: "Superset uses Scarf by default to collect basic telemetry data to help maintainers understand version usage. Users can opt out by setting the SCARF_ANALYTICS environment variable to false."
},
{
question: "Does Superset have a trash bin to recover deleted assets?",
answer: "No, there is no built-in way to recover deleted dashboards, charts, or datasets. It is recommended to take periodic backups of the metadata database and use export functionality for recovery."
}
]} />
# FAQ
## How big of a dataset can Superset handle?

View File

@@ -23,6 +23,7 @@ import type * as OpenApiPlugin from 'docusaurus-plugin-openapi-docs';
import { themes } from 'prism-react-renderer';
import remarkImportPartial from 'remark-import-partial';
import remarkLocalizeBadges from './plugins/remark-localize-badges.mjs';
import remarkTechArticleSchema from './plugins/remark-tech-article-schema.mjs';
import * as fs from 'fs';
import * as path from 'path';
@@ -46,7 +47,7 @@ if (!versionsConfig.components.disabled) {
sidebarPath: require.resolve('./sidebarComponents.js'),
editUrl:
'https://github.com/apache/superset/edit/master/docs/components',
remarkPlugins: [remarkImportPartial, remarkLocalizeBadges],
remarkPlugins: [remarkImportPartial, remarkLocalizeBadges, remarkTechArticleSchema],
admonitions: {
keywords: ['note', 'tip', 'info', 'warning', 'danger', 'resources'],
extendDefaults: true,
@@ -74,7 +75,7 @@ if (!versionsConfig.developer_portal.disabled) {
sidebarPath: require.resolve('./sidebarTutorials.js'),
editUrl:
'https://github.com/apache/superset/edit/master/docs/developer_portal',
remarkPlugins: [remarkImportPartial, remarkLocalizeBadges],
remarkPlugins: [remarkImportPartial, remarkLocalizeBadges, remarkTechArticleSchema],
admonitions: {
keywords: ['note', 'tip', 'info', 'warning', 'danger', 'resources'],
extendDefaults: true,
@@ -180,6 +181,83 @@ const config: Config = {
favicon: '/img/favicon.ico',
organizationName: 'apache',
projectName: 'superset',
// SEO: Structured data (Organization, Software, WebSite with SearchAction)
headTags: [
// SoftwareApplication schema
{
tagName: 'script',
attributes: {
type: 'application/ld+json',
},
innerHTML: JSON.stringify({
'@context': 'https://schema.org',
'@type': 'SoftwareApplication',
name: 'Apache Superset',
applicationCategory: 'BusinessApplication',
operatingSystem: 'Cross-platform',
description: 'Apache Superset is a modern, enterprise-ready business intelligence web application for data exploration and visualization.',
url: 'https://superset.apache.org',
license: 'https://www.apache.org/licenses/LICENSE-2.0',
author: {
'@type': 'Organization',
name: 'Apache Software Foundation',
url: 'https://www.apache.org/',
logo: 'https://www.apache.org/foundation/press/kit/asf_logo.png',
},
offers: {
'@type': 'Offer',
price: '0',
priceCurrency: 'USD',
},
featureList: [
'Interactive dashboards',
'SQL IDE',
'40+ visualization types',
'Semantic layer',
'Role-based access control',
'REST API',
],
}),
},
// WebSite schema with SearchAction (enables sitelinks search box in Google)
{
tagName: 'script',
attributes: {
type: 'application/ld+json',
},
innerHTML: JSON.stringify({
'@context': 'https://schema.org',
'@type': 'WebSite',
name: 'Apache Superset',
url: 'https://superset.apache.org',
potentialAction: {
'@type': 'SearchAction',
target: {
'@type': 'EntryPoint',
urlTemplate: 'https://superset.apache.org/search?q={search_term_string}',
},
'query-input': 'required name=search_term_string',
},
}),
},
// Preconnect hints for faster external resource loading
{
tagName: 'link',
attributes: {
rel: 'preconnect',
href: 'https://WR5FASX5ED-dsn.algolia.net',
crossorigin: 'anonymous',
},
},
{
tagName: 'link',
attributes: {
rel: 'preconnect',
href: 'https://analytics.apache.org',
},
},
],
themes: [
'@saucelabs/theme-github-codeblock',
'@docusaurus/theme-mermaid',
@@ -212,6 +290,19 @@ const config: Config = {
},
},
],
// SEO: Generate robots.txt during build
[
require.resolve('./plugins/robots-txt-plugin.js'),
{
policies: [
{
userAgent: '*',
allow: '/',
disallow: ['/api/v1/', '/_next/', '/static/js/*.map'],
},
],
},
],
[
'@docusaurus/plugin-client-redirects',
{
@@ -373,7 +464,7 @@ const config: Config = {
}
return `https://github.com/apache/superset/edit/master/docs/${versionDocsDirPath}/${docPath}`;
},
remarkPlugins: [remarkImportPartial, remarkLocalizeBadges],
remarkPlugins: [remarkImportPartial, remarkLocalizeBadges, remarkTechArticleSchema],
admonitions: {
keywords: ['note', 'tip', 'info', 'warning', 'danger', 'resources'],
extendDefaults: true,
@@ -396,11 +487,57 @@ const config: Config = {
theme: {
customCss: require.resolve('./src/styles/custom.css'),
},
// SEO: Sitemap configuration with priorities
sitemap: {
lastmod: 'date',
changefreq: 'weekly',
priority: 0.5,
ignorePatterns: ['/tags/**'],
filename: 'sitemap.xml',
createSitemapItems: async (params) => {
const { defaultCreateSitemapItems, ...rest } = params;
const items = await defaultCreateSitemapItems(rest);
return items.map((item) => {
// Boost priority for key pages
if (item.url.includes('/docs/intro')) {
return { ...item, priority: 1.0, changefreq: 'daily' };
}
if (item.url.includes('/docs/quickstart')) {
return { ...item, priority: 0.9, changefreq: 'weekly' };
}
if (item.url.includes('/docs/installation/')) {
return { ...item, priority: 0.8, changefreq: 'weekly' };
}
if (item.url.includes('/docs/databases')) {
return { ...item, priority: 0.8, changefreq: 'weekly' };
}
if (item.url.includes('/docs/faq')) {
return { ...item, priority: 0.7, changefreq: 'monthly' };
}
if (item.url === 'https://superset.apache.org/') {
return { ...item, priority: 1.0, changefreq: 'daily' };
}
return item;
});
},
},
} satisfies Options,
],
],
themeConfig: {
// SEO: OpenGraph and Twitter meta tags
metadata: [
{ name: 'keywords', content: 'data visualization, business intelligence, BI, dashboards, SQL, analytics, open source, Apache, charts, reporting' },
{ property: 'og:type', content: 'website' },
{ property: 'og:site_name', content: 'Apache Superset' },
{ property: 'og:image', content: 'https://superset.apache.org/img/superset-og-image.png' },
{ property: 'og:image:width', content: '1200' },
{ property: 'og:image:height', content: '630' },
{ name: 'twitter:card', content: 'summary_large_image' },
{ name: 'twitter:image', content: 'https://superset.apache.org/img/superset-og-image.png' },
{ name: 'twitter:site', content: '@ApacheSuperset' },
],
colorMode: {
defaultMode: 'dark',
disableSwitch: false,
@@ -499,7 +636,6 @@ const config: Config = {
copyright: `
<div class="footer__ci-services">
<span>CI powered by</span>
<a href="https://applitools.com/" target="_blank" rel="nofollow noopener noreferrer"><img src="/img/applitools.png" alt="Applitools" title="Applitools - Visual Testing" /></a>
<a href="https://www.netlify.com/" target="_blank" rel="nofollow noopener noreferrer"><img src="/img/netlify.png" alt="Netlify" title="Netlify - Deploy Previews" /></a>
</div>
<p>Copyright © ${new Date().getFullYear()},

View File

@@ -1,6 +1,6 @@
{
"copyright": {
"message": "\n <div class=\"footer__ci-services\">\n <span>CI powered by</span>\n <a href=\"https://applitools.com/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"><img src=\"/img/applitools.png\" alt=\"Applitools\" title=\"Applitools - Visual Testing\" /></a>\n <a href=\"https://www.netlify.com/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"><img src=\"/img/netlify.png\" alt=\"Netlify\" title=\"Netlify - Deploy Previews\" /></a>\n </div>\n <p>Copyright © 2026,\n The <a href=\"https://www.apache.org/\" target=\"_blank\" rel=\"noreferrer\">Apache Software Foundation</a>,\n Licensed under the Apache <a href=\"https://apache.org/licenses/LICENSE-2.0\" target=\"_blank\" rel=\"noreferrer\">License</a>.</p>\n <p><small>Apache Superset, Apache, Superset, the Superset logo, and the Apache feather logo are either registered trademarks or trademarks of The Apache Software Foundation. All other products or name brands are trademarks of their respective holders, including The Apache Software Foundation.\n <a href=\"https://www.apache.org/\" target=\"_blank\">Apache Software Foundation</a> resources</small></p>\n <img class=\"footer__divider\" src=\"/img/community/line.png\" alt=\"Divider\" />\n <p>\n <small>\n <a href=\"/docs/security/\" target=\"_blank\" rel=\"noreferrer\">Security</a>&nbsp;|&nbsp;\n <a href=\"https://www.apache.org/foundation/sponsorship.html\" target=\"_blank\" rel=\"noreferrer\">Donate</a>&nbsp;|&nbsp;\n <a href=\"https://www.apache.org/foundation/thanks.html\" target=\"_blank\" rel=\"noreferrer\">Thanks</a>&nbsp;|&nbsp;\n <a href=\"https://apache.org/events/current-event\" target=\"_blank\" rel=\"noreferrer\">Events</a>&nbsp;|&nbsp;\n <a href=\"https://apache.org/licenses/\" target=\"_blank\" rel=\"noreferrer\">License</a>&nbsp;|&nbsp;\n <a href=\"https://privacy.apache.org/policies/privacy-policy-public.html\" target=\"_blank\" rel=\"noreferrer\">Privacy</a>\n </small>\n </p>\n <!-- telemetry/analytics pixel: -->\n <img referrerPolicy=\"no-referrer-when-downgrade\" src=\"https://static.scarf.sh/a.png?x-pxid=39ae6855-95fc-4566-86e5-360d542b0a68\" />\n ",
"message": "\n <div class=\"footer__ci-services\">\n <span>CI powered by</span>\n <a href=\"https://www.netlify.com/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"><img src=\"/img/netlify.png\" alt=\"Netlify\" title=\"Netlify - Deploy Previews\" /></a>\n </div>\n <p>Copyright © 2026,\n The <a href=\"https://www.apache.org/\" target=\"_blank\" rel=\"noreferrer\">Apache Software Foundation</a>,\n Licensed under the Apache <a href=\"https://apache.org/licenses/LICENSE-2.0\" target=\"_blank\" rel=\"noreferrer\">License</a>.</p>\n <p><small>Apache Superset, Apache, Superset, the Superset logo, and the Apache feather logo are either registered trademarks or trademarks of The Apache Software Foundation. All other products or name brands are trademarks of their respective holders, including The Apache Software Foundation.\n <a href=\"https://www.apache.org/\" target=\"_blank\">Apache Software Foundation</a> resources</small></p>\n <img class=\"footer__divider\" src=\"/img/community/line.png\" alt=\"Divider\" />\n <p>\n <small>\n <a href=\"/docs/security/\" target=\"_blank\" rel=\"noreferrer\">Security</a>&nbsp;|&nbsp;\n <a href=\"https://www.apache.org/foundation/sponsorship.html\" target=\"_blank\" rel=\"noreferrer\">Donate</a>&nbsp;|&nbsp;\n <a href=\"https://www.apache.org/foundation/thanks.html\" target=\"_blank\" rel=\"noreferrer\">Thanks</a>&nbsp;|&nbsp;\n <a href=\"https://apache.org/events/current-event\" target=\"_blank\" rel=\"noreferrer\">Events</a>&nbsp;|&nbsp;\n <a href=\"https://apache.org/licenses/\" target=\"_blank\" rel=\"noreferrer\">License</a>&nbsp;|&nbsp;\n <a href=\"https://privacy.apache.org/policies/privacy-policy-public.html\" target=\"_blank\" rel=\"noreferrer\">Privacy</a>\n </small>\n </p>\n <!-- telemetry/analytics pixel: -->\n <img referrerPolicy=\"no-referrer-when-downgrade\" src=\"https://static.scarf.sh/a.png?x-pxid=39ae6855-95fc-4566-86e5-360d542b0a68\" />\n ",
"description": "The footer copyright"
}
}

View File

@@ -48,25 +48,26 @@
"@emotion/react": "^11.13.3",
"@emotion/styled": "^11.14.1",
"@fontsource/fira-code": "^5.2.7",
"@fontsource/ibm-plex-mono": "^5.2.7",
"@fontsource/inter": "^5.2.8",
"@mdx-js/react": "^3.1.1",
"@saucelabs/theme-github-codeblock": "^0.3.0",
"@storybook/addon-docs": "^8.6.15",
"@storybook/addon-docs": "^8.6.17",
"@storybook/blocks": "^8.6.15",
"@storybook/channels": "^8.6.15",
"@storybook/client-logger": "^8.6.15",
"@storybook/components": "^8.6.15",
"@storybook/core": "^8.6.15",
"@storybook/core-events": "^8.6.15",
"@storybook/channels": "^8.6.17",
"@storybook/client-logger": "^8.6.17",
"@storybook/components": "^8.6.17",
"@storybook/core": "^8.6.17",
"@storybook/core-events": "^8.6.17",
"@storybook/csf": "^0.1.13",
"@storybook/docs-tools": "^8.6.15",
"@storybook/preview-api": "^8.6.15",
"@storybook/docs-tools": "^8.6.17",
"@storybook/preview-api": "^8.6.17",
"@storybook/theming": "^8.6.15",
"@superset-ui/core": "^0.20.4",
"@swc/core": "^1.15.11",
"antd": "^6.2.3",
"baseline-browser-mapping": "^2.9.19",
"caniuse-lite": "^1.0.30001769",
"@swc/core": "^1.15.13",
"antd": "^6.3.1",
"baseline-browser-mapping": "^2.10.0",
"caniuse-lite": "^1.0.30001770",
"docusaurus-plugin-openapi-docs": "^4.6.0",
"docusaurus-theme-openapi-docs": "^4.6.0",
"js-yaml": "^4.1.1",
@@ -81,8 +82,8 @@
"react-table": "^7.8.0",
"remark-import-partial": "^0.0.2",
"reselect": "^5.1.1",
"storybook": "^8.6.15",
"swagger-ui-react": "^5.31.0",
"storybook": "^8.6.17",
"swagger-ui-react": "^5.31.2",
"swc-loader": "^0.2.7",
"tinycolor2": "^1.4.2",
"unist-util-visit": "^5.1.0"
@@ -94,7 +95,7 @@
"@types/js-yaml": "^4.0.9",
"@types/react": "^19.1.8",
"@typescript-eslint/eslint-plugin": "^8.52.0",
"@typescript-eslint/parser": "^8.52.0",
"@typescript-eslint/parser": "^8.56.1",
"eslint": "^9.39.2",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-prettier": "^5.5.5",
@@ -102,8 +103,8 @@
"globals": "^17.3.0",
"prettier": "^3.8.1",
"typescript": "~5.9.3",
"typescript-eslint": "^8.54.0",
"webpack": "^5.105.0"
"typescript-eslint": "^8.56.1",
"webpack": "^5.105.2"
},
"browserslist": {
"production": [

View File

@@ -0,0 +1,153 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
// Note: visit from unist-util-visit is available if needed for tree traversal
/**
* Remark plugin that automatically injects TechArticle schema import and component
* into documentation MDX files based on frontmatter.
*
* This enables rich snippets for technical documentation in search results.
*
* Frontmatter options:
* - title: (required) Article headline
* - description: (required) Article description
* - keywords: (optional) Array of keywords
* - seo_proficiency: (optional) 'Beginner' or 'Expert', defaults to 'Beginner'
* - seo_schema: (optional) Set to false to disable schema injection
*/
export default function remarkTechArticleSchema() {
return (tree, file) => {
const frontmatter = file.data.frontMatter || {};
// Skip if explicitly disabled or missing required fields
if (frontmatter.seo_schema === false) {
return;
}
// Only add schema if we have title and description
if (!frontmatter.title || !frontmatter.description) {
return;
}
const title = frontmatter.title;
const description = frontmatter.description;
const keywords = Array.isArray(frontmatter.keywords) ? frontmatter.keywords : [];
const proficiencyLevel = frontmatter.seo_proficiency || 'Beginner';
// Create the import statement
const importNode = {
type: 'mdxjsEsm',
value: `import TechArticleSchema from '@site/src/components/TechArticleSchema';`,
data: {
estree: {
type: 'Program',
sourceType: 'module',
body: [
{
type: 'ImportDeclaration',
specifiers: [
{
type: 'ImportDefaultSpecifier',
local: { type: 'Identifier', name: 'TechArticleSchema' },
},
],
source: {
type: 'Literal',
value: '@site/src/components/TechArticleSchema',
},
},
],
},
},
};
// Create the component node for MDX
const componentNode = {
type: 'mdxJsxFlowElement',
name: 'TechArticleSchema',
attributes: [
{
type: 'mdxJsxAttribute',
name: 'title',
value: title,
},
{
type: 'mdxJsxAttribute',
name: 'description',
value: description,
},
...(keywords.length > 0
? [
{
type: 'mdxJsxAttribute',
name: 'keywords',
value: {
type: 'mdxJsxAttributeValueExpression',
value: JSON.stringify(keywords),
data: {
estree: {
type: 'Program',
sourceType: 'module',
body: [
{
type: 'ExpressionStatement',
expression: {
type: 'ArrayExpression',
elements: keywords.map((k) => ({
type: 'Literal',
value: k,
})),
},
},
],
},
},
},
},
]
: []),
...(proficiencyLevel !== 'Beginner'
? [
{
type: 'mdxJsxAttribute',
name: 'proficiencyLevel',
value: proficiencyLevel,
},
]
: []),
],
children: [],
};
// Insert import at the beginning
tree.children.unshift(importNode);
// Find the first heading and insert component after it
let insertIndex = 1; // Default: after import
for (let i = 1; i < tree.children.length; i++) {
if (tree.children[i].type === 'heading') {
insertIndex = i + 1;
break;
}
}
tree.children.splice(insertIndex, 0, componentNode);
};
}

View File

@@ -0,0 +1,83 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
/* eslint-disable @typescript-eslint/no-require-imports */
const fs = require('fs');
const path = require('path');
/* eslint-enable @typescript-eslint/no-require-imports */
/**
* Docusaurus plugin to generate robots.txt during build
* Configuration is passed via plugin options
*/
module.exports = function robotsTxtPlugin(context, options = {}) {
const { siteConfig } = context;
const {
policies = [{ userAgent: '*', allow: '/' }],
additionalSitemaps = [],
} = options;
return {
name: 'robots-txt-plugin',
async postBuild({ outDir }) {
const sitemapUrl = `${siteConfig.url}/sitemap.xml`;
// Build robots.txt content
const lines = [];
// Add policies
for (const policy of policies) {
lines.push(`User-agent: ${policy.userAgent}`);
if (policy.allow) {
const allows = Array.isArray(policy.allow) ? policy.allow : [policy.allow];
for (const allow of allows) {
lines.push(`Allow: ${allow}`);
}
}
if (policy.disallow) {
const disallows = Array.isArray(policy.disallow) ? policy.disallow : [policy.disallow];
for (const disallow of disallows) {
lines.push(`Disallow: ${disallow}`);
}
}
if (policy.crawlDelay) {
lines.push(`Crawl-delay: ${policy.crawlDelay}`);
}
lines.push(''); // Empty line between policies
}
// Add sitemaps
lines.push(`Sitemap: ${sitemapUrl}`);
for (const sitemap of additionalSitemaps) {
lines.push(`Sitemap: ${sitemap}`);
}
// Write robots.txt
const robotsPath = path.join(outDir, 'robots.txt');
fs.writeFileSync(robotsPath, lines.join('\n'));
console.log('Generated robots.txt');
},
};
};

View File

@@ -97,6 +97,7 @@ const sidebars = {
'extensions/deployment',
'extensions/mcp',
'extensions/security',
'extensions/tasks',
'extensions/registry',
],
},

View File

@@ -0,0 +1,66 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import type { JSX } from 'react';
import Head from '@docusaurus/Head';
interface FAQItem {
question: string;
answer: string;
}
interface FAQSchemaProps {
faqs: FAQItem[];
}
/**
* Component that injects FAQPage JSON-LD structured data
* Use this on FAQ pages to enable rich snippets in search results
*
* @example
* <FAQSchema faqs={[
* { question: "What is Superset?", answer: "Apache Superset is..." },
* { question: "How do I install it?", answer: "You can install via..." }
* ]} />
*/
export default function FAQSchema({ faqs }: FAQSchemaProps): JSX.Element | null {
// FAQPage schema requires a non-empty mainEntity array per schema.org specs
if (!faqs || faqs.length === 0) {
return null;
}
const schema = {
'@context': 'https://schema.org',
'@type': 'FAQPage',
mainEntity: faqs.map((faq) => ({
'@type': 'Question',
name: faq.question,
acceptedAnswer: {
'@type': 'Answer',
text: faq.answer,
},
})),
};
return (
<Head>
<script type="application/ld+json">{JSON.stringify(schema)}</script>
</Head>
);
}

View File

@@ -0,0 +1,91 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import type { JSX } from 'react';
import Head from '@docusaurus/Head';
import { useLocation } from '@docusaurus/router';
interface TechArticleSchemaProps {
title: string;
description: string;
datePublished?: string;
dateModified?: string;
keywords?: string[];
proficiencyLevel?: 'Beginner' | 'Expert';
}
/**
* Component that injects TechArticle JSON-LD structured data for documentation pages.
* This helps search engines understand technical documentation content.
*
* @example
* <TechArticleSchema
* title="Installing Superset with Docker"
* description="Learn how to install Apache Superset using Docker Compose"
* keywords={['docker', 'installation', 'superset']}
* proficiencyLevel="Beginner"
* />
*/
export default function TechArticleSchema({
title,
description,
datePublished,
dateModified,
keywords = [],
proficiencyLevel = 'Beginner',
}: TechArticleSchemaProps): JSX.Element {
const location = useLocation();
const url = `https://superset.apache.org${location.pathname}`;
const schema = {
'@context': 'https://schema.org',
'@type': 'TechArticle',
headline: title,
description,
url,
proficiencyLevel,
author: {
'@type': 'Organization',
name: 'Apache Superset Contributors',
url: 'https://github.com/apache/superset/graphs/contributors',
},
publisher: {
'@type': 'Organization',
name: 'Apache Software Foundation',
url: 'https://www.apache.org/',
logo: {
'@type': 'ImageObject',
url: 'https://www.apache.org/foundation/press/kit/asf_logo.png',
},
},
mainEntityOfPage: {
'@type': 'WebPage',
'@id': url,
},
...(datePublished && { datePublished }),
...(dateModified && { dateModified }),
...(keywords.length > 0 && { keywords: keywords.join(', ') }),
};
return (
<Head>
<script type="application/ld+json">{JSON.stringify(schema)}</script>
</Head>
);
}

View File

@@ -579,7 +579,7 @@ const DatabaseIndex: React.FC<DatabaseIndexProps> = ({ data }) => {
columns={columns}
rowKey={(record) => record.isCompatible ? `${record.compatibleWith}-${record.name}` : record.name}
pagination={{
pageSize: 20,
defaultPageSize: 20,
showSizeChanger: true,
showTotal: (total) => `${total} databases`,
}}

View File

@@ -104,6 +104,10 @@ const DatabasePage: React.FC<DatabasePageProps> = ({ database, name }) => {
</div>
);
// Ensure db filename can be obtained regardless of how db doc gets generated
// by either Flask app (superset.db_engine_specs.postgres) or fallback mode (postgres)
const databaseModuleFilename = `${database.module?.split('.').pop()}.py`;
// Render driver information
const renderDrivers = () => {
if (!docs?.drivers?.length) return null;
@@ -770,11 +774,11 @@ const DatabasePage: React.FC<DatabasePageProps> = ({ database, name }) => {
Help improve this documentation by editing the engine spec:
</Text>
<a
href={`https://github.com/apache/superset/edit/master/superset/db_engine_specs/${database.module}.py`}
href={`https://github.com/apache/superset/edit/master/superset/db_engine_specs/${databaseModuleFilename}`}
target="_blank"
rel="noreferrer"
>
<EditOutlined /> Edit {database.module}.py
<EditOutlined /> Edit {databaseModuleFilename}
</a>
</Space>
</Card>

View File

@@ -1,16 +1,16 @@
{
"generated": "2026-01-31T10:47:01.730Z",
"generated": "2026-02-16T04:47:37.257Z",
"statistics": {
"totalDatabases": 70,
"withDocumentation": 70,
"withConnectionString": 70,
"totalDatabases": 72,
"withDocumentation": 72,
"withConnectionString": 72,
"withDrivers": 36,
"withAuthMethods": 4,
"supportsJoins": 66,
"supportsSubqueries": 67,
"supportsJoins": 68,
"supportsSubqueries": 69,
"supportsDynamicSchema": 15,
"supportsCatalog": 9,
"averageScore": 32,
"averageScore": 31,
"maxScore": 201,
"byCategory": {
"Other Databases": [
@@ -74,6 +74,7 @@
"Apache Kylin",
"Azure Synapse",
"Ocient",
"Apache Phoenix",
"Amazon Redshift",
"RisingWave",
"SingleStore",
@@ -151,12 +152,14 @@
"Greenplum",
"Apache Hive",
"Apache Impala",
"Apache IoTDB",
"Apache Kylin",
"MariaDB",
"MonetDB",
"MySQL",
"OceanBase",
"Parseable",
"Apache Phoenix",
"Apache Pinot",
"PostgreSQL",
"Presto",
@@ -187,6 +190,7 @@
"Time Series Databases": [
"CrateDB",
"Apache Druid",
"Apache IoTDB",
"Apache Pinot",
"TDengine"
],
@@ -197,7 +201,9 @@
"Apache Druid",
"Apache Hive",
"Apache Impala",
"Apache IoTDB",
"Apache Kylin",
"Apache Phoenix",
"Apache Pinot",
"Apache Solr",
"Apache Spark SQL"
@@ -2890,6 +2896,47 @@
"query_cost_estimation": false,
"sql_validation": false
},
"Apache IoTDB": {
"engine": "apache_iotdb",
"engine_name": "Apache IoTDB",
"module": "iotdb",
"documentation": {
"description": "Apache IoTDB is a time series database designed for IoT data, with efficient storage and query capabilities for massive time series data.",
"logo": "apache-iotdb.svg",
"homepage_url": "https://iotdb.apache.org/",
"categories": [
"APACHE_PROJECTS",
"TIME_SERIES",
"OPEN_SOURCE"
],
"pypi_packages": [
"apache-iotdb"
],
"connection_string": "iotdb://{username}:{password}@{hostname}:{port}",
"default_port": 6667,
"parameters": {
"username": "Database username (default: root)",
"password": "Database password (default: root)",
"hostname": "IP address or hostname",
"port": "Default 6667"
},
"notes": "The IoTDB SQLAlchemy dialect was written to integrate with Apache Superset. IoTDB uses a hierarchical data model, which is reorganized into a relational model for SQL queries."
},
"time_grains": {},
"score": 0,
"max_score": 0,
"joins": true,
"subqueries": true,
"supports_dynamic_schema": false,
"supports_catalog": false,
"supports_dynamic_catalog": false,
"ssh_tunneling": false,
"query_cancelation": false,
"supports_file_upload": false,
"user_impersonation": false,
"query_cost_estimation": false,
"sql_validation": false
},
"Azure Data Explorer": {
"engine": "azure_data_explorer",
"engine_name": "Azure Data Explorer",
@@ -4039,6 +4086,41 @@
"query_cost_estimation": false,
"sql_validation": false
},
"Apache Phoenix": {
"engine": "apache_phoenix",
"engine_name": "Apache Phoenix",
"module": "phoenix",
"documentation": {
"description": "Apache Phoenix is a relational database layer over Apache HBase, providing low-latency SQL queries over HBase data.",
"logo": "apache-phoenix.png",
"homepage_url": "https://phoenix.apache.org/",
"categories": [
"APACHE_PROJECTS",
"ANALYTICAL_DATABASES",
"OPEN_SOURCE"
],
"pypi_packages": [
"phoenixdb"
],
"connection_string": "phoenix://{hostname}:{port}/",
"default_port": 8765,
"notes": "Phoenix provides a SQL interface to Apache HBase. The phoenixdb driver connects via the Phoenix Query Server and supports a subset of SQLAlchemy."
},
"time_grains": {},
"score": 0,
"max_score": 0,
"joins": true,
"subqueries": true,
"supports_dynamic_schema": false,
"supports_catalog": false,
"supports_dynamic_catalog": false,
"ssh_tunneling": false,
"query_cancelation": false,
"supports_file_upload": false,
"user_impersonation": false,
"query_cost_estimation": false,
"sql_validation": false
},
"Apache Pinot": {
"engine": "apache_pinot",
"engine_name": "Apache Pinot",
@@ -4207,6 +4289,80 @@
"OPEN_SOURCE"
]
},
{
"name": "Supabase",
"description": "Open-source Firebase alternative built on top of PostgreSQL, providing a full backend-as-a-service with a hosted Postgres database.",
"logo": "supabase.svg",
"homepage_url": "https://supabase.com/",
"pypi_packages": [
"psycopg2"
],
"connection_string": "postgresql://{username}:{password}@{host}:{port}/{database}",
"connection_examples": [
{
"description": "Supabase project (connection pooler)",
"connection_string": "postgresql://{username}.{project_ref}:{password}@aws-0-{region}.pooler.supabase.com:6543/{database}"
}
],
"parameters": {
"username": "Database user (default: postgres)",
"password": "Database password",
"host": "Supabase project host (from project settings)",
"port": "Default 5432 (direct) or 6543 (pooler)",
"database": "Database name (default: postgres)",
"project_ref": "Supabase project reference (from project settings)",
"region": "Supabase project region (e.g., us-east-1)"
},
"notes": "Find connection details in your Supabase project dashboard under Settings > Database. Use the connection pooler (port 6543) for better connection management.",
"docs_url": "https://supabase.com/docs/guides/database/connecting-to-postgres",
"categories": [
"HOSTED_OPEN_SOURCE"
]
},
{
"name": "Google AlloyDB",
"description": "Google Cloud's PostgreSQL-compatible database service for demanding transactional and analytical workloads.",
"logo": "alloydb.png",
"homepage_url": "https://cloud.google.com/alloydb",
"pypi_packages": [
"psycopg2"
],
"connection_string": "postgresql://{username}:{password}@{host}:{port}/{database}",
"parameters": {
"username": "Database user (default: postgres)",
"password": "Database password",
"host": "AlloyDB instance IP or Auth Proxy address",
"port": "Default 5432",
"database": "Database name"
},
"notes": "For public IP connections, use the AlloyDB Auth Proxy for secure access. Private IP connections can connect directly.",
"docs_url": "https://cloud.google.com/alloydb/docs",
"categories": [
"CLOUD_GCP",
"HOSTED_OPEN_SOURCE"
]
},
{
"name": "Neon",
"description": "Serverless PostgreSQL with branching, scale-to-zero, and bottomless storage.",
"logo": "neon.png",
"homepage_url": "https://neon.tech/",
"pypi_packages": [
"psycopg2"
],
"connection_string": "postgresql://{username}:{password}@{host}/{database}?sslmode=require",
"parameters": {
"username": "Neon role name",
"password": "Neon role password",
"host": "Neon hostname (e.g., ep-cool-name-123456.us-east-2.aws.neon.tech)",
"database": "Database name (default: neondb)"
},
"notes": "SSL is required for all connections. Find connection details in the Neon console under Connection Details.",
"docs_url": "https://neon.tech/docs/connect/connect-from-any-app",
"categories": [
"HOSTED_OPEN_SOURCE"
]
},
{
"name": "Amazon Aurora PostgreSQL",
"description": "Amazon Aurora PostgreSQL is a fully managed, PostgreSQL-compatible relational database with up to 5x the throughput of standard PostgreSQL.",

View File

@@ -261,6 +261,14 @@
"description": "Data panel closed by default in chart builder",
"category": "runtime_config"
},
{
"name": "DISABLE_EMBEDDED_SUPERSET_LOGOUT",
"default": false,
"lifecycle": "stable",
"description": "Hide the logout button in embedded contexts (e.g., when using SSO in iframes)",
"docs": "https://superset.apache.org/docs/configuration/networking-settings#hiding-the-logout-button-in-embedded-contexts",
"category": "runtime_config"
},
{
"name": "DRILL_BY",
"default": true,

Binary file not shown.

Before

Width:  |  Height:  |  Size: 36 KiB

BIN
docs/static/img/superset-og-image.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 88 KiB

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,9 @@
dependencies:
- name: postgresql
repository: oci://registry-1.docker.io/bitnamicharts
version: 13.4.4
version: 16.7.27
- name: redis
repository: oci://registry-1.docker.io/bitnamicharts
version: 17.9.4
digest: sha256:c6290bb7e8ce9c694c06b3f5e9b9d01401943b0943c515d3a7a3a8dc1e6492ea
generated: "2025-03-16T00:52:41.47139769+09:00"
digest: sha256:fcae507ca24a20b9cc08b8bf0fcb0eba8ffa33126ab6f71cc3a6e1d5e997e9e3
generated: "2026-02-08T14:11:58.8058368+01:00"

View File

@@ -29,10 +29,10 @@ maintainers:
- name: craig-rueda
email: craig@craigrueda.com
url: https://github.com/craig-rueda
version: 0.15.2 # See [README](https://github.com/apache/superset/blob/master/helm/superset/README.md#versioning) for version details.
version: 0.15.4 # See [README](https://github.com/apache/superset/blob/master/helm/superset/README.md#versioning) for version details.
dependencies:
- name: postgresql
version: 13.4.4
version: 16.7.27
repository: oci://registry-1.docker.io/bitnamicharts
condition: postgresql.enabled
- name: redis

View File

@@ -23,7 +23,7 @@ NOTE: This file is generated by helm-docs: https://github.com/norwoodj/helm-docs
# superset
![Version: 0.15.2](https://img.shields.io/badge/Version-0.15.2-informational?style=flat-square)
![Version: 0.15.4](https://img.shields.io/badge/Version-0.15.4-informational?style=flat-square)
Apache Superset is a modern, enterprise-ready business intelligence web application
@@ -50,7 +50,7 @@ On helm this can be set on `extraSecretEnv.SUPERSET_SECRET_KEY` or `configOverri
| Repository | Name | Version |
|------------|------|---------|
| oci://registry-1.docker.io/bitnamicharts | postgresql | 13.4.4 |
| oci://registry-1.docker.io/bitnamicharts | postgresql | 16.7.27 |
| oci://registry-1.docker.io/bitnamicharts | redis | 17.9.4 |
## Values

View File

@@ -312,6 +312,12 @@ supersetNode:
- /bin/sh
- -c
- dockerize -wait "tcp://$DB_HOST:$DB_PORT" -timeout 120s
resources:
limits:
memory: "256Mi"
requests:
cpu: "250m"
memory: "128Mi"
# -- Launch additional containers into supersetNode pod
extraContainers: []
@@ -410,6 +416,12 @@ supersetWorker:
- /bin/sh
- -c
- dockerize -wait "tcp://$DB_HOST:$DB_PORT" -wait "tcp://$REDIS_HOST:$REDIS_PORT" -timeout 120s
resources:
limits:
memory: "256Mi"
requests:
cpu: "250m"
memory: "128Mi"
# -- Launch additional containers into supersetWorker pod
extraContainers: []
# -- Annotations to be added to supersetWorker deployment
@@ -492,6 +504,12 @@ supersetCeleryBeat:
- /bin/sh
- -c
- dockerize -wait "tcp://$DB_HOST:$DB_PORT" -wait "tcp://$REDIS_HOST:$REDIS_PORT" -timeout 120s
resources:
limits:
memory: "256Mi"
requests:
cpu: "250m"
memory: "128Mi"
# -- Launch additional containers into supersetCeleryBeat pods
extraContainers: []
# -- Annotations to be added to supersetCeleryBeat deployment
@@ -585,6 +603,12 @@ supersetCeleryFlower:
- /bin/sh
- -c
- dockerize -wait "tcp://$DB_HOST:$DB_PORT" -wait "tcp://$REDIS_HOST:$REDIS_PORT" -timeout 120s
resources:
limits:
memory: "256Mi"
requests:
cpu: "250m"
memory: "128Mi"
# -- Launch additional containers into supersetCeleryFlower pods
extraContainers: []
# -- Annotations to be added to supersetCeleryFlower deployment
@@ -749,6 +773,12 @@ init:
- /bin/sh
- -c
- dockerize -wait "tcp://$DB_HOST:$DB_PORT" -timeout 120s
resources:
limits:
memory: "256Mi"
requests:
cpu: "250m"
memory: "128Mi"
# -- A Superset init script
# @default -- a script to create admin user and initialize roles
initscript: |-

View File

@@ -82,7 +82,7 @@ dependencies = [
"parsedatetime",
"paramiko>=3.4.0",
"pgsanity",
"Pillow>=11.0.0, <12",
"Pillow>=11.0.0, <13",
"polyline>=2.0.0, <3.0",
"pydantic>=2.8.0",
"pyparsing>=3.0.6, <4",
@@ -99,8 +99,8 @@ dependencies = [
"simplejson>=3.15.0",
"slack_sdk>=3.19.0, <4",
"sqlalchemy>=1.4, <2",
"sqlalchemy-utils>=0.38.3, <0.39",
"sqlglot>=27.15.2, <28",
"sqlalchemy-utils>=0.38.0, <0.43", # expanding lowerbound to work with pydoris
"sqlglot>=28.10.0, <29",
# newer pandas needs 0.9+
"tabulate>=0.9.0, <1.0",
"typing-extensions>=4, <5",
@@ -141,7 +141,7 @@ druid = ["pydruid>=0.6.5,<0.7"]
duckdb = ["duckdb>=1.4.2,<2", "duckdb-engine>=0.17.0"]
dynamodb = ["pydynamodb>=0.4.2"]
solr = ["sqlalchemy-solr >= 0.2.0"]
elasticsearch = ["elasticsearch-dbapi>=0.2.9, <0.3.0"]
elasticsearch = ["elasticsearch-dbapi>=0.2.12, <0.3.0"]
exasol = ["sqlalchemy-exasol >= 2.4.0, <3.0"]
excel = ["xlrd>=1.2.0, <1.3"]
fastmcp = ["fastmcp==2.14.3"]

View File

@@ -399,12 +399,12 @@ sqlalchemy==1.4.54
# marshmallow-sqlalchemy
# shillelagh
# sqlalchemy-utils
sqlalchemy-utils==0.38.3
sqlalchemy-utils==0.42.0
# via
# apache-superset (pyproject.toml)
# apache-superset-core
# flask-appbuilder
sqlglot==27.15.2
sqlglot==28.10.0
# via
# apache-superset (pyproject.toml)
# apache-superset-core

View File

@@ -990,13 +990,13 @@ sqlalchemy==1.4.54
# sqlalchemy-utils
sqlalchemy-bigquery==1.15.0
# via apache-superset
sqlalchemy-utils==0.38.3
sqlalchemy-utils==0.42.0
# via
# -c requirements/base-constraint.txt
# apache-superset
# apache-superset-core
# flask-appbuilder
sqlglot==27.15.2
sqlglot==28.10.0
# via
# -c requirements/base-constraint.txt
# apache-superset

View File

@@ -18,7 +18,7 @@
[project]
name = "apache-superset-core"
version = "0.0.1rc3"
version = "0.0.1rc4"
description = "Core Python package for building Apache Superset backend extensions and integrations"
readme = "README.md"
authors = [
@@ -45,8 +45,8 @@ dependencies = [
"flask-appbuilder>=5.0.2,<6",
"pydantic>=2.8.0",
"sqlalchemy>=1.4.0,<2.0",
"sqlalchemy-utils>=0.38.0",
"sqlglot>=27.15.2, <28",
"sqlalchemy-utils>=0.38.0, <0.43", # expanding lowerbound to work with pydoris
"sqlglot>=28.10.0, <29",
"typing-extensions>=4.0.0",
]

View File

@@ -46,6 +46,7 @@ from superset_core.api.models import (
Query,
SavedQuery,
Tag,
Task,
User,
)
@@ -248,6 +249,48 @@ class KeyValueDAO(BaseDAO[KeyValue]):
id_column_name = "id"
class TaskDAO(BaseDAO[Task]):
"""
Abstract Task DAO interface.
Host implementations will replace this class during initialization
with a concrete implementation providing actual functionality.
"""
# Class variables that will be set by host implementation
model_cls = None
base_filter = None
id_column_name = "id"
uuid_column_name = "uuid"
@classmethod
@abstractmethod
def find_by_task_key(
cls,
task_type: str,
task_key: str,
scope: str = "private",
user_id: int | None = None,
) -> Task | None:
"""
Find active task by type, key, scope, and user.
Uses dedup_key internally for efficient querying with a unique index.
Only returns tasks that are active (pending or in progress).
Uniqueness logic by scope:
- private: scope + task_type + task_key + user_id
- shared/system: scope + task_type + task_key (user-agnostic)
:param task_type: Task type to filter by
:param task_key: Task identifier for deduplication
:param scope: Task scope (private/shared/system)
:param user_id: User ID (required for private tasks)
:returns: Task instance or None if not found or not active
"""
...
__all__ = [
"BaseDAO",
"DatasetDAO",
@@ -259,4 +302,5 @@ __all__ = [
"SavedQueryDAO",
"TagDAO",
"KeyValueDAO",
"TaskDAO",
]

View File

@@ -40,6 +40,7 @@ from flask_appbuilder import Model
from sqlalchemy.orm import scoped_session
if TYPE_CHECKING:
from superset_core.api.tasks import TaskProperties
from superset_core.api.types import (
AsyncQueryHandle,
QueryOptions,
@@ -361,6 +362,132 @@ class KeyValue(CoreModel):
changed_by_fk: int | None
class Task(CoreModel):
"""
Abstract Task model interface.
Host implementations will replace this class during initialization
with concrete implementation providing actual functionality.
This model represents async tasks in the Global Task Framework (GTF).
Non-filterable fields (progress, error info, execution config) are stored
in a `properties` JSON blob for schema flexibility.
"""
__abstract__ = True
# Type hints for expected column attributes
id: int
uuid: UUID
task_key: str # For deduplication
task_type: str # e.g., 'sql_execution'
task_name: str | None # Human readable name
scope: str # private/shared/system
status: str
dedup_key: str # Computed deduplication key
# Timestamps (from AuditMixinNullable)
created_on: datetime | None
changed_on: datetime | None
started_at: datetime | None
ended_at: datetime | None
# User context
created_by_fk: int | None
user_id: int | None
# Task output data
payload: str # JSON serialized task output data
def get_payload(self) -> dict[str, Any]:
"""
Get payload as parsed JSON.
Payload contains task-specific output data set by task code.
Host implementations will replace this method during initialization
with concrete implementation providing actual functionality.
:returns: Dictionary containing payload data
"""
raise NotImplementedError("Method will be replaced during initialization")
def set_payload(self, data: dict[str, Any]) -> None:
"""
Update payload with new data (merges with existing).
Host implementations will replace this method during initialization
with concrete implementation providing actual functionality.
:param data: Dictionary of data to merge into payload
"""
raise NotImplementedError("Method will be replaced during initialization")
@property
def properties(self) -> Any:
"""
Get typed properties (runtime state and execution config).
Properties contain:
- is_abortable: bool | None - has abort handler registered
- progress_percent: float | None - progress 0.0-1.0
- progress_current: int | None - current iteration count
- progress_total: int | None - total iterations
- error_message: str | None - human-readable error message
- exception_type: str | None - exception class name
- stack_trace: str | None - full formatted traceback
- timeout: int | None - timeout in seconds
Host implementations will replace this property during initialization.
:returns: TaskProperties dataclass instance
"""
raise NotImplementedError("Property will be replaced during initialization")
def update_properties(self, updates: "TaskProperties") -> None:
"""
Update specific properties fields (merge semantics).
Only updates fields present in the updates dict.
Host implementations will replace this method during initialization.
:param updates: TaskProperties dict with fields to update
Example:
task.update_properties({"is_abortable": True})
"""
raise NotImplementedError("Method will be replaced during initialization")
class TaskSubscriber(CoreModel):
"""
Abstract TaskSubscriber model interface.
Host implementations will replace this class during initialization
with concrete implementation providing actual functionality.
This model tracks task subscriptions for multi-user shared tasks. When a user
schedules a shared task with the same parameters as an existing task,
they are subscribed to that task instead of creating a duplicate.
"""
__abstract__ = True
# Type hints for expected attributes (no actual field definitions)
id: int
task_id: int
user_id: int
subscribed_at: datetime
# Audit fields from AuditMixinNullable
created_on: datetime | None
changed_on: datetime | None
created_by_fk: int | None
changed_by_fk: int | None
def get_session() -> scoped_session:
"""
Retrieve the SQLAlchemy session to directly interface with the
@@ -384,6 +511,8 @@ __all__ = [
"SavedQuery",
"Tag",
"KeyValue",
"Task",
"TaskSubscriber",
"CoreModel",
"get_session",
]

View File

@@ -0,0 +1,361 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from __future__ import annotations
from abc import ABC, abstractmethod
from dataclasses import dataclass
from enum import Enum
from typing import Any, Callable, Generic, Literal, ParamSpec, TypedDict, TypeVar
from superset_core.api.models import Task
P = ParamSpec("P")
R = TypeVar("R")
class TaskStatus(str, Enum):
"""
Status of task execution.
"""
PENDING = "pending"
IN_PROGRESS = "in_progress"
SUCCESS = "success"
FAILURE = "failure"
ABORTING = "aborting" # Abort/timeout requested, handlers running
ABORTED = "aborted" # User/admin cancelled
TIMED_OUT = "timed_out" # Timeout expired
class TaskScope(str, Enum):
"""
Scope of task visibility and access control.
"""
PRIVATE = "private" # User-specific tasks (default)
SHARED = "shared" # Multi-user collaborative tasks
SYSTEM = "system" # Admin-only background tasks
class TaskProperties(TypedDict, total=False):
"""
TypedDict for task runtime state and execution config.
Stored as JSON in the database, accessed as a dict throughout the codebase.
All fields are optional (total=False) - only set keys are present in the dict.
Usage:
# Reading - always use .get() since keys may not be present
if task.properties.get("is_abortable"):
...
# Writing/updating - only include keys you want to set
task.update_properties({"is_abortable": True, "progress_percent": 0.5})
Notes:
- Sparse dict: only keys that are explicitly set are present
- Unknown keys from JSON are preserved (forward compatibility)
- Always use .get() for reads since keys may be absent
"""
# Execution config - set at task creation
execution_mode: Literal["async", "sync"]
timeout: int
# Runtime state - set by framework during execution
is_abortable: bool
progress_percent: float
progress_current: int
progress_total: int
# Error info - set when task fails
error_message: str
exception_type: str
stack_trace: str
@dataclass(frozen=True)
class TaskOptions:
"""
Execution metadata for tasks.
NOTE: This is intentionally minimal for the initial implementation.
Additional options (queue, priority, run_at, delay_s,
max_retries, retry_backoff_s, tags, etc.) can be added later when needed.
Future enhancements will include:
- Validation (e.g., run_at vs delay_s mutual exclusion)
- Queue routing and priority management
- Retry policies and backoff strategies
Example:
from superset_core.api.tasks import TaskOptions, TaskScope
# Private task (default)
task = my_task.schedule(arg1)
# Custom task with deduplication
task = my_task.schedule(
arg1,
options=TaskOptions(
task_key="custom_key",
task_name="Custom Task Name"
)
)
# Task with custom name
task = admin_task.schedule(
options=TaskOptions(task_name="Admin Operation")
)
# Task with timeout (overrides decorator default)
task = long_task.schedule(
options=TaskOptions(timeout=600) # 10 minute timeout
)
"""
task_key: str | None = None
task_name: str | None = None
timeout: int | None = None # Timeout in seconds
class TaskContext(ABC):
"""
Abstract task context for write-only task state updates.
Tasks use this context to update their state (progress, payload) and
check for cancellation. Tasks should not need to read their own state -
they are the source of state, not consumers of it.
Host implementations will replace this abstract class during initialization
with a concrete implementation providing actual functionality.
"""
@abstractmethod
def update_task(
self,
progress: float | int | tuple[int, int] | None = None,
payload: dict[str, Any] | None = None,
) -> None:
"""
Update task progress and/or payload atomically.
All parameters are optional. Payload is merged with existing data,
not replaced. All updates occur in a single database transaction.
Progress can be specified in three ways:
- float (0.0-1.0): Percentage only, e.g., 0.5 means 50%
- int: Count only (total unknown), e.g., 42 means "42 items processed"
- tuple[int, int]: Count and total, e.g., (3, 100) means "3 of 100"
The percentage is automatically computed from count/total.
:param progress: Progress value, or None to leave unchanged
:param payload: Payload data to merge (dict), or None to leave unchanged
Examples:
# Percentage only - displays as "In progress: 50 %"
ctx.update_task(progress=0.5)
# Count only (total unknown) - displays as "In progress: 42"
ctx.update_task(progress=42)
# Count and total - displays as "In progress: 3 of 100 (3 %)"
ctx.update_task(progress=(3, 100))
# Update payload only
ctx.update_task(payload={"step": "processing"})
# Update both atomically
ctx.update_task(
progress=(80, 100),
payload={"processed": 80, "total": 100}
)
"""
...
@abstractmethod
def on_cleanup(self, handler: Callable[[], None]) -> Callable[[], None]:
"""
Register a cleanup handler that runs when the task ends.
Cleanup handlers are called when the task completes (success),
fails with an error, or is cancelled. Multiple handlers can be
registered and will execute in LIFO order (last registered runs first).
Can be used as a decorator:
@ctx.on_cleanup
def cleanup():
logger.info("Task ended")
Or called directly:
ctx.on_cleanup(lambda: logger.info("Task ended"))
:param handler: Cleanup function to register
:returns: The handler (for decorator compatibility)
"""
...
@abstractmethod
def on_abort(self, handler: Callable[[], None]) -> Callable[[], None]:
"""
Register handler that runs when task is aborted.
When the first handler is registered, background polling starts
automatically. The handler will be called when an abort is detected.
The handler executes in a background thread and the task code
continues running unless the handler takes action to stop it.
:param handler: Callback function to execute when abort is detected
:returns: The handler (for decorator compatibility)
Example:
@ctx.on_abort
def handle_abort():
logger.info("Task was aborted!")
cleanup_partial_work()
"""
...
def task(
name: str | None = None,
scope: TaskScope = TaskScope.PRIVATE,
timeout: int | None = None,
) -> Callable[[Callable[P, R]], "TaskWrapper[P]"]:
"""
Decorator to register a task.
Host implementations will replace this function during initialization
with a concrete implementation providing actual functionality.
:param name: Optional unique task name (e.g., "superset.generate_thumbnail").
If not provided, uses the function name as the task name.
:param scope: Task scope (TaskScope.PRIVATE, SHARED, or SYSTEM).
Defaults to TaskScope.PRIVATE.
:param timeout: Optional timeout in seconds. When the timeout is reached,
abort handlers are triggered if registered. Can be overridden
at call time via TaskOptions(timeout=...).
:returns: TaskWrapper with .schedule() method
Note:
Both direct calls and .schedule() return Task, regardless of the
original function's return type. The decorated function's return value
is discarded; only side effects and context updates matter.
Example:
from superset_core.api.tasks import task, get_context, TaskScope
# Private task (default scope)
@task
def generate_thumbnail(chart_id: int) -> None:
ctx = get_context()
# ... task implementation
# Named task with shared scope
@task(name="generate_report", scope=TaskScope.SHARED)
def generate_chart_thumbnail(chart_id: int) -> None:
ctx = get_context()
# Update progress and payload atomically
ctx.update_task(
progress=0.5,
payload={"chart_id": chart_id, "status": "processing"}
)
# ... task implementation
ctx.update_task(progress=1.0)
# System task (admin-only)
@task(scope=TaskScope.SYSTEM)
def cleanup_old_data() -> None:
ctx = get_context()
# ... cleanup implementation
# Task with timeout
@task(timeout=300) # 5-minute timeout
def long_running_task() -> None:
ctx = get_context()
@ctx.on_abort
def handle_abort():
# Called when timeout or manual abort
pass
# Schedule async execution
task = generate_chart_thumbnail.schedule(chart_id=123) # Returns Task
# Direct call for sync execution (blocks until task is complete)
task = generate_chart_thumbnail(chart_id=123) # Also returns Task
"""
raise NotImplementedError("Function will be replaced during initialization")
class TaskWrapper(Generic[P]):
"""
Type stub for task wrapper returned by @task decorator.
Both __call__ and .schedule() return Task.
"""
def __call__(self, *args: P.args, **kwargs: P.kwargs) -> Task:
"""Execute the task synchronously."""
raise NotImplementedError("Will be replaced during initialization")
def schedule(self, *args: P.args, **kwargs: P.kwargs) -> Task:
"""Schedule the task for async execution."""
raise NotImplementedError("Will be replaced during initialization")
def get_context() -> TaskContext:
"""
Get the current task context from ambient context.
Host implementations will replace this function during initialization
with a concrete implementation providing actual functionality.
This function provides ambient access to the task context without
requiring it to be passed as a parameter. It can only be called
from within an async task execution.
:returns: The current TaskContext
:raises RuntimeError: If called outside a task execution context
Example:
@task("thumbnail_generation")
def generate_chart_thumbnail(chart_id: int):
ctx = get_context() # Access ambient context
# Update task state - no need to fetch task object
ctx.update_task(
progress=0.5,
payload={"chart_id": chart_id}
)
"""
raise NotImplementedError("Function will be replaced during initialization")
__all__ = [
"TaskStatus",
"TaskScope",
"TaskProperties",
"TaskContext",
"TaskOptions",
"task",
"get_context",
]

View File

@@ -0,0 +1,35 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""
Constants for extension validation and naming.
"""
# Publisher validation pattern: lowercase letters, numbers, hyphens; must start with
# letter; no consecutive hyphens or trailing hyphens
PUBLISHER_PATTERN = r"^[a-z]([a-z0-9]*(-[a-z0-9]+)*)?$"
# Technical name validation pattern: lowercase letters, numbers, hyphens; must start
# with letter; no consecutive hyphens or trailing hyphens
TECHNICAL_NAME_PATTERN = r"^[a-z]([a-z0-9]*(-[a-z0-9]+)*)?$"
# Display name validation pattern: must start with letter, can contain letters,
# numbers, spaces, hyphens, underscores, dots
DISPLAY_NAME_PATTERN = r"^[a-zA-Z][a-zA-Z0-9\s\-_\.]*$"
# Version pattern for semantic versioning
VERSION_PATTERN = r"^\d+\.\d+\.\d+$"

View File

@@ -29,6 +29,13 @@ from typing import Any
from pydantic import BaseModel, Field # noqa: I001
from superset_core.extensions.constants import (
DISPLAY_NAME_PATTERN,
PUBLISHER_PATTERN,
TECHNICAL_NAME_PATTERN,
VERSION_PATTERN,
)
# =============================================================================
# Shared components
# =============================================================================
@@ -37,6 +44,11 @@ from pydantic import BaseModel, Field # noqa: I001
class ModuleFederationConfig(BaseModel):
"""Configuration for Webpack Module Federation."""
name: str | None = Field(
default=None,
description="Module Federation container name "
"(must be valid JavaScript identifier)",
)
exposes: list[str] = Field(
default_factory=list,
description="Modules exposed by this extension",
@@ -56,39 +68,69 @@ class ModuleFederationConfig(BaseModel):
class ContributionConfig(BaseModel):
"""Configuration for frontend UI contributions."""
"""Configuration for frontend UI contributions.
Views and menus use a nested structure: type -> scope -> location -> contributions.
Example:
{
"views": {
"sqllab": {
"panels": [{"id": "my-ext.panel", "name": "My Panel"}],
"leftSidebar": [{"id": "my-ext.sidebar", "name": "Sidebar"}]
}
},
"menus": {
"sqllab": {
"editor": {"primary": [...], "secondary": [...]}
}
}
}
"""
commands: list[dict[str, Any]] = Field(
default_factory=list,
description="Command contributions",
)
views: dict[str, list[dict[str, Any]]] = Field(
views: dict[str, dict[str, list[dict[str, Any]]]] = Field(
default_factory=dict,
description="View contributions by location",
description="View contributions by scope and location",
)
menus: dict[str, Any] = Field(
menus: dict[str, dict[str, Any]] = Field(
default_factory=dict,
description="Menu contributions",
description="Menu contributions by scope and location",
)
editors: list[dict[str, Any]] = Field(
default_factory=list,
description="Editor contributions",
)
class BaseExtension(BaseModel):
"""Base fields shared by ExtensionConfig and Manifest."""
id: str = Field(
publisher: str = Field(
...,
description="Unique extension identifier",
description="Publisher/organization namespace",
min_length=1,
pattern=PUBLISHER_PATTERN,
)
name: str = Field(
...,
description="Technical extension identifier",
min_length=1,
pattern=TECHNICAL_NAME_PATTERN,
)
displayName: str = Field( # noqa: N815
...,
description="Human-readable extension name",
min_length=1,
pattern=DISPLAY_NAME_PATTERN,
)
version: str = Field(
default="0.0.0",
description="Semantic version string",
pattern=r"^\d+\.\d+\.\d+$",
pattern=VERSION_PATTERN,
)
license: str | None = Field(
default=None,
@@ -194,6 +236,11 @@ class Manifest(BaseExtension):
This file is generated by the build tool from extension.json.
"""
id: str = Field(
...,
description="Composite extension ID (publisher.name)",
min_length=1,
)
frontend: ManifestFrontend | None = Field(
default=None,
description="Frontend manifest",

View File

@@ -38,7 +38,18 @@ from watchdog.events import FileSystemEventHandler
from watchdog.observers import Observer
from superset_extensions_cli.constants import MIN_NPM_VERSION
from superset_extensions_cli.utils import read_json, read_toml
from superset_extensions_cli.exceptions import ExtensionNameError
from superset_extensions_cli.types import ExtensionNames
from superset_extensions_cli.utils import (
generate_extension_names,
kebab_to_snake_case,
read_json,
read_toml,
suggest_technical_name,
validate_display_name,
validate_publisher,
validate_technical_name,
)
REMOTE_ENTRY_REGEX = re.compile(r"^remoteEntry\..+\.js$")
FRONTEND_DIST_REGEX = re.compile(r"/frontend/dist")
@@ -137,6 +148,9 @@ def build_manifest(cwd: Path, remote_entry: str | None) -> Manifest:
extension = ExtensionConfig.model_validate(extension_data)
# Generate composite ID from publisher and name
composite_id = f"{extension.publisher}.{extension.name}"
frontend: ManifestFrontend | None = None
if extension.frontend and remote_entry:
frontend = ManifestFrontend(
@@ -150,8 +164,10 @@ def build_manifest(cwd: Path, remote_entry: str | None) -> Manifest:
backend = ManifestBackend(entryPoints=extension.backend.entryPoints)
return Manifest(
id=extension.id,
id=composite_id,
publisher=extension.publisher,
name=extension.name,
displayName=extension.displayName,
version=extension.version,
permissions=extension.permissions,
dependencies=extension.dependencies,
@@ -403,14 +419,111 @@ def dev(ctx: click.Context) -> None:
click.secho("❌ No directories to watch. Exiting.", fg="red")
def prompt_for_extension_info(
display_name_opt: str | None,
publisher_opt: str | None,
technical_name_opt: str | None,
) -> ExtensionNames:
"""
Prompt for extension info with graceful validation and re-prompting.
Args:
display_name_opt: Display name provided via CLI option (if any)
publisher_opt: Publisher provided via CLI option (if any)
technical_name_opt: Technical name provided via CLI option (if any)
Returns:
ExtensionNames: Validated extension name variants
"""
# Step 1: Get display name
if display_name_opt:
display_name = display_name_opt
try:
display_name = validate_display_name(display_name)
except ExtensionNameError as e:
click.secho(f"{e}", fg="red")
sys.exit(1)
else:
while True:
display_name = click.prompt("Extension display name", type=str)
try:
display_name = validate_display_name(display_name)
break
except ExtensionNameError as e:
click.secho(f"{e}", fg="red")
# Step 2: Get technical name (with suggestion from display name)
if technical_name_opt:
technical_name = technical_name_opt
try:
validate_technical_name(technical_name)
except ExtensionNameError as e:
click.secho(f"{e}", fg="red")
sys.exit(1)
else:
# Suggest technical name from display name
try:
suggested_technical = suggest_technical_name(display_name)
except ExtensionNameError:
suggested_technical = "extension"
while True:
technical_name = click.prompt(
f"Extension name ({suggested_technical})",
default=suggested_technical,
type=str,
)
try:
validate_technical_name(technical_name)
break
except ExtensionNameError as e:
click.secho(f"{e}", fg="red")
# Step 3: Get publisher
if publisher_opt:
publisher = publisher_opt
try:
validate_publisher(publisher)
except ExtensionNameError as e:
click.secho(f"{e}", fg="red")
sys.exit(1)
else:
while True:
publisher = click.prompt("Publisher (e.g., my-org)", type=str)
try:
validate_publisher(publisher)
break
except ExtensionNameError as e:
click.secho(f"{e}", fg="red")
# Generate all name variants
try:
return generate_extension_names(display_name, publisher, technical_name)
except ExtensionNameError as e:
click.secho(f"{e}", fg="red")
sys.exit(1)
@app.command()
@click.option(
"--id",
"id_opt",
"--publisher",
"publisher_opt",
default=None,
help="Extension ID (alphanumeric and underscores only)",
help="Publisher namespace (kebab-case, e.g. my-org)",
)
@click.option(
"--name",
"name_opt",
default=None,
help="Technical extension name (kebab-case, e.g. dashboard-widgets)",
)
@click.option(
"--display-name",
"display_name_opt",
default=None,
help="Extension display name (e.g. Dashboard Widgets)",
)
@click.option("--name", "name_opt", default=None, help="Extension display name")
@click.option(
"--version", "version_opt", default=None, help="Initial version (default: 0.1.0)"
)
@@ -424,25 +537,17 @@ def dev(ctx: click.Context) -> None:
"--backend/--no-backend", "backend_opt", default=None, help="Include backend"
)
def init(
id_opt: str | None,
publisher_opt: str | None,
name_opt: str | None,
display_name_opt: str | None,
version_opt: str | None,
license_opt: str | None,
frontend_opt: bool | None,
backend_opt: bool | None,
) -> None:
id_ = id_opt or click.prompt(
"Extension ID (unique identifier, alphanumeric only)", type=str
)
if not re.match(r"^[a-zA-Z0-9_]+$", id_):
click.secho(
"❌ ID must be alphanumeric (letters, digits, underscore).", fg="red"
)
sys.exit(1)
# Get extension names with graceful validation
names = prompt_for_extension_info(display_name_opt, publisher_opt, name_opt)
name = name_opt or click.prompt(
"Extension name (human-readable display name)", type=str
)
version = version_opt or click.prompt("Initial version", default="0.1.0")
license_ = license_opt or click.prompt("License", default="Apache-2.0")
include_frontend = (
@@ -456,7 +561,7 @@ def init(
else click.confirm("Include backend?", default=True)
)
target_dir = Path.cwd() / id_
target_dir = Path.cwd() / names["id"]
if target_dir.exists():
click.secho(f"❌ Directory {target_dir} already exists.", fg="red")
sys.exit(1)
@@ -465,8 +570,7 @@ def init(
templates_dir = Path(__file__).parent / "templates"
env = Environment(loader=FileSystemLoader(templates_dir)) # noqa: S701
ctx = {
"id": id_,
"name": name,
**names, # Include all name variants
"include_frontend": include_frontend,
"include_backend": include_backend,
"license": license_,
@@ -502,29 +606,48 @@ def init(
(frontend_src_dir / "index.tsx").write_text(index_tsx)
click.secho("✅ Created frontend folder structure", fg="green")
# Initialize backend files
# Initialize backend files with superset_extensions.publisher.name structure
if include_backend:
backend_dir = target_dir / "backend"
backend_dir.mkdir()
backend_src_dir = backend_dir / "src"
backend_src_dir.mkdir()
backend_src_package_dir = backend_src_dir / id_
backend_src_package_dir.mkdir()
# Create superset_extensions namespace directory
namespace_dir = backend_src_dir / "superset_extensions"
namespace_dir.mkdir()
# Create publisher directory (e.g., superset_extensions/my_org)
publisher_snake = kebab_to_snake_case(names["publisher"])
publisher_dir = namespace_dir / publisher_snake
publisher_dir.mkdir()
# Create extension package directory (e.g., superset_extensions/my_org/dashboard_widgets)
name_snake = kebab_to_snake_case(names["name"])
extension_package_dir = publisher_dir / name_snake
extension_package_dir.mkdir()
# backend files
pyproject_toml = env.get_template("backend/pyproject.toml.j2").render(ctx)
(backend_dir / "pyproject.toml").write_text(pyproject_toml)
# Namespace package __init__.py (empty for namespace)
(namespace_dir / "__init__.py").write_text("")
(publisher_dir / "__init__.py").write_text("")
# Extension package files
init_py = env.get_template("backend/src/package/__init__.py.j2").render(ctx)
(backend_src_package_dir / "__init__.py").write_text(init_py)
(extension_package_dir / "__init__.py").write_text(init_py)
entrypoint_py = env.get_template("backend/src/package/entrypoint.py.j2").render(
ctx
)
(backend_src_package_dir / "entrypoint.py").write_text(entrypoint_py)
(extension_package_dir / "entrypoint.py").write_text(entrypoint_py)
click.secho("✅ Created backend folder structure", fg="green")
click.secho(
f"🎉 Extension {name} (ID: {id_}) initialized at {target_dir}", fg="cyan"
f"🎉 Extension {names['display_name']} (ID: {names['id']}) initialized at {target_dir}",
fg="cyan",
)

View File

@@ -0,0 +1,22 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
class ExtensionNameError(Exception):
"""Raised when extension name validation fails."""
pass

View File

@@ -1,4 +1,4 @@
[project]
name = "{{ id }}"
name = "{{ backend_package }}"
version = "{{ version }}"
license = "{{ license }}"

View File

@@ -1 +1 @@
print("{{ name }} extension registered")
print("{{ display_name }} extension registered")

View File

@@ -1,6 +1,7 @@
{
"id": "{{ id }}",
"publisher": "{{ publisher }}",
"name": "{{ name }}",
"displayName": "{{ display_name }}",
"version": "{{ version }}",
"license": "{{ license }}",
{% if include_frontend -%}
@@ -8,17 +9,19 @@
"contributions": {
"commands": [],
"views": {},
"menus": {}
"menus": {},
"editors": []
},
"moduleFederation": {
"name": "{{ mf_name }}",
"exposes": ["./index"]
}
},
{% endif -%}
{% if include_backend -%}
"backend": {
"entryPoints": ["{{ id }}.entrypoint"],
"files": ["backend/src/{{ id }}/**/*.py"]
"entryPoints": ["{{ backend_entry }}"],
"files": ["backend/src/{{ backend_path|replace('.', '/') }}/**/*.py"]
},
{% endif -%}
"permissions": []

View File

@@ -1,5 +1,5 @@
{
"name": "{{ id }}",
"name": "{{ npm_name }}",
"version": "{{ version }}",
"main": "dist/main.js",
"types": "dist/publicAPI.d.ts",
@@ -23,8 +23,6 @@
"@babel/preset-typescript": "^7.26.0",
"@types/react": "^19.0.10",
"copy-webpack-plugin": "^13.0.0",
"install": "^0.13.0",
"npm": "^11.1.0",
"ts-loader": "^9.5.2",
"typescript": "^5.8.2",
"webpack": "^5.98.0",

View File

@@ -19,7 +19,7 @@ module.exports = (env, argv) => {
filename: isProd ? undefined : "[name].[contenthash].js",
chunkFilename: "[name].[contenthash].js",
path: path.resolve(__dirname, "dist"),
publicPath: `/api/v1/extensions/${packageConfig.name}/`,
publicPath: `/api/v1/extensions/{{ publisher }}/{{ name }}/`,
},
resolve: {
extensions: [".ts", ".tsx", ".js", ".jsx"],
@@ -39,7 +39,7 @@ module.exports = (env, argv) => {
},
plugins: [
new ModuleFederationPlugin({
name: "{{ id }}",
name: "{{ mf_name }}",
filename: "remoteEntry.[contenthash].js",
exposes: {
"./index": "./src/index.tsx",

View File

@@ -0,0 +1,49 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from typing import TypedDict
class ExtensionNames(TypedDict):
"""Type definition for extension name variants following platform conventions."""
# Publisher namespace (e.g., "my-org")
publisher: str
# Technical extension name (e.g., "dashboard-widgets")
name: str
# Human-readable display name (e.g., "Dashboard Widgets")
display_name: str
# Composite extension ID - publisher.name (e.g., "my-org.dashboard-widgets")
id: str
# NPM package name - @publisher/name (e.g., "@my-org/dashboard-widgets")
npm_name: str
# Module Federation library - publisherCamel_nameCamel (e.g., "myOrg_dashboardWidgets")
mf_name: str
# Backend package name with hyphens for distribution (e.g., "my_org-dashboard_widgets")
backend_package: str
# Full backend import path (e.g., "superset_extensions.my_org.dashboard_widgets")
backend_path: str
# Backend entry point (e.g., "superset_extensions.my_org.dashboard_widgets.entrypoint")
backend_entry: str

View File

@@ -16,15 +16,82 @@
# under the License.
import json # noqa: TID251
import re
import sys
from pathlib import Path
from typing import Any
from superset_core.extensions.constants import (
DISPLAY_NAME_PATTERN,
PUBLISHER_PATTERN,
TECHNICAL_NAME_PATTERN,
)
from superset_extensions_cli.exceptions import ExtensionNameError
from superset_extensions_cli.types import ExtensionNames
if sys.version_info >= (3, 11):
import tomllib
else:
import tomli as tomllib
# Python reserved keywords to avoid in package names
PYTHON_KEYWORDS = {
"and",
"as",
"assert",
"break",
"class",
"continue",
"def",
"del",
"elif",
"else",
"except",
"exec",
"finally",
"for",
"from",
"global",
"if",
"import",
"in",
"is",
"lambda",
"not",
"or",
"pass",
"print",
"raise",
"return",
"try",
"while",
"with",
"yield",
"False",
"None",
"True",
}
# npm reserved names to avoid
NPM_RESERVED = {
"node_modules",
"favicon.ico",
"www",
"http",
"https",
"ftp",
"localhost",
"package.json",
"npm",
"yarn",
"bower_components",
}
# Compiled patterns for publisher/name validation
PUBLISHER_REGEX = re.compile(PUBLISHER_PATTERN)
TECHNICAL_NAME_REGEX = re.compile(TECHNICAL_NAME_PATTERN)
DISPLAY_NAME_REGEX = re.compile(DISPLAY_NAME_PATTERN)
def read_toml(path: Path) -> dict[str, Any] | None:
if not path.is_file():
@@ -40,3 +107,276 @@ def read_json(path: Path) -> dict[str, Any] | None:
return None
return json.loads(path.read_text())
def _normalize_for_identifiers(name: str) -> str:
"""
Normalize display name to clean lowercase words.
Args:
name: Raw display name (e.g., "Hello World!")
Returns:
Normalized string (e.g., "hello world")
"""
# Convert to lowercase
normalized = name.lower().strip()
# Convert underscores and existing hyphens to spaces for consistent processing
normalized = normalized.replace("_", " ").replace("-", " ")
# Remove any non-alphanumeric characters except spaces
normalized = re.sub(r"[^a-z0-9\s]", "", normalized)
# Normalize whitespace (collapse multiple spaces, strip)
normalized = " ".join(normalized.split())
return normalized
def _normalized_to_kebab(normalized: str) -> str:
"""Convert normalized string to kebab-case."""
return normalized.replace(" ", "-")
def _normalized_to_snake(normalized: str) -> str:
"""Convert normalized string to snake_case."""
return normalized.replace(" ", "_")
def _normalized_to_camel(normalized: str) -> str:
"""Convert normalized string to camelCase."""
parts = normalized.split()
if not parts:
return ""
# First part lowercase, subsequent parts capitalized
return parts[0] + "".join(word.capitalize() for word in parts[1:])
def kebab_to_camel_case(kebab_name: str) -> str:
"""Convert kebab-case to camelCase (e.g., 'hello-world' -> 'helloWorld')."""
parts = kebab_name.split("-")
if not parts:
return ""
# First part lowercase, subsequent parts capitalized
return parts[0] + "".join(word.capitalize() for word in parts[1:])
def kebab_to_snake_case(kebab_name: str) -> str:
"""Convert kebab-case to snake_case (e.g., 'hello-world' -> 'hello_world')."""
return kebab_name.replace("-", "_")
def name_to_kebab_case(name: str) -> str:
"""Convert display name directly to kebab-case (e.g., 'Hello World' -> 'hello-world')."""
normalized = _normalize_for_identifiers(name)
return _normalized_to_kebab(normalized)
def validate_python_package_name(name: str) -> None:
"""
Validate Python package name (snake_case format).
Raises:
ExtensionNameError: If name is invalid
"""
# Check if it starts with a number (invalid for Python identifiers)
if name[0].isdigit():
raise ExtensionNameError(f"Package name '{name}' cannot start with a number")
# Check if the first part (before any underscore) is a Python keyword
if (first_part := name.split("_")[0]) in PYTHON_KEYWORDS:
raise ExtensionNameError(
f"Package name cannot start with Python keyword '{first_part}'"
)
# Check if it's a valid Python identifier
if not name.replace("_", "a").isalnum():
raise ExtensionNameError(f"'{name}' is not a valid Python package name")
def validate_npm_package_name(name: str) -> None:
"""
Validate npm package name (kebab-case format).
Raises:
ExtensionNameError: If name is invalid
"""
if name.lower() in NPM_RESERVED:
raise ExtensionNameError(f"'{name}' is a reserved npm package name")
def validate_publisher(publisher: str) -> None:
"""
Validate publisher namespace format.
Args:
publisher: Publisher namespace (e.g., 'my-org')
Raises:
ExtensionNameError: If publisher is invalid
"""
if not publisher:
raise ExtensionNameError("Publisher cannot be empty")
if not PUBLISHER_REGEX.match(publisher):
raise ExtensionNameError(
"Publisher must start with a letter and contain only lowercase letters, numbers, and hyphens (e.g., 'my-org')"
)
def validate_technical_name(name: str) -> None:
"""
Validate technical extension name format.
Args:
name: Technical extension name (e.g., 'dashboard-widgets')
Raises:
ExtensionNameError: If name is invalid
"""
if not name:
raise ExtensionNameError("Extension name cannot be empty")
if not TECHNICAL_NAME_REGEX.match(name):
raise ExtensionNameError(
"Extension name must start with a letter and contain only lowercase letters, numbers, and hyphens (e.g., 'dashboard-widgets')"
)
def validate_display_name(display_name: str) -> str:
"""
Validate and normalize display name format.
Args:
display_name: Human-readable extension name
Returns:
Cleaned display name
Raises:
ExtensionNameError: If display name is invalid
"""
if not display_name or not display_name.strip():
raise ExtensionNameError("Display name cannot be empty")
# Normalize whitespace: strip and collapse multiple spaces
normalized = " ".join(display_name.strip().split())
if not DISPLAY_NAME_REGEX.match(normalized):
raise ExtensionNameError(
"Display name must start with a letter and can contain letters, numbers, spaces, hyphens, underscores, and dots (e.g., 'Dashboard Widgets')"
)
# Check for only whitespace/special chars after normalization
if not any(c.isalnum() for c in normalized):
raise ExtensionNameError(
"Display name must contain at least one letter or number"
)
return normalized
def suggest_technical_name(display_name: str) -> str:
"""
Suggest technical name from display name.
Args:
display_name: Human-readable name (e.g., "Dashboard Widgets!")
Returns:
Technical name suggestion (e.g., "dashboard-widgets")
"""
# Normalize for identifiers
normalized = _normalize_for_identifiers(display_name)
# Convert to kebab-case
technical_name = _normalized_to_kebab(normalized)
# Remove any leading/trailing hyphens that might result from edge cases
technical_name = technical_name.strip("-")
# Ensure we have something left
if not technical_name:
raise ExtensionNameError(
"Display name must contain at least one letter or number"
)
return technical_name
def get_module_federation_name(publisher: str, name: str) -> str:
"""
Generate Module Federation container name.
Args:
publisher: Publisher namespace (e.g., 'my-org')
name: Technical name (e.g., 'dashboard-widgets')
Returns:
Module Federation name (e.g., 'myOrg_dashboardWidgets')
"""
publisher_camel = kebab_to_camel_case(publisher)
name_camel = kebab_to_camel_case(name)
return f"{publisher_camel}_{name_camel}"
def generate_extension_names(
display_name: str, publisher: str, technical_name: str | None = None
) -> ExtensionNames:
"""
Generate all extension name variants from input.
Args:
display_name: Human-readable name (e.g., "Dashboard Widgets")
publisher: Publisher namespace (e.g., "my-org")
technical_name: Technical name override, or None to auto-generate
Returns:
ExtensionNames: Dictionary with all name variants
Raises:
ExtensionNameError: If any name is invalid
"""
# Validate and normalize inputs
display_name = validate_display_name(display_name)
validate_publisher(publisher)
# Use provided technical name or generate from display name
if technical_name is None:
technical_name = suggest_technical_name(display_name)
else:
validate_technical_name(technical_name)
# Generate composite ID
composite_id = f"{publisher}.{technical_name}"
# Generate NPM package name
npm_name = f"@{publisher}/{technical_name}"
# Generate Module Federation name
mf_name = get_module_federation_name(publisher, technical_name)
# Generate backend names with collision protection
publisher_snake = kebab_to_snake_case(publisher)
name_snake = kebab_to_snake_case(technical_name)
backend_package = f"{publisher_snake}-{name_snake}"
backend_path = f"superset_extensions.{publisher_snake}.{name_snake}"
backend_entry = f"{backend_path}.entrypoint"
# Validate the generated names
validate_python_package_name(publisher_snake)
validate_python_package_name(name_snake)
validate_npm_package_name(technical_name)
return ExtensionNames(
publisher=publisher,
name=technical_name,
display_name=display_name,
id=composite_id,
npm_name=npm_name,
mf_name=mf_name,
backend_package=backend_package,
backend_path=backend_path,
backend_entry=backend_entry,
)

View File

@@ -46,8 +46,9 @@ def isolated_filesystem(tmp_path):
def extension_params():
"""Default parameters for extension creation."""
return {
"id": "test_extension",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": "0.1.0",
"license": "Apache-2.0",
"include_frontend": True,
@@ -58,25 +59,25 @@ def extension_params():
@pytest.fixture
def cli_input_both():
"""CLI input for creating extension with both frontend and backend."""
return "test_extension\nTest Extension\n0.1.0\nApache-2.0\ny\ny\n"
return "Test Extension\n\ntest-org\n0.1.0\nApache-2.0\ny\ny\n"
@pytest.fixture
def cli_input_frontend_only():
"""CLI input for creating extension with frontend only."""
return "test_extension\nTest Extension\n0.1.0\nApache-2.0\ny\nn\n"
return "Test Extension\n\ntest-org\n0.1.0\nApache-2.0\ny\nn\n"
@pytest.fixture
def cli_input_backend_only():
"""CLI input for creating extension with backend only."""
return "test_extension\nTest Extension\n0.1.0\nApache-2.0\nn\ny\n"
return "Test Extension\n\ntest-org\n0.1.0\nApache-2.0\nn\ny\n"
@pytest.fixture
def cli_input_neither():
"""CLI input for creating extension with neither frontend nor backend."""
return "test_extension\nTest Extension\n0.1.0\nApache-2.0\nn\nn\n"
return "Test Extension\n\ntest-org\n0.1.0\nApache-2.0\nn\nn\n"
@pytest.fixture
@@ -86,10 +87,11 @@ def extension_setup_for_dev():
def _setup(base_path: Path) -> None:
import json
# Create extension.json
# Create extension.json with new structure
extension_json = {
"id": "test_extension",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": "1.0.0",
"permissions": [],
}
@@ -113,10 +115,12 @@ def extension_setup_for_bundling():
dist_dir = base_path / "dist"
dist_dir.mkdir(parents=True)
# Create manifest.json
# Create manifest.json with composite ID
manifest = {
"id": "test_extension",
"name": "Test Extension",
"id": "test-org.test-extension",
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": "1.0.0",
"permissions": [],
}
@@ -128,8 +132,15 @@ def extension_setup_for_bundling():
(frontend_dir / "remoteEntry.abc123.js").write_text("// remote entry")
(frontend_dir / "main.js").write_text("// main js")
# Create some backend files
backend_dir = dist_dir / "backend" / "src" / "test_extension"
# Create some backend files - updated path structure
backend_dir = (
dist_dir
/ "backend"
/ "src"
/ "superset_extensions"
/ "test_org"
/ "test_extension"
)
backend_dir.mkdir(parents=True)
(backend_dir / "__init__.py").write_text("# init")

View File

@@ -52,20 +52,33 @@ def extension_with_build_structure():
# Create extension.json
extension_json = {
"id": "test_extension",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": "1.0.0",
"permissions": [],
}
if include_frontend:
extension_json["frontend"] = {
"contributions": {"commands": []},
"moduleFederation": {"exposes": ["./index"]},
"contributions": {
"commands": [],
"views": {},
"menus": {},
"editors": [],
},
"moduleFederation": {
"exposes": ["./index"],
"name": "testOrg_testExtension",
},
}
if include_backend:
extension_json["backend"] = {"entryPoints": ["test_extension.entrypoint"]}
extension_json["backend"] = {
"entryPoints": [
"superset_extensions.test_org.test_extension.entrypoint"
]
}
(base_path / "extension.json").write_text(json.dumps(extension_json))
@@ -230,16 +243,27 @@ def test_build_manifest_creates_correct_manifest_structure(isolated_filesystem):
"""Test build_manifest creates correct manifest from extension.json."""
# Create extension.json
extension_data = {
"id": "test_extension",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": "1.0.0",
"permissions": ["read_data"],
"dependencies": ["some_dep"],
"frontend": {
"contributions": {"commands": [{"id": "test_command", "title": "Test"}]},
"moduleFederation": {"exposes": ["./index"]},
"contributions": {
"commands": [{"id": "test_command", "title": "Test"}],
"views": {},
"menus": {},
"editors": [],
},
"moduleFederation": {
"exposes": ["./index"],
"name": "testOrg_testExtension",
},
},
"backend": {
"entryPoints": ["superset_extensions.test_org.test_extension.entrypoint"]
},
"backend": {"entryPoints": ["test_extension.entrypoint"]},
}
extension_json = isolated_filesystem / "extension.json"
extension_json.write_text(json.dumps(extension_data))
@@ -247,8 +271,10 @@ def test_build_manifest_creates_correct_manifest_structure(isolated_filesystem):
manifest = build_manifest(isolated_filesystem, "remoteEntry.abc123.js")
# Verify manifest structure
assert manifest.id == "test_extension"
assert manifest.name == "Test Extension"
assert manifest.id == "test-org.test-extension" # Composite ID
assert manifest.publisher == "test-org"
assert manifest.name == "test-extension"
assert manifest.displayName == "Test Extension"
assert manifest.version == "1.0.0"
assert manifest.permissions == ["read_data"]
assert manifest.dependencies == ["some_dep"]
@@ -263,15 +289,18 @@ def test_build_manifest_creates_correct_manifest_structure(isolated_filesystem):
# Verify backend section
assert manifest.backend is not None
assert manifest.backend.entryPoints == ["test_extension.entrypoint"]
assert manifest.backend.entryPoints == [
"superset_extensions.test_org.test_extension.entrypoint"
]
@pytest.mark.unit
def test_build_manifest_handles_minimal_extension(isolated_filesystem):
"""Test build_manifest with minimal extension.json (no frontend/backend)."""
extension_data = {
"id": "minimal_extension",
"name": "Minimal Extension",
"publisher": "minimal-org",
"name": "minimal-extension",
"displayName": "Minimal Extension",
"version": "0.1.0",
"permissions": [],
}
@@ -280,8 +309,10 @@ def test_build_manifest_handles_minimal_extension(isolated_filesystem):
manifest = build_manifest(isolated_filesystem, None)
assert manifest.id == "minimal_extension"
assert manifest.name == "Minimal Extension"
assert manifest.id == "minimal-org.minimal-extension" # Composite ID
assert manifest.publisher == "minimal-org"
assert manifest.name == "minimal-extension"
assert manifest.displayName == "Minimal Extension"
assert manifest.version == "0.1.0"
assert manifest.permissions == []
assert manifest.dependencies == [] # Default empty list
@@ -393,8 +424,9 @@ def test_rebuild_backend_calls_copy_and_shows_message(isolated_filesystem):
# Create extension.json
extension_json = {
"id": "test",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": "1.0.0",
"permissions": [],
}
@@ -420,8 +452,9 @@ def test_copy_backend_files_skips_non_files(isolated_filesystem):
# Create extension.json with backend file patterns
extension_data = {
"id": "test_ext",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-ext",
"displayName": "Test Extension",
"version": "1.0.0",
"permissions": [],
"backend": {
@@ -457,8 +490,9 @@ def test_copy_backend_files_copies_matched_files(isolated_filesystem):
# Create extension.json with backend file patterns
extension_data = {
"id": "test_ext",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-ext",
"displayName": "Test Extension",
"version": "1.0.0",
"permissions": [],
"backend": {"files": ["backend/src/test_ext/**/*.py"]},
@@ -480,8 +514,9 @@ def test_copy_backend_files_copies_matched_files(isolated_filesystem):
def test_copy_backend_files_handles_no_backend_config(isolated_filesystem):
"""Test copy_backend_files handles extension.json without backend config."""
extension_data = {
"id": "frontend_only",
"name": "Frontend Only Extension",
"publisher": "frontend-org",
"name": "frontend-only",
"displayName": "Frontend Only Extension",
"version": "1.0.0",
"permissions": [],
}

View File

@@ -43,10 +43,10 @@ def test_bundle_command_creates_zip_with_default_name(
result = cli_runner.invoke(app, ["bundle"])
assert result.exit_code == 0
assert "✅ Bundle created: test_extension-1.0.0.supx" in result.output
assert "✅ Bundle created: test-org.test-extension-1.0.0.supx" in result.output
# Verify zip file was created
zip_path = isolated_filesystem / "test_extension-1.0.0.supx"
zip_path = isolated_filesystem / "test-org.test-extension-1.0.0.supx"
assert_file_exists(zip_path)
# Verify zip contents
@@ -55,7 +55,10 @@ def test_bundle_command_creates_zip_with_default_name(
assert "manifest.json" in file_list
assert "frontend/dist/remoteEntry.abc123.js" in file_list
assert "frontend/dist/main.js" in file_list
assert "backend/src/test_extension/__init__.py" in file_list
assert (
"backend/src/superset_extensions/test_org/test_extension/__init__.py"
in file_list
)
@pytest.mark.cli
@@ -100,7 +103,7 @@ def test_bundle_command_with_output_directory(
assert result.exit_code == 0
# Verify zip file was created in output directory
expected_path = output_dir / "test_extension-1.0.0.supx"
expected_path = output_dir / "test-org.test-extension-1.0.0.supx"
assert_file_exists(expected_path)
assert f"✅ Bundle created: {expected_path}" in result.output
@@ -159,8 +162,10 @@ def test_bundle_includes_all_files_recursively(
# Manifest
manifest = {
"id": "complex_extension",
"name": "Complex Extension",
"id": "complex-org.complex-extension",
"publisher": "complex-org",
"name": "complex-extension",
"displayName": "Complex Extension",
"version": "2.1.0",
"permissions": [],
}
@@ -191,7 +196,7 @@ def test_bundle_includes_all_files_recursively(
assert result.exit_code == 0
# Verify zip file and contents
zip_path = isolated_filesystem / "complex_extension-2.1.0.supx"
zip_path = isolated_filesystem / "complex-org.complex-extension-2.1.0.supx"
assert_file_exists(zip_path)
with zipfile.ZipFile(zip_path, "r") as zipf:

View File

@@ -49,7 +49,13 @@ def test_dev_command_starts_watchers(
"""Test dev command starts file watchers."""
# Setup mocks
mock_rebuild_frontend.return_value = "remoteEntry.abc123.js"
mock_build_manifest.return_value = Manifest(id="test", name="test", version="1.0.0")
mock_build_manifest.return_value = Manifest(
id="test-org.test-extension",
publisher="test-org",
name="test-extension",
displayName="Test Extension",
version="1.0.0",
)
mock_observer = Mock()
mock_observer_class.return_value = mock_observer
@@ -101,7 +107,13 @@ def test_dev_command_initial_build(
"""Test dev command performs initial build setup."""
# Setup mocks
mock_rebuild_frontend.return_value = "remoteEntry.abc123.js"
mock_build_manifest.return_value = Manifest(id="test", name="test", version="1.0.0")
mock_build_manifest.return_value = Manifest(
id="test-org.test-extension",
publisher="test-org",
name="test-extension",
displayName="Test Extension",
version="1.0.0",
)
extension_setup_for_dev(isolated_filesystem)
@@ -178,8 +190,9 @@ def test_frontend_watcher_function_coverage(isolated_filesystem):
"""Test frontend watcher function for coverage."""
# Create extension.json
extension_json = {
"id": "test_extension",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": "1.0.0",
"permissions": [],
}
@@ -189,7 +202,13 @@ def test_frontend_watcher_function_coverage(isolated_filesystem):
dist_dir = isolated_filesystem / "dist"
dist_dir.mkdir()
mock_manifest = Manifest(id="test", name="test", version="1.0.0")
mock_manifest = Manifest(
id="test-org.test-extension",
publisher="test-org",
name="test-extension",
displayName="Test Extension",
version="1.0.0",
)
with patch("superset_extensions_cli.cli.rebuild_frontend") as mock_rebuild:
with patch("superset_extensions_cli.cli.build_manifest") as mock_build:
with patch("superset_extensions_cli.cli.write_manifest") as mock_write:

View File

@@ -43,16 +43,17 @@ def test_init_creates_extension_with_both_frontend_and_backend(
assert result.exit_code == 0, f"Command failed with output: {result.output}"
assert (
"🎉 Extension Test Extension (ID: test_extension) initialized" in result.output
"🎉 Extension Test Extension (ID: test-org.test-extension) initialized"
in result.output
)
# Verify directory structure
extension_path = isolated_filesystem / "test_extension"
extension_path = isolated_filesystem / "test-org.test-extension"
assert_directory_exists(extension_path, "main extension directory")
expected_structure = create_test_extension_structure(
isolated_filesystem,
"test_extension",
"test-org.test-extension",
include_frontend=True,
include_backend=True,
)
@@ -73,7 +74,7 @@ def test_init_creates_extension_with_frontend_only(
assert result.exit_code == 0, f"Command failed with output: {result.output}"
extension_path = isolated_filesystem / "test_extension"
extension_path = isolated_filesystem / "test-org.test-extension"
assert_directory_exists(extension_path)
# Should have frontend directory and package.json
@@ -96,7 +97,7 @@ def test_init_creates_extension_with_backend_only(
assert result.exit_code == 0, f"Command failed with output: {result.output}"
extension_path = isolated_filesystem / "test_extension"
extension_path = isolated_filesystem / "test-org.test-extension"
assert_directory_exists(extension_path)
# Should have backend directory and pyproject.toml
@@ -119,7 +120,7 @@ def test_init_creates_extension_with_neither_frontend_nor_backend(
assert result.exit_code == 0, f"Command failed with output: {result.output}"
extension_path = isolated_filesystem / "test_extension"
extension_path = isolated_filesystem / "test-org.test-extension"
assert_directory_exists(extension_path)
# Should only have extension.json
@@ -130,54 +131,54 @@ def test_init_creates_extension_with_neither_frontend_nor_backend(
assert not (extension_path / "backend").exists()
@pytest.mark.cli
def test_init_accepts_valid_display_name(cli_runner, isolated_filesystem):
"""Test that init accepts valid display names and generates proper ID."""
cli_input = "My Awesome Extension\n\ntest-org\n0.1.0\nApache-2.0\ny\ny\n"
result = cli_runner.invoke(app, ["init"], input=cli_input)
assert result.exit_code == 0, f"Should accept display name: {result.output}"
assert Path("test-org.my-awesome-extension").exists(), (
"Directory for generated composite ID should be created"
)
@pytest.mark.cli
def test_init_accepts_mixed_alphanumeric_name(cli_runner, isolated_filesystem):
"""Test that init accepts mixed alphanumeric display names."""
cli_input = "Tool 123\n\ntest-org\n0.1.0\nApache-2.0\ny\ny\n"
result = cli_runner.invoke(app, ["init"], input=cli_input)
assert result.exit_code == 0, (
f"Mixed alphanumeric display name should be valid: {result.output}"
)
assert Path("test-org.tool-123").exists(), (
"Directory for 'test-org.tool-123' should be created"
)
@pytest.mark.cli
@pytest.mark.parametrize(
"invalid_name,expected_error",
"display_name,expected_id",
[
("test-extension", "must be alphanumeric"),
("test extension", "must be alphanumeric"),
("test.extension", "must be alphanumeric"),
("test@extension", "must be alphanumeric"),
("", "must be alphanumeric"),
("Test Extension", "test-org.test-extension"),
("My Tool v2", "test-org.my-tool-v2"),
("Dashboard Helper", "test-org.dashboard-helper"),
("Chart Builder Pro", "test-org.chart-builder-pro"),
],
)
def test_init_validates_extension_name(
cli_runner, isolated_filesystem, invalid_name, expected_error
):
"""Test that init validates extension names according to regex pattern."""
cli_input = f"{invalid_name}\n0.1.0\nApache-2.0\ny\ny\n"
result = cli_runner.invoke(app, ["init"], input=cli_input)
assert result.exit_code == 1, (
f"Expected command to fail for invalid name '{invalid_name}'"
)
assert expected_error in result.output
@pytest.mark.cli
def test_init_accepts_numeric_extension_name(cli_runner, isolated_filesystem):
"""Test that init accepts numeric extension ids like '123'."""
cli_input = "123\n123\n0.1.0\nApache-2.0\ny\ny\n"
result = cli_runner.invoke(app, ["init"], input=cli_input)
assert result.exit_code == 0, f"Numeric id '123' should be valid: {result.output}"
assert Path("123").exists(), "Directory for '123' should be created"
@pytest.mark.cli
@pytest.mark.parametrize(
"valid_id", ["test123", "TestExtension", "test_extension_123", "MyExt_1"]
)
def test_init_with_valid_alphanumeric_names(cli_runner, valid_id):
"""Test that init accepts various valid alphanumeric names."""
def test_init_with_various_display_names(cli_runner, display_name, expected_id):
"""Test that init accepts various display names and generates proper IDs."""
with cli_runner.isolated_filesystem():
cli_input = f"{valid_id}\nTest Extension\n0.1.0\nApache-2.0\ny\ny\n"
cli_input = f"{display_name}\n\ntest-org\n0.1.0\nApache-2.0\ny\ny\n"
result = cli_runner.invoke(app, ["init"], input=cli_input)
assert result.exit_code == 0, (
f"Valid name '{valid_id}' was rejected: {result.output}"
f"Valid display name '{display_name}' was rejected: {result.output}"
)
assert Path(expected_id).exists(), (
f"Directory for '{expected_id}' was not created"
)
assert Path(valid_id).exists(), f"Directory for '{valid_id}' was not created"
@pytest.mark.cli
@@ -186,7 +187,7 @@ def test_init_fails_when_directory_already_exists(
):
"""Test that init fails gracefully when target directory already exists."""
# Create the directory first
existing_dir = isolated_filesystem / "test_extension"
existing_dir = isolated_filesystem / "test-org.test-extension"
existing_dir.mkdir()
result = cli_runner.invoke(app, ["init"], input=cli_input_both)
@@ -203,15 +204,16 @@ def test_extension_json_content_is_correct(
result = cli_runner.invoke(app, ["init"], input=cli_input_both)
assert result.exit_code == 0
extension_path = isolated_filesystem / "test_extension"
extension_path = isolated_filesystem / "test-org.test-extension"
extension_json_path = extension_path / "extension.json"
# Verify the JSON structure and values
assert_json_content(
extension_json_path,
{
"id": "test_extension",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-extension",
"displayName": "Test Extension",
"version": "0.1.0",
"license": "Apache-2.0",
"permissions": [],
@@ -226,16 +228,28 @@ def test_extension_json_content_is_correct(
frontend = content["frontend"]
assert "contributions" in frontend
assert "moduleFederation" in frontend
assert frontend["contributions"] == {"commands": [], "views": {}, "menus": {}}
assert frontend["moduleFederation"] == {"exposes": ["./index"]}
assert frontend["contributions"] == {
"commands": [],
"views": {},
"menus": {},
"editors": [],
}
assert frontend["moduleFederation"] == {
"exposes": ["./index"],
"name": "testOrg_testExtension",
}
# Verify backend section exists and has correct structure
assert "backend" in content
backend = content["backend"]
assert "entryPoints" in backend
assert "files" in backend
assert backend["entryPoints"] == ["test_extension.entrypoint"]
assert backend["files"] == ["backend/src/test_extension/**/*.py"]
assert backend["entryPoints"] == [
"superset_extensions.test_org.test_extension.entrypoint"
]
assert backend["files"] == [
"backend/src/superset_extensions/test_org/test_extension/**/*.py"
]
@pytest.mark.cli
@@ -246,14 +260,14 @@ def test_frontend_package_json_content_is_correct(
result = cli_runner.invoke(app, ["init"], input=cli_input_both)
assert result.exit_code == 0
extension_path = isolated_filesystem / "test_extension"
extension_path = isolated_filesystem / "test-org.test-extension"
package_json_path = extension_path / "frontend" / "package.json"
# Verify the package.json structure and values
assert_json_content(
package_json_path,
{
"name": "test_extension",
"name": "@test-org/test-extension",
"version": "0.1.0",
"license": "Apache-2.0",
},
@@ -275,14 +289,16 @@ def test_backend_pyproject_toml_is_created(
result = cli_runner.invoke(app, ["init"], input=cli_input_both)
assert result.exit_code == 0
extension_path = isolated_filesystem / "test_extension"
extension_path = isolated_filesystem / "test-org.test-extension"
pyproject_path = extension_path / "backend" / "pyproject.toml"
assert_file_exists(pyproject_path, "backend pyproject.toml")
# Basic content verification (without parsing TOML for now)
content = pyproject_path.read_text()
assert "test_extension" in content
assert (
"test_org-test_extension" in content
) # Package name uses collision-safe naming
assert "0.1.0" in content
assert "Apache-2.0" in content
@@ -300,7 +316,9 @@ def test_init_command_output_messages(cli_runner, isolated_filesystem, cli_input
assert "Created .gitignore" in output
assert "Created frontend folder structure" in output
assert "Created backend folder structure" in output
assert "Extension Test Extension (ID: test_extension) initialized" in output
assert (
"Extension Test Extension (ID: test-org.test-extension) initialized" in output
)
@pytest.mark.cli
@@ -309,7 +327,7 @@ def test_gitignore_content_is_correct(cli_runner, isolated_filesystem, cli_input
result = cli_runner.invoke(app, ["init"], input=cli_input_both)
assert result.exit_code == 0
extension_path = isolated_filesystem / "test_extension"
extension_path = isolated_filesystem / "test-org.test-extension"
gitignore_path = extension_path / ".gitignore"
assert_file_exists(gitignore_path, ".gitignore")
@@ -329,19 +347,20 @@ def test_gitignore_content_is_correct(cli_runner, isolated_filesystem, cli_input
@pytest.mark.cli
def test_init_with_custom_version_and_license(cli_runner, isolated_filesystem):
"""Test init with custom version and license parameters."""
cli_input = "my_extension\nMy Extension\n2.1.0\nMIT\ny\nn\n"
cli_input = "My Extension\n\ntest-org\n2.1.0\nMIT\ny\nn\n"
result = cli_runner.invoke(app, ["init"], input=cli_input)
assert result.exit_code == 0
extension_path = isolated_filesystem / "my_extension"
extension_path = isolated_filesystem / "test-org.my-extension"
extension_json_path = extension_path / "extension.json"
assert_json_content(
extension_json_path,
{
"id": "my_extension",
"name": "My Extension",
"publisher": "test-org",
"name": "my-extension",
"displayName": "My Extension",
"version": "2.1.0",
"license": "MIT",
},
@@ -353,17 +372,17 @@ def test_init_with_custom_version_and_license(cli_runner, isolated_filesystem):
def test_full_init_workflow_integration(cli_runner, isolated_filesystem):
"""Integration test for the complete init workflow."""
# Test the complete flow with realistic user input
cli_input = "awesome_charts\nAwesome Charts\n1.0.0\nApache-2.0\ny\ny\n"
cli_input = "Awesome Charts\n\nawesome-org\n1.0.0\nApache-2.0\ny\ny\n"
result = cli_runner.invoke(app, ["init"], input=cli_input)
# Verify success
assert result.exit_code == 0
# Verify complete directory structure
extension_path = isolated_filesystem / "awesome_charts"
extension_path = isolated_filesystem / "awesome-org.awesome-charts"
expected_structure = create_test_extension_structure(
isolated_filesystem,
"awesome_charts",
"awesome-org.awesome-charts",
include_frontend=True,
include_backend=True,
)
@@ -374,16 +393,19 @@ def test_full_init_workflow_integration(cli_runner, isolated_filesystem):
# Verify all generated files have correct content
extension_json = load_json_file(extension_path / "extension.json")
assert extension_json["id"] == "awesome_charts"
assert extension_json["name"] == "Awesome Charts"
assert extension_json["publisher"] == "awesome-org"
assert extension_json["name"] == "awesome-charts"
assert extension_json["displayName"] == "Awesome Charts"
assert extension_json["version"] == "1.0.0"
assert extension_json["license"] == "Apache-2.0"
package_json = load_json_file(extension_path / "frontend" / "package.json")
assert package_json["name"] == "awesome_charts"
assert package_json["name"] == "@awesome-org/awesome-charts"
pyproject_content = (extension_path / "backend" / "pyproject.toml").read_text()
assert "awesome_charts" in pyproject_content
assert (
"awesome_org-awesome_charts" in pyproject_content
) # Package name uses collision-safe naming
# Non-interactive mode tests
@@ -394,9 +416,11 @@ def test_init_non_interactive_with_all_options(cli_runner, isolated_filesystem):
app,
[
"init",
"--id",
"my_ext",
"--publisher",
"my-org",
"--name",
"my-ext",
"--display-name",
"My Extension",
"--version",
"1.0.0",
@@ -408,16 +432,17 @@ def test_init_non_interactive_with_all_options(cli_runner, isolated_filesystem):
)
assert result.exit_code == 0, f"Command failed with output: {result.output}"
assert "🎉 Extension My Extension (ID: my_ext) initialized" in result.output
assert "🎉 Extension My Extension (ID: my-org.my-ext) initialized" in result.output
extension_path = isolated_filesystem / "my_ext"
extension_path = isolated_filesystem / "my-org.my-ext"
assert_directory_exists(extension_path)
assert_directory_exists(extension_path / "frontend")
assert_directory_exists(extension_path / "backend")
extension_json = load_json_file(extension_path / "extension.json")
assert extension_json["id"] == "my_ext"
assert extension_json["name"] == "My Extension"
assert extension_json["publisher"] == "my-org"
assert extension_json["name"] == "my-ext"
assert extension_json["displayName"] == "My Extension"
assert extension_json["version"] == "1.0.0"
assert extension_json["license"] == "MIT"
@@ -429,9 +454,11 @@ def test_init_frontend_only_with_cli_options(cli_runner, isolated_filesystem):
app,
[
"init",
"--id",
"frontend_ext",
"--publisher",
"frontend-org",
"--name",
"frontend-ext",
"--display-name",
"Frontend Extension",
"--version",
"1.0.0",
@@ -444,7 +471,7 @@ def test_init_frontend_only_with_cli_options(cli_runner, isolated_filesystem):
assert result.exit_code == 0, f"Command failed with output: {result.output}"
extension_path = isolated_filesystem / "frontend_ext"
extension_path = isolated_filesystem / "frontend-org.frontend-ext"
assert_directory_exists(extension_path / "frontend")
assert not (extension_path / "backend").exists()
@@ -456,9 +483,11 @@ def test_init_backend_only_with_cli_options(cli_runner, isolated_filesystem):
app,
[
"init",
"--id",
"backend_ext",
"--publisher",
"backend-org",
"--name",
"backend-ext",
"--display-name",
"Backend Extension",
"--version",
"1.0.0",
@@ -471,7 +500,7 @@ def test_init_backend_only_with_cli_options(cli_runner, isolated_filesystem):
assert result.exit_code == 0, f"Command failed with output: {result.output}"
extension_path = isolated_filesystem / "backend_ext"
extension_path = isolated_filesystem / "backend-org.backend-ext"
assert not (extension_path / "frontend").exists()
assert_directory_exists(extension_path / "backend")
@@ -479,14 +508,16 @@ def test_init_backend_only_with_cli_options(cli_runner, isolated_filesystem):
@pytest.mark.cli
def test_init_prompts_for_missing_options(cli_runner, isolated_filesystem):
"""Test that init prompts for options not provided via CLI and uses defaults."""
# Provide id and name via CLI, but version/license will be prompted (accept defaults)
# Provide publisher, name, and display-name via CLI, but version/license will be prompted (accept defaults)
result = cli_runner.invoke(
app,
[
"init",
"--id",
"default_ext",
"--publisher",
"default-org",
"--name",
"default-ext",
"--display-name",
"Default Extension",
"--frontend",
"--backend",
@@ -496,22 +527,24 @@ def test_init_prompts_for_missing_options(cli_runner, isolated_filesystem):
assert result.exit_code == 0, f"Command failed with output: {result.output}"
extension_path = isolated_filesystem / "default_ext"
extension_path = isolated_filesystem / "default-org.default-ext"
extension_json = load_json_file(extension_path / "extension.json")
assert extension_json["version"] == "0.1.0"
assert extension_json["license"] == "Apache-2.0"
@pytest.mark.cli
def test_init_non_interactive_validates_id(cli_runner, isolated_filesystem):
"""Test that non-interactive mode validates extension ID."""
def test_init_non_interactive_validates_technical_name(cli_runner, isolated_filesystem):
"""Test that non-interactive mode validates technical name."""
result = cli_runner.invoke(
app,
[
"init",
"--id",
"invalid-id",
"--publisher",
"test-org",
"--name",
"invalid_name",
"--display-name",
"Invalid Extension",
"--frontend",
"--backend",
@@ -519,4 +552,4 @@ def test_init_non_interactive_validates_id(cli_runner, isolated_filesystem):
)
assert result.exit_code == 1
assert "must be alphanumeric" in result.output
assert "must start with a letter" in result.output.lower()

View File

@@ -0,0 +1,502 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import pytest
from superset_extensions_cli.exceptions import ExtensionNameError
from superset_extensions_cli.utils import (
generate_extension_names,
get_module_federation_name,
kebab_to_camel_case,
kebab_to_snake_case,
name_to_kebab_case,
suggest_technical_name,
validate_display_name,
validate_npm_package_name,
validate_publisher,
validate_python_package_name,
validate_technical_name,
)
# Name transformation tests
@pytest.mark.parametrize(
("display_name", "expected"),
[
("Hello World", "hello-world"),
("Data Explorer", "data-explorer"),
("My Extension", "my-extension"),
("hello-world", "hello-world"), # Already normalized
("Hello@World!", "helloworld"), # Special chars removed
(
"Data_Explorer",
"data-explorer",
), # Underscores become spaces then hyphens
("My Extension", "my-extension"), # Multiple spaces normalized
(" Hello World ", "hello-world"), # Trimmed
("API v2 Client", "api-v2-client"), # Numbers preserved
("Simple", "simple"), # Single word
],
)
def test_name_to_kebab_case(display_name, expected):
"""Test direct kebab case conversion from display names."""
assert name_to_kebab_case(display_name) == expected
@pytest.mark.parametrize(
("kebab_name", "expected"),
[
("hello-world", "helloWorld"),
("data-explorer", "dataExplorer"),
("my-extension", "myExtension"),
("api-v2-client", "apiV2Client"),
("simple", "simple"), # Single word
("chart-tool", "chartTool"),
("dashboard-helper", "dashboardHelper"),
],
)
def test_kebab_to_camel_case(kebab_name, expected):
"""Test kebab-case to camelCase conversion."""
assert kebab_to_camel_case(kebab_name) == expected
@pytest.mark.parametrize(
("kebab_name", "expected"),
[
("hello-world", "hello_world"),
("data-explorer", "data_explorer"),
("my-extension", "my_extension"),
("api-v2-client", "api_v2_client"),
("simple", "simple"), # Single word
("chart-tool", "chart_tool"),
("dashboard-helper", "dashboard_helper"),
],
)
def test_kebab_to_snake_case(kebab_name, expected):
"""Test kebab-case to snake_case conversion."""
assert kebab_to_snake_case(kebab_name) == expected
# Display name validation tests
@pytest.mark.parametrize(
("valid_display", "expected_normalized"),
[
("Hello World", "Hello World"),
("Data Explorer", "Data Explorer"),
("My Extension", "My Extension"),
("Simple", "Simple"),
(" Extra Spaces ", "Extra Spaces"), # Gets normalized
("Dashboard Widgets", "Dashboard Widgets"),
("Chart Builder Pro", "Chart Builder Pro"),
("API Client v2.0", "API Client v2.0"),
("Tool_123", "Tool_123"), # Underscores allowed
("My-Extension", "My-Extension"), # Hyphens allowed
],
)
def test_validate_display_name_valid(valid_display, expected_normalized):
"""Test valid display names return correctly normalized output."""
result = validate_display_name(valid_display)
assert result == expected_normalized
@pytest.mark.parametrize(
("invalid_display", "error_match"),
[
("", "cannot be empty"),
(" ", "cannot be empty"),
("@#$%", "must start with a letter"),
("123 Tool", "must start with a letter"),
("-My Extension", "must start with a letter"),
],
)
def test_validate_display_name_invalid(invalid_display, error_match):
"""Test invalid display names."""
with pytest.raises(ExtensionNameError, match=error_match):
validate_display_name(invalid_display)
# Python package name validation tests
@pytest.mark.parametrize(
("valid_package",),
[
("hello_world",),
("data_explorer",),
("myext",),
("test123",),
("package_with_many_parts",),
],
)
def test_validate_python_package_name_valid(valid_package):
"""Test valid Python package names."""
# Should not raise exceptions
validate_python_package_name(valid_package)
@pytest.mark.parametrize(
("keyword",),
[
("class",),
("import",),
("def",),
("return",),
("if",),
("else",),
("for",),
("while",),
("try",),
("except",),
("finally",),
("with",),
("as",),
("lambda",),
("yield",),
("False",),
("None",),
("True",),
],
)
def test_validate_python_package_name_keywords(keyword):
"""Test that Python reserved keywords are rejected."""
with pytest.raises(
ExtensionNameError, match="Package name cannot start with Python keyword"
):
validate_python_package_name(keyword)
@pytest.mark.parametrize(
("invalid_package",),
[
("hello-world",), # Hyphens not allowed in Python identifiers
],
)
def test_validate_python_package_name_invalid(invalid_package):
"""Test invalid Python package names."""
with pytest.raises(ExtensionNameError, match="not a valid Python package"):
validate_python_package_name(invalid_package)
# NPM package validation tests
@pytest.mark.parametrize(
("valid_npm",),
[
("hello-world",),
("data-explorer",),
("myext",),
("package-with-many-parts",),
],
)
def test_validate_npm_package_name_valid(valid_npm):
"""Test valid npm package names."""
# Should not raise exceptions
validate_npm_package_name(valid_npm)
@pytest.mark.parametrize(
("reserved_name",),
[
("node_modules",),
("npm",),
("yarn",),
("package.json",),
("localhost",),
("favicon.ico",),
],
)
def test_validate_npm_package_name_reserved(reserved_name):
"""Test that npm reserved names are rejected."""
with pytest.raises(ExtensionNameError, match="reserved npm package name"):
validate_npm_package_name(reserved_name)
# Publisher validation tests
@pytest.mark.parametrize(
("valid_publisher",),
[
("my-org",),
("acme",),
("apache-superset",),
("test123",),
("a",), # Single character
("publisher-with-many-parts",),
],
)
def test_validate_publisher_valid(valid_publisher):
"""Test valid publisher namespaces."""
# Should not raise exceptions
validate_publisher(valid_publisher)
@pytest.mark.parametrize(
("invalid_publisher", "error_match"),
[
("", "cannot be empty"),
("My-Org", "must start with a letter and contain only lowercase letters"),
("-publisher", "must start with a letter and contain only lowercase letters"),
("publisher-", "must start with a letter and contain only lowercase letters"),
("pub--lisher", "must start with a letter and contain only lowercase letters"),
],
)
def test_validate_publisher_invalid(invalid_publisher, error_match):
"""Test invalid publisher namespaces."""
with pytest.raises(ExtensionNameError, match=error_match):
validate_publisher(invalid_publisher)
# Technical name validation tests
@pytest.mark.parametrize(
("valid_name",),
[
("dashboard-widgets",),
("chart-builder",),
("simple",),
("api-client-v2",),
("tool123",),
],
)
def test_validate_technical_name_valid(valid_name):
"""Test valid technical names."""
# Should not raise exceptions
validate_technical_name(valid_name)
@pytest.mark.parametrize(
("invalid_name", "error_match"),
[
("", "cannot be empty"),
(
"Dashboard-Widgets",
"must start with a letter and contain only lowercase letters",
),
("-name", "must start with a letter and contain only lowercase letters"),
("name-", "must start with a letter and contain only lowercase letters"),
("na--me", "must start with a letter and contain only lowercase letters"),
],
)
def test_validate_technical_name_invalid(invalid_name, error_match):
"""Test invalid technical names."""
with pytest.raises(ExtensionNameError, match=error_match):
validate_technical_name(invalid_name)
# Name suggestion tests
@pytest.mark.parametrize(
("display_name", "expected_technical"),
[
("Dashboard Widgets", "dashboard-widgets"),
("Chart Builder Pro!", "chart-builder-pro"),
("My@Tool#123", "mytool123"),
(" Spaced Out ", "spaced-out"),
("API v2 Client", "api-v2-client"),
],
)
def test_suggest_technical_name(display_name, expected_technical):
"""Test technical name suggestion from display names."""
result = suggest_technical_name(display_name)
assert result == expected_technical
@pytest.mark.parametrize(
("publisher", "name", "expected_mf"),
[
("my-org", "dashboard-widgets", "myOrg_dashboardWidgets"),
("acme", "chart-builder", "acme_chartBuilder"),
("test-company", "simple", "testCompany_simple"),
],
)
def test_get_module_federation_name(publisher, name, expected_mf):
"""Test Module Federation name generation."""
result = get_module_federation_name(publisher, name)
assert result == expected_mf
# Complete name generation tests
@pytest.mark.parametrize(
("display_name", "expected_kebab", "expected_snake", "expected_camel"),
[
("Hello World", "hello-world", "hello_world", "helloWorld"),
("Data Explorer", "data-explorer", "data_explorer", "dataExplorer"),
("My Extension v2", "my-extension-v2", "my_extension_v2", "myExtensionV2"),
("Chart Tool", "chart-tool", "chart_tool", "chartTool"),
("Simple", "simple", "simple", "simple"),
("API v2 Client", "api-v2-client", "api_v2_client", "apiV2Client"),
(
"Dashboard Helper",
"dashboard-helper",
"dashboard_helper",
"dashboardHelper",
),
],
)
def test_generate_extension_names_complete_flow(
display_name, expected_kebab, expected_snake, expected_camel
):
"""Test complete name generation flow with publisher concept."""
publisher = "test-org"
names = generate_extension_names(display_name, publisher, expected_kebab)
# Test all transformations with publisher concept
assert names["display_name"] == display_name
assert names["publisher"] == publisher
assert names["name"] == expected_kebab # Technical name
assert names["id"] == f"{publisher}.{expected_kebab}" # Composite ID
assert names["npm_name"] == f"@{publisher}/{expected_kebab}" # NPM scoped
assert (
names["mf_name"] == f"testOrg_{expected_camel}"
) # Module Federation with publisher prefix
assert (
names["backend_package"] == f"{publisher.replace('-', '_')}-{expected_snake}"
) # Collision-safe
assert (
names["backend_path"]
== f"superset_extensions.{publisher.replace('-', '_')}.{expected_snake}"
)
assert (
names["backend_entry"]
== f"superset_extensions.{publisher.replace('-', '_')}.{expected_snake}.entrypoint"
)
@pytest.mark.parametrize(
("invalid_display",),
[
("Class Helper",), # Would create 'class_helper' - reserved keyword
("Import Tool",), # Would create 'import_tool' - reserved keyword
("@#$%",), # All special chars - becomes empty
("123 Tool",), # Starts with number after kebab conversion
],
)
def test_generate_extension_names_invalid(invalid_display):
"""Test invalid name generation scenarios."""
with pytest.raises(ExtensionNameError):
generate_extension_names(invalid_display, "test-org")
def test_generate_extension_names_unicode():
"""Test handling of unicode characters."""
# Use a simpler approach - the display name validation now requires starting with letter
names = generate_extension_names("Cafe Extension", "test-org", "cafe-extension")
assert names["id"] == "test-org.cafe-extension"
assert names["display_name"] == "Cafe Extension" # Original preserved
def test_generate_extension_names_special_chars():
"""Test name generation with special characters."""
# Use manual technical name since display validation is stricter
names = generate_extension_names("My Extension", "test-org", "my-extension")
assert names["display_name"] == "My Extension"
assert names["id"] == "test-org.my-extension"
assert names["backend_package"] == "test_org-my_extension"
def test_generate_extension_names_case_preservation():
"""Test that display name case is preserved."""
names = generate_extension_names("CamelCase Extension", "test-org")
assert names["display_name"] == "CamelCase Extension"
assert names["id"] == "test-org.camelcase-extension"
# Edge case tests
@pytest.mark.parametrize(
("edge_case",),
[
("",), # Empty string
(" ",), # Only spaces
("---",), # Only hyphens
("___",), # Only underscores
],
)
def test_empty_or_invalid_inputs(edge_case):
"""Test inputs that become empty or invalid after processing."""
with pytest.raises(ExtensionNameError):
generate_extension_names(edge_case, "test-org")
def test_minimal_valid_input():
"""Test minimal valid input."""
names = generate_extension_names("A Extension", "test-org")
assert names["id"] == "test-org.a-extension"
assert names["backend_package"] == "test_org-a_extension"
def test_numbers_handling():
"""Test handling of numbers in names."""
names = generate_extension_names("Tool 123 v2", "test-org")
assert names["id"] == "test-org.tool-123-v2"
assert names["backend_package"] == "test_org-tool_123_v2"
def test_manual_technical_name_override():
"""Test using manual technical name instead of auto-generated."""
display_name = "My Awesome Chart Builder Pro"
publisher = "acme"
technical_name = "chart-builder" # Much shorter than display name
# Create names using manual technical name
names = generate_extension_names(display_name, publisher, technical_name)
# Verify technical names come from provided technical name, not display name
assert (
names["display_name"] == "My Awesome Chart Builder Pro"
) # Display name preserved
assert names["publisher"] == "acme"
assert names["name"] == "chart-builder" # Technical name used
assert names["id"] == "acme.chart-builder" # Composite ID
assert names["mf_name"] == "acme_chartBuilder" # Module Federation format
assert names["backend_package"] == "acme-chart_builder" # Collision-safe
assert names["backend_path"] == "superset_extensions.acme.chart_builder"
assert names["backend_entry"] == "superset_extensions.acme.chart_builder.entrypoint"
def test_generate_names_uses_suggested_technical_names():
"""Test that generate_extension_names can auto-suggest technical names."""
display_name = "Hello World"
publisher = "test-org"
# Generated names should use suggested technical name generation
names = generate_extension_names(display_name, publisher)
# Verify the technical name was suggested from display name
assert names["name"] == "hello-world"
assert names["id"] == "test-org.hello-world"
# Verify other names were generated from the technical name and publisher
assert names["mf_name"] == get_module_federation_name(
"test-org", "hello-world"
) # "testOrg_helloWorld"
assert names["backend_package"] == "test_org-hello_world"
# Module Federation name should use underscore format with camelCase
assert names["mf_name"] == "testOrg_helloWorld"

View File

@@ -42,8 +42,15 @@ def jinja_env(templates_dir):
def template_context():
"""Default template context for testing."""
return {
"id": "test_extension",
"name": "Test Extension",
"publisher": "test-org",
"name": "test-extension",
"display_name": "Test Extension",
"id": "test-org.test-extension",
"npm_name": "@test-org/test-extension",
"mf_name": "testOrg_testExtension",
"backend_package": "test_org-test_extension",
"backend_path": "superset_extensions.test_org.test_extension",
"backend_entry": "superset_extensions.test_org.test_extension.entrypoint",
"version": "0.1.0",
"license": "Apache-2.0",
"include_frontend": True,
@@ -64,8 +71,9 @@ def test_extension_json_template_renders_with_both_frontend_and_backend(
parsed = json.loads(rendered)
# Verify basic fields
assert parsed["id"] == "test_extension"
assert parsed["name"] == "Test Extension"
assert parsed["publisher"] == "test-org"
assert parsed["name"] == "test-extension"
assert parsed["displayName"] == "Test Extension"
assert parsed["version"] == "0.1.0"
assert parsed["license"] == "Apache-2.0"
assert parsed["permissions"] == []
@@ -75,14 +83,26 @@ def test_extension_json_template_renders_with_both_frontend_and_backend(
frontend = parsed["frontend"]
assert "contributions" in frontend
assert "moduleFederation" in frontend
assert frontend["contributions"] == {"commands": [], "views": {}, "menus": {}}
assert frontend["moduleFederation"] == {"exposes": ["./index"]}
assert frontend["contributions"] == {
"commands": [],
"views": {},
"menus": {},
"editors": [],
}
assert frontend["moduleFederation"] == {
"exposes": ["./index"],
"name": "testOrg_testExtension",
}
# Verify backend section exists
assert "backend" in parsed
backend = parsed["backend"]
assert backend["entryPoints"] == ["test_extension.entrypoint"]
assert backend["files"] == ["backend/src/test_extension/**/*.py"]
assert backend["entryPoints"] == [
"superset_extensions.test_org.test_extension.entrypoint"
]
assert backend["files"] == [
"backend/src/superset_extensions/test_org/test_extension/**/*.py"
]
@pytest.mark.unit
@@ -127,7 +147,7 @@ def test_frontend_package_json_template_renders_correctly(jinja_env, template_co
parsed = json.loads(rendered)
# Verify basic package info
assert parsed["name"] == "test_extension"
assert parsed["name"] == "@test-org/test-extension"
assert parsed["version"] == "0.1.0"
assert parsed["license"] == "Apache-2.0"
assert parsed["private"] is True
@@ -161,7 +181,7 @@ def test_backend_pyproject_toml_template_renders_correctly(jinja_env, template_c
rendered = template.render(template_context)
# Basic content verification (without full TOML parsing)
assert "test_extension" in rendered
assert "test_org-test_extension" in rendered
assert "0.1.0" in rendered
assert "Apache-2.0" in rendered
@@ -169,19 +189,36 @@ def test_backend_pyproject_toml_template_renders_correctly(jinja_env, template_c
# Template Rendering with Different Parameters Tests
@pytest.mark.unit
@pytest.mark.parametrize(
"id_,name",
"publisher,technical_name,display_name",
[
("simple_extension", "Simple Extension"),
("MyExtension123", "My Extension 123"),
("complex_extension_name_123", "Complex Extension Name 123"),
("ext", "Ext"),
("test-org", "simple-extension", "Simple Extension"),
("acme", "my-extension-123", "My Extension 123"),
("company", "complex-extension-name-123", "Complex Extension Name 123"),
("pub", "ext", "Ext"),
],
)
def test_template_rendering_with_different_ids(jinja_env, id_, name):
"""Test templates render correctly with various extension ids/names."""
def test_template_rendering_with_different_ids(
jinja_env, publisher, technical_name, display_name
):
"""Test templates render correctly with various publisher/name combinations."""
from superset_extensions_cli.utils import (
get_module_federation_name,
kebab_to_snake_case,
)
publisher_snake = kebab_to_snake_case(publisher)
name_snake = kebab_to_snake_case(technical_name)
context = {
"id": id_,
"name": name,
"publisher": publisher,
"name": technical_name,
"display_name": display_name,
"id": f"{publisher}.{technical_name}",
"npm_name": f"@{publisher}/{technical_name}",
"mf_name": get_module_federation_name(publisher, technical_name),
"backend_package": f"{publisher_snake}-{name_snake}",
"backend_path": f"superset_extensions.{publisher_snake}.{name_snake}",
"backend_entry": f"superset_extensions.{publisher_snake}.{name_snake}.entrypoint",
"version": "1.0.0",
"license": "MIT",
"include_frontend": True,
@@ -193,23 +230,28 @@ def test_template_rendering_with_different_ids(jinja_env, id_, name):
rendered = template.render(context)
parsed = json.loads(rendered)
assert parsed["id"] == id_
assert parsed["name"] == name
assert parsed["backend"]["entryPoints"] == [f"{id_}.entrypoint"]
assert parsed["backend"]["files"] == [f"backend/src/{id_}/**/*.py"]
assert parsed["publisher"] == publisher
assert parsed["name"] == technical_name
assert parsed["displayName"] == display_name
assert parsed["backend"]["entryPoints"] == [
f"superset_extensions.{publisher_snake}.{name_snake}.entrypoint"
]
assert parsed["backend"]["files"] == [
f"backend/src/superset_extensions/{publisher_snake}/{name_snake}/**/*.py"
]
# Test package.json template
template = jinja_env.get_template("frontend/package.json.j2")
rendered = template.render(context)
parsed = json.loads(rendered)
assert parsed["name"] == id_
assert parsed["name"] == f"@{publisher}/{technical_name}"
# Test pyproject.toml template
template = jinja_env.get_template("backend/pyproject.toml.j2")
rendered = template.render(context)
assert id_ in rendered
assert f"{publisher_snake}-{name_snake}" in rendered
@pytest.mark.unit
@@ -217,8 +259,12 @@ def test_template_rendering_with_different_ids(jinja_env, id_, name):
def test_template_rendering_with_different_versions(jinja_env, version):
"""Test templates render correctly with various version formats."""
context = {
"id": "test_ext",
"name": "Test Extension",
"publisher": "test-pub",
"name": "test-ext",
"display_name": "Test Extension",
"id": "test-pub.test-ext",
"npm_name": "@test-pub/test-ext",
"mf_name": "testPub_testExt",
"version": version,
"license": "Apache-2.0",
"include_frontend": True,
@@ -246,8 +292,15 @@ def test_template_rendering_with_different_versions(jinja_env, version):
def test_template_rendering_with_different_licenses(jinja_env, license_type):
"""Test templates render correctly with various license types."""
context = {
"id": "test_ext",
"name": "Test Extension",
"publisher": "test-pub",
"name": "test-ext",
"display_name": "Test Extension",
"id": "test-pub.test-ext",
"npm_name": "@test-pub/test-ext",
"mf_name": "testPub_testExt",
"backend_package": "test_pub-test_ext",
"backend_path": "superset_extensions.test_pub.test_ext",
"backend_entry": "superset_extensions.test_pub.test_ext.entrypoint",
"version": "1.0.0",
"license": license_type,
"include_frontend": True,
@@ -312,8 +365,15 @@ def test_template_context_edge_cases(jinja_env):
"""Test template rendering with edge case contexts."""
# Test with minimal context
minimal_context = {
"id": "minimal",
"name": "Minimal",
"publisher": "min",
"name": "minimal",
"display_name": "Minimal",
"id": "min.minimal",
"npm_name": "@min/minimal",
"mf_name": "min_minimal",
"backend_package": "min-minimal",
"backend_path": "superset_extensions.min.minimal",
"backend_entry": "superset_extensions.min.minimal.entrypoint",
"version": "1.0.0",
"license": "MIT",
"include_frontend": False,
@@ -325,7 +385,8 @@ def test_template_context_edge_cases(jinja_env):
parsed = json.loads(rendered)
# Should still be valid JSON with basic fields
assert parsed["id"] == "minimal"
assert parsed["name"] == "Minimal"
assert parsed["publisher"] == "min"
assert parsed["name"] == "minimal"
assert parsed["displayName"] == "Minimal"
assert "frontend" not in parsed
assert "backend" not in parsed

View File

@@ -16,6 +16,10 @@
* specific language governing permissions and limitations
* under the License.
*/
// Register TypeScript require hook so ESLint can load .ts plugin files
require('tsx/cjs');
const packageConfig = require('./package.json');
const importCoreModules = [];
@@ -135,7 +139,9 @@ module.exports = {
'icons',
'i18n-strings',
'react-prefer-function-component',
'react-you-might-not-need-an-effect',
'prettier',
'react-you-might-not-need-an-effect',
],
rules: {
// === Essential Superset customizations ===
@@ -146,7 +152,7 @@ module.exports = {
// Custom Superset rules
'theme-colors/no-literal-colors': 'error',
'icons/no-fa-icons-usage': 'error',
'i18n-strings/no-template-vars': ['error', true],
'i18n-strings/no-template-vars': 'error',
// Core ESLint overrides for Superset
'no-console': 'warn',
@@ -193,7 +199,7 @@ module.exports = {
'**/jest.setup.js',
'**/webpack.config.js',
'**/webpack.config.*.js',
'**/.eslintrc.js',
'**/.eslintrc*.js',
],
optionalDependencies: false,
},
@@ -235,12 +241,32 @@ module.exports = {
'jsx-a11y/mouse-events-have-key-events': 0,
'jsx-a11y/no-static-element-interactions': 0,
// React effect best practices
'react-you-might-not-need-an-effect/no-empty-effect': 'error',
'react-you-might-not-need-an-effect/no-pass-live-state-to-parent': 'error',
'react-you-might-not-need-an-effect/no-initialize-state': 'error',
// Lodash
'lodash/import-scope': [2, 'member'],
// React effect best practices
'react-you-might-not-need-an-effect/no-reset-all-state-on-prop-change':
'error',
'react-you-might-not-need-an-effect/no-chain-state-updates': 'error',
'react-you-might-not-need-an-effect/no-event-handler': 'error',
'react-you-might-not-need-an-effect/no-derived-state': 'error',
// Storybook
'storybook/prefer-pascal-case': 'error',
// File progress
'file-progress/activate': 1,
// React effect rules
'react-you-might-not-need-an-effect/no-adjust-state-on-prop-change':
'error',
'react-you-might-not-need-an-effect/no-pass-data-to-parent': 'error',
// Restricted imports
'no-restricted-imports': [
'error',
@@ -273,6 +299,52 @@ module.exports = {
],
},
overrides: [
// Ban JavaScript files in src/ - all new code must be TypeScript
{
files: ['src/**/*.js', 'src/**/*.jsx'],
rules: {
'no-restricted-syntax': [
'error',
{
selector: 'Program',
message:
'JavaScript files are not allowed in src/. Please use TypeScript (.ts/.tsx) instead.',
},
],
},
},
// Ban JavaScript files in plugins/ - all plugin source code must be TypeScript
{
files: ['plugins/**/src/**/*.js', 'plugins/**/src/**/*.jsx'],
rules: {
'no-restricted-syntax': [
'error',
{
selector: 'Program',
message:
'JavaScript files are not allowed in plugins/. Please use TypeScript (.ts/.tsx) instead.',
},
],
},
},
// Ban JavaScript files in packages/ - with exceptions for config files and generators
{
files: ['packages/**/src/**/*.js', 'packages/**/src/**/*.jsx'],
excludedFiles: [
'packages/generator-superset/**/*', // Yeoman generator templates run via Node
'packages/**/__mocks__/**/*', // Test mocks
],
rules: {
'no-restricted-syntax': [
'error',
{
selector: 'Program',
message:
'JavaScript files are not allowed in packages/. Please use TypeScript (.ts/.tsx) instead.',
},
],
},
},
{
files: ['*.ts', '*.tsx'],
parser: '@typescript-eslint/parser',
@@ -303,7 +375,7 @@ module.exports = {
],
'@typescript-eslint/no-empty-function': 0,
'@typescript-eslint/no-explicit-any': 0,
'@typescript-eslint/no-use-before-define': 1,
'@typescript-eslint/no-use-before-define': 'error',
'@typescript-eslint/no-non-null-assertion': 0,
'@typescript-eslint/explicit-function-return-type': 0,
'@typescript-eslint/explicit-module-boundary-types': 0,
@@ -399,27 +471,13 @@ module.exports = {
'**/spec/**/*',
],
excludedFiles: 'cypress-base/cypress/**/*',
plugins: ['jest', 'jest-dom', 'no-only-tests', 'testing-library'],
env: {
'jest/globals': true,
},
settings: {
jest: {
version: 'detect',
},
},
extends: [
'plugin:jest/recommended',
'plugin:jest-dom/recommended',
'plugin:testing-library/react',
],
plugins: ['jest-dom', 'no-only-tests', 'testing-library'],
extends: ['plugin:jest-dom/recommended', 'plugin:testing-library/react'],
rules: {
'import/no-extraneous-dependencies': [
'error',
{ devDependencies: true },
],
'jest/consistent-test-it': 'error',
'no-only-tests/no-only-tests': 'error',
'prefer-promise-reject-errors': 0,
'max-classes-per-file': 0,

View File

@@ -17,6 +17,9 @@
* under the License.
*/
// Register TypeScript require hook so ESLint can load .ts plugin files
require('tsx/cjs');
/**
* MINIMAL ESLint config - ONLY for rules OXC doesn't support
* This config is designed to be run alongside OXC linter
@@ -66,7 +69,7 @@ module.exports = {
// Custom Superset plugins
'theme-colors/no-literal-colors': 'error',
'icons/no-fa-icons-usage': 'error',
'i18n-strings/no-template-vars': ['error', true],
'i18n-strings/no-template-vars': 'error',
'file-progress/activate': 1,
// Explicitly turn off all other rules to avoid conflicts

View File

@@ -23,8 +23,13 @@ const customConfig = require('../webpack.config.js');
// Filter out plugins that shouldn't be included in Storybook's static build
// ReactRefreshWebpackPlugin adds Fast Refresh code that requires a dev server runtime,
// which isn't available when serving the static storybook build
// ForkTsCheckerWebpackPlugin causes TypeScript project reference errors in Storybook context
const pluginsToExclude = [
'ReactRefreshWebpackPlugin',
'ForkTsCheckerWebpackPlugin',
];
const filteredPlugins = customConfig.plugins.filter(
plugin => plugin.constructor.name !== 'ReactRefreshWebpackPlugin',
plugin => !pluginsToExclude.includes(plugin.constructor.name),
);
// Deep clone and modify rules to disable React Fast Refresh and dev mode in SWC loader
@@ -73,9 +78,9 @@ const disableDevModeInRules = rules =>
module.exports = {
stories: [
'../src/@(components|common|filters|explore|views|dashboard|features)/**/*.stories.@(tsx|jsx)',
'../packages/superset-ui-demo/storybook/stories/**/*.*.@(tsx|jsx)',
'../packages/superset-ui-core/src/components/**/*.stories.@(tsx|jsx)',
'../src/**/*.stories.tsx',
'../packages/superset-ui-core/src/**/*.stories.tsx',
'../plugins/*/src/**/*.stories.tsx',
],
addons: [
@@ -102,13 +107,15 @@ module.exports = {
...customConfig.resolve?.alias,
// Fix for Storybook 8.6.x with React 17 - resolve ESM module paths
'react-dom/test-utils': require.resolve('react-dom/test-utils'),
// Shared storybook utilities
'@storybook-shared': join(__dirname, 'shared'),
},
},
plugins: [...config.plugins, ...filteredPlugins],
}),
typescript: {
reactDocgen: 'react-docgen-typescript',
reactDocgen: getAbsolutePath('react-docgen-typescript'),
},
framework: {

View File

@@ -28,6 +28,27 @@ import { App, Layout, Space, Content } from 'antd';
import 'src/theme.ts';
import './storybook.css';
// Set up bootstrap data for components that check HTML_SANITIZATION config
// (e.g., HandlebarsViewer). This allows <style> tags in Handlebars templates.
if (typeof document !== 'undefined') {
let appEl = document.getElementById('app');
if (!appEl) {
appEl = document.createElement('div');
appEl.id = 'app';
document.body.appendChild(appEl);
}
appEl.setAttribute(
'data-bootstrap',
JSON.stringify({
common: {
conf: {
HTML_SANITIZATION: false,
},
},
}),
);
}
export const GlobalStylesOverrides = () => (
<Global
styles={css`

View File

@@ -61,10 +61,7 @@ export default function ResizableChartDemo({
);
}
export const withResizableChartDemo: Decorator<{
width: number;
height: number;
}> = (storyFn, context) => {
export const withResizableChartDemo: Decorator = (Story, context) => {
const {
parameters: { initialSize, panelPadding },
} = context;
@@ -73,7 +70,14 @@ export const withResizableChartDemo: Decorator<{
initialSize={initialSize as Size | undefined}
panelPadding={panelPadding}
>
{innerSize => storyFn({ ...context, ...context.args, ...innerSize })}
{innerSize => (
<Story
args={{
...context.args,
...innerSize,
}}
/>
)}
</ResizableChartDemo>
);
};

View File

@@ -23,9 +23,25 @@ import {
ResizableBoxProps,
ResizeCallbackData,
} from 'react-resizable';
import { styled } from '@apache-superset/core/ui';
import 'react-resizable/css/styles.css';
const StyledResizableBox = styled(ResizableBox)`
&.panel {
overflow: hidden;
background: ${({ theme }) => theme.colorBgContainer};
border: 1px solid ${({ theme }) => theme.colorBorder};
border-radius: ${({ theme }) => theme.borderRadius}px;
}
.panel-body {
overflow: hidden;
width: 100%;
height: 100%;
}
`;
export type Size = ResizeCallbackData['size'];
export default function ResizablePanel({
@@ -41,7 +57,7 @@ export default function ResizablePanel({
}) {
const { width, height } = initialSize;
return (
<ResizableBox
<StyledResizableBox
className="panel"
width={width}
height={height}
@@ -60,6 +76,6 @@ export default function ResizablePanel({
{heading ? <div className="panel-heading">{heading}</div> : null}
<div className="panel-body">{children}</div>
</>
</ResizableBox>
</StyledResizableBox>
);
}

View File

@@ -32,7 +32,7 @@ export default function createQueryStory({
[key: string]: {
chartType: string;
formData: {
[key: string]: any;
[key: string]: unknown;
};
};
};
@@ -43,7 +43,7 @@ export default function createQueryStory({
mode: string | number,
width: number,
height: number,
formData: any,
formData: string,
) => {
const { chartType } = choices[mode];

View File

@@ -17,21 +17,18 @@
* under the License.
*/
import { SuperChart, VizType } from '@superset-ui/core';
import dummyDatasource from '../../../../../shared/dummyDatasource';
import data from '../data';
export const basic = () => (
<SuperChart
chartType="box-plot"
width={800}
height={600}
datasource={dummyDatasource}
queriesData={[{ data }]}
formData={{
colorScheme: 'd3Category10',
vizType: VizType.BoxPlot,
whiskerOptions: 'Min/max (no outliers)',
}}
/>
);
export { default as ErrorMessage } from './ErrorMessage';
export { default as Expandable } from './Expandable';
export { default as ResizablePanel, type Size } from './ResizablePanel';
export {
default as ResizableChartDemo,
SupersetBody,
withResizableChartDemo,
} from './ResizableChartDemo';
export {
default as VerifyCORS,
renderError,
type Props as VerifyCORSProps,
} from './VerifyCORS';
export { default as createQueryStory } from './createQueryStory';
export { default as dummyDatasource } from './dummyDatasource';

View File

@@ -1,64 +0,0 @@
{
"$schema": "https://json.schemastore.org/swcrc",
"jsc": {
"parser": {
"syntax": "typescript",
"tsx": true,
"decorators": false,
"dynamicImport": true
},
"transform": {
"react": {
"runtime": "automatic",
"importSource": "@emotion/react",
"throwIfNamespace": true
},
"optimizer": {
"globals": {
"vars": {
"process.env.NODE_ENV": "production"
}
}
}
},
"target": "es2015",
"loose": true,
"externalHelpers": false,
"preserveAllComments": false,
"experimental": {
"plugins": [
[
"@swc/plugin-emotion",
{
"sourceMap": true,
"autoLabel": "dev-only",
"labelFormat": "[local]"
}
],
[
"@swc/plugin-transform-imports",
{
"lodash": {
"transform": "lodash/{{member}}",
"preventFullImport": true,
"skipDefaultConversion": false
},
"lodash-es": {
"transform": "lodash-es/{{member}}",
"preventFullImport": true,
"skipDefaultConversion": false
}
}
]
]
}
},
"module": {
"type": "es6",
"strict": false,
"strictMode": false,
"lazy": false,
"noInterop": false
},
"minify": false
}

View File

@@ -52,8 +52,6 @@ module.exports = {
['@babel/plugin-transform-private-methods', { loose: true }],
['@babel/plugin-transform-nullish-coalescing-operator', { loose: true }],
['@babel/plugin-transform-runtime', { corejs: 3 }],
// only used in packages/superset-ui-core/src/chart/components/reactify.tsx
['babel-plugin-typescript-to-proptypes', { loose: true }],
[
'@emotion/babel-plugin',
{

View File

@@ -1,29 +0,0 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
module.exports = {
apiKey: process.env.APPLITOOLS_API_KEY,
batchId: process.env.APPLITOOLS_BATCH_ID,
batchName: process.env.APPLITOOLS_BATCH_NAME,
browser: [{ width: 1920, height: 1080, name: 'chrome' }],
failCypressOnDiff: false,
isDisabled: false,
showLogs: false,
testConcurrency: 10,
ignoreCaret: true,
};

Some files were not shown because too many files have changed in this diff Show More