Compare commits

...

20 Commits

Author SHA1 Message Date
dependabot[bot]
d4c23d2d57 chore(deps-dev): update sqlalchemy-vertica-python requirement
Updates the requirements on [sqlalchemy-vertica-python](https://github.com/bluelabsio/sqlalchemy-vertica-python) to permit the latest version.
- [Release notes](https://github.com/bluelabsio/sqlalchemy-vertica-python/releases)
- [Commits](https://github.com/bluelabsio/sqlalchemy-vertica-python/compare/v0.5.9...v0.6.3)

---
updated-dependencies:
- dependency-name: sqlalchemy-vertica-python
  dependency-version: 0.6.3
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-05-02 07:03:02 +00:00
dependabot[bot]
ad73395c89 chore(deps-dev): bump yeoman-test from 11.3.1 to 11.4.2 in /superset-frontend (#39816)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-02 13:46:47 +07:00
Evan Rusackas
867e173427 chore(deps): drop stale legacy-plugin-chart-map-box lockfile entry (#39825)
Co-authored-by: Superset Dev <dev@superset.apache.org>
Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-05-02 13:35:25 +07:00
dependabot[bot]
c90c8612ad chore(deps): bump @docusaurus/faster from 3.10.0 to 3.10.1 in /docs (#39804)
Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: hainenber <dotronghai96@gmail.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: hainenber <dotronghai96@gmail.com>
2026-05-02 13:32:37 +07:00
Abdul Rehman
b14cca15f6 fix(table): preserve decimals in totals row when Time Comparison is enabled (#39747) 2026-05-02 13:31:54 +07:00
dependabot[bot]
9d4384e49e chore(deps-dev): bump @babel/preset-env from 7.29.2 to 7.29.3 in /superset-frontend (#39822)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-02 12:38:54 +07:00
jesperct
d8dd2d99b3 fix(time-comparison): use chart row_limit instead of instance config in offset queries (#39490) 2026-05-01 16:24:59 -07:00
dependabot[bot]
dbe26d81ce chore(deps-dev): bump baseline-browser-mapping from 2.10.21 to 2.10.24 in /superset-frontend (#39759)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-01 16:24:23 -07:00
Elizabeth Thompson
98eaaaa6d6 fix(mcp): clear stale thread-local DB session in sync tool wrapper (#39798)
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-01 09:24:48 -07:00
Jay Masiwal
cb74438865 fix(viz): correct table chart drill-to-detail temporal boundaries and null handling (#39668)
Co-authored-by: Samuelinto <samuel.mantilla@mail.utoronto.ca>
Co-authored-by: Amin Ghadersohi <amin.ghadersohi@gmail.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-01 11:46:18 -04:00
Danylo Korostil
e77fb5e3fc feat(i18n): updated Ukrainian translation (#39720) 2026-05-01 11:12:05 -04:00
dependabot[bot]
1ac113fd44 chore(deps): bump aws-actions/amazon-ecs-render-task-definition from 1.8.4 to 1.8.5 (#39809)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-01 06:31:48 -07:00
dependabot[bot]
6bfdee98cd chore(deps-dev): bump @docusaurus/tsconfig from 3.10.0 to 3.10.1 in /docs (#39811)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-01 09:31:29 -04:00
dependabot[bot]
de45f3a928 chore(deps): bump aws-actions/amazon-ecs-deploy-task-definition from 2.6.1 to 2.6.2 (#39806)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-01 09:30:49 -04:00
dependabot[bot]
2ec53c0694 chore(deps): bump mapbox-gl from 3.22.0 to 3.23.0 in /superset-frontend (#39769)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-05-01 09:30:21 -04:00
Michael S. Molina
d23b0cad92 chore: Bump core packages to 0.1.0 RC3 (#39823) 2026-05-01 09:54:39 -03:00
Evan Rusackas
e585406fff chore(codeowners): notify @sfirke on translation changes (#39794)
Co-authored-by: Claude Code <noreply@anthropic.com>
2026-04-30 23:07:29 -04:00
Amin Ghadersohi
957b298ae1 fix(mcp): add default request parameter to list_charts and list_dashboards (#39730) 2026-04-30 18:04:39 -04:00
Amin Ghadersohi
f29d82b3b1 feat(mcp): add query_dataset tool to query datasets using semantic layer (#39727) 2026-04-30 18:03:41 -04:00
Vitor Avila
3f550f166f fix(GSheets OAuth2): Re-add UnauthenticatedError (#39785) 2026-04-30 18:57:00 -03:00
33 changed files with 7609 additions and 5991 deletions

4
.github/CODEOWNERS vendored
View File

@@ -36,6 +36,10 @@
**/*.geojson @villebro @rusackas
/superset-frontend/plugins/legacy-plugin-chart-country-map/ @villebro @rusackas
# Notify translation maintainers of changes to translations
/superset/translations/ @sfirke
# Notify PMC members of changes to extension-related files
/docs/developer_portal/extensions/ @michael-s-molina @villebro @rusackas

View File

@@ -265,7 +265,7 @@ jobs:
- name: Fill in the new image ID in the Amazon ECS task definition
id: task-def
uses: aws-actions/amazon-ecs-render-task-definition@77954e213ba1f9f9cb016b86a1d4f6fcdea0d57e # v1
uses: aws-actions/amazon-ecs-render-task-definition@6853cfae8c3a7d978fbf68b5a55453395541dfbb # v1
with:
task-definition: .github/workflows/ecs-task-definition.json
container-name: superset-ci
@@ -300,7 +300,7 @@ jobs:
--tags key=pr,value=$PR_NUMBER key=github_user,value=${{ github.actor }}
- name: Deploy Amazon ECS task definition
id: deploy-task
uses: aws-actions/amazon-ecs-deploy-task-definition@fc8fc60f3a60ffd500fcb13b209c59d221ac8c8c # v2
uses: aws-actions/amazon-ecs-deploy-task-definition@a310a830f5c14e583e35d84e4e1ec7dd177c3c9c # v2
with:
task-definition: ${{ steps.task-def.outputs.task-definition }}
service: pr-${{ github.event.inputs.issue_number || github.event.pull_request.number }}-service

View File

@@ -41,12 +41,12 @@
},
"dependencies": {
"@ant-design/icons": "^6.2.2",
"@docusaurus/core": "^3.10.0",
"@docusaurus/faster": "^3.10.0",
"@docusaurus/plugin-client-redirects": "^3.10.0",
"@docusaurus/preset-classic": "3.10.0",
"@docusaurus/theme-live-codeblock": "^3.10.0",
"@docusaurus/theme-mermaid": "^3.10.0",
"@docusaurus/core": "^3.10.1",
"@docusaurus/faster": "^3.10.1",
"@docusaurus/plugin-client-redirects": "^3.10.1",
"@docusaurus/preset-classic": "3.10.1",
"@docusaurus/theme-live-codeblock": "^3.10.1",
"@docusaurus/theme-mermaid": "^3.10.1",
"@emotion/core": "^11.0.0",
"@emotion/react": "^11.13.3",
"@emotion/styled": "^11.14.1",
@@ -92,8 +92,8 @@
"unist-util-visit": "^5.1.0"
},
"devDependencies": {
"@docusaurus/module-type-aliases": "^3.10.0",
"@docusaurus/tsconfig": "^3.10.0",
"@docusaurus/module-type-aliases": "^3.10.1",
"@docusaurus/tsconfig": "^3.10.1",
"@eslint/js": "^9.39.2",
"@types/js-yaml": "^4.0.9",
"@types/react": "^19.1.8",
@@ -124,8 +124,7 @@
"resolutions": {
"react-redux": "^9.2.0",
"@reduxjs/toolkit": "^2.5.0",
"baseline-browser-mapping": "^2.9.19",
"webpackbar": "^7.0.0"
"baseline-browser-mapping": "^2.9.19"
},
"packageManager": "yarn@1.22.22+sha1.ac34549e6aa8e7ead463a7407e1c7390f61a6610"
}

View File

@@ -1570,10 +1570,10 @@
"@docsearch/core" "4.6.2"
"@docsearch/css" "4.6.2"
"@docusaurus/babel@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/babel/-/babel-3.10.0.tgz#819819f107233dfcf50b59cd51158f23fb04878a"
integrity sha512-mqCJhCZNZUDg0zgDEaPTM4DnRsisa24HdqTy/qn/MQlbwhTb4WVaZg6ZyX6yIVKqTz8fS1hBMgM+98z+BeJJDg==
"@docusaurus/babel@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/babel/-/babel-3.10.1.tgz#2f714f682117658ba43d308e9b35b6a73a105227"
integrity sha512-DZzFO1K3v/GoEt1fx1DiYHF4en+PuhtQf1AkQJa5zu3CoeKSpr5cpQRUlz3jr0m44wyzmSXu9bVpfir+N4+8bg==
dependencies:
"@babel/core" "^7.25.9"
"@babel/generator" "^7.25.9"
@@ -1584,23 +1584,23 @@
"@babel/preset-typescript" "^7.25.9"
"@babel/runtime" "^7.25.9"
"@babel/traverse" "^7.25.9"
"@docusaurus/logger" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/logger" "3.10.1"
"@docusaurus/utils" "3.10.1"
babel-plugin-dynamic-import-node "^2.3.3"
fs-extra "^11.1.1"
tslib "^2.6.0"
"@docusaurus/bundler@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/bundler/-/bundler-3.10.0.tgz#878c4c46bfa3434671ea37a43da184238a6aae26"
integrity sha512-iONUGZGgp+lAkw/cJZH6irONcF4p8+278IsdRlq8lYhxGjkoNUs0w7F4gVXBYSNChq5KG5/JleTSsdJySShxow==
"@docusaurus/bundler@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/bundler/-/bundler-3.10.1.tgz#82fa5079f3787a67502e25f82d37d05ec5de0cc3"
integrity sha512-HIqQPvbqnnQRe4NsBd1774KRarjXqS6wHsWELtyuSs1gCfvixJO2jUGH/OEBtr1Gvzpw+ze5CjGMvSJ8UE1KUw==
dependencies:
"@babel/core" "^7.25.9"
"@docusaurus/babel" "3.10.0"
"@docusaurus/cssnano-preset" "3.10.0"
"@docusaurus/logger" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/babel" "3.10.1"
"@docusaurus/cssnano-preset" "3.10.1"
"@docusaurus/logger" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
babel-loader "^9.2.1"
clean-css "^5.3.3"
copy-webpack-plugin "^11.0.0"
@@ -1618,20 +1618,20 @@
tslib "^2.6.0"
url-loader "^4.1.1"
webpack "^5.95.0"
webpackbar "^6.0.1"
webpackbar "^7.0.0"
"@docusaurus/core@3.10.0", "@docusaurus/core@^3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/core/-/core-3.10.0.tgz#642e71a0209d62c3f5ef275ed9d74a881f40df39"
integrity sha512-mgLdQsO8xppnQZc3LPi+Mf+PkPeyxJeIx11AXAq/14fsaMefInQiMEZUUmrc7J+956G/f7MwE7tn8KZgi3iRcA==
"@docusaurus/core@3.10.1", "@docusaurus/core@^3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/core/-/core-3.10.1.tgz#3f8bdb97451b4df14f2a3b39ab0186366fbf8fbe"
integrity sha512-3pf2fXXw0eVk8WnC3T4LIigRDupcpvngpKo9Vy7mYyBhuddc0klDUuZAIfzMoK6z05pdlk6EFC/vBSX43+1O5w==
dependencies:
"@docusaurus/babel" "3.10.0"
"@docusaurus/bundler" "3.10.0"
"@docusaurus/logger" "3.10.0"
"@docusaurus/mdx-loader" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/babel" "3.10.1"
"@docusaurus/bundler" "3.10.1"
"@docusaurus/logger" "3.10.1"
"@docusaurus/mdx-loader" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
boxen "^6.2.1"
chalk "^4.1.2"
chokidar "^3.5.3"
@@ -1668,22 +1668,22 @@
webpack-dev-server "^5.2.2"
webpack-merge "^6.0.1"
"@docusaurus/cssnano-preset@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/cssnano-preset/-/cssnano-preset-3.10.0.tgz#be1b435c33df09d743473d3fadda67b4568dfae3"
integrity sha512-qzSshTO1DB3TYW+dPUal5KHM7XPc5YQfzF3Kdb2NDACJUyGbNcFtw3tGkCJlYwhNCRKbZcmwraKUS1i5dcHdGg==
"@docusaurus/cssnano-preset@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/cssnano-preset/-/cssnano-preset-3.10.1.tgz#4b6bafeca8bb9423364d2fd6683c28e2f85a4665"
integrity sha512-eNfHGcTKCSq6xmcavAkX3RRclHaE2xRCMParlDXLdXVP01/a2e/jKXMj/0ULnLFQSNwwuI62L0Ge8J+nZsR7UQ==
dependencies:
cssnano-preset-advanced "^6.1.2"
postcss "^8.5.4"
postcss-sort-media-queries "^5.2.0"
tslib "^2.6.0"
"@docusaurus/faster@^3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/faster/-/faster-3.10.0.tgz#0758a93196f685537aa7700bde62faf926e6c817"
integrity sha512-GNPtVH14ISjHfSwnHu3KiFGf86ICmJSQDeSv/QaanpBgiZGOtgZaslnC5q8WiguxM1EVkwcGxPuD8BXF4eggKw==
"@docusaurus/faster@^3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/faster/-/faster-3.10.1.tgz#a63d89ae980c98e1eeab3ff15ee083f7c20ed353"
integrity sha512-XTZhE5C1gZ/DaYYMlSk02dwP5vhpQON5QHVz1s3892mSESAywgWanURpXEDAvt4GvGuq7s+XP8rTWHZvfaJmdQ==
dependencies:
"@docusaurus/types" "3.10.0"
"@docusaurus/types" "3.10.1"
"@rspack/core" "^1.7.10"
"@swc/core" "^1.7.39"
"@swc/html" "^1.13.5"
@@ -1694,22 +1694,22 @@
tslib "^2.6.0"
webpack "^5.95.0"
"@docusaurus/logger@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/logger/-/logger-3.10.0.tgz#2bacbd004dd78e3da926dbe8f6fa9a930856575d"
integrity sha512-9jrZzFuBH1LDRlZ7cznAhCLmAZ3HSDqgwdrSSZdGHq9SPUOQgXXu8mnxe2ZRB9NS1PCpMTIOVUqDtZPIhMafZg==
"@docusaurus/logger@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/logger/-/logger-3.10.1.tgz#34c964e32e18f120e30f80171a38cfefe72cfb4b"
integrity sha512-oPjNFnfJsRCkePVjkGrxWGq4MvJKRQT0r9jOP0eRBTZ7Wr9FAbzdP/Gjs0I2Ss6YRkPoEgygKG112OkE6skvJw==
dependencies:
chalk "^4.1.2"
tslib "^2.6.0"
"@docusaurus/mdx-loader@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/mdx-loader/-/mdx-loader-3.10.0.tgz#1d4b050d751389ecf38dee48bcb61e53df8ffb82"
integrity sha512-mQQV97080AH4PYNs087l202NMDqRopZA4mg5W76ZZyTFrmWhJ3mHg+8A+drJVENxw5/Q+wHMHLgsx+9z1nEs0A==
"@docusaurus/mdx-loader@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/mdx-loader/-/mdx-loader-3.10.1.tgz#050ae9bc614158a4ec07a628aa75fa9ae90d7e82"
integrity sha512-GRmeb/wQ+iXRrFwcHBfgQhrJxGElgCsoTWZYDhccjsZVne1p8MK/EpQVIloXttz76TCe78kKD5AEG9n1xc1oxQ==
dependencies:
"@docusaurus/logger" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/logger" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
"@mdx-js/mdx" "^3.0.0"
"@slorber/remark-comment" "^1.0.0"
escape-html "^1.0.3"
@@ -1732,12 +1732,12 @@
vfile "^6.0.1"
webpack "^5.88.1"
"@docusaurus/module-type-aliases@3.10.0", "@docusaurus/module-type-aliases@^3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/module-type-aliases/-/module-type-aliases-3.10.0.tgz#749928f104d563f11f046bf0c9ab6489a470c7c8"
integrity sha512-/1O0Zg8w3DFrYX/I6Fbss7OJrtZw1QoyjDhegiFNHVi9A9Y0gQ3jUAytVxF6ywpAWpLyLxch8nN8H/V3XfzdJQ==
"@docusaurus/module-type-aliases@3.10.1", "@docusaurus/module-type-aliases@^3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/module-type-aliases/-/module-type-aliases-3.10.1.tgz#22d39177c296786eb6e0d940699cd590cc93ca77"
integrity sha512-YoOZKUdGlp8xSYhuAkGdSo5Ydkbq4V4eK3sD8v0a2hloxCWdQbNBhkc+Ko9QyjpESc0BYcIGM5iHVAy5hdFV6w==
dependencies:
"@docusaurus/types" "3.10.0"
"@docusaurus/types" "3.10.1"
"@types/history" "^4.7.11"
"@types/react" "*"
"@types/react-router-config" "*"
@@ -1745,34 +1745,34 @@
react-helmet-async "npm:@slorber/react-helmet-async@1.3.0"
react-loadable "npm:@docusaurus/react-loadable@6.0.0"
"@docusaurus/plugin-client-redirects@^3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-client-redirects/-/plugin-client-redirects-3.10.0.tgz#4dd4619817fd69462d1e6d986580343aeb911111"
integrity sha512-P+VLoLoZTc74so8+IbsaPZ33/mkf2BWL1CYXQpPRkl0v1QVCN2CgfsZY/8QtbYjQnx2upXUnv45abDhNcSggNw==
"@docusaurus/plugin-client-redirects@^3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-client-redirects/-/plugin-client-redirects-3.10.1.tgz#e22ed20e5837b7c3a28258e3d1816c4239c82b36"
integrity sha512-LHgd+YDvkhfOHMAE6XtUng3DQNzVM765RqVRrMJgHtzAvfopQhY6ieprqjxDVBdv21cLma6I0jHr+YCZH8fL9A==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/logger" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/logger" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
eta "^2.2.0"
fs-extra "^11.1.1"
lodash "^4.17.21"
tslib "^2.6.0"
"@docusaurus/plugin-content-blog@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-blog/-/plugin-content-blog-3.10.0.tgz#10095291b637440847854ecb2c8afcd8746debd7"
integrity sha512-RuTz68DhB7CL96QO5UsFbciD7GPYq6QV+YMfF9V0+N4ZgLhJIBgpVAr8GobrKF6NRe5cyWWETU5z5T834piG9g==
"@docusaurus/plugin-content-blog@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-blog/-/plugin-content-blog-3.10.1.tgz#0bd8de700ccbd8e95d920df2613304ef59abe72b"
integrity sha512-mmkgE6Q2+K74tnkou7tXlpDLvoCU/qkSa2GSQ3XUiHWvcebCoDQzS670RR3tO8PmaWlIyWWISYWzZLuMfxunRA==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/logger" "3.10.0"
"@docusaurus/mdx-loader" "3.10.0"
"@docusaurus/theme-common" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/logger" "3.10.1"
"@docusaurus/mdx-loader" "3.10.1"
"@docusaurus/theme-common" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
cheerio "1.0.0-rc.12"
combine-promises "^1.1.0"
feed "^4.2.2"
@@ -1785,20 +1785,20 @@
utility-types "^3.10.0"
webpack "^5.88.1"
"@docusaurus/plugin-content-docs@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-docs/-/plugin-content-docs-3.10.0.tgz#9c4ea1d5a405340f28c281d2e4586c695a7c65a5"
integrity sha512-9BjHhf15ct8Z7TThTC0xRndKDVvMKmVsAGAN7W9FpNRzfMdScOGcXtLmcCWtJGvAezjOJIm6CxOYCy3Io5+RnQ==
"@docusaurus/plugin-content-docs@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-docs/-/plugin-content-docs-3.10.1.tgz#261e0e982e4a937c05b462e3c5729374f433b752"
integrity sha512-2jRVrtzjf8LClGTHQlwlwuD3wQXRx3WEoF7XUarJ8Ou+0onV+SLtejsyfY9JLpfUh9hPhXM4pbBGkyAY4Bi3HQ==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/logger" "3.10.0"
"@docusaurus/mdx-loader" "3.10.0"
"@docusaurus/module-type-aliases" "3.10.0"
"@docusaurus/theme-common" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/logger" "3.10.1"
"@docusaurus/mdx-loader" "3.10.1"
"@docusaurus/module-type-aliases" "3.10.1"
"@docusaurus/theme-common" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
"@types/react-router-config" "^5.0.7"
combine-promises "^1.1.0"
fs-extra "^11.1.1"
@@ -1809,142 +1809,142 @@
utility-types "^3.10.0"
webpack "^5.88.1"
"@docusaurus/plugin-content-pages@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-pages/-/plugin-content-pages-3.10.0.tgz#7670cbb3c849f434949f542bfdfded1580a13165"
integrity sha512-5amX8kEJI+nIGtuLVjYk59Y5utEJ3CHETFOPEE4cooIRLA4xM4iBsA6zFgu4ljcopeYwvBzFEWf5g2I6Yb9SkA==
"@docusaurus/plugin-content-pages@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-pages/-/plugin-content-pages-3.10.1.tgz#8c6ffc2079ed0262548ecc4df1dea6add6aa9673"
integrity sha512-huJpaRPMl42nsFwuCXvV8bVDj2MazuwRJIUylI/RSlmZeJssVoZXeCjVf1y+1Drtpa9SKcdGn8yoJ76IRJijtw==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/mdx-loader" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/mdx-loader" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
fs-extra "^11.1.1"
tslib "^2.6.0"
webpack "^5.88.1"
"@docusaurus/plugin-css-cascade-layers@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-css-cascade-layers/-/plugin-css-cascade-layers-3.10.0.tgz#71e318d842be95f92be6c3dca00ceea4971d0edb"
integrity sha512-6q1vtt5FJcg5osgkHeM1euErECNqEZ5Z1j69yiNx2luEBIso+nxCkS9nqj8w+MK5X7rvKEToGhFfOFWncs51pQ==
"@docusaurus/plugin-css-cascade-layers@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-css-cascade-layers/-/plugin-css-cascade-layers-3.10.1.tgz#440578d95cbe1a6120936fa83df868d2626cd1d8"
integrity sha512-r//fn+MNHkE1wCof8T29VAQezt1enGCpsFxoziBbvLgBM4JfXN2P3rxrBaavHmvLvm7lYkpJeitcDthwnmWCTw==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
tslib "^2.6.0"
"@docusaurus/plugin-debug@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-debug/-/plugin-debug-3.10.0.tgz#e77f924604e1e09d5d90fe0bdf23a3be8ea3307e"
integrity sha512-XcljKN+G+nmmK69uQA1d9BlYU3ZftG3T3zpK8/7Hf/wrOlV7TA4Ampdrdwkg0jElKdKAoSnPhCO0/U3bQGsVQQ==
"@docusaurus/plugin-debug@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-debug/-/plugin-debug-3.10.1.tgz#b8b7b24d9a7d185fd8a56a030f90145d3bfd8239"
integrity sha512-9KqOpKNfAyqGZykRb9LhIT/vyRF6sm/ykhjj/39JvaJahDS+jZJE0Z1Wfz9q3DUNDTMNN0Q7u/kk4rKKU+IJuA==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
fs-extra "^11.1.1"
react-json-view-lite "^2.3.0"
tslib "^2.6.0"
"@docusaurus/plugin-google-analytics@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-3.10.0.tgz#22c7e976fe4d970c7cd1c73c9723d9a5786c6e37"
integrity sha512-hTEoodatpBZnUat5nFExbuTGA1lhWGy7vZGuTew5Q3QDtGKFpSJLYmZJhdTjvCFwv1+qQ67hgAVlKdJOB8TXow==
"@docusaurus/plugin-google-analytics@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-3.10.1.tgz#ac15afc77386e0352edb8a1698d993aa5de36ffc"
integrity sha512-8o0P1KtmgdYQHH+oInitPpRWI0Of5XednAX4+DMhQNSmGSRNrsEEHg1ebv35m9AgRClfAytCJ5jA9KvcASTyuA==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
tslib "^2.6.0"
"@docusaurus/plugin-google-gtag@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-3.10.0.tgz#c38a2ba638257851cc845b934506b80c08d47f96"
integrity sha512-iB/Zzjv/eelJRbdULZqzWCbgMgJ7ht4ONVjXtN3+BI/muil6S87gQ1OJyPwlXD+ELdKkitC7bWv5eJdYOZLhrQ==
"@docusaurus/plugin-google-gtag@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-3.10.1.tgz#0482b83b9bc411aa99a432be2b39d2e53a00e2e0"
integrity sha512-pu3xIUo5o/zCMLfUY9BO5KOwSH0zIsAGyFRPvXHayFSA5XIhCU/SFuB0g0ZNjFn9niZLCaNvoeAuOGFJZq0fdw==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
"@types/gtag.js" "^0.0.20"
tslib "^2.6.0"
"@docusaurus/plugin-google-tag-manager@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-tag-manager/-/plugin-google-tag-manager-3.10.0.tgz#5469c923cc1ad4608399d0b17e5fcacd8e030d56"
integrity sha512-FEjZxqKgLHa+Wez/EgKxRwvArNCWIScfyEQD95rot7jkxp6nonjI5XIbGfO/iYhM5Qinwe8aIEQHP2KZtpqVuA==
"@docusaurus/plugin-google-tag-manager@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-tag-manager/-/plugin-google-tag-manager-3.10.1.tgz#eaf5765d6f82b4fb661d92a793d1883f9d1ec106"
integrity sha512-f6fyGHiCm7kJHBtAisGQS5oNBnpnMTYQZxDXeVrnw/3zWU+LMA22pr6UHGYkBKDbN+qPC5QHG3NuOfzQLq3+Lw==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
tslib "^2.6.0"
"@docusaurus/plugin-sitemap@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-sitemap/-/plugin-sitemap-3.10.0.tgz#35d59d46803f279f22aa64fc1bd18c048f12662b"
integrity sha512-DVTSLjB97hIjmayGnGcBfognCeI7ZuUKgEnU7Oz81JYqXtVg94mVTthDjq3QHTylYNeCUbkaW8VF0FDLcc8pPw==
"@docusaurus/plugin-sitemap@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-sitemap/-/plugin-sitemap-3.10.1.tgz#66a6974bb2fd1b9d8f5cb0f3c5ecd2201c118565"
integrity sha512-C26MbmmqgdjkDq1htaZ3aD7LzEDKFWXfpyQpt0EOUThuq5nV77zDaedV20yHcVo9p+3ey9aZ4pbHA0D3QcZTzg==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/logger" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/logger" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
fs-extra "^11.1.1"
sitemap "^7.1.1"
tslib "^2.6.0"
"@docusaurus/plugin-svgr@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-svgr/-/plugin-svgr-3.10.0.tgz#8ada2e6dd8318d20206a9b044fc091a5794ba3f0"
integrity sha512-lNljBESaETZqVBMPqkrGchr+UPT1eZzEPLmJhz8I76BxbjqgsUnRvrq6lQJ9sYjgmgX52KB7kkgczqd2yzoswQ==
"@docusaurus/plugin-svgr@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-svgr/-/plugin-svgr-3.10.1.tgz#c217c24d6d23fd2bc6f54d44c040635b49d6b36e"
integrity sha512-6SFxsmjWFkVLDmBUvFK6i72QjUwqyQFe4Ovz+SUJophJjOyVG3ZZG5IQpBC/kX/Gfv1yWeU9nWauH6F6Q7QX/Q==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
"@svgr/core" "8.1.0"
"@svgr/webpack" "^8.1.0"
tslib "^2.6.0"
webpack "^5.88.1"
"@docusaurus/preset-classic@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/preset-classic/-/preset-classic-3.10.0.tgz#74b6facdaf568bcd41ec90cae9aebb7ca0ac8619"
integrity sha512-kw/Ye02Hc6xP1OdTswy8yxQEHg0fdPpyWAQRxr5b2x3h7LlG2Zgbb5BDFROnXDDMpUxB7YejlocJIE5HIEfpNA==
"@docusaurus/preset-classic@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/preset-classic/-/preset-classic-3.10.1.tgz#faf330d96aedc9083a59bec09d966ae4dfc8b2fb"
integrity sha512-YO/FL8v1zmbxoTso6mjMz/RDjhaTJxb1UpFFTDdY5847LLDCeyYiYlrhyTbgN1RIN3xnkLKZ9Lj1x8hUzI4JOg==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/plugin-content-blog" "3.10.0"
"@docusaurus/plugin-content-docs" "3.10.0"
"@docusaurus/plugin-content-pages" "3.10.0"
"@docusaurus/plugin-css-cascade-layers" "3.10.0"
"@docusaurus/plugin-debug" "3.10.0"
"@docusaurus/plugin-google-analytics" "3.10.0"
"@docusaurus/plugin-google-gtag" "3.10.0"
"@docusaurus/plugin-google-tag-manager" "3.10.0"
"@docusaurus/plugin-sitemap" "3.10.0"
"@docusaurus/plugin-svgr" "3.10.0"
"@docusaurus/theme-classic" "3.10.0"
"@docusaurus/theme-common" "3.10.0"
"@docusaurus/theme-search-algolia" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/plugin-content-blog" "3.10.1"
"@docusaurus/plugin-content-docs" "3.10.1"
"@docusaurus/plugin-content-pages" "3.10.1"
"@docusaurus/plugin-css-cascade-layers" "3.10.1"
"@docusaurus/plugin-debug" "3.10.1"
"@docusaurus/plugin-google-analytics" "3.10.1"
"@docusaurus/plugin-google-gtag" "3.10.1"
"@docusaurus/plugin-google-tag-manager" "3.10.1"
"@docusaurus/plugin-sitemap" "3.10.1"
"@docusaurus/plugin-svgr" "3.10.1"
"@docusaurus/theme-classic" "3.10.1"
"@docusaurus/theme-common" "3.10.1"
"@docusaurus/theme-search-algolia" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/theme-classic@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-classic/-/theme-classic-3.10.0.tgz#d937915c691189f27ced649c822994d839ea565b"
integrity sha512-9msCAsRdN+UG+RwPwCFb0uKy4tGoPh5YfBozXeGUtIeAgsMdn6f3G/oY861luZ3t8S2ET8S9Y/1GnpJAGWytww==
"@docusaurus/theme-classic@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-classic/-/theme-classic-3.10.1.tgz#deed8cf73cc0f56113e53775cbb3b168c3c61566"
integrity sha512-VU1RK0qb2pab0si4r7HFK37cYco8VzqLj3u1PspVipSr/z/GPVKHO4/HXbnePqHoWDk8urjyGSeatH0NIMBM1A==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/logger" "3.10.0"
"@docusaurus/mdx-loader" "3.10.0"
"@docusaurus/module-type-aliases" "3.10.0"
"@docusaurus/plugin-content-blog" "3.10.0"
"@docusaurus/plugin-content-docs" "3.10.0"
"@docusaurus/plugin-content-pages" "3.10.0"
"@docusaurus/theme-common" "3.10.0"
"@docusaurus/theme-translations" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/logger" "3.10.1"
"@docusaurus/mdx-loader" "3.10.1"
"@docusaurus/module-type-aliases" "3.10.1"
"@docusaurus/plugin-content-blog" "3.10.1"
"@docusaurus/plugin-content-docs" "3.10.1"
"@docusaurus/plugin-content-pages" "3.10.1"
"@docusaurus/theme-common" "3.10.1"
"@docusaurus/theme-translations" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
"@mdx-js/react" "^3.0.0"
clsx "^2.0.0"
copy-text-to-clipboard "^3.2.0"
@@ -1959,15 +1959,15 @@
tslib "^2.6.0"
utility-types "^3.10.0"
"@docusaurus/theme-common@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-common/-/theme-common-3.10.0.tgz#70b419ccfdf62f092299354a72d1692e81be597d"
integrity sha512-Dkp1YXKn16ByCJAdIjbDIOpVb4Z66MsVD694/ilX1vAAHaVEMrVsf/NPd9VgreyFx08rJ9GqV1MtzsbTcU73Kg==
"@docusaurus/theme-common@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-common/-/theme-common-3.10.1.tgz#cbfec82b1b107be5c229811ed9caae14a501361c"
integrity sha512-0YtmIeoNo1fIw65LO8+/1dPgmDV86UmhMkow37gzjytuiCSQm9xob6PJy0L4kuQEMTLfUOGvkXvZr7GPrHquMA==
dependencies:
"@docusaurus/mdx-loader" "3.10.0"
"@docusaurus/module-type-aliases" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/mdx-loader" "3.10.1"
"@docusaurus/module-type-aliases" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
"@types/history" "^4.7.11"
"@types/react" "*"
"@types/react-router-config" "*"
@@ -1977,48 +1977,48 @@
tslib "^2.6.0"
utility-types "^3.10.0"
"@docusaurus/theme-live-codeblock@^3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-live-codeblock/-/theme-live-codeblock-3.10.0.tgz#05a38c6bfac479fd698f18f27ca06ebb126633d9"
integrity sha512-1Ycxu0dBAhEXzXPQ1dQW01aY1MNi7TCTUOBtIF0GcNrQBFj74XxhDqv/T6GxYBsaN+6QnIDs1T+D43iV2/r2hQ==
"@docusaurus/theme-live-codeblock@^3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-live-codeblock/-/theme-live-codeblock-3.10.1.tgz#29e6ddee467d816205ad611fd7bf10f00db5bdef"
integrity sha512-MKG/0zreelS6YlupQAoKmS5nCw9RRKwDHihJg2FinsU1+rqbrOYNYVq//eQy+m649k9b8XCazEw9VUMTFhpCTg==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/theme-common" "3.10.0"
"@docusaurus/theme-translations" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/theme-common" "3.10.1"
"@docusaurus/theme-translations" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
"@philpl/buble" "^0.19.7"
clsx "^2.0.0"
fs-extra "^11.1.1"
react-live "^4.1.6"
tslib "^2.6.0"
"@docusaurus/theme-mermaid@^3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-mermaid/-/theme-mermaid-3.10.0.tgz#6581ccf16d27e4c02fe8c7cf15488862f27be9c8"
integrity sha512-Y2xrlwhIJ80oOZIO3PXL6A7J869splfcMI87E3NKpYsy3zJxOyV+BP1QMtGi59ajKgU868HPuyyn6J+6BZGOBg==
"@docusaurus/theme-mermaid@^3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-mermaid/-/theme-mermaid-3.10.1.tgz#dada9c50c780524d246906234ace8a35446f26fc"
integrity sha512-2gxpmln8Pc4EN1oWzshQEx2HTs67jk14v7MmgqGs8ZU7Nm8oihg+fTouof2u4vN8DtB3Fln4cDJu4UprSX1S3Q==
dependencies:
"@docusaurus/core" "3.10.0"
"@docusaurus/module-type-aliases" "3.10.0"
"@docusaurus/theme-common" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/module-type-aliases" "3.10.1"
"@docusaurus/theme-common" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
mermaid ">=11.6.0"
tslib "^2.6.0"
"@docusaurus/theme-search-algolia@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-search-algolia/-/theme-search-algolia-3.10.0.tgz#0ff57fe58db6abde8f5ad2877e459cd2fa6e7464"
integrity sha512-f5FPKI08e3JRG63vR/o4qeuUVHUHzFzM0nnF+AkB67soAZgNsKJRf2qmUZvlQkGwlV+QFkKe4D0ANMh1jToU3g==
"@docusaurus/theme-search-algolia@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-search-algolia/-/theme-search-algolia-3.10.1.tgz#6f422058711629ce8d7c2f17e1e54efa075c626e"
integrity sha512-OTaARARVZj2GvkJQjB+1jOIxntRaXea+G+fMsNqrZBAU1O1vJKDW22R7kECOHW27oJCLFN9HKaZeRrfAUyviug==
dependencies:
"@algolia/autocomplete-core" "^1.19.2"
"@docsearch/react" "^3.9.0 || ^4.3.2"
"@docusaurus/core" "3.10.0"
"@docusaurus/logger" "3.10.0"
"@docusaurus/plugin-content-docs" "3.10.0"
"@docusaurus/theme-common" "3.10.0"
"@docusaurus/theme-translations" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-validation" "3.10.0"
"@docusaurus/core" "3.10.1"
"@docusaurus/logger" "3.10.1"
"@docusaurus/plugin-content-docs" "3.10.1"
"@docusaurus/theme-common" "3.10.1"
"@docusaurus/theme-translations" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-validation" "3.10.1"
algoliasearch "^5.37.0"
algoliasearch-helper "^3.26.0"
clsx "^2.0.0"
@@ -2028,23 +2028,23 @@
tslib "^2.6.0"
utility-types "^3.10.0"
"@docusaurus/theme-translations@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-translations/-/theme-translations-3.10.0.tgz#8fdc23d29bd7f907db49c36cf65e2123d96be300"
integrity sha512-L9IbFLwTc5+XdgH45iQYufLn0SVZd6BUNelDbKIFlH+E4hhjuj/XHWAFMX/w2K59rfy8wak9McOaei7BSUfRPA==
"@docusaurus/theme-translations@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/theme-translations/-/theme-translations-3.10.1.tgz#c3119a015652290eea560ca45ac775963d6eb75b"
integrity sha512-cLMyaKivjBVWKMJuWqyFVVgtqe8DPJNPkog0bn8W1MDVAKcPdxRFycBfC1We1RaNp7Rdk513bmtW78RR6OBxBw==
dependencies:
fs-extra "^11.1.1"
tslib "^2.6.0"
"@docusaurus/tsconfig@^3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/tsconfig/-/tsconfig-3.10.0.tgz#f40a57248828f0503a5f355cf30aa59941c9baaa"
integrity sha512-TXdC3WXuPrdQAexLvjUJfnYf3YKEgEqAs5nK0Q88pRBCW7t7oN4ILvWYb3A5Z1wlSXyXGWW/mCUmLEhdWsjnDQ==
"@docusaurus/tsconfig@^3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/tsconfig/-/tsconfig-3.10.1.tgz#1db31b4a4a5c914bdffa80070a35b6365d34f2e8"
integrity sha512-rYvB7yqkdqWIpAbDzQljGfM4cDBkLTbhmagZBEcsyj6oPUsz47lmW2pYdN1j+7sGFgltbAmQH62xfbrij4Eh6Q==
"@docusaurus/types@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/types/-/types-3.10.0.tgz#a69232bba74b738fcf4671fd5f0f079366dd3d13"
integrity sha512-F0dOt3FOoO20rRaFK7whGFQZ3ggyrWEdQc/c8/UiRuzhtg4y1w9FspXH5zpCT07uMnJKBPGh+qNazbNlCQqvSw==
"@docusaurus/types@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/types/-/types-3.10.1.tgz#d42837938ae43ca2be0ca47e63e00476b5eb94be"
integrity sha512-XYMK8k1szDCFMw2V+Xyen0g7Kee1sP3dtFnl7vkGkZOkeAJ/oPDQPL8iz4HBKOo/cwU8QeV6onVjMqtP+tFzsw==
dependencies:
"@mdx-js/mdx" "^3.0.0"
"@types/history" "^4.7.11"
@@ -2057,36 +2057,36 @@
webpack "^5.95.0"
webpack-merge "^5.9.0"
"@docusaurus/utils-common@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/utils-common/-/utils-common-3.10.0.tgz#2a6dc76b312664fca7234d33607c085318ff1ae3"
integrity sha512-JyL7sb9QVDgYvudIS81Dv0lsWm7le0vGZSDwsztxWam1SPBqrnkvBy9UYL/amh6pbybkyYTd3CMTkO24oMlCSw==
"@docusaurus/utils-common@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/utils-common/-/utils-common-3.10.1.tgz#6350b4898691e765de750f90eade0e0fa7902d99"
integrity sha512-5mFSgEADtnFxFH7RLw02QA5MpU5JVUCj0MPeIvi/aF4Fi45tQRIuTwXoXDqJ+1VfQJuYJGz3SI63wmGz4HvXzA==
dependencies:
"@docusaurus/types" "3.10.0"
"@docusaurus/types" "3.10.1"
tslib "^2.6.0"
"@docusaurus/utils-validation@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/utils-validation/-/utils-validation-3.10.0.tgz#a2418d7f31980d991fd3a1f39c8aad8820b36812"
integrity sha512-c+6n2+ZPOJtWWc8Bb/EYdpSDfjYEScdCu9fB/SNjOmSCf1IdVnGf2T53o0tsz0gDRtCL90tifTL0JE/oMuP1Mw==
"@docusaurus/utils-validation@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/utils-validation/-/utils-validation-3.10.1.tgz#ddbcce997a5506424cdd16abf6845cc51692acae"
integrity sha512-cRv1X69jwaWv47waglllgZVWzeBFLhl53XT/XED/83BerVBTC5FTP8WTcVl8Z6sZOegDSwitu/wpCSPCDOT6lg==
dependencies:
"@docusaurus/logger" "3.10.0"
"@docusaurus/utils" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/logger" "3.10.1"
"@docusaurus/utils" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
fs-extra "^11.2.0"
joi "^17.9.2"
js-yaml "^4.1.0"
lodash "^4.17.21"
tslib "^2.6.0"
"@docusaurus/utils@3.10.0":
version "3.10.0"
resolved "https://registry.yarnpkg.com/@docusaurus/utils/-/utils-3.10.0.tgz#ea7d7b0d325b60f728decc00bb3908d00ef86faf"
integrity sha512-T3B0WTigsIthe0D4LQa2k+7bJY+c3WS+Wq2JhcznOSpn1lSN64yNtHQXboCj3QnUs1EuAZszQG1SHKu5w5ZrlA==
"@docusaurus/utils@3.10.1":
version "3.10.1"
resolved "https://registry.yarnpkg.com/@docusaurus/utils/-/utils-3.10.1.tgz#535968caa2c9bff69f997a081b98b95b3c5d3785"
integrity sha512-3ojeJry9xBYdJO6qoyyzqeJFSJBVx2mXhyDzSdjwL2+URFQMf+h25gG38iswGImicK0ELjTd1EL2xzk8hf3QPw==
dependencies:
"@docusaurus/logger" "3.10.0"
"@docusaurus/types" "3.10.0"
"@docusaurus/utils-common" "3.10.0"
"@docusaurus/logger" "3.10.1"
"@docusaurus/types" "3.10.1"
"@docusaurus/utils-common" "3.10.1"
escape-string-regexp "^4.0.0"
execa "^5.1.1"
file-loader "^6.2.0"
@@ -13328,7 +13328,7 @@ renderkid@^3.0.0:
repeat-string@^1.5.2:
version "1.6.1"
resolved "https://registry.npmjs.org/repeat-string/-/repeat-string-1.6.1.tgz"
resolved "https://registry.yarnpkg.com/repeat-string/-/repeat-string-1.6.1.tgz#8dcae470e1c88abc2d600fff4a776286da75e637"
integrity sha512-PV0dzCYDNfRi1jCDbJzpW7jNNDRuCOG/jI5ctQcGKt/clZD+YcPS3yIlWuTJMmESC8aevCFmWJy5wjAFgNqN6w==
require-directory@^2.1.1:
@@ -15368,7 +15368,7 @@ webpack@^5.106.2, webpack@^5.88.1, webpack@^5.95.0:
watchpack "^2.5.1"
webpack-sources "^3.3.4"
webpackbar@^6.0.1, webpackbar@^7.0.0:
webpackbar@^7.0.0:
version "7.0.0"
resolved "https://registry.yarnpkg.com/webpackbar/-/webpackbar-7.0.0.tgz#7228d32881af2392381b6514499ddea73cdf218a"
integrity sha512-aS9soqSO2iCHgqHoCrj4LbfGQUboDCYJPSFOAchEK+9psIjNrfSWW4Y0YEz67MKURNvMmfo0ycOg9d/+OOf9/Q==

View File

@@ -197,7 +197,7 @@ tdengine = [
]
teradata = ["teradatasql>=16.20.0.23"]
thumbnails = [] # deprecated, will be removed in 7.0
vertica = ["sqlalchemy-vertica-python>=0.5.9, < 0.6"]
vertica = ["sqlalchemy-vertica-python>= 0.5.9, < 0.7"]
netezza = ["nzalchemy>=11.0.2"]
starrocks = ["starrocks>=1.0.0"]
doris = ["pydoris>=1.0.0, <2.0.0"]

View File

@@ -18,7 +18,7 @@
[project]
name = "apache-superset-core"
version = "0.1.0rc2"
version = "0.1.0rc3"
description = "Core Python package for building Apache Superset backend extensions and integrations"
readme = "README.md"
authors = [

View File

@@ -17,7 +17,7 @@
[project]
name = "apache-superset-extensions-cli"
version = "0.1.0rc2"
version = "0.1.0rc3"
description = "Official command-line interface for building, bundling, and managing Apache Superset extensions"
readme = "README.md"
authors = [

View File

@@ -102,7 +102,7 @@
"json-bigint": "^1.0.0",
"json-stringify-pretty-compact": "^2.0.0",
"lodash": "^4.18.1",
"mapbox-gl": "^3.22.0",
"mapbox-gl": "^3.23.0",
"markdown-to-jsx": "^9.7.16",
"match-sorter": "^8.3.0",
"memoize-one": "^5.2.1",
@@ -163,7 +163,7 @@
"@babel/plugin-transform-export-namespace-from": "^7.27.1",
"@babel/plugin-transform-modules-commonjs": "^7.28.6",
"@babel/plugin-transform-runtime": "^7.29.0",
"@babel/preset-env": "^7.29.2",
"@babel/preset-env": "^7.29.3",
"@babel/preset-react": "^7.28.5",
"@babel/preset-typescript": "^7.28.5",
"@babel/register": "^7.23.7",
@@ -225,7 +225,7 @@
"babel-plugin-dynamic-import-node": "^2.3.3",
"babel-plugin-jsx-remove-data-test-id": "^3.0.0",
"babel-plugin-lodash": "^3.3.4",
"baseline-browser-mapping": "^2.10.21",
"baseline-browser-mapping": "^2.10.24",
"cheerio": "1.2.0",
"concurrently": "^9.2.1",
"copy-webpack-plugin": "^14.0.0",
@@ -578,9 +578,9 @@
}
},
"node_modules/@babel/compat-data": {
"version": "7.29.0",
"resolved": "https://registry.npmjs.org/@babel/compat-data/-/compat-data-7.29.0.tgz",
"integrity": "sha512-T1NCJqT/j9+cn8fvkt7jtwbLBfLC/1y1c7NtCeXFRgzGTsafi68MRv8yzkYSapBnFA6L3U2VSc02ciDzoAJhJg==",
"version": "7.29.3",
"resolved": "https://registry.npmjs.org/@babel/compat-data/-/compat-data-7.29.3.tgz",
"integrity": "sha512-LIVqM46zQWZhj17qA8wb4nW/ixr2y1Nw+r1etiAWgRM6U1IqP+LNhL1yg440jYZR72jCWcWbLWzIosH+uP1fqg==",
"dev": true,
"license": "MIT",
"engines": {
@@ -1062,6 +1062,23 @@
"@babel/core": "^7.0.0"
}
},
"node_modules/@babel/plugin-bugfix-safari-rest-destructuring-rhs-array": {
"version": "7.29.3",
"resolved": "https://registry.npmjs.org/@babel/plugin-bugfix-safari-rest-destructuring-rhs-array/-/plugin-bugfix-safari-rest-destructuring-rhs-array-7.29.3.tgz",
"integrity": "sha512-SRS46DFR4HqzUzCVgi90/xMoL+zeBDBvWdKYXSEzh79kXswNFEglUpMKxR04//dPqwYXWUBJ3mpUd933ru9Kmg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@babel/helper-plugin-utils": "^7.28.6",
"@babel/helper-skip-transparent-expression-wrappers": "^7.27.1"
},
"engines": {
"node": ">=6.9.0"
},
"peerDependencies": {
"@babel/core": "^7.0.0"
}
},
"node_modules/@babel/plugin-bugfix-v8-spread-parameters-in-optional-chaining": {
"version": "7.27.1",
"resolved": "https://registry.npmjs.org/@babel/plugin-bugfix-v8-spread-parameters-in-optional-chaining/-/plugin-bugfix-v8-spread-parameters-in-optional-chaining-7.27.1.tgz",
@@ -2388,19 +2405,20 @@
}
},
"node_modules/@babel/preset-env": {
"version": "7.29.2",
"resolved": "https://registry.npmjs.org/@babel/preset-env/-/preset-env-7.29.2.tgz",
"integrity": "sha512-DYD23veRYGvBFhcTY1iUvJnDNpuqNd/BzBwCvzOTKUnJjKg5kpUBh3/u9585Agdkgj+QuygG7jLfOPWMa2KVNw==",
"version": "7.29.3",
"resolved": "https://registry.npmjs.org/@babel/preset-env/-/preset-env-7.29.3.tgz",
"integrity": "sha512-ySZypNLAIH1ClygLDQzVMoGQRViATnkHkYYV6TcNDz+8+jwZCdsguGvsb3EY5d9wyWyhmF1iSuFM0Yh5XPnqSA==",
"dev": true,
"license": "MIT",
"dependencies": {
"@babel/compat-data": "^7.29.0",
"@babel/compat-data": "^7.29.3",
"@babel/helper-compilation-targets": "^7.28.6",
"@babel/helper-plugin-utils": "^7.28.6",
"@babel/helper-validator-option": "^7.27.1",
"@babel/plugin-bugfix-firefox-class-in-computed-class-key": "^7.28.5",
"@babel/plugin-bugfix-safari-class-field-initializer-scope": "^7.27.1",
"@babel/plugin-bugfix-safari-id-destructuring-collision-in-function-expression": "^7.27.1",
"@babel/plugin-bugfix-safari-rest-destructuring-rhs-array": "^7.29.3",
"@babel/plugin-bugfix-v8-spread-parameters-in-optional-chaining": "^7.27.1",
"@babel/plugin-bugfix-v8-static-class-fields-redefine-readonly": "^7.28.6",
"@babel/plugin-proposal-private-property-in-object": "7.21.0-placeholder-for-preset-env.2",
@@ -18689,9 +18707,9 @@
"license": "MIT"
},
"node_modules/baseline-browser-mapping": {
"version": "2.10.21",
"resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.10.21.tgz",
"integrity": "sha512-Q+rUQ7Uz8AHM7DEaNdwvfFCTq7a43lNTzuS94eiWqwyxfV/wJv+oUivef51T91mmRY4d4A1u9rcSvkeufCVXlA==",
"version": "2.10.24",
"resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.10.24.tgz",
"integrity": "sha512-I2NkZOOrj2XuguvWCK6OVh9GavsNjZjK908Rq3mIBK25+GD8vPX5w2WdxVqnQ7xx3SrZJiCiZFu+/Oz50oSYSA==",
"dev": true,
"license": "Apache-2.0",
"bin": {
@@ -35939,9 +35957,9 @@
"license": "MIT"
},
"node_modules/mapbox-gl": {
"version": "3.22.0",
"resolved": "https://registry.npmjs.org/mapbox-gl/-/mapbox-gl-3.22.0.tgz",
"integrity": "sha512-ZIpF+oAMcQoDlvABmiRkHoydyBR9zI6CyDeVRa2/iyua0/B2+rPuIzoCV/CgN14P5F0HVk53GIZw220WSqJPyA==",
"version": "3.23.0",
"resolved": "https://registry.npmjs.org/mapbox-gl/-/mapbox-gl-3.23.0.tgz",
"integrity": "sha512-zzjNAaMNvXnAVEUrYpOWmRVEBCIWgDAMLRPvSOoKY3smKvrINFVrRK/1jEpUDbEa7Ppf5Q/nwC6E07tz/i7IKw==",
"license": "SEE LICENSE IN LICENSE.txt",
"workspaces": [
"src/style-spec",
@@ -51021,7 +51039,7 @@
"dependencies": {
"chalk": "^5.6.2",
"lodash-es": "^4.18.1",
"yeoman-generator": "^8.1.2",
"yeoman-generator": "^8.2.2",
"yosay": "^3.0.0"
},
"devDependencies": {
@@ -51395,12 +51413,12 @@
},
"packages/superset-core": {
"name": "@apache-superset/core",
"version": "0.1.0-rc2",
"version": "0.1.0-rc3",
"license": "Apache-2.0",
"devDependencies": {
"@babel/cli": "^7.28.6",
"@babel/core": "^7.29.0",
"@babel/preset-env": "^7.29.2",
"@babel/preset-env": "^7.29.3",
"@babel/preset-react": "^7.28.5",
"@babel/preset-typescript": "^7.28.5",
"@emotion/styled": "^11.14.1",
@@ -52666,25 +52684,6 @@
"integrity": "sha512-lDB5YccMydFBtasVtxnZ3MRBHuaoE8GKsppq+EchKL2U4nK/DmEpPHNH8MZe5HkMtpSiTSOZwfN0tzYjO/lJEw==",
"license": "ISC"
},
"plugins/legacy-plugin-chart-map-box": {
"name": "@superset-ui/legacy-plugin-chart-map-box",
"version": "0.20.3",
"extraneous": true,
"license": "Apache-2.0",
"dependencies": {
"@math.gl/web-mercator": "^4.1.0",
"prop-types": "^15.8.1",
"react-map-gl": "^6.1.19",
"supercluster": "^8.0.1"
},
"peerDependencies": {
"@apache-superset/core": "*",
"@superset-ui/chart-controls": "*",
"@superset-ui/core": "*",
"mapbox-gl": "*",
"react": "^17.0.2"
}
},
"plugins/legacy-plugin-chart-paired-t-test": {
"name": "@superset-ui/legacy-plugin-chart-paired-t-test",
"version": "0.20.3",
@@ -53062,7 +53061,7 @@
"license": "Apache-2.0",
"dependencies": {
"@math.gl/web-mercator": "^4.1.0",
"mapbox-gl": "^3.22.0",
"mapbox-gl": "^3.23.0",
"maplibre-gl": "^5.24.0",
"react-map-gl": "^8.1.0",
"supercluster": "^8.0.1"
@@ -53473,103 +53472,6 @@
"version": "1.0.0",
"extraneous": true,
"license": "Apache-2.0"
},
"node_modules/mem-fs-editor/node_modules/array-differ": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/array-differ/-/array-differ-4.0.0.tgz",
"integrity": "sha512-Q6VPTLMsmXZ47ENG3V+wQyZS1ZxXMxFyYzA+Z/GMrJ6yIutAIEf9wTyroTzmGjNfox9/h3GdGBCVh43GVFx4Uw==",
"license": "MIT",
"optional": true,
"peer": true,
"engines": {
"node": "^12.20.0 || ^14.13.1 || >=16.0.0"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/mem-fs-editor/node_modules/array-union": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/array-union/-/array-union-3.0.1.tgz",
"integrity": "sha512-1OvF9IbWwaeiM9VhzYXVQacMibxpXOMYVNIvMtKRyX9SImBXpKcFr8XvFDeEslCyuH/t6KRt7HEO94AlP8Iatw==",
"license": "MIT",
"optional": true,
"peer": true,
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/mem-fs-editor/node_modules/globby": {
"version": "14.0.2",
"resolved": "https://registry.npmjs.org/globby/-/globby-14.0.2.tgz",
"integrity": "sha512-s3Fq41ZVh7vbbe2PN3nrW7yC7U7MFVc5c98/iTl9c2GawNMKx/J648KQRW6WKkuU8GIbbh2IXfIRQjOZnXcTnw==",
"license": "MIT",
"optional": true,
"peer": true,
"dependencies": {
"@sindresorhus/merge-streams": "^2.1.0",
"fast-glob": "^3.3.2",
"ignore": "^5.2.4",
"path-type": "^5.0.0",
"slash": "^5.1.0",
"unicorn-magic": "^0.1.0"
},
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/mem-fs-editor/node_modules/multimatch": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/multimatch/-/multimatch-7.0.0.tgz",
"integrity": "sha512-SYU3HBAdF4psHEL/+jXDKHO95/m5P2RvboHT2Y0WtTttvJLP4H/2WS9WlQPFvF6C8d6SpLw8vjCnQOnVIVOSJQ==",
"license": "MIT",
"optional": true,
"peer": true,
"dependencies": {
"array-differ": "^4.0.0",
"array-union": "^3.0.1",
"minimatch": "^9.0.3"
},
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/mem-fs-editor/node_modules/path-type": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/path-type/-/path-type-5.0.0.tgz",
"integrity": "sha512-5HviZNaZcfqP95rwpv+1HDgUamezbqdSYTyzjTvwtJSnIH+3vnbmWsItli8OFEndS984VT55M3jduxZbX351gg==",
"license": "MIT",
"optional": true,
"peer": true,
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/mem-fs-editor/node_modules/slash": {
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/slash/-/slash-5.1.0.tgz",
"integrity": "sha512-ZA6oR3T/pEyuqwMgAKT0/hAv8oAXckzbkmR0UkUosQ+Mc4RxGoJkRmwHgHufaenlyAgE1Mxgpdcrf75y6XcnDg==",
"license": "MIT",
"optional": true,
"peer": true,
"engines": {
"node": ">=14.16"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
}
}
}

View File

@@ -183,7 +183,7 @@
"json-bigint": "^1.0.0",
"json-stringify-pretty-compact": "^2.0.0",
"lodash": "^4.18.1",
"mapbox-gl": "^3.22.0",
"mapbox-gl": "^3.23.0",
"markdown-to-jsx": "^9.7.16",
"match-sorter": "^8.3.0",
"memoize-one": "^5.2.1",
@@ -244,7 +244,7 @@
"@babel/plugin-transform-export-namespace-from": "^7.27.1",
"@babel/plugin-transform-modules-commonjs": "^7.28.6",
"@babel/plugin-transform-runtime": "^7.29.0",
"@babel/preset-env": "^7.29.2",
"@babel/preset-env": "^7.29.3",
"@babel/preset-react": "^7.28.5",
"@babel/preset-typescript": "^7.28.5",
"@babel/register": "^7.23.7",
@@ -306,7 +306,7 @@
"babel-plugin-dynamic-import-node": "^2.3.3",
"babel-plugin-jsx-remove-data-test-id": "^3.0.0",
"babel-plugin-lodash": "^3.3.4",
"baseline-browser-mapping": "^2.10.21",
"baseline-browser-mapping": "^2.10.24",
"cheerio": "1.2.0",
"concurrently": "^9.2.1",
"copy-webpack-plugin": "^14.0.0",

View File

@@ -37,7 +37,7 @@
"cross-env": "^10.1.0",
"fs-extra": "^11.3.4",
"jest": "^30.3.0",
"yeoman-test": "^11.3.1"
"yeoman-test": "^11.4.2"
},
"engines": {
"npm": ">= 4.0.0",

View File

@@ -1,6 +1,6 @@
{
"name": "@apache-superset/core",
"version": "0.1.0-rc2",
"version": "0.1.0-rc3",
"description": "This package contains UI elements, APIs, and utility functions used by Superset.",
"sideEffects": false,
"main": "lib/index.js",
@@ -75,7 +75,7 @@
"devDependencies": {
"@babel/cli": "^7.28.6",
"@babel/core": "^7.29.0",
"@babel/preset-env": "^7.29.2",
"@babel/preset-env": "^7.29.3",
"@babel/preset-react": "^7.28.5",
"@babel/preset-typescript": "^7.28.5",
"typescript": "^5.0.0",

View File

@@ -27,7 +27,7 @@
],
"dependencies": {
"@math.gl/web-mercator": "^4.1.0",
"mapbox-gl": "^3.22.0",
"mapbox-gl": "^3.23.0",
"maplibre-gl": "^5.24.0",
"react-map-gl": "^8.1.0",
"supercluster": "^8.0.1"

View File

@@ -50,6 +50,7 @@ import {
getTimeFormatterForGranularity,
BinaryQueryObjectFilterClause,
extractTextFromHTML,
TimeGranularity,
} from '@superset-ui/core';
import {
styled,
@@ -309,6 +310,67 @@ function SelectPageSize({
const getNoResultsMessage = (filter: string) =>
filter ? t('No matching records found') : t('No records found');
/**
* Calculates the inclusive/exclusive temporal range for a bucket.
* standard SQL range pattern: [start, end)
*/
function getTimeRangeFromGranularity(
startTime: Date,
granularity: TimeGranularity,
): [Date, Date] {
const time = startTime.getTime();
const date = startTime.getUTCDate();
const month = startTime.getUTCMonth();
const year = startTime.getUTCFullYear();
// Constants
const MS_IN_SECOND = 1000;
const MS_IN_MINUTE = 60 * MS_IN_SECOND;
const MS_IN_HOUR = 60 * MS_IN_MINUTE;
switch (granularity) {
case TimeGranularity.SECOND:
return [startTime, new Date(time + MS_IN_SECOND)];
case TimeGranularity.MINUTE:
return [startTime, new Date(time + MS_IN_MINUTE)];
case TimeGranularity.FIVE_MINUTES:
return [startTime, new Date(time + MS_IN_MINUTE * 5)];
case TimeGranularity.TEN_MINUTES:
return [startTime, new Date(time + MS_IN_MINUTE * 10)];
case TimeGranularity.FIFTEEN_MINUTES:
return [startTime, new Date(time + MS_IN_MINUTE * 15)];
case TimeGranularity.THIRTY_MINUTES:
return [startTime, new Date(time + MS_IN_MINUTE * 30)];
case TimeGranularity.HOUR:
return [startTime, new Date(time + MS_IN_HOUR)];
case TimeGranularity.DAY:
case TimeGranularity.DATE:
return [startTime, new Date(Date.UTC(year, month, date + 1))];
case TimeGranularity.WEEK:
case TimeGranularity.WEEK_STARTING_SUNDAY:
case TimeGranularity.WEEK_STARTING_MONDAY:
return [startTime, new Date(Date.UTC(year, month, date + 7))];
case TimeGranularity.WEEK_ENDING_SATURDAY:
case TimeGranularity.WEEK_ENDING_SUNDAY:
// Week-ending buckets are labeled by the bucket's final day.
return [
new Date(Date.UTC(year, month, date - 6)),
new Date(Date.UTC(year, month, date + 1)),
];
case TimeGranularity.MONTH:
return [startTime, new Date(Date.UTC(year, month + 1, 1))];
case TimeGranularity.QUARTER:
return [
startTime,
new Date(Date.UTC(year, Math.floor(month / 3) * 3 + 3, 1)),
];
case TimeGranularity.YEAR:
return [startTime, new Date(Date.UTC(year + 1, 0, 1))];
default:
return [startTime, new Date(Date.UTC(year, month, date + 1))];
}
}
export default function TableChart<D extends DataRecord = DataRecord>(
props: TableChartTransformedProps<D> & {
sticky?: DataTableProps<D>['sticky'];
@@ -471,7 +533,7 @@ export default function TableChart<D extends DataRecord = DataRecord>(
// so that cross-filters work on the receiving chart
const resolvedCol = columnLabelToNameMap[col] ?? col;
const val = ensureIsArray(updatedFilters?.[col]);
if (!val.length)
if (!val.length || val[0] === null || (val[0] instanceof DateWithFormatter && val[0].input === null))
return {
col: resolvedCol,
op: 'IS NULL' as const,
@@ -578,15 +640,49 @@ export default function TableChart<D extends DataRecord = DataRecord>(
const drillToDetailFilters: BinaryQueryObjectFilterClause[] = [];
filteredColumnsMeta.forEach(col => {
if (!col.isMetric) {
let dataRecordValue = value[col.key];
dataRecordValue = extractTextFromHTML(dataRecordValue);
const dataRecordValue = value[col.key];
drillToDetailFilters.push({
col: col.key,
op: '==',
val: dataRecordValue as string | number | boolean,
formattedVal: formatColumnValue(col, dataRecordValue)[1],
});
// FIX: Explicitly handle NULL values for temporal and non-temporal columns
// DateWithFormatter objects wrap nulls, so we must check both
if (
dataRecordValue == null ||
(dataRecordValue instanceof DateWithFormatter && dataRecordValue.input == null)
) {
drillToDetailFilters.push({
col: col.key,
op: 'IS NULL' as any,
val: null,
});
} else if (col.dataType === GenericDataType.Temporal && timeGrain) {
const startTime =
dataRecordValue instanceof Date
? dataRecordValue
: new Date(dataRecordValue as string | number);
const [rangeStartTime, rangeEndTime] = getTimeRangeFromGranularity(
startTime,
timeGrain,
);
const timeRangeValue = `${rangeStartTime.toISOString()} : ${rangeEndTime.toISOString()}`;
drillToDetailFilters.push({
col: col.key,
op: 'TEMPORAL_RANGE',
val: timeRangeValue,
grain: timeGrain,
formattedVal: formatColumnValue(col, dataRecordValue)[1],
});
} else {
// Non-temporal columns use exact match
const sanitizedValue = extractTextFromHTML(dataRecordValue);
drillToDetailFilters.push({
col: col.key,
op: '==',
val: sanitizedValue as string | number | boolean,
formattedVal: formatColumnValue(col, sanitizedValue)[1],
});
}
}
});
onContextMenu(clientX, clientY, {
@@ -600,7 +696,7 @@ export default function TableChart<D extends DataRecord = DataRecord>(
filters: [
{
col: cellPoint.key,
op: '==',
op: (cellPoint.value == null || (cellPoint.value instanceof DateWithFormatter && cellPoint.value.input == null) ? 'IS NULL' : '==') as any,
val: extractTextFromHTML(cellPoint.value),
},
],
@@ -615,6 +711,7 @@ export default function TableChart<D extends DataRecord = DataRecord>(
isRawRecords,
filteredColumnsMeta,
getCrossFilterDataMask,
timeGrain,
]);
const getHeaderColumns = useCallback(

View File

@@ -130,13 +130,12 @@ const processComparisonTotals = (
Object.keys(totalRecord).forEach(key => {
if (totalRecord[key] !== undefined && !key.includes(comparisonSuffix)) {
transformedTotals[`Main ${key}`] =
parseInt(transformedTotals[`Main ${key}`]?.toString() || '0', 10) +
parseInt(totalRecord[key]?.toString() || '0', 10);
parseFloat(transformedTotals[`Main ${key}`]?.toString() || '0') +
parseFloat(totalRecord[key]?.toString() || '0');
transformedTotals[`# ${key}`] =
parseInt(transformedTotals[`# ${key}`]?.toString() || '0', 10) +
parseInt(
parseFloat(transformedTotals[`# ${key}`]?.toString() || '0') +
parseFloat(
totalRecord[`${key}__${comparisonSuffix}`]?.toString() || '0',
10,
);
const { valueDifference, percentDifferenceNum } = calculateDifferences(
transformedTotals[`Main ${key}`] as number,

View File

@@ -2360,3 +2360,76 @@ describe('plugin-chart-table', () => {
});
});
});
/**
* DRILL-TO-DETAIL FIX VERIFICATION (#23847)
*/
describe('Drill-to-Detail Temporal Range Logic', () => {
const renderChartAndOpenContextMenu = (
timeGrain?: TimeGranularity,
timestampValue?: string | number | null,
) => {
const onContextMenu = jest.fn();
const data = cloneDeep(testData.basic);
if (timestampValue !== undefined) {
data.queriesData[0].data[0].__timestamp = timestampValue;
}
const props = transformProps({
...data,
rawFormData: {
...data.rawFormData,
...(timeGrain ? { time_grain_sqla: timeGrain } : {}),
},
hooks: { onAddFilter: jest.fn(), onContextMenu, setDataMask: jest.fn() },
});
render(<TableChart {...props} sticky={false} />);
const tbody = screen.getAllByRole('rowgroup')[1];
fireEvent.contextMenu(tbody.querySelectorAll('td')[0]);
const [, , { drillToDetail }] = onContextMenu.mock.calls[0];
return drillToDetail.find((f: any) => f.col === '__timestamp');
};
test('uses TEMPORAL_RANGE for monthly grain', () => {
const filter = renderChartAndOpenContextMenu(TimeGranularity.MONTH);
expect(filter.op).toBe('TEMPORAL_RANGE');
expect(filter.val).toContain(
'2020-01-01T12:34:56.000Z : 2020-02-01T00:00:00.000Z',
);
});
test('uses the full bucket for week ending sunday grain', () => {
const filter = renderChartAndOpenContextMenu(
TimeGranularity.WEEK_ENDING_SUNDAY,
'2020-01-05T00:00:00',
);
expect(filter.op).toBe('TEMPORAL_RANGE');
expect(filter.val).toBe(
'2019-12-30T00:00:00.000Z : 2020-01-06T00:00:00.000Z',
);
});
test('uses the full bucket for week ending saturday grain', () => {
const filter = renderChartAndOpenContextMenu(
TimeGranularity.WEEK_ENDING_SATURDAY,
'2020-01-04T00:00:00',
);
expect(filter.op).toBe('TEMPORAL_RANGE');
expect(filter.val).toBe(
'2019-12-29T00:00:00.000Z : 2020-01-05T00:00:00.000Z',
);
});
test('correctly handles NULL values by emitting IS NULL instead of 1970 timestamp', () => {
const filter = renderChartAndOpenContextMenu(TimeGranularity.MONTH, null);
expect(filter.op).toBe('IS NULL');
expect(filter.val).toBeNull();
});
});

View File

@@ -590,7 +590,9 @@ class BaseEngineSpec: # pylint: disable=too-many-public-methods
# Driver-specific params to be included in the `get_oauth2_token` request body
oauth2_additional_token_request_params: dict[str, Any] = {}
# Driver-specific exception that should be mapped to OAuth2RedirectError
oauth2_exception = OAuth2RedirectError
oauth2_exception: type[Exception] | tuple[type[Exception], ...] = (
OAuth2RedirectError
)
# Does the query id related to the connection?
# The default value is True, which means that the query id is determined when

View File

@@ -31,6 +31,7 @@ from marshmallow import fields, Schema
from marshmallow.exceptions import ValidationError
from requests import Session
from shillelagh.adapters.api.gsheets.lib import SCOPES
from shillelagh.exceptions import UnauthenticatedError
from sqlalchemy.engine import create_engine
from sqlalchemy.engine.reflection import Inspector
from sqlalchemy.engine.url import URL
@@ -40,7 +41,7 @@ from superset.databases.schemas import encrypted_field_properties, EncryptedStri
from superset.db_engine_specs.base import DatabaseCategory
from superset.db_engine_specs.shillelagh import ShillelaghEngineSpec
from superset.errors import ErrorLevel, SupersetError, SupersetErrorType
from superset.exceptions import SupersetException
from superset.exceptions import OAuth2TokenRefreshError, SupersetException
from superset.utils import json
from superset.utils.oauth2 import get_oauth2_access_token
@@ -151,6 +152,7 @@ class GSheetsEngineSpec(ShillelaghEngineSpec):
"https://accounts.google.com/o/oauth2/v2/auth"
)
oauth2_token_request_uri = "https://oauth2.googleapis.com/token" # noqa: S105
oauth2_exception = (UnauthenticatedError, OAuth2TokenRefreshError)
@classmethod
def get_oauth2_authorization_uri(

View File

@@ -62,6 +62,7 @@ Dataset Management:
- list_datasets: List datasets with advanced filters (1-based pagination)
- get_dataset_info: Get detailed dataset information by ID (includes columns/metrics)
- create_virtual_dataset: Save a SQL query as a virtual dataset for charting
- query_dataset: Query a dataset using its semantic layer (saved metrics, dimensions, filters) without needing a saved chart
Chart Management:
- list_charts: List charts with advanced filters (1-based pagination)
@@ -164,6 +165,17 @@ Use created_by_me for authorship, owned_by_me for edit ownership, or both
together for the union. All flags can be combined with 'filters' but not
with 'search'.
To query a dataset's semantic layer (metrics, dimensions):
1. list_datasets(request={{}}) -> find a dataset
2. get_dataset_info(request={{"identifier": <id>}}) -> examine columns AND metrics
3. query_dataset(request={{
"dataset_id": <id>,
"metrics": ["count", "avg_revenue"],
"columns": ["category"],
"time_range": "Last 7 days",
"row_limit": 100
}}) -> returns tabular data using saved metrics and dimensions
To explore data with SQL:
1. list_datasets(request={{}}) -> find a dataset and note its database_id
2. execute_sql(request={{"database_id": <id>, "sql": "SELECT ..."}})
@@ -520,6 +532,7 @@ from superset.mcp_service.dataset.tool import ( # noqa: F401, E402
create_virtual_dataset,
get_dataset_info,
list_datasets,
query_dataset,
)
from superset.mcp_service.explore.tool import ( # noqa: F401, E402
generate_explore_link,

View File

@@ -632,6 +632,15 @@ def mcp_auth_hook(tool_func: F) -> F: # noqa: C901
@functools.wraps(tool_func)
def sync_wrapper(*args: Any, **kwargs: Any) -> Any:
with _get_app_context_manager():
# Clear any stale thread-local SQLAlchemy session before user lookup.
# Thread pool workers reuse threads across requests; db.session is
# scoped by thread (not ContextVar), so a prior request's session may
# still be bound to a different tenant's DB engine. Removing it here
# ensures the next DB access creates a fresh session bound to the
# correct engine for the current request.
from superset.extensions import db
db.session.remove()
user = _setup_user_context()
# No Flask context - this is a FastMCP internal operation

View File

@@ -70,6 +70,8 @@ SORTABLE_CHART_COLUMNS = [
"created_on",
]
_DEFAULT_LIST_CHARTS_REQUEST = ListChartsRequest()
@tool(
tags=["core"],
@@ -81,7 +83,8 @@ SORTABLE_CHART_COLUMNS = [
),
)
async def list_charts(
request: ListChartsRequest, ctx: Context
request: ListChartsRequest | None = None,
ctx: Context = None,
) -> ChartList | ChartError:
"""List charts with filtering and search.
@@ -91,6 +94,7 @@ async def list_charts(
Sortable columns for order_column: id, slice_name, viz_type, description,
changed_on, created_on
"""
request = request or _DEFAULT_LIST_CHARTS_REQUEST.model_copy(deep=True)
await ctx.info(
"Listing charts: page=%s, page_size=%s, search=%s"
% (

View File

@@ -65,6 +65,8 @@ SORTABLE_DASHBOARD_COLUMNS = [
"created_on",
]
_DEFAULT_LIST_DASHBOARDS_REQUEST = ListDashboardsRequest()
@tool(
tags=["core"],
@@ -76,7 +78,8 @@ SORTABLE_DASHBOARD_COLUMNS = [
),
)
async def list_dashboards(
request: ListDashboardsRequest, ctx: Context
request: ListDashboardsRequest | None = None,
ctx: Context = None,
) -> DashboardList:
"""List dashboards with filtering and search. Returns dashboard metadata
including title, slug, URL, and last modified time. Use select_columns to
@@ -85,6 +88,7 @@ async def list_dashboards(
Sortable columns for order_column: id, dashboard_title, slug, published,
changed_on, created_on
"""
request = request or _DEFAULT_LIST_DASHBOARDS_REQUEST.model_copy(deep=True)
await ctx.info(
"Listing dashboards: page=%s, page_size=%s, search=%s"
% (

View File

@@ -36,10 +36,13 @@ from pydantic import (
)
from superset.daos.base import ColumnOperator, ColumnOperatorEnum
from superset.mcp_service.chart.schemas import DataColumn, PerformanceMetadata
from superset.mcp_service.common.cache_schemas import (
CacheStatus,
CreatedByMeMixin,
MetadataCacheControl,
OwnedByMeMixin,
QueryCacheControl,
)
from superset.mcp_service.constants import DEFAULT_PAGE_SIZE, MAX_PAGE_SIZE
from superset.mcp_service.privacy import filter_user_directory_fields
@@ -393,6 +396,146 @@ class CreateVirtualDatasetResponse(BaseModel):
)
VALID_FILTER_OPS = Literal[
"==",
"!=",
">",
"<",
">=",
"<=",
"LIKE",
"NOT LIKE",
"ILIKE",
"NOT ILIKE",
"IN",
"NOT IN",
"IS NULL",
"IS NOT NULL",
"IS TRUE",
"IS FALSE",
"TEMPORAL_RANGE",
]
class QueryDatasetFilter(BaseModel):
"""A single filter condition for dataset queries."""
col: str = Field(..., description="Column name to filter on")
op: VALID_FILTER_OPS = Field(
...,
description=(
'Filter operator. Use "==" for equals, "!=" for not equals, '
'"IN" / "NOT IN" for membership, "IS NULL" / "IS NOT NULL", '
'"LIKE" for pattern matching, "TEMPORAL_RANGE" for time filters.'
),
)
val: Any = Field(
default=None,
description="Filter value (omit for IS NULL/IS NOT NULL)",
)
class QueryDatasetRequest(QueryCacheControl):
"""Request schema for query_dataset tool."""
dataset_id: int | str = Field(
...,
description="Dataset identifier — numeric ID or UUID string.",
)
metrics: List[str] = Field(
default_factory=list,
description=(
"Saved metric names to compute (e.g. ['count', 'avg_revenue']). "
"Use get_dataset_info to discover available metrics."
),
)
columns: List[str] = Field(
default_factory=list,
description=(
"Column/dimension names for GROUP BY or SELECT "
"(e.g. ['category', 'region']). "
"Use get_dataset_info to discover available columns."
),
)
filters: List[QueryDatasetFilter] = Field(
default_factory=list,
description=(
'Filter conditions (e.g. [{"col": "status", "op": "==", "val": "active"}]).'
),
)
time_range: str | None = Field(
default=None,
description=(
"Time range filter (e.g. 'Last 7 days', 'Last month', "
"'2024-01-01 : 2024-12-31'). Requires a temporal column "
"on the dataset."
),
)
time_column: str | None = Field(
default=None,
description=(
"Temporal column to apply time_range to. "
"Defaults to the dataset's main datetime column."
),
)
order_by: List[str] | None = Field(
default=None,
description="Column or metric names to sort results by.",
)
order_desc: bool = Field(
default=True,
description="Sort descending (True) or ascending (False).",
)
row_limit: int = Field(
default=1000,
ge=1,
le=50000,
description="Maximum number of rows to return (default 1000, max 50000).",
)
@model_validator(mode="after")
def validate_metrics_or_columns(self) -> "QueryDatasetRequest":
"""At least one of metrics or columns must be provided."""
if not self.metrics and not self.columns:
raise ValueError(
"At least one of 'metrics' or 'columns' must be provided. "
"Use get_dataset_info to discover available metrics and columns."
)
return self
class QueryDatasetResponse(BaseModel):
"""Response schema for query_dataset tool."""
model_config = ConfigDict(ser_json_timedelta="iso8601")
dataset_id: int = Field(..., description="Dataset ID")
dataset_name: str = Field(..., description="Dataset name")
columns: List[DataColumn] = Field(
default_factory=list, description="Column metadata for returned data"
)
data: List[Dict[str, Any]] = Field(
default_factory=list, description="Query result rows"
)
row_count: int = Field(0, description="Number of rows returned")
total_rows: int | None = Field(
None, description="Total row count from the query engine"
)
summary: str = Field("", description="Human-readable summary of the results")
performance: PerformanceMetadata | None = Field(
None, description="Query performance metadata"
)
cache_status: CacheStatus | None = Field(
None, description="Cache hit/miss information"
)
applied_filters: List[QueryDatasetFilter] = Field(
default_factory=list, description="Filters that were applied to the query"
)
warnings: List[str] = Field(
default_factory=list, description="Any warnings encountered during execution"
)
def _parse_json_field(obj: Any, field_name: str) -> Dict[str, Any] | None:
"""Parse a field that may be stored as a JSON string into a dict."""
value = getattr(obj, field_name, None)

View File

@@ -18,9 +18,11 @@
from .create_virtual_dataset import create_virtual_dataset
from .get_dataset_info import get_dataset_info
from .list_datasets import list_datasets
from .query_dataset import query_dataset
__all__ = [
"create_virtual_dataset",
"list_datasets",
"get_dataset_info",
"query_dataset",
]

View File

@@ -0,0 +1,489 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""
MCP tool: query_dataset
Query a dataset using its semantic layer (saved metrics, calculated columns,
dimensions) without requiring a saved chart.
"""
import difflib
import logging
import time
from typing import Any
from fastmcp import Context
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import joinedload, subqueryload
from superset_core.mcp.decorators import tool, ToolAnnotations
from superset.commands.exceptions import CommandException
from superset.exceptions import OAuth2Error, OAuth2RedirectError, SupersetException
from superset.extensions import event_logger
from superset.mcp_service.chart.schemas import DataColumn, PerformanceMetadata
from superset.mcp_service.dataset.schemas import (
DatasetError,
QueryDatasetFilter,
QueryDatasetRequest,
QueryDatasetResponse,
)
from superset.mcp_service.privacy import (
DATA_MODEL_METADATA_ERROR_TYPE,
requires_data_model_metadata_access,
user_can_view_data_model_metadata,
)
from superset.mcp_service.utils import _is_uuid
from superset.mcp_service.utils.cache_utils import get_cache_status_from_result
from superset.mcp_service.utils.oauth2_utils import build_oauth2_redirect_message
logger = logging.getLogger(__name__)
def _resolve_dataset(identifier: int | str, eager_options: list[Any]) -> Any | None:
"""Resolve a dataset by int ID or UUID string.
Replicates the identifier resolution logic from ModelGetInfoCore._find_object().
"""
from superset.daos.dataset import DatasetDAO
opts = eager_options or None
if isinstance(identifier, int):
return DatasetDAO.find_by_id(identifier, query_options=opts)
# Try parsing as int
try:
id_val = int(identifier)
return DatasetDAO.find_by_id(id_val, query_options=opts)
except (ValueError, TypeError):
pass
# Try UUID
if _is_uuid(str(identifier)):
return DatasetDAO.find_by_id(identifier, id_column="uuid", query_options=opts)
return None
def _validate_names(
requested: list[str],
valid: set[str],
kind: str,
) -> list[str]:
"""Return list of error messages for names not found in *valid*.
Includes close-match suggestions when available.
"""
errors: list[str] = []
for name in requested:
if name not in valid:
suggestions = difflib.get_close_matches(name, valid, n=3, cutoff=0.6)
msg = f"Unknown {kind}: '{name}'"
if suggestions:
msg += f". Did you mean: {', '.join(suggestions)}?"
errors.append(msg)
return errors
@requires_data_model_metadata_access
@tool(
tags=["data"],
class_permission_name="Dataset",
annotations=ToolAnnotations(
title="Query dataset",
readOnlyHint=True,
destructiveHint=False,
),
)
async def query_dataset( # noqa: C901
request: QueryDatasetRequest, ctx: Context
) -> QueryDatasetResponse | DatasetError:
"""Query a dataset using its semantic layer (saved metrics, dimensions, filters).
Returns tabular data without requiring a saved chart. Use this when you want
to compute saved metrics, group by dimensions, or apply filters directly
against a dataset's curated semantic layer.
Workflow:
1. list_datasets -> find a dataset
2. get_dataset_info -> discover available columns and metrics
3. query_dataset -> query using metric names and column names
Example:
```json
{
"dataset_id": 123,
"metrics": ["count", "avg_revenue"],
"columns": ["product_category"],
"time_range": "Last 7 days",
"row_limit": 100
}
```
"""
await ctx.info(
"Starting dataset query: dataset_id=%s, metrics=%s, columns=%s, "
"row_limit=%s"
% (
request.dataset_id,
request.metrics,
request.columns,
request.row_limit,
)
)
try:
from superset.commands.chart.data.get_data_command import ChartDataCommand
from superset.common.query_context_factory import QueryContextFactory
from superset.connectors.sqla.models import SqlaTable
# ------------------------------------------------------------------
# Step 1: Check data-model metadata access BEFORE the dataset lookup.
# Doing this first prevents leaking dataset existence — restricted
# users always receive DataModelMetadataRestricted, never NotFound.
# The decorator hides this tool from search; this check enforces
# direct calls that bypass tool discovery.
# ------------------------------------------------------------------
if not user_can_view_data_model_metadata():
await ctx.warning("Dataset metadata access blocked by privacy controls")
return DatasetError.create(
error=(
"You don't have permission to access dataset details for your role."
),
error_type=DATA_MODEL_METADATA_ERROR_TYPE,
)
# ------------------------------------------------------------------
# Step 2: Resolve dataset
# ------------------------------------------------------------------
await ctx.report_progress(1, 5, "Looking up dataset")
eager_options = [
subqueryload(SqlaTable.columns),
subqueryload(SqlaTable.metrics),
joinedload(SqlaTable.database),
]
with event_logger.log_context(action="mcp.query_dataset.lookup"):
dataset = _resolve_dataset(request.dataset_id, eager_options)
if dataset is None:
await ctx.error("Dataset not found: identifier=%s" % (request.dataset_id,))
return DatasetError.create(
error=f"No dataset found with identifier: {request.dataset_id}",
error_type="NotFound",
)
dataset_name = getattr(dataset, "table_name", None) or f"Dataset {dataset.id}"
await ctx.info(
"Dataset found: id=%s, name=%s, columns=%s, metrics=%s"
% (
dataset.id,
dataset_name,
len(dataset.columns),
len(dataset.metrics),
)
)
# ------------------------------------------------------------------
# Step 2: Validate requested columns and metrics
# ------------------------------------------------------------------
await ctx.report_progress(2, 5, "Validating columns and metrics")
valid_columns = {c.column_name for c in dataset.columns}
valid_metrics = {m.metric_name for m in dataset.metrics}
validation_errors: list[str] = []
validation_errors.extend(
_validate_names(request.columns, valid_columns, "column")
)
validation_errors.extend(
_validate_names(request.metrics, valid_metrics, "metric")
)
# Validate filter column names against dataset columns
filter_cols = [f.col for f in request.filters]
validation_errors.extend(
_validate_names(filter_cols, valid_columns, "filter column")
)
# Validate order_by names against columns + metrics
if request.order_by:
valid_orderby = valid_columns | valid_metrics
validation_errors.extend(
_validate_names(request.order_by, valid_orderby, "order_by")
)
if validation_errors:
error_msg = "; ".join(validation_errors)
await ctx.error("Validation failed: %s" % (error_msg,))
return DatasetError.create(
error=error_msg,
error_type="ValidationError",
)
# ------------------------------------------------------------------
# Step 3: Build filters and time range
# ------------------------------------------------------------------
warnings: list[str] = []
query_filters: list[dict[str, Any]] = [
{"col": f.col, "op": f.op, "val": f.val} for f in request.filters
]
# Track all applied filters (including synthesized ones) for the response.
effective_filters: list[QueryDatasetFilter] = list(request.filters)
granularity: str | None = None
if request.time_range:
temporal_col = request.time_column or getattr(
dataset, "main_dttm_col", None
)
if not temporal_col:
await ctx.error("time_range provided but no temporal column available")
return DatasetError.create(
error=(
"time_range was provided but no temporal column is available. "
"Either set time_column explicitly or ensure the dataset has "
"a main datetime column configured."
),
error_type="ValidationError",
)
# Validate that the temporal column actually exists on the dataset
if temporal_col not in valid_columns:
await ctx.error("time_column '%s' not found on dataset" % temporal_col)
return DatasetError.create(
error=(
f"time_column '{temporal_col}' does not exist on this dataset."
),
error_type="ValidationError",
)
# Warn if the chosen temporal column isn't marked as datetime
dttm_cols = {c.column_name for c in dataset.columns if c.is_dttm}
if temporal_col not in dttm_cols:
warnings.append(
f"Column '{temporal_col}' is not marked as a datetime "
f"column on this dataset. Time filtering may not work "
f"as expected."
)
query_filters.append(
{
"col": temporal_col,
"op": "TEMPORAL_RANGE",
"val": request.time_range,
}
)
effective_filters.append(
QueryDatasetFilter(
col=temporal_col,
op="TEMPORAL_RANGE",
val=request.time_range,
)
)
granularity = temporal_col
await ctx.debug(
"Time filter: column=%s, range=%s" % (temporal_col, request.time_range)
)
# ------------------------------------------------------------------
# Step 4: Build query dict
# ------------------------------------------------------------------
await ctx.report_progress(3, 5, "Building query")
query_dict: dict[str, Any] = {
"filters": query_filters,
"columns": request.columns,
"metrics": request.metrics,
"row_limit": request.row_limit,
"order_desc": request.order_desc,
}
if granularity:
query_dict["granularity"] = granularity
if request.order_by:
# OrderBy = tuple[Metric | Column, bool] where bool is ascending
query_dict["orderby"] = [
(col, not request.order_desc) for col in request.order_by
]
await ctx.debug("Query dict keys: %s" % (sorted(query_dict.keys()),))
# ------------------------------------------------------------------
# Step 5: Create QueryContext and execute
# ------------------------------------------------------------------
await ctx.report_progress(4, 5, "Executing query")
start_time = time.time()
with event_logger.log_context(action="mcp.query_dataset.execute"):
factory = QueryContextFactory()
# datasource_type is "table" because this tool queries SqlaTable
# datasets (Superset's built-in semantic layer). External semantic
# layers (dbt, Snowflake Cortex, etc.) use "semantic_view" and have
# a different query path — see SemanticView + mapper.py.
query_context = factory.create(
datasource={"id": dataset.id, "type": "table"},
queries=[query_dict],
form_data={},
force=not request.use_cache or request.force_refresh,
custom_cache_timeout=request.cache_timeout,
)
command = ChartDataCommand(query_context)
command.validate()
result = command.run()
query_duration_ms = int((time.time() - start_time) * 1000)
if not result or "queries" not in result or len(result["queries"]) == 0:
await ctx.warning("Query returned no results for dataset %s" % dataset.id)
return DatasetError.create(
error="Query returned no results.",
error_type="EmptyQuery",
)
# ------------------------------------------------------------------
# Step 6: Format response
# ------------------------------------------------------------------
await ctx.report_progress(5, 5, "Formatting results")
query_result = result["queries"][0]
data = query_result.get("data", [])
raw_columns = query_result.get("colnames", [])
if not data:
return QueryDatasetResponse(
dataset_id=dataset.id,
dataset_name=dataset_name,
columns=[],
data=[],
row_count=0,
total_rows=0,
summary=f"Query on '{dataset_name}' returned no data.",
performance=PerformanceMetadata(
query_duration_ms=query_duration_ms,
cache_status="no_data",
),
cache_status=get_cache_status_from_result(
query_result, force_refresh=request.force_refresh
),
applied_filters=effective_filters,
warnings=warnings,
)
# Build column metadata in a single pass per column.
# Cap stats computation at STATS_SAMPLE rows to avoid O(rows*cols)
# overhead on large result sets (row_limit allows up to 50k).
stats_sample_size = 5000
stats_rows = data[:stats_sample_size]
columns_meta: list[DataColumn] = []
for col_name in raw_columns:
sample_values = [
row.get(col_name) for row in data[:3] if row.get(col_name) is not None
]
data_type = "string"
if sample_values:
if all(isinstance(v, bool) for v in sample_values):
data_type = "boolean"
elif all(isinstance(v, (int, float)) for v in sample_values):
data_type = "numeric"
# Compute null_count and unique non-null values in one pass
null_count = 0
unique_vals: set[str] = set()
for row in stats_rows:
val = row.get(col_name)
if val is None:
null_count += 1
else:
unique_vals.add(str(val))
columns_meta.append(
DataColumn(
name=col_name,
display_name=col_name.replace("_", " ").title(),
data_type=data_type,
sample_values=sample_values[:3],
null_count=null_count,
unique_count=len(unique_vals),
)
)
cache_status = get_cache_status_from_result(
query_result, force_refresh=request.force_refresh
)
cache_label = "cached" if cache_status and cache_status.cache_hit else "fresh"
summary = (
f"Dataset '{dataset_name}': {len(data)} rows, "
f"{len(raw_columns)} columns ({cache_label})."
)
await ctx.info(
"Query complete: rows=%s, columns=%s, duration=%sms"
% (len(data), len(raw_columns), query_duration_ms)
)
return QueryDatasetResponse(
dataset_id=dataset.id,
dataset_name=dataset_name,
columns=columns_meta,
data=data,
row_count=len(data),
total_rows=query_result.get("rowcount"),
summary=summary,
performance=PerformanceMetadata(
query_duration_ms=query_duration_ms,
cache_status=cache_label,
),
cache_status=cache_status,
applied_filters=effective_filters,
warnings=warnings,
)
except OAuth2RedirectError as exc:
redirect_msg = build_oauth2_redirect_message(exc)
await ctx.error("OAuth2 redirect required: %s" % (redirect_msg,))
return DatasetError.create(
error=redirect_msg,
error_type="OAuth2Redirect",
)
except OAuth2Error as exc:
await ctx.error("OAuth2 error: %s" % (str(exc),))
return DatasetError.create(
error=f"OAuth2 authentication error: {exc}",
error_type="OAuth2Error",
)
except (CommandException, SupersetException) as exc:
await ctx.error("Query failed: %s" % (str(exc),))
return DatasetError.create(
error=f"Query execution failed: {exc}",
error_type="QueryError",
)
except SQLAlchemyError as exc:
await ctx.error("Database error: %s" % (str(exc),))
return DatasetError.create(
error=f"Database error: {exc}",
error_type="DatabaseError",
)
except Exception as exc:
logger.exception(
"Unexpected error while querying dataset: %s: %s",
type(exc).__name__,
str(exc),
)
await ctx.error("Unexpected error: %s: %s" % (type(exc).__name__, str(exc)))
return DatasetError.create(
error="An unexpected error occurred while querying the dataset.",
error_type="UnexpectedError",
)

View File

@@ -1585,11 +1585,17 @@ class ExploreMixin: # pylint: disable=too-many-public-methods
for metric in metric_names
}
# When the original query has limit or offset we wont apply those
# to the subquery so we prevent data inconsistency due to missing records
# in the dataframes when performing the join
# The subquery drops row_offset (the offset period's own row ordering
# differs from the main query's, so applying the same offset would skew
# the join). It must still fetch enough rows to cover the main query's
# window, hence row_limit + row_offset when a chart limit is set.
if query_object.row_limit or query_object.row_offset:
query_object_clone_dct["row_limit"] = app.config["ROW_LIMIT"]
if query_object.row_limit:
query_object_clone_dct["row_limit"] = (
query_object.row_limit + query_object.row_offset
)
else:
query_object_clone_dct["row_limit"] = app.config["ROW_LIMIT"]
query_object_clone_dct["row_offset"] = 0
# Call the unified query method on the datasource

File diff suppressed because it is too large Load Diff

View File

@@ -23,7 +23,6 @@ from unittest.mock import Mock, patch
import numpy as np
import pandas as pd
import pytest
from flask import current_app
from pandas import DateOffset
from superset import db
@@ -43,6 +42,7 @@ from superset.utils.core import (
QueryStatus,
)
from superset.utils.pandas_postprocessing.utils import FLAT_COLUMN_SEPARATOR
from tests.conftest import with_config
from tests.integration_tests.base_tests import SupersetTestCase
from tests.integration_tests.conftest import (
only_postgresql,
@@ -68,6 +68,130 @@ def get_sql_text(payload: dict[str, Any]) -> str:
return response["query"]
def _time_comparison_offset_queries_payload() -> dict[str, Any]:
"""Birth-names chart payload with time comparison and x-axis suitable for tests."""
payload = get_query_context("birth_names")
payload["queries"][0]["columns"] = [
{
"timeGrain": "P1D",
"columnType": "BASE_AXIS",
"sqlExpression": "ds",
"label": "ds",
"expressionType": "SQL",
}
]
payload["queries"][0]["metrics"] = ["sum__num"]
payload["queries"][0]["groupby"] = ["name"]
payload["queries"][0]["is_timeseries"] = True
payload["queries"][0]["time_range"] = "1990 : 1991"
return payload
@pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
@patch("superset.common.query_context.QueryContext.get_query_result")
def test_time_offset_comparison_queries_use_chart_row_limit(
query_result_mock: Mock,
) -> None:
"""Comparison SQL covers the main query's window (row_limit + row_offset)."""
payload = _time_comparison_offset_queries_payload()
payload["queries"][0]["row_limit"] = 100
payload["queries"][0]["row_offset"] = 10
initial_df = pd.DataFrame(
{
"__timestamp": ["1990-01-01", "1990-01-01"],
"name": ["zban", "ahwb"],
"sum__num": [43571, 27225],
}
)
mock_query_result = Mock()
mock_query_result.df = initial_df
query_result_mock.side_effect = [mock_query_result]
query_context = ChartDataQueryContextSchema().load(payload)
query_object = query_context.queries[0]
df = query_context.get_query_result(query_object).df
payload["queries"][0]["time_offsets"] = ["1 year ago", "1 year later"]
query_context = ChartDataQueryContextSchema().load(payload)
query_object = query_context.queries[0]
def cache_key_fn(qo: QueryObject, time_offset: str, time_grain: Any) -> str | None:
return query_context._processor.query_cache_key(
qo, time_offset=time_offset, time_grain=time_grain
)
def cache_timeout_fn() -> int:
return query_context._processor.get_cache_timeout()
time_offsets_obj = query_context.datasource.processing_time_offsets(
df, query_object, cache_key_fn, cache_timeout_fn, query_context.force
)
sqls = time_offsets_obj["queries"]
assert len(sqls) == 2
assert re.search(r"1989-01-01.+1990-01-01", sqls[0], re.S)
assert re.search(r"LIMIT 110", sqls[0], re.S)
assert not re.search(r"OFFSET 10", sqls[0], re.S)
assert re.search(r"1991-01-01.+1992-01-01", sqls[1], re.S)
assert re.search(r"LIMIT 110", sqls[1], re.S)
assert not re.search(r"OFFSET 10", sqls[1], re.S)
@pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
@with_config({"ROW_LIMIT": 4242})
@patch("superset.common.query_context.QueryContext.get_query_result")
def test_time_offset_comparison_queries_use_config_row_limit_without_chart_limit(
query_result_mock: Mock,
) -> None:
"""Chart with row_offset only: subquery widens the config ROW_LIMIT by row_offset.
The schema fills `row_limit` with `app.config["ROW_LIMIT"]` when the payload
omits it, so the query_object arrives with row_limit=4242. The subquery then
covers the window via row_limit + row_offset = 4252.
"""
payload = _time_comparison_offset_queries_payload()
del payload["queries"][0]["row_limit"]
payload["queries"][0]["row_offset"] = 10
initial_df = pd.DataFrame(
{
"__timestamp": ["1990-01-01", "1990-01-01"],
"name": ["zban", "ahwb"],
"sum__num": [43571, 27225],
}
)
mock_query_result = Mock()
mock_query_result.df = initial_df
query_result_mock.side_effect = [mock_query_result]
query_context = ChartDataQueryContextSchema().load(payload)
query_object = query_context.queries[0]
df = query_context.get_query_result(query_object).df
payload["queries"][0]["time_offsets"] = ["1 year ago", "1 year later"]
query_context = ChartDataQueryContextSchema().load(payload)
query_object = query_context.queries[0]
def cache_key_fn(qo: QueryObject, time_offset: str, time_grain: Any) -> str | None:
return query_context._processor.query_cache_key(
qo, time_offset=time_offset, time_grain=time_grain
)
def cache_timeout_fn() -> int:
return query_context._processor.get_cache_timeout()
time_offsets_obj = query_context.datasource.processing_time_offsets(
df, query_object, cache_key_fn, cache_timeout_fn, query_context.force
)
sqls = time_offsets_obj["queries"]
limit_pattern = re.compile(r"LIMIT\s+4252\b")
assert len(sqls) == 2
assert limit_pattern.search(sqls[0])
assert not re.search(r"OFFSET 10", sqls[0], re.S)
assert limit_pattern.search(sqls[1])
assert not re.search(r"OFFSET 10", sqls[1], re.S)
@pytest.mark.skip(
reason=(
"TODO: Fix test class to work with DuckDB example data format. "
@@ -794,28 +918,17 @@ class TestQueryContext(SupersetTestCase):
@pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
@patch("superset.common.query_context.QueryContext.get_query_result")
def test_time_offsets_in_query_object_no_limit(self, query_result_mock):
def test_time_offsets_in_query_object_uses_chart_row_limit(self, query_result_mock):
"""
Ensure that time_offsets can generate the correct queries and
it doesnt use the row_limit nor row_offset from the original
query object
Subquery honors the chart's row_limit (widened by row_offset so the
LEFT JOIN covers the main query's paginated window) and drops
row_offset. Before this fix, row_limit was replaced with
app.config["ROW_LIMIT"], which caused the main query and offset
subquery to fetch different row counts.
"""
payload = get_query_context("birth_names")
payload["queries"][0]["columns"] = [
{
"timeGrain": "P1D",
"columnType": "BASE_AXIS",
"sqlExpression": "ds",
"label": "ds",
"expressionType": "SQL",
}
]
payload["queries"][0]["metrics"] = ["sum__num"]
payload["queries"][0]["groupby"] = ["name"]
payload["queries"][0]["is_timeseries"] = True
payload = _time_comparison_offset_queries_payload()
payload["queries"][0]["row_limit"] = 100
payload["queries"][0]["row_offset"] = 10
payload["queries"][0]["time_range"] = "1990 : 1991"
initial_data = {
"__timestamp": ["1990-01-01", "1990-01-01"],
@@ -839,33 +952,86 @@ class TestQueryContext(SupersetTestCase):
query_context = ChartDataQueryContextSchema().load(payload)
query_object = query_context.queries[0]
def cache_key_fn(qo, time_offset, time_grain):
def cache_key_fn(
qo: QueryObject, time_offset: str, time_grain: Any
) -> str | None:
return query_context._processor.query_cache_key(
qo, time_offset=time_offset, time_grain=time_grain
)
def cache_timeout_fn():
def cache_timeout_fn() -> int:
return query_context._processor.get_cache_timeout()
time_offsets_obj = query_context.datasource.processing_time_offsets(
df, query_object, cache_key_fn, cache_timeout_fn, query_context.force
)
sqls = time_offsets_obj["queries"]
row_limit_value = current_app.config["ROW_LIMIT"]
row_limit_pattern_with_config_value = r"LIMIT " + re.escape(
str(row_limit_value)
)
assert len(sqls) == 2
# 1 year ago
# 1 year ago — subquery widens row_limit to cover main window (100 + 10)
assert re.search(r"1989-01-01.+1990-01-01", sqls[0], re.S)
assert not re.search(r"LIMIT 100", sqls[0], re.S)
assert re.search(r"LIMIT 110", sqls[0], re.S)
assert not re.search(r"OFFSET 10", sqls[0], re.S)
assert re.search(row_limit_pattern_with_config_value, sqls[0], re.S)
# 1 year later
assert re.search(r"1991-01-01.+1992-01-01", sqls[1], re.S)
assert not re.search(r"LIMIT 100", sqls[1], re.S)
assert re.search(r"LIMIT 110", sqls[1], re.S)
assert not re.search(r"OFFSET 10", sqls[1], re.S)
@pytest.mark.usefixtures("load_birth_names_dashboard_with_slices")
@with_config({"ROW_LIMIT": 4242})
@patch("superset.common.query_context.QueryContext.get_query_result")
def test_time_offsets_use_config_row_limit_when_chart_has_offset_only(
self, query_result_mock
):
"""
Chart with row_offset only: subquery widens the config ROW_LIMIT by row_offset.
The schema fills row_limit with app.config["ROW_LIMIT"] (4242) when the
payload omits it, so the subquery covers the window via 4242 + 10 = 4252.
"""
payload = _time_comparison_offset_queries_payload()
del payload["queries"][0]["row_limit"]
payload["queries"][0]["row_offset"] = 10
initial_data = {
"__timestamp": ["1990-01-01", "1990-01-01"],
"name": ["zban", "ahwb"],
"sum__num": [43571, 27225],
}
initial_df = pd.DataFrame(initial_data)
mock_query_result = Mock()
mock_query_result.df = initial_df
query_result_mock.side_effect = [mock_query_result]
query_context = ChartDataQueryContextSchema().load(payload)
query_object = query_context.queries[0]
query_result = query_context.get_query_result(query_object)
df = query_result.df
payload["queries"][0]["time_offsets"] = ["1 year ago", "1 year later"]
query_context = ChartDataQueryContextSchema().load(payload)
query_object = query_context.queries[0]
def cache_key_fn(
qo: QueryObject, time_offset: str, time_grain: Any
) -> str | None:
return query_context._processor.query_cache_key(
qo, time_offset=time_offset, time_grain=time_grain
)
def cache_timeout_fn() -> int:
return query_context._processor.get_cache_timeout()
time_offsets_obj = query_context.datasource.processing_time_offsets(
df, query_object, cache_key_fn, cache_timeout_fn, query_context.force
)
sqls = time_offsets_obj["queries"]
limit_pattern = re.compile(r"LIMIT\s+4252\b")
assert len(sqls) == 2
assert limit_pattern.search(sqls[0])
assert not re.search(r"OFFSET 10", sqls[0], re.S)
assert limit_pattern.search(sqls[1])
assert not re.search(r"OFFSET 10", sqls[1], re.S)
assert re.search(row_limit_pattern_with_config_value, sqls[1], re.S)
def test_get_label_map(app_context, virtual_dataset_comma_in_column_value):

View File

@@ -700,6 +700,233 @@ def test_processing_time_offsets_date_range_enabled(processor):
assert isinstance(result["cache_keys"], list)
def test_processing_time_offsets_uses_chart_row_limit(processor):
"""Offset subquery inherits the chart's row_limit when one is set."""
from superset.common.query_object import QueryObject
from superset.models.helpers import ExploreMixin
processor._qc_datasource.processing_time_offsets = (
ExploreMixin.processing_time_offsets.__get__(processor._qc_datasource)
)
df = pd.DataFrame({"__timestamp": ["1990-01-01"], "sum__num": [100]})
query_object = QueryObject(
datasource=MagicMock(),
granularity="ds",
columns=[],
metrics=["sum__num"],
is_timeseries=True,
row_limit=100,
row_offset=0,
time_offsets=["1 year ago"],
filters=[
{
"col": "ds",
"op": "TEMPORAL_RANGE",
"val": "1990-01-01 : 1991-01-01",
}
],
)
captured: list[dict[str, Any]] = []
def fake_query(dct: dict[str, Any]) -> MagicMock:
captured.append(dct)
result = MagicMock()
result.df = pd.DataFrame()
result.query = "SELECT 1"
return result
processor._qc_datasource.query = fake_query
processor._qc_datasource.normalize_df = MagicMock(return_value=pd.DataFrame())
with (
patch(
"superset.models.helpers.get_since_until_from_query_object",
return_value=(pd.Timestamp("1990-01-01"), pd.Timestamp("1991-01-01")),
),
patch(
"superset.common.utils.query_cache_manager.QueryCacheManager"
) as mock_cache_manager,
patch.object(
processor._qc_datasource,
"get_time_grain",
return_value=None,
),
patch.object(
processor._qc_datasource,
"join_offset_dfs",
return_value=df,
),
):
mock_cache = MagicMock()
mock_cache.is_loaded = False
mock_cache_manager.get.return_value = mock_cache
processor._qc_datasource.processing_time_offsets(
df, query_object, None, None, False
)
assert len(captured) == 1
assert captured[0]["row_limit"] == 100
assert captured[0]["row_offset"] == 0
def test_processing_time_offsets_row_offset_extends_window(processor):
"""Offset subquery limit covers the main query's window (row_limit + row_offset).
When the chart has pagination (row_offset > 0), fetching only row_limit rows
in the offset period would likely miss the dimensions present in the main
query's page, yielding null comparison values. The subquery instead drops
row_offset and widens row_limit to cover the full window.
"""
from superset.common.query_object import QueryObject
from superset.models.helpers import ExploreMixin
processor._qc_datasource.processing_time_offsets = (
ExploreMixin.processing_time_offsets.__get__(processor._qc_datasource)
)
df = pd.DataFrame({"__timestamp": ["1990-01-01"], "sum__num": [100]})
query_object = QueryObject(
datasource=MagicMock(),
granularity="ds",
columns=[],
metrics=["sum__num"],
is_timeseries=True,
row_limit=100,
row_offset=10,
time_offsets=["1 year ago"],
filters=[
{
"col": "ds",
"op": "TEMPORAL_RANGE",
"val": "1990-01-01 : 1991-01-01",
}
],
)
captured: list[dict[str, Any]] = []
def fake_query(dct: dict[str, Any]) -> MagicMock:
captured.append(dct)
result = MagicMock()
result.df = pd.DataFrame()
result.query = "SELECT 1"
return result
processor._qc_datasource.query = fake_query
processor._qc_datasource.normalize_df = MagicMock(return_value=pd.DataFrame())
with (
patch(
"superset.models.helpers.get_since_until_from_query_object",
return_value=(pd.Timestamp("1990-01-01"), pd.Timestamp("1991-01-01")),
),
patch(
"superset.common.utils.query_cache_manager.QueryCacheManager"
) as mock_cache_manager,
patch.object(
processor._qc_datasource,
"get_time_grain",
return_value=None,
),
patch.object(
processor._qc_datasource,
"join_offset_dfs",
return_value=df,
),
):
mock_cache = MagicMock()
mock_cache.is_loaded = False
mock_cache_manager.get.return_value = mock_cache
processor._qc_datasource.processing_time_offsets(
df, query_object, None, None, False
)
assert len(captured) == 1
assert captured[0]["row_limit"] == 110
assert captured[0]["row_offset"] == 0
def test_processing_time_offsets_falls_back_to_config_row_limit(processor):
"""Offset subquery uses app config ROW_LIMIT when chart has offset but no limit."""
from superset.common.query_object import QueryObject
from superset.models.helpers import ExploreMixin
processor._qc_datasource.processing_time_offsets = (
ExploreMixin.processing_time_offsets.__get__(processor._qc_datasource)
)
df = pd.DataFrame({"__timestamp": ["1990-01-01"], "sum__num": [100]})
query_object = QueryObject(
datasource=MagicMock(),
granularity="ds",
columns=[],
metrics=["sum__num"],
is_timeseries=True,
row_limit=None,
row_offset=10,
time_offsets=["1 year ago"],
filters=[
{
"col": "ds",
"op": "TEMPORAL_RANGE",
"val": "1990-01-01 : 1991-01-01",
}
],
)
captured: list[dict[str, Any]] = []
def fake_query(dct: dict[str, Any]) -> MagicMock:
captured.append(dct)
result = MagicMock()
result.df = pd.DataFrame()
result.query = "SELECT 1"
return result
processor._qc_datasource.query = fake_query
processor._qc_datasource.normalize_df = MagicMock(return_value=pd.DataFrame())
with (
patch(
"superset.models.helpers.get_since_until_from_query_object",
return_value=(pd.Timestamp("1990-01-01"), pd.Timestamp("1991-01-01")),
),
patch(
"superset.common.utils.query_cache_manager.QueryCacheManager"
) as mock_cache_manager,
patch.object(
processor._qc_datasource,
"get_time_grain",
return_value=None,
),
patch.object(
processor._qc_datasource,
"join_offset_dfs",
return_value=df,
),
patch("superset.models.helpers.app") as mock_app,
):
mock_app.config = {"ROW_LIMIT": 4242}
mock_cache = MagicMock()
mock_cache.is_loaded = False
mock_cache_manager.get.return_value = mock_cache
processor._qc_datasource.processing_time_offsets(
df, query_object, None, None, False
)
assert len(captured) == 1
assert captured[0]["row_limit"] == 4242
assert captured[0]["row_offset"] == 0
def test_ensure_totals_available_updates_cache_values():
"""
Test that ensure_totals_available() updates the query objects AND

View File

@@ -24,6 +24,7 @@ import pandas as pd
import pytest
from pytest_mock import MockerFixture
from requests.exceptions import HTTPError
from shillelagh.exceptions import UnauthenticatedError
from sqlalchemy.engine.url import make_url
from superset.errors import ErrorLevel, SupersetError, SupersetErrorType
@@ -789,6 +790,36 @@ def test_needs_oauth2_with_other_error(mocker: MockerFixture) -> None:
assert GSheetsEngineSpec.needs_oauth2(ex) is False
def test_needs_oauth2_with_shillelagh_unauthenticated_error(
mocker: MockerFixture,
) -> None:
"""
Test that needs_oauth2 returns True when UnauthenticatedError is raised.
"""
from superset.db_engine_specs.gsheets import GSheetsEngineSpec
g = mocker.patch("superset.db_engine_specs.gsheets.g")
g.user = mocker.MagicMock()
ex = UnauthenticatedError("Token has been revoked")
assert GSheetsEngineSpec.needs_oauth2(ex) is True
def test_needs_oauth2_with_unrelated_exception_type(
mocker: MockerFixture,
) -> None:
"""
Test that an unrelated exception type (with no matching message) returns
False.
"""
from superset.db_engine_specs.gsheets import GSheetsEngineSpec
g = mocker.patch("superset.db_engine_specs.gsheets.g")
g.user = mocker.MagicMock()
assert GSheetsEngineSpec.needs_oauth2(ValueError("unrelated")) is False
def test_get_oauth2_fresh_token_success(
mocker: MockerFixture,
oauth2_config: OAuth2ClientConfig,

View File

@@ -320,66 +320,13 @@ class TestChartDataModelMetadataPrivacy:
assert data["error_type"] == DATA_MODEL_METADATA_ERROR_TYPE
class TestListChartsCreatedByMe:
"""Tests for the created_by_me flag on ListChartsRequest."""
def test_created_by_me_default_is_false(self):
request = ListChartsRequest()
assert request.created_by_me is False
def test_created_by_me_true_accepted(self):
request = ListChartsRequest(created_by_me=True)
assert request.created_by_me is True
def test_created_by_me_combined_with_filters(self):
request = ListChartsRequest(
created_by_me=True,
filters=[ChartFilter(col="slice_name", opr="sw", value="My")],
)
assert request.created_by_me is True
assert len(request.filters) == 1
def test_created_by_me_with_search_raises(self):
from pydantic import ValidationError
with pytest.raises(ValidationError, match="created_by_me"):
ListChartsRequest(created_by_me=True, search="My charts")
def test_chart_filter_rejects_created_by_fk(self):
"""created_by_fk is not a public filter column; use created_by_me instead."""
from pydantic import ValidationError
with pytest.raises(ValidationError):
ChartFilter(col="created_by_fk", opr="eq", value=1)
class TestListChartsOwnedByMe:
"""Tests for the owned_by_me flag on ListChartsRequest."""
def test_owned_by_me_default_is_false(self):
request = ListChartsRequest()
assert request.owned_by_me is False
def test_owned_by_me_true_accepted(self):
request = ListChartsRequest(owned_by_me=True)
assert request.owned_by_me is True
def test_owned_by_me_combined_with_filters(self):
request = ListChartsRequest(
owned_by_me=True,
filters=[ChartFilter(col="slice_name", opr="sw", value="My")],
)
assert request.owned_by_me is True
assert len(request.filters) == 1
def test_owned_by_me_with_search_raises(self):
from pydantic import ValidationError
with pytest.raises(ValidationError, match="owned_by_me"):
ListChartsRequest(owned_by_me=True, search="My charts")
def test_owned_by_me_and_created_by_me_allowed(self):
"""Both flags together are valid (OR logic — creator or owner)."""
request = ListChartsRequest(owned_by_me=True, created_by_me=True)
assert request.owned_by_me is True
assert request.created_by_me is True
@patch("superset.daos.chart.ChartDAO.list")
@pytest.mark.asyncio
async def test_list_charts_no_arguments(mock_list, mcp_server):
"""Regression test: list_charts must accept zero arguments without raising
pydantic_core.ValidationError: Missing required argument: request."""
mock_list.return_value = ([], 0)
async with Client(mcp_server) as client:
result = await client.call_tool("list_charts", {})
data = json.loads(result.content[0].text)
assert "charts" in data

View File

@@ -30,7 +30,6 @@ from flask import g
from superset.mcp_service.app import mcp
from superset.mcp_service.dashboard.schemas import (
DashboardFilter,
ListDashboardsRequest,
)
from superset.mcp_service.dashboard.tool.get_dashboard_info import (
@@ -1355,66 +1354,13 @@ class TestDashboardSortableColumns:
assert col in list_dashboards.__doc__
class TestListDashboardsCreatedByMe:
"""Tests for the created_by_me flag on ListDashboardsRequest."""
def test_created_by_me_default_is_false(self):
request = ListDashboardsRequest()
assert request.created_by_me is False
def test_created_by_me_true_accepted(self):
request = ListDashboardsRequest(created_by_me=True)
assert request.created_by_me is True
def test_created_by_me_combined_with_filters(self):
request = ListDashboardsRequest(
created_by_me=True,
filters=[DashboardFilter(col="published", opr="eq", value=True)],
)
assert request.created_by_me is True
assert len(request.filters) == 1
def test_created_by_me_with_search_raises(self):
from pydantic import ValidationError
with pytest.raises(ValidationError, match="created_by_me"):
ListDashboardsRequest(created_by_me=True, search="My dashboards")
def test_dashboard_filter_rejects_created_by_fk(self):
"""created_by_fk is not a public filter column; use created_by_me instead."""
from pydantic import ValidationError
with pytest.raises(ValidationError):
DashboardFilter(col="created_by_fk", opr="eq", value=1)
class TestListDashboardsOwnedByMe:
"""Tests for the owned_by_me flag on ListDashboardsRequest."""
def test_owned_by_me_default_is_false(self):
request = ListDashboardsRequest()
assert request.owned_by_me is False
def test_owned_by_me_true_accepted(self):
request = ListDashboardsRequest(owned_by_me=True)
assert request.owned_by_me is True
def test_owned_by_me_combined_with_filters(self):
request = ListDashboardsRequest(
owned_by_me=True,
filters=[DashboardFilter(col="published", opr="eq", value=True)],
)
assert request.owned_by_me is True
assert len(request.filters) == 1
def test_owned_by_me_with_search_raises(self):
from pydantic import ValidationError
with pytest.raises(ValidationError, match="owned_by_me"):
ListDashboardsRequest(owned_by_me=True, search="My dashboards")
def test_owned_by_me_and_created_by_me_allowed(self):
"""Both flags together are valid (OR logic — creator or owner)."""
request = ListDashboardsRequest(owned_by_me=True, created_by_me=True)
assert request.owned_by_me is True
assert request.created_by_me is True
@patch("superset.daos.dashboard.DashboardDAO.list")
@pytest.mark.asyncio
async def test_list_dashboards_no_arguments(mock_list, mcp_server):
"""Regression test: list_dashboards must accept zero arguments without raising
pydantic_core.ValidationError: Missing required argument: request."""
mock_list.return_value = ([], 0)
async with Client(mcp_server) as client:
result = await client.call_tool("list_dashboards", {})
data = json.loads(result.content[0].text)
assert "dashboards" in data

View File

@@ -0,0 +1,831 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""Tests for the query_dataset MCP tool."""
from __future__ import annotations
import importlib
from collections.abc import Generator
from typing import Any
from unittest.mock import MagicMock, Mock, patch
import pytest
from fastmcp import Client, FastMCP
from superset.mcp_service.app import mcp
from superset.utils import json
query_dataset_module = importlib.import_module(
"superset.mcp_service.dataset.tool.query_dataset"
)
@pytest.fixture
def mcp_server() -> FastMCP:
return mcp
@pytest.fixture(autouse=True)
def mock_auth() -> Generator[MagicMock, None, None]:
"""Mock authentication and metadata access for all tests."""
with (
patch("superset.mcp_service.auth.get_user_from_request") as mock_get_user,
patch.object(
query_dataset_module,
"user_can_view_data_model_metadata",
return_value=True,
),
):
mock_user = Mock()
mock_user.id = 1
mock_user.username = "admin"
mock_get_user.return_value = mock_user
yield mock_get_user
def _make_column(name: str, is_dttm: bool = False) -> MagicMock:
"""Build a mock SqlaTable column with the given name and datetime flag."""
col = MagicMock()
col.column_name = name
col.is_dttm = is_dttm
col.verbose_name = None
col.type = "VARCHAR"
col.groupby = True
col.filterable = True
col.description = None
return col
def _make_metric(name: str, expression: str = "COUNT(*)") -> MagicMock:
"""Build a mock SqlMetric with the given name and SQL expression."""
metric = MagicMock()
metric.metric_name = name
metric.verbose_name = None
metric.expression = expression
metric.description = None
metric.d3format = None
return metric
def _make_dataset(
dataset_id: int = 1,
table_name: str = "orders",
columns: list[Any] | None = None,
metrics: list[Any] | None = None,
main_dttm_col: str | None = None,
) -> MagicMock:
"""Build a mock SqlaTable dataset with default columns and metrics."""
ds = MagicMock()
ds.id = dataset_id
ds.table_name = table_name
ds.uuid = f"test-uuid-{dataset_id}"
ds.main_dttm_col = main_dttm_col
ds.database = MagicMock()
ds.database.database_name = "examples"
ds.columns = columns or [
_make_column("category"),
_make_column("region"),
_make_column("order_date", is_dttm=True),
]
ds.metrics = metrics or [
_make_metric("count", "COUNT(*)"),
_make_metric("total_revenue", "SUM(revenue)"),
]
return ds
def _mock_command_result(
data: list[dict[str, Any]] | None = None,
colnames: list[str] | None = None,
) -> dict[str, Any]:
"""Build the result dict that ChartDataCommand.run() returns."""
data = data or [
{"category": "Electronics", "count": 42},
{"category": "Clothing", "count": 17},
]
colnames = colnames or ["category", "count"]
return {
"queries": [
{
"data": data,
"colnames": colnames,
"rowcount": len(data),
"cache_key": "abc123",
"is_cached": False,
"cached_dttm": None,
"cache_timeout": 300,
}
]
}
@pytest.mark.asyncio
async def test_query_dataset_success(mcp_server: FastMCP) -> None:
"""Happy path: metrics + columns returns data."""
dataset = _make_dataset()
result_data = _mock_command_result()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.run",
return_value=result_data,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
return_value=MagicMock(),
),
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"columns": ["category"],
}
},
)
data = json.loads(result.content[0].text)
assert data["dataset_id"] == 1
assert data["dataset_name"] == "orders"
assert data["row_count"] == 2
assert len(data["data"]) == 2
assert data["data"][0]["category"] == "Electronics"
@pytest.mark.asyncio
async def test_query_dataset_not_found(mcp_server: FastMCP) -> None:
"""Dataset ID that doesn't exist returns error."""
with patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=None,
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 999,
"metrics": ["count"],
}
},
)
data = json.loads(result.content[0].text)
assert data["error_type"] == "NotFound"
assert "999" in data["error"]
@pytest.mark.asyncio
async def test_query_dataset_invalid_metric(mcp_server: FastMCP) -> None:
"""Unknown metric name returns validation error with suggestions."""
dataset = _make_dataset()
with patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["countt"], # typo
}
},
)
data = json.loads(result.content[0].text)
assert data["error_type"] == "ValidationError"
assert "countt" in data["error"]
# Should suggest "count" as a close match
assert "count" in data["error"]
@pytest.mark.asyncio
async def test_query_dataset_invalid_column(mcp_server: FastMCP) -> None:
"""Unknown column name returns validation error."""
dataset = _make_dataset()
with patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"columns": ["nonexistent_col"],
"metrics": ["count"],
}
},
)
data = json.loads(result.content[0].text)
assert data["error_type"] == "ValidationError"
assert "nonexistent_col" in data["error"]
@pytest.mark.asyncio
async def test_query_dataset_no_metrics_no_columns(mcp_server: FastMCP) -> None:
"""Providing neither metrics nor columns raises validation error."""
from fastmcp.exceptions import ToolError
async with Client(mcp_server) as client:
with pytest.raises(ToolError, match="metrics.*columns"):
await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": [],
"columns": [],
}
},
)
@pytest.mark.asyncio
async def test_query_dataset_with_time_range(mcp_server: FastMCP) -> None:
"""time_range is converted to TEMPORAL_RANGE filter + granularity."""
dataset = _make_dataset(main_dttm_col="order_date")
result_data = _mock_command_result()
captured_queries: list[dict[str, Any]] = []
def capture_create(**kwargs):
captured_queries.extend(kwargs.get("queries", []))
return MagicMock()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.run",
return_value=result_data,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
side_effect=capture_create,
),
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"time_range": "Last 7 days",
}
},
)
assert len(captured_queries) == 1
query_dict = captured_queries[0]
# Should have TEMPORAL_RANGE filter
temporal_filters = [f for f in query_dict["filters"] if f["op"] == "TEMPORAL_RANGE"]
assert len(temporal_filters) == 1
assert temporal_filters[0]["col"] == "order_date"
assert temporal_filters[0]["val"] == "Last 7 days"
# Should set granularity
assert query_dict["granularity"] == "order_date"
# applied_filters in response must include the synthesized TEMPORAL_RANGE filter
data = json.loads(result.content[0].text)
resp_filters = data["applied_filters"]
temporal_resp = [f for f in resp_filters if f["op"] == "TEMPORAL_RANGE"]
assert len(temporal_resp) == 1
assert temporal_resp[0]["col"] == "order_date"
assert temporal_resp[0]["val"] == "Last 7 days"
@pytest.mark.asyncio
async def test_query_dataset_time_range_no_temporal_column(mcp_server: FastMCP) -> None:
"""time_range without a temporal column returns error."""
dataset = _make_dataset(main_dttm_col=None)
with patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"time_range": "Last 7 days",
}
},
)
data = json.loads(result.content[0].text)
assert data["error_type"] == "ValidationError"
assert "temporal column" in data["error"].lower()
@pytest.mark.asyncio
async def test_query_dataset_with_filters(mcp_server: FastMCP) -> None:
"""User-provided filters are passed through to the query."""
dataset = _make_dataset()
result_data = _mock_command_result()
captured_queries: list[dict[str, Any]] = []
def capture_create(**kwargs):
captured_queries.extend(kwargs.get("queries", []))
return MagicMock()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.run",
return_value=result_data,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
side_effect=capture_create,
),
):
async with Client(mcp_server) as client:
await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"filters": [
{"col": "category", "op": "==", "val": "Electronics"}
],
}
},
)
assert len(captured_queries) == 1
filters = captured_queries[0]["filters"]
assert len(filters) == 1
assert filters[0]["col"] == "category"
assert filters[0]["op"] == "=="
assert filters[0]["val"] == "Electronics"
@pytest.mark.asyncio
async def test_query_dataset_empty_results(mcp_server: FastMCP) -> None:
"""Query that returns no data gives a response with row_count=0."""
dataset = _make_dataset()
empty_result = {
"queries": [
{
"data": [],
"colnames": [],
"rowcount": 0,
"is_cached": False,
"cached_dttm": None,
"cache_timeout": 300,
}
]
}
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.run",
return_value=empty_result,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
return_value=MagicMock(),
),
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
}
},
)
data = json.loads(result.content[0].text)
assert data["row_count"] == 0
assert data["data"] == []
assert "no data" in data["summary"].lower()
@pytest.mark.asyncio
async def test_query_dataset_by_uuid(mcp_server: FastMCP) -> None:
"""UUID-based lookup works."""
dataset = _make_dataset()
result_data = _mock_command_result()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
) as mock_resolve,
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.run",
return_value=result_data,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
return_value=MagicMock(),
),
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": "a1b2c3d4-5678-90ab-cdef-1234567890ab",
"metrics": ["count"],
}
},
)
# Verify the resolve function was called with the UUID
mock_resolve.assert_called_once()
call_args = mock_resolve.call_args
assert call_args[0][0] == "a1b2c3d4-5678-90ab-cdef-1234567890ab"
data = json.loads(result.content[0].text)
assert data["dataset_id"] == 1
@pytest.mark.asyncio
async def test_query_dataset_permission_denied(mcp_server: FastMCP) -> None:
"""Permission denied from ChartDataCommand.validate() returns error."""
from superset.errors import ErrorLevel, SupersetError, SupersetErrorType
from superset.exceptions import SupersetSecurityException
dataset = _make_dataset()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
return_value=MagicMock(),
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
side_effect=SupersetSecurityException(
SupersetError(
message="Access denied",
error_type=SupersetErrorType.DATASOURCE_SECURITY_ACCESS_ERROR,
level=ErrorLevel.WARNING,
)
),
),
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
}
},
)
data = json.loads(result.content[0].text)
assert data["error_type"] == "QueryError"
@pytest.mark.asyncio
async def test_query_dataset_order_by_valid(mcp_server: FastMCP) -> None:
"""order_by with valid column/metric names passes through."""
dataset = _make_dataset()
result_data = _mock_command_result()
captured_queries: list[dict[str, Any]] = []
def capture_create(**kwargs):
captured_queries.extend(kwargs.get("queries", []))
return MagicMock()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.run",
return_value=result_data,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
side_effect=capture_create,
),
):
async with Client(mcp_server) as client:
await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"columns": ["category"],
"order_by": ["count"],
"order_desc": True,
}
},
)
assert len(captured_queries) == 1
orderby = captured_queries[0].get("orderby", [])
assert len(orderby) == 1
assert orderby[0][0] == "count"
# order_desc=True -> ascending=False
assert orderby[0][1] is False
@pytest.mark.asyncio
async def test_query_dataset_order_by_invalid(mcp_server: FastMCP) -> None:
"""order_by with an unknown name returns validation error."""
dataset = _make_dataset()
with patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"order_by": ["nonexistent"],
}
},
)
data = json.loads(result.content[0].text)
assert data["error_type"] == "ValidationError"
assert "nonexistent" in data["error"]
@pytest.mark.asyncio
async def test_query_dataset_time_column_override(mcp_server: FastMCP) -> None:
"""Explicit time_column overrides dataset main_dttm_col."""
dataset = _make_dataset(main_dttm_col="order_date")
result_data = _mock_command_result()
captured_queries: list[dict[str, Any]] = []
def capture_create(**kwargs):
captured_queries.extend(kwargs.get("queries", []))
return MagicMock()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.run",
return_value=result_data,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
side_effect=capture_create,
),
):
async with Client(mcp_server) as client:
await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"time_range": "Last 30 days",
"time_column": "order_date",
}
},
)
assert len(captured_queries) == 1
query_dict = captured_queries[0]
assert query_dict["granularity"] == "order_date"
temporal_filters = [f for f in query_dict["filters"] if f["op"] == "TEMPORAL_RANGE"]
assert temporal_filters[0]["col"] == "order_date"
@pytest.mark.asyncio
async def test_query_dataset_non_dttm_time_column_warns(mcp_server: FastMCP) -> None:
"""Using a non-datetime column for time_range produces a warning."""
dataset = _make_dataset(main_dttm_col=None)
result_data = _mock_command_result()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.validate",
),
patch(
"superset.commands.chart.data.get_data_command.ChartDataCommand.run",
return_value=result_data,
),
patch(
"superset.common.query_context_factory.QueryContextFactory.create",
return_value=MagicMock(),
),
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"time_range": "Last 7 days",
"time_column": "category",
}
},
)
data = json.loads(result.content[0].text)
assert len(data["warnings"]) > 0
assert "not marked as a datetime" in data["warnings"][0]
@pytest.mark.asyncio
async def test_query_dataset_invalid_filter_column(mcp_server: FastMCP) -> None:
"""Filter on a column that doesn't exist returns validation error."""
dataset = _make_dataset()
with patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
"metrics": ["count"],
"filters": [
{
"col": "nonexistent",
"op": "==",
"val": "test",
}
],
}
},
)
data = json.loads(result.content[0].text)
assert data["error_type"] == "ValidationError"
assert "nonexistent" in data["error"]
@pytest.mark.asyncio
async def test_query_dataset_metadata_access_denied_no_suggestions(
mcp_server: FastMCP,
) -> None:
"""Users without data-model metadata access cannot probe column/metric names.
The privacy gate must fire before the validation step that returns close-match
suggestions, so restricted users cannot enumerate schema details via typos.
"""
dataset = _make_dataset()
with (
patch.object(
query_dataset_module,
"_resolve_dataset",
return_value=dataset,
),
patch.object(
query_dataset_module,
"user_can_view_data_model_metadata",
return_value=False,
),
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
"dataset_id": 1,
# Typo that would normally trigger close-match suggestions
"metrics": ["countt"],
}
},
)
data = json.loads(result.content[0].text)
# Must be denied before returning any schema suggestions
assert data["error_type"] == "DataModelMetadataRestricted"
# Must NOT contain column/metric name suggestions
assert "countt" not in data.get("error", "")
assert "count" not in data.get("error", "")
@pytest.mark.asyncio
async def test_query_dataset_metadata_access_denied_nonexistent_dataset(
mcp_server: FastMCP,
) -> None:
"""Metadata-restricted users must not be able to probe dataset existence.
The privacy gate fires before the DAO lookup, so a restricted caller
always receives DataModelMetadataRestricted — never NotFound — regardless
of whether the requested dataset ID exists.
"""
with patch.object(
query_dataset_module,
"user_can_view_data_model_metadata",
return_value=False,
):
async with Client(mcp_server) as client:
result = await client.call_tool(
"query_dataset",
{
"request": {
# Use a dataset_id that does not exist
"dataset_id": 999999,
"metrics": ["count"],
}
},
)
data = json.loads(result.content[0].text)
# Must receive restricted error, not a NotFound that leaks existence
assert data["error_type"] == "DataModelMetadataRestricted"
assert data["error_type"] != "NotFound"

View File

@@ -372,6 +372,43 @@ def test_mcp_auth_hook_preserves_g_user_in_request_context(app) -> None:
assert result == "middleware_user"
def test_mcp_auth_hook_removes_stale_db_session_in_sync_wrapper(app) -> None:
"""sync_wrapper calls db.session.remove() BEFORE get_user_from_request().
Thread pool workers reuse threads across requests; db.session is
thread-local and may be bound to a different tenant's DB engine from a
prior request. Removing it before user lookup ensures a fresh session is
created for the current request.
The ordering is critical: if remove() were called after user lookup,
the stale session binding would already have caused a mismatch error.
"""
fresh_user = _make_mock_user("fresh")
def dummy_tool():
"""Dummy tool."""
return g.user.username
wrapped = mcp_auth_hook(dummy_tool)
with app.test_request_context():
g.user = fresh_user
with patch("superset.extensions.db") as mock_db:
def _assert_remove_already_called() -> MagicMock:
"""Verify remove() was called before user resolution runs."""
mock_db.session.remove.assert_called_once_with()
return fresh_user
with patch(
"superset.mcp_service.auth.get_user_from_request",
side_effect=_assert_remove_already_called,
):
result = wrapped()
assert result == "fresh"
# -- default_user_resolver --