mirror of
https://github.com/apache/superset.git
synced 2026-04-30 21:44:40 +00:00
Compare commits
9 Commits
docs/db-ca
...
subdirecto
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
86fe4fc8b2 | ||
|
|
5fe3a1c2cd | ||
|
|
90f8fafbb4 | ||
|
|
7774ec7e3c | ||
|
|
6da04fa51d | ||
|
|
7c4b2b137c | ||
|
|
9ccd37de1c | ||
|
|
b791f4c2cd | ||
|
|
44d1f50b7c |
4
.github/dependabot.yml
vendored
4
.github/dependabot.yml
vendored
@@ -37,6 +37,10 @@ updates:
|
||||
# `just-handlerbars-helpers` library in plugin-chart-handlebars requires `currencyformatter`` to be < 2
|
||||
- dependency-name: "currencyformatter.js"
|
||||
update-types: ["version-update:semver-major"]
|
||||
# TODO: remove below clause once https://github.com/pmmmwh/react-refresh-webpack-plugin/pull/940 lands onto a future release
|
||||
# and confirm the issue https://github.com/apache/superset/issues/39600 is fixed
|
||||
- dependency-name: "react-checkbox-tree"
|
||||
update-types: ["version-update:semver-major"]
|
||||
groups:
|
||||
storybook:
|
||||
applies-to: version-updates
|
||||
|
||||
@@ -29,7 +29,7 @@ ARG BUILD_TRANSLATIONS="false"
|
||||
######################################################################
|
||||
# superset-node-ci used as a base for building frontend assets and CI
|
||||
######################################################################
|
||||
FROM --platform=${BUILDPLATFORM} node:20-trixie-slim AS superset-node-ci
|
||||
FROM --platform=${BUILDPLATFORM} node:22-trixie-slim AS superset-node-ci
|
||||
ARG BUILD_TRANSLATIONS
|
||||
ENV BUILD_TRANSLATIONS=${BUILD_TRANSLATIONS}
|
||||
ARG DEV_MODE="false" # Skip frontend build in dev mode
|
||||
|
||||
@@ -239,26 +239,143 @@ based on the roles and permissions that were attributed.
|
||||
### Row Level Security
|
||||
|
||||
Using Row Level Security filters (under the **Security** menu) you can create filters
|
||||
that are assigned to a particular table, as well as a set of roles.
|
||||
that are assigned to a particular dataset, as well as a set of roles.
|
||||
If you want members of the Finance team to only have access to
|
||||
rows where `department = "finance"`, you could:
|
||||
|
||||
- Create a Row Level Security filter with that clause (`department = "finance"`)
|
||||
- Then assign the clause to the **Finance** role and the table it applies to
|
||||
- Then assign the clause to the **Finance** role and the dataset it applies to
|
||||
|
||||
The **clause** field, which can contain arbitrary text, is then added to the generated
|
||||
SQL statement’s WHERE clause. So you could even do something like create a filter
|
||||
SQL statement's WHERE clause. So you could even do something like create a filter
|
||||
for the last 30 days and apply it to a specific role, with a clause
|
||||
like `date_field > DATE_SUB(NOW(), INTERVAL 30 DAY)`. It can also support
|
||||
multiple conditions: `client_id = 6` AND `advertiser="foo"`, etc.
|
||||
|
||||
All relevant Row level security filters will be combined together (under the hood,
|
||||
the different SQL clauses are combined using AND statements). This means it's
|
||||
possible to create a situation where two roles conflict in such a way as to limit a table subset to empty.
|
||||
RLS clauses also support **Jinja templating** when `ENABLE_TEMPLATE_PROCESSING` is enabled, so you can write dynamic filters such as
|
||||
`user_id = '{{ current_username() }}'` to restrict rows based on the logged-in user.
|
||||
|
||||
For example, the filters `client_id=4` and `client_id=5`, applied to a role,
|
||||
will result in users of that role having `client_id=4` AND `client_id=5`
|
||||
added to their query, which can never be true.
|
||||
#### Filter Types
|
||||
|
||||
There are two types of RLS filters:
|
||||
|
||||
- **Regular** — The filter clause is applied when the querying user belongs to one of the
|
||||
roles assigned to the filter. Use this to restrict what specific roles can see.
|
||||
- **Base** — The filter clause is applied to **all** users _except_ those in the assigned
|
||||
roles. Use this to define a default restriction that privileged roles (e.g. Admin) are
|
||||
exempt from. For example, a Base filter with clause `1 = 0` and the Admin role would
|
||||
hide all rows from everyone except Admin — useful as a deny-by-default baseline.
|
||||
|
||||
#### Group Keys and Filter Combination
|
||||
|
||||
All applicable RLS filters are combined before being added to the query. The combination
|
||||
rules are:
|
||||
|
||||
- Filters that share the **same group key** are combined with **OR** (any match within
|
||||
the group is sufficient).
|
||||
- Different filter groups (different group keys, or no group key) are combined with
|
||||
**AND** (all groups must match).
|
||||
- Filters with **no group key** are each treated as their own group and are always AND'd.
|
||||
|
||||
For example, if a dataset has three filters:
|
||||
|
||||
| Filter | Clause | Group Key |
|
||||
|--------|--------|-----------|
|
||||
| F1 | `department = 'Finance'` | `department` |
|
||||
| F2 | `department = 'Marketing'` | `department` |
|
||||
| F3 | `region = 'Europe'` | `region` |
|
||||
|
||||
The resulting WHERE clause would be:
|
||||
|
||||
```sql
|
||||
(department = 'Finance' OR department = 'Marketing') AND (region = 'Europe')
|
||||
```
|
||||
|
||||
:::caution Conflicting filters
|
||||
It is possible to create filters that conflict and produce an empty result set. For
|
||||
example, the filters `client_id = 4` and `client_id = 5` **without a shared group key**
|
||||
will be AND'd together, producing `client_id = 4 AND client_id = 5`, which can never
|
||||
be true.
|
||||
|
||||
If you intend for these to be alternatives, assign them the **same group key** so they
|
||||
are OR'd instead.
|
||||
:::
|
||||
|
||||
#### RLS and Virtual (SQL-Based) Datasets
|
||||
|
||||
RLS filters are assigned to **datasets**, not to underlying database tables directly. This
|
||||
has important implications when working with virtual (SQL-based) datasets:
|
||||
|
||||
- **Physical datasets** (backed directly by a table or view) — RLS filters assigned to
|
||||
the dataset are added as WHERE clauses to the query.
|
||||
- **Virtual datasets** (defined by a custom SQL query) — RLS filters assigned directly to
|
||||
the virtual dataset are applied to the _outer_ query that wraps the dataset's SQL.
|
||||
Additionally, RLS filters on the **underlying physical datasets** referenced by the
|
||||
virtual dataset's SQL are injected into the inner subquery for each referenced table.
|
||||
|
||||
For example, if you have:
|
||||
|
||||
1. A physical dataset `orders` with RLS filter `region = 'US'`
|
||||
2. A virtual dataset defined as `SELECT * FROM orders WHERE status = 'active'`
|
||||
|
||||
A user affected by the RLS filter will effectively see:
|
||||
|
||||
```sql
|
||||
SELECT * FROM (
|
||||
SELECT * FROM orders WHERE (region = 'US') AND status = 'active'
|
||||
) ...
|
||||
```
|
||||
|
||||
**Key considerations for virtual datasets:**
|
||||
|
||||
- You generally do **not** need to duplicate RLS filters on both the physical and virtual
|
||||
dataset — filters on the physical dataset are applied automatically at query time.
|
||||
- If you assign an RLS filter directly to a virtual dataset, the clause must reference
|
||||
columns available in the virtual dataset's _output_, not necessarily the underlying
|
||||
table's columns.
|
||||
- In **SQL Lab**, RLS is enforced only when the `RLS_IN_SQLLAB` feature flag is enabled:
|
||||
queries run against tables that have associated datasets with RLS filters will then have
|
||||
the appropriate predicates injected automatically.
|
||||
|
||||
#### Checking RLS Filters via the API
|
||||
|
||||
You can use the RLS REST API to audit which filters are configured and which datasets
|
||||
they affect. This requires the `can_read` permission on the `Row Level Security` resource.
|
||||
|
||||
**List all RLS rules:**
|
||||
|
||||
```
|
||||
GET /api/v1/rowlevelsecurity/
|
||||
```
|
||||
|
||||
**Filter RLS rules for a specific dataset** (using [Rison](https://github.com/Nanonid/rison) query syntax):
|
||||
|
||||
```
|
||||
GET /api/v1/rowlevelsecurity/?q=(filters:!((col:tables,opr:rel_m_m,value:<dataset_id>)))
|
||||
```
|
||||
|
||||
**Filter RLS rules by role:**
|
||||
|
||||
```
|
||||
GET /api/v1/rowlevelsecurity/?q=(filters:!((col:roles,opr:rel_m_m,value:<role_id>)))
|
||||
```
|
||||
|
||||
**View details of a specific rule** (including clause, assigned datasets, and roles):
|
||||
|
||||
```
|
||||
GET /api/v1/rowlevelsecurity/<id>
|
||||
```
|
||||
|
||||
The response includes the filter's `name`, `filter_type` (Regular or Base), `clause`,
|
||||
`group_key`, assigned `tables` (with id, schema, and table\_name), and assigned `roles`
|
||||
(with id and name).
|
||||
|
||||
:::tip Auditing RLS for virtual datasets
|
||||
To find all RLS rules that could affect a particular virtual dataset, query the list
|
||||
endpoint filtered by that dataset's ID for any directly-assigned rules. Then also check
|
||||
the physical datasets referenced in the virtual dataset's SQL, since their RLS filters
|
||||
are applied at query time too.
|
||||
:::
|
||||
|
||||
### User Sessions
|
||||
|
||||
|
||||
@@ -69,7 +69,7 @@
|
||||
"@superset-ui/core": "^0.20.4",
|
||||
"@swc/core": "^1.15.30",
|
||||
"antd": "^6.3.7",
|
||||
"baseline-browser-mapping": "^2.10.21",
|
||||
"baseline-browser-mapping": "^2.10.23",
|
||||
"caniuse-lite": "^1.0.30001791",
|
||||
"docusaurus-plugin-openapi-docs": "^5.0.1",
|
||||
"docusaurus-theme-openapi-docs": "^5.0.1",
|
||||
|
||||
@@ -5794,10 +5794,10 @@ base64-js@^1.3.1, base64-js@^1.5.1:
|
||||
resolved "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz"
|
||||
integrity sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==
|
||||
|
||||
baseline-browser-mapping@^2.10.21, baseline-browser-mapping@^2.9.0, baseline-browser-mapping@^2.9.19:
|
||||
version "2.10.21"
|
||||
resolved "https://registry.yarnpkg.com/baseline-browser-mapping/-/baseline-browser-mapping-2.10.21.tgz#136f9f181ee0d7ca6e3edbf42d9559763d2c1141"
|
||||
integrity sha512-Q+rUQ7Uz8AHM7DEaNdwvfFCTq7a43lNTzuS94eiWqwyxfV/wJv+oUivef51T91mmRY4d4A1u9rcSvkeufCVXlA==
|
||||
baseline-browser-mapping@^2.10.23, baseline-browser-mapping@^2.9.0, baseline-browser-mapping@^2.9.19:
|
||||
version "2.10.23"
|
||||
resolved "https://registry.yarnpkg.com/baseline-browser-mapping/-/baseline-browser-mapping-2.10.23.tgz#3a1a55d1a691a8c8d74688af7f1fd17eac23c184"
|
||||
integrity sha512-xwVXGqevyKPsiuQdLj+dZMVjidjJV508TBqexND5HrF89cGdCYCJFB3qhcxRHSeMctdCfbR1jrxBajhDy7o29g==
|
||||
|
||||
batch@0.6.1:
|
||||
version "0.6.1"
|
||||
|
||||
44
superset-frontend/package-lock.json
generated
44
superset-frontend/package-lock.json
generated
@@ -115,7 +115,7 @@
|
||||
"re-resizable": "^6.11.2",
|
||||
"react": "^17.0.2",
|
||||
"react-arborist": "^3.5.0",
|
||||
"react-checkbox-tree": "^2.0.1",
|
||||
"react-checkbox-tree": "^1.8.0",
|
||||
"react-diff-viewer-continued": "^4.2.2",
|
||||
"react-dnd": "^11.1.3",
|
||||
"react-dnd-html5-backend": "^11.1.3",
|
||||
@@ -25521,15 +25521,6 @@
|
||||
"dev": true,
|
||||
"license": "Apache-2.0"
|
||||
},
|
||||
"node_modules/fast-equals": {
|
||||
"version": "6.0.0",
|
||||
"resolved": "https://registry.npmjs.org/fast-equals/-/fast-equals-6.0.0.tgz",
|
||||
"integrity": "sha512-PFhhIGgdM79r5Uztdj9Zb6Tt1zKafqVfdMGwVca1z5z6fbX7DmsySSuJd8HiP6I1j505DCS83cLxo5rmSNeVEA==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/fast-fifo": {
|
||||
"version": "1.3.2",
|
||||
"resolved": "https://registry.npmjs.org/fast-fifo/-/fast-fifo-1.3.2.tgz",
|
||||
@@ -35611,6 +35602,7 @@
|
||||
"version": "4.1.2",
|
||||
"resolved": "https://registry.npmjs.org/lodash.memoize/-/lodash.memoize-4.1.2.tgz",
|
||||
"integrity": "sha512-t7j+NzmgnQzTAYXcsHYLgimltOV1MXHtlOWf6GjL9Kj8GK5FInw5JotxvbOs+IvV1/Dzo04/fCGfLVs7aXb4Ag==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/lodash.merge": {
|
||||
@@ -41848,18 +41840,36 @@
|
||||
}
|
||||
},
|
||||
"node_modules/react-checkbox-tree": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/react-checkbox-tree/-/react-checkbox-tree-2.0.1.tgz",
|
||||
"integrity": "sha512-ZUh9strXP3a+RpXEGPSq5qWC0HSo3pjjGQEwNWYdmo1OfSNq0L61boy4ANIN2O+ybo/n80hadQYNAeMgwdQqRQ==",
|
||||
"version": "1.8.0",
|
||||
"resolved": "https://registry.npmjs.org/react-checkbox-tree/-/react-checkbox-tree-1.8.0.tgz",
|
||||
"integrity": "sha512-ufC4aorihOvjLpvY1beab2hjVLGZbDTFRzw62foG0+th+KX7e/sdmWu/nD1ZS/U5Yr0rWGwedGH5GOtR0IkUXw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"classnames": "^2.2.5",
|
||||
"fast-equals": "^6.0.0",
|
||||
"lodash.memoize": "^4.1.2",
|
||||
"lodash": "^4.17.10",
|
||||
"nanoid": "^3.0.0",
|
||||
"prop-types": "^15.5.8"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0"
|
||||
"react": "^15.3.0 || ^16.0.0 || ^17.0.0 || ^18.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/react-checkbox-tree/node_modules/nanoid": {
|
||||
"version": "3.3.11",
|
||||
"resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz",
|
||||
"integrity": "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/ai"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"bin": {
|
||||
"nanoid": "bin/nanoid.cjs"
|
||||
},
|
||||
"engines": {
|
||||
"node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
|
||||
}
|
||||
},
|
||||
"node_modules/react-diff-viewer-continued": {
|
||||
@@ -53369,7 +53379,7 @@
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"@types/d3-scale": "^4.0.9",
|
||||
"d3-cloud": "^1.2.9",
|
||||
"d3-cloud": "^1.2.8",
|
||||
"d3-scale": "^4.0.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
|
||||
@@ -196,7 +196,7 @@
|
||||
"re-resizable": "^6.11.2",
|
||||
"react": "^17.0.2",
|
||||
"react-arborist": "^3.5.0",
|
||||
"react-checkbox-tree": "^2.0.1",
|
||||
"react-checkbox-tree": "^1.8.0",
|
||||
"react-diff-viewer-continued": "^4.2.2",
|
||||
"react-dnd": "^11.1.3",
|
||||
"react-dnd-html5-backend": "^11.1.3",
|
||||
|
||||
@@ -60,7 +60,7 @@ describe('useJsonValidation', () => {
|
||||
expect(result.current[0]).toMatchObject({
|
||||
type: 'error',
|
||||
row: 0,
|
||||
column: 0,
|
||||
column: 1,
|
||||
text: expect.stringContaining('Invalid JSON'),
|
||||
});
|
||||
});
|
||||
|
||||
@@ -55,132 +55,138 @@ export const StyledModal = styled(BaseModal)<StyledModalProps>`
|
||||
height,
|
||||
draggable,
|
||||
hideFooter,
|
||||
}) => css`
|
||||
${responsive &&
|
||||
css`
|
||||
max-width: ${maxWidth ?? '900px'};
|
||||
padding-left: ${theme.sizeUnit * 3}px;
|
||||
padding-right: ${theme.sizeUnit * 3}px;
|
||||
padding-bottom: 0;
|
||||
top: 0;
|
||||
`}
|
||||
}) => {
|
||||
const closeButtonWidth = theme.sizeUnit * 14;
|
||||
|
||||
.ant-modal-content {
|
||||
background-color: ${theme.colorBgContainer};
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
max-height: calc(100vh - ${theme.sizeUnit * 8}px);
|
||||
margin-bottom: ${theme.sizeUnit * 4}px;
|
||||
margin-top: ${theme.sizeUnit * 4}px;
|
||||
padding: 0;
|
||||
}
|
||||
return css`
|
||||
${responsive &&
|
||||
css`
|
||||
max-width: ${maxWidth ?? '900px'};
|
||||
padding-left: ${theme.sizeUnit * 3}px;
|
||||
padding-right: ${theme.sizeUnit * 3}px;
|
||||
padding-bottom: 0;
|
||||
top: 0;
|
||||
`}
|
||||
|
||||
.ant-modal-header {
|
||||
flex: 0 0 auto;
|
||||
border-radius: ${theme.borderRadius}px ${theme.borderRadius}px 0 0;
|
||||
padding: ${theme.sizeUnit * 4}px ${theme.sizeUnit * 4}px;
|
||||
|
||||
.ant-modal-title {
|
||||
font-weight: ${theme.fontWeightStrong};
|
||||
}
|
||||
|
||||
.ant-modal-title h4 {
|
||||
.ant-modal-content {
|
||||
background-color: ${theme.colorBgContainer};
|
||||
display: flex;
|
||||
margin: 0;
|
||||
align-items: center;
|
||||
flex-direction: column;
|
||||
max-height: calc(100vh - ${theme.sizeUnit * 8}px);
|
||||
margin-bottom: ${theme.sizeUnit * 4}px;
|
||||
margin-top: ${theme.sizeUnit * 4}px;
|
||||
padding: 0;
|
||||
}
|
||||
}
|
||||
|
||||
.ant-modal-close {
|
||||
width: ${theme.sizeUnit * 14}px;
|
||||
height: ${theme.sizeUnit * 14}px;
|
||||
padding: ${theme.sizeUnit * 6}px ${theme.sizeUnit * 4}px
|
||||
${theme.sizeUnit * 4}px;
|
||||
top: 0;
|
||||
right: 0;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
}
|
||||
.ant-modal-header {
|
||||
flex: 0 0 auto;
|
||||
border-radius: ${theme.borderRadius}px ${theme.borderRadius}px 0 0;
|
||||
padding: ${theme.sizeUnit * 4}px ${closeButtonWidth}px
|
||||
${theme.sizeUnit * 4}px ${theme.sizeUnit * 4}px;
|
||||
|
||||
.ant-modal-close:hover {
|
||||
background: transparent;
|
||||
}
|
||||
.ant-modal-title {
|
||||
font-weight: ${theme.fontWeightStrong};
|
||||
}
|
||||
|
||||
.ant-modal-close-x {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
[data-test='close-modal-btn'] {
|
||||
.ant-modal-title h4 {
|
||||
display: flex;
|
||||
margin: 0;
|
||||
align-items: center;
|
||||
}
|
||||
}
|
||||
|
||||
.ant-modal-close {
|
||||
width: ${closeButtonWidth}px;
|
||||
height: ${theme.sizeUnit * 14}px;
|
||||
padding: ${theme.sizeUnit * 6}px ${theme.sizeUnit * 4}px
|
||||
${theme.sizeUnit * 4}px;
|
||||
top: 0;
|
||||
right: 0;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
}
|
||||
.close {
|
||||
flex: 1 1 auto;
|
||||
margin-bottom: ${theme.sizeUnit}px;
|
||||
color: ${theme.colorPrimaryText};
|
||||
font-weight: ${theme.fontWeightLight};
|
||||
}
|
||||
}
|
||||
|
||||
.ant-modal-body {
|
||||
flex: 0 1 auto;
|
||||
padding: ${theme.sizeUnit * 4}px ${theme.sizeUnit * 6}px;
|
||||
|
||||
overflow: auto;
|
||||
${!resizable && height && `height: ${height};`}
|
||||
}
|
||||
|
||||
.ant-modal-footer {
|
||||
flex: 0 0 1;
|
||||
border-top: ${theme.sizeUnit / 4}px solid ${theme.colorSplit};
|
||||
padding: ${theme.sizeUnit * 4}px;
|
||||
margin-top: 0;
|
||||
|
||||
.btn {
|
||||
font-size: 12px;
|
||||
.ant-modal-close:hover {
|
||||
background: transparent;
|
||||
}
|
||||
|
||||
.btn + .btn {
|
||||
margin-left: ${theme.sizeUnit * 2}px;
|
||||
.ant-modal-close-x {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
[data-test='close-modal-btn'] {
|
||||
justify-content: center;
|
||||
}
|
||||
.close {
|
||||
flex: 1 1 auto;
|
||||
margin-bottom: ${theme.sizeUnit}px;
|
||||
color: ${theme.colorPrimaryText};
|
||||
font-weight: ${theme.fontWeightLight};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
&.no-content-padding .ant-modal-body {
|
||||
padding: 0;
|
||||
}
|
||||
.ant-modal-body {
|
||||
flex: 0 1 auto;
|
||||
padding: ${theme.sizeUnit * 4}px ${theme.sizeUnit * 6}px;
|
||||
|
||||
${draggable &&
|
||||
css`
|
||||
.ant-modal-header {
|
||||
overflow: auto;
|
||||
${!resizable && height && `height: ${height};`}
|
||||
}
|
||||
|
||||
.ant-modal-footer {
|
||||
flex: 0 0 1;
|
||||
border-top: ${theme.sizeUnit / 4}px solid ${theme.colorSplit};
|
||||
padding: ${theme.sizeUnit * 4}px;
|
||||
margin-top: 0;
|
||||
|
||||
.btn {
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
.btn + .btn {
|
||||
margin-left: ${theme.sizeUnit * 2}px;
|
||||
}
|
||||
}
|
||||
|
||||
&.no-content-padding .ant-modal-body {
|
||||
padding: 0;
|
||||
|
||||
.draggable-trigger {
|
||||
cursor: move;
|
||||
padding: ${theme.sizeUnit * 4}px;
|
||||
width: 100%;
|
||||
}
|
||||
}
|
||||
`}
|
||||
|
||||
${resizable &&
|
||||
css`
|
||||
.resizable {
|
||||
pointer-events: all;
|
||||
${draggable &&
|
||||
css`
|
||||
.ant-modal-header {
|
||||
padding: 0;
|
||||
|
||||
.resizable-wrapper {
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
.ant-modal-content {
|
||||
height: 100%;
|
||||
|
||||
.ant-modal-body {
|
||||
height: ${hideFooter
|
||||
? `calc(100% - ${MODAL_HEADER_HEIGHT}px)`
|
||||
: `calc(100% - ${MODAL_HEADER_HEIGHT}px - ${MODAL_FOOTER_HEIGHT}px)`};
|
||||
.draggable-trigger {
|
||||
cursor: move;
|
||||
padding: ${theme.sizeUnit * 4}px ${closeButtonWidth}px
|
||||
${theme.sizeUnit * 4}px ${theme.sizeUnit * 4}px;
|
||||
width: 100%;
|
||||
}
|
||||
}
|
||||
}
|
||||
`}
|
||||
`}
|
||||
`}
|
||||
|
||||
${resizable &&
|
||||
css`
|
||||
.resizable {
|
||||
pointer-events: all;
|
||||
|
||||
.resizable-wrapper {
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
.ant-modal-content {
|
||||
height: 100%;
|
||||
|
||||
.ant-modal-body {
|
||||
height: ${hideFooter
|
||||
? `calc(100% - ${MODAL_HEADER_HEIGHT}px)`
|
||||
: `calc(100% - ${MODAL_HEADER_HEIGHT}px - ${MODAL_FOOTER_HEIGHT}px)`};
|
||||
}
|
||||
}
|
||||
}
|
||||
`}
|
||||
`;
|
||||
}}
|
||||
`;
|
||||
|
||||
const defaultResizableConfig = (hideFooter: boolean | undefined) => ({
|
||||
|
||||
@@ -20,6 +20,10 @@ import { t } from '@apache-superset/core/translation';
|
||||
import { Icons, Modal, Typography, Button } from '@superset-ui/core/components';
|
||||
import type { FC, ReactElement } from 'react';
|
||||
|
||||
// Ant Design's default modal zIndex is 1000. Using a higher value ensures
|
||||
// this dialog always renders above other open modals (e.g. a draggable View SQL modal).
|
||||
const UNSAVED_CHANGES_MODAL_Z_INDEX = 1100;
|
||||
|
||||
export type UnsavedChangesModalProps = {
|
||||
showModal: boolean;
|
||||
onHide: () => void;
|
||||
@@ -27,6 +31,7 @@ export type UnsavedChangesModalProps = {
|
||||
onConfirmNavigation: () => void;
|
||||
title?: string;
|
||||
body?: string;
|
||||
zIndex?: number;
|
||||
};
|
||||
|
||||
export const UnsavedChangesModal: FC<UnsavedChangesModalProps> = ({
|
||||
@@ -36,6 +41,7 @@ export const UnsavedChangesModal: FC<UnsavedChangesModalProps> = ({
|
||||
onConfirmNavigation,
|
||||
title = 'Unsaved Changes',
|
||||
body = "If you don't save, changes will be lost.",
|
||||
zIndex = UNSAVED_CHANGES_MODAL_Z_INDEX,
|
||||
}: UnsavedChangesModalProps): ReactElement => (
|
||||
<Modal
|
||||
centered
|
||||
@@ -43,6 +49,7 @@ export const UnsavedChangesModal: FC<UnsavedChangesModalProps> = ({
|
||||
onHide={onHide}
|
||||
show={showModal}
|
||||
width="444px"
|
||||
zIndex={zIndex}
|
||||
title={
|
||||
<>
|
||||
<Icons.WarningOutlined iconSize="m" style={{ marginRight: 8 }} />
|
||||
|
||||
@@ -30,7 +30,7 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@types/d3-scale": "^4.0.9",
|
||||
"d3-cloud": "^1.2.9",
|
||||
"d3-cloud": "^1.2.8",
|
||||
"d3-scale": "^4.0.2"
|
||||
},
|
||||
"peerDependencies": {
|
||||
|
||||
@@ -50,7 +50,6 @@ import Table, {
|
||||
import { RootState } from 'src/dashboard/types';
|
||||
import { usePermissions } from 'src/hooks/usePermissions';
|
||||
import { useToasts } from 'src/components/MessageToasts/withToasts';
|
||||
import { ensureAppRoot } from 'src/utils/pathUtils';
|
||||
import { safeStringify } from 'src/utils/safeStringify';
|
||||
import HeaderWithRadioGroup from '@superset-ui/core/components/Table/header-renderers/HeaderWithRadioGroup';
|
||||
import { useDatasetMetadataBar } from 'src/features/datasets/metadataBar/useDatasetMetadataBar';
|
||||
@@ -248,7 +247,7 @@ export default function DrillDetailPane({
|
||||
if (dashboardId) {
|
||||
payload.form_data = { dashboardId };
|
||||
}
|
||||
SupersetClient.postForm(ensureAppRoot('/api/v1/chart/data'), {
|
||||
SupersetClient.postForm('/api/v1/chart/data', {
|
||||
form_data: safeStringify(payload),
|
||||
}).catch(error => {
|
||||
addDangerToast(
|
||||
|
||||
@@ -48,7 +48,6 @@ import { Logger, LOG_ACTIONS_LOAD_CHART } from 'src/logger/LogUtils';
|
||||
import { allowCrossDomain as domainShardingEnabled } from 'src/utils/hostNamesConfig';
|
||||
import { updateDataMask } from 'src/dataMask/actions';
|
||||
import { waitForAsyncData } from 'src/middleware/asyncEvent';
|
||||
import { ensureAppRoot } from 'src/utils/pathUtils';
|
||||
import { safeStringify } from 'src/utils/safeStringify';
|
||||
import { extendedDayjs } from '@superset-ui/core/utils/dates';
|
||||
import type { Dispatch, Action, AnyAction } from 'redux';
|
||||
@@ -934,7 +933,7 @@ export function redirectSQLLab(
|
||||
requestedQuery: payload,
|
||||
});
|
||||
} else {
|
||||
SupersetClient.postForm(ensureAppRoot(redirectUrl), {
|
||||
SupersetClient.postForm(redirectUrl, {
|
||||
form_data: safeStringify(payload),
|
||||
});
|
||||
}
|
||||
|
||||
@@ -171,7 +171,7 @@ function getCheckboxIcon(element: HTMLElement): Element {
|
||||
* checkbox state change is the fill color of the SVG icon.
|
||||
*/
|
||||
function getCheckboxState(name: string): CheckboxState {
|
||||
const element = screen.getByRole('button', { name });
|
||||
const element = screen.getByRole('link', { name });
|
||||
const svgPath = getCheckboxIcon(element).children[1].children[0].children[0];
|
||||
const fill = svgPath.getAttribute('fill');
|
||||
return fill === supersetTheme.colorPrimary
|
||||
@@ -183,7 +183,7 @@ function getCheckboxState(name: string): CheckboxState {
|
||||
|
||||
// Replace the original clickCheckbox function with the async version
|
||||
async function clickCheckbox(name: string) {
|
||||
const element = screen.getByRole('button', { name });
|
||||
const element = screen.getByRole('link', { name });
|
||||
const checkboxLabel = getCheckboxIcon(element);
|
||||
await userEvent.click(checkboxLabel);
|
||||
}
|
||||
@@ -204,11 +204,11 @@ test('renders with empty filters', () => {
|
||||
|
||||
test('renders with filters values', () => {
|
||||
render(<FilterScopeSelector {...createProps()} />, { useRedux: true });
|
||||
expect(screen.getByRole('button', { name: FILTER_A })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: FILTER_B })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: FILTER_C })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: TAB_A })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: TAB_B })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: FILTER_A })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: FILTER_B })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: FILTER_C })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: TAB_A })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: TAB_B })).toBeInTheDocument();
|
||||
expect(screen.queryByText(CHART_A)).not.toBeInTheDocument();
|
||||
expect(screen.queryByText(CHART_B)).not.toBeInTheDocument();
|
||||
expect(screen.queryByText(CHART_C)).not.toBeInTheDocument();
|
||||
@@ -222,21 +222,21 @@ test('collapses/expands all filters', () => {
|
||||
useRedux: true,
|
||||
});
|
||||
userEvent.click(screen.getAllByRole('button', { name: COLLAPSE_ALL })[0]);
|
||||
expect(screen.getByRole('button', { name: ALL_FILTERS })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: ALL_FILTERS })).toBeInTheDocument();
|
||||
expect(
|
||||
screen.queryByRole('button', { name: FILTER_A }),
|
||||
screen.queryByRole('link', { name: FILTER_A }),
|
||||
).not.toBeInTheDocument();
|
||||
expect(
|
||||
screen.queryByRole('button', { name: FILTER_B }),
|
||||
screen.queryByRole('link', { name: FILTER_B }),
|
||||
).not.toBeInTheDocument();
|
||||
expect(
|
||||
screen.queryByRole('button', { name: FILTER_C }),
|
||||
screen.queryByRole('link', { name: FILTER_C }),
|
||||
).not.toBeInTheDocument();
|
||||
userEvent.click(screen.getAllByRole('button', { name: EXPAND_ALL })[0]);
|
||||
expect(screen.getByRole('button', { name: ALL_FILTERS })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: FILTER_A })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: FILTER_B })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: FILTER_C })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: ALL_FILTERS })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: FILTER_A })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: FILTER_B })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: FILTER_C })).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('collapses/expands all charts', () => {
|
||||
@@ -251,10 +251,10 @@ test('collapses/expands all charts', () => {
|
||||
expect(screen.queryByText(CHART_D)).not.toBeInTheDocument();
|
||||
userEvent.click(screen.getAllByRole('button', { name: EXPAND_ALL })[1]);
|
||||
expect(screen.getByText(ALL_CHARTS)).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: CHART_A })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: CHART_B })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: CHART_C })).toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: CHART_D })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: CHART_A })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: CHART_B })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: CHART_C })).toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: CHART_D })).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('searches for a chart', () => {
|
||||
@@ -262,13 +262,9 @@ test('searches for a chart', () => {
|
||||
useRedux: true,
|
||||
});
|
||||
userEvent.type(screen.getByPlaceholderText('Search...'), CHART_C);
|
||||
expect(
|
||||
screen.queryByRole('button', { name: CHART_A }),
|
||||
).not.toBeInTheDocument();
|
||||
expect(
|
||||
screen.queryByRole('button', { name: CHART_B }),
|
||||
).not.toBeInTheDocument();
|
||||
expect(screen.getByRole('button', { name: CHART_C })).toBeInTheDocument();
|
||||
expect(screen.queryByRole('link', { name: CHART_A })).not.toBeInTheDocument();
|
||||
expect(screen.queryByRole('link', { name: CHART_B })).not.toBeInTheDocument();
|
||||
expect(screen.getByRole('link', { name: CHART_C })).toBeInTheDocument();
|
||||
});
|
||||
|
||||
// Update all tests that use clickCheckbox to be async and await the function call
|
||||
|
||||
@@ -58,7 +58,9 @@ beforeEach(() => {
|
||||
});
|
||||
});
|
||||
|
||||
// Tests for exportChart URL prefix handling in streaming export
|
||||
// Tests for exportChart URL prefix handling in streaming export.
|
||||
// Streaming uses native fetch (not SupersetClient), so exportChart must apply
|
||||
// ensureAppRoot before passing the URL to onStartStreamingExport.
|
||||
test('exportChart v1 API passes prefixed URL to onStartStreamingExport when app root is configured', async () => {
|
||||
const appRoot = '/superset';
|
||||
ensureAppRoot.mockImplementation((path: string) => `${appRoot}${path}`);
|
||||
@@ -111,6 +113,24 @@ test('exportChart v1 API passes nested prefix for deeply nested deployments', as
|
||||
expect(callArgs.exportType).toBe('xlsx');
|
||||
});
|
||||
|
||||
// Regression test for the double-prefix bug: SupersetClient.postForm adds appRoot
|
||||
// internally via getUrl(), so the URL passed must NOT already be prefixed.
|
||||
test('exportChart v1 API calls postForm with unprefixed URL when app root is configured', async () => {
|
||||
const { SupersetClient } = jest.requireMock('@superset-ui/core');
|
||||
const appRoot = '/analytics';
|
||||
ensureAppRoot.mockImplementation((path: string) => `${appRoot}${path}`);
|
||||
|
||||
await exportChart({
|
||||
formData: baseFormData,
|
||||
resultFormat: 'csv',
|
||||
});
|
||||
|
||||
expect(SupersetClient.postForm).toHaveBeenCalledTimes(1);
|
||||
const [url] = SupersetClient.postForm.mock.calls[0];
|
||||
expect(url).toBe('/api/v1/chart/data');
|
||||
expect(url).not.toContain(appRoot);
|
||||
});
|
||||
|
||||
test('exportChart passes csv exportType for CSV exports', async () => {
|
||||
const onStartStreamingExport = jest.fn();
|
||||
|
||||
@@ -143,7 +163,7 @@ test('exportChart passes xlsx exportType for Excel exports', async () => {
|
||||
);
|
||||
});
|
||||
|
||||
test('exportChart legacy API (useLegacyApi=true) passes prefixed URL with app root configured', async () => {
|
||||
test('exportChart legacy API (useLegacyApi=true) passes prefixed URL to onStartStreamingExport when app root is configured', async () => {
|
||||
const appRoot = '/superset';
|
||||
ensureAppRoot.mockImplementation((path: string) => `${appRoot}${path}`);
|
||||
|
||||
@@ -165,6 +185,8 @@ test('exportChart legacy API (useLegacyApi=true) passes prefixed URL with app ro
|
||||
|
||||
expect(onStartStreamingExport).toHaveBeenCalledTimes(1);
|
||||
const callArgs = onStartStreamingExport.mock.calls[0][0];
|
||||
// The legacy blueprint path is /superset/explore_json/; with appRoot=/superset the
|
||||
// full streaming URL is /superset/superset/explore_json/ (appRoot + blueprint prefix).
|
||||
expect(callArgs.url).toBe('/superset/superset/explore_json/?csv=true');
|
||||
expect(callArgs.exportType).toBe('csv');
|
||||
});
|
||||
|
||||
@@ -76,6 +76,7 @@ interface GetExploreUrlParams {
|
||||
allowDomainSharding?: boolean;
|
||||
method?: 'GET' | 'POST';
|
||||
relative?: boolean;
|
||||
includeAppRoot?: boolean;
|
||||
}
|
||||
|
||||
interface BuildV1ChartDataPayloadParams {
|
||||
@@ -223,6 +224,7 @@ export function getExploreUrl({
|
||||
allowDomainSharding = false,
|
||||
method = 'POST',
|
||||
relative = false,
|
||||
includeAppRoot = true,
|
||||
}: GetExploreUrlParams): string | null {
|
||||
if (!formData.datasource) {
|
||||
return null;
|
||||
@@ -242,7 +244,7 @@ export function getExploreUrl({
|
||||
uri = URI(URI(curUrl).search());
|
||||
}
|
||||
|
||||
const directory = getURIDirectory(endpointType);
|
||||
const directory = getURIDirectory(endpointType, includeAppRoot);
|
||||
|
||||
// Building the querystring (search) part of the URI
|
||||
const search = uri.search(true) as Record<string, string>;
|
||||
@@ -370,10 +372,11 @@ export const exportChart = async ({
|
||||
force,
|
||||
allowDomainSharding: false,
|
||||
relative: true,
|
||||
includeAppRoot: false,
|
||||
});
|
||||
payload = formData;
|
||||
} else {
|
||||
url = ensureAppRoot('/api/v1/chart/data');
|
||||
url = '/api/v1/chart/data';
|
||||
payload = await buildV1ChartDataPayload({
|
||||
formData,
|
||||
force,
|
||||
@@ -385,14 +388,16 @@ export const exportChart = async ({
|
||||
|
||||
// Check if streaming export handler is provided (from dashboard Chart.jsx)
|
||||
if (onStartStreamingExport) {
|
||||
// Streaming is handled by the caller - pass URL, payload, and export type
|
||||
// Streaming uses native fetch — apply appRoot prefix here since useStreamingExport
|
||||
// does not go through SupersetClient (which would add it automatically).
|
||||
onStartStreamingExport({
|
||||
url,
|
||||
url: url ? ensureAppRoot(url) : url,
|
||||
payload,
|
||||
exportType: resultFormat,
|
||||
});
|
||||
} else {
|
||||
// Fallback to original behavior for non-streaming exports
|
||||
// SupersetClient.postForm calls getUrl({ endpoint }) internally, which prepends
|
||||
// appRoot — so the URL must NOT be pre-prefixed here.
|
||||
SupersetClient.postForm(url as string, {
|
||||
form_data: safeStringify(payload),
|
||||
});
|
||||
|
||||
@@ -127,6 +127,27 @@ def _resolve_metrics_and_groupby(
|
||||
return _resolve_metrics(form_data, viz_type), _resolve_groupby(form_data)
|
||||
|
||||
|
||||
def _resolve_engine(
|
||||
datasource_id: Any,
|
||||
datasource_type: str,
|
||||
) -> str:
|
||||
"""Return the DB engine name for *datasource_id*, or ``"base"`` on any error."""
|
||||
if not isinstance(datasource_id, (int, str)):
|
||||
return "base"
|
||||
try:
|
||||
from superset.daos.datasource import DatasourceDAO
|
||||
from superset.utils.core import DatasourceType
|
||||
|
||||
ds = DatasourceDAO.get_datasource(
|
||||
datasource_type=DatasourceType(datasource_type),
|
||||
database_id_or_uuid=datasource_id,
|
||||
)
|
||||
return ds.database.db_engine_spec.engine
|
||||
except Exception: # noqa: BLE001
|
||||
logger.debug("Could not resolve engine for datasource %s", datasource_id)
|
||||
return "base"
|
||||
|
||||
|
||||
def _build_query_context_from_form_data(
|
||||
form_data: dict[str, Any],
|
||||
chart: "Slice | None" = None,
|
||||
@@ -159,14 +180,35 @@ def _build_query_context_from_form_data(
|
||||
|
||||
metrics, groupby = _resolve_metrics_and_groupby(form_data, chart)
|
||||
|
||||
# Build a minimal query object; let QueryContextFactory handle temporal
|
||||
# fields (time_range, granularity_sqla), adhoc_filters, WHERE/HAVING
|
||||
# clauses, etc. from form_data — same approach as get_chart_data.
|
||||
# Preprocess adhoc_filters into where/having/filters on form_data so
|
||||
# that the QueryObject receives concrete filter clauses. This mirrors
|
||||
# the view-layer call in viz.py:process_query_filters.
|
||||
from superset.utils.core import (
|
||||
merge_extra_filters,
|
||||
split_adhoc_filters_into_base_filters,
|
||||
)
|
||||
|
||||
resolved_type_str: str = (
|
||||
datasource_type if isinstance(datasource_type, str) else "table"
|
||||
)
|
||||
engine = _resolve_engine(datasource_id, resolved_type_str)
|
||||
merge_extra_filters(form_data)
|
||||
split_adhoc_filters_into_base_filters(form_data, engine)
|
||||
|
||||
# Build query dict with temporal and filter fields.
|
||||
# QueryObjectFactory.create() accepts time_range as a top-level kwarg
|
||||
# and converts it to from_dttm/to_dttm for the QueryObject.
|
||||
query_dict: dict[str, Any] = {
|
||||
"columns": groupby,
|
||||
"metrics": metrics,
|
||||
}
|
||||
|
||||
if time_range := form_data.get("time_range"):
|
||||
query_dict["time_range"] = time_range
|
||||
|
||||
if filters := form_data.get("filters"):
|
||||
query_dict["filters"] = filters
|
||||
|
||||
if (row_limit := form_data.get("row_limit")) is not None:
|
||||
query_dict["row_limit"] = row_limit
|
||||
|
||||
@@ -179,12 +221,9 @@ def _build_query_context_from_form_data(
|
||||
"'datasource_id' or 'datasource'."
|
||||
)
|
||||
resolved_id: int | str = datasource_id
|
||||
resolved_type: str = (
|
||||
datasource_type if isinstance(datasource_type, str) else "table"
|
||||
)
|
||||
|
||||
return factory.create(
|
||||
datasource={"id": resolved_id, "type": resolved_type},
|
||||
datasource={"id": resolved_id, "type": resolved_type_str},
|
||||
queries=[query_dict],
|
||||
form_data=form_data,
|
||||
result_type=ChartDataResultType.QUERY,
|
||||
@@ -270,6 +309,54 @@ def _sql_from_saved_query_context(
|
||||
return None
|
||||
|
||||
|
||||
def _resolve_datasource_name(
|
||||
form_data: dict[str, Any],
|
||||
chart: "Slice | None",
|
||||
) -> str | None:
|
||||
"""Resolve datasource name from form_data or chart.
|
||||
|
||||
For unsaved charts (chart=None), looks up the datasource by ID
|
||||
from form_data so that the response includes a meaningful name.
|
||||
"""
|
||||
if chart:
|
||||
return getattr(chart, "datasource_name", None)
|
||||
|
||||
# Unsaved chart — resolve from form_data
|
||||
datasource_id = form_data.get("datasource_id")
|
||||
datasource_type = form_data.get("datasource_type", "table")
|
||||
|
||||
if not datasource_id and (combined := form_data.get("datasource")):
|
||||
if isinstance(combined, str) and "__" in combined:
|
||||
parts = combined.split("__", 1)
|
||||
datasource_id = int(parts[0]) if parts[0].isdigit() else parts[0]
|
||||
datasource_type = parts[1] if len(parts) > 1 else "table"
|
||||
|
||||
if not datasource_id:
|
||||
return None
|
||||
|
||||
try:
|
||||
from superset.daos.datasource import DatasourceDAO
|
||||
from superset.daos.exceptions import (
|
||||
DatasourceNotFound,
|
||||
DatasourceTypeNotSupportedError,
|
||||
DatasourceValueIsIncorrect,
|
||||
)
|
||||
from superset.utils.core import DatasourceType
|
||||
|
||||
datasource = DatasourceDAO.get_datasource(
|
||||
datasource_type=DatasourceType(datasource_type),
|
||||
database_id_or_uuid=datasource_id,
|
||||
)
|
||||
return getattr(datasource, "name", None)
|
||||
except (
|
||||
ValueError,
|
||||
DatasourceNotFound,
|
||||
DatasourceTypeNotSupportedError,
|
||||
DatasourceValueIsIncorrect,
|
||||
):
|
||||
return None
|
||||
|
||||
|
||||
def _sql_from_form_data(
|
||||
form_data: dict[str, Any],
|
||||
chart: "Slice | None",
|
||||
@@ -286,7 +373,7 @@ def _sql_from_form_data(
|
||||
result,
|
||||
chart_id=getattr(chart, "id", None),
|
||||
chart_name=getattr(chart, "slice_name", None),
|
||||
datasource_name=getattr(chart, "datasource_name", None),
|
||||
datasource_name=_resolve_datasource_name(form_data, chart),
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -59,10 +59,14 @@ class DatabaseFilter(ColumnOperator):
|
||||
"database_name",
|
||||
"expose_in_sqllab",
|
||||
"allow_file_upload",
|
||||
"created_by_fk",
|
||||
"changed_by_fk",
|
||||
] = Field(
|
||||
...,
|
||||
description="Column to filter on. Use get_schema(model_type='database') for "
|
||||
"available filter columns.",
|
||||
"available filter columns. Use created_by_fk with the user "
|
||||
"ID from get_instance_info's current_user to find "
|
||||
"databases created by a specific user.",
|
||||
)
|
||||
opr: ColumnOperatorEnum = Field(
|
||||
...,
|
||||
|
||||
@@ -34,6 +34,7 @@ from superset.mcp_service.chart.tool.get_chart_sql import (
|
||||
_build_query_context_from_form_data,
|
||||
_extract_sql_from_result,
|
||||
_find_chart_by_identifier,
|
||||
_resolve_datasource_name,
|
||||
_resolve_effective_form_data,
|
||||
_resolve_groupby,
|
||||
_resolve_metrics,
|
||||
@@ -356,9 +357,14 @@ class TestBuildQueryContextFromFormData:
|
||||
"""
|
||||
|
||||
@patch("superset.common.query_context_factory.QueryContextFactory")
|
||||
def test_temporal_fields_passed_to_factory(self, mock_factory_cls):
|
||||
"""time_range, granularity_sqla, adhoc_filters from form_data are
|
||||
forwarded via form_data= to the factory, not dropped."""
|
||||
@patch("superset.daos.datasource.DatasourceDAO.get_datasource")
|
||||
def test_temporal_fields_passed_to_factory(self, mock_get_ds, mock_factory_cls):
|
||||
"""time_range, adhoc_filters from form_data are processed and
|
||||
forwarded to the factory — not dropped."""
|
||||
mock_ds = Mock()
|
||||
mock_ds.database.db_engine_spec.engine = "postgresql"
|
||||
mock_get_ds.return_value = mock_ds
|
||||
|
||||
mock_factory = Mock()
|
||||
mock_factory.create.return_value = Mock()
|
||||
mock_factory_cls.return_value = mock_factory
|
||||
@@ -390,17 +396,22 @@ class TestBuildQueryContextFromFormData:
|
||||
mock_result_type.QUERY = "QUERY"
|
||||
_build_query_context_from_form_data(form_data, chart=None)
|
||||
|
||||
# Verify factory.create was called with form_data containing all fields
|
||||
call_kwargs = mock_factory.create.call_args[1]
|
||||
assert call_kwargs["form_data"] is form_data
|
||||
assert call_kwargs["form_data"]["time_range"] == "Last 7 days"
|
||||
assert call_kwargs["form_data"]["granularity_sqla"] == "created_at"
|
||||
assert call_kwargs["form_data"]["adhoc_filters"] is not None
|
||||
assert call_kwargs["form_data"]["where"] == "region = 'US'"
|
||||
assert call_kwargs["form_data"]["having"] == "count > 10"
|
||||
assert call_kwargs["form_data"]["filters"] == [
|
||||
{"col": "city", "op": "==", "val": "NYC"}
|
||||
]
|
||||
|
||||
# adhoc_filters are preprocessed by split_adhoc_filters_into_base_filters
|
||||
# into form_data["filters"]/["where"]/["having"], then included
|
||||
# in the query dict as concrete filter clauses.
|
||||
queries = call_kwargs["queries"]
|
||||
assert len(queries) == 1
|
||||
assert queries[0]["time_range"] == "Last 7 days"
|
||||
assert "adhoc_filters" not in queries[0]
|
||||
# The simple adhoc WHERE filter (status == active) should be
|
||||
# merged into filters by split_adhoc_filters_into_base_filters.
|
||||
filters = queries[0].get("filters", [])
|
||||
assert {"col": "status", "op": "==", "val": "active"} in filters
|
||||
|
||||
@patch("superset.common.query_context_factory.QueryContextFactory")
|
||||
def test_metrics_and_groupby_in_queries(self, mock_factory_cls):
|
||||
@@ -429,6 +440,76 @@ class TestBuildQueryContextFromFormData:
|
||||
assert queries[0]["columns"] == ["product"]
|
||||
|
||||
|
||||
class TestResolveDatasourceName:
|
||||
"""Tests for _resolve_datasource_name helper."""
|
||||
|
||||
def test_returns_chart_datasource_name_when_chart_exists(self):
|
||||
"""When a chart object is provided, use its datasource_name."""
|
||||
chart = Mock()
|
||||
chart.datasource_name = "my_dataset"
|
||||
result = _resolve_datasource_name({"datasource_id": 1}, chart)
|
||||
assert result == "my_dataset"
|
||||
|
||||
@patch(
|
||||
"superset.mcp_service.chart.tool.get_chart_sql.DatasourceDAO",
|
||||
create=True,
|
||||
)
|
||||
def test_resolves_from_form_data_when_chart_is_none(self, mock_dao):
|
||||
"""Unsaved charts resolve datasource name via DAO lookup."""
|
||||
mock_ds = Mock()
|
||||
mock_ds.name = "resolved_dataset"
|
||||
|
||||
with patch(
|
||||
"superset.daos.datasource.DatasourceDAO.get_datasource",
|
||||
return_value=mock_ds,
|
||||
):
|
||||
result = _resolve_datasource_name(
|
||||
{"datasource_id": 42, "datasource_type": "table"}, chart=None
|
||||
)
|
||||
assert result == "resolved_dataset"
|
||||
|
||||
def test_returns_none_when_no_datasource_id(self):
|
||||
"""Returns None when form_data has no datasource info."""
|
||||
result = _resolve_datasource_name({}, chart=None)
|
||||
assert result is None
|
||||
|
||||
def test_returns_none_when_datasource_not_found(self):
|
||||
"""Returns None when DAO raises DatasourceNotFound."""
|
||||
from superset.daos.exceptions import DatasourceNotFound
|
||||
|
||||
with patch(
|
||||
"superset.daos.datasource.DatasourceDAO.get_datasource",
|
||||
side_effect=DatasourceNotFound(),
|
||||
):
|
||||
result = _resolve_datasource_name(
|
||||
{"datasource_id": 999, "datasource_type": "table"}, chart=None
|
||||
)
|
||||
assert result is None
|
||||
|
||||
def test_returns_none_on_unsupported_type(self):
|
||||
"""Returns None when DAO raises DatasourceTypeNotSupportedError."""
|
||||
from superset.daos.exceptions import DatasourceTypeNotSupportedError
|
||||
|
||||
with patch(
|
||||
"superset.daos.datasource.DatasourceDAO.get_datasource",
|
||||
side_effect=DatasourceTypeNotSupportedError(),
|
||||
):
|
||||
result = _resolve_datasource_name(
|
||||
{"datasource_id": 1, "datasource_type": "invalid"}, chart=None
|
||||
)
|
||||
assert result is None
|
||||
|
||||
@patch("superset.daos.datasource.DatasourceDAO.get_datasource")
|
||||
def test_resolves_combined_datasource_field(self, mock_get_ds):
|
||||
"""Handles combined 'datasource' field like '123__table'."""
|
||||
mock_ds = Mock()
|
||||
mock_ds.name = "combined_dataset"
|
||||
mock_get_ds.return_value = mock_ds
|
||||
|
||||
result = _resolve_datasource_name({"datasource": "123__table"}, chart=None)
|
||||
assert result == "combined_dataset"
|
||||
|
||||
|
||||
class TestGetChartSqlTool:
|
||||
"""Integration-style tests for the get_chart_sql MCP tool via Client."""
|
||||
|
||||
|
||||
@@ -23,9 +23,10 @@ from unittest.mock import MagicMock, patch
|
||||
import pytest
|
||||
from fastmcp import Client
|
||||
from fastmcp.exceptions import ToolError
|
||||
from pydantic import ValidationError
|
||||
|
||||
from superset.mcp_service.app import mcp
|
||||
from superset.mcp_service.database.schemas import ListDatabasesRequest
|
||||
from superset.mcp_service.database.schemas import DatabaseFilter, ListDatabasesRequest
|
||||
from superset.mcp_service.privacy import DATA_MODEL_METADATA_ERROR_TYPE
|
||||
from superset.utils import json
|
||||
|
||||
@@ -39,6 +40,25 @@ get_database_info_module = importlib.import_module(
|
||||
)
|
||||
|
||||
|
||||
class TestDatabaseFilterSchema:
|
||||
"""Tests for DatabaseFilter schema — filterable columns."""
|
||||
|
||||
def test_created_by_fk_is_valid_filter_column(self):
|
||||
"""created_by_fk must be accepted as a filter column."""
|
||||
f = DatabaseFilter(col="created_by_fk", opr="eq", value=1)
|
||||
assert f.col == "created_by_fk"
|
||||
|
||||
def test_changed_by_fk_is_valid_filter_column(self):
|
||||
"""changed_by_fk must be accepted as a filter column."""
|
||||
f = DatabaseFilter(col="changed_by_fk", opr="eq", value=1)
|
||||
assert f.col == "changed_by_fk"
|
||||
|
||||
def test_invalid_filter_column_rejected(self):
|
||||
"""Columns not in the Literal set must be rejected."""
|
||||
with pytest.raises(ValidationError):
|
||||
DatabaseFilter(col="not_a_real_column", opr="eq", value=1)
|
||||
|
||||
|
||||
def create_mock_database(
|
||||
database_id: int = 1,
|
||||
database_name: str = "examples",
|
||||
@@ -249,10 +269,15 @@ async def test_list_databases_does_not_expose_user_directory_fields(
|
||||
|
||||
|
||||
def test_database_filter_rejects_user_directory_fields() -> None:
|
||||
"""Test user directory fields cannot be used for database filters."""
|
||||
with pytest.raises(ValueError, match="created_by_fk"):
|
||||
"""Test user directory string fields cannot be used for database filters.
|
||||
|
||||
created_by_fk / changed_by_fk are integer FK IDs and ARE valid filter
|
||||
columns. The user-directory *string* fields (created_by, created_by_name,
|
||||
etc.) must still be rejected.
|
||||
"""
|
||||
with pytest.raises(ValidationError, match="created_by_name"):
|
||||
ListDatabasesRequest(
|
||||
filters=[{"col": "created_by_fk", "opr": "eq", "value": 1}],
|
||||
filters=[{"col": "created_by_name", "opr": "eq", "value": "admin"}],
|
||||
)
|
||||
|
||||
|
||||
|
||||
Reference in New Issue
Block a user