Compare commits

..

24 Commits

Author SHA1 Message Date
Maxime Beauchemin
375fe42a68 pointing link to master 2025-07-29 12:01:05 -07:00
Maxime Beauchemin
e6e0c3c47e docs 2025-07-29 11:19:58 -07:00
Maxime Beauchemin
1d6617d809 improve startup script 2025-07-29 11:19:58 -07:00
Maxime Beauchemin
4ff2a85b11 gh 2025-07-29 11:19:58 -07:00
Maxime Beauchemin
f1a3bdd878 tweak utilities 2025-07-29 11:19:58 -07:00
Maxime Beauchemin
4b5dbf3dcf public port 2025-07-29 11:19:58 -07:00
Maxime Beauchemin
458db68929 tmux 2025-07-29 11:19:58 -07:00
Maxime Beauchemin
d4463078ad only 9001 2025-07-29 11:19:57 -07:00
Maxime Beauchemin
7ad10ac1a9 ssh 2025-07-29 11:19:57 -07:00
Maxime Beauchemin
f580f6159e ok 2025-07-29 11:19:57 -07:00
Maxime Beauchemin
a26e0ea0fe fix: Use Python 3.11 Bookworm image to match current standard
- Switch to pre-built Python 3.11 image (no compilation)
- Bookworm base matches Superset Docker images
- Python 3.11 is the current tested standard
- Faster startup, no building from source
2025-07-29 11:19:57 -07:00
Maxime Beauchemin
4eef7a65c1 fix: Remove Python feature to avoid building from source
- Ubuntu 24.04 already includes Python 3.12
- No need to build Python from source (saves ~10min)
- System Python is sufficient for host environment
- Actual Superset Python runs in Docker containers
2025-07-29 11:19:57 -07:00
Maxime Beauchemin
ba3388bf94 feat: Add Claude Code CLI to devcontainer setup
- Install Claude Code for AI-assisted development
- Perfect for using 'claude --yes' safely in Codespaces
- No risk to local machine when running automated commands
2025-07-29 11:19:57 -07:00
Maxime Beauchemin
ca57bbc1e2 feat: Add uv package installer to devcontainer setup
- Install uv via official installer script
- Provides 10-100x faster Python package operations
- Matches what CI uses for package installation
2025-07-29 11:19:56 -07:00
Maxime Beauchemin
19f414b217 fix: Update Node version to 20 to match package.json requirements
- package.json specifies Node ^20.18.1
- Update devcontainer to use Node 20 instead of 18
2025-07-29 11:19:56 -07:00
Maxime Beauchemin
bc604d54e4 fix: Use Ubuntu 24.04 base to match CI with Python 3.11
- Switch to ubuntu-24.04 to match CI environment
- Add Python 3.11 explicitly
- Keep lean setup with only needed features
2025-07-29 11:19:56 -07:00
Maxime Beauchemin
e922e51e6b fix: Use lean Python base image instead of bloated universal
- Switch from 10GB universal to ~2GB Python base
- Add only needed features: Docker, Node, Git
- Much faster Codespace startup
- Same functionality, less bloat
2025-07-29 11:19:56 -07:00
Maxime Beauchemin
8bf2e4ea3a fix: Simplify devcontainer to avoid docker-compose conflicts
- Remove all features (universal image has everything)
- Simplified config to just image + scripts
- No dockerComposeFile reference
- Plain container that runs docker-compose internally
2025-07-29 11:19:56 -07:00
Maxime Beauchemin
cf8183b67e fix: Force rebuild with clean devcontainer config 2025-07-29 11:19:56 -07:00
Maxime Beauchemin
02f90f4321 feat: Use devcontainers/universal image for better tooling
- Switch to universal:2 image which includes vim, curl, jq, tmux, etc.
- Remove redundant features (already in universal image)
- Simplify setup script - only install Superset-specific libs
- Keeps SSH feature for remote access
2025-07-29 11:19:55 -07:00
Maxime Beauchemin
a007b3020d fix: Refactor devcontainer to use base Ubuntu with Docker-in-Docker
- Switch from docker-compose service to base Ubuntu container
- Add Docker-in-Docker to run docker-compose inside Codespace
- This provides git access and full dev environment
- Superset services run via docker-compose from within the container
2025-07-29 11:19:55 -07:00
Maxime Beauchemin
26e5e637f9 feat: Add SSH support to Codespaces configuration 2025-07-29 11:19:55 -07:00
Maxime Beauchemin
8de420ec8e fix: Correct workspace paths for Codespaces
- Use /workspaces instead of /app for Codespaces compatibility
- Fix postCreateCommand and postStartCommand paths
- Make startup script more flexible with directory detection
2025-07-29 11:19:55 -07:00
Maxime Beauchemin
fd51cc65a2 feat: Add GitHub Codespaces support with docker-compose-light
## Summary

Adds full GitHub Codespaces development environment configuration leveraging the new `docker-compose-light.yml` for efficient cloud development.

## Key Features

- **Lightweight Setup**: Uses `docker-compose-light.yml` which removes Redis/nginx for faster startup and lower resource usage
- **Multi-Instance Support**: Each Codespace gets isolated database volumes, perfect for testing multiple branches
- **Auto-Configuration**: Includes VS Code extensions, Python/TypeScript settings, and auto-start script
- **Developer Friendly**: Comprehensive README with SSH, VS Code, and browser connection instructions

## Implementation Details

### Files Added
- `.devcontainer/devcontainer.json` - Main configuration with:
  - Docker-in-Docker support for compose
  - Optimized VS Code extensions for Superset development
  - Smart port forwarding (9001 for frontend, 8088 for API)
  - 4-core/8GB recommended resources

- `.devcontainer/start-superset.sh` - Auto-start script that:
  - Uses unique project names per Codespace
  - Handles Docker daemon startup
  - Shows clear status and credentials

- `.devcontainer/README.md` - Developer guide covering:
  - Multiple connection methods (SSH, VS Code, browser)
  - Port forwarding instructions
  - Cost optimization tips
  - Integration with `claude --yes` workflows

## Benefits

1. **Isolated Development**: No risk to local machine when using `claude --yes`
2. **Resource Efficiency**: Laptop stays cool, Codespaces handles the load
3. **Parallel Testing**: Spin up multiple instances for different features
4. **Quick Pause/Resume**: Auto-stops when idle, resumes in ~30 seconds

## Testing

Push to fork and create a Codespace to test. The environment auto-starts Superset and forwards port 9001 with HTTPS.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-07-29 11:19:55 -07:00
120 changed files with 8555 additions and 12843 deletions

View File

@@ -55,7 +55,6 @@ esm/*
tsconfig.tsbuildinfo
.*ipynb
.*yml
.*yaml
.*iml
.esprintrc
.prettierignore

View File

@@ -74,7 +74,7 @@ RUN --mount=type=bind,source=./superset-frontend/package.json,target=./package.j
COPY superset-frontend /app/superset-frontend
######################################################################
# superset-node is used for compiling frontend assets
# superset-node used for compile frontend assets
######################################################################
FROM superset-node-ci AS superset-node
@@ -90,7 +90,7 @@ RUN --mount=type=cache,target=/root/.npm \
# Copy translation files
COPY superset/translations /app/superset/translations
# Build translations if enabled, then cleanup localization files
# Build the frontend if not in dev mode
RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \
npm run build-translation; \
fi; \

View File

@@ -23,8 +23,6 @@ This file documents any backwards-incompatible changes in Superset and
assists people when migrating to a new version.
## Next
- [34346](https://github.com/apache/superset/pull/34346) The examples system has been migrated from Python-based scripts to YAML configuration files. The CLI command `superset load-examples` has been deprecated in favor of `superset examples load`. The old command still works but will show a deprecation warning. Additional example management commands are available under `superset examples` including `clear-old` and `reload`. If you have old Python-based examples loaded, the new YAML-based examples will not load automatically to preserve your existing data. To migrate to the new examples, run `superset examples clear-old --confirm` followed by `superset examples load`.
**Note**: This change affects Cypress tests that rely on specific chart names from the old examples (e.g., "Num Births Trend", "Daily Totals"). These charts may not exist in the new YAML examples, causing test failures. Consider updating your Cypress tests or creating test-specific fixtures.
- [33084](https://github.com/apache/superset/pull/33084) The DISALLOWED_SQL_FUNCTIONS configuration now includes additional potentially sensitive database functions across PostgreSQL, MySQL, SQLite, MS SQL Server, and ClickHouse. Existing queries using these functions may now be blocked. Review your SQL Lab queries and dashboards if you encounter "disallowed function" errors after upgrading
- [34235](https://github.com/apache/superset/pull/34235) CSV exports now use `utf-8-sig` encoding by default to include a UTF-8 BOM, improving compatibility with Excel.
- [34258](https://github.com/apache/superset/pull/34258) changing the default in Dockerfile to INCLUDE_CHROMIUM="false" (from "true") in the past. This ensures the `lean` layer is lean by default, and people can opt-in to the `chromium` layer by setting the build arg `INCLUDE_CHROMIUM=true`. This is a breaking change for anyone using the `lean` layer, as it will no longer include Chromium by default.

View File

@@ -20,6 +20,9 @@
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.
#
# For verbose logging during development:
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
# -----------------------------------------------------------------------
x-superset-image: &superset-image apachesuperset.docker.scarf.sh/apache/superset:${TAG:-latest-dev}
x-superset-volumes:

View File

@@ -20,6 +20,9 @@
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.
#
# For verbose logging during development:
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
# -----------------------------------------------------------------------
x-superset-volumes:
&superset-volumes # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container

View File

@@ -20,6 +20,9 @@
# If you choose to use this type of deployment make sure to
# create you own docker environment file (docker/.env) with your own
# unique random secure passwords and SECRET_KEY.
#
# For verbose logging during development:
# - Set SUPERSET_LOG_LEVEL=debug in docker/.env-local for detailed Superset logs
# -----------------------------------------------------------------------
x-superset-user: &superset-user root
x-superset-volumes: &superset-volumes

View File

@@ -53,7 +53,12 @@ PYTHONPATH=/app/pythonpath:/app/docker/pythonpath_dev
REDIS_HOST=redis
REDIS_PORT=6379
# Development and logging configuration
# FLASK_DEBUG: Enables Flask dev features (auto-reload, better error pages) - keep 'true' for development
FLASK_DEBUG=true
# SUPERSET_LOG_LEVEL: Controls Superset application logging verbosity (debug, info, warning, error, critical)
SUPERSET_LOG_LEVEL=info
SUPERSET_APP_ROOT="/"
SUPERSET_ENV=development
SUPERSET_LOAD_EXAMPLES=yes
@@ -66,4 +71,3 @@ SUPERSET_SECRET_KEY=TEST_NON_DEV_SECRET
ENABLE_PLAYWRIGHT=false
PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
BUILD_SUPERSET_FRONTEND_IN_DOCKER=true
SUPERSET_LOG_LEVEL=info

View File

@@ -20,4 +20,5 @@
# DON'T ignore the .gitignore
!.gitignore
!superset_config.py
!superset_config_docker_light.py
!superset_config_local.example

View File

@@ -14,7 +14,6 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# mypy: disable-error-code="assignment,misc"
#
# This file is included in the final Docker image and SHOULD be overridden when
# deploying the image to prod. Settings configured here are intended for use in local

View File

@@ -137,7 +137,7 @@ contributing to Apache Superset more accessible to developers worldwide.
1. **Create a Codespace**: Use this pre-configured link that sets up everything you need:
[**Launch Superset Codespace →**](https://github.com/codespaces/new?skip_quickstart=true&machine=standardLinux32gb&repo=39464018&ref=codespaces&geo=UsWest&devcontainer_path=.devcontainer%2Fdevcontainer.json)
[**Launch Superset Codespace →**](https://github.com/codespaces/new?skip_quickstart=true&machine=standardLinux32gb&repo=39464018&ref=master&geo=UsWest&devcontainer_path=.devcontainer%2Fdevcontainer.json)
:::caution
**Important**: You must select at least the **4 CPU / 16GB RAM** machine type (pre-selected in the link above).
@@ -348,7 +348,7 @@ superset init
# Load some data to play with.
# Note: you MUST have previously created an admin user with the username `admin` for this command to work.
superset examples load
superset load-examples
# Start the Flask dev web server from inside your virtualenv.
# Note that your page may not have CSS at this point.

View File

@@ -26,11 +26,14 @@ Superset locally is using Docker Compose on a Linux or Mac OSX
computer. Superset does not have official support for Windows. It's also the easiest
way to launch a fully functioning **development environment** quickly.
Note that there are 3 major ways we support to run `docker compose`:
Note that there are 4 major ways we support to run `docker compose`:
1. **docker-compose.yml:** for interactive development, where we mount your local folder with the
frontend/backend files that you can edit and experience the changes you
make in the app in real time
1. **docker-compose-light.yml:** a lightweight configuration with minimal services (database,
Superset app, and frontend dev server) for development. Uses in-memory caching instead of Redis
and is designed for running multiple instances simultaneously
1. **docker-compose-non-dev.yml** where we just build a more immutable image based on the
local branch and get all the required images running. Changes in the local branch
at the time you fire this up will be reflected, but changes to the code
@@ -44,7 +47,7 @@ Note that there are 3 major ways we support to run `docker compose`:
The `dev` builds include the `psycopg2-binary` required to connect
to the Postgres database launched as part of the `docker compose` builds.
More on these two approaches after setting up the requirements for either.
More on these approaches after setting up the requirements for either.
## Requirements
@@ -103,13 +106,36 @@ and help you start fresh. In the context of `docker compose` setting
from within docker. This will slow down the startup, but will fix various npm-related issues.
:::
### Option #2 - build a set of immutable images from the local branch
### Option #2 - lightweight development with multiple instances
For a lighter development setup that uses fewer resources and supports running multiple instances:
```bash
# Single lightweight instance (default port 9001)
docker compose -f docker-compose-light.yml up
# Multiple instances with different ports
NODE_PORT=9001 docker compose -p superset-1 -f docker-compose-light.yml up
NODE_PORT=9002 docker compose -p superset-2 -f docker-compose-light.yml up
NODE_PORT=9003 docker compose -p superset-3 -f docker-compose-light.yml up
```
This configuration includes:
- PostgreSQL database (internal network only)
- Superset application server
- Frontend development server with webpack hot reloading
- In-memory caching (no Redis)
- Isolated volumes and networks per instance
Access each instance at `http://localhost:{NODE_PORT}` (e.g., `http://localhost:9001`).
### Option #3 - build a set of immutable images from the local branch
```bash
docker compose -f docker-compose-non-dev.yml up
```
### Option #3 - boot up an official release
### Option #4 - boot up an official release
```bash
# Set the version you want to run

View File

@@ -151,7 +151,7 @@ Finish installing by running through the following commands:
superset fab create-admin
# Load some data to play with
superset examples load
superset load_examples
# Create default roles and permissions
superset init

View File

@@ -33,4 +33,4 @@ superset load-test-users
echo "Running tests"
pytest --durations-min=2 --cov-report= --cov=superset ./tests/integration_tests "$@"
pytest --durations-min=2 --maxfail=1 --cov-report= --cov=superset ./tests/integration_tests "$@"

View File

@@ -41,53 +41,6 @@ import {
import { checkColumnType } from '../utils/checkColumnType';
import { isSortable } from '../utils/isSortable';
// Aggregation choices with computation methods for plugins and controls
export const aggregationChoices = {
raw: {
label: 'Overall value',
compute: (data: number[]) => {
if (!data.length) return null;
return data[0];
},
},
LAST_VALUE: {
label: 'Last Value',
compute: (data: number[]) => {
if (!data.length) return null;
return data[0];
},
},
sum: {
label: 'Total (Sum)',
compute: (data: number[]) =>
data.length ? data.reduce((a, b) => a + b, 0) : null,
},
mean: {
label: 'Average (Mean)',
compute: (data: number[]) =>
data.length ? data.reduce((a, b) => a + b, 0) / data.length : null,
},
min: {
label: 'Minimum',
compute: (data: number[]) => (data.length ? Math.min(...data) : null),
},
max: {
label: 'Maximum',
compute: (data: number[]) => (data.length ? Math.max(...data) : null),
},
median: {
label: 'Median',
compute: (data: number[]) => {
if (!data.length) return null;
const sorted = [...data].sort((a, b) => a - b);
const mid = Math.floor(sorted.length / 2);
return sorted.length % 2 === 0
? (sorted[mid - 1] + sorted[mid]) / 2
: sorted[mid];
},
},
} as const;
export const contributionModeControl = {
name: 'contributionMode',
config: {
@@ -116,12 +69,17 @@ export const aggregationControl = {
default: 'LAST_VALUE',
clearable: false,
renderTrigger: false,
choices: Object.entries(aggregationChoices).map(([value, { label }]) => [
value,
t(label),
]),
choices: [
['raw', t('None')],
['LAST_VALUE', t('Last Value')],
['sum', t('Total (Sum)')],
['mean', t('Average (Mean)')],
['min', t('Minimum')],
['max', t('Maximum')],
['median', t('Median')],
],
description: t(
'Method to compute the displayed value. "Overall value" calculates a single metric across the entire filtered time period, ideal for non-additive metrics like ratios, averages, or distinct counts. Other methods operate over the time series data points.',
'Aggregation method used to compute the Big Number from the Trendline.For non-additive metrics like ratios, averages, distinct counts, etc use NONE.',
),
provideFormDataToProps: true,
mapStateToProps: ({ form_data }: ControlPanelState) => ({

View File

@@ -16,8 +16,14 @@
* specific language governing permissions and limitations
* under the License.
*/
import { t } from '@superset-ui/core';
import { Icons, Modal, Typography, Button } from '@superset-ui/core/components';
import { t, css, useTheme } from '@superset-ui/core';
import {
Icons,
Modal,
Typography,
Button,
Flex,
} from '@superset-ui/core/components';
import type { FC, ReactElement } from 'react';
export type UnsavedChangesModalProps = {
@@ -36,30 +42,66 @@ export const UnsavedChangesModal: FC<UnsavedChangesModalProps> = ({
onConfirmNavigation,
title = 'Unsaved Changes',
body = "If you don't save, changes will be lost.",
}: UnsavedChangesModalProps): ReactElement => (
<Modal
centered
responsive
onHide={onHide}
show={showModal}
width="444px"
title={
<>
<Icons.WarningOutlined iconSize="m" style={{ marginRight: 8 }} />
{title}
</>
}
footer={
<>
<Button buttonStyle="secondary" onClick={onConfirmNavigation}>
{t('Discard')}
</Button>
<Button buttonStyle="primary" onClick={handleSave}>
{t('Save')}
</Button>
</>
}
>
<Typography.Text>{body}</Typography.Text>
</Modal>
);
}): ReactElement => {
const theme = useTheme();
return (
<Modal
name={title}
centered
responsive
onHide={onHide}
show={showModal}
width="444px"
title={
<Flex>
<Icons.WarningOutlined
iconColor={theme.colorWarning}
css={css`
margin-right: ${theme.sizeUnit * 2}px;
`}
iconSize="l"
/>
<Typography.Title
css={css`
&& {
margin: 0;
margin-bottom: 0;
}
`}
level={5}
>
{title}
</Typography.Title>
</Flex>
}
footer={
<Flex
justify="flex-end"
css={css`
width: 100%;
`}
>
<Button
htmlType="button"
buttonSize="small"
buttonStyle="secondary"
onClick={onConfirmNavigation}
>
{t('Discard')}
</Button>
<Button
htmlType="button"
buttonSize="small"
buttonStyle="primary"
onClick={handleSave}
>
{t('Save')}
</Button>
</Flex>
}
>
<Typography.Text>{body}</Typography.Text>
</Modal>
);
};

View File

@@ -1385,7 +1385,7 @@ export default function (config) {
p[0] = p[0] - __.margin.left;
p[1] = p[1] - __.margin.top;
((dims = dimensionsForPoint(p)),
(dims = dimensionsForPoint(p)),
(strum = {
p1: p,
dims: dims,
@@ -1393,7 +1393,7 @@ export default function (config) {
maxX: xscale(dims.right),
minY: 0,
maxY: h(),
}));
});
strums[dims.i] = strum;
strums.active = dims.i;
@@ -1942,7 +1942,7 @@ export default function (config) {
p[0] = p[0] - __.margin.left;
p[1] = p[1] - __.margin.top;
((dims = dimensionsForPoint(p)),
(dims = dimensionsForPoint(p)),
(arc = {
p1: p,
dims: dims,
@@ -1953,7 +1953,7 @@ export default function (config) {
startAngle: undefined,
endAngle: undefined,
arc: d3.svg.arc().innerRadius(0),
}));
});
arcs[dims.i] = arc;
arcs.active = dims.i;

View File

@@ -49,53 +49,38 @@ describe('BigNumberWithTrendline buildQuery', () => {
aggregation: null,
};
it('creates raw metric query when aggregation is "raw"', () => {
const queryContext = buildQuery({ ...baseFormData, aggregation: 'raw' });
it('creates raw metric query when aggregation is null', () => {
const queryContext = buildQuery({ ...baseFormData });
const bigNumberQuery = queryContext.queries[1];
expect(bigNumberQuery.post_processing).toEqual([]);
expect(bigNumberQuery.is_timeseries).toBe(false);
expect(bigNumberQuery.columns).toEqual([]);
expect(bigNumberQuery.post_processing).toEqual([{ operation: 'pivot' }]);
expect(bigNumberQuery.is_timeseries).toBe(true);
});
it('returns single query for aggregation methods that can be computed client-side', () => {
it('adds aggregation operator when aggregation is "sum"', () => {
const queryContext = buildQuery({ ...baseFormData, aggregation: 'sum' });
const bigNumberQuery = queryContext.queries[1];
expect(queryContext.queries.length).toBe(1);
expect(queryContext.queries[0].post_processing).toEqual([
expect(bigNumberQuery.post_processing).toEqual([
{ operation: 'pivot' },
{ operation: 'rolling' },
{ operation: 'resample' },
{ operation: 'flatten' },
{ operation: 'aggregation', options: { operator: 'sum' } },
]);
expect(bigNumberQuery.is_timeseries).toBe(true);
});
it('returns single query for LAST_VALUE aggregation', () => {
it('skips aggregation when aggregation is LAST_VALUE', () => {
const queryContext = buildQuery({
...baseFormData,
aggregation: 'LAST_VALUE',
});
const bigNumberQuery = queryContext.queries[1];
expect(queryContext.queries.length).toBe(1);
expect(queryContext.queries[0].post_processing).toEqual([
{ operation: 'pivot' },
{ operation: 'rolling' },
{ operation: 'resample' },
{ operation: 'flatten' },
]);
expect(bigNumberQuery.post_processing).toEqual([{ operation: 'pivot' }]);
expect(bigNumberQuery.is_timeseries).toBe(true);
});
it('returns two queries only for raw aggregation', () => {
const queryContext = buildQuery({ ...baseFormData, aggregation: 'raw' });
it('always returns two queries', () => {
const queryContext = buildQuery({ ...baseFormData });
expect(queryContext.queries.length).toBe(2);
const queryContextLastValue = buildQuery({
...baseFormData,
aggregation: 'LAST_VALUE',
});
expect(queryContextLastValue.queries.length).toBe(1);
const queryContextSum = buildQuery({ ...baseFormData, aggregation: 'sum' });
expect(queryContextSum.queries.length).toBe(1);
});
});

View File

@@ -39,37 +39,28 @@ export default function buildQuery(formData: QueryFormData) {
? ensureIsArray(getXAxisColumn(formData))
: [];
return buildQueryContext(formData, baseQueryObject => {
const queries = [
{
...baseQueryObject,
columns: [...timeColumn],
...(timeColumn.length ? {} : { is_timeseries: true }),
post_processing: [
pivotOperator(formData, baseQueryObject),
rollingWindowOperator(formData, baseQueryObject),
resampleOperator(formData, baseQueryObject),
flattenOperator(formData, baseQueryObject),
].filter(Boolean),
},
];
// Only add second query for raw metrics which need different query structure
// All other aggregations (sum, mean, min, max, median, LAST_VALUE) can be computed client-side from trendline data
if (formData.aggregation === 'raw') {
queries.push({
...baseQueryObject,
columns: [...(isRawMetric ? [] : timeColumn)],
is_timeseries: !isRawMetric,
post_processing: isRawMetric
? []
: ([
pivotOperator(formData, baseQueryObject),
aggregationOperator(formData, baseQueryObject),
].filter(Boolean) as any[]),
});
}
return queries;
});
return buildQueryContext(formData, baseQueryObject => [
{
...baseQueryObject,
columns: [...timeColumn],
...(timeColumn.length ? {} : { is_timeseries: true }),
post_processing: [
pivotOperator(formData, baseQueryObject),
rollingWindowOperator(formData, baseQueryObject),
resampleOperator(formData, baseQueryObject),
flattenOperator(formData, baseQueryObject),
],
},
{
...baseQueryObject,
columns: [...(isRawMetric ? [] : timeColumn)],
is_timeseries: !isRawMetric,
post_processing: isRawMetric
? []
: [
pivotOperator(formData, baseQueryObject),
aggregationOperator(formData, baseQueryObject),
],
},
]);
}

View File

@@ -20,41 +20,6 @@ import { GenericDataType } from '@superset-ui/core';
import transformProps from './transformProps';
import { BigNumberWithTrendlineChartProps, BigNumberDatum } from '../types';
// Mock chart-controls to avoid styled-components issues in Jest
jest.mock('@superset-ui/chart-controls', () => ({
aggregationChoices: {
raw: {
label: 'Force server-side aggregation',
compute: (data: number[]) => data[0] ?? null,
},
LAST_VALUE: {
label: 'Last Value',
compute: (data: number[]) => data[0] ?? null,
},
sum: {
label: 'Total (Sum)',
compute: (data: number[]) => data.reduce((a, b) => a + b, 0),
},
mean: {
label: 'Average (Mean)',
compute: (data: number[]) =>
data.reduce((a, b) => a + b, 0) / data.length,
},
min: { label: 'Minimum', compute: (data: number[]) => Math.min(...data) },
max: { label: 'Maximum', compute: (data: number[]) => Math.max(...data) },
median: {
label: 'Median',
compute: (data: number[]) => {
const sorted = [...data].sort((a, b) => a - b);
const mid = Math.floor(sorted.length / 2);
return sorted.length % 2 === 0
? (sorted[mid - 1] + sorted[mid]) / 2
: sorted[mid];
},
},
},
}));
jest.mock('@superset-ui/core', () => ({
GenericDataType: { Temporal: 2, String: 1 },
extractTimegrain: jest.fn(() => 'P1D'),
@@ -253,7 +218,7 @@ describe('BigNumberWithTrendline transformProps', () => {
coltypes: ['NUMERIC'],
},
],
formData: { ...baseFormData, aggregation: 'sum' },
formData: { ...baseFormData, aggregation: 'SUM' },
rawFormData: baseRawFormData,
hooks: baseHooks,
datasource: baseDatasource,

View File

@@ -29,7 +29,6 @@ import {
tooltipHtml,
} from '@superset-ui/core';
import { EChartsCoreOption, graphic } from 'echarts/core';
import { aggregationChoices } from '@superset-ui/chart-controls';
import {
BigNumberVizProps,
BigNumberDatum,
@@ -44,31 +43,6 @@ const formatPercentChange = getNumberFormatter(
NumberFormats.PERCENT_SIGNED_1_POINT,
);
// Client-side aggregation function using shared aggregationChoices
function computeClientSideAggregation(
data: [number | null, number | null][],
aggregation: string | undefined | null,
): number | null {
if (!data.length) return null;
// Find the aggregation method, handling case variations
const methodKey = Object.keys(aggregationChoices).find(
key => key.toLowerCase() === (aggregation || '').toLowerCase(),
);
// Use the compute method from aggregationChoices, fallback to LAST_VALUE
const selectedMethod = methodKey
? aggregationChoices[methodKey as keyof typeof aggregationChoices]
: aggregationChoices.LAST_VALUE;
// Extract values from tuple array and filter out nulls
const values = data
.map(([, value]) => value)
.filter((v): v is number => v !== null);
return selectedMethod.compute(values);
}
export default function transformProps(
chartProps: BigNumberWithTrendlineChartProps,
): BigNumberVizProps {
@@ -152,33 +126,27 @@ export default function transformProps(
// sort in time descending order
.sort((a, b) => (a[0] !== null && b[0] !== null ? b[0] - a[0] : 0));
}
if (sortedData.length > 0) {
timestamp = sortedData[0][0];
// Raw aggregation uses server-side data, all others use client-side
if (aggregation === 'raw' && hasAggregatedData && aggregatedData) {
// Use server-side aggregation for raw
if (
aggregatedData[metricName] !== null &&
aggregatedData[metricName] !== undefined
) {
bigNumber = aggregatedData[metricName];
} else {
const metricKeys = Object.keys(aggregatedData).filter(
key =>
key !== xAxisLabel &&
aggregatedData[key] !== null &&
typeof aggregatedData[key] === 'number',
);
bigNumber =
metricKeys.length > 0 ? aggregatedData[metricKeys[0]] : null;
}
if (hasAggregatedData && aggregatedData) {
if (
aggregatedData[metricName] !== null &&
aggregatedData[metricName] !== undefined
) {
bigNumber = aggregatedData[metricName];
} else {
// Use client-side aggregation for all other methods
bigNumber = computeClientSideAggregation(sortedData, aggregation);
const metricKeys = Object.keys(aggregatedData).filter(
key =>
key !== xAxisLabel &&
aggregatedData[key] !== null &&
typeof aggregatedData[key] === 'number',
);
bigNumber = metricKeys.length > 0 ? aggregatedData[metricKeys[0]] : null;
}
// Handle null bigNumber case
timestamp = sortedData.length > 0 ? sortedData[0][0] : null;
} else if (sortedData.length > 0) {
bigNumber = sortedData[0][1];
timestamp = sortedData[0][0];
if (bigNumber === null) {
bigNumberFallback = sortedData.find(d => d[1] !== null);
bigNumber = bigNumberFallback ? bigNumberFallback[1] : null;

View File

@@ -128,10 +128,9 @@ describe('BigNumberWithTrendline', () => {
expect(lastDatum?.[0]).toStrictEqual(100);
expect(lastDatum?.[1]).toBeNull();
// should get the last non-null value
// should note this is a fallback
expect(transformed.bigNumber).toStrictEqual(1.2345);
// bigNumberFallback is only set when bigNumber is null after aggregation
expect(transformed.bigNumberFallback).toBeNull();
expect(transformed.bigNumberFallback).not.toBeNull();
// should successfully formatTime by granularity
// @ts-ignore

View File

@@ -164,7 +164,7 @@ const v1ChartDataRequest = async (
ownState,
parseMethod,
) => {
const payload = await buildV1ChartDataPayload({
const payload = buildV1ChartDataPayload({
formData,
resultType,
resultFormat,
@@ -255,7 +255,7 @@ export function runAnnotationQuery({
isDashboardRequest = false,
force = false,
}) {
return async function (dispatch, getState) {
return function (dispatch, getState) {
const { charts, common } = getState();
const sliceKey = key || Object.keys(charts)[0];
const queryTimeout = timeout || common.conf.SUPERSET_WEBSERVER_TIMEOUT;
@@ -310,19 +310,17 @@ export function runAnnotationQuery({
fd.annotation_layers[annotationIndex].overrides = sliceFormData;
}
const payload = await buildV1ChartDataPayload({
formData: fd,
force,
resultFormat: 'json',
resultType: 'full',
});
return SupersetClient.post({
url,
signal,
timeout: queryTimeout * 1000,
headers: { 'Content-Type': 'application/json' },
jsonPayload: payload,
jsonPayload: buildV1ChartDataPayload({
formData: fd,
force,
resultFormat: 'json',
resultType: 'full',
}),
})
.then(({ json }) => {
const data = json?.result?.[0]?.annotation_data?.[annotation.name];
@@ -422,8 +420,6 @@ export function exploreJSON(
const setDataMask = dataMask => {
dispatch(updateDataMask(formData.slice_id, dataMask));
};
dispatch(chartUpdateStarted(controller, formData, key));
const chartDataRequest = getChartDataRequest({
setDataMask,
formData,
@@ -435,6 +431,8 @@ export function exploreJSON(
ownState,
});
dispatch(chartUpdateStarted(controller, formData, key));
const [useLegacyApi] = getQuerySettings(formData);
const chartDataRequestCaught = chartDataRequest
.then(({ response, json }) =>

View File

@@ -64,7 +64,6 @@ describe('chart actions', () => {
let dispatch;
let getExploreUrlStub;
let getChartDataUriStub;
let buildV1ChartDataPayloadStub;
let waitForAsyncDataStub;
let fakeMetadata;
@@ -86,13 +85,6 @@ describe('chart actions', () => {
getChartDataUriStub = sinon
.stub(exploreUtils, 'getChartDataUri')
.callsFake(({ qs }) => URI(MOCK_URL).query(qs));
buildV1ChartDataPayloadStub = sinon
.stub(exploreUtils, 'buildV1ChartDataPayload')
.resolves({
some_param: 'fake query!',
result_type: 'full',
result_format: 'json',
});
fakeMetadata = { useLegacyApi: true };
getChartMetadataRegistry.mockImplementation(() => ({
get: () => fakeMetadata,
@@ -112,7 +104,6 @@ describe('chart actions', () => {
afterEach(() => {
getExploreUrlStub.restore();
getChartDataUriStub.restore();
buildV1ChartDataPayloadStub.restore();
fetchMock.resetHistory();
waitForAsyncDataStub.restore();
@@ -371,7 +362,7 @@ describe('chart actions timeout', () => {
jest.clearAllMocks();
});
it('should use the timeout from arguments when given', async () => {
it('should use the timeout from arguments when given', () => {
const postSpy = jest.spyOn(SupersetClient, 'post');
postSpy.mockImplementation(() => Promise.resolve({ json: { result: [] } }));
const timeout = 10; // Set the timeout value here
@@ -379,7 +370,7 @@ describe('chart actions timeout', () => {
const key = 'chartKey'; // Set the chart key here
const store = mockStore(initialState);
await store.dispatch(
store.dispatch(
actions.runAnnotationQuery({
annotation: {
value: 'annotationValue',
@@ -403,14 +394,14 @@ describe('chart actions timeout', () => {
expect(postSpy).toHaveBeenCalledWith(expectedPayload);
});
it('should use the timeout from common.conf when not passed as an argument', async () => {
it('should use the timeout from common.conf when not passed as an argument', () => {
const postSpy = jest.spyOn(SupersetClient, 'post');
postSpy.mockImplementation(() => Promise.resolve({ json: { result: [] } }));
const formData = { datasource: 'table__1' }; // Set the formData here
const key = 'chartKey'; // Set the chart key here
const store = mockStore(initialState);
await store.dispatch(
store.dispatch(
actions.runAnnotationQuery({
annotation: {
value: 'annotationValue',

View File

@@ -91,7 +91,7 @@ afterEach(() => {
});
const getFormatSwitch = () =>
screen.getByRole('switch', { name: 'formatted original' });
screen.getByRole('switch', { name: 'Show original SQL' });
test('renders the component with Formatted SQL and buttons', async () => {
const { container } = setup(mockProps);

View File

@@ -26,17 +26,11 @@ import {
} from 'react';
import { useSelector } from 'react-redux';
import rison from 'rison';
import { styled, SupersetClient, t, useTheme } from '@superset-ui/core';
import {
Icons,
Switch,
Button,
Skeleton,
Card,
Space,
} from '@superset-ui/core/components';
import { styled, SupersetClient, t } from '@superset-ui/core';
import { Icons, Switch, Button, Skeleton } from '@superset-ui/core/components';
import { CopyToClipboard } from 'src/components';
import { RootState } from 'src/dashboard/types';
import { CopyButton } from 'src/explore/components/DataTableControl';
import { findPermission } from 'src/utils/findPermission';
import CodeSyntaxHighlighter, {
SupportedLanguage,
@@ -44,6 +38,14 @@ import CodeSyntaxHighlighter, {
} from '@superset-ui/core/components/CodeSyntaxHighlighter';
import { useHistory } from 'react-router-dom';
const CopyButtonViewQuery = styled(CopyButton)`
${({ theme }) => `
&& {
margin: 0 0 ${theme.sizeUnit}px;
}
`}
`;
export interface ViewQueryProps {
sql: string;
datasource: string;
@@ -56,14 +58,26 @@ const StyledSyntaxContainer = styled.div`
flex-direction: column;
`;
const StyledHeaderMenuContainer = styled.div`
display: flex;
flex-direction: row;
justify-content: space-between;
margin-top: ${({ theme }) => -theme.sizeUnit * 4}px;
align-items: flex-end;
`;
const StyledHeaderActionContainer = styled.div`
display: flex;
flex-direction: row;
column-gap: ${({ theme }) => theme.sizeUnit * 2}px;
`;
const StyledThemedSyntaxHighlighter = styled(CodeSyntaxHighlighter)`
flex: 1;
`;
const StyledFooter = styled.div`
display: flex;
justify-content: space-between;
align-items: center;
const StyledLabel = styled.label`
font-size: ${({ theme }) => theme.fontSize}px;
`;
const DATASET_BACKEND_QUERY = {
@@ -73,7 +87,6 @@ const DATASET_BACKEND_QUERY = {
const ViewQuery: FC<ViewQueryProps> = props => {
const { sql, language = 'sql', datasource } = props;
const theme = useTheme();
const datasetId = datasource.split('__')[0];
const [formattedSQL, setFormattedSQL] = useState<string>();
const [showFormatSQL, setShowFormatSQL] = useState(true);
@@ -140,57 +153,46 @@ const ViewQuery: FC<ViewQueryProps> = props => {
}, [sql]);
return (
<Card bodyStyle={{ padding: theme.sizeUnit * 4 }}>
<StyledSyntaxContainer key={sql}>
{!formattedSQL && <Skeleton active />}
{formattedSQL && (
<StyledThemedSyntaxHighlighter
language={language}
customStyle={{ flex: 1, marginBottom: theme.sizeUnit * 3 }}
>
{currentSQL}
</StyledThemedSyntaxHighlighter>
)}
<StyledFooter>
<Space size={theme.sizeUnit * 2}>
<CopyToClipboard
text={currentSQL}
shouldShowText={false}
copyNode={
<Button
buttonStyle="secondary"
buttonSize="small"
icon={<Icons.CopyOutlined />}
>
{t('Copy')}
</Button>
}
/>
{canAccessSQLLab && (
<Button
buttonStyle="secondary"
<StyledSyntaxContainer key={sql}>
<StyledHeaderMenuContainer>
<StyledHeaderActionContainer>
<CopyToClipboard
text={currentSQL}
shouldShowText={false}
copyNode={
<CopyButtonViewQuery
buttonSize="small"
onClick={navToSQLLab}
icon={<Icons.CopyOutlined />}
>
{t('View in SQL Lab')}
</Button>
)}
</Space>
<Space size={theme.sizeUnit * 2} align="center">
<Icons.ConsoleSqlOutlined />
<Switch
id="formatSwitch"
checked={showFormatSQL}
onChange={formatCurrentQuery}
checkedChildren={t('formatted')}
unCheckedChildren={t('original')}
/>
</Space>
</StyledFooter>
</StyledSyntaxContainer>
</Card>
{t('Copy')}
</CopyButtonViewQuery>
}
/>
{canAccessSQLLab && (
<Button onClick={navToSQLLab}>{t('View in SQL Lab')}</Button>
)}
</StyledHeaderActionContainer>
<StyledHeaderActionContainer>
<Switch
id="formatSwitch"
checked={!showFormatSQL}
onChange={formatCurrentQuery}
/>
<StyledLabel htmlFor="formatSwitch">
{t('Show original SQL')}
</StyledLabel>
</StyledHeaderActionContainer>
</StyledHeaderMenuContainer>
{!formattedSQL && <Skeleton active />}
{formattedSQL && (
<StyledThemedSyntaxHighlighter
language={language}
customStyle={{ flex: 1 }}
>
{currentSQL}
</StyledThemedSyntaxHighlighter>
)}
</StyledSyntaxContainer>
);
};

View File

@@ -42,7 +42,6 @@ const ViewQueryModalContainer = styled.div`
height: 100%;
display: flex;
flex-direction: column;
gap: ${({ theme }) => theme.sizeUnit * 4}px;
`;
const ViewQueryModal: FC<Props> = ({ latestQueryFormData }) => {
@@ -87,10 +86,9 @@ const ViewQueryModal: FC<Props> = ({ latestQueryFormData }) => {
return (
<ViewQueryModalContainer>
{result.map((item, index) =>
{result.map(item =>
item.query ? (
<ViewQuery
key={`query-${index}`}
datasource={latestQueryFormData.datasource}
sql={item.query}
language="sql"

View File

@@ -191,8 +191,8 @@ describe('exploreUtils', () => {
});
describe('buildV1ChartDataPayload', () => {
it('generate valid request payload despite no registered buildQuery', async () => {
const v1RequestPayload = await buildV1ChartDataPayload({
it('generate valid request payload despite no registered buildQuery', () => {
const v1RequestPayload = buildV1ChartDataPayload({
formData: { ...formData, viz_type: 'my_custom_viz' },
});
expect(v1RequestPayload.hasOwnProperty('queries')).toBeTruthy();

View File

@@ -207,7 +207,7 @@ export const getQuerySettings = formData => {
];
};
export const buildV1ChartDataPayload = async ({
export const buildV1ChartDataPayload = ({
formData,
force,
resultFormat,
@@ -242,7 +242,7 @@ export const buildV1ChartDataPayload = async ({
export const getLegacyEndpointType = ({ resultType, resultFormat }) =>
resultFormat === 'csv' ? resultFormat : resultType;
export const exportChart = async ({
export const exportChart = ({
formData,
resultFormat = 'json',
resultType = 'full',
@@ -262,7 +262,7 @@ export const exportChart = async ({
payload = formData;
} else {
url = ensureAppRoot('/api/v1/chart/data');
payload = await buildV1ChartDataPayload({
payload = buildV1ChartDataPayload({
formData,
force,
resultFormat,

View File

@@ -16,12 +16,7 @@
* specific language governing permissions and limitations
* under the License.
*/
import {
render,
screen,
waitFor,
userEvent,
} from 'spec/helpers/testing-library';
import { render, screen } from 'spec/helpers/testing-library';
import Footer from 'src/features/datasets/AddDataset/Footer';
const mockHistoryPush = jest.fn();
@@ -32,14 +27,6 @@ jest.mock('react-router-dom', () => ({
}),
}));
// Mock the API call
const mockCreateResource = jest.fn();
jest.mock('src/views/CRUD/hooks', () => ({
useSingleViewResource: () => ({
createResource: mockCreateResource,
}),
}));
const mockedProps = {
url: 'realwebsite.com',
};
@@ -47,7 +34,7 @@ const mockedProps = {
const mockPropsWithDataset = {
url: 'realwebsite.com',
datasetObject: {
db: {
database: {
id: '1',
database_name: 'examples',
},
@@ -60,10 +47,6 @@ const mockPropsWithDataset = {
};
describe('Footer', () => {
beforeEach(() => {
jest.clearAllMocks();
});
test('renders a Footer with a cancel button and a disabled create button', () => {
render(<Footer {...mockedProps} />, { useRedux: true });
@@ -72,28 +55,21 @@ describe('Footer', () => {
});
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
name: /Create/i,
});
expect(saveButton).toBeVisible();
expect(createButton).toBeDisabled();
});
test('renders a Create Dataset dropdown button when a table is selected', () => {
test('renders a Create Dataset button when a table is selected', () => {
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
name: /Create/i,
});
expect(createButton).toBeEnabled();
// Check that it's a dropdown button with the correct text
expect(createButton).toHaveTextContent('Create dataset and create chart');
// Check for the dropdown arrow
const dropdownArrow = screen.getByRole('img', { hidden: true });
expect(dropdownArrow).toBeInTheDocument();
});
test('create button becomes disabled when table already has a dataset', () => {
@@ -102,119 +78,9 @@ describe('Footer', () => {
});
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
name: /Create/i,
});
expect(createButton).toBeDisabled();
});
test('shows dropdown menu when dropdown arrow is clicked', async () => {
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
// Find and click the dropdown trigger (the arrow part)
const dropdownTrigger = screen.getByRole('button', { name: 'down' });
userEvent.click(dropdownTrigger);
// Check that the dropdown menu option is visible
await waitFor(() => {
expect(screen.getByText('Create dataset only')).toBeVisible();
});
});
test('navigates to chart creation when main button is clicked', async () => {
mockCreateResource.mockResolvedValue(123); // Mock successful dataset creation
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
});
userEvent.click(createButton);
await waitFor(() => {
expect(mockCreateResource).toHaveBeenCalledWith({
database: '1',
catalog: undefined,
schema: 'public',
table_name: 'real_info',
});
expect(mockHistoryPush).toHaveBeenCalledWith(
'/chart/add/?dataset=real_info',
);
});
});
test('navigates to dataset list when "Create dataset only" menu option is clicked', async () => {
mockCreateResource.mockResolvedValue(123);
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
// Open dropdown menu
const dropdownTrigger = screen.getByRole('button', { name: 'down' });
userEvent.click(dropdownTrigger);
// Click the "Create dataset only" option
await waitFor(() => {
const datasetOnlyOption = screen.getByText('Create dataset only');
userEvent.click(datasetOnlyOption);
});
await waitFor(() => {
expect(mockCreateResource).toHaveBeenCalledWith({
database: '1',
catalog: undefined,
schema: 'public',
table_name: 'real_info',
});
expect(mockHistoryPush).toHaveBeenCalledWith('/tablemodelview/list/');
});
});
test('handles dataset creation failure gracefully', async () => {
mockCreateResource.mockResolvedValue(null); // Mock failed dataset creation
render(<Footer {...mockPropsWithDataset} />, { useRedux: true });
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
});
userEvent.click(createButton);
await waitFor(() => {
expect(mockCreateResource).toHaveBeenCalled();
// Should not navigate if creation failed
expect(mockHistoryPush).not.toHaveBeenCalled();
});
});
test('passes correct data to createResource with catalog', async () => {
const mockPropsWithCatalog = {
...mockPropsWithDataset,
datasetObject: {
...mockPropsWithDataset.datasetObject,
catalog: 'test_catalog',
},
};
mockCreateResource.mockResolvedValue(456);
render(<Footer {...mockPropsWithCatalog} />, { useRedux: true });
const createButton = screen.getByRole('button', {
name: /Create dataset and create chart/i,
});
userEvent.click(createButton);
await waitFor(() => {
expect(mockCreateResource).toHaveBeenCalledWith({
database: '1',
catalog: 'test_catalog',
schema: 'public',
table_name: 'real_info',
});
});
});
});

View File

@@ -17,14 +17,8 @@
* under the License.
*/
import { useHistory } from 'react-router-dom';
import {
Button,
DropdownButton,
Menu,
Flex,
} from '@superset-ui/core/components';
import { t, useTheme } from '@superset-ui/core';
import { Icons } from '@superset-ui/core/components/Icons';
import { Button } from '@superset-ui/core/components';
import { t } from '@superset-ui/core';
import { useSingleViewResource } from 'src/views/CRUD/hooks';
import { logEvent } from 'src/logger/actions';
import withToasts from 'src/components/MessageToasts/withToasts';
@@ -61,7 +55,6 @@ function Footer({
datasets,
}: FooterProps) {
const history = useHistory();
const theme = useTheme();
const { createResource } = useSingleViewResource<Partial<DatasetObject>>(
'dataset',
t('dataset'),
@@ -92,7 +85,7 @@ function Footer({
const tooltipText = t('Select a database table.');
const onSave = (createChart: boolean = true) => {
const onSave = () => {
if (datasetObject) {
const data = {
database: datasetObject.db?.id,
@@ -107,57 +100,32 @@ function Footer({
if (typeof response === 'number') {
logEvent(LOG_ACTIONS_DATASET_CREATION_SUCCESS, datasetObject);
// When a dataset is created the response we get is its ID number
if (createChart) {
history.push(`/chart/add/?dataset=${datasetObject.table_name}`);
} else {
history.push('/tablemodelview/list/');
}
history.push(`/chart/add/?dataset=${datasetObject.table_name}`);
}
});
}
};
const onSaveOnly = () => {
onSave(false);
};
const CREATE_DATASET_TEXT = t('Create dataset and create chart');
const CREATE_DATASET_ONLY_TEXT = t('Create dataset only');
const disabledCheck =
!datasetObject?.table_name ||
!hasColumns ||
datasets?.includes(datasetObject?.table_name);
const dropdownMenu = (
<Menu>
<Menu.Item key="create-only" onClick={onSaveOnly}>
{CREATE_DATASET_ONLY_TEXT}
</Menu.Item>
</Menu>
);
return (
<Flex align="center" justify="flex-end" gap="8px">
<>
<Button buttonStyle="secondary" onClick={cancelButtonOnClick}>
{t('Cancel')}
</Button>
<DropdownButton
type="primary"
<Button
buttonStyle="primary"
disabled={disabledCheck}
tooltip={!datasetObject?.table_name ? tooltipText : undefined}
onClick={() => onSave(true)}
popupRender={() => dropdownMenu}
icon={
<Icons.DownOutlined
iconSize="xs"
iconColor={theme.colors.grayscale.light5}
/>
}
trigger={['click']}
onClick={onSave}
>
{CREATE_DATASET_TEXT}
</DropdownButton>
</Flex>
</Button>
</>
);
}

View File

@@ -30,7 +30,6 @@ def load_examples_run(
load_big_data: bool = False,
only_metadata: bool = False,
force: bool = False,
cleanup: bool = False,
) -> None:
if only_metadata:
logger.info("Loading examples metadata")
@@ -41,41 +40,51 @@ def load_examples_run(
# pylint: disable=import-outside-toplevel
import superset.examples.data_loading as examples
# Clear old examples if requested
if cleanup:
clear_old_examples()
examples.load_css_templates()
if load_test_data:
# Import test fixtures from tests directory
from tests.fixtures.examples.energy import load_energy
from tests.fixtures.examples.supported_charts_dashboard import (
load_supported_charts_dashboard,
)
from tests.fixtures.examples.tabbed_dashboard import load_tabbed_dashboard
logger.info("Loading energy related dataset")
load_energy(only_metadata, force)
examples.load_energy(only_metadata, force)
logger.info("Loading [World Bank's Health Nutrition and Population Stats]")
examples.load_world_bank_health_n_pop(only_metadata, force)
logger.info("Loading [Birth names]")
examples.load_birth_names(only_metadata, force)
if load_test_data:
logger.info("Loading [Tabbed dashboard]")
load_tabbed_dashboard(only_metadata)
examples.load_tabbed_dashboard(only_metadata)
logger.info("Loading [Supported Charts Dashboard]")
load_supported_charts_dashboard()
examples.load_supported_charts_dashboard()
else:
logger.info("Loading [Random long/lat data]")
examples.load_long_lat_data(only_metadata, force)
logger.info("Loading [Country Map data]")
examples.load_country_map_data(only_metadata, force)
if load_big_data:
# Import test fixture from tests directory
from tests.fixtures.examples.big_data import load_big_data as load_big_data_func
logger.info("Loading [San Francisco population polygons]")
examples.load_sf_population_polygons(only_metadata, force)
logger.info("Loading [Flights data]")
examples.load_flights(only_metadata, force)
logger.info("Loading [BART lines]")
examples.load_bart_lines(only_metadata, force)
logger.info("Loading [Misc Charts] dashboard")
examples.load_misc_dashboard()
logger.info("Loading DECK.gl demo")
examples.load_deck_dash()
if load_big_data:
logger.info("Loading big synthetic data for tests")
load_big_data_func()
examples.load_big_data()
# load examples that are stored as YAML config files
logger.info("Loading examples from YAML configuration files")
examples.load_examples_from_configs(force, load_test_data)
@@ -103,222 +112,4 @@ def load_examples(
force: bool = False,
) -> None:
"""Loads a set of Slices and Dashboards and a supporting dataset"""
# Show deprecation warning
click.echo(
click.style(
"\nWARNING: 'superset load-examples' is deprecated. "
"Please use 'superset examples load' instead.\n",
fg="yellow",
),
err=True,
)
load_examples_run(load_test_data, load_big_data, only_metadata, force)
# New CLI structure
@click.group(name="examples", help="Manage example data")
def examples_cli() -> None:
"""Group for example-related commands."""
pass
@examples_cli.command(name="load", help="Load example data into the database")
@with_appcontext
@transaction()
@click.option("--load-test-data", "-t", is_flag=True, help="Load additional test data")
@click.option("--load-big-data", "-b", is_flag=True, help="Load additional big data")
@click.option(
"--only-metadata",
"-m",
is_flag=True,
help="Only load metadata, skip actual data",
)
@click.option(
"--force",
"-f",
is_flag=True,
help="Force load data even if table already exists",
)
def load(
load_test_data: bool = False,
load_big_data: bool = False,
only_metadata: bool = False,
force: bool = False,
) -> None:
"""Load example datasets, charts, and dashboards."""
load_examples_run(
load_test_data, load_big_data, only_metadata, force, cleanup=False
)
def clear_old_examples() -> bool:
"""
Clear old Python-generated examples.
Returns True if clear was performed, False otherwise.
"""
from superset import db
from superset.connectors.sqla.models import SqlaTable
from superset.examples.utils import _has_old_examples
from superset.models.core import Database
from superset.models.dashboard import Dashboard, dashboard_slices
from superset.models.slice import Slice
# Check if old examples exist
if not _has_old_examples():
logger.info("No old examples found to clear")
return False
# Find the examples database
examples_db = db.session.query(Database).filter_by(database_name="examples").first()
if not examples_db:
return False
logger.info("Found examples database (id=%s)", examples_db.id)
logger.info("Clearing old examples...")
# 1. Get all datasets from examples database
example_datasets = (
db.session.query(SqlaTable).filter_by(database_id=examples_db.id).all()
)
dataset_ids = [ds.id for ds in example_datasets]
logger.info("Found %d example datasets", len(example_datasets))
# 2. Find all charts using these datasets
example_charts = []
if dataset_ids:
example_charts = (
db.session.query(Slice)
.filter(
Slice.datasource_id.in_(dataset_ids),
Slice.datasource_type == "table",
)
.all()
)
logger.info("Found %d example charts", len(example_charts))
chart_ids = [chart.id for chart in example_charts]
# 3. Find dashboards that contain these charts
example_dashboards = []
if chart_ids:
# Get dashboards that have relationships with our example charts
example_dashboards = (
db.session.query(Dashboard)
.join(dashboard_slices)
.filter(dashboard_slices.c.slice_id.in_(chart_ids))
.distinct()
.all()
)
logger.info("Found %d example dashboards", len(example_dashboards))
# Remove dashboard-slice relationships first
db.session.execute(
dashboard_slices.delete().where(dashboard_slices.c.slice_id.in_(chart_ids))
)
logger.info(
"Removed dashboard-slice relationships for %d charts",
len(chart_ids),
)
# 4. Delete dashboards that are now empty (contain only example charts)
for dashboard in example_dashboards:
# Since we already deleted the relationships, check if dashboard is empty
remaining_charts = (
db.session.query(dashboard_slices.c.slice_id)
.filter(dashboard_slices.c.dashboard_id == dashboard.id)
.count()
)
if remaining_charts == 0:
db.session.delete(dashboard)
logger.info(
"Deleted dashboard: %s (slug: %s)",
dashboard.dashboard_title,
dashboard.slug,
)
else:
logger.info(
"Keeping dashboard %s as it contains non-example charts",
dashboard.dashboard_title,
)
# 5. Delete charts
for chart in example_charts:
db.session.delete(chart)
logger.info("Deleted %d example charts", len(example_charts))
# 6. Delete the database - this will cascade delete all datasets,
# columns, and metrics thanks to the cascade="all, delete-orphan"
db.session.delete(examples_db)
logger.info("Examples database and all related objects removed successfully")
return True
@examples_cli.command(name="clear-old", help="Clear old Python-based example data")
@with_appcontext
@transaction()
@click.option(
"--confirm",
is_flag=True,
help="Skip confirmation prompt",
)
def clear_old(confirm: bool = False) -> None:
"""Clear old Python-generated example datasets, charts, and dashboards."""
if not confirm:
click.confirm(
"This will delete old Python-based example data. Are you sure?",
abort=True,
)
try:
if clear_old_examples():
logger.info("Old examples cleared successfully")
else:
logger.info("No old examples found to clear")
except Exception as e:
logger.error(f"Failed to clear old examples: {e}")
raise
@examples_cli.command(name="clear", help="Clear all example data (NOT YET IMPLEMENTED)")
@with_appcontext
def clear() -> None:
"""Clear all example data including YAML-based examples."""
click.echo(
click.style(
"Clearing YAML-based examples is NOT YET IMPLEMENTED.\n"
"Use 'superset examples clear-old' to remove old Python-based examples.",
fg="yellow",
)
)
@examples_cli.command(name="reload", help="Clear and reload example data")
@with_appcontext
@transaction()
@click.option("--load-test-data", "-t", is_flag=True, help="Load additional test data")
@click.option("--load-big-data", "-b", is_flag=True, help="Load additional big data")
@click.option(
"--only-metadata",
"-m",
is_flag=True,
help="Only load metadata, skip actual data",
)
@click.option(
"--force",
"-f",
is_flag=True,
help="Force load data even if table already exists",
)
def reload(
load_test_data: bool = False,
load_big_data: bool = False,
only_metadata: bool = False,
force: bool = False,
) -> None:
"""Clear existing examples and load fresh ones."""
# This is essentially the old --cleanup behavior
load_examples_run(load_test_data, load_big_data, only_metadata, force, cleanup=True)

View File

@@ -16,7 +16,6 @@
# under the License.
import copy
import logging
from inspect import isclass
from typing import Any
@@ -28,8 +27,6 @@ from superset.models.slice import Slice
from superset.utils import json
from superset.utils.core import AnnotationType, get_user
logger = logging.getLogger(__name__)
def filter_chart_annotations(chart_config: dict[str, Any]) -> None:
"""
@@ -66,13 +63,10 @@ def import_chart(
if not overwrite or not can_write:
return existing
config["id"] = existing.id
logger.info(f"Updating existing chart: {config.get('slice_name')}")
elif not can_write:
raise ImportFailedError(
"Chart doesn't exist and user doesn't have permission to create charts"
)
else:
logger.info(f"Creating new chart: {config.get('slice_name')}")
filter_chart_annotations(config)

View File

@@ -123,9 +123,6 @@ class ExportDashboardsCommand(ExportModelsCommand):
include_defaults=True,
export_uuids=True,
)
# Remove theme_id from export to make dashboards theme-free
payload.pop("theme_id", None)
# TODO (betodealmeida): move this logic to export_to_dict once this
# becomes the default export endpoint
for key, new_name in JSON_KEYS.items():

View File

@@ -166,14 +166,11 @@ def import_dashboard( # noqa: C901
elif not overwrite or not can_write:
return existing
config["id"] = existing.id
logger.info(f"Updating existing dashboard: {config.get('dashboard_title')}")
elif not can_write:
raise ImportFailedError(
"Dashboard doesn't exist and user doesn't "
"have permission to create dashboards"
)
else:
logger.info(f"Creating new dashboard: {config.get('dashboard_title')}")
# TODO (betodealmeida): move this logic to import_from_dict
config = config.copy()

View File

@@ -46,13 +46,10 @@ def import_database(
if not overwrite or not can_write:
return existing
config["id"] = existing.id
logger.info(f"Updating existing database: {config.get('database_name')}")
elif not can_write:
raise ImportFailedError(
"Database doesn't exist and user doesn't have permission to create databases" # noqa: E501
)
else:
logger.info(f"Creating new database: {config.get('database_name')}")
# Check if this URI is allowed
if app.config["PREVENT_UNSAFE_DB_CONNECTIONS"]:
try:

View File

@@ -124,13 +124,11 @@ def import_dataset( # noqa: C901
if not overwrite or not can_write:
return existing
config["id"] = existing.id
logger.info(f"Updating existing dataset: {config.get('table_name')}")
elif not can_write:
raise ImportFailedError(
"Dataset doesn't exist and user doesn't have permission to create datasets"
)
else:
logger.info(f"Creating new dataset: {config.get('table_name')}")
# TODO (betodealmeida): move this logic to import_from_dict
config = config.copy()
@@ -211,12 +209,7 @@ def load_data(data_uri: str, dataset: SqlaTable, database: Database) -> None:
data = request.urlopen(data_uri) # pylint: disable=consider-using-with # noqa: S310
if data_uri.endswith(".gz"):
data = gzip.open(data)
# Determine file format based on URI
if ".json" in data_uri:
df = pd.read_json(data, encoding="utf-8")
else:
df = pd.read_csv(data, encoding="utf-8")
df = pd.read_csv(data, encoding="utf-8")
dtype = get_dtype(df, dataset)
# convert temporal columns

View File

@@ -195,5 +195,4 @@ class ImportExamplesCommand(ImportModelsCommand):
{"dashboard_id": dashboard_id, "slice_id": chart_id}
for (dashboard_id, chart_id) in dashboard_chart_ids
]
if values:
db.session.execute(dashboard_slices.insert(), values)
db.session.execute(dashboard_slices.insert(), values)

View File

@@ -1155,7 +1155,7 @@ class CeleryConfig: # pylint: disable=too-few-public-methods
}
CELERY_CONFIG: type[CeleryConfig] = CeleryConfig
CELERY_CONFIG: type[CeleryConfig] | None = CeleryConfig
# Set celery config to None to disable all the above configuration
# CELERY_CONFIG = None

View File

@@ -0,0 +1,71 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
import polyline
from sqlalchemy import inspect, String, Text
from superset import db
from superset.sql.parse import Table
from superset.utils import json
from ..utils.database import get_example_database # noqa: TID252
from .helpers import get_table_connector_registry, read_example_data
logger = logging.getLogger(__name__)
def load_bart_lines(only_metadata: bool = False, force: bool = False) -> None:
tbl_name = "bart_lines"
database = get_example_database()
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
table_exists = database.has_table(Table(tbl_name, schema))
if not only_metadata and (not table_exists or force):
df = read_example_data(
"examples://bart-lines.json.gz", encoding="latin-1", compression="gzip"
)
df["path_json"] = df.path.map(json.dumps)
df["polyline"] = df.path.map(polyline.encode)
del df["path"]
df.to_sql(
tbl_name,
engine,
schema=schema,
if_exists="replace",
chunksize=500,
dtype={
"color": String(255),
"name": String(255),
"polyline": Text,
"path_json": Text,
},
index=False,
)
logger.debug(f"Creating table {tbl_name} reference")
table = get_table_connector_registry()
tbl = db.session.query(table).filter_by(table_name=tbl_name).first()
if not tbl:
tbl = table(table_name=tbl_name, schema=schema)
db.session.add(tbl)
tbl.description = "BART lines"
tbl.database = database
tbl.filter_select_enabled = True
tbl.fetch_metadata()

View File

@@ -0,0 +1,869 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
import textwrap
from typing import Union
import pandas as pd
from sqlalchemy import DateTime, inspect, String
from sqlalchemy.sql import column
from superset import app, db, security_manager
from superset.connectors.sqla.models import SqlaTable, SqlMetric, TableColumn
from superset.models.core import Database
from superset.models.dashboard import Dashboard
from superset.models.slice import Slice
from superset.sql.parse import Table
from superset.utils import json
from superset.utils.core import DatasourceType
from ..utils.database import get_example_database # noqa: TID252
from .helpers import (
get_slice_json,
get_table_connector_registry,
merge_slice,
misc_dash_slices,
read_example_data,
update_slice_ids,
)
logger = logging.getLogger(__name__)
def gen_filter(
subject: str, comparator: str, operator: str = "=="
) -> dict[str, Union[bool, str]]:
return {
"clause": "WHERE",
"comparator": comparator,
"expressionType": "SIMPLE",
"operator": operator,
"subject": subject,
}
def load_data(tbl_name: str, database: Database, sample: bool = False) -> None:
pdf = read_example_data("examples://birth_names2.json.gz", compression="gzip")
# TODO(bkyryliuk): move load examples data into the pytest fixture
if database.backend == "presto":
pdf.ds = pd.to_datetime(pdf.ds, unit="ms")
pdf.ds = pdf.ds.dt.strftime("%Y-%m-%d %H:%M%:%S")
else:
pdf.ds = pd.to_datetime(pdf.ds, unit="ms")
pdf = pdf.head(100) if sample else pdf
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
pdf.to_sql(
tbl_name,
engine,
schema=schema,
if_exists="replace",
chunksize=500,
dtype={
# TODO(bkyryliuk): use TIMESTAMP type for presto
"ds": DateTime if database.backend != "presto" else String(255),
"gender": String(16),
"state": String(10),
"name": String(255),
},
method="multi",
index=False,
)
logger.debug("Done loading table!")
logger.debug("-" * 80)
def load_birth_names(
only_metadata: bool = False, force: bool = False, sample: bool = False
) -> None:
"""Loading birth name dataset from a zip file in the repo"""
database = get_example_database()
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
tbl_name = "birth_names"
table_exists = database.has_table(Table(tbl_name, schema))
if not only_metadata and (not table_exists or force):
load_data(tbl_name, database, sample=sample)
table = get_table_connector_registry()
obj = db.session.query(table).filter_by(table_name=tbl_name, schema=schema).first()
if not obj:
logger.debug(f"Creating table [{tbl_name}] reference")
obj = table(table_name=tbl_name, schema=schema)
db.session.add(obj)
_set_table_metadata(obj, database)
_add_table_metrics(obj)
slices, _ = create_slices(obj)
create_dashboard(slices)
def _set_table_metadata(datasource: SqlaTable, database: "Database") -> None:
datasource.main_dttm_col = "ds"
datasource.database = database
datasource.filter_select_enabled = True
datasource.fetch_metadata()
def _add_table_metrics(datasource: SqlaTable) -> None:
# By accessing the attribute first, we make sure `datasource.columns` and
# `datasource.metrics` are already loaded. Otherwise accessing them later
# may trigger an unnecessary and unexpected `after_update` event.
columns, metrics = datasource.columns, datasource.metrics
if not any(col.column_name == "num_california" for col in columns):
col_state = str(column("state").compile(db.engine))
col_num = str(column("num").compile(db.engine))
columns.append(
TableColumn(
column_name="num_california",
expression=f"CASE WHEN {col_state} = 'CA' THEN {col_num} ELSE 0 END",
)
)
if not any(col.metric_name == "sum__num" for col in metrics):
col = str(column("num").compile(db.engine))
metrics.append(SqlMetric(metric_name="sum__num", expression=f"SUM({col})"))
for col in columns:
if col.column_name == "ds": # type: ignore
col.is_dttm = True # type: ignore
break
datasource.columns = columns
datasource.metrics = metrics
def create_slices(tbl: SqlaTable) -> tuple[list[Slice], list[Slice]]:
owner = security_manager.get_user_by_id(1)
metrics = [
{
"expressionType": "SIMPLE",
"column": {"column_name": "num", "type": "BIGINT"},
"aggregate": "SUM",
"label": "Births",
"optionName": "metric_11",
}
]
metric = "sum__num"
defaults = {
"compare_lag": "10",
"compare_suffix": "o10Y",
"limit": "25",
"granularity_sqla": "ds",
"groupby": [],
"row_limit": app.config["ROW_LIMIT"],
"time_range": "100 years ago : now",
"viz_type": "table",
"markup_type": "markdown",
}
default_query_context = {
"result_format": "json",
"result_type": "full",
"datasource": {
"id": tbl.id,
"type": "table",
},
"queries": [
{
"columns": [],
"metrics": [],
},
],
}
slice_kwargs = {
"datasource_id": tbl.id,
"datasource_type": DatasourceType.TABLE,
}
logger.debug("Creating some slices")
slices = [
Slice(
**slice_kwargs,
slice_name="Participants",
viz_type="big_number",
params=get_slice_json(
defaults,
viz_type="big_number",
granularity_sqla="ds",
compare_lag="5",
compare_suffix="over 5Y",
metric=metric,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Genders",
viz_type="pie",
params=get_slice_json(
defaults, viz_type="pie", groupby=["gender"], metric=metric
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Trends",
viz_type="echarts_timeseries_line",
params=get_slice_json(
defaults,
viz_type="echarts_timeseries_line",
groupby=["name"],
granularity_sqla="ds",
rich_tooltip=True,
show_legend=True,
metrics=metrics,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Genders by State",
viz_type="echarts_timeseries_bar",
params=get_slice_json(
defaults,
adhoc_filters=[
{
"clause": "WHERE",
"expressionType": "SIMPLE",
"filterOptionName": "2745eae5",
"comparator": ["other"],
"operator": "NOT IN",
"subject": "state",
}
],
viz_type="echarts_timeseries_bar",
metrics=[
{
"expressionType": "SIMPLE",
"column": {"column_name": "num_boys", "type": "BIGINT(20)"},
"aggregate": "SUM",
"label": "Boys",
"optionName": "metric_11",
},
{
"expressionType": "SIMPLE",
"column": {"column_name": "num_girls", "type": "BIGINT(20)"},
"aggregate": "SUM",
"label": "Girls",
"optionName": "metric_12",
},
],
groupby=["state"],
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Girls",
viz_type="table",
params=get_slice_json(
defaults,
groupby=["name"],
adhoc_filters=[gen_filter("gender", "girl")],
row_limit=50,
timeseries_limit_metric=metric,
metrics=[metric],
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Girl Name Cloud",
viz_type="word_cloud",
params=get_slice_json(
defaults,
viz_type="word_cloud",
size_from="10",
series="name",
size_to="70",
rotation="square",
limit="100",
adhoc_filters=[gen_filter("gender", "girl")],
metric=metric,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Boys",
viz_type="table",
params=get_slice_json(
defaults,
groupby=["name"],
adhoc_filters=[gen_filter("gender", "boy")],
row_limit=50,
timeseries_limit_metric=metric,
metrics=[metric],
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Boy Name Cloud",
viz_type="word_cloud",
params=get_slice_json(
defaults,
viz_type="word_cloud",
size_from="10",
series="name",
size_to="70",
rotation="square",
limit="100",
adhoc_filters=[gen_filter("gender", "boy")],
metric=metric,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Top 10 Girl Name Share",
viz_type="echarts_area",
params=get_slice_json(
defaults,
adhoc_filters=[gen_filter("gender", "girl")],
comparison_type="values",
groupby=["name"],
limit=10,
stacked_style="expand",
time_grain_sqla="P1D",
viz_type="echarts_area",
x_axis_forma="smart_date",
metrics=metrics,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Top 10 Boy Name Share",
viz_type="echarts_area",
params=get_slice_json(
defaults,
adhoc_filters=[gen_filter("gender", "boy")],
comparison_type="values",
groupby=["name"],
limit=10,
stacked_style="expand",
time_grain_sqla="P1D",
viz_type="echarts_area",
x_axis_forma="smart_date",
metrics=metrics,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Pivot Table v2",
viz_type="pivot_table_v2",
params=get_slice_json(
defaults,
viz_type="pivot_table_v2",
groupbyRows=["name"],
groupbyColumns=["state"],
metrics=[metric],
),
query_context=get_slice_json(
default_query_context,
queries=[
{
"columns": ["name", "state"],
"metrics": [metric],
}
],
),
owners=[],
),
]
misc_slices = [
Slice(
**slice_kwargs,
slice_name="Average and Sum Trends",
viz_type="mixed_timeseries",
params=get_slice_json(
defaults,
viz_type="mixed_timeseries",
metrics=[
{
"expressionType": "SIMPLE",
"column": {"column_name": "num", "type": "BIGINT(20)"},
"aggregate": "AVG",
"label": "AVG(num)",
"optionName": "metric_vgops097wej_g8uff99zhk7",
}
],
metrics_b=["sum__num"],
granularity_sqla="ds",
yAxisIndex=0,
yAxisIndexB=1,
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Num Births Trend",
viz_type="echarts_timeseries_line",
params=get_slice_json(
defaults, viz_type="echarts_timeseries_line", metrics=metrics
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Daily Totals",
viz_type="table",
params=get_slice_json(
defaults,
groupby=["ds"],
time_range="1983 : 2023",
viz_type="table",
metrics=metrics,
),
query_context=get_slice_json(
default_query_context,
queries=[
{
"columns": ["ds"],
"metrics": metrics,
"time_range": "1983 : 2023",
}
],
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Number of California Births",
viz_type="big_number_total",
params=get_slice_json(
defaults,
metric={
"expressionType": "SIMPLE",
"column": {
"column_name": "num_california",
"expression": "CASE WHEN state = 'CA' THEN num ELSE 0 END",
},
"aggregate": "SUM",
"label": "SUM(num_california)",
},
viz_type="big_number_total",
granularity_sqla="ds",
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Top 10 California Names Timeseries",
viz_type="echarts_timeseries_line",
params=get_slice_json(
defaults,
metrics=[
{
"expressionType": "SIMPLE",
"column": {
"column_name": "num_california",
"expression": "CASE WHEN state = 'CA' THEN num ELSE 0 END",
},
"aggregate": "SUM",
"label": "SUM(num_california)",
}
],
viz_type="echarts_timeseries_line",
granularity_sqla="ds",
groupby=["name"],
timeseries_limit_metric={
"expressionType": "SIMPLE",
"column": {
"column_name": "num_california",
"expression": "CASE WHEN state = 'CA' THEN num ELSE 0 END",
},
"aggregate": "SUM",
"label": "SUM(num_california)",
},
limit="10",
),
owners=[owner] if owner else [],
),
Slice(
**slice_kwargs,
slice_name="Names Sorted by Num in California",
viz_type="table",
params=get_slice_json(
defaults,
metrics=metrics,
groupby=["name"],
row_limit=50,
timeseries_limit_metric={
"expressionType": "SIMPLE",
"column": {
"column_name": "num_california",
"expression": "CASE WHEN state = 'CA' THEN num ELSE 0 END",
},
"aggregate": "SUM",
"label": "SUM(num_california)",
},
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Number of Girls",
viz_type="big_number_total",
params=get_slice_json(
defaults,
metric=metric,
viz_type="big_number_total",
granularity_sqla="ds",
adhoc_filters=[gen_filter("gender", "girl")],
subheader="total female participants",
),
owners=[],
),
Slice(
**slice_kwargs,
slice_name="Pivot Table",
viz_type="pivot_table_v2",
params=get_slice_json(
defaults,
viz_type="pivot_table_v2",
groupbyRows=["name"],
groupbyColumns=["state"],
metrics=metrics,
),
owners=[],
),
]
for slc in slices:
merge_slice(slc)
for slc in misc_slices:
merge_slice(slc)
misc_dash_slices.add(slc.slice_name)
return slices, misc_slices
def create_dashboard(slices: list[Slice]) -> Dashboard:
logger.debug("Creating a dashboard")
dash = db.session.query(Dashboard).filter_by(slug="births").first()
if not dash:
dash = Dashboard()
db.session.add(dash)
dash.published = True
dash.json_metadata = textwrap.dedent(
"""\
{
"label_colors": {
"Girls": "#FF69B4",
"Boys": "#ADD8E6",
"girl": "#FF69B4",
"boy": "#ADD8E6"
}
}"""
)
# pylint: disable=line-too-long
pos = json.loads( # noqa: TID251
textwrap.dedent(
"""\
{
"CHART-6GdlekVise": {
"children": [],
"id": "CHART-6GdlekVise",
"meta": {
"chartId": 5547,
"height": 50,
"sliceName": "Top 10 Girl Name Share",
"width": 5
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-eh0w37bWbR"
],
"type": "CHART"
},
"CHART-6n9jxb30JG": {
"children": [],
"id": "CHART-6n9jxb30JG",
"meta": {
"chartId": 5540,
"height": 36,
"sliceName": "Genders by State",
"width": 5
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW--EyBZQlDi"
],
"type": "CHART"
},
"CHART-Jj9qh1ol-N": {
"children": [],
"id": "CHART-Jj9qh1ol-N",
"meta": {
"chartId": 5545,
"height": 50,
"sliceName": "Boy Name Cloud",
"width": 4
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-kzWtcvo8R1"
],
"type": "CHART"
},
"CHART-ODvantb_bF": {
"children": [],
"id": "CHART-ODvantb_bF",
"meta": {
"chartId": 5548,
"height": 50,
"sliceName": "Top 10 Boy Name Share",
"width": 5
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-kzWtcvo8R1"
],
"type": "CHART"
},
"CHART-PAXUUqwmX9": {
"children": [],
"id": "CHART-PAXUUqwmX9",
"meta": {
"chartId": 5538,
"height": 34,
"sliceName": "Genders",
"width": 3
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-2n0XgiHDgs"
],
"type": "CHART"
},
"CHART-_T6n_K9iQN": {
"children": [],
"id": "CHART-_T6n_K9iQN",
"meta": {
"chartId": 5539,
"height": 36,
"sliceName": "Trends",
"width": 7
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW--EyBZQlDi"
],
"type": "CHART"
},
"CHART-eNY0tcE_ic": {
"children": [],
"id": "CHART-eNY0tcE_ic",
"meta": {
"chartId": 5537,
"height": 34,
"sliceName": "Participants",
"width": 3
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-2n0XgiHDgs"
],
"type": "CHART"
},
"CHART-g075mMgyYb": {
"children": [],
"id": "CHART-g075mMgyYb",
"meta": {
"chartId": 5541,
"height": 50,
"sliceName": "Girls",
"width": 3
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-eh0w37bWbR"
],
"type": "CHART"
},
"CHART-n-zGGE6S1y": {
"children": [],
"id": "CHART-n-zGGE6S1y",
"meta": {
"chartId": 5542,
"height": 50,
"sliceName": "Girl Name Cloud",
"width": 4
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-eh0w37bWbR"
],
"type": "CHART"
},
"CHART-vJIPjmcbD3": {
"children": [],
"id": "CHART-vJIPjmcbD3",
"meta": {
"chartId": 5543,
"height": 50,
"sliceName": "Boys",
"width": 3
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-kzWtcvo8R1"
],
"type": "CHART"
},
"DASHBOARD_VERSION_KEY": "v2",
"GRID_ID": {
"children": [
"ROW-2n0XgiHDgs",
"ROW--EyBZQlDi",
"ROW-eh0w37bWbR",
"ROW-kzWtcvo8R1"
],
"id": "GRID_ID",
"parents": [
"ROOT_ID"
],
"type": "GRID"
},
"HEADER_ID": {
"id": "HEADER_ID",
"meta": {
"text": "Births"
},
"type": "HEADER"
},
"MARKDOWN-zaflB60tbC": {
"children": [],
"id": "MARKDOWN-zaflB60tbC",
"meta": {
"code": "<div style=\\"text-align:center\\"> <h1>Birth Names Dashboard</h1> <img src=\\"/static/assets/images/babies.png\\" style=\\"width:50%;\\"></div>",
"height": 34,
"width": 6
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-2n0XgiHDgs"
],
"type": "MARKDOWN"
},
"ROOT_ID": {
"children": [
"GRID_ID"
],
"id": "ROOT_ID",
"type": "ROOT"
},
"ROW--EyBZQlDi": {
"children": [
"CHART-_T6n_K9iQN",
"CHART-6n9jxb30JG"
],
"id": "ROW--EyBZQlDi",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
},
"ROW-2n0XgiHDgs": {
"children": [
"CHART-eNY0tcE_ic",
"MARKDOWN-zaflB60tbC",
"CHART-PAXUUqwmX9"
],
"id": "ROW-2n0XgiHDgs",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
},
"ROW-eh0w37bWbR": {
"children": [
"CHART-g075mMgyYb",
"CHART-n-zGGE6S1y",
"CHART-6GdlekVise"
],
"id": "ROW-eh0w37bWbR",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
},
"ROW-kzWtcvo8R1": {
"children": [
"CHART-vJIPjmcbD3",
"CHART-Jj9qh1ol-N",
"CHART-ODvantb_bF"
],
"id": "ROW-kzWtcvo8R1",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
}
}
""" # noqa: E501
)
)
# pylint: enable=line-too-long
# dashboard v2 doesn't allow add markup slice
dash.slices = [slc for slc in slices if slc.viz_type != "markup"]
update_slice_ids(pos)
dash.dashboard_title = "USA Births Names"
dash.position_json = json.dumps(pos, indent=4) # noqa: TID251
dash.slug = "births"
return dash

View File

@@ -1,26 +0,0 @@
slice_name: Birth in France by department in 2016
description: null
certified_by: null
certification_details: null
viz_type: country_map
params:
entity: DEPT_ID
granularity_sqla: ''
metric:
aggregate: AVG
column:
column_name: '2004'
type: INT
expressionType: SIMPLE
label: Boys
optionName: metric_112342
row_limit: 500000
select_country: france
since: ''
until: ''
viz_type: country_map
query_context: null
cache_timeout: null
uuid: 6bd584f1-0ef5-44fc-8a05-61400f83bb62
version: 1.0.0
dataset_uuid: c21dd48d-9a4b-4a08-a926-47c3601c2a8d

View File

@@ -1,21 +0,0 @@
slice_name: OSM Long/Lat
description: null
certified_by: null
certification_details: null
viz_type: osm
params:
all_columns:
- occupancy
all_columns_x: LON
all_columns_y: LAT
granularity_sqla: day
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
row_limit: 500000
since: '2014-01-01'
until: now
viz_type: mapbox
query_context: null
cache_timeout: null
uuid: a4e90860-c8f5-4c50-8c04-06b2e144809c
version: 1.0.0
dataset_uuid: 605eaec7-ebf1-4fea-ac4b-07652fcb46e7

View File

@@ -1,31 +0,0 @@
slice_name: Parallel Coordinates
description: null
certified_by: null
certification_details: null
viz_type: para
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby: []
limit: 100
markup_type: markdown
metrics:
- sum__SP_POP_TOTL
- sum__SP_RUR_TOTL_ZS
- sum__SH_DYN_AIDS
row_limit: 50000
secondary_metric: sum__SP_POP_TOTL
series: country_name
show_bubbles: true
since: '2011-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: '2012-01-01'
viz_type: para
query_context: null
cache_timeout: null
uuid: 041377c4-0ca9-4a40-8abd-befcd137c0dc
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,30 +0,0 @@
slice_name: Pivot Table v2
description: null
certified_by: null
certification_details: null
viz_type: pivot_table_v2
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
groupbyColumns:
- state
groupbyRows:
- name
limit: '25'
markup_type: markdown
metrics:
- sum__num
row_limit: 50000
time_range: '100 years ago : now'
viz_type: pivot_table_v2
query_context: "{\n \"datasource\": {\n \"id\": 2,\n \"type\": \"\
table\"\n },\n \"queries\": [\n {\n \"columns\": [\n \
\ \"name\",\n \"state\"\n ],\n \
\ \"metrics\": [\n \"sum__num\"\n ]\n }\n ],\n\
\ \"result_format\": \"json\",\n \"result_type\": \"full\"\n}"
cache_timeout: null
uuid: 86778b63-19d8-4278-a79f-c90a1b31e162
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,32 +0,0 @@
slice_name: Average and Sum Trends
description: null
certified_by: null
certification_details: null
viz_type: mixed_timeseries
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '25'
markup_type: markdown
metrics:
- aggregate: AVG
column:
column_name: num
type: BIGINT(20)
expressionType: SIMPLE
label: AVG(num)
optionName: metric_vgops097wej_g8uff99zhk7
metrics_b:
- sum__num
row_limit: 50000
time_range: '100 years ago : now'
viz_type: mixed_timeseries
yAxisIndex: 0
yAxisIndexB: 1
query_context: null
cache_timeout: null
uuid: 9c690f97-9196-5e01-bec9-8f4975ea5108
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,31 +0,0 @@
slice_name: Boy Name Cloud
description: null
certified_by: null
certification_details: null
viz_type: word_cloud
params:
adhoc_filters:
- clause: WHERE
comparator: boy
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '100'
markup_type: markdown
metric: sum__num
rotation: square
row_limit: 50000
series: name
size_from: '10'
size_to: '70'
time_range: '100 years ago : now'
viz_type: word_cloud
query_context: null
cache_timeout: null
uuid: 6994ec83-0cf2-4a26-97e2-1e30b0002aa0
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,30 +0,0 @@
slice_name: Boys
description: null
certified_by: null
certification_details: null
viz_type: table
params:
adhoc_filters:
- clause: WHERE
comparator: boy
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- sum__num
row_limit: 50
time_range: '100 years ago : now'
timeseries_limit_metric: sum__num
viz_type: table
query_context: null
cache_timeout: null
uuid: 0af97164-82f0-42bb-a611-7093e5c56596
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,28 +0,0 @@
slice_name: Daily Totals
description: null
certified_by: null
certification_details: null
viz_type: table
params:
granularity_sqla: ds
groupby:
- ds
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50
time_range: '1983 : 2023'
timeseries_limit_metric: sum__num
viz_type: table
query_context: null
cache_timeout: null
uuid: a3d4f2e1-8c9b-4d2a-9e7f-1b6c8d5e2f4a
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,22 +0,0 @@
slice_name: Genders
description: null
certified_by: null
certification_details: null
viz_type: pie
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- gender
limit: '25'
markup_type: markdown
metric: sum__num
row_limit: 50000
time_range: '100 years ago : now'
viz_type: pie
query_context: null
cache_timeout: null
uuid: fb05dca0-bd3e-4953-a0a5-94b51de3a653
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,44 +0,0 @@
slice_name: Genders by State
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_bar
params:
adhoc_filters:
- clause: WHERE
comparator:
- other
expressionType: SIMPLE
filterOptionName: 2745eae5
operator: NOT IN
subject: state
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- state
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num_boys
type: BIGINT(20)
expressionType: SIMPLE
label: Boys
optionName: metric_11
- aggregate: SUM
column:
column_name: num_girls
type: BIGINT(20)
expressionType: SIMPLE
label: Girls
optionName: metric_12
row_limit: 50000
time_range: '100 years ago : now'
viz_type: echarts_timeseries_bar
query_context: null
cache_timeout: null
uuid: 2cc25185-3d8c-494c-aa3c-14f081ac7e54
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,31 +0,0 @@
slice_name: Girl Name Cloud
description: null
certified_by: null
certification_details: null
viz_type: word_cloud
params:
adhoc_filters:
- clause: WHERE
comparator: girl
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '100'
markup_type: markdown
metric: sum__num
rotation: square
row_limit: 50000
series: name
size_from: '10'
size_to: '70'
time_range: '100 years ago : now'
viz_type: word_cloud
query_context: null
cache_timeout: null
uuid: ba6574fe-a6c0-41ef-9499-1ea6ff36bd2d
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,30 +0,0 @@
slice_name: Girls
description: null
certified_by: null
certification_details: null
viz_type: table
params:
adhoc_filters:
- clause: WHERE
comparator: girl
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- sum__num
row_limit: 50
time_range: '100 years ago : now'
timeseries_limit_metric: sum__num
viz_type: table
query_context: null
cache_timeout: null
uuid: 44cfa30e-af8e-4176-8612-4df0c0609516
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,36 +0,0 @@
slice_name: Names Sorted by Num in California
description: null
certified_by: null
certification_details: null
viz_type: table
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50
time_range: '100 years ago : now'
timeseries_limit_metric:
aggregate: SUM
column:
column_name: num_california
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
expressionType: SIMPLE
label: SUM(num_california)
viz_type: table
query_context: null
cache_timeout: null
uuid: e49ed2c4-b8a3-5736-bafe-4658790b113a
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,31 +0,0 @@
slice_name: Num Births Trend
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_line
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
rich_tooltip: true
row_limit: 50000
show_legend: true
time_range: '100 years ago : now'
viz_type: echarts_timeseries_line
query_context: null
cache_timeout: null
uuid: 5b8c76e5-0e5e-45c1-b07e-3b2cb9b9c7e8
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,27 +0,0 @@
slice_name: Number of California Births
description: null
certified_by: null
certification_details: null
viz_type: big_number_total
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '25'
markup_type: markdown
metric:
aggregate: SUM
column:
column_name: num_california
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
expressionType: SIMPLE
label: SUM(num_california)
row_limit: 50000
time_range: '100 years ago : now'
viz_type: big_number_total
query_context: null
cache_timeout: null
uuid: 400ee69f-eda4-5fe8-bc30-299184e08048
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,28 +0,0 @@
slice_name: Number of Girls
description: null
certified_by: null
certification_details: null
viz_type: big_number_total
params:
adhoc_filters:
- clause: WHERE
comparator: girl
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
limit: '25'
markup_type: markdown
metric: sum__num
row_limit: 50000
subheader: total female participants
time_range: '100 years ago : now'
viz_type: big_number_total
query_context: null
cache_timeout: null
uuid: 2f1a8720-7ea6-5b0f-b419-b75163f6bf17
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,21 +0,0 @@
slice_name: Participants
description: null
certified_by: null
certification_details: null
viz_type: big_number
params:
compare_lag: '5'
compare_suffix: over 5Y
granularity_sqla: ds
groupby: []
limit: '25'
markup_type: markdown
metric: sum__num
row_limit: 50000
time_range: '100 years ago : now'
viz_type: big_number
query_context: null
cache_timeout: null
uuid: 89ae3c32-eafa-4466-82cf-8c4328420782
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,32 +0,0 @@
slice_name: Pivot Table
description: null
certified_by: null
certification_details: null
viz_type: pivot_table_v2
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby: []
groupbyColumns:
- state
groupbyRows:
- name
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50000
time_range: '100 years ago : now'
viz_type: pivot_table_v2
query_context: null
cache_timeout: null
uuid: b9038f33-aea3-52de-840b-0a32f4c0eb41
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,39 +0,0 @@
slice_name: Top 10 Boy Name Share
description: null
certified_by: null
certification_details: null
viz_type: echarts_area
params:
adhoc_filters:
- clause: WHERE
comparator: boy
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
comparison_type: values
granularity_sqla: ds
groupby:
- name
limit: 10
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50000
stacked_style: expand
time_grain_sqla: P1D
time_range: '100 years ago : now'
viz_type: echarts_area
x_axis_forma: smart_date
query_context: null
cache_timeout: null
uuid: f35cca46-bb11-440e-8ba1-7f021bfe52a7
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,35 +0,0 @@
slice_name: Top 10 California Names Timeseries
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_line
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '10'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num_california
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
expressionType: SIMPLE
label: SUM(num_california)
row_limit: 50000
time_range: '100 years ago : now'
timeseries_limit_metric:
aggregate: SUM
column:
column_name: num_california
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
expressionType: SIMPLE
label: SUM(num_california)
viz_type: echarts_timeseries_line
query_context: null
cache_timeout: null
uuid: 6a587b9e-e28b-5c2a-abb9-c6c1f4fd56b5
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,39 +0,0 @@
slice_name: Top 10 Girl Name Share
description: null
certified_by: null
certification_details: null
viz_type: echarts_area
params:
adhoc_filters:
- clause: WHERE
comparator: girl
expressionType: SIMPLE
operator: ==
subject: gender
compare_lag: '10'
compare_suffix: o10Y
comparison_type: values
granularity_sqla: ds
groupby:
- name
limit: 10
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
row_limit: 50000
stacked_style: expand
time_grain_sqla: P1D
time_range: '100 years ago : now'
viz_type: echarts_area
x_axis_forma: smart_date
query_context: null
cache_timeout: null
uuid: da76899a-d75c-467b-b0ce-cfa4819ed1b1
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,31 +0,0 @@
slice_name: Trends
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_line
params:
compare_lag: '10'
compare_suffix: o10Y
granularity_sqla: ds
groupby:
- name
limit: '25'
markup_type: markdown
metrics:
- aggregate: SUM
column:
column_name: num
type: BIGINT
expressionType: SIMPLE
label: Births
optionName: metric_11
rich_tooltip: true
row_limit: 50000
show_legend: true
time_range: '100 years ago : now'
viz_type: echarts_timeseries_line
query_context: null
cache_timeout: null
uuid: c6024db9-1695-4aa6-b846-42d9c96bfcbf
version: 1.0.0
dataset_uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76

View File

@@ -1,120 +0,0 @@
slice_name: Rise & Fall of Video Game Consoles
description: null
certified_by: null
certification_details: null
viz_type: echarts_area
params:
adhoc_filters: []
annotation_layers: []
bottom_margin: auto
color_scheme: supersetColors
comparison_type: values
contribution: false
datasource: 21__table
granularity_sqla: year
groupby:
- platform
label_colors:
'0': '#1FA8C9'
'1': '#454E7C'
'2600': '#666666'
3DO: '#B2B2B2'
3DS: '#D1C6BC'
Action: '#1FA8C9'
Adventure: '#454E7C'
DC: '#A38F79'
DS: '#8FD3E4'
Europe: '#5AC189'
Fighting: '#5AC189'
GB: '#FDE380'
GBA: '#ACE1C4'
GC: '#5AC189'
GEN: '#3CCCCB'
GG: '#EFA1AA'
Japan: '#FF7F44'
Microsoft Game Studios: '#D1C6BC'
Misc: '#FF7F44'
N64: '#1FA8C9'
NES: '#9EE5E5'
NG: '#A1A6BD'
Nintendo: '#D3B3DA'
North America: '#666666'
Other: '#E04355'
PC: '#EFA1AA'
PCFX: '#FDE380'
PS: '#A1A6BD'
PS2: '#FCC700'
PS3: '#3CCCCB'
PS4: '#B2B2B2'
PSP: '#FEC0A1'
PSV: '#FCC700'
Platform: '#666666'
Puzzle: '#E04355'
Racing: '#FCC700'
Role-Playing: '#A868B7'
SAT: '#A868B7'
SCD: '#8FD3E4'
SNES: '#454E7C'
Shooter: '#3CCCCB'
Simulation: '#A38F79'
Sports: '#8FD3E4'
Strategy: '#A1A6BD'
TG16: '#FEC0A1'
Take-Two Interactive: '#9EE5E5'
WS: '#ACE1C4'
Wii: '#A38F79'
WiiU: '#E04355'
X360: '#A868B7'
XB: '#D3B3DA'
XOne: '#FF7F44'
line_interpolation: linear
metrics:
- aggregate: SUM
column:
column_name: global_sales
description: null
expression: null
filterable: true
groupby: true
id: 887
is_dttm: false
optionName: _col_Global_Sales
python_date_format: null
type: DOUBLE PRECISION
verbose_name: null
expressionType: SIMPLE
hasCustomLabel: false
isNew: false
label: SUM(Global_Sales)
optionName: metric_ufl75addr8c_oqqhdumirpn
sqlExpression: null
order_desc: true
queryFields:
groupby: groupby
metrics: metrics
rich_tooltip: true
rolling_type: None
row_limit: null
show_brush: auto
show_legend: false
slice_id: 659
stacked_style: stream
time_grain_sqla: null
time_range: No filter
url_params:
preselect_filters: '{"1389": {"platform": ["PS", "PS2", "PS3", "PS4"], "genre":
null, "__time_range": "No filter"}}'
viz_type: echarts_area
x_axis_format: smart_date
x_axis_label: Year Published
x_axis_showminmax: true
x_ticks_layout: auto
y_axis_bounds:
- null
- null
y_axis_format: SMART_NUMBER
query_context: null
cache_timeout: null
uuid: 3d926244-6e32-5e42-8ade-7302b83a65d7
version: 1.0.0
dataset_uuid: 53d47c0c-c03d-47f0-b9ac-81225f808283

View File

@@ -1,30 +0,0 @@
slice_name: Box plot
description: null
certified_by: null
certification_details: null
viz_type: box_plot
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby:
- region
limit: '25'
markup_type: markdown
metrics:
- sum__SP_POP_TOTL
row_limit: 50000
show_bubbles: true
since: '1960-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: now
viz_type: box_plot
whisker_options: Min/max (no outliers)
x_ticks_layout: staggered
query_context: null
cache_timeout: null
uuid: d31ba9c7-798b-4f84-87ef-ab31721680a8
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,29 +0,0 @@
slice_name: Growth Rate
description: null
certified_by: null
certification_details: null
viz_type: echarts_timeseries_line
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby:
- country_name
limit: '25'
markup_type: markdown
metrics:
- sum__SP_POP_TOTL
num_period_compare: '10'
row_limit: 50000
show_bubbles: true
since: '1960-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: '2014-01-02'
viz_type: echarts_timeseries_line
query_context: null
cache_timeout: null
uuid: cfcd7c5e-4759-4b28-bb7c-e2200508e978
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,51 +0,0 @@
slice_name: Life Expectancy VS Rural %
description: null
certified_by: null
certification_details: null
viz_type: bubble
params:
adhoc_filters:
- clause: WHERE
comparator:
- TCA
- MNP
- DMA
- MHL
- MCO
- SXM
- CYM
- TUV
- IMY
- KNA
- ASM
- ADO
- AMA
- PLW
expressionType: SIMPLE
filterOptionName: 2745eae5
operator: NOT IN
subject: country_code
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_name
granularity_sqla: year
groupby: []
limit: 0
markup_type: markdown
max_bubble_size: '50'
row_limit: 50000
series: region
show_bubbles: true
since: '2011-01-01'
size: sum__SP_POP_TOTL
time_range: '2014-01-01 : 2014-01-02'
until: '2011-01-02'
viz_type: bubble
x: sum__SP_RUR_TOTL_ZS
y: sum__SP_DYN_LE00_IN
query_context: null
cache_timeout: null
uuid: fa927236-7b66-4d03-ae6c-463d2d394123
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,28 +0,0 @@
slice_name: Most Populated Countries
description: null
certified_by: null
certification_details: null
viz_type: table
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby:
- country_name
limit: '25'
markup_type: markdown
metrics:
- sum__SP_POP_TOTL
row_limit: 50000
show_bubbles: true
since: '2014-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: '2014-01-02'
viz_type: table
query_context: null
cache_timeout: null
uuid: 4183745e-1cc4-4f88-9ae6-973c69845ce4
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,36 +0,0 @@
slice_name: '% Rural'
description: null
certified_by: null
certification_details: null
viz_type: world_map
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby: []
limit: '25'
markup_type: markdown
metric: sum__SP_RUR_TOTL_ZS
num_period_compare: '10'
row_limit: 50000
secondary_metric:
aggregate: SUM
column:
column_name: SP_RUR_TOTL
optionName: _col_SP_RUR_TOTL
type: DOUBLE
expressionType: SIMPLE
hasCustomLabel: true
label: Rural Population
show_bubbles: true
since: '2014-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: '2014-01-02'
viz_type: world_map
query_context: null
cache_timeout: null
uuid: 8d889488-edb5-40cb-a69c-e2c14f009e2b
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,38 +0,0 @@
slice_name: Rural Breakdown
description: null
certified_by: null
certification_details: null
viz_type: sunburst_v2
params:
columns:
- region
- country_name
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby: []
limit: '25'
markup_type: markdown
metric: sum__SP_POP_TOTL
row_limit: 50000
secondary_metric:
aggregate: SUM
column:
column_name: SP_RUR_TOTL
optionName: _col_SP_RUR_TOTL
type: DOUBLE
expressionType: SIMPLE
hasCustomLabel: true
label: Rural Population
show_bubbles: true
since: '2011-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: '2011-01-02'
viz_type: sunburst_v2
query_context: null
cache_timeout: null
uuid: 70a2e07b-0f45-4532-96ae-0c6db52d2e7c
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,28 +0,0 @@
slice_name: Treemap
description: null
certified_by: null
certification_details: null
viz_type: treemap_v2
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby:
- region
- country_code
limit: '25'
markup_type: markdown
metric: sum__SP_POP_TOTL
row_limit: 50000
show_bubbles: true
since: '1960-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: now
viz_type: treemap_v2
query_context: null
cache_timeout: null
uuid: fc941a12-88a0-42e7-ac48-c1ec4ed84640
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,28 +0,0 @@
slice_name: World's Pop Growth
description: null
certified_by: null
certification_details: null
viz_type: echarts_area
params:
compare_lag: '10'
compare_suffix: o10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby:
- region
limit: '25'
markup_type: markdown
metrics:
- sum__SP_POP_TOTL
row_limit: 50000
show_bubbles: true
since: '1960-01-01'
time_range: '2014-01-01 : 2014-01-02'
until: now
viz_type: echarts_area
query_context: null
cache_timeout: null
uuid: e18b5d28-3a3d-43ea-8a20-b198b44b08e3
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,26 +0,0 @@
slice_name: World's Population
description: null
certified_by: null
certification_details: null
viz_type: big_number
params:
compare_lag: '10'
compare_suffix: over 10Y
country_fieldtype: cca3
entity: country_code
granularity_sqla: year
groupby: []
limit: '25'
markup_type: markdown
metric: sum__SP_POP_TOTL
row_limit: 50000
show_bubbles: true
since: '2000'
time_range: '2014-01-01 : 2014-01-02'
until: '2014-01-02'
viz_type: big_number
query_context: null
cache_timeout: null
uuid: c50fc6e3-96fc-4e72-877b-2ea1a5e25c7a
version: 1.0.0
dataset_uuid: 3b851597-e0e9-42a1-83e4-55547811742e

View File

@@ -1,48 +0,0 @@
slice_name: Deck.gl Arcs
description: null
certified_by: null
certification_details: null
viz_type: deck_arc
params:
color_picker:
a: 1
b: 135
g: 122
r: 0
datasource: 10__table
end_spatial:
latCol: LATITUDE_DEST
lonCol: LONGITUDE_DEST
type: latlong
granularity_sqla: null
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
row_limit: 5000
slice_id: 42
start_spatial:
latCol: LATITUDE
lonCol: LONGITUDE
type: latlong
stroke_width: 1
time_grain_sqla: null
time_range: ' : '
viewport:
altitude: 1.5
bearing: 8.546256357301871
height: 642
latitude: 44.596651438714254
longitude: -91.84340711201104
maxLatitude: 85.05113
maxPitch: 60
maxZoom: 20
minLatitude: -85.05113
minPitch: 0
minZoom: 0
pitch: 60
width: 997
zoom: 2.929837070560775
viz_type: deck_arc
query_context: null
cache_timeout: null
uuid: 51a68f80-d538-4094-bb9e-346aad49b306
version: 1.0.0
dataset_uuid: 92980b06-cbec-4f34-9c2e-7308edc8c2b9

View File

@@ -1,43 +0,0 @@
slice_name: Deck.gl Grid
description: null
certified_by: null
certification_details: null
viz_type: deck_grid
params:
autozoom: false
color_picker:
a: 1
b: 0
g: 255
r: 14
datasource: 5__table
extruded: true
granularity_sqla: null
grid_size: 120
groupby: []
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
point_radius: Auto
point_radius_fixed:
type: fix
value: 2000
point_radius_unit: Pixels
row_limit: 5000
size: count
spatial:
latCol: LAT
lonCol: LON
type: latlong
time_grain_sqla: null
time_range: No filter
viewport:
bearing: 155.80099696026355
latitude: 37.7942314882596
longitude: -122.42066918995666
pitch: 53.470800300695146
zoom: 12.699690845482069
viz_type: deck_grid
query_context: null
cache_timeout: null
uuid: a1b96ab6-3c0b-4cbc-b13a-a70749e84068
version: 1.0.0
dataset_uuid: 605eaec7-ebf1-4fea-ac4b-07652fcb46e7

View File

@@ -1,42 +0,0 @@
slice_name: Deck.gl Hexagons
description: null
certified_by: null
certification_details: null
viz_type: deck_hex
params:
color_picker:
a: 1
b: 0
g: 255
r: 14
datasource: 5__table
extruded: true
granularity_sqla: null
grid_size: 40
groupby: []
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
point_radius: Auto
point_radius_fixed:
type: fix
value: 2000
point_radius_unit: Pixels
row_limit: 5000
size: count
spatial:
latCol: LAT
lonCol: LON
type: latlong
time_grain_sqla: null
time_range: No filter
viewport:
bearing: -2.3984797349335167
latitude: 37.789795085160335
longitude: -122.40632230075536
pitch: 54.08961642447763
zoom: 13.835465702403654
viz_type: deck_hex
query_context: null
cache_timeout: null
uuid: bdfdce5d-c44d-4c63-8a45-0b2a1a29715b
version: 1.0.0
dataset_uuid: 605eaec7-ebf1-4fea-ac4b-07652fcb46e7

View File

@@ -1,48 +0,0 @@
slice_name: Deck.gl Path
description: null
certified_by: null
certification_details: null
viz_type: deck_path
params:
color_picker:
a: 1
b: 135
g: 122
r: 0
datasource: 12__table
js_columns:
- color
js_data_mutator: "data => data.map(d => ({\n ...d,\n color: colors.hexToRGB(d.extraProps.color)\n\
}));"
js_onclick_href: ''
js_tooltip: ''
line_column: path_json
line_type: json
line_width: 150
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
reverse_long_lat: false
row_limit: 5000
slice_id: 43
time_grain_sqla: null
time_range: ' : '
viewport:
altitude: 1.5
bearing: 0
height: 1094
latitude: 37.73671752604488
longitude: -122.18885402582598
maxLatitude: 85.05113
maxPitch: 60
maxZoom: 20
minLatitude: -85.05113
minPitch: 0
minZoom: 0
pitch: 0
width: 669
zoom: 9.51847667620428
viz_type: deck_path
query_context: null
cache_timeout: null
uuid: 6332daf6-e442-469d-b66c-a6a38423d4c7
version: 1.0.0
dataset_uuid: 151c283f-c076-437a-8e2f-1cf65fe6db0d

View File

@@ -1,88 +0,0 @@
slice_name: Deck.gl Polygons
description: null
certified_by: null
certification_details: null
viz_type: deck_polygon
params:
datasource: 11__table
extruded: true
fill_color_picker:
a: 1
b: 73
g: 65
r: 3
filled: true
granularity_sqla: null
js_columns: []
js_data_mutator: ''
js_onclick_href: ''
js_tooltip: ''
legend_format: .1s
legend_position: tr
line_column: contour
line_type: json
line_width: 10
line_width_unit: meters
linear_color_scheme: oranges
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
metric:
aggregate: SUM
column:
column_name: population
description: null
expression: null
filterable: true
groupby: true
id: 1332
is_dttm: false
optionName: _col_population
python_date_format: null
type: BIGINT
verbose_name: null
expressionType: SIMPLE
hasCustomLabel: true
label: Population
optionName: metric_t2v4qbfiz1_w6qgpx4h2p
sqlExpression: null
multiplier: 0.1
point_radius_fixed:
type: metric
value:
aggregate: null
column: null
expressionType: SQL
hasCustomLabel: null
label: Density
optionName: metric_c5rvwrzoo86_293h6yrv2ic
sqlExpression: SUM(population)/SUM(area)
reverse_long_lat: false
slice_id: 41
stroke_color_picker:
a: 1
b: 135
g: 122
r: 0
stroked: false
time_grain_sqla: null
time_range: ' : '
viewport:
altitude: 1.5
bearing: 37.89506450385642
height: 906
latitude: 37.752020331384834
longitude: -122.43388541747726
maxLatitude: 85.05113
maxPitch: 60
maxZoom: 20
minLatitude: -85.05113
minPitch: 0
minZoom: 0
pitch: 60
width: 667
zoom: 11.133995608594631
viz_type: deck_polygon
query_context: null
cache_timeout: null
uuid: f3236785-149e-4cab-9408-f2cc69afd977
version: 1.0.0
dataset_uuid: a480e881-e90d-4dc8-818e-f9338c3ca839

View File

@@ -1,42 +0,0 @@
slice_name: Deck.gl Scatterplot
description: null
certified_by: null
certification_details: null
viz_type: deck_scatter
params:
color_picker:
a: 0.82
b: 3
g: 0
r: 205
datasource: 5__table
granularity_sqla: null
groupby: []
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
max_radius: 250
min_radius: 1
multiplier: 10
point_radius_fixed:
type: metric
value: count
point_unit: square_m
row_limit: 5000
size: count
spatial:
latCol: LAT
lonCol: LON
type: latlong
time_grain_sqla: null
time_range: ' : '
viewport:
bearing: -4.952916738791771
latitude: 37.78926922909199
longitude: -122.42613341901688
pitch: 4.750411100577438
zoom: 12.729132798697304
viz_type: deck_scatter
query_context: null
cache_timeout: null
uuid: cc75c4d5-8f79-4ffd-8e75-06162d4a867f
version: 1.0.0
dataset_uuid: 605eaec7-ebf1-4fea-ac4b-07652fcb46e7

View File

@@ -1,41 +0,0 @@
slice_name: Deck.gl Screen grid
description: null
certified_by: null
certification_details: null
viz_type: deck_screengrid
params:
color_picker:
a: 1
b: 0
g: 255
r: 14
datasource: 5__table
granularity_sqla: null
grid_size: 20
groupby: []
mapbox_style: https://tile.openstreetmap.org/{z}/{x}/{y}.png
point_radius: Auto
point_radius_fixed:
type: fix
value: 2000
point_unit: square_m
row_limit: 5000
size: count
spatial:
latCol: LAT
lonCol: LON
type: latlong
time_grain_sqla: null
time_range: No filter
viewport:
bearing: -4.952916738791771
latitude: 37.76024135844065
longitude: -122.41827069521386
pitch: 4.750411100577438
zoom: 14.161641703941438
viz_type: deck_screengrid
query_context: null
cache_timeout: null
uuid: 966c802c-4733-489f-b65b-385083c85d90
version: 1.0.0
dataset_uuid: 605eaec7-ebf1-4fea-ac4b-07652fcb46e7

View File

@@ -1,83 +0,0 @@
dashboard_title: Misc Charts
description: null
css: null
slug: misc_charts
certified_by: null
certification_details: null
published: false
uuid: 55a4fe9f-2682-4b0d-84c7-49ded4be11db
position:
CHART-HJOYVMV0E7:
children: []
id: CHART-HJOYVMV0E7
meta:
chartId: 30
height: 69
sliceName: OSM Long/Lat
uuid: a4e90860-c8f5-4c50-8c04-06b2e144809c
width: 4
parents:
- ROOT_ID
- GRID_ID
- ROW-S1MK4M4A4X
- COLUMN-ByUFVf40EQ
type: CHART
CHART-S1WYNz4AVX:
children: []
id: CHART-S1WYNz4AVX
meta:
chartId: 10
height: 69
sliceName: Parallel Coordinates
uuid: 041377c4-0ca9-4a40-8abd-befcd137c0dc
width: 4
parents:
- ROOT_ID
- GRID_ID
- ROW-SytNzNA4X
type: CHART
CHART-rkgF4G4A4X:
children: []
id: CHART-rkgF4G4A4X
meta:
chartId: 31
height: 69
sliceName: Birth in France by department in 2016
uuid: 6bd584f1-0ef5-44fc-8a05-61400f83bb62
width: 4
parents:
- ROOT_ID
- GRID_ID
- ROW-SytNzNA4X
type: CHART
DASHBOARD_VERSION_KEY: v2
GRID_ID:
children:
- ROW-SytNzNA4X
id: GRID_ID
parents:
- ROOT_ID
type: GRID
HEADER_ID:
id: HEADER_ID
meta:
text: Misc Charts
type: HEADER
ROOT_ID:
children:
- GRID_ID
id: ROOT_ID
type: ROOT
ROW-SytNzNA4X:
children:
- CHART-rkgF4G4A4X
- CHART-S1WYNz4AVX
- CHART-HJOYVMV0E7
id: ROW-SytNzNA4X
meta:
background: BACKGROUND_TRANSPARENT
parents:
- ROOT_ID
- GRID_ID
type: ROW
version: 1.0.0

View File

@@ -1,263 +0,0 @@
dashboard_title: USA Births Names
description: null
css: null
slug: births
certified_by: null
certification_details: null
published: true
uuid: fb7d30bc-b160-4371-861c-235d19bf6e25
position:
CHART-6GdlekVise:
children: []
id: CHART-6GdlekVise
meta:
chartId: 19
height: 50
sliceName: Top 10 Girl Name Share
width: 5
uuid: da76899a-d75c-467b-b0ce-cfa4819ed1b1
parents:
- ROOT_ID
- GRID_ID
- ROW-eh0w37bWbR
type: CHART
CHART-6n9jxb30JG:
children: []
id: CHART-6n9jxb30JG
meta:
chartId: 14
height: 36
sliceName: Genders by State
width: 5
uuid: 2cc25185-3d8c-494c-aa3c-14f081ac7e54
parents:
- ROOT_ID
- GRID_ID
- ROW--EyBZQlDi
type: CHART
CHART-Jj9qh1ol-N:
children: []
id: CHART-Jj9qh1ol-N
meta:
chartId: 18
height: 50
sliceName: Boy Name Cloud
width: 4
uuid: 6994ec83-0cf2-4a26-97e2-1e30b0002aa0
parents:
- ROOT_ID
- GRID_ID
- ROW-kzWtcvo8R1
type: CHART
CHART-ODvantb_bF:
children: []
id: CHART-ODvantb_bF
meta:
chartId: 20
height: 50
sliceName: Top 10 Boy Name Share
width: 5
uuid: f35cca46-bb11-440e-8ba1-7f021bfe52a7
parents:
- ROOT_ID
- GRID_ID
- ROW-kzWtcvo8R1
type: CHART
CHART-PAXUUqwmX9:
children: []
id: CHART-PAXUUqwmX9
meta:
chartId: 12
height: 34
sliceName: Genders
width: 3
uuid: fb05dca0-bd3e-4953-a0a5-94b51de3a653
parents:
- ROOT_ID
- GRID_ID
- ROW-2n0XgiHDgs
type: CHART
CHART-_T6n_K9iQN:
children: []
id: CHART-_T6n_K9iQN
meta:
chartId: 13
height: 36
sliceName: Trends
width: 7
uuid: c6024db9-1695-4aa6-b846-42d9c96bfcbf
parents:
- ROOT_ID
- GRID_ID
- ROW--EyBZQlDi
type: CHART
CHART-eNY0tcE_ic:
children: []
id: CHART-eNY0tcE_ic
meta:
chartId: 11
height: 34
sliceName: Participants
width: 3
uuid: 89ae3c32-eafa-4466-82cf-8c4328420782
parents:
- ROOT_ID
- GRID_ID
- ROW-2n0XgiHDgs
type: CHART
CHART-g075mMgyYb:
children: []
id: CHART-g075mMgyYb
meta:
chartId: 15
height: 50
sliceName: Girls
width: 3
uuid: 44cfa30e-af8e-4176-8612-4df0c0609516
parents:
- ROOT_ID
- GRID_ID
- ROW-eh0w37bWbR
type: CHART
CHART-n-zGGE6S1y:
children: []
id: CHART-n-zGGE6S1y
meta:
chartId: 16
height: 50
sliceName: Girl Name Cloud
width: 4
uuid: ba6574fe-a6c0-41ef-9499-1ea6ff36bd2d
parents:
- ROOT_ID
- GRID_ID
- ROW-eh0w37bWbR
type: CHART
CHART-vJIPjmcbD3:
children: []
id: CHART-vJIPjmcbD3
meta:
chartId: 17
height: 50
sliceName: Boys
width: 3
uuid: 0af97164-82f0-42bb-a611-7093e5c56596
parents:
- ROOT_ID
- GRID_ID
- ROW-kzWtcvo8R1
type: CHART
DASHBOARD_VERSION_KEY: v2
GRID_ID:
children:
- ROW-2n0XgiHDgs
- ROW--EyBZQlDi
- ROW-eh0w37bWbR
- ROW-kzWtcvo8R1
- ROW-N-0P6H2KVI
id: GRID_ID
parents:
- ROOT_ID
type: GRID
HEADER_ID:
id: HEADER_ID
meta:
text: Births
type: HEADER
MARKDOWN-zaflB60tbC:
children: []
id: MARKDOWN-zaflB60tbC
meta:
code: <div style="text-align:center"> <h1>Birth Names Dashboard</h1> <img
src="/static/assets/images/babies.png" style="width:50%;"></div>
height: 34
width: 6
parents:
- ROOT_ID
- GRID_ID
- ROW-2n0XgiHDgs
type: MARKDOWN
ROOT_ID:
children:
- GRID_ID
id: ROOT_ID
type: ROOT
ROW--EyBZQlDi:
children:
- CHART-_T6n_K9iQN
- CHART-6n9jxb30JG
id: ROW--EyBZQlDi
meta:
background: BACKGROUND_TRANSPARENT
parents:
- ROOT_ID
- GRID_ID
type: ROW
ROW-2n0XgiHDgs:
children:
- CHART-eNY0tcE_ic
- MARKDOWN-zaflB60tbC
- CHART-PAXUUqwmX9
id: ROW-2n0XgiHDgs
meta:
background: BACKGROUND_TRANSPARENT
parents:
- ROOT_ID
- GRID_ID
type: ROW
ROW-eh0w37bWbR:
children:
- CHART-g075mMgyYb
- CHART-n-zGGE6S1y
- CHART-6GdlekVise
id: ROW-eh0w37bWbR
meta:
background: BACKGROUND_TRANSPARENT
parents:
- ROOT_ID
- GRID_ID
type: ROW
ROW-kzWtcvo8R1:
children:
- CHART-vJIPjmcbD3
- CHART-Jj9qh1ol-N
- CHART-ODvantb_bF
id: ROW-kzWtcvo8R1
meta:
background: BACKGROUND_TRANSPARENT
parents:
- ROOT_ID
- GRID_ID
type: ROW
ROW-N-0P6H2KVI:
children:
- CHART-A62J4Z7R
id: ROW-N-0P6H2KVI
meta:
'0': ROOT_ID
background: BACKGROUND_TRANSPARENT
type: ROW
parents:
- ROOT_ID
- GRID_ID
CHART-A62J4Z7R:
children: []
id: CHART-A62J4Z7R
meta:
chartId: 21
height: 50
sliceName: Pivot Table v2
uuid: 86778b63-19d8-4278-a79f-c90a1b31e162
width: 4
type: CHART
parents:
- ROOT_ID
- GRID_ID
- ROW-N-0P6H2KVI
metadata:
label_colors:
Girls: '#FF69B4'
Boys: '#ADD8E6'
girl: '#FF69B4'
boy: '#ADD8E6'
version: 1.0.0

View File

@@ -1,175 +0,0 @@
dashboard_title: World Bank's Data
description: null
css: null
slug: world_health
certified_by: null
certification_details: null
published: true
uuid: d37232b3-43b9-486a-a132-e387dc0ff8de
position:
CHART-37982887:
children: []
id: CHART-37982887
meta:
chartId: 1
height: 52
sliceName: World's Population
width: 2
uuid: c50fc6e3-96fc-4e72-877b-2ea1a5e25c7a
type: CHART
CHART-17e0f8d8:
children: []
id: CHART-17e0f8d8
meta:
chartId: 2
height: 92
sliceName: Most Populated Countries
width: 3
uuid: 4183745e-1cc4-4f88-9ae6-973c69845ce4
type: CHART
CHART-2ee52f30:
children: []
id: CHART-2ee52f30
meta:
chartId: 3
height: 38
sliceName: Growth Rate
width: 6
uuid: cfcd7c5e-4759-4b28-bb7c-e2200508e978
type: CHART
CHART-2d5b6871:
children: []
id: CHART-2d5b6871
meta:
chartId: 4
height: 52
sliceName: '% Rural'
width: 7
uuid: 8d889488-edb5-40cb-a69c-e2c14f009e2b
type: CHART
CHART-0fd0d252:
children: []
id: CHART-0fd0d252
meta:
chartId: 5
height: 50
sliceName: Life Expectancy VS Rural %
width: 8
uuid: fa927236-7b66-4d03-ae6c-463d2d394123
type: CHART
CHART-97f4cb48:
children: []
id: CHART-97f4cb48
meta:
chartId: 6
height: 38
sliceName: Rural Breakdown
width: 3
uuid: 70a2e07b-0f45-4532-96ae-0c6db52d2e7c
type: CHART
CHART-b5e05d6f:
children: []
id: CHART-b5e05d6f
meta:
chartId: 7
height: 50
sliceName: World's Pop Growth
width: 4
uuid: e18b5d28-3a3d-43ea-8a20-b198b44b08e3
type: CHART
CHART-e76e9f5f:
children: []
id: CHART-e76e9f5f
meta:
chartId: 8
height: 50
sliceName: Box plot
width: 4
uuid: d31ba9c7-798b-4f84-87ef-ab31721680a8
type: CHART
CHART-a4808bba:
children: []
id: CHART-a4808bba
meta:
chartId: 9
height: 50
sliceName: Treemap
width: 8
uuid: fc941a12-88a0-42e7-ac48-c1ec4ed84640
type: CHART
COLUMN-071bbbad:
children:
- ROW-1e064e3c
- ROW-afdefba9
id: COLUMN-071bbbad
meta:
background: BACKGROUND_TRANSPARENT
width: 9
type: COLUMN
COLUMN-fe3914b8:
children:
- CHART-37982887
id: COLUMN-fe3914b8
meta:
background: BACKGROUND_TRANSPARENT
width: 2
type: COLUMN
GRID_ID:
children:
- ROW-46632bc2
- ROW-3fa26c5d
- ROW-812b3f13
id: GRID_ID
type: GRID
HEADER_ID:
id: HEADER_ID
meta:
text: World's Bank Data
type: HEADER
ROOT_ID:
children:
- GRID_ID
id: ROOT_ID
type: ROOT
ROW-1e064e3c:
children:
- COLUMN-fe3914b8
- CHART-2d5b6871
id: ROW-1e064e3c
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
ROW-3fa26c5d:
children:
- CHART-b5e05d6f
- CHART-0fd0d252
id: ROW-3fa26c5d
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
ROW-46632bc2:
children:
- COLUMN-071bbbad
- CHART-17e0f8d8
id: ROW-46632bc2
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
ROW-812b3f13:
children:
- CHART-a4808bba
- CHART-e76e9f5f
id: ROW-812b3f13
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
ROW-afdefba9:
children:
- CHART-2ee52f30
- CHART-97f4cb48
id: ROW-afdefba9
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
DASHBOARD_VERSION_KEY: v2
version: 1.0.0

View File

@@ -1,130 +0,0 @@
dashboard_title: deck.gl Demo
description: null
css: null
slug: deck
certified_by: null
certification_details: null
published: true
uuid: b78795f1-0b33-41a9-a6c7-186f38a526ad
position:
CHART-3afd9d70:
meta:
chartId: 32
sliceName: Deck.gl Scatterplot
width: 6
height: 50
uuid: cc75c4d5-8f79-4ffd-8e75-06162d4a867f
type: CHART
id: CHART-3afd9d70
children: []
CHART-2ee7fa5e:
meta:
chartId: 33
sliceName: Deck.gl Screen grid
width: 6
height: 50
uuid: 966c802c-4733-489f-b65b-385083c85d90
type: CHART
id: CHART-2ee7fa5e
children: []
CHART-201f7715:
meta:
chartId: 34
sliceName: Deck.gl Hexagons
width: 6
height: 50
uuid: bdfdce5d-c44d-4c63-8a45-0b2a1a29715b
type: CHART
id: CHART-201f7715
children: []
CHART-d02f6c40:
meta:
chartId: 35
sliceName: Deck.gl Grid
width: 6
height: 50
uuid: a1b96ab6-3c0b-4cbc-b13a-a70749e84068
type: CHART
id: CHART-d02f6c40
children: []
CHART-2673431d:
meta:
chartId: 36
sliceName: Deck.gl Polygons
width: 6
height: 50
uuid: f3236785-149e-4cab-9408-f2cc69afd977
type: CHART
id: CHART-2673431d
children: []
CHART-85265a60:
meta:
chartId: 37
sliceName: Deck.gl Arcs
width: 6
height: 50
uuid: 51a68f80-d538-4094-bb9e-346aad49b306
type: CHART
id: CHART-85265a60
children: []
CHART-2b87513c:
meta:
chartId: 38
sliceName: Deck.gl Path
width: 6
height: 50
uuid: 6332daf6-e442-469d-b66c-a6a38423d4c7
type: CHART
id: CHART-2b87513c
children: []
GRID_ID:
type: GRID
id: GRID_ID
children:
- ROW-a7b16cb5
- ROW-72c218a5
- ROW-957ba55b
- ROW-af041bdd
HEADER_ID:
meta:
text: deck.gl Demo
type: HEADER
id: HEADER_ID
ROOT_ID:
type: ROOT
id: ROOT_ID
children:
- GRID_ID
ROW-72c218a5:
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
id: ROW-72c218a5
children:
- CHART-d02f6c40
- CHART-201f7715
ROW-957ba55b:
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
id: ROW-957ba55b
children:
- CHART-2673431d
- CHART-85265a60
ROW-a7b16cb5:
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
id: ROW-a7b16cb5
children:
- CHART-3afd9d70
- CHART-2ee7fa5e
ROW-af041bdd:
meta:
background: BACKGROUND_TRANSPARENT
type: ROW
id: ROW-af041bdd
children:
- CHART-2b87513c
DASHBOARD_VERSION_KEY: v2
version: 1.0.0

View File

@@ -1,80 +0,0 @@
table_name: bart_lines
main_dttm_col: null
description: BART lines
default_endpoint: null
offset: 0
cache_timeout: null
catalog: null
schema: public
sql: null
params: null
template_params: null
filter_select_enabled: true
fetch_values_predicate: null
extra: null
normalize_columns: false
always_filter_main_dttm: false
folders: null
uuid: 151c283f-c076-437a-8e2f-1cf65fe6db0d
metrics:
- metric_name: count
verbose_name: COUNT(*)
metric_type: count
expression: COUNT(*)
description: null
d3format: null
currency: null
extra: null
warning_text: null
columns:
- column_name: name
verbose_name: null
is_dttm: false
is_active: true
type: VARCHAR(255)
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: color
verbose_name: null
is_dttm: false
is_active: true
type: VARCHAR(255)
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: path_json
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: polyline
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
version: 1.0.0
database_uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
data: https://cdn.jsdelivr.net/gh/apache-superset/examples-data@master/bart-lines.json.gz

View File

@@ -1,209 +0,0 @@
table_name: birth_france_by_region
main_dttm_col: dttm
description: null
default_endpoint: null
offset: 0
cache_timeout: null
catalog: null
schema: public
sql: null
params: null
template_params: null
filter_select_enabled: true
fetch_values_predicate: null
extra: null
normalize_columns: false
always_filter_main_dttm: false
folders: null
uuid: c21dd48d-9a4b-4a08-a926-47c3601c2a8d
metrics:
- metric_name: avg__2004
verbose_name: null
metric_type: null
expression: AVG("2004")
description: null
d3format: null
currency: null
extra: null
warning_text: null
- metric_name: count
verbose_name: COUNT(*)
metric_type: count
expression: COUNT(*)
description: null
d3format: null
currency: null
extra: null
warning_text: null
columns:
- column_name: DEPT_ID
verbose_name: null
is_dttm: false
is_active: true
type: VARCHAR(10)
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2010'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2003'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2004'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2005'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2006'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2007'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2008'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2009'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2011'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2012'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2013'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: '2014'
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: dttm
verbose_name: null
is_dttm: true
is_active: true
type: DATE
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
version: 1.0.0
database_uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
data: https://cdn.jsdelivr.net/gh/apache-superset/examples-data@master/paris_iris.json.gz

View File

@@ -1,137 +0,0 @@
table_name: birth_names
main_dttm_col: ds
description: null
default_endpoint: null
offset: 0
cache_timeout: null
catalog: null
schema: public
sql: null
params: null
template_params: null
filter_select_enabled: true
fetch_values_predicate: null
extra: null
normalize_columns: false
always_filter_main_dttm: false
folders: null
uuid: 4ec507ac-bece-4d2b-8dc3-cfb7c3515e76
metrics:
- metric_name: count
verbose_name: COUNT(*)
metric_type: count
expression: COUNT(*)
description: null
d3format: null
currency: null
extra: null
warning_text: null
- metric_name: sum__num
verbose_name: null
metric_type: null
expression: SUM(num)
description: null
d3format: null
currency: null
extra: null
warning_text: null
columns:
- column_name: num_california
verbose_name: null
is_dttm: false
is_active: true
type: null
advanced_data_type: null
groupby: true
filterable: true
expression: CASE WHEN state = 'CA' THEN num ELSE 0 END
description: null
python_date_format: null
extra: null
- column_name: ds
verbose_name: null
is_dttm: true
is_active: true
type: TIMESTAMP WITHOUT TIME ZONE
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: state
verbose_name: null
is_dttm: false
is_active: true
type: VARCHAR(10)
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: gender
verbose_name: null
is_dttm: false
is_active: true
type: VARCHAR(16)
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: name
verbose_name: null
is_dttm: false
is_active: true
type: VARCHAR(255)
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: num_boys
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: num_girls
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: num
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
version: 1.0.0
database_uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
data: https://cdn.jsdelivr.net/gh/apache-superset/examples-data@master/birth_names2.json.gz

View File

@@ -1,560 +0,0 @@
table_name: flights
main_dttm_col: ds
description: Random set of flights in the US
default_endpoint: null
offset: 0
cache_timeout: null
catalog: null
schema: public
sql: null
params: null
template_params: null
filter_select_enabled: true
fetch_values_predicate: null
extra: null
normalize_columns: false
always_filter_main_dttm: false
folders: null
uuid: 92980b06-cbec-4f34-9c2e-7308edc8c2b9
metrics:
- metric_name: count
verbose_name: COUNT(*)
metric_type: count
expression: COUNT(*)
description: null
d3format: null
currency: null
extra: null
warning_text: null
columns:
- column_name: ds
verbose_name: null
is_dttm: true
is_active: true
type: TIMESTAMP WITHOUT TIME ZONE
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: LATE_AIRCRAFT_DELAY
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: ARRIVAL_DELAY
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: DEPARTURE_DELAY
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: WEATHER_DELAY
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: AIRLINE_DELAY
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: AIR_SYSTEM_DELAY
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: ARRIVAL_TIME
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: SECURITY_DELAY
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: LATITUDE_DEST
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: ELAPSED_TIME
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: DEPARTURE_TIME
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: LATITUDE
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: AIR_TIME
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: TAXI_IN
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: TAXI_OUT
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: LONGITUDE_DEST
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: LONGITUDE
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: WHEELS_OFF
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: WHEELS_ON
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: CANCELLATION_REASON
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: SCHEDULED_ARRIVAL
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: DESTINATION_AIRPORT
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: CANCELLED
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: SCHEDULED_DEPARTURE
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: DISTANCE
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: DAY_OF_WEEK
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: DAY
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: TAIL_NUMBER
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: YEAR
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: STATE_DEST
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: AIRPORT_DEST
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: AIRLINE
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: STATE
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: ORIGIN_AIRPORT
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: AIRPORT
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: FLIGHT_NUMBER
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: SCHEDULED_TIME
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: DIVERTED
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: MONTH
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: CITY_DEST
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: COUNTRY_DEST
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: CITY
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: COUNTRY
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
version: 1.0.0
database_uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
data: https://cdn.jsdelivr.net/gh/apache-superset/examples-data@master/flight_data.csv.gz

View File

@@ -1,212 +0,0 @@
table_name: long_lat
main_dttm_col: datetime
description: null
default_endpoint: null
offset: 0
cache_timeout: null
catalog: null
schema: public
sql: null
params: null
template_params: null
filter_select_enabled: true
fetch_values_predicate: null
extra: null
normalize_columns: false
always_filter_main_dttm: false
folders: null
uuid: 605eaec7-ebf1-4fea-ac4b-07652fcb46e7
metrics:
- metric_name: count
verbose_name: COUNT(*)
metric_type: count
expression: COUNT(*)
description: null
d3format: null
currency: null
extra: null
warning_text: null
columns:
- column_name: datetime
verbose_name: null
is_dttm: true
is_active: true
type: TIMESTAMP WITHOUT TIME ZONE
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: LAT
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: DISTRICT
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: CITY
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: ID
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: REGION
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: LON
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: radius_miles
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: occupancy
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: delimited
verbose_name: null
is_dttm: false
is_active: true
type: VARCHAR(60)
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: geohash
verbose_name: null
is_dttm: false
is_active: true
type: VARCHAR(12)
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: POSTCODE
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: NUMBER
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: STREET
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: UNIT
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
version: 1.0.0
database_uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
data: https://cdn.jsdelivr.net/gh/apache-superset/examples-data@master/san_francisco.csv.gz

View File

@@ -1,80 +0,0 @@
table_name: sf_population_polygons
main_dttm_col: null
description: Population density of San Francisco
default_endpoint: null
offset: 0
cache_timeout: null
catalog: null
schema: public
sql: null
params: null
template_params: null
filter_select_enabled: true
fetch_values_predicate: null
extra: null
normalize_columns: false
always_filter_main_dttm: false
folders: null
uuid: a480e881-e90d-4dc8-818e-f9338c3ca839
metrics:
- metric_name: count
verbose_name: COUNT(*)
metric_type: count
expression: COUNT(*)
description: null
d3format: null
currency: null
extra: null
warning_text: null
columns:
- column_name: area
verbose_name: null
is_dttm: false
is_active: true
type: DOUBLE PRECISION
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: population
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: zipcode
verbose_name: null
is_dttm: false
is_active: true
type: BIGINT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
- column_name: contour
verbose_name: null
is_dttm: false
is_active: true
type: TEXT
advanced_data_type: null
groupby: true
filterable: true
expression: null
description: null
python_date_format: null
extra: null
version: 1.0.0
database_uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
data: https://cdn.jsdelivr.net/gh/apache-superset/examples-data@master/sf_population.json.gz

View File

@@ -14,13 +14,42 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from .bart_lines import load_bart_lines
from .big_data import load_big_data
from .birth_names import load_birth_names
from .country_map import load_country_map_data
from .css_templates import load_css_templates
from .utils import cleanup_old_examples, load_examples_from_configs
from .deck import load_deck_dash
from .energy import load_energy
from .flights import load_flights
from .long_lat import load_long_lat_data
from .misc_dashboard import load_misc_dashboard
from .multiformat_time_series import load_multiformat_time_series
from .paris import load_paris_iris_geojson
from .random_time_series import load_random_time_series_data
from .sf_population_polygons import load_sf_population_polygons
from .supported_charts_dashboard import load_supported_charts_dashboard
from .tabbed_dashboard import load_tabbed_dashboard
from .utils import load_examples_from_configs
from .world_bank import load_world_bank_health_n_pop
__all__ = [
"cleanup_old_examples",
"load_bart_lines",
"load_big_data",
"load_birth_names",
"load_country_map_data",
"load_css_templates",
"load_deck_dash",
"load_energy",
"load_flights",
"load_long_lat_data",
"load_misc_dashboard",
"load_multiformat_time_series",
"load_paris_iris_geojson",
"load_random_time_series_data",
"load_sf_population_polygons",
"load_supported_charts_dashboard",
"load_tabbed_dashboard",
"load_examples_from_configs",
"load_world_bank_health_n_pop",
]

547
superset/examples/deck.py Normal file
View File

@@ -0,0 +1,547 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
from superset import db
from superset.models.dashboard import Dashboard
from superset.models.slice import Slice
from superset.utils import json
from superset.utils.core import DatasourceType
from .helpers import (
get_slice_json,
get_table_connector_registry,
merge_slice,
update_slice_ids,
)
logger = logging.getLogger(__name__)
COLOR_RED = {"r": 205, "g": 0, "b": 3, "a": 0.82}
POSITION_JSON = """\
{
"CHART-3afd9d70": {
"meta": {
"chartId": 66,
"sliceName": "Deck.gl Scatterplot",
"width": 6,
"height": 50
},
"type": "CHART",
"id": "CHART-3afd9d70",
"children": []
},
"CHART-2ee7fa5e": {
"meta": {
"chartId": 67,
"sliceName": "Deck.gl Screen grid",
"width": 6,
"height": 50
},
"type": "CHART",
"id": "CHART-2ee7fa5e",
"children": []
},
"CHART-201f7715": {
"meta": {
"chartId": 68,
"sliceName": "Deck.gl Hexagons",
"width": 6,
"height": 50
},
"type": "CHART",
"id": "CHART-201f7715",
"children": []
},
"CHART-d02f6c40": {
"meta": {
"chartId": 69,
"sliceName": "Deck.gl Grid",
"width": 6,
"height": 50
},
"type": "CHART",
"id": "CHART-d02f6c40",
"children": []
},
"CHART-2673431d": {
"meta": {
"chartId": 70,
"sliceName": "Deck.gl Polygons",
"width": 6,
"height": 50
},
"type": "CHART",
"id": "CHART-2673431d",
"children": []
},
"CHART-85265a60": {
"meta": {
"chartId": 71,
"sliceName": "Deck.gl Arcs",
"width": 6,
"height": 50
},
"type": "CHART",
"id": "CHART-85265a60",
"children": []
},
"CHART-2b87513c": {
"meta": {
"chartId": 72,
"sliceName": "Deck.gl Path",
"width": 6,
"height": 50
},
"type": "CHART",
"id": "CHART-2b87513c",
"children": []
},
"GRID_ID": {
"type": "GRID",
"id": "GRID_ID",
"children": [
"ROW-a7b16cb5",
"ROW-72c218a5",
"ROW-957ba55b",
"ROW-af041bdd"
]
},
"HEADER_ID": {
"meta": {
"text": "deck.gl Demo"
},
"type": "HEADER",
"id": "HEADER_ID"
},
"ROOT_ID": {
"type": "ROOT",
"id": "ROOT_ID",
"children": [
"GRID_ID"
]
},
"ROW-72c218a5": {
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"type": "ROW",
"id": "ROW-72c218a5",
"children": [
"CHART-d02f6c40",
"CHART-201f7715"
]
},
"ROW-957ba55b": {
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"type": "ROW",
"id": "ROW-957ba55b",
"children": [
"CHART-2673431d",
"CHART-85265a60"
]
},
"ROW-a7b16cb5": {
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"type": "ROW",
"id": "ROW-a7b16cb5",
"children": [
"CHART-3afd9d70",
"CHART-2ee7fa5e"
]
},
"ROW-af041bdd": {
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"type": "ROW",
"id": "ROW-af041bdd",
"children": [
"CHART-2b87513c"
]
},
"DASHBOARD_VERSION_KEY": "v2"
}"""
def load_deck_dash() -> None: # pylint: disable=too-many-statements
logger.debug("Loading deck.gl dashboard")
slices = []
table = get_table_connector_registry()
tbl = db.session.query(table).filter_by(table_name="long_lat").first()
slice_data = {
"spatial": {"type": "latlong", "lonCol": "LON", "latCol": "LAT"},
"color_picker": COLOR_RED,
"datasource": "5__table",
"granularity_sqla": None,
"groupby": [],
"mapbox_style": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
"multiplier": 10,
"point_radius_fixed": {"type": "metric", "value": "count"},
"point_unit": "square_m",
"min_radius": 1,
"max_radius": 250,
"row_limit": 5000,
"time_range": " : ",
"size": "count",
"time_grain_sqla": None,
"viewport": {
"bearing": -4.952916738791771,
"latitude": 37.78926922909199,
"longitude": -122.42613341901688,
"pitch": 4.750411100577438,
"zoom": 12.729132798697304,
},
"viz_type": "deck_scatter",
}
logger.debug("Creating Scatterplot slice")
slc = Slice(
slice_name="Deck.gl Scatterplot",
viz_type="deck_scatter",
datasource_type=DatasourceType.TABLE,
datasource_id=tbl.id,
params=get_slice_json(slice_data),
)
merge_slice(slc)
slices.append(slc)
slice_data = {
"point_unit": "square_m",
"row_limit": 5000,
"spatial": {"type": "latlong", "lonCol": "LON", "latCol": "LAT"},
"mapbox_style": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
"granularity_sqla": None,
"size": "count",
"viz_type": "deck_screengrid",
"time_range": "No filter",
"point_radius": "Auto",
"color_picker": {"a": 1, "r": 14, "b": 0, "g": 255},
"grid_size": 20,
"viewport": {
"zoom": 14.161641703941438,
"longitude": -122.41827069521386,
"bearing": -4.952916738791771,
"latitude": 37.76024135844065,
"pitch": 4.750411100577438,
},
"point_radius_fixed": {"type": "fix", "value": 2000},
"datasource": "5__table",
"time_grain_sqla": None,
"groupby": [],
}
logger.debug("Creating Screen Grid slice")
slc = Slice(
slice_name="Deck.gl Screen grid",
viz_type="deck_screengrid",
datasource_type=DatasourceType.TABLE,
datasource_id=tbl.id,
params=get_slice_json(slice_data),
)
merge_slice(slc)
slices.append(slc)
slice_data = {
"spatial": {"type": "latlong", "lonCol": "LON", "latCol": "LAT"},
"row_limit": 5000,
"mapbox_style": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
"granularity_sqla": None,
"size": "count",
"viz_type": "deck_hex",
"time_range": "No filter",
"point_radius_unit": "Pixels",
"point_radius": "Auto",
"color_picker": {"a": 1, "r": 14, "b": 0, "g": 255},
"grid_size": 40,
"extruded": True,
"viewport": {
"latitude": 37.789795085160335,
"pitch": 54.08961642447763,
"zoom": 13.835465702403654,
"longitude": -122.40632230075536,
"bearing": -2.3984797349335167,
},
"point_radius_fixed": {"type": "fix", "value": 2000},
"datasource": "5__table",
"time_grain_sqla": None,
"groupby": [],
}
logger.debug("Creating Hex slice")
slc = Slice(
slice_name="Deck.gl Hexagons",
viz_type="deck_hex",
datasource_type=DatasourceType.TABLE,
datasource_id=tbl.id,
params=get_slice_json(slice_data),
)
merge_slice(slc)
slices.append(slc)
slice_data = {
"autozoom": False,
"spatial": {"type": "latlong", "lonCol": "LON", "latCol": "LAT"},
"row_limit": 5000,
"mapbox_style": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
"granularity_sqla": None,
"size": "count",
"viz_type": "deck_grid",
"point_radius_unit": "Pixels",
"point_radius": "Auto",
"time_range": "No filter",
"color_picker": {"a": 1, "r": 14, "b": 0, "g": 255},
"grid_size": 120,
"extruded": True,
"viewport": {
"longitude": -122.42066918995666,
"bearing": 155.80099696026355,
"zoom": 12.699690845482069,
"latitude": 37.7942314882596,
"pitch": 53.470800300695146,
},
"point_radius_fixed": {"type": "fix", "value": 2000},
"datasource": "5__table",
"time_grain_sqla": None,
"groupby": [],
}
logger.debug("Creating Grid slice")
slc = Slice(
slice_name="Deck.gl Grid",
viz_type="deck_grid",
datasource_type=DatasourceType.TABLE,
datasource_id=tbl.id,
params=get_slice_json(slice_data),
)
merge_slice(slc)
slices.append(slc)
polygon_tbl = (
db.session.query(table).filter_by(table_name="sf_population_polygons").first()
)
slice_data = {
"datasource": "11__table",
"viz_type": "deck_polygon",
"slice_id": 41,
"granularity_sqla": None,
"time_grain_sqla": None,
"time_range": " : ",
"line_column": "contour",
"metric": {
"aggregate": "SUM",
"column": {
"column_name": "population",
"description": None,
"expression": None,
"filterable": True,
"groupby": True,
"id": 1332,
"is_dttm": False,
"optionName": "_col_population",
"python_date_format": None,
"type": "BIGINT",
"verbose_name": None,
},
"expressionType": "SIMPLE",
"hasCustomLabel": True,
"label": "Population",
"optionName": "metric_t2v4qbfiz1_w6qgpx4h2p",
"sqlExpression": None,
},
"line_type": "json",
"linear_color_scheme": "oranges",
"mapbox_style": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
"viewport": {
"longitude": -122.43388541747726,
"latitude": 37.752020331384834,
"zoom": 11.133995608594631,
"bearing": 37.89506450385642,
"pitch": 60,
"width": 667,
"height": 906,
"altitude": 1.5,
"maxZoom": 20,
"minZoom": 0,
"maxPitch": 60,
"minPitch": 0,
"maxLatitude": 85.05113,
"minLatitude": -85.05113,
},
"reverse_long_lat": False,
"fill_color_picker": {"r": 3, "g": 65, "b": 73, "a": 1},
"stroke_color_picker": {"r": 0, "g": 122, "b": 135, "a": 1},
"filled": True,
"stroked": False,
"extruded": True,
"multiplier": 0.1,
"line_width": 10,
"line_width_unit": "meters",
"point_radius_fixed": {
"type": "metric",
"value": {
"aggregate": None,
"column": None,
"expressionType": "SQL",
"hasCustomLabel": None,
"label": "Density",
"optionName": "metric_c5rvwrzoo86_293h6yrv2ic",
"sqlExpression": "SUM(population)/SUM(area)",
},
},
"js_columns": [],
"js_data_mutator": "",
"js_tooltip": "",
"js_onclick_href": "",
"legend_format": ".1s",
"legend_position": "tr",
}
logger.debug("Creating Polygon slice")
slc = Slice(
slice_name="Deck.gl Polygons",
viz_type="deck_polygon",
datasource_type=DatasourceType.TABLE,
datasource_id=polygon_tbl.id,
params=get_slice_json(slice_data),
)
merge_slice(slc)
slices.append(slc)
slice_data = {
"datasource": "10__table",
"viz_type": "deck_arc",
"slice_id": 42,
"granularity_sqla": None,
"time_grain_sqla": None,
"time_range": " : ",
"start_spatial": {
"type": "latlong",
"latCol": "LATITUDE",
"lonCol": "LONGITUDE",
},
"end_spatial": {
"type": "latlong",
"latCol": "LATITUDE_DEST",
"lonCol": "LONGITUDE_DEST",
},
"row_limit": 5000,
"mapbox_style": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
"viewport": {
"altitude": 1.5,
"bearing": 8.546256357301871,
"height": 642,
"latitude": 44.596651438714254,
"longitude": -91.84340711201104,
"maxLatitude": 85.05113,
"maxPitch": 60,
"maxZoom": 20,
"minLatitude": -85.05113,
"minPitch": 0,
"minZoom": 0,
"pitch": 60,
"width": 997,
"zoom": 2.929837070560775,
},
"color_picker": {"r": 0, "g": 122, "b": 135, "a": 1},
"stroke_width": 1,
}
logger.debug("Creating Arc slice")
slc = Slice(
slice_name="Deck.gl Arcs",
viz_type="deck_arc",
datasource_type=DatasourceType.TABLE,
datasource_id=db.session.query(table)
.filter_by(table_name="flights")
.first()
.id,
params=get_slice_json(slice_data),
)
merge_slice(slc)
slices.append(slc)
slice_data = {
"datasource": "12__table",
"slice_id": 43,
"viz_type": "deck_path",
"time_grain_sqla": None,
"time_range": " : ",
"line_column": "path_json",
"line_type": "json",
"row_limit": 5000,
"mapbox_style": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
"viewport": {
"longitude": -122.18885402582598,
"latitude": 37.73671752604488,
"zoom": 9.51847667620428,
"bearing": 0,
"pitch": 0,
"width": 669,
"height": 1094,
"altitude": 1.5,
"maxZoom": 20,
"minZoom": 0,
"maxPitch": 60,
"minPitch": 0,
"maxLatitude": 85.05113,
"minLatitude": -85.05113,
},
"color_picker": {"r": 0, "g": 122, "b": 135, "a": 1},
"line_width": 150,
"reverse_long_lat": False,
"js_columns": ["color"],
"js_data_mutator": "data => data.map(d => ({\n"
" ...d,\n"
" color: colors.hexToRGB(d.extraProps.color)\n"
"}));",
"js_tooltip": "",
"js_onclick_href": "",
}
logger.debug("Creating Path slice")
slc = Slice(
slice_name="Deck.gl Path",
viz_type="deck_path",
datasource_type=DatasourceType.TABLE,
datasource_id=db.session.query(table)
.filter_by(table_name="bart_lines")
.first()
.id,
params=get_slice_json(slice_data),
)
merge_slice(slc)
slices.append(slc)
slug = "deck"
logger.debug("Creating a dashboard")
title = "deck.gl Demo"
dash = db.session.query(Dashboard).filter_by(slug=slug).first()
if not dash:
dash = Dashboard()
db.session.add(dash)
dash.published = True
js = POSITION_JSON
pos = json.loads(js)
slices = update_slice_ids(pos)
dash.position_json = json.dumps(pos, indent=4)
dash.dashboard_title = title
dash.slug = slug
dash.slices = slices

View File

@@ -23,16 +23,17 @@ from sqlalchemy.sql import column
import superset.utils.database as database_utils
from superset import db
from superset.connectors.sqla.models import SqlMetric
from superset.examples.helpers import (
from superset.models.slice import Slice
from superset.sql.parse import Table
from superset.utils.core import DatasourceType
from .helpers import (
get_slice_json,
get_table_connector_registry,
merge_slice,
misc_dash_slices,
read_example_data,
)
from superset.models.slice import Slice
from superset.sql.parse import Table
from superset.utils.core import DatasourceType
logger = logging.getLogger(__name__)

View File

@@ -0,0 +1,76 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
import pandas as pd
from sqlalchemy import DateTime, inspect
import superset.utils.database as database_utils
from superset import db
from superset.sql.parse import Table
from .helpers import get_table_connector_registry, read_example_data
logger = logging.getLogger(__name__)
def load_flights(only_metadata: bool = False, force: bool = False) -> None:
"""Loading random time series data from a zip file in the repo"""
tbl_name = "flights"
database = database_utils.get_example_database()
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
table_exists = database.has_table(Table(tbl_name, schema))
if not only_metadata and (not table_exists or force):
pdf = read_example_data(
"examples://flight_data.csv.gz", encoding="latin-1", compression="gzip"
)
# Loading airports info to join and get lat/long
airports = read_example_data(
"examples://airports.csv.gz", encoding="latin-1", compression="gzip"
)
airports = airports.set_index("IATA_CODE")
pdf["ds"] = (
pdf.YEAR.map(str) + "-0" + pdf.MONTH.map(str) + "-0" + pdf.DAY.map(str)
)
pdf.ds = pd.to_datetime(pdf.ds)
pdf.drop(columns=["DAY", "MONTH", "YEAR"])
pdf = pdf.join(airports, on="ORIGIN_AIRPORT", rsuffix="_ORIG")
pdf = pdf.join(airports, on="DESTINATION_AIRPORT", rsuffix="_DEST")
pdf.to_sql(
tbl_name,
engine,
schema=schema,
if_exists="replace",
chunksize=500,
dtype={"ds": DateTime},
index=False,
)
table = get_table_connector_registry()
tbl = db.session.query(table).filter_by(table_name=tbl_name).first()
if not tbl:
tbl = table(table_name=tbl_name, schema=schema)
db.session.add(tbl)
tbl.description = "Random set of flights in the US"
tbl.database = database
tbl.filter_select_enabled = True
tbl.fetch_metadata()
logger.debug("Done loading table!")

View File

@@ -49,7 +49,7 @@ from urllib.error import HTTPError
import pandas as pd
from superset import db
from superset import app, db
from superset.connectors.sqla.models import SqlaTable
from superset.models.slice import Slice
from superset.utils import json
@@ -78,6 +78,11 @@ def get_table_connector_registry() -> Any:
return SqlaTable
def get_examples_folder() -> str:
"""Return local path to the examples folder (when vendored)."""
return os.path.join(app.config["BASE_DIR"], "examples")
def update_slice_ids(pos: dict[Any, Any]) -> list[Slice]:
"""Update slice ids in ``position_json`` and return the slices found."""
slice_components = [

View File

@@ -0,0 +1,127 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import datetime
import logging
import random
import geohash
from sqlalchemy import DateTime, Float, inspect, String
import superset.utils.database as database_utils
from superset import db
from superset.models.slice import Slice
from superset.sql.parse import Table
from superset.utils.core import DatasourceType
from .helpers import (
get_slice_json,
get_table_connector_registry,
merge_slice,
misc_dash_slices,
read_example_data,
)
logger = logging.getLogger(__name__)
def load_long_lat_data(only_metadata: bool = False, force: bool = False) -> None:
"""Loading lat/long data from a csv file in the repo"""
tbl_name = "long_lat"
database = database_utils.get_example_database()
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
table_exists = database.has_table(Table(tbl_name, schema))
if not only_metadata and (not table_exists or force):
pdf = read_example_data(
"examples://san_francisco.csv.gz", encoding="utf-8", compression="gzip"
)
start = datetime.datetime.now().replace(
hour=0, minute=0, second=0, microsecond=0
)
pdf["datetime"] = [
start + datetime.timedelta(hours=i * 24 / (len(pdf) - 1))
for i in range(len(pdf))
]
pdf["occupancy"] = [random.randint(1, 6) for _ in range(len(pdf))] # noqa: S311
pdf["radius_miles"] = [random.uniform(1, 3) for _ in range(len(pdf))] # noqa: S311
pdf["geohash"] = pdf[["LAT", "LON"]].apply(
lambda x: geohash.encode(*x), axis=1
)
pdf["delimited"] = pdf["LAT"].map(str).str.cat(pdf["LON"].map(str), sep=",")
pdf.to_sql(
tbl_name,
engine,
schema=schema,
if_exists="replace",
chunksize=500,
dtype={
"longitude": Float(),
"latitude": Float(),
"number": Float(),
"street": String(100),
"unit": String(10),
"city": String(50),
"district": String(50),
"region": String(50),
"postcode": Float(),
"id": String(100),
"datetime": DateTime(),
"occupancy": Float(),
"radius_miles": Float(),
"geohash": String(12),
"delimited": String(60),
},
index=False,
)
logger.debug("Done loading table!")
logger.debug("-" * 80)
logger.debug("Creating table reference")
table = get_table_connector_registry()
obj = db.session.query(table).filter_by(table_name=tbl_name).first()
if not obj:
obj = table(table_name=tbl_name, schema=schema)
db.session.add(obj)
obj.main_dttm_col = "datetime"
obj.database = database
obj.filter_select_enabled = True
obj.fetch_metadata()
tbl = obj
slice_data = {
"granularity_sqla": "day",
"since": "2014-01-01",
"until": "now",
"viz_type": "mapbox",
"all_columns_x": "LON",
"all_columns_y": "LAT",
"mapbox_style": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
"all_columns": ["occupancy"],
"row_limit": 500000,
}
logger.debug("Creating a slice")
slc = Slice(
slice_name="OSM Long/Lat",
viz_type="osm",
datasource_type=DatasourceType.TABLE,
datasource_id=tbl.id,
params=get_slice_json(slice_data),
)
misc_dash_slices.add(slc.slice_name)
merge_slice(slc)

View File

@@ -0,0 +1,145 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
import textwrap
from superset import db
from superset.models.dashboard import Dashboard
from superset.utils import json
from .helpers import update_slice_ids
logger = logging.getLogger(__name__)
DASH_SLUG = "misc_charts"
def load_misc_dashboard() -> None:
"""Loading a dashboard featuring misc charts"""
logger.debug("Creating the dashboard")
db.session.expunge_all()
dash = db.session.query(Dashboard).filter_by(slug=DASH_SLUG).first()
if not dash:
dash = Dashboard()
db.session.add(dash)
js = textwrap.dedent(
"""\
{
"CHART-HJOYVMV0E7": {
"children": [],
"id": "CHART-HJOYVMV0E7",
"meta": {
"chartId": 3969,
"height": 69,
"sliceName": "OSM Long/Lat",
"uuid": "164efe31-295b-4408-aaa6-2f4bfb58a212",
"width": 4
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-S1MK4M4A4X",
"COLUMN-ByUFVf40EQ"
],
"type": "CHART"
},
"CHART-S1WYNz4AVX": {
"children": [],
"id": "CHART-S1WYNz4AVX",
"meta": {
"chartId": 3989,
"height": 69,
"sliceName": "Parallel Coordinates",
"uuid": "e84f7e74-031a-47bb-9f80-ae0694dcca48",
"width": 4
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-SytNzNA4X"
],
"type": "CHART"
},
"CHART-rkgF4G4A4X": {
"children": [],
"id": "CHART-rkgF4G4A4X",
"meta": {
"chartId": 3970,
"height": 69,
"sliceName": "Birth in France by department in 2016",
"uuid": "54583ae9-c99a-42b5-a906-7ee2adfe1fb1",
"width": 4
},
"parents": [
"ROOT_ID",
"GRID_ID",
"ROW-SytNzNA4X"
],
"type": "CHART"
},
"DASHBOARD_VERSION_KEY": "v2",
"GRID_ID": {
"children": [
"ROW-SytNzNA4X"
],
"id": "GRID_ID",
"parents": [
"ROOT_ID"
],
"type": "GRID"
},
"HEADER_ID": {
"id": "HEADER_ID",
"meta": {
"text": "Misc Charts"
},
"type": "HEADER"
},
"ROOT_ID": {
"children": [
"GRID_ID"
],
"id": "ROOT_ID",
"type": "ROOT"
},
"ROW-SytNzNA4X": {
"children": [
"CHART-rkgF4G4A4X",
"CHART-S1WYNz4AVX",
"CHART-HJOYVMV0E7"
],
"id": "ROW-SytNzNA4X",
"meta": {
"background": "BACKGROUND_TRANSPARENT"
},
"parents": [
"ROOT_ID",
"GRID_ID"
],
"type": "ROW"
}
}
"""
)
pos = json.loads(js)
slices = update_slice_ids(pos)
dash.dashboard_title = "Misc Charts"
dash.position_json = json.dumps(pos, indent=4)
dash.slug = DASH_SLUG
dash.slices = slices

View File

@@ -0,0 +1,134 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
from typing import Optional
import pandas as pd
from sqlalchemy import BigInteger, Date, DateTime, inspect, String
from superset import app, db
from superset.models.slice import Slice
from superset.sql.parse import Table
from superset.utils.core import DatasourceType
from ..utils.database import get_example_database # noqa: TID252
from .helpers import (
get_slice_json,
get_table_connector_registry,
merge_slice,
misc_dash_slices,
read_example_data,
)
logger = logging.getLogger(__name__)
def load_multiformat_time_series( # pylint: disable=too-many-locals
only_metadata: bool = False, force: bool = False
) -> None:
"""Loading time series data from a zip file in the repo"""
tbl_name = "multiformat_time_series"
database = get_example_database()
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
table_exists = database.has_table(Table(tbl_name, schema))
if not only_metadata and (not table_exists or force):
pdf = read_example_data(
"examples://multiformat_time_series.json.gz", compression="gzip"
)
# TODO(bkyryliuk): move load examples data into the pytest fixture
if database.backend == "presto":
pdf.ds = pd.to_datetime(pdf.ds, unit="s")
pdf.ds = pdf.ds.dt.strftime("%Y-%m-%d")
pdf.ds2 = pd.to_datetime(pdf.ds2, unit="s")
pdf.ds2 = pdf.ds2.dt.strftime("%Y-%m-%d %H:%M%:%S")
else:
pdf.ds = pd.to_datetime(pdf.ds, unit="s")
pdf.ds2 = pd.to_datetime(pdf.ds2, unit="s")
pdf.to_sql(
tbl_name,
engine,
schema=schema,
if_exists="replace",
chunksize=500,
dtype={
"ds": String(255) if database.backend == "presto" else Date,
"ds2": String(255) if database.backend == "presto" else DateTime,
"epoch_s": BigInteger,
"epoch_ms": BigInteger,
"string0": String(100),
"string1": String(100),
"string2": String(100),
"string3": String(100),
},
index=False,
)
logger.debug("Done loading table!")
logger.debug("-" * 80)
logger.debug(f"Creating table [{tbl_name}] reference")
table = get_table_connector_registry()
obj = db.session.query(table).filter_by(table_name=tbl_name).first()
if not obj:
obj = table(table_name=tbl_name, schema=schema)
db.session.add(obj)
obj.main_dttm_col = "ds"
obj.database = database
obj.filter_select_enabled = True
dttm_and_expr_dict: dict[str, tuple[Optional[str], None]] = {
"ds": (None, None),
"ds2": (None, None),
"epoch_s": ("epoch_s", None),
"epoch_ms": ("epoch_ms", None),
"string2": ("%Y%m%d-%H%M%S", None),
"string1": ("%Y-%m-%d^%H:%M:%S", None),
"string0": ("%Y-%m-%d %H:%M:%S.%f", None),
"string3": ("%Y/%m/%d%H:%M:%S.%f", None),
}
for col in obj.columns:
dttm_and_expr = dttm_and_expr_dict[col.column_name]
col.python_date_format = dttm_and_expr[0]
col.database_expression = dttm_and_expr[1]
col.is_dttm = True
obj.fetch_metadata()
tbl = obj
logger.debug("Creating Heatmap charts")
for i, col in enumerate(tbl.columns):
slice_data = {
"metrics": ["count"],
"granularity_sqla": col.column_name,
"row_limit": app.config["ROW_LIMIT"],
"since": "2015",
"until": "2016",
"viz_type": "cal_heatmap",
"domain_granularity": "month",
"subdomain_granularity": "day",
}
slc = Slice(
slice_name=f"Calendar Heatmap multiformat {i}",
viz_type="cal_heatmap",
datasource_type=DatasourceType.TABLE,
datasource_id=tbl.id,
params=get_slice_json(slice_data),
)
merge_slice(slc)
misc_dash_slices.add("Calendar Heatmap multiformat 0")

View File

@@ -0,0 +1,67 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import logging
from sqlalchemy import inspect, String, Text
import superset.utils.database as database_utils
from superset import db
from superset.sql.parse import Table
from superset.utils import json
from .helpers import get_table_connector_registry, read_example_data
logger = logging.getLogger(__name__)
def load_paris_iris_geojson(only_metadata: bool = False, force: bool = False) -> None:
tbl_name = "paris_iris_mapping"
database = database_utils.get_example_database()
with database.get_sqla_engine() as engine:
schema = inspect(engine).default_schema_name
table_exists = database.has_table(Table(tbl_name, schema))
if not only_metadata and (not table_exists or force):
df = read_example_data("examples://paris_iris.json.gz", compression="gzip")
df["features"] = df.features.map(json.dumps)
df.to_sql(
tbl_name,
engine,
schema=schema,
if_exists="replace",
chunksize=500,
dtype={
"color": String(255),
"name": String(255),
"features": Text,
"type": Text,
},
index=False,
)
logger.debug(f"Creating table {tbl_name} reference")
table = get_table_connector_registry()
tbl = db.session.query(table).filter_by(table_name=tbl_name).first()
if not tbl:
tbl = table(table_name=tbl_name, schema=schema)
db.session.add(tbl)
tbl.description = "Map of Paris"
tbl.database = database
tbl.filter_select_enabled = True
tbl.fetch_metadata()

Some files were not shown because too many files have changed in this diff Show More