Compare commits

...

22 Commits

Author SHA1 Message Date
Hugh A. Miles II
40c692dd5d Update viz.py 2018-01-22 21:02:57 -08:00
Maxime Beauchemin
712212fc51 Use the query_obj as the basis for the cache key
When we recently moved from hashing form_data to define the cache_key
towards using the rendered query instead,
it made is such that non deterministic form
control values like relative times specified in "from" and "until" time
bound resulted in making those miss cache 100% of the time.

Here we move away from using the rendered query and using the query_obj
instead.
2018-01-22 21:01:42 -08:00
Maxime Beauchemin
36956a5d24 Using a NullPool for external connections by default
Currently, even though `get_sqla_engine` calls get memoized, engines are
still short lived since they are attached to an models.Database ORM
object. All engines created through this method have the scope of a web
request.

Knowing that the SQLAlchemy objects are short lived means that
a related connection pool would also be short lived and mostly useless.
I think it's pretty rare that connections get reused within the context
of a view or Celery worker task.

We've noticed on Redshift that Superset was leaving many connections
opened (hundreds). This is probably due to a combination of the current
process not garbage collecting connections properly, and perhaps the
absence of connection timeout on the redshift side of things. This
could also be related to the fact that we experience web requests timeouts
(enforced by gunicorn) and that process-killing may not allow SQLAlchemy
to clean up connections as they occur (which this PR may not help
fixing...)

For all these reasons, it seems like the right thing to do to use
NullPool for external connection (but not for our connection to the metadata
db!).

Opening the PR for conversation. Putting this query into our staging
today to run some tests.
2018-01-22 16:59:18 -08:00
Hugh A. Miles II
1c76d583b3 Added DeckGL.Polygon Layer w/ JS controls (#4227)
* Working polygon layer for deckGL

* add js controls

* add thumbnail

* better description

* refactor to leverage line_column controls

* templates: open code and documentation on a new tab (#4217)

As they are external resources.

* Fix tutorial doesn't match the current interface #4138 (#4215)

* [bugfix] markup and iframe viz raise 'Empty query' (#4225)

closes https://github.com/apache/incubator-superset/issues/4222

Related to: https://github.com/apache/incubator-superset/pull/4016

* [bugfix] time_pivot entry got missing in merge conflict (#4221)

PR here https://github.com/apache/incubator-superset/pull/3518 missed a
line of code while merging conflicts with time_pivot viz

* Improve deck.gl GeoJSON visualization (#4220)

* Improve geoJSON

* Addressing comments

* lint

* refactor to leverage line_column controls

* refactor to use DeckPathViz

* oops
2018-01-18 13:47:04 -08:00
Hugh A. Miles II
ee77f11b27 remove setting spatial in DeckPathViz class (#4235) 2018-01-18 13:47:04 -08:00
michellethomas
3b40e90b40 Don't cache if there's no cache key (#4229) 2018-01-18 13:47:04 -08:00
Peter Lubell-Doughtie
6cd83c3025 add Ona as a user (#4234) 2018-01-18 13:47:03 -08:00
Maxime Beauchemin
af941736a4 Improve deck.gl GeoJSON visualization (#4220)
* Improve geoJSON

* Addressing comments

* lint
2018-01-18 13:47:03 -08:00
Maxime Beauchemin
e502c22c70 [bugfix] time_pivot entry got missing in merge conflict (#4221)
PR here https://github.com/apache/incubator-superset/pull/3518 missed a
line of code while merging conflicts with time_pivot viz
2018-01-18 13:47:03 -08:00
Maxime Beauchemin
46411bc4ad [bugfix] markup and iframe viz raise 'Empty query' (#4225)
closes https://github.com/apache/incubator-superset/issues/4222

Related to: https://github.com/apache/incubator-superset/pull/4016
2018-01-18 13:47:03 -08:00
Yongjie Zhao
32e06616d9 Fix tutorial doesn't match the current interface #4138 (#4215) 2018-01-18 13:47:03 -08:00
Riccardo Magliocchetti
b3bc1429ac templates: open code and documentation on a new tab (#4217)
As they are external resources.
2018-01-18 13:47:03 -08:00
michellethomas
9b3eef893a Adding limit to time_table viz to get druid query to work (#4207) 2018-01-18 13:47:03 -08:00
Maxime Beauchemin
151657ba3e [line chart] fix time shift color (#4202) 2018-01-18 13:47:03 -08:00
Hugh Miles
23cc83f300 fix mergeconflicts 2018-01-12 13:45:17 -08:00
Hugh Miles
6e820b8355 rm merge arrows 2018-01-09 16:04:24 -08:00
Hugh Miles
8db14c47e7 fixed permissions 2018-01-09 15:44:00 -08:00
Hugh Miles
58ff72776d updated cherry-pick for lyft-endpoints 2018-01-09 14:12:22 -08:00
Maxime Beauchemin
b72bf98f68 Using JS to customize spatial viz and tooltips
(cherry picked from commit df22f29aa49f8e5991e19430aeed816ab08d2dd3)
2018-01-09 09:01:59 -08:00
Hugh Miles
32b466184e Moved lyft specific endpoints into its own file
(cherry picked from commit a2630b41c8d7da859a39fe5f9f5e51e66e9e97b9)
2018-01-09 08:57:21 -08:00
Maxime Beauchemin
bfdfd66160 Simplify login form for oauth
(cherry picked from commit 89ba06d9a6)
2018-01-09 08:57:10 -08:00
Maxime Beauchemin
a4c1d6d5c0 0.23.0rc1 2018-01-09 08:51:11 -08:00
23 changed files with 449 additions and 94 deletions

View File

@@ -165,6 +165,7 @@ the world know they are using Superset. Join our growing community!
- [Konfío](http://konfio.mx)
- [Lyft](https://www.lyft.com/)
- [Maieutical Labs](https://cloudschooling.it)
- [Ona](https://ona.io)
- [Pronto Tools](http://www.prontotools.io)
- [Qunar](https://www.qunar.com/)
- [Shopee](https://shopee.sg)

View File

@@ -23,7 +23,7 @@ Under the **Sources** menu, select the *Databases* option:
.. image:: _static/img/tutorial/tutorial_01_sources_database.png
:scale: 70%
On the resulting page, click on the green plus sign, near the top left:
On the resulting page, click on the green plus sign, near the top right:
.. image:: _static/img/tutorial/tutorial_02_add_database.png
:scale: 70%

Binary file not shown.

Before

Width:  |  Height:  |  Size: 19 KiB

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 433 KiB

View File

@@ -571,6 +571,16 @@ export const controls = {
}),
},
polygon: {
type: 'SelectControl',
label: t('Polygon Column'),
validators: [v.nonEmpty],
description: t('Select the polygon column. Each row should contain JSON.array(N) of [longitude, latitude] points'),
mapStateToProps: state => ({
choices: (state.datasource) ? state.datasource.all_cols : [],
}),
},
point_radius_scale: {
type: 'SelectControl',
freeForm: true,

View File

@@ -517,6 +517,46 @@ export const visTypes = {
],
},
deck_polygon: {
label: t('Deck.gl - Polygon'),
requiresTime: true,
controlPanelSections: [
{
label: t('Query'),
expanded: true,
controlSetRows: [
['line_column', 'line_type'],
['row_limit', null],
],
},
{
label: t('Map'),
controlSetRows: [
['mapbox_style', 'viewport'],
['reverse_long_lat', null],
],
},
{
label: t('Polygon Settings'),
controlSetRows: [
['fill_color_picker', 'stroke_color_picker'],
['filled', 'stroked'],
['extruded', null],
['point_radius_scale', null],
],
},
{
label: t('Advanced'),
controlSetRows: [
['js_columns'],
['js_datapoint_mutator'],
['js_tooltip'],
['js_onclick_href'],
],
},
],
},
deck_arc: {
label: t('Deck.gl - Arc'),
requiresTime: true,
@@ -684,6 +724,7 @@ export const visTypes = {
expanded: true,
controlSetRows: [
['groupby', 'metrics'],
['limit'],
['column_collection'],
['url'],
],

View File

@@ -1,6 +1,6 @@
{
"name": "superset",
"version": "0.23.0dev",
"version": "0.23.0rc1",
"description": "Superset is a data exploration platform designed to be visual, intuitive, and interactive.",
"license": "Apache-2.0",
"directories": {
@@ -66,6 +66,7 @@
"jquery": "3.1.1",
"lodash.throttle": "^4.1.1",
"luma.gl": "^5.0.1",
"mapbox-gl": "^0.43.0",
"mathjs": "^3.16.3",
"moment": "2.18.1",
"mustache": "^2.2.1",

View File

@@ -2,6 +2,7 @@ import React from 'react';
import PropTypes from 'prop-types';
import MapGL from 'react-map-gl';
import DeckGL from 'deck.gl';
import 'mapbox-gl/dist/mapbox-gl.css';
const propTypes = {
viewport: PropTypes.object.isRequired,

View File

@@ -14,40 +14,74 @@ const propertyMap = {
'stroke-width': 'strokeWidth',
};
const convertGeoJsonColorProps = (p, colors) => {
const obj = Object.assign(...Object.keys(p).map(k => ({
[(propertyMap[k]) ? propertyMap[k] : k]: p[k] })));
const alterProps = (props, propOverrides) => {
const newProps = {};
Object.keys(props).forEach((k) => {
if (k in propertyMap) {
newProps[propertyMap[k]] = props[k];
} else {
newProps[k] = props[k];
}
});
if (typeof props.fillColor === 'string') {
newProps.fillColor = hexToRGB(p.fillColor);
}
if (typeof props.strokeColor === 'string') {
newProps.strokeColor = hexToRGB(p.strokeColor);
}
return {
...obj,
fillColor: (colors.fillColor[3] !== 0) ? colors.fillColor : hexToRGB(obj.fillColor),
strokeColor: (colors.strokeColor[3] !== 0) ? colors.strokeColor : hexToRGB(obj.strokeColor),
...newProps,
...propOverrides,
};
};
let features;
const recurseGeoJson = (node, propOverrides, jsFnMutator, extraProps) => {
if (node && node.features) {
node.features.forEach((obj) => {
recurseGeoJson(obj, propOverrides, jsFnMutator, node.extraProps || extraProps);
});
}
if (node && node.geometry) {
const newNode = {
...node,
properties: alterProps(node.properties, propOverrides),
};
if (jsFnMutator) {
jsFnMutator(newNode);
}
if (!newNode.extraProps) {
newNode.extraProps = extraProps;
}
features.push(newNode);
}
};
export default function geoJsonLayer(formData, payload, slice) {
const fd = formData;
const fc = fd.fill_color_picker;
const sc = fd.stroke_color_picker;
let data = payload.data.geojson.features.map(d => ({
...d,
properties: convertGeoJsonColorProps(
d.properties, {
fillColor: [fc.r, fc.g, fc.b, 255 * fc.a],
strokeColor: [sc.r, sc.g, sc.b, 255 * sc.a],
}),
}));
if (fd.js_datapoint_mutator) {
// Applying user defined data mutator if defined
const jsFnMutator = sandboxedEval(fd.js_datapoint_mutator);
data = data.map(jsFnMutator);
const fillColor = [fc.r, fc.g, fc.b, 255 * fc.a];
const strokeColor = [sc.r, sc.g, sc.b, 255 * sc.a];
const propOverrides = {};
if (fillColor[3] > 0) {
propOverrides.fillColor = fillColor;
}
if (strokeColor[3] > 0) {
propOverrides.strokeColor = strokeColor;
}
let jsFnMutator;
if (fd.js_datapoint_mutator) {
// Applying user defined data mutator if defined
jsFnMutator = sandboxedEval(fd.js_datapoint_mutator);
}
features = [];
recurseGeoJson(payload.data, propOverrides, jsFnMutator);
return new GeoJsonLayer({
id: `path-layer-${fd.slice_id}`,
data,
id: `geojson-layer-${fd.slice_id}`,
filled: fd.filled,
data: features,
stroked: fd.stroked,
extruded: fd.extruded,
pointRadiusScale: fd.point_radius_scale,

View File

@@ -7,7 +7,6 @@ export default function getLayer(formData, payload) {
...d,
color: [c.r, c.g, c.b, 255 * c.a],
}));
return new GridLayer({
id: `grid-layer-${fd.slice_id}`,
data,

View File

@@ -6,6 +6,7 @@ import deck_hex from './hex';
import deck_scatter from './scatter';
import deck_geojson from './geojson';
import deck_arc from './arc';
import deck_polygon from './polygon';
const layerGenerators = {
deck_grid,
@@ -15,5 +16,6 @@ const layerGenerators = {
deck_scatter,
deck_geojson,
deck_arc,
deck_polygon,
};
export default layerGenerators;

View File

@@ -0,0 +1,28 @@
import { PolygonLayer } from 'deck.gl';
import * as common from './common';
import sandboxedEval from '../../../javascripts/modules/sandbox';
export default function polygonLayer(formData, payload, slice) {
const fd = formData;
const fc = fd.fill_color_picker;
let data = payload.data.features.map(d => ({
...d,
fillColor: [fc.r, fc.g, fc.b, 255 * fc.a],
}));
if (fd.js_datapoint_mutator) {
// Applying user defined data mutator if defined
const jsFnMutator = sandboxedEval(fd.js_datapoint_mutator);
data = data.map(jsFnMutator);
}
return new PolygonLayer({
id: `path-layer-${fd.slice_id}`,
data,
filled: fd.filled,
stroked: fd.stoked,
extruded: fd.extruded,
...common.commonLayerProps(fd, slice),
});
}

View File

@@ -31,6 +31,7 @@ export const VIZ_TYPES = {
sunburst: 'sunburst',
table: 'table',
time_table: 'time_table',
time_pivot: 'time_pivot',
treemap: 'treemap',
country_map: 'country_map',
word_cloud: 'word_cloud',
@@ -47,6 +48,7 @@ export const VIZ_TYPES = {
deck_geojson: 'deck_geojson',
deck_multi: 'deck_multi',
deck_arc: 'deck_arc',
deck_polygon: 'deck_polygon',
};
const vizMap = {
@@ -94,6 +96,7 @@ const vizMap = {
[VIZ_TYPES.deck_path]: deckglFactory,
[VIZ_TYPES.deck_geojson]: deckglFactory,
[VIZ_TYPES.deck_arc]: deckglFactory,
[VIZ_TYPES.deck_polygon]: deckglFactory,
[VIZ_TYPES.deck_multi]: require('./deckgl/multi.jsx'),
};
export default vizMap;

View File

@@ -1552,6 +1552,36 @@ def load_paris_iris_geojson():
tbl.fetch_metadata()
def load_sf_population_polygons():
tbl_name = 'sf_population_polygons'
with gzip.open(os.path.join(DATA_FOLDER, 'sf_population.json.gz')) as f:
df = pd.read_json(f)
df['contour'] = df.contour.map(json.dumps)
df.to_sql(
tbl_name,
db.engine,
if_exists='replace',
chunksize=500,
dtype={
'zipcode': BigInteger,
'population': BigInteger,
'contour': Text,
'area': BigInteger,
},
index=False)
print("Creating table {} reference".format(tbl_name))
tbl = db.session.query(TBL).filter_by(table_name=tbl_name).first()
if not tbl:
tbl = TBL(table_name=tbl_name)
tbl.description = "Population density of San Francisco"
tbl.database = get_or_create_main_db()
db.session.merge(tbl)
db.session.commit()
tbl.fetch_metadata()
def load_bart_lines():
tbl_name = 'bart_lines'
with gzip.open(os.path.join(DATA_FOLDER, 'bart-lines.json.gz')) as f:

Binary file not shown.

View File

@@ -639,7 +639,7 @@ class Database(Model, AuditMixinNullable, ImportMixin):
@utils.memoized(
watch=('impersonate_user', 'sqlalchemy_uri_decrypted', 'extra'))
def get_sqla_engine(self, schema=None, nullpool=False, user_name=None):
def get_sqla_engine(self, schema=None, nullpool=True, user_name=None):
extra = self.get_extra()
url = make_url(self.sqlalchemy_uri_decrypted)
url = self.db_engine_spec.adjust_database_uri(url, schema)

View File

@@ -0,0 +1,15 @@
{% extends "appbuilder/base.html" %}
{% block content %}
<div class="container">
<div id="loginbox" style="margin-top:50px;" class="mainbox col-md-6 col-md-offset-3 col-sm-8 col-sm-offset-2">
<center>
<a href="/login/google">
<img width="300" src="https://developers.google.com/accounts/images/sign-in-with-google.png">
</a>
</center>
</div>
</div>
{% endblock %}

View File

@@ -34,12 +34,12 @@
</a>
</li>
<li>
<a href="https://github.com/apache/incubator-superset" title="Superset's Github">
<a href="https://github.com/apache/incubator-superset" title="Superset's Github" target="_blank">
<i class="fa fa-github"></i> &nbsp;
</a>
</li>
<li>
<a href="https://superset.incubator.apache.org" title="Documentation">
<a href="https://superset.incubator.apache.org" title="Documentation" target="_blank">
<i class="fa fa-book"></i> &nbsp;
</a>
</li>

View File

@@ -2,3 +2,4 @@ from . import base # noqa
from . import core # noqa
from . import sql_lab # noqa
from . import annotations # noqa
from . import lyft # noqa

View File

@@ -1035,8 +1035,7 @@ class Superset(BaseSupersetView):
return self.get_query_string_response(viz_obj)
try:
payload = viz_obj.get_payload(
force=force)
payload = viz_obj.get_payload(force=force)
except Exception as e:
logging.exception(e)
return json_error_response(utils.error_msg_from_exception(e))

157
superset/views/lyft.py Normal file
View File

@@ -0,0 +1,157 @@
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import json
import logging
import traceback
from flask import (
g, request, Response,
)
from flask_appbuilder import expose
from flask_babel import gettext as __
from superset import (
app, appbuilder, db, utils,
)
import superset.models.core as models
from superset.views.core import Superset
from superset.utils import QueryStatus
from .base import (
json_error_response, generate_download_headers, CsvResponse,
)
config = app.config
stats_logger = config.get('STATS_LOGGER')
log_this = models.Log.log_this
can_access = utils.can_access
DAR = models.DatasourceAccessRequest
ALL_DATASOURCE_ACCESS_ERR = __(
'This endpoint requires the `all_datasource_access` permission')
DATASOURCE_MISSING_ERR = __('The datasource seems to have been deleted')
ACCESS_REQUEST_MISSING_ERR = __(
'The access requests seem to have been deleted')
USER_MISSING_ERR = __('The user seems to have been deleted')
DATASOURCE_ACCESS_ERR = __("You don't have access to this datasource")
def get_database_access_error_msg(database_name):
return __('This view requires the database %(name)s or '
'`all_datasource_access` permission', name=database_name)
def get_datasource_access_error_msg(datasource_name):
return __('This endpoint requires the datasource %(name)s, database or '
'`all_datasource_access` permission', name=datasource_name)
def json_success(json_msg, status=200):
return Response(json_msg, status=status, mimetype='application/json')
class Lyft(Superset):
@log_this
@expose('/lyft_explore_json/<datasource_type>/<datasource_id>/')
def lyft_explore_json(self, datasource_type, datasource_id):
try:
csv = request.args.get('csv') == 'true'
query = request.args.get('query') == 'true'
force = request.args.get('force') == 'true'
form_data = self.get_form_data()
except Exception as e:
return json_error_response(
utils.error_msg_from_exception(e),
stacktrace=traceback.format_exc())
return self.generate_json(datasource_type=datasource_type,
datasource_id=datasource_id,
form_data=form_data,
csv=csv,
query=query,
force=force)
@log_this
@expose('/lyft_dashboard_json/<dashboard_id>/')
def lyft_dashboard_json(self, dashboard_id):
"""Server side rendering for a dashboard"""
session = db.session()
qry = session.query(models.Dashboard)
if dashboard_id.isdigit():
qry = qry.filter_by(id=int(dashboard_id))
else:
qry = qry.filter_by(slug=dashboard_id)
dash = qry.one()
datasources = set()
for slc in dash.slices:
datasource = slc.datasource
if datasource:
datasources.add(datasource)
# Hack to log the dashboard_id properly, even when getting a slug
@log_this
def dashboard(**kwargs): # noqa
pass
dashboard(dashboard_id=dash.id)
standalone_mode = request.args.get('standalone') == 'true'
dashboard_data = dash.data
dashboard_data.update({
'standalone_mode': standalone_mode,
'dash_save_perm': False,
'dash_edit_perm': False,
})
bootstrap_data = {
'user_id': g.user.get_id(),
'dashboard_data': dashboard_data,
'datasources': {ds.uid: ds.data for ds in datasources},
'common': self.common_bootsrap_payload(),
}
return json_success(json.dumps(bootstrap_data))
def generate_json(self, datasource_type, datasource_id, form_data,
csv=False, query=False, force=False):
try:
viz_obj = self.get_viz(
datasource_type=datasource_type,
datasource_id=datasource_id,
form_data=form_data)
except Exception as e:
logging.exception(e)
return json_error_response(
utils.error_msg_from_exception(e),
stacktrace=traceback.format_exc())
if csv:
return CsvResponse(
viz_obj.get_csv(),
status=200,
headers=generate_download_headers('csv'),
mimetype='application/csv')
if query:
return self.get_query_string_response(viz_obj)
try:
payload = viz_obj.get_payload(
force=force)
except Exception as e:
logging.exception(e)
return json_error_response(utils.error_msg_from_exception(e))
status = 200
if payload.get('status') == QueryStatus.FAILED:
status = 400
return json_success(viz_obj.json_dumps(payload), status=status)
appbuilder.add_view_no_menu(Lyft)

View File

@@ -229,26 +229,32 @@ class BaseViz(object):
def cache_key(self, query_obj):
"""
The cache key is the datasource/query string tuple associated with the
object which needs to be fully deterministic.
"""
The cache key is made out of the key/values in `query_obj`
return hashlib.md5(
json.dumps((
self.datasource.id,
self.datasource.get_query_str(query_obj),
)).encode('utf-8'),
).hexdigest()
We remove datetime bounds that are hard values,
and replace them with the use-provided inputs to bounds, which
may we time-relative (as in "5 days ago" or "now").
"""
cache_dict = copy.deepcopy(query_obj)
for k in ['from_dttm', 'to_dttm']:
del cache_dict[k]
for k in ['since', 'until', 'datasource']:
cache_dict[k] = self.form_data.get(k)
json_data = self.json_dumps(cache_dict, sort_keys=True)
return hashlib.md5(json_data.encode('utf-8')).hexdigest()
def get_payload(self, force=False):
"""Handles caching around the json payload retrieval"""
query_obj = self.query_obj()
cache_key = self.cache_key(query_obj)
cache_key = self.cache_key(query_obj) if query_obj else None
cached_dttm = None
data = None
stacktrace = None
rowcount = None
if not force and cache:
if cache_key and cache and not force:
cache_value = cache.get(cache_key)
if cache_value:
stats_logger.incr('loaded_from_cache')
@@ -282,7 +288,11 @@ class BaseViz(object):
data = None
stacktrace = traceback.format_exc()
if data and cache and self.status != utils.QueryStatus.FAILED:
if (
data and
cache_key and
cache and
self.status != utils.QueryStatus.FAILED):
cached_dttm = datetime.utcnow().isoformat().split('.')[0]
try:
cache_value = json.dumps({
@@ -316,8 +326,13 @@ class BaseViz(object):
'rowcount': rowcount,
}
def json_dumps(self, obj):
return json.dumps(obj, default=utils.json_int_dttm_ser, ignore_nan=True)
def json_dumps(self, obj, sort_keys=False):
return json.dumps(
obj,
default=utils.json_int_dttm_ser,
ignore_nan=True,
sort_keys=sort_keys,
)
@property
def data(self):
@@ -427,9 +442,10 @@ class TableViz(BaseViz):
columns=list(df.columns),
)
def json_dumps(self, obj):
def json_dumps(self, obj, sort_keys=False):
if self.form_data.get('all_columns'):
return json.dumps(obj, default=utils.json_iso_dttm_ser)
return json.dumps(
obj, default=utils.json_iso_dttm_ser, sort_keys=sort_keys)
else:
return super(TableViz, self).json_dumps(obj)
@@ -536,7 +552,10 @@ class MarkupViz(BaseViz):
verbose_name = _('Markup')
is_timeseries = False
def get_df(self):
def query_obj(self):
return None
def get_df(self, query_obj=None):
return None
def get_data(self, df):
@@ -946,7 +965,7 @@ class NVD3TimeSeriesViz(NVD3Viz):
if isinstance(series_title, string_types):
series_title += title_suffix
elif title_suffix and isinstance(series_title, (list, tuple)):
series_title = series_title + (title_suffix,)
series_title = text_type(series_title[-1]) + title_suffix
values = []
for ds in df.index:
@@ -1573,7 +1592,10 @@ class IFrameViz(BaseViz):
credits = 'a <a href="https://github.com/airbnb/superset">Superset</a> original'
is_timeseries = False
def get_df(self):
def query_obj(self):
return None
def get_df(self, query_obj=None):
return None
@@ -1819,14 +1841,6 @@ class BaseDeckGLViz(BaseViz):
self.metric = self.form_data.get('size')
return [self.metric] if self.metric else []
def get_properties(self, d):
return {
'weight': d.get(self.metric) or 1,
}
def get_position(self, d):
raise Exception('Not implemented in child class!')
def process_spatial_query_obj(self, key, group_by):
spatial = self.form_data.get(key)
if spatial is None:
@@ -1892,16 +1906,20 @@ class BaseDeckGLViz(BaseViz):
features = []
for d in df.to_dict(orient='records'):
feature = dict(
position=self.get_position(d),
props=self.get_js_columns(d),
**self.get_properties(d))
feature = self.get_properties(d)
extra_props = self.get_js_columns(d)
if extra_props:
feature['extraProps'] = extra_props
features.append(feature)
return {
'features': features,
'mapboxApiKey': config.get('MAPBOX_API_KEY'),
}
def get_properties(self, d):
raise NotImplementedError()
class DeckScatterViz(BaseDeckGLViz):
@@ -1917,9 +1935,6 @@ class DeckScatterViz(BaseDeckGLViz):
fd.get('point_radius_fixed') or {'type': 'fix', 'value': 500})
return super(DeckScatterViz, self).query_obj()
def get_position(self, d):
return d['spatial']
def get_metrics(self):
self.metric = None
if self.point_radius_fixed.get('type') == 'metric':
@@ -1931,6 +1946,7 @@ class DeckScatterViz(BaseDeckGLViz):
return {
'radius': self.fixed_value if self.fixed_value else d.get(self.metric),
'cat_color': d.get(self.dim) if self.dim else None,
'position': d.get('spatial'),
}
def get_data(self, df):
@@ -1951,8 +1967,11 @@ class DeckScreengrid(BaseDeckGLViz):
verbose_name = _('Deck.gl - Screen Grid')
spatial_control_keys = ['spatial']
def get_position(self, d):
return d['spatial']
def get_properties(self, d):
return {
'position': d.get('spatial'),
'weight': d.get(self.metric) or 1,
}
class DeckGrid(BaseDeckGLViz):
@@ -1963,8 +1982,11 @@ class DeckGrid(BaseDeckGLViz):
verbose_name = _('Deck.gl - 3D Grid')
spatial_control_keys = ['spatial']
def get_position(self, d):
return d['spatial']
def get_properties(self, d):
return {
'position': d.get('spatial'),
'weight': d.get(self.metric) or 1,
}
class DeckPathViz(BaseDeckGLViz):
@@ -1973,14 +1995,11 @@ class DeckPathViz(BaseDeckGLViz):
viz_type = 'deck_path'
verbose_name = _('Deck.gl - Paths')
deck_viz_key = 'path'
deser_map = {
'json': json.loads,
'polyline': polyline.decode,
}
spatial_control_keys = ['spatial']
def get_position(self, d):
return d['spatial']
def query_obj(self):
d = super(DeckPathViz, self).query_obj()
@@ -1998,10 +2017,19 @@ class DeckPathViz(BaseDeckGLViz):
if fd.get('reverse_long_lat'):
path = (path[1], path[0])
return {
'path': path,
self.deck_viz_key: path,
}
class DeckPolygon(DeckPathViz):
"""deck.gl's Polygon Layer"""
viz_type = 'deck_polygon'
deck_viz_key = 'polygon'
verbose_name = _('Deck.gl - Polygon')
class DeckHex(BaseDeckGLViz):
"""deck.gl's DeckLayer"""
@@ -2010,8 +2038,11 @@ class DeckHex(BaseDeckGLViz):
verbose_name = _('Deck.gl - 3D HEX')
spatial_control_keys = ['spatial']
def get_position(self, d):
return d['spatial']
def get_properties(self, d):
return {
'position': d.get('spatial'),
'weight': d.get(self.metric) or 1,
}
class DeckGeoJson(BaseDeckGLViz):
@@ -2023,22 +2054,14 @@ class DeckGeoJson(BaseDeckGLViz):
def query_obj(self):
d = super(DeckGeoJson, self).query_obj()
d['columns'] = [self.form_data.get('geojson')]
d['columns'] += [self.form_data.get('geojson')]
d['metrics'] = []
d['groupby'] = []
return d
def get_data(self, df):
fd = self.form_data
geojson = {
'type': 'FeatureCollection',
'features': [json.loads(item) for item in df[fd.get('geojson')]],
}
return {
'geojson': geojson,
'mapboxApiKey': config.get('MAPBOX_API_KEY'),
}
def get_properties(self, d):
geojson = d.get(self.form_data.get('geojson'))
return json.loads(geojson)
class DeckArc(BaseDeckGLViz):
@@ -2049,20 +2072,18 @@ class DeckArc(BaseDeckGLViz):
verbose_name = _('Deck.gl - Arc')
spatial_control_keys = ['start_spatial', 'end_spatial']
def get_position(self, d):
deck_map = {
'start_spatial': 'sourcePosition',
'end_spatial': 'targetPosition',
def get_properties(self, d):
return {
'sourcePosition': d.get('start_spatial'),
'targetPosition': d.get('end_spatial'),
}
return {deck_map[key]: d[key] for key in self.spatial_control_keys}
def get_data(self, df):
d = super(DeckArc, self).get_data(df)
arcs = d['features']
return {
'arcs': [arc['position'] for arc in arcs],
'arcs': arcs,
'mapboxApiKey': config.get('MAPBOX_API_KEY'),
}

View File

@@ -83,6 +83,18 @@ class CoreTests(SupersetTestCase):
'/superset/slice/{}/?standalone=true'.format(slc.id))
assert 'List Roles' not in resp
def test_cache_key(self):
self.login(username='admin')
slc = self.get_slice('Girls', db.session)
viz = slc.viz
qobj = viz.query_obj()
cache_key = viz.cache_key(qobj)
self.assertEqual(cache_key, viz.cache_key(qobj))
qobj['groupby'] = []
self.assertNotEqual(cache_key, viz.cache_key(qobj))
def test_slice_json_endpoint(self):
self.login(username='admin')
slc = self.get_slice('Girls', db.session)