mirror of
https://github.com/apache/superset.git
synced 2026-05-03 23:14:29 +00:00
Compare commits
16 Commits
standardiz
...
flask_conf
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b7b0f1c795 | ||
|
|
ed73ac4737 | ||
|
|
59fa496221 | ||
|
|
590d39abeb | ||
|
|
c67143592b | ||
|
|
6e469eb922 | ||
|
|
ffe1a0c9ee | ||
|
|
c2a05ea919 | ||
|
|
54f17134b6 | ||
|
|
ad8d0bb2fb | ||
|
|
a21a1824e3 | ||
|
|
30e731a15b | ||
|
|
faef33d6ba | ||
|
|
92bf3b9d4e | ||
|
|
29b4c480f3 | ||
|
|
1a9da0ff78 |
4
.github/workflows/superset-docs-deploy.yml
vendored
4
.github/workflows/superset-docs-deploy.yml
vendored
@@ -47,10 +47,12 @@ jobs:
|
||||
java-version: '21'
|
||||
- name: Install Graphviz
|
||||
run: sudo apt-get install -y graphviz
|
||||
- name: Compute Entity Relationship diagram (ERD)
|
||||
- name: Generate documentation artifacts
|
||||
env:
|
||||
SUPERSET_SECRET_KEY: not-a-secret
|
||||
CI: true
|
||||
run: |
|
||||
# Generate ERD
|
||||
python scripts/erd/erd.py
|
||||
curl -L http://sourceforge.net/projects/plantuml/files/1.2023.7/plantuml.1.2023.7.jar/download > ~/plantuml.jar
|
||||
java -jar ~/plantuml.jar -v -tsvg -r -o "${{ github.workspace }}/docs/static/img/" "${{ github.workspace }}/scripts/erd/erd.puml"
|
||||
|
||||
5
.github/workflows/superset-docs-verify.yml
vendored
5
.github/workflows/superset-docs-verify.yml
vendored
@@ -64,6 +64,11 @@ jobs:
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version-file: './docs/.nvmrc'
|
||||
- name: Setup Python Backend
|
||||
uses: ./.github/actions/setup-backend
|
||||
with:
|
||||
python-version: 'current'
|
||||
requirements-type: 'base'
|
||||
- name: yarn install
|
||||
run: |
|
||||
yarn install --check-cache
|
||||
|
||||
3
.gitignore
vendored
3
.gitignore
vendored
@@ -131,3 +131,6 @@ superset/static/stats/statistics.html
|
||||
# LLM-related
|
||||
CLAUDE.local.md
|
||||
.aider*
|
||||
|
||||
# Temporary scratchpad for development
|
||||
.scratchpad/
|
||||
|
||||
86
docs/docs/configuration/configuration-reference.mdx
Normal file
86
docs/docs/configuration/configuration-reference.mdx
Normal file
@@ -0,0 +1,86 @@
|
||||
---
|
||||
title: Configuration Reference
|
||||
hide_title: true
|
||||
sidebar_position: 1
|
||||
version: 1
|
||||
---
|
||||
|
||||
import ConfigurationTable from '@site/src/components/ConfigurationTable';
|
||||
|
||||
# Configuration Reference
|
||||
|
||||
This page provides a comprehensive reference for all Superset configuration options. These settings can be configured in your `superset_config.py` file or through environment variables.
|
||||
|
||||
## How to Use This Reference
|
||||
|
||||
- **Search**: Use the search box to find specific configuration settings
|
||||
- **Filter by Category**: Use the dropdown to filter by configuration category
|
||||
- **Environment Variables**: All configurations can be set via environment variables with the `SUPERSET__` prefix
|
||||
- **Impact Level**: Each setting shows its impact level (low, medium, high)
|
||||
- **Restart Required**: Settings marked with "RESTART" require a server restart to take effect
|
||||
|
||||
## Configuration Settings
|
||||
|
||||
<ConfigurationTable showEnvironmentVariables={true} />
|
||||
|
||||
## Setting Configuration Values
|
||||
|
||||
### In superset_config.py
|
||||
|
||||
```python
|
||||
# Example configuration in superset_config.py
|
||||
SECRET_KEY = 'your-secret-key-here'
|
||||
SQLALCHEMY_DATABASE_URI = 'postgresql://user:pass@localhost/superset'
|
||||
CACHE_DEFAULT_TIMEOUT = 300
|
||||
```
|
||||
|
||||
### Via Environment Variables
|
||||
|
||||
```bash
|
||||
# All configuration keys can be set via environment variables
|
||||
export SUPERSET__SECRET_KEY="your-secret-key-here"
|
||||
export SUPERSET__SQLALCHEMY_DATABASE_URI="postgresql://user:pass@localhost/superset"
|
||||
export SUPERSET__CACHE_DEFAULT_TIMEOUT=300
|
||||
```
|
||||
|
||||
### Configuration Precedence
|
||||
|
||||
Configuration values are loaded in the following order (later values override earlier ones):
|
||||
|
||||
1. **Default values** from `superset/config_defaults.py`
|
||||
2. **Base configuration** from `superset/config.py`
|
||||
3. **Custom configuration file** (if specified via `SUPERSET_CONFIG_PATH`)
|
||||
4. **superset_config module** (if available in PYTHONPATH)
|
||||
5. **Environment variables** with `SUPERSET__` prefix
|
||||
|
||||
## Configuration Categories
|
||||
|
||||
The configuration settings are organized into the following categories:
|
||||
|
||||
- **Security**: Authentication, authorization, and security-related settings
|
||||
- **Database**: Database connection and SQL-related configurations
|
||||
- **Performance**: Caching, timeouts, and performance optimization settings
|
||||
- **Features**: Feature flags and optional functionality toggles
|
||||
- **UI**: User interface and theming configurations
|
||||
- **Logging**: Logging and monitoring configurations
|
||||
- **Email**: Email and notification settings
|
||||
- **Async**: Asynchronous processing and Celery settings
|
||||
- **General**: Miscellaneous configuration options
|
||||
|
||||
## Important Security Notes
|
||||
|
||||
- Always set a strong `SECRET_KEY` in production
|
||||
- Use environment variables for sensitive configuration values
|
||||
- Never commit sensitive configuration values to version control
|
||||
- Regularly rotate secrets and passwords
|
||||
- Review security-related configurations before deploying
|
||||
|
||||
## Need Help?
|
||||
|
||||
For detailed information about specific configuration topics, see:
|
||||
|
||||
- [Configuring Superset](./configuring-superset.mdx) - General configuration guide
|
||||
- [Security](../security/security.mdx) - Security configuration
|
||||
- [Database Configuration](./databases.mdx) - Database-specific settings
|
||||
- [Cache Configuration](./cache.mdx) - Caching setup
|
||||
- [Async Queries](./async-queries-celery.mdx) - Celery configuration
|
||||
@@ -1,19 +1,36 @@
|
||||
---
|
||||
title: Configuring Superset
|
||||
hide_title: true
|
||||
sidebar_position: 1
|
||||
sidebar_position: 2
|
||||
version: 1
|
||||
---
|
||||
|
||||
# Configuring Superset
|
||||
|
||||
## superset_config.py
|
||||
## Configuration Overview
|
||||
|
||||
Superset provides a flexible, multi-layered configuration system that supports:
|
||||
|
||||
1. **File-based configuration** - Traditional Python configuration files
|
||||
2. **Environment variables** - For containerized deployments and CI/CD
|
||||
3. **Database-backed settings** - Runtime configuration changes (coming soon)
|
||||
4. **Structured metadata** - Rich documentation and validation schemas
|
||||
|
||||
### Configuration Priority
|
||||
|
||||
Configuration values are loaded in the following order (later values override earlier ones):
|
||||
|
||||
1. **Default configuration** - Built-in defaults from `superset/config_defaults.py`
|
||||
2. **Environment variables** - Values prefixed with `SUPERSET__`
|
||||
3. **User configuration file** - Your custom `superset_config.py` file
|
||||
|
||||
### superset_config.py
|
||||
|
||||
Superset exposes hundreds of configurable parameters through its
|
||||
[config.py module](https://github.com/apache/superset/blob/master/superset/config.py). The
|
||||
[config_defaults.py module](https://github.com/apache/superset/blob/master/superset/config_defaults.py). The
|
||||
variables and objects exposed act as a public interface of the bulk of what you may want
|
||||
to configure, alter and interface with. In this python module, you'll find all these
|
||||
parameters, sensible defaults, as well as rich documentation in the form of comments
|
||||
parameters, sensible defaults, as well as structured metadata documentation.
|
||||
|
||||
To configure your application, you need to create your own configuration module, which
|
||||
will allow you to override few or many of these parameters. Instead of altering the core module,
|
||||
@@ -77,12 +94,12 @@ MAPBOX_API_KEY = ''
|
||||
|
||||
:::tip
|
||||
Note that it is typical to copy and paste [only] the portions of the
|
||||
core [superset/config.py](https://github.com/apache/superset/blob/master/superset/config.py) that
|
||||
core [superset/config_defaults.py](https://github.com/apache/superset/blob/master/superset/config_defaults.py) that
|
||||
you want to alter, along with the related comments into your own `superset_config.py` file.
|
||||
:::
|
||||
|
||||
All the parameters and default values defined
|
||||
in [superset/config.py](https://github.com/apache/superset/blob/master/superset/config.py)
|
||||
in [superset/config_defaults.py](https://github.com/apache/superset/blob/master/superset/config_defaults.py)
|
||||
can be altered in your local `superset_config.py`. Administrators will want to read through the file
|
||||
to understand what can be configured locally as well as the default values in place.
|
||||
|
||||
@@ -97,6 +114,102 @@ for more information on how to configure it.
|
||||
|
||||
At the very least, you'll want to change `SECRET_KEY` and `SQLALCHEMY_DATABASE_URI`. Continue reading for more about each of these.
|
||||
|
||||
## Environment Variables Configuration
|
||||
|
||||
For containerized deployments and CI/CD pipelines, Superset supports configuration through environment variables. This is particularly useful for:
|
||||
|
||||
- **Docker deployments** - Configure containers without rebuilding images
|
||||
- **Kubernetes environments** - Use ConfigMaps and Secrets
|
||||
- **CI/CD pipelines** - Set configuration dynamically based on environment
|
||||
- **Development workflows** - Override settings locally without changing files
|
||||
|
||||
### Environment Variable Format
|
||||
|
||||
All Superset environment variables must use the `SUPERSET__` prefix (note the double underscore):
|
||||
|
||||
```bash
|
||||
# Basic settings
|
||||
export SUPERSET__ROW_LIMIT=100000
|
||||
export SUPERSET__SQLLAB_TIMEOUT=60
|
||||
|
||||
# Database configuration
|
||||
export SUPERSET__SQLALCHEMY_DATABASE_URI="postgresql://user:pass@localhost/superset"
|
||||
|
||||
# Secret key
|
||||
export SUPERSET__SECRET_KEY="your-secret-key-here"
|
||||
```
|
||||
|
||||
### JSON and Complex Values
|
||||
|
||||
Environment variables automatically parse JSON values for complex configuration:
|
||||
|
||||
```bash
|
||||
# Feature flags as JSON
|
||||
export SUPERSET__FEATURE_FLAGS='{"ENABLE_TEMPLATE_PROCESSING": true, "ENABLE_EXPLORE_DRAG_AND_DROP": true}'
|
||||
|
||||
# Database configuration
|
||||
export SUPERSET__DATABASE_CONFIG='{"timeout": 60, "pool_size": 10}'
|
||||
```
|
||||
|
||||
### Nested Configuration
|
||||
|
||||
For nested configuration objects, use triple underscores (`___`) to separate levels:
|
||||
|
||||
```bash
|
||||
# This sets FEATURE_FLAGS["ENABLE_TEMPLATE_PROCESSING"] = true
|
||||
export SUPERSET__FEATURE_FLAGS__ENABLE_TEMPLATE_PROCESSING=true
|
||||
|
||||
# This sets THEME_DEFAULT["token"]["colorPrimary"] = "#ff0000"
|
||||
export SUPERSET__THEME_DEFAULT__token__colorPrimary="#ff0000"
|
||||
```
|
||||
|
||||
### Environment Variable Examples
|
||||
|
||||
You can view examples of environment variable configuration:
|
||||
|
||||
```bash
|
||||
# Show all available environment variable examples
|
||||
superset config env-examples
|
||||
|
||||
# Show current configuration and sources
|
||||
superset config show --verbose
|
||||
|
||||
# Get a specific configuration value
|
||||
superset config get ROW_LIMIT
|
||||
```
|
||||
|
||||
### Docker Environment Variables
|
||||
|
||||
When using Docker, you can set environment variables in your `docker-compose.yml`:
|
||||
|
||||
```yaml
|
||||
services:
|
||||
superset:
|
||||
image: apache/superset:latest
|
||||
environment:
|
||||
- SUPERSET__ROW_LIMIT=100000
|
||||
- SUPERSET__SQLLAB_TIMEOUT=60
|
||||
- SUPERSET__FEATURE_FLAGS__ENABLE_TEMPLATE_PROCESSING=true
|
||||
# ... other configuration
|
||||
```
|
||||
|
||||
Or use an environment file:
|
||||
|
||||
```bash
|
||||
# .env file
|
||||
SUPERSET__ROW_LIMIT=100000
|
||||
SUPERSET__SQLLAB_TIMEOUT=60
|
||||
SUPERSET__SECRET_KEY=your-secret-key-here
|
||||
```
|
||||
|
||||
```yaml
|
||||
services:
|
||||
superset:
|
||||
image: apache/superset:latest
|
||||
env_file:
|
||||
- .env
|
||||
```
|
||||
|
||||
## Specifying a SECRET_KEY
|
||||
|
||||
### Adding an initial SECRET_KEY
|
||||
@@ -546,3 +659,93 @@ FEATURE_FLAGS = {
|
||||
```
|
||||
|
||||
A current list of feature flags can be found in [RESOURCES/FEATURE_FLAGS.md](https://github.com/apache/superset/blob/master/RESOURCES/FEATURE_FLAGS.md).
|
||||
|
||||
## Configuration Introspection
|
||||
|
||||
Superset provides CLI commands to inspect and understand your current configuration:
|
||||
|
||||
### View Current Configuration
|
||||
|
||||
```bash
|
||||
# Show all configuration as YAML
|
||||
superset config show
|
||||
|
||||
# Filter configuration by pattern
|
||||
superset config show --filter "ROW_LIMIT"
|
||||
|
||||
# Show configuration with sources (where each value comes from)
|
||||
superset config show --verbose
|
||||
```
|
||||
|
||||
### Get Specific Configuration Values
|
||||
|
||||
```bash
|
||||
# Get a specific configuration value with source information
|
||||
superset config get ROW_LIMIT
|
||||
|
||||
# Output shows both the value and where it came from:
|
||||
# ROW_LIMIT:
|
||||
# source: environment (SUPERSET__ROW_LIMIT)
|
||||
# value: 100000
|
||||
```
|
||||
|
||||
### Environment Variable Examples
|
||||
|
||||
```bash
|
||||
# Show examples of environment variables for all documented settings
|
||||
superset config env-examples
|
||||
|
||||
# This shows:
|
||||
# - Basic environment variable syntax
|
||||
# - JSON formatting examples
|
||||
# - Nested configuration examples
|
||||
# - All documented settings with their metadata
|
||||
```
|
||||
|
||||
### Configuration Sources
|
||||
|
||||
The CLI will show you where each configuration value comes from:
|
||||
|
||||
- **`environment (SUPERSET__KEY)`** - Value set via environment variable
|
||||
- **`superset_config.py`** - Value set in your custom configuration file
|
||||
- **`config_defaults.py`** - Default value from Superset's built-in configuration
|
||||
|
||||
This helps you understand the configuration precedence and troubleshoot configuration issues.
|
||||
|
||||
## Configuration Reference
|
||||
|
||||
The following table shows all documented configuration settings with their metadata:
|
||||
|
||||
import ConfigurationTable from '@site/src/components/ConfigurationTable';
|
||||
|
||||
<ConfigurationTable showEnvironmentVariables={true} />
|
||||
|
||||
## Environment Variables Examples
|
||||
|
||||
Here are ready-to-use environment variable examples:
|
||||
|
||||
import EnvironmentVariablesExample from '@site/src/components/EnvironmentVariablesExample';
|
||||
|
||||
<EnvironmentVariablesExample />
|
||||
|
||||
## Configuration Metadata and Documentation
|
||||
|
||||
Superset's configuration system includes rich metadata for many settings, providing:
|
||||
|
||||
- **Type information** - Whether a setting expects an integer, string, boolean, or object
|
||||
- **Validation rules** - Minimum/maximum values, allowed options
|
||||
- **Documentation** - Detailed descriptions of what each setting does
|
||||
- **Impact levels** - How significant changes to this setting are
|
||||
- **Restart requirements** - Whether changing this setting requires a restart
|
||||
|
||||
This metadata is used for:
|
||||
- **CLI documentation** - The `superset config env-examples` command shows this information
|
||||
- **Future admin UI** - Settings management interface (coming soon)
|
||||
- **Validation** - Ensuring configuration values are valid
|
||||
- **API documentation** - Automatic generation of configuration schemas
|
||||
|
||||
You can also access this information via CLI:
|
||||
|
||||
```bash
|
||||
superset config env-examples
|
||||
```
|
||||
|
||||
@@ -6,8 +6,9 @@
|
||||
"scripts": {
|
||||
"docusaurus": "docusaurus",
|
||||
"_init": "cat src/intro_header.txt ../README.md > docs/intro.md",
|
||||
"start": "yarn run _init && docusaurus start",
|
||||
"build": "yarn run _init && DEBUG=docusaurus:* docusaurus build",
|
||||
"_update-config": "bash scripts/generate_docs.sh",
|
||||
"start": "yarn run _init && yarn run _update-config && docusaurus start",
|
||||
"build": "yarn run _init && yarn run _update-config && DEBUG=docusaurus:* docusaurus build",
|
||||
"swizzle": "docusaurus swizzle",
|
||||
"deploy": "docusaurus deploy",
|
||||
"clear": "docusaurus clear",
|
||||
@@ -26,6 +27,8 @@
|
||||
"@emotion/styled": "^10.0.27",
|
||||
"@saucelabs/theme-github-codeblock": "^0.3.0",
|
||||
"@superset-ui/style": "^0.14.23",
|
||||
"ag-grid-community": "^34.1.0",
|
||||
"ag-grid-react": "^34.1.0",
|
||||
"antd": "^5.26.3",
|
||||
"docusaurus-plugin-less": "^2.0.2",
|
||||
"less": "^4.3.0",
|
||||
@@ -34,6 +37,7 @@
|
||||
"react": "^18.3.1",
|
||||
"react-dom": "^18.3.1",
|
||||
"react-github-btn": "^1.4.0",
|
||||
"react-markdown": "^10.1.0",
|
||||
"react-svg-pan-zoom": "^3.13.1",
|
||||
"swagger-ui-react": "^5.26.0"
|
||||
},
|
||||
|
||||
116
docs/scripts/export_config_metadata.py
Normal file
116
docs/scripts/export_config_metadata.py
Normal file
@@ -0,0 +1,116 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Export configuration metadata to JSON for documentation generation.
|
||||
|
||||
This script loads configuration metadata from the Python metadata module
|
||||
and exports it in JSON format for the documentation React components.
|
||||
|
||||
This script is called by docs/scripts/generate_docs.sh as part of the
|
||||
unified documentation generation process.
|
||||
"""
|
||||
|
||||
import json as json_module
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List
|
||||
|
||||
# Add the superset directory to Python path
|
||||
superset_root = Path(__file__).parent.parent.parent
|
||||
sys.path.insert(0, str(superset_root))
|
||||
|
||||
|
||||
def infer_impact(key: str) -> str:
|
||||
"""Infer the impact level based on the configuration key name."""
|
||||
name_lower = key.lower()
|
||||
|
||||
# High impact - security, database, core functionality
|
||||
if any(
|
||||
term in name_lower
|
||||
for term in [
|
||||
"secret",
|
||||
"key",
|
||||
"password",
|
||||
"database",
|
||||
"uri",
|
||||
"url",
|
||||
"security",
|
||||
"auth",
|
||||
]
|
||||
):
|
||||
return "high"
|
||||
|
||||
# Medium impact - performance, features, UI
|
||||
elif any(
|
||||
term in name_lower
|
||||
for term in ["limit", "timeout", "cache", "feature", "flag", "theme"]
|
||||
):
|
||||
return "medium"
|
||||
|
||||
# Low impact - logging, debugging, minor settings
|
||||
else:
|
||||
return "low"
|
||||
|
||||
|
||||
def infer_requires_restart(key: str) -> bool:
|
||||
"""Infer if the configuration requires a restart based on the key name."""
|
||||
name_lower = key.lower()
|
||||
|
||||
# These typically require restart
|
||||
if any(
|
||||
term in name_lower
|
||||
for term in [
|
||||
"secret",
|
||||
"key",
|
||||
"database",
|
||||
"uri",
|
||||
"url",
|
||||
"security",
|
||||
"auth",
|
||||
"ssl",
|
||||
"tls",
|
||||
]
|
||||
):
|
||||
return True
|
||||
|
||||
# These typically don't require restart
|
||||
elif any(
|
||||
term in name_lower for term in ["limit", "timeout", "cache", "log", "debug"]
|
||||
):
|
||||
return False
|
||||
|
||||
# Default to requiring restart for safety
|
||||
return True
|
||||
|
||||
|
||||
def export_config_metadata() -> List[Dict[str, Any]]:
|
||||
"""Export configuration metadata as JSON."""
|
||||
try:
|
||||
# Import from Python metadata module
|
||||
from superset.config_metadata import export_for_documentation
|
||||
|
||||
# Get metadata from Python source
|
||||
metadata_export = export_for_documentation()
|
||||
|
||||
# Export as JSON for documentation
|
||||
output_dir = Path(__file__).parent.parent / "src" / "resources"
|
||||
output_dir.mkdir(exist_ok=True)
|
||||
|
||||
# Write the full export (includes categories, etc.)
|
||||
with open(output_dir / "config_metadata.json", "w") as f:
|
||||
json_module.dump(metadata_export, f, indent=2)
|
||||
|
||||
output_file = output_dir / "config_metadata.json"
|
||||
print(
|
||||
f"Exported {len(metadata_export['all_settings'])} configuration settings to {output_file}"
|
||||
)
|
||||
|
||||
return metadata_export["all_settings"]
|
||||
|
||||
except ImportError as e:
|
||||
print(f"Error importing config_metadata: {e}")
|
||||
print("Please ensure superset/config_metadata.py exists")
|
||||
return []
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
export_config_metadata()
|
||||
101
docs/scripts/generate_docs.sh
Executable file
101
docs/scripts/generate_docs.sh
Executable file
@@ -0,0 +1,101 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
# Unified documentation generation script
|
||||
# This script generates all dynamic documentation artifacts needed for the docs build
|
||||
|
||||
set -e
|
||||
|
||||
echo "🚀 Generating documentation artifacts..."
|
||||
|
||||
# Navigate to the docs directory
|
||||
cd "$(dirname "$0")/.."
|
||||
|
||||
# Track any failures
|
||||
FAILED_TASKS=()
|
||||
|
||||
# 1. Extract configuration schema and export metadata
|
||||
echo "📊 Extracting configuration schema and exporting metadata..."
|
||||
if python ../scripts/extract_config_schema.py && python scripts/export_config_metadata.py; then
|
||||
echo "✅ Configuration metadata exported successfully"
|
||||
else
|
||||
echo "⚠️ Warning: Failed to export configuration metadata"
|
||||
echo " The documentation build will continue with existing metadata"
|
||||
FAILED_TASKS+=("config_metadata")
|
||||
fi
|
||||
|
||||
# 2. Generate OpenAPI documentation
|
||||
echo "🔌 Generating OpenAPI documentation..."
|
||||
if python -c "
|
||||
import sys
|
||||
sys.path.insert(0, '..')
|
||||
from superset.app import create_app
|
||||
from superset.cli.update import update_api_docs
|
||||
import os
|
||||
|
||||
# Set required environment variables
|
||||
os.environ['SUPERSET_SECRET_KEY'] = 'not-a-secret'
|
||||
|
||||
app = create_app()
|
||||
with app.app_context():
|
||||
update_api_docs()
|
||||
"; then
|
||||
echo "✅ OpenAPI documentation generated successfully"
|
||||
else
|
||||
echo "⚠️ Warning: Failed to generate OpenAPI documentation"
|
||||
echo " The documentation build will continue with existing OpenAPI spec"
|
||||
FAILED_TASKS+=("openapi")
|
||||
fi
|
||||
|
||||
# 3. Generate ERD (Entity Relationship Diagram) if in CI environment
|
||||
if [ -n "$CI" ] && [ -f "../scripts/erd/erd.py" ]; then
|
||||
echo "🗂️ Generating Entity Relationship Diagram..."
|
||||
if python ../scripts/erd/erd.py; then
|
||||
echo "✅ ERD generated successfully"
|
||||
else
|
||||
echo "⚠️ Warning: Failed to generate ERD"
|
||||
echo " The documentation build will continue without updated ERD"
|
||||
FAILED_TASKS+=("erd")
|
||||
fi
|
||||
fi
|
||||
|
||||
# Summary
|
||||
echo ""
|
||||
echo "📝 Documentation generation summary:"
|
||||
echo " - Configuration metadata: ${FAILED_TASKS[*]}" | grep -q "config_metadata" && echo " - Configuration metadata: ❌ Failed" || echo " - Configuration metadata: ✅ Success"
|
||||
echo " - OpenAPI documentation: ${FAILED_TASKS[*]}" | grep -q "openapi" && echo " - OpenAPI documentation: ❌ Failed" || echo " - OpenAPI documentation: ✅ Success"
|
||||
if [ -n "$CI" ]; then
|
||||
echo " - ERD generation: ${FAILED_TASKS[*]}" | grep -q "erd" && echo " - ERD generation: ❌ Failed" || echo " - ERD generation: ✅ Success"
|
||||
fi
|
||||
|
||||
if [ ${#FAILED_TASKS[@]} -eq 0 ]; then
|
||||
echo ""
|
||||
echo "🎉 All documentation artifacts generated successfully!"
|
||||
else
|
||||
echo ""
|
||||
echo "⚠️ Some tasks failed but documentation build can continue"
|
||||
echo " Failed tasks: ${FAILED_TASKS[*]}"
|
||||
echo " To fix missing dependencies, run: pip install -e ."
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "📁 Generated files:"
|
||||
[ -f "src/resources/config_metadata.json" ] && echo " - src/resources/config_metadata.json"
|
||||
[ -f "static/resources/openapi.json" ] && echo " - static/resources/openapi.json"
|
||||
[ -f "static/img/erd.svg" ] && echo " - static/img/erd.svg"
|
||||
329
docs/src/components/ConfigurationTable.tsx
Normal file
329
docs/src/components/ConfigurationTable.tsx
Normal file
@@ -0,0 +1,329 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import React, { useState, useMemo, useCallback } from 'react';
|
||||
import { AgGridReact } from 'ag-grid-react';
|
||||
import { ColDef, GridReadyEvent, GridApi, ModuleRegistry, AllCommunityModule } from 'ag-grid-community';
|
||||
import 'ag-grid-community/styles/ag-grid.css';
|
||||
import 'ag-grid-community/styles/ag-theme-material.css';
|
||||
import configMetadata from '../resources/config_metadata.json';
|
||||
|
||||
// Register AG Grid modules
|
||||
ModuleRegistry.registerModules([AllCommunityModule]);
|
||||
|
||||
// ConfigSetting interface is defined for type safety but not directly used
|
||||
// as AG Grid uses dynamic property access
|
||||
// interface ConfigSetting {
|
||||
// key: string;
|
||||
// title: string;
|
||||
// description: string;
|
||||
// details: string;
|
||||
// type: string;
|
||||
// category: string;
|
||||
// group: string;
|
||||
// default: any;
|
||||
// env_var: string;
|
||||
// external: boolean;
|
||||
// source: string;
|
||||
// supports_callable: boolean;
|
||||
// }
|
||||
|
||||
interface ConfigurationTableProps {
|
||||
category?: string;
|
||||
showEnvironmentVariables?: boolean;
|
||||
}
|
||||
|
||||
// Custom cell renderers
|
||||
|
||||
const KeyCellRenderer = (props: { value: string }) => {
|
||||
return <span style={{ fontWeight: 'bold' }}>{props.value}</span>;
|
||||
};
|
||||
|
||||
const TypeCellRenderer = (props: { value: string }) => {
|
||||
return <code>{props.value}</code>;
|
||||
};
|
||||
|
||||
const DefaultCellRenderer = (props: { value: unknown }) => {
|
||||
const formatDefault = (value: unknown): string => {
|
||||
if (value === null || value === undefined || value === 'None') return 'None';
|
||||
if (typeof value === 'object') {
|
||||
try {
|
||||
return JSON.stringify(value, null, 2);
|
||||
} catch {
|
||||
return String(value);
|
||||
}
|
||||
}
|
||||
return String(value);
|
||||
};
|
||||
|
||||
const formatted = formatDefault(props.value);
|
||||
const isLong = formatted.length > 50;
|
||||
|
||||
return (
|
||||
<code
|
||||
style={{
|
||||
whiteSpace: isLong ? 'pre-wrap' : 'nowrap',
|
||||
wordBreak: isLong ? 'break-all' : 'normal',
|
||||
}}
|
||||
title={isLong ? formatted : undefined}
|
||||
>
|
||||
{isLong ? formatted.substring(0, 50) + '...' : formatted}
|
||||
</code>
|
||||
);
|
||||
};
|
||||
|
||||
const BooleanCellRenderer = (props: { value: boolean }) => {
|
||||
return props.value ? '✅ Yes' : '❌ No';
|
||||
};
|
||||
|
||||
const GroupCellRenderer = (props: { value: string | null }) => {
|
||||
if (!props.value) return null;
|
||||
return (
|
||||
<span
|
||||
style={{
|
||||
backgroundColor: '#f0f0f0',
|
||||
padding: '2px 8px',
|
||||
borderRadius: '4px',
|
||||
fontSize: '0.9em',
|
||||
}}
|
||||
>
|
||||
{props.value}
|
||||
</span>
|
||||
);
|
||||
};
|
||||
|
||||
const DescriptionCellRenderer = (props: { value: string; data: { details?: string } }) => {
|
||||
const hasDetails = props.data.details && props.data.details.trim() !== '';
|
||||
|
||||
return (
|
||||
<div style={{ display: 'flex', alignItems: 'center', gap: '6px' }}>
|
||||
<span>{props.value || 'No description available'}</span>
|
||||
{hasDetails && (
|
||||
<span
|
||||
style={{
|
||||
display: 'inline-flex',
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
width: '16px',
|
||||
height: '16px',
|
||||
backgroundColor: '#e8e8e8',
|
||||
color: '#666',
|
||||
borderRadius: '50%',
|
||||
fontSize: '0.8em',
|
||||
fontWeight: 'bold',
|
||||
cursor: 'help',
|
||||
flexShrink: 0,
|
||||
border: '1px solid #d0d0d0',
|
||||
}}
|
||||
title={props.data.details}
|
||||
>
|
||||
i
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
const ConfigurationTable: React.FC<ConfigurationTableProps> = ({
|
||||
category, // eslint-disable-line @typescript-eslint/no-unused-vars
|
||||
showEnvironmentVariables = false,
|
||||
}) => {
|
||||
const [gridApi, setGridApi] = useState<GridApi | null>(null);
|
||||
const [searchText, setSearchText] = useState('');
|
||||
|
||||
// Process data to include only enriched configs
|
||||
const rowData = useMemo(() => {
|
||||
return configMetadata.all_settings;
|
||||
}, []);
|
||||
|
||||
// Column definitions
|
||||
const columnDefs = useMemo<ColDef[]>(() => {
|
||||
const columns: ColDef[] = [
|
||||
{
|
||||
field: 'key',
|
||||
headerName: 'Configuration Key',
|
||||
cellRenderer: KeyCellRenderer,
|
||||
width: 280,
|
||||
pinned: 'left',
|
||||
filter: 'agTextColumnFilter',
|
||||
floatingFilter: true,
|
||||
},
|
||||
{
|
||||
field: 'description',
|
||||
headerName: 'Description',
|
||||
cellRenderer: DescriptionCellRenderer,
|
||||
flex: 2,
|
||||
minWidth: 300,
|
||||
wrapText: true,
|
||||
autoHeight: true,
|
||||
filter: 'agTextColumnFilter',
|
||||
floatingFilter: true,
|
||||
},
|
||||
{
|
||||
field: 'type',
|
||||
headerName: 'Type',
|
||||
cellRenderer: TypeCellRenderer,
|
||||
width: 120,
|
||||
filter: 'agTextColumnFilter',
|
||||
},
|
||||
{
|
||||
field: 'default',
|
||||
headerName: 'Default',
|
||||
cellRenderer: DefaultCellRenderer,
|
||||
width: 200,
|
||||
filter: 'agTextColumnFilter',
|
||||
},
|
||||
{
|
||||
field: 'category',
|
||||
headerName: 'Category',
|
||||
width: 120,
|
||||
filter: 'agTextColumnFilter',
|
||||
floatingFilter: true,
|
||||
},
|
||||
{
|
||||
field: 'group',
|
||||
headerName: 'Group',
|
||||
cellRenderer: GroupCellRenderer,
|
||||
width: 180,
|
||||
filter: 'agTextColumnFilter',
|
||||
floatingFilter: true,
|
||||
},
|
||||
];
|
||||
|
||||
if (showEnvironmentVariables) {
|
||||
columns.push({
|
||||
field: 'env_var',
|
||||
headerName: 'Environment Variable',
|
||||
width: 250,
|
||||
filter: 'agTextColumnFilter',
|
||||
cellRenderer: (props: { value: string }) => (
|
||||
<code>{props.value}</code>
|
||||
),
|
||||
});
|
||||
}
|
||||
|
||||
columns.push(
|
||||
{
|
||||
field: 'external',
|
||||
headerName: 'External',
|
||||
cellRenderer: BooleanCellRenderer,
|
||||
width: 100,
|
||||
filter: true,
|
||||
},
|
||||
);
|
||||
|
||||
return columns;
|
||||
}, [showEnvironmentVariables]);
|
||||
|
||||
const defaultColDef = useMemo<ColDef>(() => ({
|
||||
sortable: true,
|
||||
resizable: true,
|
||||
}), []);
|
||||
|
||||
const onGridReady = useCallback((params: GridReadyEvent) => {
|
||||
setGridApi(params.api);
|
||||
}, []);
|
||||
|
||||
const onFilterTextBoxChanged = useCallback(() => {
|
||||
if (gridApi) {
|
||||
gridApi.setGridOption('quickFilterText', searchText);
|
||||
}
|
||||
}, [gridApi, searchText]);
|
||||
|
||||
const exportToCsv = useCallback(() => {
|
||||
if (gridApi) {
|
||||
gridApi.exportDataAsCsv({
|
||||
fileName: 'superset_configuration.csv',
|
||||
});
|
||||
}
|
||||
}, [gridApi]);
|
||||
|
||||
return (
|
||||
<div style={{ width: '100%', height: '800px' }}>
|
||||
{/* Controls */}
|
||||
<div style={{ marginBottom: '20px', display: 'flex', gap: '15px', alignItems: 'center' }}>
|
||||
<div style={{ flex: 1 }}>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="Quick filter across all columns..."
|
||||
value={searchText}
|
||||
onChange={(e) => {
|
||||
setSearchText(e.target.value);
|
||||
onFilterTextBoxChanged();
|
||||
}}
|
||||
style={{
|
||||
padding: '8px 12px',
|
||||
border: '1px solid #ddd',
|
||||
borderRadius: '4px',
|
||||
width: '100%',
|
||||
maxWidth: '400px',
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<button
|
||||
onClick={exportToCsv}
|
||||
style={{
|
||||
padding: '8px 16px',
|
||||
backgroundColor: '#1890ff',
|
||||
color: 'white',
|
||||
border: 'none',
|
||||
borderRadius: '4px',
|
||||
cursor: 'pointer',
|
||||
fontSize: '14px',
|
||||
}}
|
||||
>
|
||||
Export to CSV
|
||||
</button>
|
||||
|
||||
<div style={{ color: '#666' }}>
|
||||
{rowData.length} configurations
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* AG Grid */}
|
||||
<div className="ag-theme-material" style={{ height: '100%', width: '100%' }}>
|
||||
<AgGridReact
|
||||
rowData={rowData}
|
||||
columnDefs={columnDefs}
|
||||
defaultColDef={defaultColDef}
|
||||
onGridReady={onGridReady}
|
||||
animateRows={true}
|
||||
enableCellTextSelection={true}
|
||||
ensureDomOrder={true}
|
||||
tooltipShowDelay={500}
|
||||
pagination={true}
|
||||
paginationPageSize={50}
|
||||
paginationPageSizeSelector={[20, 50, 100, 200]}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Help text */}
|
||||
<div style={{ marginTop: '15px', color: '#666' }}>
|
||||
<p>
|
||||
<strong>Tips:</strong> Click column headers to sort. Use the filter row below headers for column-specific filtering.
|
||||
Hold Shift to sort by multiple columns. Right-click headers for more options.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
|
||||
export default ConfigurationTable;
|
||||
181
docs/src/components/EnvironmentVariablesExample.tsx
Normal file
181
docs/src/components/EnvironmentVariablesExample.tsx
Normal file
@@ -0,0 +1,181 @@
|
||||
/**
|
||||
* Licensed to the Apache Software Foundation (ASF) under one
|
||||
* or more contributor license agreements. See the NOTICE file
|
||||
* distributed with this work for additional information
|
||||
* regarding copyright ownership. The ASF licenses this file
|
||||
* to you under the Apache License, Version 2.0 (the
|
||||
* "License"); you may not use this file except in compliance
|
||||
* with the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import React, { useState } from 'react';
|
||||
import configMetadata from '../resources/config_metadata.json';
|
||||
|
||||
interface EnvironmentVariablesExampleProps {
|
||||
category?: string;
|
||||
}
|
||||
|
||||
const EnvironmentVariablesExample: React.FC<
|
||||
EnvironmentVariablesExampleProps
|
||||
> = ({ category }) => {
|
||||
const [showAll, setShowAll] = useState(false);
|
||||
|
||||
// Get settings based on category
|
||||
const getSettings = () => {
|
||||
if (category && configMetadata.by_category[category]) {
|
||||
return configMetadata.by_category[category];
|
||||
}
|
||||
return configMetadata.all_settings;
|
||||
};
|
||||
|
||||
const settings = getSettings();
|
||||
const displaySettings = showAll ? settings : settings.slice(0, 5);
|
||||
|
||||
const formatDefaultForEnv = (value: unknown): string => {
|
||||
if (value === null || value === undefined) return '""';
|
||||
if (typeof value === 'object') {
|
||||
return `'${JSON.stringify(value)}'`;
|
||||
}
|
||||
if (typeof value === 'string') {
|
||||
return `"${value}"`;
|
||||
}
|
||||
return String(value);
|
||||
};
|
||||
|
||||
const copyToClipboard = (text: string) => {
|
||||
navigator.clipboard.writeText(text);
|
||||
};
|
||||
|
||||
const generateEnvExample = (setting: { default: unknown; env_var: string }): string => {
|
||||
const example = formatDefaultForEnv(setting.default);
|
||||
return `export ${setting.env_var}=${example}`;
|
||||
};
|
||||
|
||||
const generateAllEnvVars = (): string => {
|
||||
return [
|
||||
'# Superset Configuration Environment Variables',
|
||||
'# Generated from configuration metadata',
|
||||
'',
|
||||
...displaySettings.map(setting =>
|
||||
[
|
||||
`# ${setting.title}`,
|
||||
`# ${setting.description}`,
|
||||
`# Type: ${setting.type}`,
|
||||
`# Impact: ${setting.impact}${
|
||||
setting.requires_restart ? ' (requires restart)' : ''
|
||||
}`,
|
||||
generateEnvExample(setting),
|
||||
'',
|
||||
].join('\n'),
|
||||
),
|
||||
].join('\n');
|
||||
};
|
||||
|
||||
return (
|
||||
<div style={{ margin: '20px 0' }}>
|
||||
<div
|
||||
style={{
|
||||
backgroundColor: '#f6f8fa',
|
||||
border: '1px solid #e1e4e8',
|
||||
borderRadius: '6px',
|
||||
padding: '16px',
|
||||
position: 'relative',
|
||||
}}
|
||||
>
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
justifyContent: 'space-between',
|
||||
alignItems: 'center',
|
||||
marginBottom: '10px',
|
||||
}}
|
||||
>
|
||||
<h4 style={{ margin: 0, color: '#24292e' }}>
|
||||
Environment Variables {category && `(${category})`}
|
||||
</h4>
|
||||
<button
|
||||
onClick={() => copyToClipboard(generateAllEnvVars())}
|
||||
style={{
|
||||
backgroundColor: '#0366d6',
|
||||
color: 'white',
|
||||
border: 'none',
|
||||
padding: '6px 12px',
|
||||
borderRadius: '4px',
|
||||
cursor: 'pointer',
|
||||
fontSize: '12px',
|
||||
}}
|
||||
title="Copy all environment variables"
|
||||
>
|
||||
📋 Copy All
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<pre
|
||||
style={{
|
||||
backgroundColor: '#f6f8fa',
|
||||
border: 'none',
|
||||
padding: '0',
|
||||
margin: '0',
|
||||
fontFamily:
|
||||
'SFMono-Regular, Consolas, "Liberation Mono", Menlo, monospace',
|
||||
fontSize: '12px',
|
||||
lineHeight: '1.45',
|
||||
overflow: 'auto',
|
||||
maxHeight: '400px',
|
||||
}}
|
||||
>
|
||||
<code>{generateAllEnvVars()}</code>
|
||||
</pre>
|
||||
|
||||
{!showAll && settings.length > 5 && (
|
||||
<div
|
||||
style={{
|
||||
textAlign: 'center',
|
||||
marginTop: '10px',
|
||||
borderTop: '1px solid #e1e4e8',
|
||||
paddingTop: '10px',
|
||||
}}
|
||||
>
|
||||
<button
|
||||
onClick={() => setShowAll(true)}
|
||||
style={{
|
||||
backgroundColor: 'transparent',
|
||||
border: '1px solid #0366d6',
|
||||
color: '#0366d6',
|
||||
padding: '6px 12px',
|
||||
borderRadius: '4px',
|
||||
cursor: 'pointer',
|
||||
fontSize: '12px',
|
||||
}}
|
||||
>
|
||||
Show all {settings.length} settings
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div
|
||||
style={{
|
||||
marginTop: '10px',
|
||||
fontSize: '14px',
|
||||
color: '#586069',
|
||||
}}
|
||||
>
|
||||
<strong>Usage:</strong> Save to a <code>.env</code> file or export
|
||||
directly in your shell.
|
||||
{category && ` Showing ${settings.length} ${category} settings.`}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default EnvironmentVariablesExample;
|
||||
4938
docs/src/resources/config_metadata.json
Normal file
4938
docs/src/resources/config_metadata.json
Normal file
File diff suppressed because it is too large
Load Diff
2893
docs/static/resources/openapi.json
vendored
2893
docs/static/resources/openapi.json
vendored
File diff suppressed because it is too large
Load Diff
@@ -3993,6 +3993,26 @@ address@^1.0.1:
|
||||
resolved "https://registry.yarnpkg.com/address/-/address-1.2.2.tgz#2b5248dac5485a6390532c6a517fda2e3faac89e"
|
||||
integrity sha512-4B/qKCfeE/ODUaAUpSwfzazo5x29WD4r3vXiWsB7I2mSDAihwEqKO+g8GELZUQSSAo5e1XTYh3ZVfLyxBc12nA==
|
||||
|
||||
ag-charts-types@12.1.0:
|
||||
version "12.1.0"
|
||||
resolved "https://registry.yarnpkg.com/ag-charts-types/-/ag-charts-types-12.1.0.tgz#75104b90e5f6ae01b7248ec3f6a8dabc65c3cbb6"
|
||||
integrity sha512-qeODwJ1EqKjpwEbp0mQ2wQ0arRNYaZo2BafdAGfcuOwjOBlagSwJvUg5MCvAYZ/W/mg2uEmt7jKMNfDy4ul4+Q==
|
||||
|
||||
ag-grid-community@34.1.0, ag-grid-community@^34.1.0:
|
||||
version "34.1.0"
|
||||
resolved "https://registry.yarnpkg.com/ag-grid-community/-/ag-grid-community-34.1.0.tgz#6356562b3a544a50bbab6a3d0929029567bbd7bc"
|
||||
integrity sha512-3rZiOyyCGqSNqqTsrWafDVj1WfK43jfb53Ka5sqzdOG/yu6ySUFmdc0h/OuGLnkzwW5PC29coQwbS2rkb4c9dA==
|
||||
dependencies:
|
||||
ag-charts-types "12.1.0"
|
||||
|
||||
ag-grid-react@^34.1.0:
|
||||
version "34.1.0"
|
||||
resolved "https://registry.yarnpkg.com/ag-grid-react/-/ag-grid-react-34.1.0.tgz#9d89a75f5994a5187cbdf1f44132eb06c2741623"
|
||||
integrity sha512-CY1p4/JnvcwOt2HipmsqME9CWz7M21nb3OB1DhJGOvNUaxo1wF6Hb/pKpa20F3F/E93wQCNmCz+gfrSLPuJrQw==
|
||||
dependencies:
|
||||
ag-grid-community "34.1.0"
|
||||
prop-types "^15.8.1"
|
||||
|
||||
aggregate-error@^3.0.0:
|
||||
version "3.1.0"
|
||||
resolved "https://registry.yarnpkg.com/aggregate-error/-/aggregate-error-3.1.0.tgz#92670ff50f5359bdb7a3e0d40d0ec30c5737687a"
|
||||
@@ -7220,6 +7240,11 @@ html-tags@^3.3.1:
|
||||
resolved "https://registry.yarnpkg.com/html-tags/-/html-tags-3.3.1.tgz#a04026a18c882e4bba8a01a3d39cfe465d40b5ce"
|
||||
integrity sha512-ztqyC3kLto0e9WbNp0aeP+M3kTt+nbaIveGmUxAtZa+8iFgKLUOD4YKM5j+f3QD89bra7UeumolZHKuOXnTmeQ==
|
||||
|
||||
html-url-attributes@^3.0.0:
|
||||
version "3.0.1"
|
||||
resolved "https://registry.yarnpkg.com/html-url-attributes/-/html-url-attributes-3.0.1.tgz#83b052cd5e437071b756cd74ae70f708870c2d87"
|
||||
integrity sha512-ol6UPyBWqsrO6EJySPz2O7ZSr856WDrEzM5zMqp+FJJLGMW35cLYmmZnl0vztAZxRUoNZJFTCohfjuIJ8I4QBQ==
|
||||
|
||||
html-void-elements@^3.0.0:
|
||||
version "3.0.0"
|
||||
resolved "https://registry.yarnpkg.com/html-void-elements/-/html-void-elements-3.0.0.tgz#fc9dbd84af9e747249034d4d62602def6517f1d7"
|
||||
@@ -10887,6 +10912,23 @@ react-loadable-ssr-addon-v5-slorber@^1.0.1:
|
||||
dependencies:
|
||||
"@types/react" "*"
|
||||
|
||||
react-markdown@^10.1.0:
|
||||
version "10.1.0"
|
||||
resolved "https://registry.yarnpkg.com/react-markdown/-/react-markdown-10.1.0.tgz#e22bc20faddbc07605c15284255653c0f3bad5ca"
|
||||
integrity sha512-qKxVopLT/TyA6BX3Ue5NwabOsAzm0Q7kAPwq6L+wWDwisYs7R8vZ0nRXqq6rkueboxpkjvLGU9fWifiX/ZZFxQ==
|
||||
dependencies:
|
||||
"@types/hast" "^3.0.0"
|
||||
"@types/mdast" "^4.0.0"
|
||||
devlop "^1.0.0"
|
||||
hast-util-to-jsx-runtime "^2.0.0"
|
||||
html-url-attributes "^3.0.0"
|
||||
mdast-util-to-hast "^13.0.0"
|
||||
remark-parse "^11.0.0"
|
||||
remark-rehype "^11.0.0"
|
||||
unified "^11.0.0"
|
||||
unist-util-visit "^5.0.0"
|
||||
vfile "^6.0.0"
|
||||
|
||||
react-redux@^9.2.0:
|
||||
version "9.2.0"
|
||||
resolved "https://registry.yarnpkg.com/react-redux/-/react-redux-9.2.0.tgz#96c3ab23fb9a3af2cb4654be4b51c989e32366f5"
|
||||
|
||||
291
scripts/extract_config_schema.py
Executable file
291
scripts/extract_config_schema.py
Executable file
@@ -0,0 +1,291 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Extract configuration schema from config_defaults.py.
|
||||
|
||||
This script parses the existing config_defaults.py file and extracts:
|
||||
- All configuration keys and their default values
|
||||
- Comments above each key as descriptions
|
||||
- Types inferred from the default values
|
||||
|
||||
The output is a comprehensive JSON schema that can be used for:
|
||||
- Documentation generation
|
||||
- Configuration validation
|
||||
- IDE autocomplete
|
||||
"""
|
||||
|
||||
import ast
|
||||
import json
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List
|
||||
|
||||
# Import the complex object handlers
|
||||
sys.path.append(str(Path(__file__).parent.parent))
|
||||
try:
|
||||
from superset.config_objects import (
|
||||
get_default_for_complex_object,
|
||||
get_fully_qualified_type,
|
||||
get_object_import_info,
|
||||
is_complex_object,
|
||||
)
|
||||
except ImportError:
|
||||
# Fallback if import fails
|
||||
def get_default_for_complex_object(key: str) -> tuple[Any, str]:
|
||||
return f"<Complex object: {key}>", "unknown"
|
||||
|
||||
def is_complex_object(key: str) -> bool:
|
||||
return False
|
||||
|
||||
def get_fully_qualified_type(obj: Any) -> str:
|
||||
return type(obj).__name__
|
||||
|
||||
def get_object_import_info(obj: Any) -> dict[str, Any]:
|
||||
return {
|
||||
"module": None,
|
||||
"name": str(type(obj).__name__),
|
||||
"import_statement": None,
|
||||
}
|
||||
|
||||
|
||||
def infer_type(value: Any) -> str:
|
||||
"""Infer the configuration type from the default value."""
|
||||
if value is None:
|
||||
return "null"
|
||||
elif isinstance(value, bool):
|
||||
return "boolean"
|
||||
elif isinstance(value, int):
|
||||
return "integer"
|
||||
elif isinstance(value, float):
|
||||
return "number"
|
||||
elif isinstance(value, str):
|
||||
return "string"
|
||||
elif isinstance(value, (list, tuple)):
|
||||
return "array"
|
||||
elif isinstance(value, dict):
|
||||
return "object"
|
||||
else:
|
||||
return "unknown"
|
||||
|
||||
|
||||
def extract_comments_before_line(lines: List[str], line_num: int) -> List[str]:
|
||||
"""Extract comments immediately before a configuration line."""
|
||||
comments: List[str] = []
|
||||
current_line = line_num - 2 # line_num is 1-based, so -2 to get previous line
|
||||
|
||||
# Look backwards for comments, but only go back a few lines to avoid
|
||||
# picking up unrelated comments
|
||||
max_lookback = min(5, current_line + 1)
|
||||
|
||||
for i in range(max_lookback):
|
||||
if current_line - i < 0:
|
||||
break
|
||||
|
||||
line = lines[current_line - i].strip()
|
||||
if line.startswith("#"):
|
||||
# Remove the '#' and clean up the comment
|
||||
comment = line[1:].strip()
|
||||
if comment: # Only add non-empty comments
|
||||
comments.insert(0, comment)
|
||||
elif line == "":
|
||||
# Empty line - continue looking
|
||||
continue
|
||||
else:
|
||||
# Non-comment, non-empty line - stop looking
|
||||
break
|
||||
|
||||
return comments
|
||||
|
||||
|
||||
def safe_eval(node: ast.AST) -> Any: # noqa: C901
|
||||
"""Safely evaluate an AST node to get its value."""
|
||||
try:
|
||||
# Handle basic constant values
|
||||
if isinstance(node, ast.Constant):
|
||||
return node.value
|
||||
elif isinstance(node, ast.Num): # Python < 3.8
|
||||
return node.n
|
||||
elif isinstance(node, ast.Str): # Python < 3.8
|
||||
return node.s
|
||||
elif isinstance(node, ast.List):
|
||||
return [safe_eval(item) for item in node.elts]
|
||||
elif isinstance(node, ast.Dict):
|
||||
return {
|
||||
safe_eval(k): safe_eval(v)
|
||||
for k, v in zip(node.keys, node.values, strict=False)
|
||||
if k is not None
|
||||
}
|
||||
elif isinstance(node, ast.Name):
|
||||
# Handle common constants
|
||||
if node.id in ("True", "False", "None"):
|
||||
return {"True": True, "False": False, "None": None}[node.id]
|
||||
else:
|
||||
return f"<{node.id}>" # Placeholder for variables
|
||||
elif isinstance(node, ast.Call):
|
||||
# Handle function calls - try to identify the function being called
|
||||
if isinstance(node.func, ast.Name):
|
||||
func_name = node.func.id
|
||||
if func_name in ("int", "float", "str", "bool"):
|
||||
# Handle type constructors
|
||||
if node.args:
|
||||
arg_val = safe_eval(node.args[0])
|
||||
if isinstance(arg_val, (int, float, str, bool)):
|
||||
try:
|
||||
return eval(func_name)(arg_val) # noqa: S307
|
||||
except Exception:
|
||||
return f"<{func_name}()>"
|
||||
return f"<{func_name}()>"
|
||||
elif func_name == "timedelta":
|
||||
# Handle timedelta calls
|
||||
return "<timedelta()>"
|
||||
else:
|
||||
return f"<{func_name}()>"
|
||||
elif isinstance(node.func, ast.Attribute):
|
||||
# Handle method calls like obj.method()
|
||||
method_name = (
|
||||
ast.unparse(node.func) if hasattr(ast, "unparse") else "method_call"
|
||||
)
|
||||
return f"<{method_name}()>"
|
||||
else:
|
||||
return "<function_call>"
|
||||
elif isinstance(node, ast.Attribute):
|
||||
# Handle attribute access like obj.attr
|
||||
try:
|
||||
attr_str = ast.unparse(node) if hasattr(ast, "unparse") else "attribute"
|
||||
return f"<{attr_str}>"
|
||||
except Exception:
|
||||
return "<attribute>"
|
||||
else:
|
||||
# For everything else, just return a descriptive placeholder
|
||||
return f"<{type(node).__name__}>"
|
||||
except Exception:
|
||||
return "<unknown>"
|
||||
|
||||
|
||||
def extract_config_schema(config_file: Path) -> Dict[str, Any]:
|
||||
"""Extract configuration schema from config_defaults.py."""
|
||||
with open(config_file, "r") as f:
|
||||
content = f.read()
|
||||
lines = content.splitlines()
|
||||
|
||||
# Parse the Python file
|
||||
tree = ast.parse(content)
|
||||
|
||||
schema = {}
|
||||
|
||||
for node in ast.walk(tree):
|
||||
if isinstance(node, ast.Assign):
|
||||
# Check if this is a simple assignment to a variable
|
||||
if len(node.targets) == 1 and isinstance(node.targets[0], ast.Name):
|
||||
var_name = node.targets[0].id
|
||||
|
||||
# Only include uppercase variables (configuration convention)
|
||||
if var_name.isupper():
|
||||
# Get the default value
|
||||
default_value = safe_eval(node.value)
|
||||
|
||||
# Check if this is a complex object
|
||||
if is_complex_object(var_name):
|
||||
# Get the proper default value and type for complex objects
|
||||
default_value, type_name = get_default_for_complex_object(
|
||||
var_name
|
||||
)
|
||||
config_type = type_name
|
||||
else:
|
||||
# Infer type from default value
|
||||
config_type = infer_type(default_value)
|
||||
|
||||
# Get comments before this line
|
||||
comments = extract_comments_before_line(lines, node.lineno)
|
||||
description = " ".join(comments) if comments else ""
|
||||
|
||||
# Determine category based on variable name patterns
|
||||
category = categorize_config(var_name)
|
||||
|
||||
schema[var_name] = {
|
||||
"type": config_type,
|
||||
"default": default_value,
|
||||
"description": description,
|
||||
"category": category,
|
||||
}
|
||||
|
||||
# Add additional metadata for complex objects
|
||||
if is_complex_object(var_name):
|
||||
schema[var_name]["is_complex_object"] = True
|
||||
|
||||
return schema
|
||||
|
||||
|
||||
def categorize_config(var_name: str) -> str:
|
||||
"""Categorize configuration variables based on their names."""
|
||||
name_lower = var_name.lower()
|
||||
|
||||
if any(term in name_lower for term in ["limit", "timeout", "cache", "pool"]):
|
||||
return "performance"
|
||||
elif any(term in name_lower for term in ["feature", "flag", "enable", "disable"]):
|
||||
return "features"
|
||||
elif any(term in name_lower for term in ["theme", "color", "style", "ui"]):
|
||||
return "ui"
|
||||
elif any(term in name_lower for term in ["db", "database", "sql", "query"]):
|
||||
return "database"
|
||||
elif any(term in name_lower for term in ["auth", "security", "login", "oauth"]):
|
||||
return "security"
|
||||
elif any(term in name_lower for term in ["log", "debug", "stats"]):
|
||||
return "logging"
|
||||
elif any(term in name_lower for term in ["mail", "smtp", "email"]):
|
||||
return "email"
|
||||
elif any(term in name_lower for term in ["celery", "async", "worker"]):
|
||||
return "async"
|
||||
else:
|
||||
return "general"
|
||||
|
||||
|
||||
def main() -> None:
|
||||
"""Extract configuration schema and save to JSON."""
|
||||
superset_root = Path(__file__).parent.parent
|
||||
config_file = superset_root / "superset" / "config_defaults.py"
|
||||
|
||||
if not config_file.exists():
|
||||
print(f"Error: {config_file} not found")
|
||||
sys.exit(1)
|
||||
|
||||
print("Extracting configuration schema...")
|
||||
schema = extract_config_schema(config_file)
|
||||
|
||||
# Create output structure
|
||||
output = {
|
||||
"metadata": {
|
||||
"generated_from": str(config_file),
|
||||
"total_configs": len(schema),
|
||||
"description": (
|
||||
"Superset configuration schema extracted from config_defaults.py"
|
||||
),
|
||||
},
|
||||
"configs": schema,
|
||||
"by_category": {},
|
||||
}
|
||||
|
||||
# Group by category
|
||||
for key, config in schema.items():
|
||||
category = config["category"]
|
||||
if category not in output["by_category"]:
|
||||
output["by_category"][category] = {}
|
||||
output["by_category"][category][key] = config
|
||||
|
||||
# Save to JSON
|
||||
output_file = superset_root / "superset" / "config_schema.json"
|
||||
with open(output_file, "w") as f:
|
||||
json.dump(output, f, indent=2, default=str)
|
||||
|
||||
print("✅ Schema extracted successfully!")
|
||||
print(f"📊 Total configurations: {len(schema)}")
|
||||
print(f"📂 Categories: {list(output['by_category'].keys())}")
|
||||
print(f"💾 Saved to: {output_file}")
|
||||
|
||||
# Show some stats
|
||||
print("\n📈 Category breakdown:")
|
||||
for category, configs in output["by_category"].items():
|
||||
print(f" {category}: {len(configs)} configs")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
324
scripts/extract_config_types.py
Normal file
324
scripts/extract_config_types.py
Normal file
@@ -0,0 +1,324 @@
|
||||
#!/usr/bin/env python3
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""Extract configuration types from runtime inspection of config.py.
|
||||
|
||||
This script imports the actual config module and extracts type information
|
||||
through runtime introspection, providing more accurate type data than
|
||||
static analysis.
|
||||
"""
|
||||
|
||||
import ast
|
||||
import inspect
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
# Add superset to path
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent))
|
||||
|
||||
|
||||
def get_source_comment(module_path: str, var_name: str) -> Optional[str]:
|
||||
"""Extract comment from source code for a variable."""
|
||||
try:
|
||||
with open(module_path, "r") as f:
|
||||
content = f.read()
|
||||
|
||||
tree = ast.parse(content)
|
||||
lines = content.splitlines()
|
||||
|
||||
for node in ast.walk(tree):
|
||||
if isinstance(node, ast.Assign):
|
||||
if len(node.targets) == 1 and isinstance(node.targets[0], ast.Name):
|
||||
if node.targets[0].id == var_name:
|
||||
# Look for comments above this line
|
||||
line_num = node.lineno - 1 # Convert to 0-based
|
||||
comments = []
|
||||
|
||||
# Look backwards for comments
|
||||
for i in range(min(5, line_num)):
|
||||
check_line = line_num - i - 1
|
||||
if check_line < 0:
|
||||
break
|
||||
|
||||
line = lines[check_line].strip()
|
||||
if line.startswith("#"):
|
||||
comment = line[1:].strip()
|
||||
if comment:
|
||||
comments.insert(0, comment)
|
||||
elif line and not line.startswith("#"):
|
||||
break
|
||||
|
||||
return " ".join(comments) if comments else None
|
||||
|
||||
return None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def analyze_value(value: Any) -> Dict[str, Any]:
|
||||
"""Analyze a configuration value to extract type information."""
|
||||
analysis = {
|
||||
"python_type": type(value),
|
||||
"type_name": type(value).__name__,
|
||||
"module": getattr(type(value), "__module__", None),
|
||||
"is_callable": callable(value),
|
||||
"is_none": value is None,
|
||||
}
|
||||
|
||||
# Basic type categorization
|
||||
if value is None:
|
||||
analysis["category"] = "null"
|
||||
elif isinstance(value, bool):
|
||||
analysis["category"] = "boolean"
|
||||
elif isinstance(value, int):
|
||||
analysis["category"] = "integer"
|
||||
elif isinstance(value, float):
|
||||
analysis["category"] = "number"
|
||||
elif isinstance(value, str):
|
||||
analysis["category"] = "string"
|
||||
elif isinstance(value, (list, tuple)):
|
||||
analysis["category"] = "array"
|
||||
# Sample item types
|
||||
if value:
|
||||
item_types = list(set(type(item).__name__ for item in value[:5]))
|
||||
analysis["item_types"] = item_types
|
||||
elif isinstance(value, dict):
|
||||
analysis["category"] = "object"
|
||||
# Sample key/value types
|
||||
if value:
|
||||
keys = list(value.keys())[:5]
|
||||
key_types = list(set(type(k).__name__ for k in keys))
|
||||
val_types = list(set(type(value[k]).__name__ for k in keys))
|
||||
analysis["key_types"] = key_types
|
||||
analysis["value_types"] = val_types
|
||||
elif callable(value):
|
||||
analysis["category"] = "function"
|
||||
try:
|
||||
analysis["signature"] = str(inspect.signature(value))
|
||||
except Exception:
|
||||
pass
|
||||
else:
|
||||
analysis["category"] = "object"
|
||||
analysis["class_name"] = f"{type(value).__module__}.{type(value).__name__}"
|
||||
|
||||
# Serialization check
|
||||
try:
|
||||
import json
|
||||
|
||||
json.dumps(value)
|
||||
analysis["serializable"] = True
|
||||
except Exception:
|
||||
analysis["serializable"] = False
|
||||
|
||||
return analysis
|
||||
|
||||
|
||||
def categorize_config_key(key: str) -> str:
|
||||
"""Categorize a configuration key based on its name."""
|
||||
key_lower = key.lower()
|
||||
|
||||
if any(
|
||||
term in key_lower
|
||||
for term in ["secret", "key", "password", "auth", "oauth", "login"]
|
||||
):
|
||||
return "security"
|
||||
elif any(
|
||||
term in key_lower for term in ["db", "database", "sql", "query", "engine"]
|
||||
):
|
||||
return "database"
|
||||
elif any(
|
||||
term in key_lower for term in ["limit", "timeout", "cache", "pool", "async"]
|
||||
):
|
||||
return "performance"
|
||||
elif any(term in key_lower for term in ["feature", "flag", "enable", "disable"]):
|
||||
return "features"
|
||||
elif any(
|
||||
term in key_lower for term in ["theme", "color", "style", "ui", "frontend"]
|
||||
):
|
||||
return "ui"
|
||||
elif any(term in key_lower for term in ["log", "debug", "stats", "event"]):
|
||||
return "logging"
|
||||
elif any(term in key_lower for term in ["mail", "smtp", "email"]):
|
||||
return "email"
|
||||
elif any(term in key_lower for term in ["celery", "worker", "beat", "task"]):
|
||||
return "async"
|
||||
else:
|
||||
return "general"
|
||||
|
||||
|
||||
def extract_config_types() -> Dict[str, Any]:
|
||||
"""Extract type information from the config module."""
|
||||
try:
|
||||
# Import the config module
|
||||
from superset import config
|
||||
|
||||
# Get module path for comment extraction
|
||||
config_path = inspect.getfile(config)
|
||||
|
||||
results = {}
|
||||
|
||||
# Get all uppercase attributes (configuration convention)
|
||||
for name in dir(config):
|
||||
if name.isupper() and not name.startswith("_"):
|
||||
value = getattr(config, name)
|
||||
|
||||
# Analyze the value
|
||||
analysis = analyze_value(value)
|
||||
|
||||
# Get source comment
|
||||
comment = get_source_comment(config_path, name)
|
||||
|
||||
# Categorize
|
||||
category = categorize_config_key(name)
|
||||
|
||||
results[name] = {
|
||||
"key": name,
|
||||
"value_analysis": analysis,
|
||||
"description": comment,
|
||||
"category": category,
|
||||
"current_value": value
|
||||
if analysis.get("serializable")
|
||||
else f"<{analysis['type_name']} instance>",
|
||||
}
|
||||
|
||||
return results
|
||||
|
||||
except ImportError as e:
|
||||
print(f"Error importing config: {e}")
|
||||
return {}
|
||||
|
||||
|
||||
def compare_with_metadata() -> Dict[str, Any]:
|
||||
"""Compare runtime config with defined metadata."""
|
||||
from superset.config_metadata import CONFIG_METADATA
|
||||
|
||||
runtime_configs = extract_config_types()
|
||||
|
||||
comparison = {
|
||||
"in_metadata_only": [],
|
||||
"in_runtime_only": [],
|
||||
"type_mismatches": [],
|
||||
"matching": [],
|
||||
}
|
||||
|
||||
metadata_keys = set(CONFIG_METADATA.keys())
|
||||
runtime_keys = set(runtime_configs.keys())
|
||||
|
||||
# Keys only in metadata
|
||||
comparison["in_metadata_only"] = sorted(metadata_keys - runtime_keys)
|
||||
|
||||
# Keys only in runtime
|
||||
comparison["in_runtime_only"] = sorted(runtime_keys - metadata_keys)
|
||||
|
||||
# Check for type mismatches
|
||||
for key in metadata_keys & runtime_keys:
|
||||
metadata_type = CONFIG_METADATA[key].type
|
||||
runtime_type = runtime_configs[key]["value_analysis"]["python_type"]
|
||||
|
||||
if metadata_type != runtime_type:
|
||||
comparison["type_mismatches"].append(
|
||||
{
|
||||
"key": key,
|
||||
"metadata_type": str(metadata_type),
|
||||
"runtime_type": str(runtime_type),
|
||||
}
|
||||
)
|
||||
else:
|
||||
comparison["matching"].append(key)
|
||||
|
||||
return comparison
|
||||
|
||||
|
||||
def suggest_metadata_entries() -> List[str]:
|
||||
"""Suggest metadata entries for configs not yet documented."""
|
||||
runtime_configs = extract_config_types()
|
||||
from superset.config_metadata import CONFIG_METADATA
|
||||
|
||||
suggestions = []
|
||||
|
||||
for key, info in runtime_configs.items():
|
||||
if key not in CONFIG_METADATA:
|
||||
analysis = info["value_analysis"]
|
||||
|
||||
# Build suggested metadata entry
|
||||
suggestion = f""" "{key}": ConfigMetadata(
|
||||
key="{key}",
|
||||
type={analysis["type_name"]},
|
||||
default={repr(info["current_value"]) if analysis["serializable"] else f"{analysis['type_name']}()"},
|
||||
description="{info.get("description", "TODO: Add description")}",
|
||||
category="{info["category"]}",
|
||||
impact="medium",
|
||||
requires_restart={"True" if info["category"] in ["security", "database"] else "False"},"""
|
||||
|
||||
if analysis["category"] == "integer":
|
||||
suggestion += "\n min_value=1,"
|
||||
|
||||
if not analysis["serializable"]:
|
||||
suggestion += f'\n serializable=False,\n doc_default="<{analysis["type_name"]} instance>",'
|
||||
|
||||
suggestion += "\n ),"
|
||||
|
||||
suggestions.append(suggestion)
|
||||
|
||||
return suggestions
|
||||
|
||||
|
||||
def main():
|
||||
"""Main function to run type extraction."""
|
||||
print("Extracting configuration types from runtime...")
|
||||
|
||||
# Extract types
|
||||
runtime_configs = extract_config_types()
|
||||
print(f"Found {len(runtime_configs)} configuration variables")
|
||||
|
||||
# Compare with metadata
|
||||
print("\nComparing with defined metadata...")
|
||||
comparison = compare_with_metadata()
|
||||
|
||||
print(f" - Matching: {len(comparison['matching'])}")
|
||||
print(f" - Only in metadata: {len(comparison['in_metadata_only'])}")
|
||||
print(f" - Only in runtime: {len(comparison['in_runtime_only'])}")
|
||||
print(f" - Type mismatches: {len(comparison['type_mismatches'])}")
|
||||
|
||||
if comparison["in_runtime_only"]:
|
||||
print(f"\nConfigs missing metadata: {len(comparison['in_runtime_only'])}")
|
||||
print("Generating suggestions...")
|
||||
|
||||
suggestions = suggest_metadata_entries()
|
||||
|
||||
# Save suggestions to file
|
||||
output_file = Path(__file__).parent / "suggested_metadata.py"
|
||||
with open(output_file, "w") as f:
|
||||
f.write("# Suggested metadata entries for undocumented configs\n\n")
|
||||
f.write("\n\n".join(suggestions))
|
||||
|
||||
print(f"Suggestions saved to: {output_file}")
|
||||
|
||||
# Show type distribution
|
||||
type_dist = {}
|
||||
for config in runtime_configs.values():
|
||||
cat = config["value_analysis"]["category"]
|
||||
type_dist[cat] = type_dist.get(cat, 0) + 1
|
||||
|
||||
print("\nType distribution:")
|
||||
for cat, count in sorted(type_dist.items()):
|
||||
print(f" {cat}: {count}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
3025
scripts/suggested_metadata.py
Normal file
3025
scripts/suggested_metadata.py
Normal file
File diff suppressed because it is too large
Load Diff
171
superset/cli/config.py
Normal file
171
superset/cli/config.py
Normal file
@@ -0,0 +1,171 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""Configuration introspection CLI commands."""
|
||||
|
||||
import re
|
||||
from typing import Any
|
||||
|
||||
import click
|
||||
import yaml
|
||||
from flask.cli import with_appcontext
|
||||
|
||||
from superset import app
|
||||
|
||||
|
||||
def serialize_config_value(value: Any) -> Any:
|
||||
"""Serialize config values for YAML output, handling callables and objects."""
|
||||
if callable(value):
|
||||
name = value.__name__ if hasattr(value, "__name__") else repr(value)
|
||||
return f"<callable: {name}>"
|
||||
elif hasattr(value, "__module__") and hasattr(value, "__name__"):
|
||||
return f"<object: {value.__module__}.{value.__name__}>"
|
||||
elif isinstance(value, type):
|
||||
return f"<class: {value.__module__}.{value.__name__}>"
|
||||
else:
|
||||
try:
|
||||
# Try to serialize with yaml to check if it's serializable
|
||||
yaml.safe_dump(value)
|
||||
return value
|
||||
except yaml.YAMLError:
|
||||
return repr(value)
|
||||
|
||||
|
||||
def get_config_source(key: str) -> str:
|
||||
"""Determine where a config value comes from."""
|
||||
import os
|
||||
|
||||
# Check if it's from environment variables (with double underscore prefix)
|
||||
env_key = f"SUPERSET__{key}"
|
||||
if env_key in os.environ:
|
||||
return f"environment ({env_key})"
|
||||
|
||||
# Check if it's from superset_config.py user override
|
||||
try:
|
||||
import superset_config
|
||||
|
||||
if hasattr(superset_config, key):
|
||||
return "superset_config.py"
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
# Otherwise it's from defaults
|
||||
return "config_defaults.py"
|
||||
|
||||
|
||||
@click.group()
|
||||
def config() -> None:
|
||||
"""Configuration introspection commands."""
|
||||
pass
|
||||
|
||||
|
||||
@config.command()
|
||||
@with_appcontext
|
||||
@click.option("--filter", "-f", help="Filter config keys (regex pattern)")
|
||||
@click.option(
|
||||
"--verbose", "-v", is_flag=True, help="Show source information for each key"
|
||||
)
|
||||
def show(filter: str, verbose: bool) -> None:
|
||||
"""Show effective configuration as YAML."""
|
||||
config_dict = {}
|
||||
|
||||
# Get actual config keys (not Flask Config methods)
|
||||
# Use app.config.keys() to get only the configuration values
|
||||
for key in app.config.keys():
|
||||
# Apply filter if provided
|
||||
if filter and not re.search(filter, key, re.IGNORECASE):
|
||||
continue
|
||||
|
||||
value = app.config[key]
|
||||
serialized_value = serialize_config_value(value)
|
||||
|
||||
if verbose:
|
||||
source = get_config_source(key)
|
||||
config_dict[key] = {"value": serialized_value, "source": source}
|
||||
else:
|
||||
config_dict[key] = serialized_value
|
||||
|
||||
# Output as YAML
|
||||
print(yaml.dump(config_dict, default_flow_style=False, sort_keys=True))
|
||||
|
||||
|
||||
@config.command()
|
||||
@with_appcontext
|
||||
@click.argument("key")
|
||||
def get(key: str) -> None:
|
||||
"""Get a specific configuration value."""
|
||||
if key not in app.config:
|
||||
click.echo(f"Configuration key '{key}' not found", err=True)
|
||||
return
|
||||
|
||||
value = app.config[key]
|
||||
serialized_value = serialize_config_value(value)
|
||||
source = get_config_source(key)
|
||||
|
||||
result = {key: {"value": serialized_value, "source": source}}
|
||||
|
||||
print(yaml.dump(result, default_flow_style=False))
|
||||
|
||||
|
||||
@config.command()
|
||||
@with_appcontext
|
||||
def env_examples() -> None:
|
||||
"""Show example environment variables for configuration."""
|
||||
from superset.config_extensions import SupersetConfig
|
||||
|
||||
examples = [
|
||||
"# Superset configuration via environment variables",
|
||||
"# All environment variables must start with SUPERSET__ prefix "
|
||||
"(note double underscore)",
|
||||
"",
|
||||
"# Basic settings",
|
||||
"export SUPERSET__ROW_LIMIT=100000",
|
||||
"export SUPERSET__SAMPLES_ROW_LIMIT=10000",
|
||||
"export SUPERSET__SQLLAB_TIMEOUT=60",
|
||||
"",
|
||||
"# Feature flags (JSON format)",
|
||||
'export SUPERSET__FEATURE_FLAGS=\'{"ENABLE_TEMPLATE_PROCESSING": true, '
|
||||
'"ENABLE_EXPLORE_DRAG_AND_DROP": true}\'',
|
||||
"",
|
||||
"# Or use triple underscore for nested values",
|
||||
"export SUPERSET__FEATURE_FLAGS__ENABLE_TEMPLATE_PROCESSING=true",
|
||||
"export SUPERSET__FEATURE_FLAGS__ENABLE_EXPLORE_DRAG_AND_DROP=true",
|
||||
"",
|
||||
"# Theme configuration",
|
||||
'export SUPERSET__THEME_DEFAULT=\'{"colors": '
|
||||
'{"primary": {"base": "#1985a1"}}}\'',
|
||||
"",
|
||||
"# Lists and complex types",
|
||||
'export SUPERSET__FAB_ROLES=\'["Admin", "Alpha", "Gamma"]\'',
|
||||
"",
|
||||
]
|
||||
|
||||
for line in examples:
|
||||
click.echo(line)
|
||||
|
||||
# Show documented settings if using SupersetConfig
|
||||
if isinstance(app.config, SupersetConfig):
|
||||
click.echo("\n# Documented settings with metadata:")
|
||||
for key, schema in app.config.DATABASE_SETTINGS_SCHEMA.items():
|
||||
click.echo(f"\n# {schema.get('title', key)}")
|
||||
click.echo(f"# {schema.get('description', '')}")
|
||||
click.echo(f"# Type: {schema.get('type', 'unknown')}")
|
||||
if "minimum" in schema or "maximum" in schema:
|
||||
click.echo(
|
||||
f"# Range: {schema.get('minimum', 'N/A')} - "
|
||||
"{schema.get('maximum', 'N/A')}"
|
||||
)
|
||||
click.echo(f"export SUPERSET__{key}={schema.get('default', '...')}")
|
||||
16
superset/commands/settings/__init__.py
Normal file
16
superset/commands/settings/__init__.py
Normal file
@@ -0,0 +1,16 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
47
superset/commands/settings/exceptions.py
Normal file
47
superset/commands/settings/exceptions.py
Normal file
@@ -0,0 +1,47 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
from superset.commands.exceptions import CommandException
|
||||
|
||||
|
||||
class SettingsCommandException(CommandException):
|
||||
"""Base exception for settings commands."""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class SettingNotFoundError(SettingsCommandException):
|
||||
"""Exception raised when a setting is not found."""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class SettingNotAllowedInDatabaseError(SettingsCommandException):
|
||||
"""Exception raised when a setting cannot be stored in the database."""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class SettingValidationError(SettingsCommandException):
|
||||
"""Exception raised when a setting value is invalid."""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class SettingAlreadyExistsError(SettingsCommandException):
|
||||
"""Exception raised when trying to create a setting that already exists."""
|
||||
|
||||
pass
|
||||
1782
superset/config_defaults.py
Normal file
1782
superset/config_defaults.py
Normal file
File diff suppressed because it is too large
Load Diff
184
superset/config_extensions.py
Normal file
184
superset/config_extensions.py
Normal file
@@ -0,0 +1,184 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""Enhanced configuration system for Superset
|
||||
|
||||
This module provides the SupersetConfig class and supporting infrastructure
|
||||
for the new configuration system.
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from flask import Config
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SupersetConfig(Config):
|
||||
"""Enhanced configuration class for Superset.
|
||||
|
||||
This class extends Flask's Config class to provide additional features:
|
||||
- Rich metadata system for configuration values
|
||||
- Database-backed settings support
|
||||
- Environment variable integration
|
||||
- JSON schema generation for UI forms
|
||||
"""
|
||||
|
||||
# Metadata is now stored in config_metadata.py
|
||||
# This provides a reference to the metadata module
|
||||
|
||||
def __init__(
|
||||
self, root_path: Optional[str] = None, defaults: Optional[Dict[str, Any]] = None
|
||||
):
|
||||
"""Initialize SupersetConfig with enhanced features."""
|
||||
super().__init__(root_path, defaults)
|
||||
|
||||
def get_setting_metadata(self, key: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get metadata for a configuration setting."""
|
||||
try:
|
||||
from superset.config_metadata import get_metadata
|
||||
|
||||
metadata = get_metadata(key)
|
||||
if metadata:
|
||||
return metadata.to_doc_dict()
|
||||
return None
|
||||
except ImportError:
|
||||
return None
|
||||
|
||||
def get_settings_by_category(self, category: str) -> Dict[str, Any]:
|
||||
"""Get all settings for a specific category."""
|
||||
try:
|
||||
from superset.config_metadata import CONFIG_METADATA
|
||||
|
||||
return {
|
||||
key: metadata.to_doc_dict()
|
||||
for key, metadata in CONFIG_METADATA.items()
|
||||
if metadata.category == category
|
||||
}
|
||||
except ImportError:
|
||||
return {}
|
||||
|
||||
def validate_setting(self, key: str, value: Any) -> bool:
|
||||
"""Validate a setting value against its schema."""
|
||||
try:
|
||||
from superset.config_metadata import validate_config_value
|
||||
|
||||
return validate_config_value(key, value)
|
||||
except ImportError:
|
||||
return True # No validation if metadata not available
|
||||
|
||||
def to_json_schema(self) -> Dict[str, Any]:
|
||||
"""Generate JSON schema for all database settings.
|
||||
|
||||
This can be used to generate forms in the frontend.
|
||||
"""
|
||||
try:
|
||||
from superset.config_metadata import CONFIG_METADATA
|
||||
|
||||
properties = {}
|
||||
required = []
|
||||
|
||||
for key, metadata in CONFIG_METADATA.items():
|
||||
if metadata.deprecated:
|
||||
continue
|
||||
|
||||
property_schema = {
|
||||
"type": metadata._type_to_string().lower(),
|
||||
"title": key.replace("_", " ").title(),
|
||||
"description": metadata.description,
|
||||
"default": metadata.doc_default
|
||||
if metadata.doc_default is not None
|
||||
else metadata._serialize_default(),
|
||||
}
|
||||
|
||||
if isinstance(metadata.type, type) and issubclass(
|
||||
metadata.type, (int, float)
|
||||
):
|
||||
if metadata.min_value is not None:
|
||||
property_schema["minimum"] = metadata.min_value
|
||||
if metadata.max_value is not None:
|
||||
property_schema["maximum"] = metadata.max_value
|
||||
|
||||
if metadata.choices is not None:
|
||||
property_schema["enum"] = metadata.choices
|
||||
|
||||
properties[key] = property_schema
|
||||
|
||||
return {
|
||||
"type": "object",
|
||||
"properties": properties,
|
||||
"required": required,
|
||||
}
|
||||
except ImportError:
|
||||
return {"type": "object", "properties": {}, "required": []}
|
||||
|
||||
def get_database_setting(self, key: str, default: Any = None) -> Any:
|
||||
"""Get a setting value from the database (future implementation)."""
|
||||
# This would integrate with the SettingsDAO
|
||||
# For now, return the regular config value
|
||||
return self.get(key, default)
|
||||
|
||||
def set_database_setting(self, key: str, value: Any) -> bool:
|
||||
"""Set a setting value in the database (future implementation)."""
|
||||
# This would integrate with the SettingsDAO
|
||||
# For now, just validate the value
|
||||
if not self.validate_setting(key, value):
|
||||
return False
|
||||
|
||||
# Would save to database here
|
||||
# settings_dao.set_value(key, value)
|
||||
return True
|
||||
|
||||
def from_database(self) -> None:
|
||||
"""Load settings from database (future implementation)."""
|
||||
# This would load database-backed settings
|
||||
# For now, this is a placeholder
|
||||
pass
|
||||
|
||||
def load_from_environment(self, prefix: str = "SUPERSET_") -> bool:
|
||||
"""Load configuration from environment variables.
|
||||
|
||||
Uses Flask's built-in from_prefixed_env method to load environment
|
||||
variables with the SUPERSET__ prefix (note the double underscore).
|
||||
This provides automatic JSON parsing and nested dictionary support.
|
||||
|
||||
The double underscore clearly separates the system prefix from the
|
||||
configuration key name.
|
||||
|
||||
Examples:
|
||||
SUPERSET__ROW_LIMIT=100000
|
||||
SUPERSET__SQLLAB_TIMEOUT=60
|
||||
SUPERSET__FEATURE_FLAGS='{"ENABLE_TEMPLATE_PROCESSING": true}'
|
||||
SUPERSET__FEATURE_FLAGS__ENABLE_TEMPLATE_PROCESSING=true
|
||||
|
||||
Args:
|
||||
prefix: The environment variable prefix (default: "SUPERSET_")
|
||||
|
||||
Returns:
|
||||
bool: True if any values were loaded
|
||||
"""
|
||||
# Use Flask's built-in method which handles JSON parsing automatically
|
||||
# Note: Flask will add one more underscore, so SUPERSET_ becomes SUPERSET__
|
||||
return self.from_prefixed_env(prefix)
|
||||
|
||||
def export_settings(self) -> Dict[str, Any]:
|
||||
"""Export current settings with metadata."""
|
||||
result = {}
|
||||
for key, schema in self.DATABASE_SETTINGS_SCHEMA.items():
|
||||
if key in self:
|
||||
result[key] = {"value": self[key], "metadata": schema}
|
||||
return result
|
||||
371
superset/config_metadata.py
Normal file
371
superset/config_metadata.py
Normal file
@@ -0,0 +1,371 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""Configuration metadata for Apache Superset.
|
||||
|
||||
This module contains the authoritative metadata for all configuration
|
||||
settings in Superset. It serves as the single source of truth for:
|
||||
|
||||
1. Configuration documentation
|
||||
2. Type information
|
||||
3. Validation rules
|
||||
4. Default values
|
||||
5. Environment variable mappings
|
||||
|
||||
The metadata can reference actual Python objects (non-serializable)
|
||||
and provides an export mechanism for documentation generation.
|
||||
"""
|
||||
|
||||
from datetime import timedelta
|
||||
from typing import Any, Callable, Dict, List, Optional, Type, Union
|
||||
|
||||
from superset.stats_logger import DummyStatsLogger
|
||||
from superset.utils.log import DBEventLogger
|
||||
from superset.utils.logging_configurator import DefaultLoggingConfigurator
|
||||
|
||||
|
||||
class ConfigMetadata:
|
||||
"""Metadata for a configuration setting."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
key: str,
|
||||
type: Type,
|
||||
default: Any,
|
||||
description: str,
|
||||
category: str = "general",
|
||||
impact: str = "medium",
|
||||
requires_restart: bool = False,
|
||||
min_value: Optional[Union[int, float]] = None,
|
||||
max_value: Optional[Union[int, float]] = None,
|
||||
choices: Optional[List[Any]] = None,
|
||||
example: Optional[str] = None,
|
||||
documentation_url: Optional[str] = None,
|
||||
deprecated: bool = False,
|
||||
deprecated_message: Optional[str] = None,
|
||||
validator: Optional[Callable[[Any], bool]] = None,
|
||||
converter: Optional[Callable[[Any], Any]] = None,
|
||||
serializable: bool = True,
|
||||
doc_default: Optional[Any] = None,
|
||||
):
|
||||
"""Initialize configuration metadata.
|
||||
|
||||
Args:
|
||||
key: Configuration key name
|
||||
type: Python type of the configuration value
|
||||
default: Default value
|
||||
description: Human-readable description
|
||||
category: Category for grouping (performance, security, etc.)
|
||||
impact: Impact level (low, medium, high)
|
||||
requires_restart: Whether changing requires restart
|
||||
min_value: Minimum value for numeric types
|
||||
max_value: Maximum value for numeric types
|
||||
choices: List of valid choices
|
||||
example: Example usage
|
||||
documentation_url: Link to detailed docs
|
||||
deprecated: Whether this setting is deprecated
|
||||
deprecated_message: Message for deprecated settings
|
||||
validator: Custom validation function
|
||||
converter: Function to convert from env var or other format
|
||||
serializable: Whether the value can be JSON serialized
|
||||
doc_default: Alternative default for documentation (if not serializable)
|
||||
"""
|
||||
self.key = key
|
||||
self.type = type
|
||||
self.default = default
|
||||
self.description = description
|
||||
self.category = category
|
||||
self.impact = impact
|
||||
self.requires_restart = requires_restart
|
||||
self.min_value = min_value
|
||||
self.max_value = max_value
|
||||
self.choices = choices
|
||||
self.example = example
|
||||
self.documentation_url = documentation_url
|
||||
self.deprecated = deprecated
|
||||
self.deprecated_message = deprecated_message
|
||||
self.validator = validator
|
||||
self.converter = converter
|
||||
self.serializable = serializable
|
||||
self.doc_default = doc_default
|
||||
|
||||
def to_doc_dict(self) -> Dict[str, Any]:
|
||||
"""Convert to dictionary for documentation export."""
|
||||
return {
|
||||
"key": self.key,
|
||||
"type": self._type_to_string(),
|
||||
"default": self.doc_default
|
||||
if self.doc_default is not None
|
||||
else self._serialize_default(),
|
||||
"description": self.description,
|
||||
"category": self.category,
|
||||
"impact": self.impact,
|
||||
"requires_restart": self.requires_restart,
|
||||
"min_value": self.min_value,
|
||||
"max_value": self.max_value,
|
||||
"choices": self.choices,
|
||||
"example": self.example,
|
||||
"documentation_url": self.documentation_url,
|
||||
"deprecated": self.deprecated,
|
||||
"deprecated_message": self.deprecated_message,
|
||||
"env_var": f"SUPERSET__{self.key}",
|
||||
}
|
||||
|
||||
def _type_to_string(self) -> str:
|
||||
"""Convert Python type to string representation."""
|
||||
if hasattr(self.type, "__name__"):
|
||||
return self.type.__name__
|
||||
return str(self.type)
|
||||
|
||||
def _serialize_default(self) -> Any:
|
||||
"""Serialize default value for documentation."""
|
||||
if not self.serializable:
|
||||
return f"<{self._type_to_string()} instance>"
|
||||
if callable(self.default):
|
||||
return "<function>"
|
||||
if self.default is None:
|
||||
return None
|
||||
if isinstance(self.default, (str, int, float, bool, list, dict)):
|
||||
return self.default
|
||||
return str(self.default)
|
||||
|
||||
|
||||
# Configuration metadata registry
|
||||
CONFIG_METADATA: Dict[str, ConfigMetadata] = {
|
||||
# Performance settings
|
||||
"ROW_LIMIT": ConfigMetadata(
|
||||
key="ROW_LIMIT",
|
||||
type=int,
|
||||
default=50000,
|
||||
description="Maximum number of rows returned for any query or data request. "
|
||||
"This is a hard limit to prevent memory issues and ensure reasonable response times.",
|
||||
category="performance",
|
||||
impact="medium",
|
||||
requires_restart=False,
|
||||
min_value=1,
|
||||
max_value=1000000,
|
||||
example="export SUPERSET__ROW_LIMIT=100000",
|
||||
),
|
||||
"SAMPLES_ROW_LIMIT": ConfigMetadata(
|
||||
key="SAMPLES_ROW_LIMIT",
|
||||
type=int,
|
||||
default=1000,
|
||||
description="Default row limit when requesting samples from datasource. "
|
||||
"Used in dataset exploration and SQL Lab table preview.",
|
||||
category="performance",
|
||||
impact="low",
|
||||
requires_restart=False,
|
||||
min_value=1,
|
||||
max_value=10000,
|
||||
),
|
||||
"SQLLAB_TIMEOUT": ConfigMetadata(
|
||||
key="SQLLAB_TIMEOUT",
|
||||
type=int,
|
||||
default=30,
|
||||
description="Timeout duration for SQL Lab synchronous queries in seconds. "
|
||||
"Queries taking longer will be killed. For async queries, see SQLLAB_ASYNC_TIME_LIMIT_SEC.",
|
||||
category="performance",
|
||||
impact="high",
|
||||
requires_restart=False,
|
||||
min_value=1,
|
||||
max_value=3600,
|
||||
converter=lambda x: int(x) if isinstance(x, str) else x,
|
||||
),
|
||||
"SQLLAB_ASYNC_TIME_LIMIT_SEC": ConfigMetadata(
|
||||
key="SQLLAB_ASYNC_TIME_LIMIT_SEC",
|
||||
type=int,
|
||||
default=int(timedelta(hours=6).total_seconds()),
|
||||
description="Maximum duration for SQL Lab async queries in seconds. "
|
||||
"This is enforced by Celery workers.",
|
||||
category="performance",
|
||||
impact="high",
|
||||
requires_restart=True,
|
||||
min_value=60,
|
||||
max_value=86400, # 24 hours
|
||||
doc_default=21600, # 6 hours
|
||||
),
|
||||
# Security settings
|
||||
"SECRET_KEY": ConfigMetadata(
|
||||
key="SECRET_KEY",
|
||||
type=str,
|
||||
default="CHANGE_ME_SECRET_KEY",
|
||||
description="**CRITICAL**: Secret key for signing cookies and CSRF tokens. "
|
||||
"**Must be changed** from default in production. Generate with: "
|
||||
"`openssl rand -base64 42`",
|
||||
category="security",
|
||||
impact="high",
|
||||
requires_restart=True,
|
||||
example='export SUPERSET__SECRET_KEY="$(openssl rand -base64 42)"',
|
||||
documentation_url="https://superset.apache.org/docs/configuration/security",
|
||||
),
|
||||
"WTF_CSRF_ENABLED": ConfigMetadata(
|
||||
key="WTF_CSRF_ENABLED",
|
||||
type=bool,
|
||||
default=True,
|
||||
description="Enable CSRF protection. Should always be True in production. "
|
||||
"Only disable for testing or if you have your own CSRF protection.",
|
||||
category="security",
|
||||
impact="high",
|
||||
requires_restart=True,
|
||||
),
|
||||
# Feature flags
|
||||
"FEATURE_FLAGS": ConfigMetadata(
|
||||
key="FEATURE_FLAGS",
|
||||
type=dict,
|
||||
default={},
|
||||
description="Feature flags to enable/disable functionality. "
|
||||
"Can be set as JSON in environment or as nested values.",
|
||||
category="features",
|
||||
impact="high",
|
||||
requires_restart=True,
|
||||
example="export SUPERSET__FEATURE_FLAGS='{\"ENABLE_TEMPLATE_PROCESSING\": true}'",
|
||||
),
|
||||
# UI/Theme settings
|
||||
"THEME_DEFAULT": ConfigMetadata(
|
||||
key="THEME_DEFAULT",
|
||||
type=dict,
|
||||
default={},
|
||||
description="Default theme configuration in Ant Design format. "
|
||||
"Customize colors, fonts, and other design tokens.",
|
||||
category="ui",
|
||||
impact="medium",
|
||||
requires_restart=False,
|
||||
example='export SUPERSET__THEME_DEFAULT__token__colorPrimary="#1890ff"',
|
||||
documentation_url="https://ant.design/docs/react/customize-theme",
|
||||
),
|
||||
# Logging
|
||||
"STATS_LOGGER": ConfigMetadata(
|
||||
key="STATS_LOGGER",
|
||||
type=DummyStatsLogger,
|
||||
default=DummyStatsLogger(),
|
||||
description="Statistics logger instance for metrics collection. "
|
||||
"Use StatsdStatsLogger for production metrics.",
|
||||
category="logging",
|
||||
impact="low",
|
||||
requires_restart=True,
|
||||
serializable=False,
|
||||
doc_default="<DummyStatsLogger instance>",
|
||||
),
|
||||
"EVENT_LOGGER": ConfigMetadata(
|
||||
key="EVENT_LOGGER",
|
||||
type=DBEventLogger,
|
||||
default=DBEventLogger(),
|
||||
description="Event logger for audit trails and user activity tracking. "
|
||||
"DBEventLogger stores in database, StdOutEventLogger for debugging.",
|
||||
category="logging",
|
||||
impact="medium",
|
||||
requires_restart=True,
|
||||
serializable=False,
|
||||
doc_default="<DBEventLogger instance>",
|
||||
),
|
||||
"LOGGING_CONFIGURATOR": ConfigMetadata(
|
||||
key="LOGGING_CONFIGURATOR",
|
||||
type=DefaultLoggingConfigurator,
|
||||
default=DefaultLoggingConfigurator(),
|
||||
description="Logging configuration handler. Customize to integrate with "
|
||||
"your logging infrastructure.",
|
||||
category="logging",
|
||||
impact="medium",
|
||||
requires_restart=True,
|
||||
serializable=False,
|
||||
doc_default="<DefaultLoggingConfigurator instance>",
|
||||
),
|
||||
# Add more configuration metadata as needed...
|
||||
}
|
||||
|
||||
|
||||
def export_for_documentation() -> Dict[str, Any]:
|
||||
"""Export metadata in JSON-serializable format for documentation.
|
||||
|
||||
Returns:
|
||||
Dictionary with all settings metadata suitable for JSON export
|
||||
"""
|
||||
all_settings = []
|
||||
by_category = {}
|
||||
|
||||
for key, metadata in CONFIG_METADATA.items():
|
||||
doc_dict = metadata.to_doc_dict()
|
||||
all_settings.append(doc_dict)
|
||||
|
||||
category = metadata.category
|
||||
if category not in by_category:
|
||||
by_category[category] = []
|
||||
by_category[category].append(doc_dict)
|
||||
|
||||
# Sort settings within each category
|
||||
for category in by_category:
|
||||
by_category[category].sort(key=lambda x: x["key"])
|
||||
|
||||
all_settings.sort(key=lambda x: x["key"])
|
||||
|
||||
return {
|
||||
"all_settings": all_settings,
|
||||
"by_category": by_category,
|
||||
"categories": sorted(by_category.keys()),
|
||||
"metadata": {
|
||||
"total_configs": len(all_settings),
|
||||
"description": "Superset configuration metadata",
|
||||
"generated_from": "config_metadata.py",
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def get_metadata(key: str) -> Optional[ConfigMetadata]:
|
||||
"""Get metadata for a specific configuration key.
|
||||
|
||||
Args:
|
||||
key: Configuration key name
|
||||
|
||||
Returns:
|
||||
ConfigMetadata instance or None if not found
|
||||
"""
|
||||
return CONFIG_METADATA.get(key)
|
||||
|
||||
|
||||
def validate_config_value(key: str, value: Any) -> bool:
|
||||
"""Validate a configuration value against its metadata.
|
||||
|
||||
Args:
|
||||
key: Configuration key name
|
||||
value: Value to validate
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
metadata = get_metadata(key)
|
||||
if not metadata:
|
||||
return True # No metadata means any value is valid
|
||||
|
||||
# Type check
|
||||
if not isinstance(value, metadata.type):
|
||||
return False
|
||||
|
||||
# Range check for numeric types
|
||||
if isinstance(value, (int, float)):
|
||||
if metadata.min_value is not None and value < metadata.min_value:
|
||||
return False
|
||||
if metadata.max_value is not None and value > metadata.max_value:
|
||||
return False
|
||||
|
||||
# Choice validation
|
||||
if metadata.choices is not None and value not in metadata.choices:
|
||||
return False
|
||||
|
||||
# Custom validator
|
||||
if metadata.validator is not None:
|
||||
return metadata.validator(value)
|
||||
|
||||
return True
|
||||
336
superset/config_objects.py
Normal file
336
superset/config_objects.py
Normal file
@@ -0,0 +1,336 @@
|
||||
"""
|
||||
Configuration objects and complex types that can't be JSON serialized.
|
||||
|
||||
This module contains all the complex Python objects used in configuration
|
||||
that need to be imported and used as actual Python objects rather than
|
||||
simple JSON-serializable values.
|
||||
"""
|
||||
|
||||
from collections import OrderedDict
|
||||
from datetime import timedelta
|
||||
from typing import Any
|
||||
|
||||
from superset.advanced_data_type.plugins.internet_address import internet_address
|
||||
from superset.advanced_data_type.plugins.internet_port import internet_port
|
||||
from superset.constants import CHANGE_ME_SECRET_KEY, NO_TIME_RANGE
|
||||
from superset.db_engine_specs.utils import SQLAlchemyUtilsAdapter
|
||||
from superset.extensions import feature_flag_manager
|
||||
from superset.key_value.types import JsonKeyValueCodec
|
||||
from superset.stats_logger import DummyStatsLogger
|
||||
from superset.utils.core import parse_boolean_string
|
||||
from superset.utils.log import DBEventLogger
|
||||
from superset.utils.logging_configurator import DefaultLoggingConfigurator
|
||||
|
||||
# Define complex objects that can't be JSON serialized
|
||||
COMPLEX_CONFIG_OBJECTS = {
|
||||
"STATS_LOGGER": DummyStatsLogger(),
|
||||
"EVENT_LOGGER": DBEventLogger(),
|
||||
"NO_TIME_RANGE": NO_TIME_RANGE,
|
||||
"CHANGE_ME_SECRET_KEY": CHANGE_ME_SECRET_KEY,
|
||||
"LOGGING_CONFIGURATOR": DefaultLoggingConfigurator(),
|
||||
"JSON_KEY_VALUE_CODEC": JsonKeyValueCodec(),
|
||||
"SQLALCHEMY_UTILS_ADAPTER": SQLAlchemyUtilsAdapter,
|
||||
"ADVANCED_DATA_TYPES": [internet_address, internet_port],
|
||||
"PARSE_BOOLEAN_STRING": parse_boolean_string,
|
||||
"FEATURE_FLAG_MANAGER": feature_flag_manager,
|
||||
}
|
||||
|
||||
# Default collections and dictionaries
|
||||
DEFAULT_COLLECTIONS = {
|
||||
"EXCEL_EXTENSIONS": {"xlsx", "xls"},
|
||||
"CSV_EXTENSIONS": {"csv", "tsv", "txt"},
|
||||
"COLUMNAR_EXTENSIONS": {"parquet", "zip"},
|
||||
"CURRENCIES": ["USD", "EUR", "GBP", "INR", "MXN", "JPY", "CNY"],
|
||||
"BLUEPRINTS": [],
|
||||
"ADDITIONAL_MIDDLEWARE": [],
|
||||
"CUSTOM_TEMPLATE_PROCESSORS": {},
|
||||
"ALLOWED_EXTRA_AUTHENTICATIONS": {},
|
||||
"DBS_AVAILABLE_DENYLIST": {},
|
||||
"QUERY_COST_FORMATTERS_BY_ENGINE": {},
|
||||
"SQLGLOT_DIALECTS_EXTENSIONS": {},
|
||||
"DB_POLL_INTERVAL_SECONDS": {},
|
||||
"DEFAULT_HTTP_HEADERS": {},
|
||||
"OVERRIDE_HTTP_HEADERS": {},
|
||||
"HTTP_HEADERS": {},
|
||||
"JINJA_CONTEXT_ADDONS": {},
|
||||
"WTF_CSRF_EXEMPT_LIST": [
|
||||
"superset.views.core.log",
|
||||
"superset.views.core.explore_json",
|
||||
"superset.charts.data.api.data",
|
||||
"superset.dashboards.api.cache_dashboard_screenshot",
|
||||
],
|
||||
"ROBOT_PERMISSION_ROLES": ["Public", "Gamma", "Alpha", "Admin", "sql_lab"],
|
||||
"VIZ_TYPE_DENYLIST": [],
|
||||
"PREFERRED_DATABASES": [
|
||||
"PostgreSQL",
|
||||
"Presto",
|
||||
"MySQL",
|
||||
"SQLite",
|
||||
],
|
||||
"DASHBOARD_AUTO_REFRESH_INTERVALS": [
|
||||
[0, "Don't refresh"],
|
||||
[10, "10 seconds"],
|
||||
[30, "30 seconds"],
|
||||
[60, "1 minute"],
|
||||
[300, "5 minutes"],
|
||||
[1800, "30 minutes"],
|
||||
[3600, "1 hour"],
|
||||
[21600, "6 hours"],
|
||||
[43200, "12 hours"],
|
||||
[86400, "24 hours"],
|
||||
],
|
||||
"LANGUAGES": {
|
||||
"en": {"flag": "us", "name": "English"},
|
||||
"es": {"flag": "es", "name": "Spanish"},
|
||||
"it": {"flag": "it", "name": "Italian"},
|
||||
"fr": {"flag": "fr", "name": "French"},
|
||||
"zh": {"flag": "cn", "name": "Chinese"},
|
||||
"zh_TW": {"flag": "tw", "name": "Traditional Chinese"},
|
||||
"ja": {"flag": "jp", "name": "Japanese"},
|
||||
"pt": {"flag": "pt", "name": "Portuguese"},
|
||||
"pt_BR": {"flag": "br", "name": "Brazilian Portuguese"},
|
||||
"ru": {"flag": "ru", "name": "Russian"},
|
||||
"ko": {"flag": "kr", "name": "Korean"},
|
||||
"sk": {"flag": "sk", "name": "Slovak"},
|
||||
"sl": {"flag": "si", "name": "Slovenian"},
|
||||
"nl": {"flag": "nl", "name": "Dutch"},
|
||||
"uk": {"flag": "uk", "name": "Ukrainian"},
|
||||
},
|
||||
"FAVICONS": [{"href": "/static/assets/images/favicon.png"}],
|
||||
"PROXY_FIX_CONFIG": {
|
||||
"x_for": 1,
|
||||
"x_proto": 1,
|
||||
"x_host": 1,
|
||||
"x_port": 1,
|
||||
"x_prefix": 1,
|
||||
},
|
||||
"WEBDRIVER_WINDOW": {
|
||||
"dashboard": (1600, 2000),
|
||||
"slice": (3000, 1200),
|
||||
"pixel_density": 1,
|
||||
},
|
||||
"WEBDRIVER_CONFIGURATION": {
|
||||
"options": {"capabilities": {}, "preferences": {}, "binary_location": ""},
|
||||
"service": {
|
||||
"log_output": "/dev/null",
|
||||
"service_args": [],
|
||||
"port": 0,
|
||||
"env": {},
|
||||
},
|
||||
},
|
||||
"WEBDRIVER_OPTION_ARGS": ["--headless"],
|
||||
"CSV_EXPORT": {"encoding": "utf-8"},
|
||||
"EXCEL_EXPORT": {},
|
||||
"GLOBAL_ASYNC_QUERIES_CACHE_BACKEND": {
|
||||
"CACHE_TYPE": "RedisCache",
|
||||
"CACHE_REDIS_HOST": "localhost",
|
||||
"CACHE_REDIS_PORT": 6379,
|
||||
"CACHE_REDIS_USER": "",
|
||||
"CACHE_REDIS_PASSWORD": "",
|
||||
"CACHE_REDIS_DB": 0,
|
||||
"CACHE_DEFAULT_TIMEOUT": 300,
|
||||
},
|
||||
"TALISMAN_CONFIG": {
|
||||
"content_security_policy": {
|
||||
"base-uri": ["'self'"],
|
||||
"default-src": ["'self'"],
|
||||
"img-src": ["'self'", "blob:", "data:"],
|
||||
"worker-src": ["'self'", "blob:"],
|
||||
"connect-src": ["'self'"],
|
||||
"object-src": "'none'",
|
||||
"style-src": ["'self'", "'unsafe-inline'"],
|
||||
"script-src": ["'self'", "'strict-dynamic'"],
|
||||
},
|
||||
"content_security_policy_nonce_in": ["script-src"],
|
||||
"force_https": False,
|
||||
"session_cookie_secure": False,
|
||||
},
|
||||
"TALISMAN_DEV_CONFIG": {
|
||||
"content_security_policy": {
|
||||
"base-uri": ["'self'"],
|
||||
"default-src": ["'self'"],
|
||||
"img-src": ["'self'", "blob:", "data:"],
|
||||
"worker-src": ["'self'", "blob:"],
|
||||
"connect-src": ["'self'"],
|
||||
"object-src": "'none'",
|
||||
"style-src": ["'self'", "'unsafe-inline'"],
|
||||
"script-src": ["'self'", "'unsafe-inline'", "'unsafe-eval'"],
|
||||
},
|
||||
"content_security_policy_nonce_in": ["script-src"],
|
||||
"force_https": False,
|
||||
"session_cookie_secure": False,
|
||||
},
|
||||
}
|
||||
|
||||
# Cache configurations
|
||||
CACHE_CONFIGS = {
|
||||
"CACHE_CONFIG": {"CACHE_TYPE": "NullCache"},
|
||||
"DATA_CACHE_CONFIG": {"CACHE_TYPE": "NullCache"},
|
||||
"THUMBNAIL_CACHE_CONFIG": {
|
||||
"CACHE_TYPE": "NullCache",
|
||||
"CACHE_DEFAULT_TIMEOUT": int(timedelta(days=7).total_seconds()),
|
||||
"CACHE_NO_NULL_WARNING": True,
|
||||
},
|
||||
"EXPLORE_FORM_DATA_CACHE_CONFIG": {
|
||||
"CACHE_TYPE": "NullCache",
|
||||
"CACHE_DEFAULT_TIMEOUT": int(timedelta(days=7).total_seconds()),
|
||||
"REFRESH_TIMEOUT_ON_RETRIEVAL": True,
|
||||
"CODEC": JsonKeyValueCodec(),
|
||||
},
|
||||
}
|
||||
|
||||
# Computed values that depend on other variables
|
||||
COMPUTED_VALUES = {
|
||||
"ALLOWED_EXTENSIONS": lambda: {
|
||||
*DEFAULT_COLLECTIONS["EXCEL_EXTENSIONS"],
|
||||
*DEFAULT_COLLECTIONS["CSV_EXTENSIONS"],
|
||||
*DEFAULT_COLLECTIONS["COLUMNAR_EXTENSIONS"],
|
||||
},
|
||||
"DEFAULT_MODULE_DS_MAP": lambda: OrderedDict(
|
||||
[
|
||||
("superset.connectors.sqla.models", ["SqlaTable"]),
|
||||
]
|
||||
),
|
||||
}
|
||||
|
||||
# Lambda functions and callables
|
||||
LAMBDA_FUNCTIONS = {
|
||||
"TRACKING_URL_TRANSFORMER": lambda url: url,
|
||||
"SQLA_TABLE_MUTATOR": lambda table: table,
|
||||
"SQL_QUERY_MUTATOR": lambda sql, **kwargs: sql,
|
||||
"EMAIL_HEADER_MUTATOR": lambda msg, **kwargs: msg,
|
||||
}
|
||||
|
||||
# Timeout values using timedelta
|
||||
TIMEOUT_VALUES = {
|
||||
"SUPERSET_WEBSERVER_TIMEOUT": int(timedelta(minutes=1).total_seconds()),
|
||||
"CELERY_BEAT_SCHEDULER_EXPIRES": timedelta(weeks=1),
|
||||
"SQLLAB_TIMEOUT": int(timedelta(seconds=30).total_seconds()),
|
||||
"SQLLAB_VALIDATION_TIMEOUT": int(timedelta(seconds=10).total_seconds()),
|
||||
"SQLLAB_ASYNC_TIME_LIMIT_SEC": int(timedelta(hours=6).total_seconds()),
|
||||
"SQLLAB_QUERY_COST_ESTIMATE_TIMEOUT": int(timedelta(seconds=10).total_seconds()),
|
||||
"CACHE_DEFAULT_TIMEOUT": int(timedelta(days=1).total_seconds()),
|
||||
"THUMBNAIL_ERROR_CACHE_TTL": int(timedelta(days=1).total_seconds()),
|
||||
"SCREENSHOT_LOCATE_WAIT": int(timedelta(seconds=10).total_seconds()),
|
||||
"SCREENSHOT_LOAD_WAIT": int(timedelta(minutes=1).total_seconds()),
|
||||
"SCREENSHOT_PLAYWRIGHT_DEFAULT_TIMEOUT": int(
|
||||
timedelta(seconds=60).total_seconds() * 1000
|
||||
),
|
||||
"WTF_CSRF_TIME_LIMIT": int(timedelta(weeks=1).total_seconds()),
|
||||
"EMAIL_PAGE_RENDER_WAIT": int(timedelta(seconds=30).total_seconds()),
|
||||
"TEST_DATABASE_CONNECTION_TIMEOUT": timedelta(seconds=30),
|
||||
"DATABASE_OAUTH2_TIMEOUT": timedelta(seconds=30),
|
||||
"SEND_FILE_MAX_AGE_DEFAULT": int(timedelta(days=365).total_seconds()),
|
||||
"GLOBAL_ASYNC_QUERIES_POLLING_DELAY": int(
|
||||
timedelta(milliseconds=500).total_seconds() * 1000
|
||||
),
|
||||
"ALERT_REPORTS_WORKING_TIME_OUT_LAG": int(timedelta(seconds=10).total_seconds()),
|
||||
"ALERT_REPORTS_WORKING_SOFT_TIME_OUT_LAG": int(
|
||||
timedelta(seconds=1).total_seconds()
|
||||
),
|
||||
"ALERT_MINIMUM_INTERVAL": int(timedelta(minutes=0).total_seconds()),
|
||||
"REPORT_MINIMUM_INTERVAL": int(timedelta(minutes=0).total_seconds()),
|
||||
"SLACK_CACHE_TIMEOUT": int(timedelta(days=1).total_seconds()),
|
||||
}
|
||||
|
||||
|
||||
def get_object_type_description(obj: Any) -> str:
|
||||
"""Get a human-readable description of a Python object type."""
|
||||
if callable(obj):
|
||||
if hasattr(obj, "__name__"):
|
||||
# Try to get the full module path
|
||||
module = getattr(obj, "__module__", None)
|
||||
if module:
|
||||
return f"Function: {module}.{obj.__name__}"
|
||||
else:
|
||||
return f"Function: {obj.__name__}"
|
||||
else:
|
||||
return "Callable function"
|
||||
elif hasattr(obj, "__class__"):
|
||||
# Get fully qualified class name
|
||||
cls = obj.__class__
|
||||
module = getattr(cls, "__module__", None)
|
||||
if module and module != "builtins":
|
||||
return f"Instance of {module}.{cls.__name__}"
|
||||
else:
|
||||
return f"Instance of {cls.__name__}"
|
||||
else:
|
||||
# Get type information
|
||||
obj_type = type(obj)
|
||||
module = getattr(obj_type, "__module__", None)
|
||||
if module and module != "builtins":
|
||||
return f"Object of type {module}.{obj_type.__name__}"
|
||||
else:
|
||||
return f"Object of type {obj_type.__name__}"
|
||||
|
||||
|
||||
def get_fully_qualified_type(obj: Any) -> str:
|
||||
"""Get the fully qualified type name of an object."""
|
||||
if obj is None:
|
||||
return "NoneType"
|
||||
|
||||
obj_type = type(obj)
|
||||
module = getattr(obj_type, "__module__", None)
|
||||
|
||||
if module and module != "builtins":
|
||||
return f"{module}.{obj_type.__name__}"
|
||||
else:
|
||||
return obj_type.__name__
|
||||
|
||||
|
||||
def get_default_for_complex_object(key: str) -> tuple[Any, str]:
|
||||
"""Get a string representation and type of complex objects for documentation."""
|
||||
if key in COMPLEX_CONFIG_OBJECTS:
|
||||
obj = COMPLEX_CONFIG_OBJECTS[key]
|
||||
return f"<{get_object_type_description(obj)}>", get_fully_qualified_type(obj)
|
||||
elif key in DEFAULT_COLLECTIONS:
|
||||
obj = DEFAULT_COLLECTIONS[key]
|
||||
return obj, get_fully_qualified_type(obj)
|
||||
elif key in CACHE_CONFIGS:
|
||||
obj = CACHE_CONFIGS[key]
|
||||
return obj, get_fully_qualified_type(obj)
|
||||
elif key in COMPUTED_VALUES:
|
||||
# For computed values, try to call them to get the actual value
|
||||
try:
|
||||
func = COMPUTED_VALUES[key]
|
||||
obj = func() # type: ignore
|
||||
return obj, get_fully_qualified_type(obj)
|
||||
except Exception:
|
||||
return f"<Computed: {key}>", "callable"
|
||||
elif key in LAMBDA_FUNCTIONS:
|
||||
obj = LAMBDA_FUNCTIONS[key]
|
||||
return "<Lambda function>", get_fully_qualified_type(obj)
|
||||
elif key in TIMEOUT_VALUES:
|
||||
obj = TIMEOUT_VALUES[key]
|
||||
return obj, get_fully_qualified_type(obj)
|
||||
else:
|
||||
return f"<Complex object: {key}>", "unknown"
|
||||
|
||||
|
||||
def is_complex_object(key: str) -> bool:
|
||||
"""Check if a configuration key represents a complex object."""
|
||||
return (
|
||||
key in COMPLEX_CONFIG_OBJECTS
|
||||
or key in COMPUTED_VALUES
|
||||
or key in LAMBDA_FUNCTIONS
|
||||
)
|
||||
|
||||
|
||||
def get_object_import_info(obj: Any) -> dict[str, Any]:
|
||||
"""Get import information for an object."""
|
||||
if hasattr(obj, "__module__") and hasattr(obj, "__name__"):
|
||||
return {
|
||||
"module": obj.__module__,
|
||||
"name": obj.__name__,
|
||||
"import_statement": f"from {obj.__module__} import {obj.__name__}",
|
||||
}
|
||||
elif hasattr(obj, "__class__"):
|
||||
cls = obj.__class__
|
||||
if hasattr(cls, "__module__"):
|
||||
return {
|
||||
"module": cls.__module__,
|
||||
"name": cls.__name__,
|
||||
"import_statement": f"from {cls.__module__} import {cls.__name__}",
|
||||
}
|
||||
|
||||
return {"module": None, "name": str(type(obj).__name__), "import_statement": None}
|
||||
3079
superset/config_schema.json
Normal file
3079
superset/config_schema.json
Normal file
File diff suppressed because it is too large
Load Diff
159
superset/daos/settings.py
Normal file
159
superset/daos/settings.py
Normal file
@@ -0,0 +1,159 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from superset.daos.base import BaseDAO
|
||||
from superset.models.core import Settings
|
||||
from superset.utils import json
|
||||
|
||||
|
||||
class SettingsDAO(BaseDAO[Settings]):
|
||||
"""
|
||||
Data Access Object for Settings model.
|
||||
|
||||
Provides methods to manage configuration settings stored in the database.
|
||||
"""
|
||||
|
||||
id_column_name = "key" # Settings uses 'key' as primary key, not 'id'
|
||||
|
||||
@classmethod
|
||||
def find_by_key(cls, key: str) -> Optional[Settings]:
|
||||
"""Find a setting by its key."""
|
||||
return cls.find_by_id(key)
|
||||
|
||||
@classmethod
|
||||
def get_value(cls, key: str, default: Any = None) -> Any:
|
||||
"""
|
||||
Get the parsed value for a setting key.
|
||||
|
||||
Args:
|
||||
key: The setting key
|
||||
default: Default value if setting not found
|
||||
|
||||
Returns:
|
||||
The parsed JSON value or default
|
||||
"""
|
||||
if setting := cls.find_by_key(key):
|
||||
try:
|
||||
return json.loads(setting.json_data)
|
||||
except json.JSONDecodeError:
|
||||
return setting.json_data
|
||||
return default
|
||||
|
||||
@classmethod
|
||||
def set_value(
|
||||
cls,
|
||||
key: str,
|
||||
value: Any,
|
||||
namespace: Optional[str] = None,
|
||||
is_sensitive: bool = False,
|
||||
) -> Settings:
|
||||
"""
|
||||
Set a setting value.
|
||||
|
||||
Args:
|
||||
key: The setting key
|
||||
value: The value to set (will be JSON serialized)
|
||||
namespace: Optional namespace for organizing settings
|
||||
is_sensitive: Whether this setting contains sensitive data
|
||||
|
||||
Returns:
|
||||
The Settings object
|
||||
"""
|
||||
json_value = json.dumps(value)
|
||||
|
||||
if setting := cls.find_by_key(key):
|
||||
# Update existing setting
|
||||
setting.json_data = json_value
|
||||
if namespace is not None:
|
||||
setting.namespace = namespace
|
||||
setting.is_sensitive = is_sensitive
|
||||
return cls.update(setting)
|
||||
else:
|
||||
# Create new setting
|
||||
return cls.create(
|
||||
attributes={
|
||||
"key": key,
|
||||
"json_data": json_value,
|
||||
"namespace": namespace,
|
||||
"is_sensitive": is_sensitive,
|
||||
}
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def get_all_as_dict(cls) -> Dict[str, Any]:
|
||||
"""
|
||||
Get all settings as a dictionary with parsed values.
|
||||
|
||||
Returns:
|
||||
Dictionary mapping setting keys to their parsed values
|
||||
"""
|
||||
settings = cls.find_all()
|
||||
result = {}
|
||||
|
||||
for setting in settings:
|
||||
try:
|
||||
result[setting.key] = json.loads(setting.json_data)
|
||||
except json.JSONDecodeError:
|
||||
result[setting.key] = setting.json_data
|
||||
|
||||
return result
|
||||
|
||||
@classmethod
|
||||
def get_by_namespace(cls, namespace: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Get all settings in a specific namespace.
|
||||
|
||||
Args:
|
||||
namespace: The namespace to filter by
|
||||
|
||||
Returns:
|
||||
Dictionary mapping setting keys to their parsed values
|
||||
"""
|
||||
|
||||
from superset.extensions import db
|
||||
|
||||
settings = (
|
||||
db.session.query(Settings).filter(Settings.namespace == namespace).all()
|
||||
)
|
||||
result = {}
|
||||
|
||||
for setting in settings:
|
||||
try:
|
||||
result[setting.key] = json.loads(setting.json_data)
|
||||
except json.JSONDecodeError:
|
||||
result[setting.key] = setting.json_data
|
||||
|
||||
return result
|
||||
|
||||
@classmethod
|
||||
def delete_by_key(cls, key: str) -> bool:
|
||||
"""
|
||||
Delete a setting by key.
|
||||
|
||||
Args:
|
||||
key: The setting key to delete
|
||||
|
||||
Returns:
|
||||
True if setting was deleted, False if not found
|
||||
"""
|
||||
if setting := cls.find_by_key(key):
|
||||
cls.delete([setting])
|
||||
return True
|
||||
return False
|
||||
@@ -21,7 +21,7 @@ from datetime import datetime
|
||||
from typing import Any, Optional
|
||||
from urllib import parse
|
||||
|
||||
from flask import current_app as app
|
||||
from flask import current_app
|
||||
from sqlalchemy import types
|
||||
from sqlalchemy.engine import URL
|
||||
|
||||
@@ -498,8 +498,8 @@ class SingleStoreSpec(BasicParametersMixin, BaseEngineSpec):
|
||||
"conn_attrs",
|
||||
{
|
||||
"_connector_name": "SingleStore Superset Database Engine",
|
||||
"_connector_version": app.config.get("VERSION_STRING", "dev"),
|
||||
"_product_version": app.config.get("VERSION_STRING", "dev"),
|
||||
"_connector_version": current_app.config.get("VERSION_STRING", "dev"),
|
||||
"_product_version": current_app.config.get("VERSION_STRING", "dev"),
|
||||
},
|
||||
)
|
||||
return uri, connect_args
|
||||
|
||||
@@ -160,6 +160,7 @@ class SupersetAppInitializer: # pylint: disable=too-many-public-methods
|
||||
SecurityRestApi,
|
||||
UserRegistrationsRestAPI,
|
||||
)
|
||||
from superset.settings.api import SettingsRestApi
|
||||
from superset.sqllab.api import SqlLabRestApi
|
||||
from superset.sqllab.permalink.api import SqlLabPermalinkRestApi
|
||||
from superset.tags.api import TagRestApi
|
||||
@@ -242,6 +243,7 @@ class SupersetAppInitializer: # pylint: disable=too-many-public-methods
|
||||
appbuilder.add_api(ReportExecutionLogRestApi)
|
||||
appbuilder.add_api(RLSRestApi)
|
||||
appbuilder.add_api(SavedQueryRestApi)
|
||||
appbuilder.add_api(SettingsRestApi)
|
||||
appbuilder.add_api(TagRestApi)
|
||||
appbuilder.add_api(SqlLabRestApi)
|
||||
appbuilder.add_api(SqlLabPermalinkRestApi)
|
||||
|
||||
@@ -0,0 +1,66 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""settings table
|
||||
|
||||
Revision ID: 02b237514781
|
||||
Revises: 363a9b1e8992
|
||||
Create Date: 2025-07-17 00:30:13.162356
|
||||
|
||||
"""
|
||||
|
||||
import sqlalchemy as sa
|
||||
import sqlalchemy_utils
|
||||
from alembic import op
|
||||
from sqlalchemy.dialects import mysql
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = "02b237514781"
|
||||
down_revision = "363a9b1e8992"
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.create_table(
|
||||
"settings",
|
||||
sa.Column("uuid", sqlalchemy_utils.types.uuid.UUIDType(), nullable=True),
|
||||
sa.Column("created_on", sa.DateTime(), nullable=True),
|
||||
sa.Column("changed_on", sa.DateTime(), nullable=True),
|
||||
sa.Column("key", sa.String(length=255), nullable=False),
|
||||
sa.Column(
|
||||
"json_data",
|
||||
sa.Text().with_variant(mysql.MEDIUMTEXT(), "mysql"),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column("namespace", sa.String(length=100), nullable=True),
|
||||
sa.Column("schema_version", sa.Integer(), nullable=False),
|
||||
sa.Column("is_sensitive", sa.Boolean(), nullable=False),
|
||||
sa.Column("created_by_fk", sa.Integer(), nullable=True),
|
||||
sa.Column("changed_by_fk", sa.Integer(), nullable=True),
|
||||
sa.ForeignKeyConstraint(
|
||||
["changed_by_fk"],
|
||||
["ab_user.id"],
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["created_by_fk"],
|
||||
["ab_user.id"],
|
||||
),
|
||||
sa.PrimaryKeyConstraint("key"),
|
||||
sa.UniqueConstraint("uuid"),
|
||||
)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("settings")
|
||||
@@ -1249,3 +1249,32 @@ class FavStar(UUIDMixin, Model):
|
||||
class_name = Column(String(50))
|
||||
obj_id = Column(Integer)
|
||||
dttm = Column(DateTime, default=datetime.utcnow)
|
||||
|
||||
|
||||
class Settings(Model, AuditMixinNullable, UUIDMixin):
|
||||
"""
|
||||
Database-backed configuration settings for Superset.
|
||||
|
||||
This model extends the Flask configuration system by storing runtime-configurable
|
||||
settings in the database. These settings can be modified without restarting the
|
||||
application and take precedence over static configuration files.
|
||||
|
||||
The settings system supports:
|
||||
- JSON-serializable values stored in json_data field
|
||||
- Import/export across Superset instances via UUID
|
||||
- Audit trail via AuditMixinNullable
|
||||
- Namespace organization for logical grouping
|
||||
"""
|
||||
|
||||
__tablename__ = "settings"
|
||||
|
||||
key = Column(String(255), primary_key=True)
|
||||
json_data = Column(utils.MediumText(), nullable=False) # JSON-serialized value
|
||||
namespace = Column(
|
||||
String(100), nullable=True
|
||||
) # e.g., 'features', 'ui', 'performance'
|
||||
schema_version = Column(Integer, nullable=False, default=1) # For migrations
|
||||
is_sensitive = Column(Boolean, nullable=False, default=False) # Flag for encryption
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"<Settings(key='{self.key}', namespace='{self.namespace}')>"
|
||||
|
||||
16
superset/settings/__init__.py
Normal file
16
superset/settings/__init__.py
Normal file
@@ -0,0 +1,16 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
379
superset/settings/api.py
Normal file
379
superset/settings/api.py
Normal file
@@ -0,0 +1,379 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import logging
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from flask import current_app, request
|
||||
from flask_appbuilder.api import expose, protect, safe
|
||||
from flask_appbuilder.models.sqla.interface import SQLAInterface
|
||||
from marshmallow import ValidationError
|
||||
|
||||
from superset.config_extensions import SupersetConfig
|
||||
from superset.constants import MODEL_API_RW_METHOD_PERMISSION_MAP
|
||||
from superset.daos.settings import SettingsDAO
|
||||
from superset.models.core import Settings
|
||||
from superset.settings.schemas import (
|
||||
SettingCreateSchema,
|
||||
SettingListResponseSchema,
|
||||
SettingResponseSchema,
|
||||
SettingUpdateSchema,
|
||||
)
|
||||
from superset.views.base_api import BaseSupersetModelRestApi
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SettingsRestApi(BaseSupersetModelRestApi):
|
||||
"""REST API for configuration settings management."""
|
||||
|
||||
datamodel = SQLAInterface(Settings)
|
||||
resource_name = "settings"
|
||||
allow_browser_login = True
|
||||
|
||||
# Only allow specific methods
|
||||
include_route_methods = {
|
||||
"get_list",
|
||||
"get",
|
||||
"post",
|
||||
"put",
|
||||
"delete",
|
||||
"info",
|
||||
}
|
||||
|
||||
# Schemas for serialization/deserialization
|
||||
list_schema = SettingListResponseSchema()
|
||||
show_schema = SettingResponseSchema()
|
||||
add_schema = SettingCreateSchema()
|
||||
edit_schema = SettingUpdateSchema()
|
||||
|
||||
# Permissions - only admins can read/write settings
|
||||
method_permission_name = MODEL_API_RW_METHOD_PERMISSION_MAP.copy()
|
||||
method_permission_name.update(
|
||||
{
|
||||
"get_list": "can_read",
|
||||
"get": "can_read",
|
||||
"post": "can_write",
|
||||
"put": "can_write",
|
||||
"delete": "can_write",
|
||||
"validate": "can_read",
|
||||
"metadata": "can_read",
|
||||
"effective_config": "can_read",
|
||||
}
|
||||
)
|
||||
|
||||
# Only Admin role can access settings
|
||||
class_permission_name = "Settings"
|
||||
|
||||
def _get_setting_metadata(self, key: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get metadata for a configuration setting."""
|
||||
if isinstance(current_app.config, SupersetConfig):
|
||||
return current_app.config.get_setting_metadata(key)
|
||||
return None
|
||||
|
||||
def _get_setting_source(self, key: str) -> str:
|
||||
"""Determine where a setting value comes from."""
|
||||
import os
|
||||
|
||||
# Check if it's from environment variables
|
||||
env_key = f"SUPERSET__{key}"
|
||||
if env_key in os.environ:
|
||||
return f"environment ({env_key})"
|
||||
|
||||
# Check if it's from superset_config.py
|
||||
try:
|
||||
import superset_config
|
||||
|
||||
if hasattr(superset_config, key):
|
||||
return "superset_config.py"
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
# Check if it's in database
|
||||
if SettingsDAO.find_by_key(key):
|
||||
return "database"
|
||||
|
||||
# Otherwise it's from defaults
|
||||
return "config_defaults.py"
|
||||
|
||||
def _is_setting_allowed_in_database(self, key: str) -> bool:
|
||||
"""Check if a setting is allowed to be stored in the database."""
|
||||
metadata = self._get_setting_metadata(key)
|
||||
if not metadata:
|
||||
return False
|
||||
|
||||
# Only allow settings that don't require restart and aren't readonly
|
||||
return not metadata.get("requires_restart", True) and not metadata.get(
|
||||
"readonly", False
|
||||
)
|
||||
|
||||
def _validate_setting_value(self, key: str, value: Any) -> tuple[bool, list[str]]:
|
||||
"""Validate a setting value against its metadata."""
|
||||
metadata = self._get_setting_metadata(key)
|
||||
if not metadata:
|
||||
return True, []
|
||||
|
||||
if isinstance(current_app.config, SupersetConfig):
|
||||
if current_app.config.validate_setting(key, value):
|
||||
return True, []
|
||||
else:
|
||||
return False, [
|
||||
f"Value does not match expected type or constraints for {key}"
|
||||
]
|
||||
|
||||
return True, []
|
||||
|
||||
@expose("/", methods=["GET"])
|
||||
@protect()
|
||||
@safe
|
||||
def get_list(self) -> str:
|
||||
"""Get list of all database settings."""
|
||||
# Parse query parameters
|
||||
args = request.args
|
||||
namespace = args.get("namespace")
|
||||
category = args.get("category")
|
||||
include_metadata = args.get("include_metadata", "false").lower() == "true"
|
||||
include_source = args.get("include_source", "false").lower() == "true"
|
||||
|
||||
# Get settings from database
|
||||
if namespace:
|
||||
settings = SettingsDAO.get_by_namespace(namespace)
|
||||
else:
|
||||
settings = SettingsDAO.get_all_as_dict()
|
||||
|
||||
# Build response
|
||||
result = []
|
||||
for key, value in settings.items():
|
||||
setting_data = {
|
||||
"key": key,
|
||||
"value": value,
|
||||
"namespace": namespace, # This would need to be fetched from the model
|
||||
}
|
||||
|
||||
if include_metadata:
|
||||
setting_data["metadata"] = self._get_setting_metadata(key)
|
||||
|
||||
if include_source:
|
||||
setting_data["source"] = self._get_setting_source(key)
|
||||
|
||||
# Filter by category if specified
|
||||
if category:
|
||||
metadata = self._get_setting_metadata(key)
|
||||
if not metadata or metadata.get("category") != category:
|
||||
continue
|
||||
|
||||
result.append(setting_data)
|
||||
|
||||
return self.response(200, result=result, count=len(result))
|
||||
|
||||
@expose("/<pk>", methods=["GET"])
|
||||
@protect()
|
||||
@safe
|
||||
def get(self, pk: str) -> str:
|
||||
"""Get a specific setting by key."""
|
||||
# Parse query parameters
|
||||
args = request.args
|
||||
include_metadata = args.get("include_metadata", "false").lower() == "true"
|
||||
include_source = args.get("include_source", "false").lower() == "true"
|
||||
|
||||
# Get setting value
|
||||
value = SettingsDAO.get_value(pk)
|
||||
if value is None:
|
||||
return self.response_404()
|
||||
|
||||
# Build response
|
||||
setting_data = {
|
||||
"key": pk,
|
||||
"value": value,
|
||||
}
|
||||
|
||||
if include_metadata:
|
||||
setting_data["metadata"] = self._get_setting_metadata(pk)
|
||||
|
||||
if include_source:
|
||||
setting_data["source"] = self._get_setting_source(pk)
|
||||
|
||||
return self.response(200, **setting_data)
|
||||
|
||||
@expose("/", methods=["POST"])
|
||||
@protect()
|
||||
@safe
|
||||
def post(self) -> str:
|
||||
"""Create a new setting."""
|
||||
try:
|
||||
item = self.add_schema.load(request.json)
|
||||
except ValidationError as error:
|
||||
return self.response_422(message=error.messages)
|
||||
|
||||
key = item["key"]
|
||||
value = item["value"]
|
||||
namespace = item.get("namespace")
|
||||
|
||||
# Check if setting is allowed in database
|
||||
if not self._is_setting_allowed_in_database(key):
|
||||
return self.response_422(
|
||||
message=f"Setting '{key}' cannot be stored in database. "
|
||||
f"It may require a restart or be read-only."
|
||||
)
|
||||
|
||||
# Validate value
|
||||
is_valid, errors = self._validate_setting_value(key, value)
|
||||
if not is_valid:
|
||||
return self.response_422(message={"validation_errors": errors})
|
||||
|
||||
# Check if setting already exists
|
||||
if SettingsDAO.find_by_key(key):
|
||||
return self.response_422(
|
||||
message=f"Setting '{key}' already exists. Use PUT to update."
|
||||
)
|
||||
|
||||
# Create setting
|
||||
try:
|
||||
SettingsDAO.set_value(key, value, namespace)
|
||||
return self.response(201, key=key, value=value)
|
||||
except Exception as ex:
|
||||
logger.exception("Error creating setting")
|
||||
return self.response_422(message=str(ex))
|
||||
|
||||
@expose("/<pk>", methods=["PUT"])
|
||||
@protect()
|
||||
@safe
|
||||
def put(self, pk: str) -> str:
|
||||
"""Update an existing setting."""
|
||||
try:
|
||||
item = self.edit_schema.load(request.json)
|
||||
except ValidationError as error:
|
||||
return self.response_422(message=error.messages)
|
||||
|
||||
value = item["value"]
|
||||
namespace = item.get("namespace")
|
||||
|
||||
# Check if setting is allowed in database
|
||||
if not self._is_setting_allowed_in_database(pk):
|
||||
return self.response_422(
|
||||
message=f"Setting '{pk}' cannot be stored in database. "
|
||||
f"It may require a restart or be read-only."
|
||||
)
|
||||
|
||||
# Validate value
|
||||
is_valid, errors = self._validate_setting_value(pk, value)
|
||||
if not is_valid:
|
||||
return self.response_422(message={"validation_errors": errors})
|
||||
|
||||
# Update setting
|
||||
try:
|
||||
SettingsDAO.set_value(pk, value, namespace)
|
||||
return self.response(200, key=pk, value=value)
|
||||
except Exception as ex:
|
||||
logger.exception("Error updating setting")
|
||||
return self.response_422(message=str(ex))
|
||||
|
||||
@expose("/<pk>", methods=["DELETE"])
|
||||
@protect()
|
||||
@safe
|
||||
def delete(self, pk: str) -> str:
|
||||
"""Delete a setting."""
|
||||
# Check if setting exists
|
||||
if not SettingsDAO.find_by_key(pk):
|
||||
return self.response_404()
|
||||
|
||||
# Delete setting
|
||||
try:
|
||||
SettingsDAO.delete_by_key(pk)
|
||||
return self.response(200, message=f"Setting '{pk}' deleted")
|
||||
except Exception as ex:
|
||||
logger.exception("Error deleting setting")
|
||||
return self.response_422(message=str(ex))
|
||||
|
||||
@expose("/validate", methods=["POST"])
|
||||
@protect()
|
||||
@safe
|
||||
def validate(self) -> str:
|
||||
"""Validate a setting value without saving it."""
|
||||
if not request.json:
|
||||
return self.response_400()
|
||||
|
||||
key = request.json.get("key")
|
||||
value = request.json.get("value")
|
||||
|
||||
if not key:
|
||||
return self.response_422(message="Key is required")
|
||||
|
||||
# Get metadata
|
||||
metadata = self._get_setting_metadata(key)
|
||||
|
||||
# Validate value
|
||||
is_valid, errors = self._validate_setting_value(key, value)
|
||||
|
||||
# Check if allowed in database
|
||||
allowed_in_db = self._is_setting_allowed_in_database(key)
|
||||
|
||||
response_data = {
|
||||
"key": key,
|
||||
"value": value,
|
||||
"valid": is_valid,
|
||||
"errors": errors,
|
||||
"metadata": metadata,
|
||||
"allowed_in_database": allowed_in_db,
|
||||
}
|
||||
|
||||
return self.response(200, **response_data)
|
||||
|
||||
@expose("/metadata", methods=["GET"])
|
||||
@protect()
|
||||
@safe
|
||||
def metadata(self) -> str:
|
||||
"""Get metadata for all documented settings."""
|
||||
if not isinstance(current_app.config, SupersetConfig):
|
||||
return self.response(200, metadata={})
|
||||
|
||||
metadata = current_app.config.DATABASE_SETTINGS_SCHEMA
|
||||
|
||||
# Filter by category if specified
|
||||
if category := request.args.get("category"):
|
||||
metadata = current_app.config.get_settings_by_category(category)
|
||||
|
||||
return self.response(200, metadata=metadata)
|
||||
|
||||
@expose("/effective_config", methods=["GET"])
|
||||
@protect()
|
||||
@safe
|
||||
def effective_config(self) -> str:
|
||||
"""Get effective configuration (database + env + defaults)."""
|
||||
# Get current config values
|
||||
config_dict = {}
|
||||
|
||||
# Get documented settings
|
||||
if isinstance(current_app.config, SupersetConfig):
|
||||
for key in current_app.config.DATABASE_SETTINGS_SCHEMA:
|
||||
if key in current_app.config:
|
||||
config_dict[key] = {
|
||||
"value": current_app.config[key],
|
||||
"source": self._get_setting_source(key),
|
||||
"metadata": self._get_setting_metadata(key),
|
||||
}
|
||||
|
||||
# Add database settings
|
||||
db_settings = SettingsDAO.get_all_as_dict()
|
||||
for key, value in db_settings.items():
|
||||
if key not in config_dict:
|
||||
config_dict[key] = {
|
||||
"value": value,
|
||||
"source": "database",
|
||||
"metadata": self._get_setting_metadata(key),
|
||||
}
|
||||
|
||||
return self.response(200, config=config_dict)
|
||||
110
superset/settings/schemas.py
Normal file
110
superset/settings/schemas.py
Normal file
@@ -0,0 +1,110 @@
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
from marshmallow import fields, Schema, validate
|
||||
|
||||
|
||||
class SettingCreateSchema(Schema):
|
||||
"""Schema for creating a new setting."""
|
||||
|
||||
key = fields.String(required=True, validate=validate.Length(min=1, max=255))
|
||||
value = fields.Raw(required=True)
|
||||
namespace = fields.String(allow_none=True, validate=validate.Length(max=100))
|
||||
|
||||
|
||||
class SettingUpdateSchema(Schema):
|
||||
"""Schema for updating an existing setting."""
|
||||
|
||||
value = fields.Raw(required=True)
|
||||
namespace = fields.String(allow_none=True, validate=validate.Length(max=100))
|
||||
|
||||
|
||||
class SettingResponseSchema(Schema):
|
||||
"""Schema for setting response."""
|
||||
|
||||
key = fields.String()
|
||||
value = fields.Raw()
|
||||
namespace = fields.String(allow_none=True)
|
||||
created_on = fields.DateTime()
|
||||
changed_on = fields.DateTime()
|
||||
created_by = fields.Nested("UserSchema", only=["id", "first_name", "last_name"])
|
||||
changed_by = fields.Nested("UserSchema", only=["id", "first_name", "last_name"])
|
||||
metadata = fields.Dict(allow_none=True) # Configuration metadata if available
|
||||
source = fields.String(
|
||||
allow_none=True
|
||||
) # Source of the setting (database, env, defaults)
|
||||
|
||||
|
||||
class SettingListResponseSchema(Schema):
|
||||
"""Schema for listing settings."""
|
||||
|
||||
result = fields.List(fields.Nested(SettingResponseSchema))
|
||||
count = fields.Integer()
|
||||
|
||||
|
||||
class SettingMetadataSchema(Schema):
|
||||
"""Schema for configuration metadata."""
|
||||
|
||||
title = fields.String()
|
||||
description = fields.String()
|
||||
type = fields.String()
|
||||
category = fields.String()
|
||||
impact = fields.String()
|
||||
requires_restart = fields.Boolean()
|
||||
default = fields.Raw()
|
||||
minimum = fields.Integer(allow_none=True)
|
||||
maximum = fields.Integer(allow_none=True)
|
||||
readonly = fields.Boolean()
|
||||
documentation_url = fields.String(allow_none=True)
|
||||
|
||||
|
||||
class SettingValidationSchema(Schema):
|
||||
"""Schema for setting validation."""
|
||||
|
||||
key = fields.String(required=True)
|
||||
value = fields.Raw(required=True)
|
||||
valid = fields.Boolean()
|
||||
errors = fields.List(fields.String())
|
||||
metadata = fields.Nested(SettingMetadataSchema, allow_none=True)
|
||||
|
||||
|
||||
# Query schemas for API endpoints
|
||||
setting_get_schema = {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"include_metadata": {"type": "boolean"},
|
||||
"include_source": {"type": "boolean"},
|
||||
},
|
||||
}
|
||||
|
||||
setting_list_schema = {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"namespace": {"type": "string"},
|
||||
"category": {"type": "string"},
|
||||
"include_metadata": {"type": "boolean"},
|
||||
"include_source": {"type": "boolean"},
|
||||
},
|
||||
}
|
||||
|
||||
setting_validate_schema = {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"key": {"type": "string"},
|
||||
"value": {}, # Any type
|
||||
},
|
||||
"required": ["key", "value"],
|
||||
}
|
||||
Reference in New Issue
Block a user