Skip to main content
Version: Next

Development How-tos

This guide contains specific instructions for common development tasks in Superset.

Contributing to Documentation

The documentation site is built using Docusaurus. All documentation lives in the docs folder, written in Markdown format.

Local Development

To set up your local environment for documentation development:

cd docs
npm install
npm run start

The site will be available at http://localhost:3000

Build

To create a production build:

npm run build
npm run serve # Test the build locally

Deployment

Documentation is automatically deployed when changes are merged to master.

Creating Visualization Plugins

Visualization plugins allow you to add custom chart types to Superset. They are built as npm packages that integrate with the Superset frontend.

Prerequisites

  • Node.js 18+
  • npm or yarn
  • A local Superset development environment

Creating a simple Hello World viz plugin

  1. Install the Superset Yeoman generator:
npm install -g @superset-ui/generator-superset
  1. Create a new plugin:
mkdir superset-plugin-chart-hello-world
cd superset-plugin-chart-hello-world
yo @superset-ui/superset
  1. Follow the prompts:
  • Package name: superset-plugin-chart-hello-world
  • Chart type: Choose your preferred type
  • Include storybook: Yes (recommended for development)
  1. Develop your plugin: The generator creates a complete plugin structure with TypeScript, React components, and build configuration.

  2. Test your plugin locally:

npm run dev
  1. Link to your local Superset:
npm link
# In your Superset frontend directory:
npm link superset-plugin-chart-hello-world
  1. Import and register in Superset: Edit superset-frontend/src/visualizations/presets/MainPreset.js to include your plugin.

Testing

Python Testing

Run Python tests using pytest:

# Run all tests
pytest

# Run specific test file
pytest tests/unit_tests/test_specific.py

# Run with coverage
pytest --cov=superset

# Run only unit tests
pytest tests/unit_tests

# Run only integration tests
pytest tests/integration_tests

Testing with local Presto connections

To test against Presto:

# Start Presto locally using Docker
docker run -p 8080:8080 \
--name presto \
-d prestodb/presto

# Configure in superset_config.py
SQLALCHEMY_DATABASE_URI = 'presto://localhost:8080/hive/default'

Frontend Testing

Run frontend tests using Jest:

cd superset-frontend

# Run all tests
npm run test

# Run with coverage
npm run test -- --coverage

# Run in watch mode
npm run test -- --watch

# Run specific test file
npm run test -- MyComponent.test.tsx

E2E Integration Testing

We support both Playwright (recommended) and Cypress for end-to-end testing.

Playwright is our new E2E testing framework, gradually replacing Cypress.

# Navigate to frontend directory
cd superset-frontend

# Run all Playwright tests
npm run playwright:test
# or: npx playwright test

# Run with interactive UI for debugging
npm run playwright:ui
# or: npx playwright test --ui

# Run in headed mode (see browser)
npm run playwright:headed
# or: npx playwright test --headed

# Run specific test file
npx playwright test tests/auth/login.spec.ts

# Run with debug mode (step through tests)
npm run playwright:debug tests/auth/login.spec.ts
# or: npx playwright test --debug tests/auth/login.spec.ts

# Generate test report
npm run playwright:report

Cypress (DEPRECATED - will be removed)

Cypress is being phased out in favor of Playwright but is still available:

# Set base URL for Cypress
export CYPRESS_BASE_URL='http://localhost:8088'
export CYPRESS_DATABASE=test
export CYPRESS_USERNAME=admin
export CYPRESS_PASSWORD=admin

# Navigate to Cypress directory
cd superset-frontend/cypress-base

# Run interactively
npm run cypress-debug

# Run headless (like CI)
npm run cypress-run-chrome

# Run specific file
npm run cypress-run-chrome -- --spec "cypress/e2e/dashboard/dashboard.test.ts"

Debugging Server App

For debugging the Flask backend:

Using PyCharm/IntelliJ

  1. Create a new Python configuration
  2. Set script path to superset/app.py
  3. Set environment variables:
    • FLASK_ENV=development
    • SUPERSET_CONFIG_PATH=/path/to/superset_config.py
  4. Set breakpoints and run in debug mode

Using VS Code

  1. Add to .vscode/launch.json:
{
"version": "0.2.0",
"configurations": [
{
"name": "Python: Flask",
"type": "python",
"request": "launch",
"module": "flask",
"env": {
"FLASK_APP": "superset/app.py",
"FLASK_ENV": "development"
},
"args": ["run", "--no-debugger", "--no-reload"],
"jinja": true
}
]
}
  1. Set breakpoints and press F5 to debug

Storybook

Storybook is used for developing and testing UI components in isolation:

cd superset-frontend

# Start Storybook
npm run storybook

# Build static Storybook
npm run build-storybook

Access Storybook at http://localhost:6006

Contributing Translations

Superset uses Flask-Babel for internationalization.

Enabling language selection

Edit superset_config.py:

LANGUAGES = {
'en': {'flag': 'us', 'name': 'English'},
'fr': {'flag': 'fr', 'name': 'French'},
'zh': {'flag': 'cn', 'name': 'Chinese'},
}

Creating a new language dictionary

# Initialize a new language
pybabel init -i superset/translations/messages.pot -d superset/translations -l de

Extracting new strings for translation

# Extract Python strings
pybabel extract -F babel.cfg -o superset/translations/messages.pot -k lazy_gettext superset

# Extract JavaScript strings
npm run build-translation

Updating language files

# Update all language files with new strings
pybabel update -i superset/translations/messages.pot -d superset/translations

Applying translations

# Frontend
cd superset-frontend
npm run build-translation

# Backend
pybabel compile -d superset/translations

Linting

Python

We use Ruff for Python linting and formatting:

# Auto-format using ruff
ruff format .

# Lint check with ruff
ruff check .

# Lint fix with ruff
ruff check --fix .

Pre-commit hooks run automatically on git commit if installed.

TypeScript

We use ESLint and Prettier for TypeScript:

cd superset-frontend

# Run eslint checks
npm run lint

# Run tsc (typescript) checks
npm run type

# Fix lint issues
npm run lint-fix

# Format with Prettier
npm run prettier

GitHub Ephemeral Environments

For every PR, an ephemeral environment is automatically deployed for testing.

Access pattern: https://pr-{PR_NUMBER}.superset.apache.org

Features:

  • Automatically deployed on PR creation/update
  • Includes sample data
  • Destroyed when PR is closed
  • Useful for UI/UX review

Tips and Tricks

Using Docker for Development

# Rebuild specific service
docker compose build superset

# View logs
docker compose logs -f superset

# Execute commands in container
docker compose exec superset bash

# Reset database
docker compose down -v
docker compose up

Hot Reloading

Frontend: Webpack dev server provides hot module replacement automatically.

Backend: Use Flask debug mode:

FLASK_ENV=development superset run -p 8088 --with-threads --reload

Performance Profiling

For Python profiling:

# In superset_config.py
PROFILING = True

For React profiling:

  • Use React DevTools Profiler
  • Enable performance marks in Chrome DevTools

Database Migrations

# Create a new migration
superset db migrate -m "Description of changes"

# Apply migrations
superset db upgrade

# Downgrade
superset db downgrade

Useful Aliases

Add to your shell profile:

alias sdev='FLASK_ENV=development superset run -p 8088 --with-threads --reload'
alias stest='pytest tests/unit_tests'
alias slint='pre-commit run --all-files'
alias sfront='cd superset-frontend && npm run dev-server'

Common Issues and Solutions

Node/npm Issues

# Clear npm cache
npm cache clean --force

# Reinstall dependencies
rm -rf node_modules package-lock.json
npm install

Python Environment Issues

# Recreate virtual environment
deactivate
rm -rf venv
python3 -m venv venv
source venv/bin/activate
pip install -r requirements/development.txt
pip install -e .

Database Issues

# Reset local database
superset db downgrade -r base
superset db upgrade
superset init

Port Already in Use

# Find process using port
lsof -i :8088
# Kill process
kill -9 [PID]

Reporting Security Vulnerabilities

Please report security vulnerabilities to private@superset.apache.org.

In the event a community member discovers a security flaw in Superset, it is important to follow the Apache Security Guidelines and release a fix as quickly as possible before public disclosure. Reporting security vulnerabilities through the usual GitHub Issues channel is not ideal as it will publicize the flaw before a fix can be applied.

SQL Lab Async Configuration

It's possible to configure a local database to operate in async mode, to work on async related features.

To do this, you'll need to:

  • Add an additional database entry. We recommend you copy the connection string from the database labeled main, and then enable SQL Lab and the features you want to use. Don't forget to check the Async box

  • Configure a results backend, here's a local FileSystemCache example, not recommended for production, but perfect for testing (stores cache in /tmp)

    from flask_caching.backends.filesystemcache import FileSystemCache
    RESULTS_BACKEND = FileSystemCache('/tmp/sqllab')
  • Start up a celery worker

    celery --app=superset.tasks.celery_app:app worker -O fair

Note that:

  • for changes that affect the worker logic, you'll have to restart the celery worker process for the changes to be reflected.
  • The message queue used is a sqlite database using the SQLAlchemy experimental broker. Ok for testing, but not recommended in production
  • In some cases, you may want to create a context that is more aligned to your production environment, and use the similar broker as well as results backend configuration

Async Chart Queries

It's possible to configure database queries for charts to operate in async mode. This is especially useful for dashboards with many charts that may otherwise be affected by browser connection limits. To enable async queries for dashboards and Explore, the following dependencies are required:

  • Redis 5.0+ (the feature utilizes Redis Streams)
  • Cache backends enabled via the CACHE_CONFIG and DATA_CACHE_CONFIG config settings
  • Celery workers configured and running to process async tasks

Need Help?