Code Review Process
Understand how code reviews work in Apache Superset and how to participate effectively.
Overview
Code review is a critical part of maintaining code quality and sharing knowledge across the team. Every change to Superset goes through peer review before merging.
For Authors
Preparing for Review
Before Requesting Review
- Self-review your changes
- Ensure CI checks pass
- Add comprehensive tests
- Update documentation
- Fill out PR template completely
- Add screenshots for UI changes
Self-Review Checklist
# View your changes
git diff upstream/master
# Check for common issues:
# - Commented out code
# - Debug statements (console.log, print)
# - TODO comments that should be addressed
# - Hardcoded values that should be configurable
# - Missing error handling
# - Performance implications
Requesting Review
Auto-Assignment
GitHub will automatically request reviews based on CODEOWNERS file.
Manual Assignment
For specific expertise, request additional reviewers:
- Frontend changes: Tag frontend experts
- Backend changes: Tag backend experts
- Security changes: Tag security team
- Database changes: Tag database experts
Review Request Message
@reviewer This PR implements [feature]. Could you please review:
1. The approach taken in [file]
2. Performance implications of [change]
3. Security considerations for [feature]
Thanks!
Responding to Feedback
Best Practices
- Be receptive: Reviews improve code quality
- Ask questions: Clarify if feedback is unclear
- Explain decisions: Share context for your choices
- Update promptly: Address feedback in timely manner
Comment Responses
# Acknowledging
"Good catch! Fixed in [commit hash]"
# Explaining
"I chose this approach because [reason]. Would you prefer [alternative]?"
# Questioning
"Could you elaborate on [concern]? I'm not sure I understand the issue."
# Disagreeing respectfully
"I see your point, but I think [current approach] because [reason]. What do you think?"
For Reviewers
Review Responsibilities
What to Review
- Correctness: Does the code do what it claims?
- Design: Is the approach appropriate?
- Clarity: Is the code readable and maintainable?
- Testing: Are tests comprehensive?
- Performance: Any performance concerns?
- Security: Any security issues?
- Documentation: Is it well documented?
Review Checklist
Functionality
- Feature works as described
- Edge cases are handled
- Error handling is appropriate
- Backwards compatibility maintained
Code Quality
- Follows project conventions
- No code duplication
- Clear variable/function names
- Appropriate abstraction levels
- SOLID principles followed
Testing
- Unit tests for business logic
- Integration tests for APIs
- E2E tests for critical paths
- Tests are maintainable
- Good test coverage
Security
- Input validation
- SQL injection prevention
- XSS prevention
- CSRF protection
- Authentication/authorization checks
- No sensitive data in logs
Performance
- Database queries optimized
- No N+1 queries
- Appropriate caching
- Frontend bundle size impact
- Memory usage considerations
Providing Feedback
Effective Comments
# ✅ Good: Specific and actionable
"This query could cause N+1 problems. Consider using
`select_related('user')` to fetch users in a single query."
# ❌ Bad: Vague
"This doesn't look right."
// ✅ Good: Suggests improvement
"Consider using useMemo here to prevent unnecessary
re-renders when dependencies haven't changed."
// ❌ Bad: Just criticism
"This is inefficient."
Comment Types
Use GitHub's comment types:
- Comment: General feedback or questions
- Approve: Changes look good
- Request Changes: Must be addressed before merge
Prefix conventions:
nit:Minor issue (non-blocking)suggestion:Recommended improvementquestion:Seeking clarificationblocker:Must be fixedpraise:Highlighting good work
Examples
nit: Consider renaming `getData` to `fetchUserData` for clarity
suggestion: This could be simplified using Array.reduce()
question: Is this intentionally not handling the error case?
blocker: This SQL is vulnerable to injection. Please use parameterized queries.
praise: Excellent test coverage! 👍
Review Process
Timeline
Expected Response Times
- Initial review: Within 2-3 business days
- Follow-up review: Within 1-2 business days
- Critical fixes: ASAP (tag in Slack)
Escalation
If no response after 3 days:
- Ping reviewer in PR comments
- Ask in #development Slack channel
- Tag @apache/superset-committers
Approval Requirements
Minimum Requirements
- 1 approval from a committer for minor changes
- 2 approvals for significant features
- 3 approvals for breaking changes
Special Cases
- Security changes: Require security team review
- API changes: Require API team review
- Database migrations: Require database expert review
- UI/UX changes: Require design review
Merge Process
Who Can Merge
- Committers with write access
- After all requirements met
- CI checks must pass
Merge Methods
- Squash and merge: Default for feature PRs
- Rebase and merge: For clean history
- Create merge commit: Rarely used
Merge Checklist
- All CI checks green
- Required approvals obtained
- No unresolved conversations
- PR title follows conventions
- Milestone set (if applicable)
Review Etiquette
Do's
- ✅ Be kind and constructive
- ✅ Acknowledge time and effort
- ✅ Provide specific examples
- ✅ Suggest solutions
- ✅ Praise good work
- ✅ Consider cultural differences
- ✅ Focus on the code, not the person
Don'ts
- ❌ Use harsh or dismissive language
- ❌ Bikeshed on minor preferences
- ❌ Review when tired or frustrated
- ❌ Make personal attacks
- ❌ Ignore the PR description
- ❌ Demand perfection
Becoming a Reviewer
Path to Reviewer
- Contribute regularly: Submit quality PRs
- Participate in discussions: Share knowledge
- Review others' code: Start with comments
- Build expertise: Focus on specific areas
- Get nominated: By existing committers
Reviewer Expectations
- Review PRs in your area of expertise
- Respond within reasonable time
- Mentor new contributors
- Maintain high standards
- Stay current with best practices
Advanced Topics
Reviewing Large PRs
Strategy
- Request splitting: Ask to break into smaller PRs
- Review in phases:
- Architecture/approach first
- Implementation details second
- Tests and docs last
- Use draft reviews: Save comments and submit together
Cross-Team Reviews
When Needed
- Changes affecting multiple teams
- Shared components/libraries
- API contract changes
- Database schema changes
Performance Reviews
Tools
# Backend performance
import cProfile
import pstats
# Profile the code
cProfile.run('function_to_profile()', 'stats.prof')
stats = pstats.Stats('stats.prof')
stats.sort_stats('cumulative').print_stats(10)
// Frontend performance
// Use React DevTools Profiler
// Chrome DevTools Performance tab
// Lighthouse audits