feat: add unified code quality dashboard with multiple validators
- Add validator_type field to scans and violations (architecture, security, performance) - Create security validator with SEC-xxx rules - Create performance validator with PERF-xxx rules - Add base validator class for shared functionality - Add validate_all.py script to run all validators - Update code quality service with validator type filtering - Add validator type tabs to dashboard UI - Add validator type filter to violations list - Update stats response with per-validator breakdown - Add security and performance rules documentation - Add chat-bubble icons to icon library 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
142
.performance-rules/async.yaml
Normal file
142
.performance-rules/async.yaml
Normal file
@@ -0,0 +1,142 @@
|
||||
# Async & Concurrency Performance Rules
|
||||
# =====================================
|
||||
|
||||
async_rules:
|
||||
- id: "PERF-036"
|
||||
name: "Async for I/O operations"
|
||||
severity: info
|
||||
description: |
|
||||
Use async for I/O-bound operations:
|
||||
- Database queries (with async driver)
|
||||
- External API calls
|
||||
- File operations
|
||||
- Network requests
|
||||
file_pattern: "**/api/**/*.py|**/service*.py"
|
||||
suggested_patterns:
|
||||
- "async def|await|asyncio"
|
||||
|
||||
- id: "PERF-037"
|
||||
name: "Parallel independent operations"
|
||||
severity: warning
|
||||
description: |
|
||||
Multiple independent async operations should run in parallel.
|
||||
Use asyncio.gather() instead of sequential awaits.
|
||||
file_pattern: "**/*.py"
|
||||
anti_patterns:
|
||||
- 'await\\s+\\w+\\([^)]*\\)\\s*\\n\\s*await\\s+\\w+\\([^)]*\\)\\s*\\n\\s*await\\s+\\w+\\('
|
||||
suggested_patterns:
|
||||
- "asyncio\\.gather|asyncio\\.create_task"
|
||||
example_bad: |
|
||||
user = await get_user(user_id)
|
||||
orders = await get_orders(user_id)
|
||||
preferences = await get_preferences(user_id)
|
||||
example_good: |
|
||||
user, orders, preferences = await asyncio.gather(
|
||||
get_user(user_id),
|
||||
get_orders(user_id),
|
||||
get_preferences(user_id)
|
||||
)
|
||||
|
||||
- id: "PERF-038"
|
||||
name: "Background tasks for slow operations"
|
||||
severity: warning
|
||||
description: |
|
||||
Operations taking > 500ms should run in background:
|
||||
- Email sending
|
||||
- Report generation
|
||||
- External API syncs
|
||||
- File processing
|
||||
file_pattern: "**/api/**/*.py"
|
||||
suggested_patterns:
|
||||
- "BackgroundTasks|background_task|celery|rq|dramatiq"
|
||||
|
||||
- id: "PERF-039"
|
||||
name: "Connection pooling for HTTP clients"
|
||||
severity: warning
|
||||
description: |
|
||||
HTTP clients should reuse connections.
|
||||
Create client once, not per request.
|
||||
file_pattern: "**/*client*.py|**/service*.py"
|
||||
anti_patterns:
|
||||
- 'def\\s+\\w+\\([^)]*\\):\\s*\\n[^}]*requests\\.get\\('
|
||||
- 'httpx\\.get\\('
|
||||
- 'aiohttp\\.request\\('
|
||||
suggested_patterns:
|
||||
- "httpx\\.AsyncClient|aiohttp\\.ClientSession|requests\\.Session"
|
||||
example_bad: |
|
||||
def fetch_data(url):
|
||||
response = requests.get(url) # New connection each time
|
||||
example_good: |
|
||||
# Use a session (connection pool)
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url)
|
||||
|
||||
- id: "PERF-040"
|
||||
name: "Timeout configuration"
|
||||
severity: error
|
||||
description: |
|
||||
All external calls must have timeouts.
|
||||
Without timeouts, requests can hang indefinitely.
|
||||
file_pattern: "**/*client*.py|**/service*.py"
|
||||
context_patterns:
|
||||
- "requests|httpx|aiohttp|urllib"
|
||||
required_patterns:
|
||||
- "timeout"
|
||||
example_bad: |
|
||||
response = requests.get(url)
|
||||
example_good: |
|
||||
response = requests.get(url, timeout=30)
|
||||
|
||||
- id: "PERF-041"
|
||||
name: "Connection pool limits"
|
||||
severity: info
|
||||
description: |
|
||||
Configure appropriate connection pool limits:
|
||||
- max_connections: Total connections
|
||||
- max_keepalive_connections: Idle connections
|
||||
- keepalive_expiry: Time before closing idle
|
||||
file_pattern: "**/*client*.py"
|
||||
suggested_patterns:
|
||||
- "max_connections|pool_connections|pool_maxsize"
|
||||
|
||||
- id: "PERF-042"
|
||||
name: "Retry with backoff"
|
||||
severity: info
|
||||
description: |
|
||||
External calls should retry with exponential backoff.
|
||||
Prevents cascade failures and respects rate limits.
|
||||
file_pattern: "**/*client*.py|**/service*.py"
|
||||
suggested_patterns:
|
||||
- "retry|backoff|tenacity|Retry"
|
||||
|
||||
- id: "PERF-043"
|
||||
name: "Circuit breaker pattern"
|
||||
severity: info
|
||||
description: |
|
||||
Use circuit breaker for unreliable external services.
|
||||
Prevents repeated failures from slowing down the system.
|
||||
file_pattern: "**/*client*.py"
|
||||
suggested_patterns:
|
||||
- "circuit_breaker|CircuitBreaker|pybreaker"
|
||||
|
||||
- id: "PERF-044"
|
||||
name: "Task queues for heavy processing"
|
||||
severity: info
|
||||
description: |
|
||||
Heavy processing should use task queues:
|
||||
- Celery
|
||||
- RQ (Redis Queue)
|
||||
- Dramatiq
|
||||
- Huey
|
||||
file_pattern: "**/tasks/**/*.py"
|
||||
suggested_patterns:
|
||||
- "celery|rq|dramatiq|huey|@task"
|
||||
|
||||
- id: "PERF-045"
|
||||
name: "Worker pool sizing"
|
||||
severity: info
|
||||
description: |
|
||||
Size worker pools appropriately:
|
||||
- CPU-bound: Number of cores
|
||||
- I/O-bound: Higher multiplier (2-4x cores)
|
||||
- Memory-constrained: Based on available RAM
|
||||
Reference in New Issue
Block a user