Files
orion/docs/testing/running_tests.md

8.0 KiB

Running Tests

This guide covers everything you need to know about running tests in the Letzshop Import project.

Prerequisites

Ensure you have the test dependencies installed:

pip install -r tests/requirements-test.txt

Required packages:

  • pytest>=7.4.0 - Test framework
  • pytest-cov>=4.1.0 - Coverage reporting
  • pytest-asyncio>=0.21.0 - Async test support
  • pytest-mock>=3.11.0 - Mocking utilities
  • httpx>=0.24.0 - HTTP client for API tests
  • faker>=19.0.0 - Test data generation

Basic Test Commands

Run All Tests

# Run the entire test suite
pytest

# Run with verbose output
pytest -v

# Run with very verbose output (show individual test results)
pytest -vv

Run by Test Type

# Run only unit tests (fast)
pytest -m unit

# Run only integration tests  
pytest -m integration

# Run unit and integration tests
pytest -m "unit or integration"

# Run everything except slow tests
pytest -m "not slow"

# Run performance tests only
pytest -m performance

Run by Directory

# Run all unit tests
pytest tests/unit/

# Run all integration tests
pytest tests/integration/

# Run API endpoint tests
pytest tests/integration/api/

# Run service layer tests
pytest tests/unit/services/

Run Specific Files

# Run specific test file
pytest tests/unit/services/test_product_service.py

# Run multiple specific files
pytest tests/unit/services/test_product_service.py tests/unit/utils/test_data_processing.py

Run Specific Tests

# Run specific test class
pytest tests/unit/services/test_product_service.py::TestProductService

# Run specific test method
pytest tests/unit/services/test_product_service.py::TestProductService::test_create_product_success

# Run tests matching pattern
pytest -k "product and create"
pytest -k "test_create_product"

Advanced Test Options

Coverage Reporting

# Run with coverage report
pytest --cov=app

# Coverage with missing lines
pytest --cov=app --cov-report=term-missing

# Generate HTML coverage report
pytest --cov=app --cov-report=html

# Coverage for specific modules
pytest --cov=app.services --cov=app.api

# Fail if coverage below threshold
pytest --cov=app --cov-fail-under=80

Output and Debugging

# Show local variables on failure
pytest --tb=short --showlocals

# Stop on first failure
pytest -x

# Stop after N failures
pytest --maxfail=3

# Show print statements
pytest -s

# Show warnings
pytest -W error

# Capture output (default)
pytest --capture=sys

Test Selection and Filtering

# Run only failed tests from last run
pytest --lf

# Run failed tests first, then continue
pytest --ff  

# Run tests that match keyword expression
pytest -k "user and not admin"

# Run tests modified since last commit
pytest --testmon

# Collect tests without running
pytest --collect-only

Performance and Parallel Execution

# Show 10 slowest tests
pytest --durations=10

# Show all test durations
pytest --durations=0

# Run tests in parallel (requires pytest-xdist)
pytest -n auto
pytest -n 4  # Use 4 workers

Test Environment Setup

Database Tests

# Run database tests (uses test database)
pytest -m database

# Run with test database reset
pytest --tb=short tests/integration/database/

API Tests

# Run API endpoint tests
pytest -m api

# Run with test client
pytest tests/integration/api/ -v

Authentication Tests

# Run auth-related tests
pytest -m auth

# Run with user fixtures
pytest tests/integration/security/ -v

Configuration Options

pytest.ini Settings

Our pytest.ini is configured with:

[pytest]
# Test discovery
testpaths = tests
python_files = test_*.py
python_classes = Test*
python_functions = test_*

# Default options
addopts = 
    -v
    --tb=short
    --strict-markers
    --color=yes
    --durations=10
    --cov=app
    --cov-report=term-missing
    --cov-report=html:htmlcov
    --cov-fail-under=80

# Custom markers
markers =
    unit: Unit tests
    integration: Integration tests
    # ... other markers

Environment Variables

# Set test environment
export TESTING=true

# Database URL for tests (uses in-memory SQLite by default)
export TEST_DATABASE_URL="sqlite:///:memory:"

# Disable external API calls in tests
export MOCK_EXTERNAL_APIS=true

Common Test Scenarios

Development Workflow

# Quick smoke test (unit tests only)
pytest -m unit --maxfail=1

# Full test run before commit
pytest -m "unit or integration"

# Pre-push comprehensive test
pytest --cov=app --cov-fail-under=80

Debugging Failed Tests

# Run failed test with detailed output
pytest --lf -vv --tb=long --showlocals

# Drop into debugger on failure (requires ipdb)
pytest --pdb

# Run specific failing test in isolation
pytest tests/path/to/test.py::TestClass::test_method -vv -s

Performance Testing

# Run performance tests only
pytest -m performance --durations=0

# Run with performance profiling
pytest -m performance --profile

# Load testing specific endpoints
pytest tests/performance/test_api_performance.py -v

CI/CD Pipeline Tests

# Minimal test run (fast feedback)
pytest -m "unit and not slow" --maxfail=5

# Full CI test run
pytest --cov=app --cov-report=xml --junitxml=test-results.xml

# Security and integration tests
pytest -m "security or integration" --tb=short

Test Reports and Output

Coverage Reports

# Terminal coverage report
pytest --cov=app --cov-report=term

# HTML coverage report (opens in browser)
pytest --cov=app --cov-report=html
open htmlcov/index.html

# XML coverage for CI
pytest --cov=app --cov-report=xml

JUnit XML Reports

# Generate JUnit XML (for CI integration)
pytest --junitxml=test-results.xml

# With coverage and JUnit
pytest --cov=app --cov-report=xml --junitxml=test-results.xml

Troubleshooting

Common Issues

Import Errors

# If getting import errors, ensure PYTHONPATH is set
PYTHONPATH=. pytest

# Or install in development mode
pip install -e .

Database Connection Issues

# Check if test database is accessible
python -c "from tests.conftest import engine; engine.connect()"

# Run with in-memory database
pytest --override-db-url="sqlite:///:memory:"

Fixture Not Found

# Ensure conftest.py is in the right location
ls tests/conftest.py

# Check fixture imports
pytest --fixtures tests/unit/

Permission Issues

# Fix test file permissions
chmod +x tests/**/*.py

# Clear pytest cache
rm -rf .pytest_cache/

Performance Issues

# Identify slow tests
pytest --durations=10 --durations-min=1.0

# Profile test execution
pytest --profile-svg

# Run subset of tests for faster feedback
pytest -m "unit and not slow"

Integration with IDEs

VS Code

Add to .vscode/settings.json:

{
    "python.testing.pytestEnabled": true,
    "python.testing.pytestArgs": [
        "tests",
        "-v"
    ],
    "python.testing.cwd": "${workspaceFolder}"
}

PyCharm

  1. Go to Settings → Tools → Python Integrated Tools
  2. Set Testing → Default test runner to "pytest"
  3. Set pytest options: -v --tb=short

Best Practices

During Development

  1. Run unit tests frequently - Fast feedback loop
  2. Run integration tests before commits - Catch interaction issues
  3. Check coverage regularly - Ensure good test coverage
  4. Use descriptive test names - Easy to understand failures

Before Code Review

  1. Run full test suite - pytest
  2. Check coverage meets threshold - pytest --cov-fail-under=80
  3. Ensure no warnings - pytest -W error
  4. Test with fresh environment - New terminal/clean cache

In CI/CD

  1. Fail fast on unit tests - pytest -m unit --maxfail=1
  2. Generate reports - Coverage and JUnit XML
  3. Run performance tests on schedule - Not every commit
  4. Archive test results - For debugging and trends

Need help with a specific testing scenario? Check our Testing FAQ or open a GitHub issue!