Skip to content

Conversation

@llbbl
Copy link

@llbbl llbbl commented Aug 23, 2025

Set Up Python Testing Infrastructure

Summary

This PR establishes a comprehensive testing infrastructure for the Python project using Poetry as the package manager and pytest as the testing framework. The setup provides a ready-to-use environment where developers can immediately start writing unit and integration tests.

Changes Made

Package Management

  • Poetry Configuration: Created pyproject.toml with Poetry as the package manager
  • Dependency Migration: Migrated existing dependencies from requirements.txt files
  • Development Dependencies: Added pytest, pytest-cov, and pytest-mock as dev dependencies

Testing Configuration

  • pytest Settings: Configured test discovery patterns, output formatting, and strict markers
  • Coverage Configuration: Set up coverage reporting with HTML/XML outputs
  • Custom Markers: Added unit, integration, and slow markers for test categorization

Directory Structure

tests/
├── __init__.py
├── conftest.py                 # Shared fixtures
├── test_infrastructure_validation.py  # Validation tests
├── unit/
│   └── __init__.py
└── integration/
    └── __init__.py

Shared Fixtures (conftest.py)

  • temp_dir: Temporary directory with automatic cleanup
  • temp_file: Temporary file creation
  • mock_config: Mock configuration dictionary
  • mock_model: Mock ML model object
  • sample_image_data: Sample image arrays for testing
  • mock_tensorflow_session: Mock TensorFlow session
  • capture_stdout: Stdout capture utility
  • mock_pillow_image: Mock PIL Image object
  • test_data_paths: Test data directory structure
  • reset_modules: Automatic module reset for test isolation

Additional Updates

  • Updated .gitignore: Added testing artifacts, coverage reports, and Claude settings
  • Poetry Scripts: Configured both poetry run test and poetry run tests commands

How to Use

Install Dependencies

poetry install

Run Tests

# Run all tests
poetry run test

# Alternative command (both work)
poetry run tests

# Run specific test markers
poetry run pytest -m unit
poetry run pytest -m integration
poetry run pytest -m "not slow"

# Run with specific options
poetry run pytest -v --tb=short

Coverage Reports

After running tests, coverage reports are generated in:

  • Terminal: Displayed with missing lines
  • HTML: htmlcov/index.html
  • XML: coverage.xml

Validation

The infrastructure has been validated with 17 tests that verify:

  • All testing dependencies are properly installed
  • Project structure is correctly set up
  • Fixtures are working as expected
  • Test markers are properly configured
  • Coverage configuration is correct

Notes

  • The coverage threshold is currently not enforced (removed --cov-fail-under=80) to allow the infrastructure to be set up without requiring immediate code coverage
  • The existing code has some syntax issues that prevent coverage parsing (seen in validation output), but this doesn't affect the testing infrastructure itself
  • All standard pytest options remain available for flexible test execution

Next Steps

Developers can now:

  1. Write unit tests in tests/unit/
  2. Write integration tests in tests/integration/
  3. Use the provided fixtures for common testing scenarios
  4. Add custom fixtures to conftest.py as needed
  5. Monitor code coverage and work towards the 80% threshold goal

- Configure Poetry as package manager with pyproject.toml
- Add pytest, pytest-cov, and pytest-mock as dev dependencies
- Set up pytest configuration with markers and coverage settings
- Create testing directory structure with unit/integration folders
- Add comprehensive shared fixtures in conftest.py
- Update .gitignore with testing and Claude-related entries
- Create validation tests to verify infrastructure setup
- Configure test commands (poetry run test/tests)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant