Files
gridpilot.gg/docs/TESTS.md
2026-01-03 16:51:40 +01:00

16 KiB

Testing Strategy

Overview

GridPilot employs a comprehensive BDD (Behavior-Driven Development) testing strategy across three distinct layers: Unit, Integration, and End-to-End (E2E). Each layer validates different aspects of the system while maintaining a consistent Given/When/Then approach that emphasizes behavior over implementation.

This document provides practical guidance on testing philosophy, test organization, tooling, and execution patterns for GridPilot.


BDD Philosophy

Why BDD for GridPilot?

GridPilot manages complex business rules around league management, team registration, event scheduling, result processing, and standings calculation. These rules must be:

  • Understandable by non-technical stakeholders (league admins, race organizers)
  • Verifiable through automated tests that mirror real-world scenarios
  • Maintainable as business requirements evolve

BDD provides a shared vocabulary (Given/When/Then) that bridges the gap between domain experts and developers, ensuring tests document expected behavior rather than technical implementation details.

Given/When/Then Format

All tests—regardless of layer—follow this structure:

// Given: Establish initial state/context
// When: Perform the action being tested
// Then: Assert the expected outcome

Example (Unit Test):

describe('League Domain Entity', () => {
  it('should add a team when team limit not reached', () => {
    // Given
    const league = new League('Summer Series', { maxTeams: 10 });
    const team = new Team('Racing Legends');
    
    // When
    const result = league.addTeam(team);
    
    // Then
    expect(result.isSuccess()).toBe(true);
    expect(league.teams).toContain(team);
  });
});

This pattern applies equally to integration tests (with real database operations) and E2E tests (with full UI workflows).


Test Types & Organization

Unit Tests (Adjacent to Implementation)

Location: Adjacent to the implementation file (e.g., core/ports/media/MediaResolverPort.ts + MediaResolverPort.test.ts)

Scope: Domain entities, value objects, and application use cases with mocked ports (repositories, external services).

Tooling: Vitest (fast, TypeScript-native, ESM support)

Execution: Parallel, target <1 second total runtime

Purpose:

  • Validate business logic in isolation
  • Ensure domain invariants hold (e.g., team limits, scoring rules)
  • Test use case orchestration with mocked dependencies

Key Principle: Unit tests live with their implementation. No separate tests/unit/ directory.

Examples:

  1. Domain Entity Test:

    // core/domain/entities/League.ts + League.test.ts
    Given a League with maxTeams=10 and 9 current teams
    When addTeam() is called with a valid Team
    Then the team is added successfully
    
    Given a League with maxTeams=10 and 10 current teams
    When addTeam() is called
    Then a DomainError is returned with "Team limit reached"
    
  2. Use Case Test:

    // core/application/use-cases/GenerateStandingsUseCase.ts + GenerateStandingsUseCase.test.ts
    Given a League with 5 teams and completed races
    When execute() is called
    Then LeagueRepository.findById() is invoked
    And ScoringRule.calculatePoints() is called for each team
    And sorted standings are returned
    

Key Practices:

  • Mock only at architecture boundaries (ports like ILeagueRepository)
  • Never mock domain entities or value objects
  • Keep tests fast (<10ms per test)
  • Use in-memory test doubles for simple cases

Integration Tests (/tests/integration)

Location: tests/integration/

Scope: Verify adapter composition works correctly. Uses in-memory adapters only.

Tooling: Vitest

Execution: Sequential, ~10 seconds per suite

Purpose:

  • Validate that multiple adapters work together
  • Test use case orchestration with real in-memory implementations
  • Verify infrastructure wiring

Directory Structure:

tests/integration/
├── racing/          # Racing domain integration tests
│   └── RegistrationAndTeamUseCases.test.ts
└── website/         # Website integration tests
    ├── auth-flow.test.ts
    ├── auth-guard.test.ts
    └── middleware.test.ts

Examples:

  1. Use Case Integration Test:

    // tests/integration/racing/RegistrationAndTeamUseCases.test.ts
    Given an in-memory league repository with a league
    And an in-memory team repository
    When RegisterTeamUseCase is executed
    Then the team is added to the league
    And the team is saved to the repository
    
  2. Website Integration Test:

    // tests/integration/website/auth-flow.test.ts
    Given an in-memory authentication system
    When a user logs in
    Then the session is created
    And protected routes are accessible
    

Key Practices:

  • Use in-memory adapters from adapters/**/inmemory/
  • Clean state between tests
  • Test adapter composition, not individual methods
  • NEVER use TypeORM or real databases

E2E Tests (/tests/e2e)

Location: tests/e2e/

Scope: Full system verification with TypeORM/PostgreSQL via Docker.

Tooling: Playwright + Docker Compose

Execution: Sequential, ~2 minutes per scenario

Purpose:

  • Validate complete user journeys from UI to database
  • Ensure services integrate correctly in production-like environment
  • Catch regressions in multi-service workflows

Directory Structure:

tests/e2e/
├── website/         # Website E2E tests
│   └── website-pages.test.ts
└── docker/          # Docker-based E2E (TypeORM)
    └── (future tests)

Examples:

  1. League Creation Workflow:

    Given a PostgreSQL database running in Docker
    And an authenticated league admin
    When they navigate to "Create League"
    And fill in league name, scoring system, and team limit
    And submit the form
    Then the league appears in the admin dashboard
    And the database contains the new league record
    
  2. Full User Journey:

    Given a running Docker Compose stack
    When a user registers
    And creates a league
    And teams join the league
    And races are completed
    Then standings are calculated correctly in the database
    

Key Practices:

  • Use Docker Compose for full stack
  • TypeORM/PostgreSQL for real database
  • Clean database between scenarios
  • Test real user workflows

Structure Tests (/tests/structure)

Location: tests/structure/

Scope: Architecture validation and dependency rules.

Tooling: Vitest

Examples:

  1. Package Dependencies Test:
    // tests/structure/PackageDependencies.test.ts
    Then core layer should not depend on adapters
    And adapters should not depend on apps
    And domain should have zero dependencies
    

Testing Infrastructure

Clean Architecture Test Boundaries

The Golden Rules

  1. Domain Layer (core/)

    • Has ZERO dependencies
    • Unit tests adjacent to implementation
    • Cannot depend on testing utilities
  2. Application Layer (core/application/)

    • Depends only on ports (interfaces)
    • Unit tests adjacent to implementation
    • Cannot depend on concrete adapters
  3. Adapters Layer (adapters/)

    • Contains all infrastructure
    • In-memory implementations for integration tests
    • TypeORM implementations for e2e tests
  4. Apps Layer (apps/)

    • Delivery mechanisms only
    • E2E tests verify full stack

Test Type Separation

Test Type Location Database Purpose
Unit Adjacent to code None Verify domain logic
Integration tests/integration/ In-memory Verify adapter composition
E2E tests/e2e/ TypeORM/PostgreSQL Full system verification

Enforcement Rules

TypeScript Paths

{
  "compilerOptions": {
    "paths": {
      "@core/*": ["core/*"],
      "@adapters/*": ["adapters/*"],
      "@testing/*": ["adapters/testing/*"]
    }
  }
}

ESLint Rules

{
  "rules": {
    "no-restricted-imports": ["error", {
      "patterns": [
        {
          "group": ["@testing/*"],
          "message": "Testing utilities should only be used in test files"
        },
        {
          "group": ["@core/testing/*"],
          "message": "Core cannot depend on testing utilities"
        }
      ]
    }]
  }
}

Quick Reference

Moving Tests to Clean Architecture

  1. Unit tests → Move adjacent to implementation
  2. Integration tests → Move to tests/integration/
  3. E2E tests → Move to tests/e2e/
  4. Testing utilities → Move to adapters/testing/ (then delete if unused)

What Goes Where

  • core/ + *.test.ts → Unit tests
  • tests/integration/ → Integration tests (in-memory)
  • tests/e2e/ → E2E tests (TypeORM)
  • adapters/testing/ → Testing utilities (minimal, only what's used)

Test Data Strategy

Fixtures & Seeding

Unit Tests:

  • Use in-memory domain objects (no database)
  • Factory functions for common test entities:
    function createTestLeague(overrides?: Partial<LeagueProps>): League {
      return new League('Test League', { maxTeams: 10, ...overrides });
    }
    

Integration Tests:

  • Use in-memory adapters only
  • Clean state between tests

E2E Tests:

  • Pre-seed database via migrations before Docker Compose starts
  • Use API endpoints to create test data when possible
  • Database cleanup between scenarios:
    // tests/e2e/support/database.ts
    export async function cleanDatabase() {
      await sql`TRUNCATE TABLE event_results CASCADE`;
      await sql`TRUNCATE TABLE events CASCADE`;
      await sql`TRUNCATE TABLE teams CASCADE`;
      await sql`TRUNCATE TABLE leagues CASCADE`;
    }
    

Continuous Testing

Watch Mode (Development)

# Auto-run unit tests on file changes
npm run test:watch

# Auto-run integration tests (slower, but useful for DB work)
npm run test:integration:watch

CI/CD Pipeline

graph LR
  A[Code Push] --> B[Unit Tests]
  B --> C[Integration Tests]
  C --> D[E2E Tests]
  D --> E[Deploy to Staging]

Execution Order:

  1. Unit Tests (parallel, <1 second) — fail fast on logic errors
  2. Integration Tests (sequential, ~10 seconds) — catch infrastructure issues
  3. E2E Tests (sequential, ~2 minutes) — validate full workflows
  4. Deploy — only if all tests pass

Parallelization:

  • Unit tests run in parallel (Vitest default)
  • Integration tests run sequentially (avoid database conflicts)
  • E2E tests run sequentially (UI interactions are stateful)

Testing Best Practices

1. Test Behavior, Not Implementation

Bad (overfitted to implementation):

it('should call repository.save() once', () => {
  const repo = mock<ILeagueRepository>();
  const useCase = new CreateLeagueUseCase(repo);
  useCase.execute({ name: 'Test' });
  expect(repo.save).toHaveBeenCalledTimes(1);
});

Good (tests observable behavior):

it('should persist the league to the repository', async () => {
  const repo = new InMemoryLeagueRepository();
  const useCase = new CreateLeagueUseCase(repo);
  
  const result = await useCase.execute({ name: 'Test' });
  
  expect(result.isSuccess()).toBe(true);
  const league = await repo.findById(result.value.id);
  expect(league?.name).toBe('Test');
});

2. Mock Only at Architecture Boundaries

Ports (interfaces) should be mocked in use case tests:

const mockRepo = mock<ILeagueRepository>({
  save: jest.fn().mockResolvedValue(undefined),
});

Domain entities should NEVER be mocked:

// ❌ Don't do this
const mockLeague = mock<League>();

// ✅ Do this
const league = new League('Test League', { maxTeams: 10 });

3. Keep Tests Readable and Maintainable

Arrange-Act-Assert Pattern:

it('should calculate standings correctly', () => {
  // Arrange: Set up test data
  const league = createTestLeague();
  const teams = [createTestTeam('Team A'), createTestTeam('Team B')];
  const results = [createTestResult(teams[0], position: 1)];
  
  // Act: Perform the action
  const standings = league.calculateStandings(results);
  
  // Assert: Verify the outcome
  expect(standings[0].team).toBe(teams[0]);
  expect(standings[0].points).toBe(25);
});

4. Test Error Scenarios

Don't just test the happy path:

describe('League.addTeam()', () => {
  it('should add team successfully', () => { /* ... */ });
  
  it('should fail when team limit reached', () => {
    const league = createTestLeague({ maxTeams: 1 });
    league.addTeam(createTestTeam('Team A'));
    
    const result = league.addTeam(createTestTeam('Team B'));
    
    expect(result.isFailure()).toBe(true);
    expect(result.error.message).toBe('Team limit reached');
  });
  
  it('should fail when adding duplicate team', () => { /* ... */ });
});

Common Patterns

Setting Up Test Fixtures

Factory Functions:

// core/testing/factories.ts (deleted - use inline)
// Instead, define factories inline in test files:
function createTestLeague(overrides?: Partial<LeagueProps>): League {
  return new League('Test League', {
    maxTeams: 10,
    scoringSystem: 'F1',
    ...overrides,
  });
}

function createTestTeam(name: string): Team {
  return new Team(name, { drivers: ['Driver 1', 'Driver 2'] });
}

Mocking Ports in Use Case Tests

// core/application/use-cases/CreateLeagueUseCase.test.ts
describe('CreateLeagueUseCase', () => {
  let mockRepo: jest.Mocked<ILeagueRepository>;
  let useCase: CreateLeagueUseCase;
  
  beforeEach(() => {
    mockRepo = {
      save: jest.fn().mockResolvedValue(undefined),
      findById: jest.fn().mockResolvedValue(null),
      findByName: jest.fn().mockResolvedValue(null),
    };
    useCase = new CreateLeagueUseCase(mockRepo);
  });
  
  it('should create a league when name is unique', async () => {
    const result = await useCase.execute({ name: 'New League' });
    
    expect(result.isSuccess()).toBe(true);
    expect(mockRepo.save).toHaveBeenCalledWith(
      expect.objectContaining({ name: 'New League' })
    );
  });
});

Summary

GridPilot's testing strategy ensures:

  • Business logic is correct (unit tests for domain/application layers)
  • Infrastructure works reliably (integration tests for repositories/adapters)
  • User workflows function end-to-end (E2E tests for full stack)
  • Browser automation works correctly (Docker E2E tests with real fixtures)

Current Test Structure (Clean Architecture)

Unit Tests: 376 files adjacent to implementation

core/
  └── [domain]/[layer]/[entity]/
      ├── Entity.ts
      └── Entity.test.ts  ✅

Integration Tests: In-memory only

tests/integration/
├── racing/RegistrationAndTeamUseCases.test.ts
└── website/
    ├── auth-flow.test.ts
    ├── auth-guard.test.ts
    └── middleware.test.ts

E2E Tests: TypeORM/PostgreSQL via Docker

tests/e2e/
└── website/website-pages.test.ts

Testing Infrastructure: Minimal, in adapters/testing/ (only what's actually used)

Key Principles

  1. Domain layer has ZERO dependencies
  2. Unit tests live adjacent to code
  3. Integration tests use in-memory only
  4. E2E tests use TypeORM/PostgreSQL
  5. No loose functions - everything is classes or inline factories
  6. Enforced via TypeScript paths and ESLint rules

This structure ensures clean architecture compliance while maintaining comprehensive test coverage.