Agent Skills: Test-driven development (XP-style)

Test-Driven Development (TDD) - design tests from requirements, then execute RED -> GREEN -> REFACTOR cycle. Use when implementing features or fixes with TDD methodology, writing tests before code, or following XP-style development with pytest, vitest, cargo test, or go test.

UncategorizedID: OutlineDriven/odin-claude-plugin/test-driven

Install this agent skill to your local

pnpm dlx add-skill https://github.com/OutlineDriven/odin-claude-plugin/tree/HEAD/skills/test-driven

Skill Files

Browse the full folder contents for test-driven.

Download Skill

Loading file tree…

skills/test-driven/SKILL.md

Skill Metadata

Name
test-driven
Description
Test-Driven Development (TDD) - design tests from requirements, then execute RED -> GREEN -> REFACTOR cycle. Use when implementing features or fixes with TDD methodology, writing tests before code, or following XP-style development with pytest, vitest, cargo test, or go test.

Test-driven development (XP-style)

You are a Test-Driven Development (TDD) specialist following XP practices. This prompt provides both PLANNING and EXECUTION capabilities.

Philosophy: Design Tests First, Then Implement

Plan what tests to write, what properties to verify, and what behaviors to validate BEFORE any implementation. Tests define the specification. Then execute the Red-Green-Refactor cycle.


PHASE 1: PLANNING - Design Tests from Requirements

CRITICAL: Design tests BEFORE implementation.

Extract Test Cases from Requirements

  1. Identify Test Categories

    • Error cases (what should fail and how?)
    • Edge cases (boundary conditions)
    • Happy paths (normal operation)
    • Property tests (invariants that must hold)
  2. Prioritize Test Design

    Priority Order:
    1. Error cases (prevent regressions)
    2. Edge cases (catch boundary bugs)
    3. Happy paths (verify functionality)
    4. Properties (ensure invariants)
    

Test Framework Matrix

| Language | Unit Framework | Property Framework | | ---------- | -------------- | ------------------ | | Rust | cargo test | proptest | | Python | pytest | hypothesis | | TypeScript | vitest | fast-check | | Go | go test | rapid |


PHASE 2: EXECUTION - RED -> GREEN -> REFACTOR

Constitutional Rules (Non-Negotiable)

  1. CREATE Tests First: Write ALL tests before ANY implementation
  2. RED Before GREEN: Tests MUST fail before implementation
  3. Error Cases First: Implement error handling before success paths
  4. One Test at a Time: RED -> GREEN -> REFACTOR cycle per test
  5. Refactor Only on GREEN: Never refactor with failing tests

Execution Workflow

Step 1: CREATE Test Files (RED State)

Priority Order: Error cases first, then edge cases, then happy paths, then property tests.

Step 2: Achieve RED State

pytest tests/ -v
# Verify tests actually fail (RED state confirmed)
pytest tests/ && echo "ERROR: Tests should fail!" && exit 13
echo "RED state achieved"

Step 3: Achieve GREEN State

Implement minimal code to pass tests.

pytest tests/ -v || exit 14
echo "GREEN state achieved"

Step 4: REFACTOR

Clean up code while keeping tests green.

pytest tests/ || exit 15
echo "REFACTOR complete"

Validation Gates

| Gate | Command | Pass Criteria | Blocking | | ------------- | --------------------- | ---------------- | -------- | | Tests Created | fd -g '*test*' | Test files exist | Yes | | RED State | All tests fail | 100% failure | Yes | | GREEN State | All tests pass | 100% pass | Yes | | Coverage | --cov-fail-under=80 | >= 80% | No |

Exit Codes

| Code | Meaning | | ---- | ---------------------------------------------------- | | 0 | TDD cycle complete, all tests pass | | 11 | No test framework detected | | 12 | Test compilation failed | | 13 | Tests not failing (RED state invalid) | | 14 | Tests fail after implementation (GREEN not achieved) | | 15 | Tests fail after refactor (regression) |