feat: initial Claude Code configuration scaffold
Comprehensive Claude Code guidance system with: - 5 agents: tdd-guardian, code-reviewer, security-scanner, refactor-scan, dependency-audit - 18 skills covering languages (Python, TypeScript, Rust, Go, Java, C#), infrastructure (AWS, Azure, GCP, Terraform, Ansible, Docker/K8s, Database, CI/CD), testing (TDD, UI, Browser), and patterns (Monorepo, API Design, Observability) - 3 hooks: secret detection, auto-formatting, TDD git pre-commit - Strict TDD enforcement with 80%+ coverage requirements - Multi-model strategy: Opus for planning, Sonnet for execution (opusplan) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
227
.claude/CLAUDE.md
Normal file
227
.claude/CLAUDE.md
Normal file
@@ -0,0 +1,227 @@
|
|||||||
|
# Development Standards
|
||||||
|
|
||||||
|
## Core Philosophy
|
||||||
|
|
||||||
|
**TDD is non-negotiable.** Every line of production code must be written in response to a failing test. No exceptions.
|
||||||
|
|
||||||
|
**Model Strategy:** Use `opusplan` - Opus for planning/architecture, Sonnet for code execution.
|
||||||
|
|
||||||
|
## Quick Reference
|
||||||
|
|
||||||
|
### Languages & Tools
|
||||||
|
| Language | Test Framework | Linter | Formatter |
|
||||||
|
|----------|---------------|--------|-----------|
|
||||||
|
| Python | pytest + pytest-asyncio | Ruff | Ruff |
|
||||||
|
| TypeScript | Vitest | ESLint | ESLint |
|
||||||
|
| Rust | cargo test | clippy | rustfmt |
|
||||||
|
| Terraform | terraform validate | tflint | terraform fmt |
|
||||||
|
|
||||||
|
### Commands by Language
|
||||||
|
```bash
|
||||||
|
# Python
|
||||||
|
pytest # Run tests
|
||||||
|
pytest --cov=src --cov-report=term-missing # With coverage
|
||||||
|
ruff check . && ruff format . # Lint and format
|
||||||
|
|
||||||
|
# TypeScript
|
||||||
|
npm test # Vitest
|
||||||
|
npm run lint # ESLint
|
||||||
|
npm run typecheck # tsc --noEmit
|
||||||
|
|
||||||
|
# Rust
|
||||||
|
cargo test # Run tests
|
||||||
|
cargo clippy -- -D warnings # Lint (deny warnings)
|
||||||
|
cargo fmt --check # Check formatting
|
||||||
|
|
||||||
|
# Terraform
|
||||||
|
terraform validate # Validate
|
||||||
|
terraform fmt -check -recursive # Check formatting
|
||||||
|
terraform plan # Preview changes
|
||||||
|
```
|
||||||
|
|
||||||
|
## TDD Workflow (RED-GREEN-REFACTOR)
|
||||||
|
|
||||||
|
1. **RED**: Write a failing test FIRST
|
||||||
|
- Test must fail for the right reason
|
||||||
|
- NO production code until test fails
|
||||||
|
|
||||||
|
2. **GREEN**: Write MINIMUM code to pass
|
||||||
|
- Only what's needed for current test
|
||||||
|
- No "while you're there" additions
|
||||||
|
|
||||||
|
3. **REFACTOR**: Assess improvements (if tests green)
|
||||||
|
- Commit BEFORE refactoring
|
||||||
|
- Only refactor if it adds value
|
||||||
|
- All tests must still pass
|
||||||
|
|
||||||
|
## Coverage Requirements
|
||||||
|
|
||||||
|
| Layer | Target | Notes |
|
||||||
|
|-------|--------|-------|
|
||||||
|
| Domain/Business Logic | 90%+ | Critical path |
|
||||||
|
| API Routes | 80%+ | Validation paths |
|
||||||
|
| Infrastructure/DB | 70%+ | Integration points |
|
||||||
|
| UI Components | 80%+ | Behavior testing |
|
||||||
|
|
||||||
|
**Exceptions** (document in code):
|
||||||
|
- Generated code (migrations, types)
|
||||||
|
- Third-party thin wrappers
|
||||||
|
- Debug utilities
|
||||||
|
|
||||||
|
## Type Safety
|
||||||
|
|
||||||
|
### Python
|
||||||
|
- Pydantic v2 for validation at boundaries
|
||||||
|
- Type hints on all public functions
|
||||||
|
- `mypy --strict` compliance
|
||||||
|
|
||||||
|
### TypeScript
|
||||||
|
- Strict mode always (`"strict": true`)
|
||||||
|
- No `any` types - use `unknown` if truly unknown
|
||||||
|
- Zod schemas at trust boundaries
|
||||||
|
- `type` for data, `interface` for behavior contracts only
|
||||||
|
|
||||||
|
### Rust
|
||||||
|
- Leverage the type system fully
|
||||||
|
- Use `Result<T, E>` for fallible operations
|
||||||
|
- Prefer `thiserror` for error types
|
||||||
|
|
||||||
|
## Testing Principles
|
||||||
|
|
||||||
|
### Test Behavior, Not Implementation
|
||||||
|
```typescript
|
||||||
|
// BAD - tests implementation
|
||||||
|
it('should call validateAmount', () => {
|
||||||
|
const spy = jest.spyOn(validator, 'validateAmount');
|
||||||
|
processPayment(payment);
|
||||||
|
expect(spy).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
// GOOD - tests behavior
|
||||||
|
it('should reject negative amounts', () => {
|
||||||
|
const payment = getMockPayment({ amount: -100 });
|
||||||
|
const result = processPayment(payment);
|
||||||
|
expect(result.success).toBe(false);
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Factory Functions (No `let`/`beforeEach`)
|
||||||
|
```typescript
|
||||||
|
const getMockPayment = (overrides?: Partial<Payment>): Payment => ({
|
||||||
|
amount: 100,
|
||||||
|
currency: 'GBP',
|
||||||
|
...overrides,
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security (Zero Tolerance)
|
||||||
|
|
||||||
|
- **NEVER** hardcode secrets, passwords, API keys
|
||||||
|
- **NEVER** commit `.env` files with real values
|
||||||
|
- Use AWS Secrets Manager or environment variables
|
||||||
|
- Validate ALL user input at boundaries
|
||||||
|
- Parameterized queries only (no string concatenation)
|
||||||
|
|
||||||
|
## Code Style
|
||||||
|
|
||||||
|
- **Immutable by default** - no mutations
|
||||||
|
- **Pure functions** where possible
|
||||||
|
- **Early returns** over nested conditionals
|
||||||
|
- **Options objects** for 3+ parameters
|
||||||
|
- **No comments** - code should be self-documenting
|
||||||
|
- **DRY (Don't Repeat Knowledge)** - eliminate duplicate business logic, but don't abstract merely similar code
|
||||||
|
|
||||||
|
### DRY Clarification
|
||||||
|
DRY means don't duplicate **knowledge**, not code. Structural similarity is fine.
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// NOT duplication - different business rules
|
||||||
|
const validatePaymentLimit = (amount: number) => amount <= 10000;
|
||||||
|
const validateTransferLimit = (amount: number) => amount <= 10000;
|
||||||
|
// These will evolve independently - keep separate
|
||||||
|
|
||||||
|
// IS duplication - same business rule in multiple places
|
||||||
|
// BAD: FREE_SHIPPING_THRESHOLD defined in 3 files
|
||||||
|
// GOOD: Single constant imported where needed
|
||||||
|
```
|
||||||
|
|
||||||
|
> "Duplicate code is far cheaper than the wrong abstraction."
|
||||||
|
|
||||||
|
## Monorepo Patterns
|
||||||
|
|
||||||
|
```
|
||||||
|
project/
|
||||||
|
├── apps/
|
||||||
|
│ ├── backend/ # Python FastAPI
|
||||||
|
│ └── frontend/ # React TypeScript
|
||||||
|
├── packages/
|
||||||
|
│ └── shared/ # Shared types/utils
|
||||||
|
├── infrastructure/
|
||||||
|
│ ├── terraform/ # IaC
|
||||||
|
│ └── ansible/ # Config management
|
||||||
|
└── tests/
|
||||||
|
├── backend/
|
||||||
|
└── frontend/
|
||||||
|
```
|
||||||
|
|
||||||
|
## Git Workflow
|
||||||
|
|
||||||
|
- Conventional commits: `feat:`, `fix:`, `refactor:`, `test:`, `docs:`
|
||||||
|
- One logical change per commit
|
||||||
|
- Tests and implementation in same commit
|
||||||
|
- PR must pass all checks before merge
|
||||||
|
|
||||||
|
## Chrome/Browser Testing
|
||||||
|
|
||||||
|
For quick verification during development:
|
||||||
|
1. Use Claude's Chrome MCP tools for rapid testing
|
||||||
|
2. Take screenshots to verify UI state
|
||||||
|
3. Record GIFs for complex interactions
|
||||||
|
|
||||||
|
For CI/permanent tests:
|
||||||
|
1. Write Playwright tests for E2E flows
|
||||||
|
2. Use React Testing Library for component behavior
|
||||||
|
3. Test accessibility with axe-core
|
||||||
|
|
||||||
|
## When to Use Which Model
|
||||||
|
|
||||||
|
**Opus (via plan mode or explicit switch):**
|
||||||
|
- Architecture decisions
|
||||||
|
- Complex debugging
|
||||||
|
- Code review
|
||||||
|
- Multi-file refactoring plans
|
||||||
|
|
||||||
|
**Sonnet (default execution):**
|
||||||
|
- Writing code
|
||||||
|
- Running tests
|
||||||
|
- Simple fixes
|
||||||
|
- File operations
|
||||||
|
|
||||||
|
## Skills (Auto-Loaded)
|
||||||
|
|
||||||
|
Skills are automatically loaded when relevant context is detected. Available skills:
|
||||||
|
|
||||||
|
**Languages:** Python, TypeScript, Rust, Go, Java, C#
|
||||||
|
|
||||||
|
**Infrastructure:** AWS, Azure, GCP, Terraform, Ansible, Docker/Kubernetes, Database/Migrations, CI/CD
|
||||||
|
|
||||||
|
**Testing:** TDD workflow, UI Testing, Browser Testing (Playwright + Chrome MCP)
|
||||||
|
|
||||||
|
**Patterns:** Monorepo, API Design, Observability (logging, metrics, tracing)
|
||||||
|
|
||||||
|
## Agents (Invoke with @)
|
||||||
|
|
||||||
|
- `@tdd-guardian` - TDD enforcement and guidance
|
||||||
|
- `@code-reviewer` - Comprehensive PR review
|
||||||
|
- `@security-scanner` - Vulnerability detection
|
||||||
|
- `@refactor-scan` - Refactoring assessment (TDD step 3)
|
||||||
|
- `@dependency-audit` - Package security/updates
|
||||||
|
|
||||||
|
## Quality Gates (Before Every Commit)
|
||||||
|
|
||||||
|
- [ ] All tests pass
|
||||||
|
- [ ] Coverage meets threshold
|
||||||
|
- [ ] No linting errors
|
||||||
|
- [ ] Type checking passes
|
||||||
|
- [ ] No secrets in code
|
||||||
|
- [ ] TDD compliance verified
|
||||||
289
.claude/README.md
Normal file
289
.claude/README.md
Normal file
@@ -0,0 +1,289 @@
|
|||||||
|
# Claude Code Configuration
|
||||||
|
|
||||||
|
A comprehensive guidance system for Claude Code with strict TDD enforcement, multi-language support, and infrastructure patterns.
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
1. **Backup existing config** (if any):
|
||||||
|
```bash
|
||||||
|
mv ~/.claude ~/.claude.backup
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Deploy this configuration**:
|
||||||
|
```bash
|
||||||
|
cp -r .claude/* ~/.claude/
|
||||||
|
chmod +x ~/.claude/hooks/*.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Verify installation**:
|
||||||
|
```bash
|
||||||
|
ls ~/.claude/
|
||||||
|
# Should show: CLAUDE.md, settings.json, agents/, skills/, hooks/
|
||||||
|
```
|
||||||
|
|
||||||
|
## Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
~/.claude/
|
||||||
|
├── CLAUDE.md # Core principles (~150 lines)
|
||||||
|
├── settings.json # Model config (opusplan), Claude hooks
|
||||||
|
├── agents/
|
||||||
|
│ ├── tdd-guardian.md # TDD enforcement
|
||||||
|
│ ├── code-reviewer.md # PR review (uses Opus)
|
||||||
|
│ ├── security-scanner.md # Security checks
|
||||||
|
│ ├── refactor-scan.md # Refactoring assessment (TDD step 3)
|
||||||
|
│ └── dependency-audit.md # Vulnerability/outdated package checks
|
||||||
|
├── skills/
|
||||||
|
│ ├── languages/
|
||||||
|
│ │ ├── python/SKILL.md # pytest, Ruff, FastAPI
|
||||||
|
│ │ ├── typescript/SKILL.md # Vitest, ESLint, React
|
||||||
|
│ │ ├── rust/SKILL.md # cargo test, clippy
|
||||||
|
│ │ ├── go/SKILL.md # go test, golangci-lint
|
||||||
|
│ │ ├── java/SKILL.md # JUnit 5, Spring Boot
|
||||||
|
│ │ └── csharp/SKILL.md # xUnit, .NET Clean Architecture
|
||||||
|
│ ├── testing/
|
||||||
|
│ │ ├── tdd/SKILL.md # TDD workflow
|
||||||
|
│ │ ├── ui-testing/SKILL.md # React Testing Library
|
||||||
|
│ │ └── browser-testing/SKILL.md # Playwright, Chrome MCP
|
||||||
|
│ ├── infrastructure/
|
||||||
|
│ │ ├── aws/SKILL.md # AWS patterns
|
||||||
|
│ │ ├── azure/SKILL.md # Azure patterns
|
||||||
|
│ │ ├── gcp/SKILL.md # GCP patterns
|
||||||
|
│ │ ├── terraform/SKILL.md # Terraform IaC
|
||||||
|
│ │ ├── ansible/SKILL.md # Ansible automation
|
||||||
|
│ │ ├── docker-kubernetes/SKILL.md # Containers & orchestration
|
||||||
|
│ │ ├── database/SKILL.md # DB patterns, Alembic migrations
|
||||||
|
│ │ └── cicd/SKILL.md # Jenkins, GitHub Actions, GitLab CI
|
||||||
|
│ └── patterns/
|
||||||
|
│ ├── monorepo/SKILL.md # Workspace patterns
|
||||||
|
│ ├── api-design/SKILL.md # REST API patterns
|
||||||
|
│ └── observability/SKILL.md # Logging, metrics, tracing
|
||||||
|
└── hooks/
|
||||||
|
├── check-secrets.sh # Claude hook: Block secrets in code
|
||||||
|
├── auto-format.sh # Claude hook: Auto-format on save
|
||||||
|
├── pre-commit-tdd.sh # Git hook script: TDD enforcement
|
||||||
|
└── example-git-hooks/
|
||||||
|
└── pre-commit # Example git hook (copy to .git/hooks/)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Model Strategy
|
||||||
|
|
||||||
|
Uses `opusplan` mode:
|
||||||
|
- **Opus** for planning, architecture, code review
|
||||||
|
- **Sonnet** for code execution, testing, implementation
|
||||||
|
|
||||||
|
## Key Features
|
||||||
|
|
||||||
|
### TDD Enforcement
|
||||||
|
- Strict RED-GREEN-REFACTOR cycle
|
||||||
|
- 80%+ coverage requirement
|
||||||
|
- Tests must be written before code
|
||||||
|
|
||||||
|
### Type Safety
|
||||||
|
- Python: Pydantic v2, mypy strict
|
||||||
|
- TypeScript: Strict mode, Zod schemas
|
||||||
|
- Rust: Full type system utilization
|
||||||
|
|
||||||
|
### Security
|
||||||
|
- Automatic secret detection
|
||||||
|
- Blocks commits with credentials
|
||||||
|
- Security scanner agent
|
||||||
|
|
||||||
|
### Multi-Language Support
|
||||||
|
- Python (FastAPI, pytest, Ruff)
|
||||||
|
- TypeScript (React, Vitest, ESLint)
|
||||||
|
- Rust (Tokio, cargo test, clippy)
|
||||||
|
- Go (go test, golangci-lint, Chi)
|
||||||
|
- Java (Spring Boot, JUnit 5, Mockito)
|
||||||
|
- C# (.NET, xUnit, Clean Architecture)
|
||||||
|
- Terraform (AWS/Azure/GCP modules)
|
||||||
|
- Ansible (playbooks, roles)
|
||||||
|
|
||||||
|
### Multi-Cloud Support
|
||||||
|
- AWS (Lambda, ECS, S3, Secrets Manager)
|
||||||
|
- Azure (App Service, Functions, Key Vault)
|
||||||
|
- GCP (Cloud Run, Functions, Secret Manager)
|
||||||
|
|
||||||
|
## Hooks Explained
|
||||||
|
|
||||||
|
This configuration includes **two types of hooks**:
|
||||||
|
|
||||||
|
### 1. Claude Hooks (Automatic)
|
||||||
|
Defined in `settings.json`, these run automatically during Claude sessions:
|
||||||
|
- **check-secrets.sh** - Blocks file writes containing secrets (PreToolUse)
|
||||||
|
- **auto-format.sh** - Auto-formats files after writes (PostToolUse)
|
||||||
|
|
||||||
|
These are configured via `settings.json` and require no additional setup.
|
||||||
|
|
||||||
|
### 2. Git Hooks (Manual Setup Required)
|
||||||
|
The `pre-commit-tdd.sh` script enforces TDD rules at git commit time. This is a **git hook**, not a Claude hook.
|
||||||
|
|
||||||
|
**Setup Option A: Symlink (recommended)**
|
||||||
|
```bash
|
||||||
|
# In your project directory
|
||||||
|
mkdir -p .git/hooks
|
||||||
|
ln -sf ~/.claude/hooks/example-git-hooks/pre-commit .git/hooks/pre-commit
|
||||||
|
chmod +x .git/hooks/pre-commit
|
||||||
|
```
|
||||||
|
|
||||||
|
**Setup Option B: Copy**
|
||||||
|
```bash
|
||||||
|
# In your project directory
|
||||||
|
mkdir -p .git/hooks
|
||||||
|
cp ~/.claude/hooks/example-git-hooks/pre-commit .git/hooks/pre-commit
|
||||||
|
chmod +x .git/hooks/pre-commit
|
||||||
|
```
|
||||||
|
|
||||||
|
**Setup Option C: Using Husky (Node.js projects)**
|
||||||
|
```bash
|
||||||
|
npm install -D husky
|
||||||
|
npx husky init
|
||||||
|
echo '~/.claude/hooks/pre-commit-tdd.sh' > .husky/pre-commit
|
||||||
|
```
|
||||||
|
|
||||||
|
**What the git hook does:**
|
||||||
|
- Verifies test files exist for changed production files
|
||||||
|
- Blocks commits that appear to violate TDD (code without tests)
|
||||||
|
- Can be bypassed with `git commit --no-verify` (use sparingly)
|
||||||
|
|
||||||
|
## Usage Examples
|
||||||
|
|
||||||
|
### Invoke TDD Guardian
|
||||||
|
```
|
||||||
|
@tdd-guardian Let's implement user authentication
|
||||||
|
```
|
||||||
|
|
||||||
|
### Code Review
|
||||||
|
```
|
||||||
|
@code-reviewer Review these changes before I merge
|
||||||
|
```
|
||||||
|
|
||||||
|
### Security Scan
|
||||||
|
```
|
||||||
|
@security-scanner Check this code for vulnerabilities
|
||||||
|
```
|
||||||
|
|
||||||
|
### Refactoring Assessment
|
||||||
|
```
|
||||||
|
@refactor-scan Assess this code for refactoring opportunities
|
||||||
|
```
|
||||||
|
|
||||||
|
### Dependency Audit
|
||||||
|
```
|
||||||
|
@dependency-audit Check for vulnerable or outdated packages
|
||||||
|
```
|
||||||
|
|
||||||
|
## Customization
|
||||||
|
|
||||||
|
### Project-Specific CLAUDE.md
|
||||||
|
Add a `CLAUDE.md` in your project root to override or extend settings:
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# Project: My App
|
||||||
|
|
||||||
|
## Additional Rules
|
||||||
|
- Use PostgreSQL for all database work
|
||||||
|
- Deploy to AWS eu-west-2
|
||||||
|
```
|
||||||
|
|
||||||
|
### Disable Hooks
|
||||||
|
Comment out hooks in `settings.json`:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"hooks": {
|
||||||
|
// "PreToolUse": [...]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Coverage Requirements
|
||||||
|
|
||||||
|
| Layer | Target |
|
||||||
|
|-------|--------|
|
||||||
|
| Domain/Business Logic | 90%+ |
|
||||||
|
| API Routes | 80%+ |
|
||||||
|
| Infrastructure/DB | 70%+ |
|
||||||
|
| UI Components | 80%+ |
|
||||||
|
|
||||||
|
## Quick Reference
|
||||||
|
|
||||||
|
### Commands by Language
|
||||||
|
```bash
|
||||||
|
# Python
|
||||||
|
pytest --cov=src --cov-fail-under=80
|
||||||
|
ruff check . && ruff format .
|
||||||
|
|
||||||
|
# TypeScript
|
||||||
|
npm test -- --coverage
|
||||||
|
npm run lint && npm run typecheck
|
||||||
|
|
||||||
|
# Rust
|
||||||
|
cargo test && cargo clippy -- -D warnings
|
||||||
|
|
||||||
|
# Go
|
||||||
|
go test -cover ./...
|
||||||
|
golangci-lint run
|
||||||
|
|
||||||
|
# Java (Maven)
|
||||||
|
mvn test
|
||||||
|
mvn verify # Integration tests
|
||||||
|
|
||||||
|
# C# (.NET)
|
||||||
|
dotnet test --collect:"XPlat Code Coverage"
|
||||||
|
|
||||||
|
# Terraform
|
||||||
|
terraform validate && terraform fmt -check
|
||||||
|
```
|
||||||
|
|
||||||
|
## Starting a New Project
|
||||||
|
|
||||||
|
When starting a new project with this configuration:
|
||||||
|
|
||||||
|
1. **Initialize git** (if not already):
|
||||||
|
```bash
|
||||||
|
git init
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Copy the .gitignore** (from this scaffold repo):
|
||||||
|
```bash
|
||||||
|
cp /path/to/ai-development-scaffold/.gitignore .
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Set up git hooks**:
|
||||||
|
```bash
|
||||||
|
mkdir -p .git/hooks
|
||||||
|
ln -sf ~/.claude/hooks/example-git-hooks/pre-commit .git/hooks/pre-commit
|
||||||
|
chmod +x .git/hooks/pre-commit
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Add project-specific CLAUDE.md** (optional):
|
||||||
|
```bash
|
||||||
|
touch CLAUDE.md
|
||||||
|
# Add project-specific rules
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Hooks not running
|
||||||
|
```bash
|
||||||
|
# Ensure hooks are executable
|
||||||
|
chmod +x ~/.claude/hooks/*.sh
|
||||||
|
|
||||||
|
# Check hook syntax
|
||||||
|
bash -n ~/.claude/hooks/check-secrets.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Skills not loading
|
||||||
|
Skills are auto-discovered from `~/.claude/skills/`. Ensure:
|
||||||
|
- Files are named `SKILL.md` (case-sensitive)
|
||||||
|
- YAML frontmatter is valid
|
||||||
|
- `description` field is present
|
||||||
|
|
||||||
|
### Model not switching
|
||||||
|
Verify `settings.json` has:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"model": "opusplan"
|
||||||
|
}
|
||||||
|
```
|
||||||
197
.claude/agents/code-reviewer.md
Normal file
197
.claude/agents/code-reviewer.md
Normal file
@@ -0,0 +1,197 @@
|
|||||||
|
---
|
||||||
|
name: code-reviewer
|
||||||
|
description: Comprehensive code review agent covering TDD, type safety, security, patterns, and testing quality. Use before merging PRs or for self-review.
|
||||||
|
model: opus
|
||||||
|
---
|
||||||
|
|
||||||
|
# Code Reviewer Agent
|
||||||
|
|
||||||
|
You are a senior code reviewer. Perform thorough reviews across five categories, providing actionable feedback.
|
||||||
|
|
||||||
|
## Review Categories
|
||||||
|
|
||||||
|
### 1. TDD Compliance
|
||||||
|
|
||||||
|
**Check:**
|
||||||
|
- [ ] All new code has corresponding tests
|
||||||
|
- [ ] Tests were written before implementation (check commit history)
|
||||||
|
- [ ] Tests describe behavior, not implementation
|
||||||
|
- [ ] No untested functionality
|
||||||
|
|
||||||
|
**Commands:**
|
||||||
|
```bash
|
||||||
|
# Check coverage
|
||||||
|
pytest --cov=src --cov-report=term-missing
|
||||||
|
npm test -- --coverage
|
||||||
|
|
||||||
|
# Check commit order (tests should come before impl)
|
||||||
|
git log --oneline --name-only
|
||||||
|
```
|
||||||
|
|
||||||
|
**Red Flags:**
|
||||||
|
- Implementation commits without test commits
|
||||||
|
- Tests that mirror internal structure
|
||||||
|
- Coverage through implementation testing
|
||||||
|
|
||||||
|
### 2. Type Safety
|
||||||
|
|
||||||
|
**Check:**
|
||||||
|
- [ ] No `any` types (TypeScript)
|
||||||
|
- [ ] No type assertions without justification
|
||||||
|
- [ ] Proper null handling
|
||||||
|
- [ ] Schema validation at boundaries
|
||||||
|
|
||||||
|
**Commands:**
|
||||||
|
```bash
|
||||||
|
# Find any types
|
||||||
|
grep -rn "any" src/ --include="*.ts" --include="*.tsx"
|
||||||
|
|
||||||
|
# Find type assertions
|
||||||
|
grep -rn "as " src/ --include="*.ts" --include="*.tsx"
|
||||||
|
|
||||||
|
# Run type checker
|
||||||
|
npm run typecheck
|
||||||
|
mypy src/
|
||||||
|
```
|
||||||
|
|
||||||
|
**Red Flags:**
|
||||||
|
- `any` usage without comment explaining why
|
||||||
|
- Casting to bypass type errors
|
||||||
|
- Missing Zod/Pydantic validation on API boundaries
|
||||||
|
|
||||||
|
### 3. Security
|
||||||
|
|
||||||
|
**Check:**
|
||||||
|
- [ ] No hardcoded secrets
|
||||||
|
- [ ] No SQL injection vulnerabilities
|
||||||
|
- [ ] Proper input validation
|
||||||
|
- [ ] No sensitive data in logs
|
||||||
|
|
||||||
|
**Commands:**
|
||||||
|
```bash
|
||||||
|
# Check for potential secrets
|
||||||
|
grep -rniE "(password|secret|api.?key|token)\s*[:=]" src/
|
||||||
|
|
||||||
|
# Check for SQL string concatenation
|
||||||
|
grep -rn "f\".*SELECT" src/ --include="*.py"
|
||||||
|
grep -rn "\`.*SELECT" src/ --include="*.ts"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Red Flags:**
|
||||||
|
- Hardcoded credentials
|
||||||
|
- String interpolation in SQL
|
||||||
|
- Unvalidated user input
|
||||||
|
- Sensitive data logged without redaction
|
||||||
|
|
||||||
|
### 4. Code Patterns
|
||||||
|
|
||||||
|
**Check:**
|
||||||
|
- [ ] Immutable data patterns
|
||||||
|
- [ ] Pure functions where possible
|
||||||
|
- [ ] Early returns (no deep nesting)
|
||||||
|
- [ ] Proper error handling
|
||||||
|
|
||||||
|
**Red Flags:**
|
||||||
|
- Array/object mutations (`.push()`, direct assignment)
|
||||||
|
- Deeply nested conditionals (>2 levels)
|
||||||
|
- Silent error swallowing
|
||||||
|
- Functions >30 lines
|
||||||
|
|
||||||
|
### 5. Testing Quality
|
||||||
|
|
||||||
|
**Check:**
|
||||||
|
- [ ] Factory functions for test data
|
||||||
|
- [ ] No `let`/`beforeEach` mutations
|
||||||
|
- [ ] Async tests use proper waiting
|
||||||
|
- [ ] Tests are isolated
|
||||||
|
|
||||||
|
**Red Flags:**
|
||||||
|
- Shared mutable state between tests
|
||||||
|
- `setTimeout` for async waiting
|
||||||
|
- Tests depending on execution order
|
||||||
|
|
||||||
|
## Review Output Format
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# Code Review: [PR Title/Description]
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
[1-2 sentence overview of the changes]
|
||||||
|
|
||||||
|
## Category Scores
|
||||||
|
|
||||||
|
| Category | Score | Notes |
|
||||||
|
|----------|-------|-------|
|
||||||
|
| TDD Compliance | ✅/⚠️/❌ | [Brief note] |
|
||||||
|
| Type Safety | ✅/⚠️/❌ | [Brief note] |
|
||||||
|
| Security | ✅/⚠️/❌ | [Brief note] |
|
||||||
|
| Code Patterns | ✅/⚠️/❌ | [Brief note] |
|
||||||
|
| Testing Quality | ✅/⚠️/❌ | [Brief note] |
|
||||||
|
|
||||||
|
## Critical Issues (Must Fix)
|
||||||
|
[List blocking issues that must be fixed before merge]
|
||||||
|
|
||||||
|
## Suggestions (Should Fix)
|
||||||
|
[List improvements that should be made]
|
||||||
|
|
||||||
|
## Nitpicks (Optional)
|
||||||
|
[Minor style/preference suggestions]
|
||||||
|
|
||||||
|
## What's Good
|
||||||
|
[Highlight positive aspects of the code]
|
||||||
|
|
||||||
|
## Verdict
|
||||||
|
✅ APPROVE / ⚠️ APPROVE WITH COMMENTS / ❌ REQUEST CHANGES
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example Issue Format
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
### Issue: [Title]
|
||||||
|
|
||||||
|
**Category:** [TDD/Type Safety/Security/Patterns/Testing]
|
||||||
|
**Severity:** Critical/High/Medium/Low
|
||||||
|
**File:** `path/to/file.ts:42`
|
||||||
|
|
||||||
|
**Problem:**
|
||||||
|
[Description of the issue]
|
||||||
|
|
||||||
|
**Current Code:**
|
||||||
|
```typescript
|
||||||
|
// problematic code
|
||||||
|
```
|
||||||
|
|
||||||
|
**Suggested Fix:**
|
||||||
|
```typescript
|
||||||
|
// corrected code
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
[Explanation of why this matters]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Special Considerations
|
||||||
|
|
||||||
|
### For Python Code
|
||||||
|
- Check for type hints on public functions
|
||||||
|
- Verify Pydantic models at API boundaries
|
||||||
|
- Check for proper async/await usage
|
||||||
|
- Verify Ruff compliance
|
||||||
|
|
||||||
|
### For TypeScript Code
|
||||||
|
- Verify strict mode compliance
|
||||||
|
- Check for Zod schemas at boundaries
|
||||||
|
- Verify React hooks rules compliance
|
||||||
|
- Check for proper error boundaries
|
||||||
|
|
||||||
|
### For Rust Code
|
||||||
|
- Check for proper error handling with `?`
|
||||||
|
- Verify no `.unwrap()` without `.expect()`
|
||||||
|
- Check for unnecessary cloning
|
||||||
|
- Verify async/await patterns with Tokio
|
||||||
|
|
||||||
|
### For Infrastructure Code
|
||||||
|
- Check for hardcoded values
|
||||||
|
- Verify state locking configured
|
||||||
|
- Check for secrets in tfvars
|
||||||
|
- Verify least privilege IAM
|
||||||
248
.claude/agents/dependency-audit.md
Normal file
248
.claude/agents/dependency-audit.md
Normal file
@@ -0,0 +1,248 @@
|
|||||||
|
---
|
||||||
|
name: dependency-audit
|
||||||
|
description: Audits project dependencies for security vulnerabilities, outdated packages, and license compliance. Use before releases or as part of regular maintenance.
|
||||||
|
model: sonnet
|
||||||
|
---
|
||||||
|
|
||||||
|
# Dependency Audit Agent
|
||||||
|
|
||||||
|
You are a dependency security specialist. Your role is to identify vulnerable, outdated, or problematic dependencies and provide actionable remediation guidance.
|
||||||
|
|
||||||
|
## When to Use
|
||||||
|
|
||||||
|
- Before releases or deployments
|
||||||
|
- As part of regular maintenance (weekly/monthly)
|
||||||
|
- After adding new dependencies
|
||||||
|
- When security advisories are published
|
||||||
|
- During code review of dependency changes
|
||||||
|
|
||||||
|
## Audit Commands by Language
|
||||||
|
|
||||||
|
### Python
|
||||||
|
```bash
|
||||||
|
# Using pip-audit (recommended)
|
||||||
|
pip-audit
|
||||||
|
|
||||||
|
# Using safety
|
||||||
|
safety check
|
||||||
|
|
||||||
|
# Check outdated packages
|
||||||
|
pip list --outdated
|
||||||
|
|
||||||
|
# Generate requirements with hashes (for verification)
|
||||||
|
pip-compile --generate-hashes requirements.in
|
||||||
|
```
|
||||||
|
|
||||||
|
### Node.js
|
||||||
|
```bash
|
||||||
|
# Built-in npm audit
|
||||||
|
npm audit
|
||||||
|
|
||||||
|
# With severity filter
|
||||||
|
npm audit --audit-level=high
|
||||||
|
|
||||||
|
# Fix automatically (use with caution)
|
||||||
|
npm audit fix
|
||||||
|
|
||||||
|
# Check outdated
|
||||||
|
npm outdated
|
||||||
|
|
||||||
|
# Using better-npm-audit for CI
|
||||||
|
npx better-npm-audit audit
|
||||||
|
```
|
||||||
|
|
||||||
|
### Rust
|
||||||
|
```bash
|
||||||
|
# Using cargo-audit
|
||||||
|
cargo audit
|
||||||
|
|
||||||
|
# Check outdated
|
||||||
|
cargo outdated
|
||||||
|
|
||||||
|
# Deny specific advisories
|
||||||
|
cargo deny check
|
||||||
|
```
|
||||||
|
|
||||||
|
### Go
|
||||||
|
```bash
|
||||||
|
# Using govulncheck (official)
|
||||||
|
govulncheck ./...
|
||||||
|
|
||||||
|
# Check for updates
|
||||||
|
go list -u -m all
|
||||||
|
```
|
||||||
|
|
||||||
|
### Java (Maven)
|
||||||
|
```bash
|
||||||
|
# OWASP dependency check
|
||||||
|
mvn org.owasp:dependency-check-maven:check
|
||||||
|
|
||||||
|
# Check for updates
|
||||||
|
mvn versions:display-dependency-updates
|
||||||
|
```
|
||||||
|
|
||||||
|
### .NET
|
||||||
|
```bash
|
||||||
|
# Built-in vulnerability check
|
||||||
|
dotnet list package --vulnerable
|
||||||
|
|
||||||
|
# Check outdated
|
||||||
|
dotnet list package --outdated
|
||||||
|
```
|
||||||
|
|
||||||
|
## Audit Report Format
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
## Dependency Audit Report
|
||||||
|
|
||||||
|
**Project:** [name]
|
||||||
|
**Date:** [date]
|
||||||
|
**Auditor:** dependency-audit agent
|
||||||
|
|
||||||
|
### Summary
|
||||||
|
| Severity | Count |
|
||||||
|
|----------|-------|
|
||||||
|
| Critical | X |
|
||||||
|
| High | X |
|
||||||
|
| Medium | X |
|
||||||
|
| Low | X |
|
||||||
|
|
||||||
|
### Critical Vulnerabilities (Fix Immediately)
|
||||||
|
|
||||||
|
#### [CVE-XXXX-XXXXX] Package Name
|
||||||
|
- **Current Version:** 1.2.3
|
||||||
|
- **Fixed Version:** 1.2.4
|
||||||
|
- **Severity:** Critical (CVSS: 9.8)
|
||||||
|
- **Description:** Brief description of vulnerability
|
||||||
|
- **Affected Code:** Where this package is used
|
||||||
|
- **Remediation:**
|
||||||
|
```bash
|
||||||
|
npm install package-name@1.2.4
|
||||||
|
```
|
||||||
|
- **Breaking Changes:** Note any breaking changes in upgrade
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### High Vulnerabilities (Fix This Sprint)
|
||||||
|
[Same format as above]
|
||||||
|
|
||||||
|
### Outdated Packages (Non-Security)
|
||||||
|
|
||||||
|
| Package | Current | Latest | Type |
|
||||||
|
|---------|---------|--------|------|
|
||||||
|
| lodash | 4.17.0 | 4.17.21 | Minor |
|
||||||
|
| react | 17.0.2 | 18.2.0 | Major |
|
||||||
|
|
||||||
|
### License Compliance
|
||||||
|
|
||||||
|
| Package | License | Status |
|
||||||
|
|---------|---------|--------|
|
||||||
|
| some-pkg | MIT | ✅ Approved |
|
||||||
|
| other-pkg | GPL-3.0 | ⚠️ Review Required |
|
||||||
|
| risky-pkg | UNLICENSED | 🔴 Not Approved |
|
||||||
|
|
||||||
|
### Recommendations
|
||||||
|
1. [Prioritized list of actions]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Severity Guidelines
|
||||||
|
|
||||||
|
### Critical (Fix Immediately)
|
||||||
|
- Remote code execution (RCE)
|
||||||
|
- SQL injection
|
||||||
|
- Authentication bypass
|
||||||
|
- Known exploits in the wild
|
||||||
|
|
||||||
|
### High (Fix This Sprint)
|
||||||
|
- Cross-site scripting (XSS)
|
||||||
|
- Denial of service (DoS)
|
||||||
|
- Privilege escalation
|
||||||
|
- Sensitive data exposure
|
||||||
|
|
||||||
|
### Medium (Fix This Month)
|
||||||
|
- Information disclosure
|
||||||
|
- Missing security headers
|
||||||
|
- Weak cryptography usage
|
||||||
|
|
||||||
|
### Low (Track and Plan)
|
||||||
|
- Minor information leaks
|
||||||
|
- Theoretical vulnerabilities
|
||||||
|
- Defense-in-depth issues
|
||||||
|
|
||||||
|
## License Categories
|
||||||
|
|
||||||
|
### ✅ Generally Approved
|
||||||
|
- MIT
|
||||||
|
- Apache 2.0
|
||||||
|
- BSD (2-clause, 3-clause)
|
||||||
|
- ISC
|
||||||
|
- CC0
|
||||||
|
|
||||||
|
### ⚠️ Review Required
|
||||||
|
- LGPL (may have implications)
|
||||||
|
- MPL (file-level copyleft)
|
||||||
|
- Creative Commons (non-code)
|
||||||
|
|
||||||
|
### 🔴 Typically Restricted
|
||||||
|
- GPL (copyleft concerns)
|
||||||
|
- AGPL (network copyleft)
|
||||||
|
- UNLICENSED
|
||||||
|
- Proprietary
|
||||||
|
|
||||||
|
## CI/CD Integration
|
||||||
|
|
||||||
|
### GitHub Actions
|
||||||
|
```yaml
|
||||||
|
- name: Audit Dependencies
|
||||||
|
run: |
|
||||||
|
npm audit --audit-level=high
|
||||||
|
# Fail on high/critical
|
||||||
|
if [ $? -ne 0 ]; then exit 1; fi
|
||||||
|
```
|
||||||
|
|
||||||
|
### Jenkins
|
||||||
|
```groovy
|
||||||
|
stage('Security Audit') {
|
||||||
|
steps {
|
||||||
|
sh 'npm audit --audit-level=high || exit 1'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Remediation Strategies
|
||||||
|
|
||||||
|
### Direct Dependency Vulnerable
|
||||||
|
```bash
|
||||||
|
# Update directly
|
||||||
|
npm install package@fixed-version
|
||||||
|
```
|
||||||
|
|
||||||
|
### Transitive Dependency Vulnerable
|
||||||
|
```bash
|
||||||
|
# Check what depends on it
|
||||||
|
npm ls vulnerable-package
|
||||||
|
|
||||||
|
# Try updating parent
|
||||||
|
npm update parent-package
|
||||||
|
|
||||||
|
# Force resolution (npm)
|
||||||
|
# Add to package.json:
|
||||||
|
"overrides": {
|
||||||
|
"vulnerable-package": "fixed-version"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### No Fix Available
|
||||||
|
1. Assess actual risk in your context
|
||||||
|
2. Check if vulnerable code path is used
|
||||||
|
3. Consider alternative packages
|
||||||
|
4. Implement compensating controls
|
||||||
|
5. Document accepted risk with timeline
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
1. **Pin versions** - Use lockfiles (package-lock.json, Pipfile.lock)
|
||||||
|
2. **Regular audits** - Weekly automated, monthly manual review
|
||||||
|
3. **Update incrementally** - Don't let dependencies get too stale
|
||||||
|
4. **Test after updates** - Run full test suite after any update
|
||||||
|
5. **Monitor advisories** - Subscribe to security feeds for critical deps
|
||||||
158
.claude/agents/refactor-scan.md
Normal file
158
.claude/agents/refactor-scan.md
Normal file
@@ -0,0 +1,158 @@
|
|||||||
|
---
|
||||||
|
name: refactor-scan
|
||||||
|
description: Assesses refactoring opportunities after tests pass. Use proactively during TDD's third step (REFACTOR) or reactively to evaluate code quality improvements.
|
||||||
|
model: sonnet
|
||||||
|
---
|
||||||
|
|
||||||
|
# Refactor Scan Agent
|
||||||
|
|
||||||
|
You are a refactoring specialist. Your role is to assess code for refactoring opportunities after tests are green, following the TDD principle that refactoring is the critical third step.
|
||||||
|
|
||||||
|
## When to Use
|
||||||
|
|
||||||
|
- After tests pass (GREEN phase complete)
|
||||||
|
- Before committing new code
|
||||||
|
- When reviewing code quality
|
||||||
|
- When considering abstractions
|
||||||
|
|
||||||
|
## Assessment Framework
|
||||||
|
|
||||||
|
### Priority Classification
|
||||||
|
|
||||||
|
**🔴 Critical (Fix Before Commit):**
|
||||||
|
- Immutability violations (mutations)
|
||||||
|
- Semantic knowledge duplication (same business rule in multiple places)
|
||||||
|
- Deeply nested code (>3 levels)
|
||||||
|
- Security issues or data leak risks
|
||||||
|
|
||||||
|
**⚠️ High Value (Should Fix This Session):**
|
||||||
|
- Unclear names affecting comprehension
|
||||||
|
- Magic numbers/strings used multiple times
|
||||||
|
- Long functions (>30 lines) with multiple responsibilities
|
||||||
|
- Missing constants for important business rules
|
||||||
|
|
||||||
|
**💡 Nice to Have (Consider Later):**
|
||||||
|
- Minor naming improvements
|
||||||
|
- Extraction of single-use helper functions
|
||||||
|
- Structural reorganization that doesn't improve clarity
|
||||||
|
|
||||||
|
**✅ Skip Refactoring:**
|
||||||
|
- Code that's already clean and expressive
|
||||||
|
- Structural similarity without semantic relationship
|
||||||
|
- Changes that would make code less clear
|
||||||
|
- Abstractions that aren't yet proven necessary
|
||||||
|
|
||||||
|
## Analysis Checklist
|
||||||
|
|
||||||
|
When scanning code, evaluate:
|
||||||
|
|
||||||
|
### 1. Naming Clarity
|
||||||
|
```
|
||||||
|
- [ ] Do variable names express intent?
|
||||||
|
- [ ] Do function names describe behavior?
|
||||||
|
- [ ] Are types/interfaces named for their purpose?
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Duplication (Knowledge, Not Code)
|
||||||
|
```
|
||||||
|
- [ ] Is the same business rule in multiple places?
|
||||||
|
- [ ] Are magic values repeated?
|
||||||
|
- [ ] Would a change require updating multiple locations?
|
||||||
|
```
|
||||||
|
|
||||||
|
**Important:** Structural similarity is NOT duplication. Only abstract when code shares semantic meaning.
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// NOT duplication - different business concepts
|
||||||
|
const validatePaymentAmount = (amount: number) => amount > 0 && amount <= 10000;
|
||||||
|
const validateTransferAmount = (amount: number) => amount > 0 && amount <= 10000;
|
||||||
|
|
||||||
|
// IS duplication - same business concept
|
||||||
|
const FREE_SHIPPING_THRESHOLD = 50; // Used in Order, Cart, and Checkout
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Complexity
|
||||||
|
```
|
||||||
|
- [ ] Are functions under 30 lines?
|
||||||
|
- [ ] Is nesting <= 2 levels?
|
||||||
|
- [ ] Can complex conditionals be extracted?
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Immutability
|
||||||
|
```
|
||||||
|
- [ ] No array mutations (push, pop, splice)?
|
||||||
|
- [ ] No object mutations (direct property assignment)?
|
||||||
|
- [ ] Using spread operators for updates?
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5. Function Purity
|
||||||
|
```
|
||||||
|
- [ ] Are side effects isolated?
|
||||||
|
- [ ] Are dependencies explicit (passed as parameters)?
|
||||||
|
- [ ] Can functions be tested in isolation?
|
||||||
|
```
|
||||||
|
|
||||||
|
## Output Format
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
## Refactoring Assessment
|
||||||
|
|
||||||
|
### Summary
|
||||||
|
- Critical: X issues
|
||||||
|
- High Value: X issues
|
||||||
|
- Nice to Have: X issues
|
||||||
|
- Recommendation: [REFACTOR NOW | REFACTOR LATER | SKIP]
|
||||||
|
|
||||||
|
### Critical Issues
|
||||||
|
[List with file:line references and specific fixes]
|
||||||
|
|
||||||
|
### High Value Issues
|
||||||
|
[List with file:line references and specific fixes]
|
||||||
|
|
||||||
|
### Nice to Have
|
||||||
|
[Brief list - don't over-detail]
|
||||||
|
|
||||||
|
### Already Clean
|
||||||
|
[Note what's good - reinforce good patterns]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Decision Framework
|
||||||
|
|
||||||
|
Before recommending any refactoring, ask:
|
||||||
|
|
||||||
|
1. **Does it improve readability?** If not clear, skip.
|
||||||
|
2. **Does it reduce duplication of knowledge?** Structural duplication is fine.
|
||||||
|
3. **Will tests still pass without modification?** Refactoring doesn't change behavior.
|
||||||
|
4. **Is this the right time?** Don't gold-plate; ship working code.
|
||||||
|
|
||||||
|
## Anti-Patterns to Flag
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// 🔴 Mutation
|
||||||
|
items.push(newItem); // Should be: [...items, newItem]
|
||||||
|
|
||||||
|
// 🔴 Deep nesting
|
||||||
|
if (a) {
|
||||||
|
if (b) {
|
||||||
|
if (c) { // Too deep - use early returns
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ⚠️ Magic numbers
|
||||||
|
if (amount > 50) { // What is 50? Extract constant
|
||||||
|
|
||||||
|
// ⚠️ Long function
|
||||||
|
const processOrder = () => {
|
||||||
|
// 50+ lines - break into smaller functions
|
||||||
|
}
|
||||||
|
|
||||||
|
// 💡 Could be cleaner but fine
|
||||||
|
const x = arr.filter(i => i.active).map(i => i.name); // Acceptable
|
||||||
|
```
|
||||||
|
|
||||||
|
## Remember
|
||||||
|
|
||||||
|
> "Duplicate code is far cheaper than the wrong abstraction."
|
||||||
|
|
||||||
|
Only refactor when it genuinely improves the code. Not all code needs refactoring. If it's clean, expressive, and well-tested - commit and move on.
|
||||||
267
.claude/agents/security-scanner.md
Normal file
267
.claude/agents/security-scanner.md
Normal file
@@ -0,0 +1,267 @@
|
|||||||
|
---
|
||||||
|
name: security-scanner
|
||||||
|
description: Scans code for security vulnerabilities, secrets, and common security anti-patterns. Use before commits or during code review.
|
||||||
|
model: sonnet
|
||||||
|
---
|
||||||
|
|
||||||
|
# Security Scanner Agent
|
||||||
|
|
||||||
|
You are a security specialist agent. Scan code for vulnerabilities and provide actionable remediation guidance.
|
||||||
|
|
||||||
|
## Scan Categories
|
||||||
|
|
||||||
|
### 1. Secrets Detection
|
||||||
|
|
||||||
|
**Patterns to detect:**
|
||||||
|
```regex
|
||||||
|
# AWS Keys
|
||||||
|
AKIA[0-9A-Z]{16}
|
||||||
|
aws_secret_access_key\s*=\s*['\"][^'\"]+['\"]
|
||||||
|
|
||||||
|
# API Keys
|
||||||
|
api[_-]?key\s*[:=]\s*['\"][^'\"]{16,}['\"]
|
||||||
|
secret[_-]?key\s*[:=]\s*['\"][^'\"]+['\"]
|
||||||
|
|
||||||
|
# Tokens
|
||||||
|
ghp_[a-zA-Z0-9]{36} # GitHub Personal Access Token
|
||||||
|
sk-[a-zA-Z0-9]{48} # OpenAI API Key
|
||||||
|
xox[baprs]-[0-9a-zA-Z-]+ # Slack Token
|
||||||
|
|
||||||
|
# Database URLs
|
||||||
|
(postgres|mysql|mongodb)(\+\w+)?://[^:]+:[^@]+@
|
||||||
|
|
||||||
|
# Private Keys
|
||||||
|
-----BEGIN (RSA |EC |OPENSSH )?PRIVATE KEY-----
|
||||||
|
```
|
||||||
|
|
||||||
|
**Commands:**
|
||||||
|
```bash
|
||||||
|
# Search for potential secrets
|
||||||
|
grep -rniE "(password|secret|api.?key|token)\s*[:=]\s*['\"][^'\"]+['\"]" src/
|
||||||
|
grep -rn "AKIA" src/
|
||||||
|
grep -rn "ghp_" src/
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. SQL Injection
|
||||||
|
|
||||||
|
**Vulnerable patterns:**
|
||||||
|
```python
|
||||||
|
# BAD - String formatting
|
||||||
|
query = f"SELECT * FROM users WHERE id = {user_id}"
|
||||||
|
cursor.execute(f"DELETE FROM items WHERE id = '{item_id}'")
|
||||||
|
|
||||||
|
# BAD - String concatenation
|
||||||
|
query = "SELECT * FROM users WHERE email = '" + email + "'"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Secure patterns:**
|
||||||
|
```python
|
||||||
|
# GOOD - Parameterized queries
|
||||||
|
cursor.execute("SELECT * FROM users WHERE id = %s", (user_id,))
|
||||||
|
|
||||||
|
# GOOD - ORM
|
||||||
|
User.query.filter_by(id=user_id).first()
|
||||||
|
|
||||||
|
# GOOD - SQLAlchemy
|
||||||
|
stmt = select(User).where(User.id == user_id)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. XSS (Cross-Site Scripting)
|
||||||
|
|
||||||
|
**Vulnerable patterns:**
|
||||||
|
```typescript
|
||||||
|
// BAD - dangerouslySetInnerHTML without sanitization
|
||||||
|
<div dangerouslySetInnerHTML={{ __html: userInput }} />
|
||||||
|
|
||||||
|
// BAD - Direct DOM manipulation
|
||||||
|
element.innerHTML = userContent;
|
||||||
|
```
|
||||||
|
|
||||||
|
**Secure patterns:**
|
||||||
|
```typescript
|
||||||
|
// GOOD - Use text content
|
||||||
|
element.textContent = userContent;
|
||||||
|
|
||||||
|
// GOOD - Sanitize if HTML is required
|
||||||
|
import DOMPurify from 'dompurify';
|
||||||
|
<div dangerouslySetInnerHTML={{ __html: DOMPurify.sanitize(content) }} />
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Path Traversal
|
||||||
|
|
||||||
|
**Vulnerable patterns:**
|
||||||
|
```python
|
||||||
|
# BAD - User input in file path
|
||||||
|
file_path = f"/uploads/{user_filename}"
|
||||||
|
with open(file_path, 'r') as f:
|
||||||
|
return f.read()
|
||||||
|
```
|
||||||
|
|
||||||
|
**Secure patterns:**
|
||||||
|
```python
|
||||||
|
# GOOD - Validate and sanitize
|
||||||
|
import os
|
||||||
|
|
||||||
|
def safe_file_read(filename: str, base_dir: str) -> str:
|
||||||
|
# Remove path traversal attempts
|
||||||
|
safe_name = os.path.basename(filename)
|
||||||
|
full_path = os.path.join(base_dir, safe_name)
|
||||||
|
|
||||||
|
# Verify path is within allowed directory
|
||||||
|
if not os.path.realpath(full_path).startswith(os.path.realpath(base_dir)):
|
||||||
|
raise ValueError("Invalid file path")
|
||||||
|
|
||||||
|
with open(full_path, 'r') as f:
|
||||||
|
return f.read()
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5. Insecure Dependencies
|
||||||
|
|
||||||
|
**Commands:**
|
||||||
|
```bash
|
||||||
|
# Python
|
||||||
|
pip-audit
|
||||||
|
safety check
|
||||||
|
|
||||||
|
# Node.js
|
||||||
|
npm audit
|
||||||
|
npx audit-ci --critical
|
||||||
|
|
||||||
|
# Rust
|
||||||
|
cargo audit
|
||||||
|
```
|
||||||
|
|
||||||
|
### 6. Hardcoded Configuration
|
||||||
|
|
||||||
|
**Vulnerable patterns:**
|
||||||
|
```python
|
||||||
|
# BAD - Hardcoded values
|
||||||
|
DATABASE_URL = "postgresql://user:password@localhost/db"
|
||||||
|
API_ENDPOINT = "https://api.production.com"
|
||||||
|
DEBUG = True
|
||||||
|
```
|
||||||
|
|
||||||
|
**Secure patterns:**
|
||||||
|
```python
|
||||||
|
# GOOD - Environment variables
|
||||||
|
import os
|
||||||
|
from pydantic_settings import BaseSettings
|
||||||
|
|
||||||
|
class Settings(BaseSettings):
|
||||||
|
database_url: str
|
||||||
|
api_endpoint: str
|
||||||
|
debug: bool = False
|
||||||
|
|
||||||
|
settings = Settings()
|
||||||
|
```
|
||||||
|
|
||||||
|
## Scan Output Format
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
## Security Scan Report
|
||||||
|
|
||||||
|
### Summary
|
||||||
|
- Critical: X
|
||||||
|
- High: X
|
||||||
|
- Medium: X
|
||||||
|
- Low: X
|
||||||
|
|
||||||
|
### Critical Issues
|
||||||
|
|
||||||
|
#### [CRIT-001] Hardcoded AWS Credentials
|
||||||
|
**File:** `src/config.py:42`
|
||||||
|
**Type:** Secrets Exposure
|
||||||
|
|
||||||
|
**Vulnerable Code:**
|
||||||
|
```python
|
||||||
|
AWS_SECRET_KEY = "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Remediation:**
|
||||||
|
1. Remove the secret from code immediately
|
||||||
|
2. Rotate the exposed credential
|
||||||
|
3. Use environment variables or AWS Secrets Manager:
|
||||||
|
```python
|
||||||
|
AWS_SECRET_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### High Issues
|
||||||
|
[...]
|
||||||
|
|
||||||
|
### Recommendations
|
||||||
|
1. [Specific recommendation]
|
||||||
|
2. [Specific recommendation]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Automated Checks
|
||||||
|
|
||||||
|
```bash
|
||||||
|
#!/bin/bash
|
||||||
|
# security-check.sh
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
echo "Running security checks..."
|
||||||
|
|
||||||
|
# Check for secrets
|
||||||
|
echo "Checking for secrets..."
|
||||||
|
if grep -rniE "(password|secret|api.?key)\s*[:=]\s*['\"][^'\"]+['\"]" src/; then
|
||||||
|
echo "FAIL: Potential secrets found"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check for AWS keys
|
||||||
|
if grep -rn "AKIA" src/; then
|
||||||
|
echo "FAIL: AWS access key found"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Python dependency audit
|
||||||
|
if [ -f "pyproject.toml" ]; then
|
||||||
|
echo "Auditing Python dependencies..."
|
||||||
|
pip-audit || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Node dependency audit
|
||||||
|
if [ -f "package.json" ]; then
|
||||||
|
echo "Auditing Node dependencies..."
|
||||||
|
npm audit --audit-level=high || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Rust dependency audit
|
||||||
|
if [ -f "Cargo.toml" ]; then
|
||||||
|
echo "Auditing Rust dependencies..."
|
||||||
|
cargo audit || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Security checks complete"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Integration with CI/CD
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/security.yml
|
||||||
|
name: Security Scan
|
||||||
|
|
||||||
|
on: [push, pull_request]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
security:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Run Trivy vulnerability scanner
|
||||||
|
uses: aquasecurity/trivy-action@master
|
||||||
|
with:
|
||||||
|
scan-type: 'fs'
|
||||||
|
scan-ref: '.'
|
||||||
|
severity: 'CRITICAL,HIGH'
|
||||||
|
|
||||||
|
- name: Run Gitleaks
|
||||||
|
uses: gitleaks/gitleaks-action@v2
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
```
|
||||||
144
.claude/agents/tdd-guardian.md
Normal file
144
.claude/agents/tdd-guardian.md
Normal file
@@ -0,0 +1,144 @@
|
|||||||
|
---
|
||||||
|
name: tdd-guardian
|
||||||
|
description: Enforces Test-Driven Development compliance. Use proactively when planning code changes and reactively to verify TDD was followed.
|
||||||
|
model: sonnet
|
||||||
|
---
|
||||||
|
|
||||||
|
# TDD Guardian Agent
|
||||||
|
|
||||||
|
You are a TDD enforcement specialist. Your role is to ensure all code follows strict Test-Driven Development practices.
|
||||||
|
|
||||||
|
## When Invoked Proactively (Before Code)
|
||||||
|
|
||||||
|
Guide the developer through proper TDD:
|
||||||
|
|
||||||
|
1. **Identify the behavior to implement**
|
||||||
|
- What should the code do?
|
||||||
|
- What are the inputs and expected outputs?
|
||||||
|
- What edge cases exist?
|
||||||
|
|
||||||
|
2. **Plan the first test**
|
||||||
|
- What's the simplest behavior to test first?
|
||||||
|
- How should the test be named to describe behavior?
|
||||||
|
- What factory functions are needed for test data?
|
||||||
|
|
||||||
|
3. **Remind of the cycle**
|
||||||
|
```
|
||||||
|
RED → Write failing test (run it, see it fail)
|
||||||
|
GREEN → Write minimum code to pass (nothing more!)
|
||||||
|
REFACTOR → Assess improvements (commit first!)
|
||||||
|
```
|
||||||
|
|
||||||
|
## When Invoked Reactively (After Code)
|
||||||
|
|
||||||
|
Verify TDD compliance by checking:
|
||||||
|
|
||||||
|
### 1. Test Coverage
|
||||||
|
```bash
|
||||||
|
# Run coverage and verify
|
||||||
|
pytest --cov=src --cov-report=term-missing
|
||||||
|
npm test -- --coverage
|
||||||
|
cargo tarpaulin
|
||||||
|
```
|
||||||
|
|
||||||
|
Look for:
|
||||||
|
- [ ] 80%+ overall coverage
|
||||||
|
- [ ] New code paths are covered
|
||||||
|
- [ ] Edge cases are tested
|
||||||
|
|
||||||
|
### 2. Test Quality
|
||||||
|
Review tests for:
|
||||||
|
- [ ] Tests describe behavior (not implementation)
|
||||||
|
- [ ] Test names are clear: "should [behavior] when [condition]"
|
||||||
|
- [ ] Factory functions used (no `let`/`beforeEach` with mutations)
|
||||||
|
- [ ] No spying on internal methods
|
||||||
|
- [ ] No testing of private implementation details
|
||||||
|
|
||||||
|
### 3. TDD Compliance Signals
|
||||||
|
|
||||||
|
**Good signs (TDD was followed):**
|
||||||
|
- Tests and implementation in same commit
|
||||||
|
- Tests describe behavior through public API
|
||||||
|
- Implementation is minimal (no over-engineering)
|
||||||
|
- Refactoring commits are separate
|
||||||
|
|
||||||
|
**Bad signs (TDD was NOT followed):**
|
||||||
|
- Implementation committed without tests
|
||||||
|
- Tests that mirror implementation structure
|
||||||
|
- Tests that spy on internal methods
|
||||||
|
- Coverage achieved by testing implementation details
|
||||||
|
|
||||||
|
## Verification Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check test coverage meets threshold
|
||||||
|
pytest --cov=src --cov-fail-under=80
|
||||||
|
npm test -- --coverage --coverageThreshold='{"global":{"lines":80}}'
|
||||||
|
|
||||||
|
# Check for test files modified with production code
|
||||||
|
git diff --name-only HEAD~1 | grep -E '\.(test|spec)\.(ts|tsx|py|rs)$'
|
||||||
|
|
||||||
|
# Verify no any types in TypeScript
|
||||||
|
grep -r "any" src/ --include="*.ts" --include="*.tsx"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Response Format
|
||||||
|
|
||||||
|
When verifying, report:
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
## TDD Compliance Report
|
||||||
|
|
||||||
|
### Coverage
|
||||||
|
- Overall: X%
|
||||||
|
- New code: Y%
|
||||||
|
- Threshold: 80%
|
||||||
|
- Status: ✅ PASS / ❌ FAIL
|
||||||
|
|
||||||
|
### Test Quality
|
||||||
|
- [ ] Tests describe behavior
|
||||||
|
- [ ] Factory functions used
|
||||||
|
- [ ] No implementation testing
|
||||||
|
- [ ] Public API tested
|
||||||
|
|
||||||
|
### Issues Found
|
||||||
|
1. [Issue description]
|
||||||
|
- File: `path/to/file.ts`
|
||||||
|
- Line: XX
|
||||||
|
- Fix: [suggestion]
|
||||||
|
|
||||||
|
### Verdict
|
||||||
|
✅ TDD COMPLIANT / ❌ TDD VIOLATION - [reason]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Common Violations to Flag
|
||||||
|
|
||||||
|
1. **No test for new code**
|
||||||
|
```
|
||||||
|
❌ File `src/service.ts` modified but no test changes
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Testing implementation**
|
||||||
|
```typescript
|
||||||
|
// ❌ BAD
|
||||||
|
expect(spy).toHaveBeenCalledWith(internalMethod);
|
||||||
|
|
||||||
|
// ✅ GOOD
|
||||||
|
expect(result.status).toBe('success');
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Mutable test setup**
|
||||||
|
```typescript
|
||||||
|
// ❌ BAD
|
||||||
|
let user: User;
|
||||||
|
beforeEach(() => { user = createUser(); });
|
||||||
|
|
||||||
|
// ✅ GOOD
|
||||||
|
const getMockUser = () => createUser();
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Coverage without behavior testing**
|
||||||
|
```
|
||||||
|
❌ 95% coverage but tests only check that code runs,
|
||||||
|
not that it produces correct results
|
||||||
|
```
|
||||||
54
.claude/hooks/auto-format.sh
Normal file
54
.claude/hooks/auto-format.sh
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Auto-format files after writing
|
||||||
|
# Runs appropriate formatter based on file extension
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Read the file path from stdin
|
||||||
|
INPUT=$(cat)
|
||||||
|
FILE_PATH=$(echo "$INPUT" | jq -r '.file_path // .filePath // empty')
|
||||||
|
|
||||||
|
if [ -z "$FILE_PATH" ] || [ ! -f "$FILE_PATH" ]; then
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Get file extension
|
||||||
|
EXT="${FILE_PATH##*.}"
|
||||||
|
|
||||||
|
case "$EXT" in
|
||||||
|
py)
|
||||||
|
if command -v ruff &> /dev/null; then
|
||||||
|
ruff format "$FILE_PATH" 2>/dev/null || true
|
||||||
|
ruff check --fix "$FILE_PATH" 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
ts|tsx|js|jsx)
|
||||||
|
# Try to find and use project's ESLint config
|
||||||
|
DIR=$(dirname "$FILE_PATH")
|
||||||
|
while [ "$DIR" != "/" ]; do
|
||||||
|
if [ -f "$DIR/package.json" ]; then
|
||||||
|
cd "$DIR"
|
||||||
|
if [ -f "node_modules/.bin/eslint" ]; then
|
||||||
|
./node_modules/.bin/eslint --fix "$FILE_PATH" 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
DIR=$(dirname "$DIR")
|
||||||
|
done
|
||||||
|
;;
|
||||||
|
rs)
|
||||||
|
if command -v rustfmt &> /dev/null; then
|
||||||
|
rustfmt "$FILE_PATH" 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
tf)
|
||||||
|
if command -v terraform &> /dev/null; then
|
||||||
|
terraform fmt "$FILE_PATH" 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
yaml|yml)
|
||||||
|
# Skip YAML formatting to preserve structure
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
exit 0
|
||||||
51
.claude/hooks/check-secrets.sh
Normal file
51
.claude/hooks/check-secrets.sh
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Check for secrets in files before writing
|
||||||
|
# Exit code 2 blocks the operation in Claude Code
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Read the file path from stdin (Claude passes tool_input as JSON)
|
||||||
|
INPUT=$(cat)
|
||||||
|
FILE_PATH=$(echo "$INPUT" | jq -r '.file_path // .filePath // empty')
|
||||||
|
|
||||||
|
if [ -z "$FILE_PATH" ]; then
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Skip non-code files
|
||||||
|
case "$FILE_PATH" in
|
||||||
|
*.md|*.txt|*.json|*.yaml|*.yml|*.toml|*.lock|*.svg|*.png|*.jpg|*.gif)
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
# Patterns that indicate secrets
|
||||||
|
SECRET_PATTERNS=(
|
||||||
|
'password\s*=\s*["\x27][^"\x27]+'
|
||||||
|
'api[_-]?key\s*=\s*["\x27][^"\x27]+'
|
||||||
|
'secret[_-]?key\s*=\s*["\x27][^"\x27]+'
|
||||||
|
'aws[_-]?access[_-]?key[_-]?id\s*=\s*["\x27][A-Z0-9]+'
|
||||||
|
'aws[_-]?secret[_-]?access[_-]?key\s*=\s*["\x27][^"\x27]+'
|
||||||
|
'private[_-]?key\s*=\s*["\x27][^"\x27]+'
|
||||||
|
'database[_-]?url\s*=\s*["\x27]postgres(ql)?://[^"\x27]+'
|
||||||
|
'mongodb(\+srv)?://[^"\x27\s]+'
|
||||||
|
'redis://[^"\x27\s]+'
|
||||||
|
'AKIA[0-9A-Z]{16}'
|
||||||
|
'ghp_[a-zA-Z0-9]{36}'
|
||||||
|
'sk-[a-zA-Z0-9]{48}'
|
||||||
|
'xox[baprs]-[0-9a-zA-Z-]+'
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check if file exists and scan for secrets
|
||||||
|
if [ -f "$FILE_PATH" ]; then
|
||||||
|
for pattern in "${SECRET_PATTERNS[@]}"; do
|
||||||
|
if grep -qiE "$pattern" "$FILE_PATH" 2>/dev/null; then
|
||||||
|
echo "BLOCKED: Potential secret detected in $FILE_PATH"
|
||||||
|
echo "Pattern matched: $pattern"
|
||||||
|
echo "Please use environment variables or secrets manager instead."
|
||||||
|
exit 2
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
exit 0
|
||||||
14
.claude/hooks/example-git-hooks/pre-commit
Normal file
14
.claude/hooks/example-git-hooks/pre-commit
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Git pre-commit hook for TDD enforcement
|
||||||
|
#
|
||||||
|
# Installation:
|
||||||
|
# cp ~/.claude/hooks/example-git-hooks/pre-commit .git/hooks/pre-commit
|
||||||
|
# chmod +x .git/hooks/pre-commit
|
||||||
|
#
|
||||||
|
# Or create a symlink:
|
||||||
|
# ln -sf ~/.claude/hooks/example-git-hooks/pre-commit .git/hooks/pre-commit
|
||||||
|
|
||||||
|
# Source the TDD enforcement script
|
||||||
|
~/.claude/hooks/pre-commit-tdd.sh
|
||||||
|
|
||||||
|
exit $?
|
||||||
57
.claude/hooks/pre-commit-tdd.sh
Normal file
57
.claude/hooks/pre-commit-tdd.sh
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Pre-commit hook to enforce TDD compliance
|
||||||
|
# Verifies that test files are modified alongside production code
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Get staged files
|
||||||
|
STAGED_FILES=$(git diff --cached --name-only --diff-filter=ACM)
|
||||||
|
|
||||||
|
# Separate test and production files
|
||||||
|
TEST_FILES=""
|
||||||
|
PROD_FILES=""
|
||||||
|
|
||||||
|
for file in $STAGED_FILES; do
|
||||||
|
case "$file" in
|
||||||
|
*test*.py|*_test.py|*tests/*.py)
|
||||||
|
TEST_FILES="$TEST_FILES $file"
|
||||||
|
;;
|
||||||
|
*.test.ts|*.test.tsx|*.spec.ts|*.spec.tsx|*/__tests__/*)
|
||||||
|
TEST_FILES="$TEST_FILES $file"
|
||||||
|
;;
|
||||||
|
*_test.rs|*/tests/*.rs)
|
||||||
|
TEST_FILES="$TEST_FILES $file"
|
||||||
|
;;
|
||||||
|
*.py|*.ts|*.tsx|*.js|*.jsx|*.rs)
|
||||||
|
# Skip __init__.py, conftest.py, config files
|
||||||
|
case "$file" in
|
||||||
|
*__init__.py|*conftest.py|*config*.py|*.config.ts|*.config.js)
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
PROD_FILES="$PROD_FILES $file"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# Check if we have production files without test files
|
||||||
|
if [ -n "$PROD_FILES" ] && [ -z "$TEST_FILES" ]; then
|
||||||
|
echo "TDD VIOLATION: Production code changed without test changes"
|
||||||
|
echo ""
|
||||||
|
echo "Production files modified:"
|
||||||
|
for file in $PROD_FILES; do
|
||||||
|
echo " - $file"
|
||||||
|
done
|
||||||
|
echo ""
|
||||||
|
echo "Please ensure you're following TDD:"
|
||||||
|
echo " 1. Write a failing test first (RED)"
|
||||||
|
echo " 2. Write minimum code to pass (GREEN)"
|
||||||
|
echo " 3. Refactor if needed (REFACTOR)"
|
||||||
|
echo ""
|
||||||
|
echo "If this is a legitimate exception (config, types, etc.), use:"
|
||||||
|
echo " git commit --no-verify"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
exit 0
|
||||||
52
.claude/settings.json
Normal file
52
.claude/settings.json
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
{
|
||||||
|
"model": "opusplan",
|
||||||
|
"permissions": {
|
||||||
|
"allow": [
|
||||||
|
"Bash(pytest:*)",
|
||||||
|
"Bash(npm test:*)",
|
||||||
|
"Bash(cargo test:*)",
|
||||||
|
"Bash(ruff:*)",
|
||||||
|
"Bash(terraform plan:*)",
|
||||||
|
"Bash(terraform validate:*)",
|
||||||
|
"Bash(ansible-playbook:--check*)",
|
||||||
|
"Bash(git status:*)",
|
||||||
|
"Bash(git diff:*)",
|
||||||
|
"Bash(git log:*)"
|
||||||
|
],
|
||||||
|
"deny": [
|
||||||
|
"Bash(rm -rf /*)",
|
||||||
|
"Bash(terraform apply:--auto-approve*)",
|
||||||
|
"Bash(terraform destroy:*)",
|
||||||
|
"Bash(git push:--force*)",
|
||||||
|
"Bash(git reset:--hard*)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"hooks": {
|
||||||
|
"PreToolUse": [
|
||||||
|
{
|
||||||
|
"matcher": "Write|Edit",
|
||||||
|
"hooks": [
|
||||||
|
{
|
||||||
|
"type": "command",
|
||||||
|
"command": "~/.claude/hooks/check-secrets.sh"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"PostToolUse": [
|
||||||
|
{
|
||||||
|
"matcher": "Write|Edit",
|
||||||
|
"hooks": [
|
||||||
|
{
|
||||||
|
"type": "command",
|
||||||
|
"command": "~/.claude/hooks/auto-format.sh"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"env": {
|
||||||
|
"COVERAGE_THRESHOLD": "80",
|
||||||
|
"TDD_STRICT": "true"
|
||||||
|
}
|
||||||
|
}
|
||||||
632
.claude/skills/infrastructure/ansible/SKILL.md
Normal file
632
.claude/skills/infrastructure/ansible/SKILL.md
Normal file
@@ -0,0 +1,632 @@
|
|||||||
|
---
|
||||||
|
name: ansible-automation
|
||||||
|
description: Ansible configuration management with playbook patterns, roles, and best practices. Use when writing Ansible playbooks, roles, or inventory configurations.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Ansible Automation Skill
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
ansible/
|
||||||
|
├── ansible.cfg
|
||||||
|
├── inventory/
|
||||||
|
│ ├── dev/
|
||||||
|
│ │ ├── hosts.yml
|
||||||
|
│ │ └── group_vars/
|
||||||
|
│ │ ├── all.yml
|
||||||
|
│ │ └── webservers.yml
|
||||||
|
│ ├── staging/
|
||||||
|
│ └── prod/
|
||||||
|
├── playbooks/
|
||||||
|
│ ├── site.yml # Main playbook
|
||||||
|
│ ├── webservers.yml
|
||||||
|
│ ├── databases.yml
|
||||||
|
│ └── deploy.yml
|
||||||
|
├── roles/
|
||||||
|
│ ├── common/
|
||||||
|
│ ├── nginx/
|
||||||
|
│ ├── postgresql/
|
||||||
|
│ └── app/
|
||||||
|
├── group_vars/
|
||||||
|
│ └── all.yml
|
||||||
|
├── host_vars/
|
||||||
|
└── files/
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration (ansible.cfg)
|
||||||
|
|
||||||
|
```ini
|
||||||
|
[defaults]
|
||||||
|
inventory = inventory/dev/hosts.yml
|
||||||
|
roles_path = roles
|
||||||
|
remote_user = ec2-user
|
||||||
|
host_key_checking = False
|
||||||
|
retry_files_enabled = False
|
||||||
|
gathering = smart
|
||||||
|
fact_caching = jsonfile
|
||||||
|
fact_caching_connection = /tmp/ansible_facts
|
||||||
|
fact_caching_timeout = 86400
|
||||||
|
|
||||||
|
# Security
|
||||||
|
no_log = False
|
||||||
|
display_skipped_hosts = False
|
||||||
|
|
||||||
|
[privilege_escalation]
|
||||||
|
become = True
|
||||||
|
become_method = sudo
|
||||||
|
become_user = root
|
||||||
|
become_ask_pass = False
|
||||||
|
|
||||||
|
[ssh_connection]
|
||||||
|
pipelining = True
|
||||||
|
control_path = /tmp/ansible-ssh-%%h-%%p-%%r
|
||||||
|
```
|
||||||
|
|
||||||
|
## Inventory Patterns
|
||||||
|
|
||||||
|
### YAML Inventory (recommended)
|
||||||
|
```yaml
|
||||||
|
# inventory/dev/hosts.yml
|
||||||
|
all:
|
||||||
|
children:
|
||||||
|
webservers:
|
||||||
|
hosts:
|
||||||
|
web1:
|
||||||
|
ansible_host: 10.0.1.10
|
||||||
|
web2:
|
||||||
|
ansible_host: 10.0.1.11
|
||||||
|
vars:
|
||||||
|
nginx_port: 80
|
||||||
|
app_port: 8000
|
||||||
|
|
||||||
|
databases:
|
||||||
|
hosts:
|
||||||
|
db1:
|
||||||
|
ansible_host: 10.0.2.10
|
||||||
|
postgresql_version: "15"
|
||||||
|
|
||||||
|
workers:
|
||||||
|
hosts:
|
||||||
|
worker[1:3]:
|
||||||
|
ansible_host: "10.0.3.{{ item }}"
|
||||||
|
|
||||||
|
vars:
|
||||||
|
ansible_user: ec2-user
|
||||||
|
ansible_python_interpreter: /usr/bin/python3
|
||||||
|
```
|
||||||
|
|
||||||
|
### Dynamic Inventory (AWS)
|
||||||
|
```yaml
|
||||||
|
# inventory/aws_ec2.yml
|
||||||
|
plugin: amazon.aws.aws_ec2
|
||||||
|
regions:
|
||||||
|
- eu-west-2
|
||||||
|
filters:
|
||||||
|
tag:Environment: dev
|
||||||
|
instance-state-name: running
|
||||||
|
keyed_groups:
|
||||||
|
- key: tags.Role
|
||||||
|
prefix: role
|
||||||
|
- key: placement.availability_zone
|
||||||
|
prefix: az
|
||||||
|
hostnames:
|
||||||
|
- private-ip-address
|
||||||
|
compose:
|
||||||
|
ansible_host: private_ip_address
|
||||||
|
```
|
||||||
|
|
||||||
|
## Playbook Patterns
|
||||||
|
|
||||||
|
### Main Site Playbook
|
||||||
|
```yaml
|
||||||
|
# playbooks/site.yml
|
||||||
|
---
|
||||||
|
- name: Configure all hosts
|
||||||
|
hosts: all
|
||||||
|
become: true
|
||||||
|
roles:
|
||||||
|
- common
|
||||||
|
|
||||||
|
- name: Configure web servers
|
||||||
|
hosts: webservers
|
||||||
|
become: true
|
||||||
|
roles:
|
||||||
|
- nginx
|
||||||
|
- app
|
||||||
|
|
||||||
|
- name: Configure databases
|
||||||
|
hosts: databases
|
||||||
|
become: true
|
||||||
|
roles:
|
||||||
|
- postgresql
|
||||||
|
```
|
||||||
|
|
||||||
|
### Application Deployment
|
||||||
|
```yaml
|
||||||
|
# playbooks/deploy.yml
|
||||||
|
---
|
||||||
|
- name: Deploy application
|
||||||
|
hosts: webservers
|
||||||
|
become: true
|
||||||
|
serial: "25%" # Rolling deployment
|
||||||
|
max_fail_percentage: 25
|
||||||
|
|
||||||
|
vars:
|
||||||
|
app_version: "{{ lookup('env', 'APP_VERSION') | default('latest') }}"
|
||||||
|
|
||||||
|
pre_tasks:
|
||||||
|
- name: Verify deployment prerequisites
|
||||||
|
ansible.builtin.assert:
|
||||||
|
that:
|
||||||
|
- app_version is defined
|
||||||
|
- app_version != ''
|
||||||
|
fail_msg: "APP_VERSION must be set"
|
||||||
|
|
||||||
|
- name: Remove from load balancer
|
||||||
|
ansible.builtin.uri:
|
||||||
|
url: "{{ lb_api_url }}/deregister"
|
||||||
|
method: POST
|
||||||
|
body:
|
||||||
|
instance_id: "{{ ansible_hostname }}"
|
||||||
|
body_format: json
|
||||||
|
delegate_to: localhost
|
||||||
|
when: lb_api_url is defined
|
||||||
|
|
||||||
|
roles:
|
||||||
|
- role: app
|
||||||
|
vars:
|
||||||
|
app_state: present
|
||||||
|
|
||||||
|
post_tasks:
|
||||||
|
- name: Wait for application health check
|
||||||
|
ansible.builtin.uri:
|
||||||
|
url: "http://localhost:{{ app_port }}/health"
|
||||||
|
status_code: 200
|
||||||
|
register: health_check
|
||||||
|
until: health_check.status == 200
|
||||||
|
retries: 30
|
||||||
|
delay: 5
|
||||||
|
|
||||||
|
- name: Add back to load balancer
|
||||||
|
ansible.builtin.uri:
|
||||||
|
url: "{{ lb_api_url }}/register"
|
||||||
|
method: POST
|
||||||
|
body:
|
||||||
|
instance_id: "{{ ansible_hostname }}"
|
||||||
|
body_format: json
|
||||||
|
delegate_to: localhost
|
||||||
|
when: lb_api_url is defined
|
||||||
|
|
||||||
|
handlers:
|
||||||
|
- name: Restart application
|
||||||
|
ansible.builtin.systemd:
|
||||||
|
name: myapp
|
||||||
|
state: restarted
|
||||||
|
daemon_reload: true
|
||||||
|
```
|
||||||
|
|
||||||
|
## Role Structure
|
||||||
|
|
||||||
|
### Role Layout
|
||||||
|
```
|
||||||
|
roles/app/
|
||||||
|
├── defaults/
|
||||||
|
│ └── main.yml # Default variables (lowest priority)
|
||||||
|
├── vars/
|
||||||
|
│ └── main.yml # Role variables (higher priority)
|
||||||
|
├── tasks/
|
||||||
|
│ ├── main.yml # Main task entry point
|
||||||
|
│ ├── install.yml
|
||||||
|
│ ├── configure.yml
|
||||||
|
│ └── service.yml
|
||||||
|
├── handlers/
|
||||||
|
│ └── main.yml # Handlers for notifications
|
||||||
|
├── templates/
|
||||||
|
│ ├── app.conf.j2
|
||||||
|
│ └── systemd.service.j2
|
||||||
|
├── files/
|
||||||
|
│ └── scripts/
|
||||||
|
├── meta/
|
||||||
|
│ └── main.yml # Role metadata and dependencies
|
||||||
|
└── README.md
|
||||||
|
```
|
||||||
|
|
||||||
|
### Role Tasks
|
||||||
|
```yaml
|
||||||
|
# roles/app/tasks/main.yml
|
||||||
|
---
|
||||||
|
- name: Include installation tasks
|
||||||
|
ansible.builtin.include_tasks: install.yml
|
||||||
|
tags:
|
||||||
|
- install
|
||||||
|
|
||||||
|
- name: Include configuration tasks
|
||||||
|
ansible.builtin.include_tasks: configure.yml
|
||||||
|
tags:
|
||||||
|
- configure
|
||||||
|
|
||||||
|
- name: Include service tasks
|
||||||
|
ansible.builtin.include_tasks: service.yml
|
||||||
|
tags:
|
||||||
|
- service
|
||||||
|
```
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# roles/app/tasks/install.yml
|
||||||
|
---
|
||||||
|
- name: Create application user
|
||||||
|
ansible.builtin.user:
|
||||||
|
name: "{{ app_user }}"
|
||||||
|
system: true
|
||||||
|
shell: /bin/false
|
||||||
|
home: "{{ app_home }}"
|
||||||
|
create_home: true
|
||||||
|
|
||||||
|
- name: Create application directories
|
||||||
|
ansible.builtin.file:
|
||||||
|
path: "{{ item }}"
|
||||||
|
state: directory
|
||||||
|
owner: "{{ app_user }}"
|
||||||
|
group: "{{ app_group }}"
|
||||||
|
mode: "0755"
|
||||||
|
loop:
|
||||||
|
- "{{ app_home }}"
|
||||||
|
- "{{ app_home }}/releases"
|
||||||
|
- "{{ app_home }}/shared"
|
||||||
|
- "{{ app_log_dir }}"
|
||||||
|
|
||||||
|
- name: Download application artifact
|
||||||
|
ansible.builtin.get_url:
|
||||||
|
url: "{{ app_artifact_url }}/{{ app_version }}/app.tar.gz"
|
||||||
|
dest: "{{ app_home }}/releases/{{ app_version }}.tar.gz"
|
||||||
|
checksum: "sha256:{{ app_checksum }}"
|
||||||
|
register: download_result
|
||||||
|
|
||||||
|
- name: Extract application
|
||||||
|
ansible.builtin.unarchive:
|
||||||
|
src: "{{ app_home }}/releases/{{ app_version }}.tar.gz"
|
||||||
|
dest: "{{ app_home }}/releases/{{ app_version }}"
|
||||||
|
remote_src: true
|
||||||
|
when: download_result.changed
|
||||||
|
|
||||||
|
- name: Link current release
|
||||||
|
ansible.builtin.file:
|
||||||
|
src: "{{ app_home }}/releases/{{ app_version }}"
|
||||||
|
dest: "{{ app_home }}/current"
|
||||||
|
state: link
|
||||||
|
notify: Restart application
|
||||||
|
```
|
||||||
|
|
||||||
|
### Role Handlers
|
||||||
|
```yaml
|
||||||
|
# roles/app/handlers/main.yml
|
||||||
|
---
|
||||||
|
- name: Restart application
|
||||||
|
ansible.builtin.systemd:
|
||||||
|
name: "{{ app_service_name }}"
|
||||||
|
state: restarted
|
||||||
|
daemon_reload: true
|
||||||
|
|
||||||
|
- name: Reload nginx
|
||||||
|
ansible.builtin.systemd:
|
||||||
|
name: nginx
|
||||||
|
state: reloaded
|
||||||
|
```
|
||||||
|
|
||||||
|
### Role Defaults
|
||||||
|
```yaml
|
||||||
|
# roles/app/defaults/main.yml
|
||||||
|
---
|
||||||
|
app_user: myapp
|
||||||
|
app_group: myapp
|
||||||
|
app_home: /opt/myapp
|
||||||
|
app_port: 8000
|
||||||
|
app_log_dir: /var/log/myapp
|
||||||
|
app_service_name: myapp
|
||||||
|
|
||||||
|
# These should be overridden
|
||||||
|
app_version: ""
|
||||||
|
app_artifact_url: ""
|
||||||
|
app_checksum: ""
|
||||||
|
```
|
||||||
|
|
||||||
|
## Templates (Jinja2)
|
||||||
|
|
||||||
|
### Application Config
|
||||||
|
```jinja2
|
||||||
|
{# roles/app/templates/app.conf.j2 #}
|
||||||
|
# Application Configuration
|
||||||
|
# Managed by Ansible - DO NOT EDIT
|
||||||
|
|
||||||
|
[server]
|
||||||
|
host = {{ app_bind_host | default('0.0.0.0') }}
|
||||||
|
port = {{ app_port }}
|
||||||
|
workers = {{ app_workers | default(ansible_processor_vcpus * 2) }}
|
||||||
|
|
||||||
|
[database]
|
||||||
|
host = {{ db_host }}
|
||||||
|
port = {{ db_port | default(5432) }}
|
||||||
|
name = {{ db_name }}
|
||||||
|
user = {{ db_user }}
|
||||||
|
# Password from environment variable
|
||||||
|
password_env = DB_PASSWORD
|
||||||
|
|
||||||
|
[logging]
|
||||||
|
level = {{ app_log_level | default('INFO') }}
|
||||||
|
file = {{ app_log_dir }}/app.log
|
||||||
|
|
||||||
|
{% if app_features is defined %}
|
||||||
|
[features]
|
||||||
|
{% for feature, enabled in app_features.items() %}
|
||||||
|
{{ feature }} = {{ enabled | lower }}
|
||||||
|
{% endfor %}
|
||||||
|
{% endif %}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Systemd Service
|
||||||
|
```jinja2
|
||||||
|
{# roles/app/templates/systemd.service.j2 #}
|
||||||
|
[Unit]
|
||||||
|
Description={{ app_description | default('Application Service') }}
|
||||||
|
After=network.target
|
||||||
|
Wants=network-online.target
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=simple
|
||||||
|
User={{ app_user }}
|
||||||
|
Group={{ app_group }}
|
||||||
|
WorkingDirectory={{ app_home }}/current
|
||||||
|
ExecStart={{ app_home }}/current/bin/app serve
|
||||||
|
ExecReload=/bin/kill -HUP $MAINPID
|
||||||
|
Restart=always
|
||||||
|
RestartSec=5
|
||||||
|
|
||||||
|
# Environment
|
||||||
|
Environment="PORT={{ app_port }}"
|
||||||
|
Environment="LOG_LEVEL={{ app_log_level | default('INFO') }}"
|
||||||
|
EnvironmentFile=-{{ app_home }}/shared/.env
|
||||||
|
|
||||||
|
# Security
|
||||||
|
NoNewPrivileges=true
|
||||||
|
PrivateTmp=true
|
||||||
|
ProtectSystem=strict
|
||||||
|
ReadWritePaths={{ app_log_dir }} {{ app_home }}/shared
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
|
```
|
||||||
|
|
||||||
|
## Secrets Management with Vault
|
||||||
|
|
||||||
|
### Encrypting Variables
|
||||||
|
```bash
|
||||||
|
# Create encrypted file
|
||||||
|
ansible-vault create group_vars/prod/vault.yml
|
||||||
|
|
||||||
|
# Edit encrypted file
|
||||||
|
ansible-vault edit group_vars/prod/vault.yml
|
||||||
|
|
||||||
|
# Encrypt existing file
|
||||||
|
ansible-vault encrypt group_vars/prod/secrets.yml
|
||||||
|
|
||||||
|
# Encrypt string for inline use
|
||||||
|
ansible-vault encrypt_string 'mysecret' --name 'db_password'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Vault Variables Pattern
|
||||||
|
```yaml
|
||||||
|
# group_vars/prod/vault.yml (encrypted)
|
||||||
|
vault_db_password: "supersecretpassword"
|
||||||
|
vault_api_key: "api-key-here"
|
||||||
|
|
||||||
|
# group_vars/prod/vars.yml (plain, references vault)
|
||||||
|
db_password: "{{ vault_db_password }}"
|
||||||
|
api_key: "{{ vault_api_key }}"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Using Vault in Playbooks
|
||||||
|
```bash
|
||||||
|
# Run with vault password file
|
||||||
|
ansible-playbook playbooks/site.yml --vault-password-file ~/.vault_pass
|
||||||
|
|
||||||
|
# Run with vault password prompt
|
||||||
|
ansible-playbook playbooks/site.yml --ask-vault-pass
|
||||||
|
|
||||||
|
# Multiple vault IDs
|
||||||
|
ansible-playbook playbooks/site.yml \
|
||||||
|
--vault-id dev@~/.vault_pass_dev \
|
||||||
|
--vault-id prod@~/.vault_pass_prod
|
||||||
|
```
|
||||||
|
|
||||||
|
## Idempotency Best Practices
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# GOOD: Idempotent - can run multiple times safely
|
||||||
|
- name: Ensure package is installed
|
||||||
|
ansible.builtin.apt:
|
||||||
|
name: nginx
|
||||||
|
state: present
|
||||||
|
|
||||||
|
- name: Ensure service is running
|
||||||
|
ansible.builtin.systemd:
|
||||||
|
name: nginx
|
||||||
|
state: started
|
||||||
|
enabled: true
|
||||||
|
|
||||||
|
- name: Ensure configuration file exists
|
||||||
|
ansible.builtin.template:
|
||||||
|
src: nginx.conf.j2
|
||||||
|
dest: /etc/nginx/nginx.conf
|
||||||
|
mode: "0644"
|
||||||
|
notify: Reload nginx
|
||||||
|
|
||||||
|
# BAD: Not idempotent - will fail on second run
|
||||||
|
- name: Add line to file
|
||||||
|
ansible.builtin.shell: echo "export PATH=/app/bin:$PATH" >> /etc/profile
|
||||||
|
# Use lineinfile instead!
|
||||||
|
|
||||||
|
# GOOD: Idempotent alternative
|
||||||
|
- name: Add application to PATH
|
||||||
|
ansible.builtin.lineinfile:
|
||||||
|
path: /etc/profile.d/app.sh
|
||||||
|
line: 'export PATH=/app/bin:$PATH'
|
||||||
|
create: true
|
||||||
|
mode: "0644"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- name: Deploy with error handling
|
||||||
|
block:
|
||||||
|
- name: Download artifact
|
||||||
|
ansible.builtin.get_url:
|
||||||
|
url: "{{ artifact_url }}"
|
||||||
|
dest: /tmp/artifact.tar.gz
|
||||||
|
|
||||||
|
- name: Extract artifact
|
||||||
|
ansible.builtin.unarchive:
|
||||||
|
src: /tmp/artifact.tar.gz
|
||||||
|
dest: /opt/app
|
||||||
|
remote_src: true
|
||||||
|
|
||||||
|
rescue:
|
||||||
|
- name: Log deployment failure
|
||||||
|
ansible.builtin.debug:
|
||||||
|
msg: "Deployment failed on {{ inventory_hostname }}"
|
||||||
|
|
||||||
|
- name: Send alert
|
||||||
|
ansible.builtin.uri:
|
||||||
|
url: "{{ slack_webhook }}"
|
||||||
|
method: POST
|
||||||
|
body:
|
||||||
|
text: "Deployment failed on {{ inventory_hostname }}"
|
||||||
|
body_format: json
|
||||||
|
delegate_to: localhost
|
||||||
|
|
||||||
|
always:
|
||||||
|
- name: Clean up temporary files
|
||||||
|
ansible.builtin.file:
|
||||||
|
path: /tmp/artifact.tar.gz
|
||||||
|
state: absent
|
||||||
|
```
|
||||||
|
|
||||||
|
## Conditionals and Loops
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# Conditional execution
|
||||||
|
- name: Install package (Debian)
|
||||||
|
ansible.builtin.apt:
|
||||||
|
name: nginx
|
||||||
|
state: present
|
||||||
|
when: ansible_os_family == "Debian"
|
||||||
|
|
||||||
|
- name: Install package (RedHat)
|
||||||
|
ansible.builtin.yum:
|
||||||
|
name: nginx
|
||||||
|
state: present
|
||||||
|
when: ansible_os_family == "RedHat"
|
||||||
|
|
||||||
|
# Loops
|
||||||
|
- name: Create users
|
||||||
|
ansible.builtin.user:
|
||||||
|
name: "{{ item.name }}"
|
||||||
|
groups: "{{ item.groups }}"
|
||||||
|
state: present
|
||||||
|
loop:
|
||||||
|
- { name: deploy, groups: [wheel, docker] }
|
||||||
|
- { name: monitoring, groups: [wheel] }
|
||||||
|
|
||||||
|
# Loop with dict
|
||||||
|
- name: Configure services
|
||||||
|
ansible.builtin.systemd:
|
||||||
|
name: "{{ item.key }}"
|
||||||
|
state: "{{ item.value.state }}"
|
||||||
|
enabled: "{{ item.value.enabled }}"
|
||||||
|
loop: "{{ services | dict2items }}"
|
||||||
|
vars:
|
||||||
|
services:
|
||||||
|
nginx:
|
||||||
|
state: started
|
||||||
|
enabled: true
|
||||||
|
postgresql:
|
||||||
|
state: started
|
||||||
|
enabled: true
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Syntax check
|
||||||
|
ansible-playbook playbooks/site.yml --syntax-check
|
||||||
|
|
||||||
|
# Dry run (check mode)
|
||||||
|
ansible-playbook playbooks/site.yml --check
|
||||||
|
|
||||||
|
# Dry run with diff
|
||||||
|
ansible-playbook playbooks/site.yml --check --diff
|
||||||
|
|
||||||
|
# Run playbook
|
||||||
|
ansible-playbook playbooks/site.yml
|
||||||
|
|
||||||
|
# Run with specific inventory
|
||||||
|
ansible-playbook -i inventory/prod/hosts.yml playbooks/site.yml
|
||||||
|
|
||||||
|
# Limit to specific hosts
|
||||||
|
ansible-playbook playbooks/site.yml --limit webservers
|
||||||
|
|
||||||
|
# Run specific tags
|
||||||
|
ansible-playbook playbooks/site.yml --tags "configure,service"
|
||||||
|
|
||||||
|
# Skip tags
|
||||||
|
ansible-playbook playbooks/site.yml --skip-tags "install"
|
||||||
|
|
||||||
|
# Extra variables
|
||||||
|
ansible-playbook playbooks/deploy.yml -e "app_version=1.2.3"
|
||||||
|
|
||||||
|
# Ad-hoc commands
|
||||||
|
ansible webservers -m ping
|
||||||
|
ansible all -m shell -a "uptime"
|
||||||
|
ansible databases -m service -a "name=postgresql state=restarted" --become
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns to Avoid
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# BAD: Using shell when module exists
|
||||||
|
- name: Install package
|
||||||
|
ansible.builtin.shell: apt-get install -y nginx
|
||||||
|
|
||||||
|
# GOOD: Use the appropriate module
|
||||||
|
- name: Install package
|
||||||
|
ansible.builtin.apt:
|
||||||
|
name: nginx
|
||||||
|
state: present
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: Hardcoded values
|
||||||
|
- name: Create user
|
||||||
|
ansible.builtin.user:
|
||||||
|
name: deploy
|
||||||
|
uid: 1001
|
||||||
|
|
||||||
|
# GOOD: Use variables
|
||||||
|
- name: Create user
|
||||||
|
ansible.builtin.user:
|
||||||
|
name: "{{ deploy_user }}"
|
||||||
|
uid: "{{ deploy_uid | default(omit) }}"
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: Secrets in plain text
|
||||||
|
- name: Set database password
|
||||||
|
ansible.builtin.lineinfile:
|
||||||
|
path: /etc/app/config
|
||||||
|
line: "DB_PASSWORD=mysecret" # NEVER!
|
||||||
|
|
||||||
|
# GOOD: Use vault
|
||||||
|
- name: Set database password
|
||||||
|
ansible.builtin.lineinfile:
|
||||||
|
path: /etc/app/config
|
||||||
|
line: "DB_PASSWORD={{ vault_db_password }}"
|
||||||
|
```
|
||||||
423
.claude/skills/infrastructure/aws/SKILL.md
Normal file
423
.claude/skills/infrastructure/aws/SKILL.md
Normal file
@@ -0,0 +1,423 @@
|
|||||||
|
---
|
||||||
|
name: aws-services
|
||||||
|
description: AWS service patterns, IAM best practices, and common architectures. Use when designing or implementing AWS infrastructure.
|
||||||
|
---
|
||||||
|
|
||||||
|
# AWS Services Skill
|
||||||
|
|
||||||
|
## Common Architecture Patterns
|
||||||
|
|
||||||
|
### Web Application (ECS + RDS)
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────────┐
|
||||||
|
│ VPC │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────┐│
|
||||||
|
│ │ Public Subnets ││
|
||||||
|
│ │ ┌─────────────┐ ┌─────────────┐ ││
|
||||||
|
│ │ │ ALB │ │ NAT GW │ ││
|
||||||
|
│ │ └──────┬──────┘ └──────┬──────┘ ││
|
||||||
|
│ └─────────┼───────────────────────────────────┼───────────┘│
|
||||||
|
│ │ │ │
|
||||||
|
│ ┌─────────┼───────────────────────────────────┼───────────┐│
|
||||||
|
│ │ │ Private Subnets │ ││
|
||||||
|
│ │ ┌──────▼──────┐ ┌───────▼─────┐ ││
|
||||||
|
│ │ │ ECS Fargate │ │ RDS │ ││
|
||||||
|
│ │ │ (Tasks) │───────────────────▶│ PostgreSQL │ ││
|
||||||
|
│ │ └─────────────┘ └─────────────┘ ││
|
||||||
|
│ └─────────────────────────────────────────────────────────┘│
|
||||||
|
└─────────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### Serverless (Lambda + API Gateway)
|
||||||
|
```
|
||||||
|
┌────────────┐ ┌─────────────┐ ┌─────────────┐
|
||||||
|
│ Route53 │────▶│ API Gateway │────▶│ Lambda │
|
||||||
|
└────────────┘ └─────────────┘ └──────┬──────┘
|
||||||
|
│
|
||||||
|
┌──────────────────────────┼──────────────┐
|
||||||
|
│ │ │
|
||||||
|
┌──────▼─────┐ ┌─────────┐ ┌────▼────┐
|
||||||
|
│ DynamoDB │ │ S3 │ │ Secrets │
|
||||||
|
└────────────┘ └─────────┘ │ Manager │
|
||||||
|
└─────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
## IAM Best Practices
|
||||||
|
|
||||||
|
### Least Privilege Policy
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Version": "2012-10-17",
|
||||||
|
"Statement": [
|
||||||
|
{
|
||||||
|
"Sid": "AllowS3ReadSpecificBucket",
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Action": [
|
||||||
|
"s3:GetObject",
|
||||||
|
"s3:ListBucket"
|
||||||
|
],
|
||||||
|
"Resource": [
|
||||||
|
"arn:aws:s3:::my-app-data-bucket",
|
||||||
|
"arn:aws:s3:::my-app-data-bucket/*"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Sid": "AllowSecretsAccess",
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Action": [
|
||||||
|
"secretsmanager:GetSecretValue"
|
||||||
|
],
|
||||||
|
"Resource": [
|
||||||
|
"arn:aws:secretsmanager:eu-west-2:123456789:secret:my-app/*"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Trust Policy (for ECS Tasks)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Version": "2012-10-17",
|
||||||
|
"Statement": [
|
||||||
|
{
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Principal": {
|
||||||
|
"Service": "ecs-tasks.amazonaws.com"
|
||||||
|
},
|
||||||
|
"Action": "sts:AssumeRole"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cross-Account Access
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Version": "2012-10-17",
|
||||||
|
"Statement": [
|
||||||
|
{
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Principal": {
|
||||||
|
"AWS": "arn:aws:iam::ACCOUNT_ID:role/CrossAccountRole"
|
||||||
|
},
|
||||||
|
"Action": "sts:AssumeRole",
|
||||||
|
"Condition": {
|
||||||
|
"StringEquals": {
|
||||||
|
"sts:ExternalId": "unique-external-id"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Secrets Management
|
||||||
|
|
||||||
|
### Using Secrets Manager
|
||||||
|
```python
|
||||||
|
# Python - boto3
|
||||||
|
import boto3
|
||||||
|
import json
|
||||||
|
|
||||||
|
def get_secret(secret_name: str, region: str = "eu-west-2") -> dict:
|
||||||
|
client = boto3.client("secretsmanager", region_name=region)
|
||||||
|
response = client.get_secret_value(SecretId=secret_name)
|
||||||
|
return json.loads(response["SecretString"])
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
db_creds = get_secret("myapp/prod/database")
|
||||||
|
connection_string = f"postgresql://{db_creds['username']}:{db_creds['password']}@{db_creds['host']}/{db_creds['database']}"
|
||||||
|
```
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// TypeScript - AWS SDK v3
|
||||||
|
import { SecretsManagerClient, GetSecretValueCommand } from "@aws-sdk/client-secrets-manager";
|
||||||
|
|
||||||
|
async function getSecret(secretName: string): Promise<Record<string, string>> {
|
||||||
|
const client = new SecretsManagerClient({ region: "eu-west-2" });
|
||||||
|
const command = new GetSecretValueCommand({ SecretId: secretName });
|
||||||
|
const response = await client.send(command);
|
||||||
|
|
||||||
|
if (!response.SecretString) {
|
||||||
|
throw new Error("Secret not found");
|
||||||
|
}
|
||||||
|
|
||||||
|
return JSON.parse(response.SecretString);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### ECS Task with Secrets
|
||||||
|
```json
|
||||||
|
// Task definition
|
||||||
|
{
|
||||||
|
"containerDefinitions": [
|
||||||
|
{
|
||||||
|
"name": "app",
|
||||||
|
"secrets": [
|
||||||
|
{
|
||||||
|
"name": "DATABASE_PASSWORD",
|
||||||
|
"valueFrom": "arn:aws:secretsmanager:eu-west-2:123456789:secret:myapp/database:password::"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "API_KEY",
|
||||||
|
"valueFrom": "arn:aws:secretsmanager:eu-west-2:123456789:secret:myapp/api-key"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## S3 Patterns
|
||||||
|
|
||||||
|
### Bucket Policy (Least Privilege)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Version": "2012-10-17",
|
||||||
|
"Statement": [
|
||||||
|
{
|
||||||
|
"Sid": "AllowECSTaskAccess",
|
||||||
|
"Effect": "Allow",
|
||||||
|
"Principal": {
|
||||||
|
"AWS": "arn:aws:iam::123456789:role/ecs-task-role"
|
||||||
|
},
|
||||||
|
"Action": [
|
||||||
|
"s3:GetObject",
|
||||||
|
"s3:PutObject"
|
||||||
|
],
|
||||||
|
"Resource": "arn:aws:s3:::my-bucket/uploads/*"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Sid": "DenyUnencryptedUploads",
|
||||||
|
"Effect": "Deny",
|
||||||
|
"Principal": "*",
|
||||||
|
"Action": "s3:PutObject",
|
||||||
|
"Resource": "arn:aws:s3:::my-bucket/*",
|
||||||
|
"Condition": {
|
||||||
|
"StringNotEquals": {
|
||||||
|
"s3:x-amz-server-side-encryption": "AES256"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Presigned URLs
|
||||||
|
```python
|
||||||
|
import boto3
|
||||||
|
from botocore.config import Config
|
||||||
|
|
||||||
|
def generate_presigned_url(bucket: str, key: str, expiration: int = 3600) -> str:
|
||||||
|
"""Generate a presigned URL for S3 object access."""
|
||||||
|
s3_client = boto3.client(
|
||||||
|
"s3",
|
||||||
|
config=Config(signature_version="s3v4"),
|
||||||
|
region_name="eu-west-2"
|
||||||
|
)
|
||||||
|
|
||||||
|
return s3_client.generate_presigned_url(
|
||||||
|
"get_object",
|
||||||
|
Params={"Bucket": bucket, "Key": key},
|
||||||
|
ExpiresIn=expiration
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
## DynamoDB Patterns
|
||||||
|
|
||||||
|
### Single Table Design
|
||||||
|
```python
|
||||||
|
# Entity types in same table
|
||||||
|
ENTITY_TYPES = {
|
||||||
|
"USER": {"PK": "USER#", "SK": "PROFILE"},
|
||||||
|
"ORDER": {"PK": "USER#", "SK": "ORDER#"},
|
||||||
|
"PRODUCT": {"PK": "PRODUCT#", "SK": "DETAILS"},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Access patterns
|
||||||
|
def get_user(user_id: str) -> dict:
|
||||||
|
return table.get_item(
|
||||||
|
Key={"PK": f"USER#{user_id}", "SK": "PROFILE"}
|
||||||
|
)["Item"]
|
||||||
|
|
||||||
|
def get_user_orders(user_id: str) -> list:
|
||||||
|
response = table.query(
|
||||||
|
KeyConditionExpression="PK = :pk AND begins_with(SK, :sk)",
|
||||||
|
ExpressionAttributeValues={
|
||||||
|
":pk": f"USER#{user_id}",
|
||||||
|
":sk": "ORDER#"
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return response["Items"]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Lambda Patterns
|
||||||
|
|
||||||
|
### Handler with Error Handling
|
||||||
|
```python
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
logger = logging.getLogger()
|
||||||
|
logger.setLevel(logging.INFO)
|
||||||
|
|
||||||
|
def handler(event: dict, context: Any) -> dict:
|
||||||
|
"""Lambda handler with proper error handling."""
|
||||||
|
try:
|
||||||
|
logger.info("Processing event", extra={"event": event})
|
||||||
|
|
||||||
|
# Process request
|
||||||
|
body = json.loads(event.get("body", "{}"))
|
||||||
|
result = process_request(body)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"statusCode": 200,
|
||||||
|
"headers": {"Content-Type": "application/json"},
|
||||||
|
"body": json.dumps(result)
|
||||||
|
}
|
||||||
|
|
||||||
|
except ValidationError as e:
|
||||||
|
logger.warning("Validation error", extra={"error": str(e)})
|
||||||
|
return {
|
||||||
|
"statusCode": 400,
|
||||||
|
"body": json.dumps({"error": str(e)})
|
||||||
|
}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception("Unexpected error")
|
||||||
|
return {
|
||||||
|
"statusCode": 500,
|
||||||
|
"body": json.dumps({"error": "Internal server error"})
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cold Start Optimization
|
||||||
|
```python
|
||||||
|
# Initialize outside handler (runs once per container)
|
||||||
|
import boto3
|
||||||
|
|
||||||
|
# These persist across invocations
|
||||||
|
dynamodb = boto3.resource("dynamodb")
|
||||||
|
table = dynamodb.Table("my-table")
|
||||||
|
secrets_client = boto3.client("secretsmanager")
|
||||||
|
|
||||||
|
# Cache secrets
|
||||||
|
_cached_secrets = {}
|
||||||
|
|
||||||
|
def get_cached_secret(name: str) -> dict:
|
||||||
|
if name not in _cached_secrets:
|
||||||
|
response = secrets_client.get_secret_value(SecretId=name)
|
||||||
|
_cached_secrets[name] = json.loads(response["SecretString"])
|
||||||
|
return _cached_secrets[name]
|
||||||
|
|
||||||
|
def handler(event, context):
|
||||||
|
# Use cached resources
|
||||||
|
secret = get_cached_secret("my-secret")
|
||||||
|
# ...
|
||||||
|
```
|
||||||
|
|
||||||
|
## CloudWatch Patterns
|
||||||
|
|
||||||
|
### Structured Logging
|
||||||
|
```python
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
|
||||||
|
class JsonFormatter(logging.Formatter):
|
||||||
|
def format(self, record):
|
||||||
|
log_record = {
|
||||||
|
"timestamp": self.formatTime(record),
|
||||||
|
"level": record.levelname,
|
||||||
|
"message": record.getMessage(),
|
||||||
|
"logger": record.name,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add extra fields
|
||||||
|
if hasattr(record, "extra"):
|
||||||
|
log_record.update(record.extra)
|
||||||
|
|
||||||
|
return json.dumps(log_record)
|
||||||
|
|
||||||
|
# Setup
|
||||||
|
logger = logging.getLogger()
|
||||||
|
handler = logging.StreamHandler()
|
||||||
|
handler.setFormatter(JsonFormatter())
|
||||||
|
logger.addHandler(handler)
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
logger.info("User created", extra={"user_id": "123", "email": "user@example.com"})
|
||||||
|
```
|
||||||
|
|
||||||
|
### Custom Metrics
|
||||||
|
```python
|
||||||
|
import boto3
|
||||||
|
|
||||||
|
cloudwatch = boto3.client("cloudwatch")
|
||||||
|
|
||||||
|
def publish_metric(name: str, value: float, unit: str = "Count"):
|
||||||
|
cloudwatch.put_metric_data(
|
||||||
|
Namespace="MyApp",
|
||||||
|
MetricData=[
|
||||||
|
{
|
||||||
|
"MetricName": name,
|
||||||
|
"Value": value,
|
||||||
|
"Unit": unit,
|
||||||
|
"Dimensions": [
|
||||||
|
{"Name": "Environment", "Value": "prod"},
|
||||||
|
{"Name": "Service", "Value": "api"},
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
publish_metric("OrdersProcessed", 1)
|
||||||
|
publish_metric("ProcessingTime", 150, "Milliseconds")
|
||||||
|
```
|
||||||
|
|
||||||
|
## CLI Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# IAM
|
||||||
|
aws iam get-role --role-name MyRole
|
||||||
|
aws iam list-attached-role-policies --role-name MyRole
|
||||||
|
aws sts get-caller-identity
|
||||||
|
|
||||||
|
# S3
|
||||||
|
aws s3 ls s3://my-bucket/
|
||||||
|
aws s3 cp file.txt s3://my-bucket/
|
||||||
|
aws s3 presign s3://my-bucket/file.txt --expires-in 3600
|
||||||
|
|
||||||
|
# Secrets Manager
|
||||||
|
aws secretsmanager get-secret-value --secret-id my-secret
|
||||||
|
aws secretsmanager list-secrets
|
||||||
|
|
||||||
|
# ECS
|
||||||
|
aws ecs list-clusters
|
||||||
|
aws ecs describe-services --cluster my-cluster --services my-service
|
||||||
|
aws ecs update-service --cluster my-cluster --service my-service --force-new-deployment
|
||||||
|
|
||||||
|
# Lambda
|
||||||
|
aws lambda invoke --function-name my-function output.json
|
||||||
|
aws lambda list-functions
|
||||||
|
aws logs tail /aws/lambda/my-function --follow
|
||||||
|
|
||||||
|
# CloudWatch
|
||||||
|
aws logs filter-log-events --log-group-name /aws/lambda/my-function --filter-pattern "ERROR"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Checklist
|
||||||
|
|
||||||
|
- [ ] All S3 buckets have versioning enabled
|
||||||
|
- [ ] All S3 buckets block public access (unless explicitly needed)
|
||||||
|
- [ ] Encryption at rest enabled for all data stores
|
||||||
|
- [ ] Encryption in transit (TLS) for all connections
|
||||||
|
- [ ] IAM roles use least privilege
|
||||||
|
- [ ] No long-term credentials (use IAM roles/instance profiles)
|
||||||
|
- [ ] Secrets in Secrets Manager (not env vars or code)
|
||||||
|
- [ ] VPC endpoints for AWS services (avoid public internet)
|
||||||
|
- [ ] Security groups follow principle of least privilege
|
||||||
|
- [ ] CloudTrail enabled for auditing
|
||||||
|
- [ ] GuardDuty enabled for threat detection
|
||||||
442
.claude/skills/infrastructure/azure/SKILL.md
Normal file
442
.claude/skills/infrastructure/azure/SKILL.md
Normal file
@@ -0,0 +1,442 @@
|
|||||||
|
---
|
||||||
|
name: azure-services
|
||||||
|
description: Azure service patterns, RBAC best practices, and common architectures. Use when designing or implementing Azure infrastructure.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Azure Services Skill
|
||||||
|
|
||||||
|
## Common Architecture Patterns
|
||||||
|
|
||||||
|
### Web Application (App Service + Azure SQL)
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────────┐
|
||||||
|
│ VNet │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────┐│
|
||||||
|
│ │ Public Subnet ││
|
||||||
|
│ │ ┌─────────────┐ ┌─────────────┐ ││
|
||||||
|
│ │ │ App Gateway │ │ NAT GW │ ││
|
||||||
|
│ │ └──────┬──────┘ └──────┬──────┘ ││
|
||||||
|
│ └─────────┼───────────────────────────────────┼───────────┘│
|
||||||
|
│ │ │ │
|
||||||
|
│ ┌─────────┼───────────────────────────────────┼───────────┐│
|
||||||
|
│ │ │ Private Subnet │ ││
|
||||||
|
│ │ ┌──────▼──────┐ ┌───────▼─────┐ ││
|
||||||
|
│ │ │ App Service │ │ Azure SQL │ ││
|
||||||
|
│ │ │ (Web App) │───────────────────▶│ Database │ ││
|
||||||
|
│ │ └─────────────┘ └─────────────┘ ││
|
||||||
|
│ └─────────────────────────────────────────────────────────┘│
|
||||||
|
└─────────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### Serverless (Azure Functions + API Management)
|
||||||
|
```
|
||||||
|
┌────────────┐ ┌─────────────┐ ┌─────────────┐
|
||||||
|
│ Front │────▶│ APIM │────▶│ Functions │
|
||||||
|
│ Door │ │ │ └──────┬──────┘
|
||||||
|
└────────────┘ └─────────────┘ │
|
||||||
|
┌──────────────────────────┼──────────────┐
|
||||||
|
│ │ │
|
||||||
|
┌──────▼─────┐ ┌─────────┐ ┌────▼────┐
|
||||||
|
│ Cosmos DB │ │ Blob │ │ Key │
|
||||||
|
└────────────┘ │ Storage │ │ Vault │
|
||||||
|
└─────────┘ └─────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
## RBAC Best Practices
|
||||||
|
|
||||||
|
### Custom Role Definition
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Name": "App Data Reader",
|
||||||
|
"Description": "Read access to application data in storage",
|
||||||
|
"Actions": [
|
||||||
|
"Microsoft.Storage/storageAccounts/blobServices/containers/read",
|
||||||
|
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read"
|
||||||
|
],
|
||||||
|
"NotActions": [],
|
||||||
|
"DataActions": [
|
||||||
|
"Microsoft.Storage/storageAccounts/blobServices/containers/blobs/read"
|
||||||
|
],
|
||||||
|
"NotDataActions": [],
|
||||||
|
"AssignableScopes": [
|
||||||
|
"/subscriptions/{subscription-id}/resourceGroups/{resource-group}"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Managed Identity Usage
|
||||||
|
```python
|
||||||
|
# Python - azure-identity
|
||||||
|
from azure.identity import DefaultAzureCredential
|
||||||
|
from azure.keyvault.secrets import SecretClient
|
||||||
|
from azure.storage.blob import BlobServiceClient
|
||||||
|
|
||||||
|
# Uses managed identity when deployed to Azure
|
||||||
|
credential = DefaultAzureCredential()
|
||||||
|
|
||||||
|
# Key Vault access
|
||||||
|
secret_client = SecretClient(
|
||||||
|
vault_url="https://my-vault.vault.azure.net/",
|
||||||
|
credential=credential
|
||||||
|
)
|
||||||
|
secret = secret_client.get_secret("database-password")
|
||||||
|
|
||||||
|
# Blob Storage access
|
||||||
|
blob_service = BlobServiceClient(
|
||||||
|
account_url="https://mystorageaccount.blob.core.windows.net/",
|
||||||
|
credential=credential
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// TypeScript - @azure/identity
|
||||||
|
import { DefaultAzureCredential } from "@azure/identity";
|
||||||
|
import { SecretClient } from "@azure/keyvault-secrets";
|
||||||
|
import { BlobServiceClient } from "@azure/storage-blob";
|
||||||
|
|
||||||
|
const credential = new DefaultAzureCredential();
|
||||||
|
|
||||||
|
// Key Vault access
|
||||||
|
const secretClient = new SecretClient(
|
||||||
|
"https://my-vault.vault.azure.net/",
|
||||||
|
credential
|
||||||
|
);
|
||||||
|
const secret = await secretClient.getSecret("database-password");
|
||||||
|
|
||||||
|
// Blob Storage access
|
||||||
|
const blobService = new BlobServiceClient(
|
||||||
|
"https://mystorageaccount.blob.core.windows.net/",
|
||||||
|
credential
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Key Vault Patterns
|
||||||
|
|
||||||
|
### Secrets Management
|
||||||
|
```python
|
||||||
|
from azure.identity import DefaultAzureCredential
|
||||||
|
from azure.keyvault.secrets import SecretClient
|
||||||
|
|
||||||
|
def get_secret(vault_url: str, secret_name: str) -> str:
|
||||||
|
"""Retrieve secret from Key Vault using managed identity."""
|
||||||
|
credential = DefaultAzureCredential()
|
||||||
|
client = SecretClient(vault_url=vault_url, credential=credential)
|
||||||
|
return client.get_secret(secret_name).value
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
db_password = get_secret(
|
||||||
|
"https://my-vault.vault.azure.net/",
|
||||||
|
"database-password"
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### App Service with Key Vault References
|
||||||
|
```json
|
||||||
|
// App Service configuration
|
||||||
|
{
|
||||||
|
"name": "DatabasePassword",
|
||||||
|
"value": "@Microsoft.KeyVault(SecretUri=https://my-vault.vault.azure.net/secrets/db-password/)",
|
||||||
|
"slotSetting": false
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Blob Storage Patterns
|
||||||
|
|
||||||
|
### SAS Token Generation
|
||||||
|
```python
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from azure.storage.blob import (
|
||||||
|
BlobServiceClient,
|
||||||
|
generate_blob_sas,
|
||||||
|
BlobSasPermissions,
|
||||||
|
)
|
||||||
|
|
||||||
|
def generate_read_sas(
|
||||||
|
account_name: str,
|
||||||
|
account_key: str,
|
||||||
|
container: str,
|
||||||
|
blob_name: str,
|
||||||
|
expiry_hours: int = 1
|
||||||
|
) -> str:
|
||||||
|
"""Generate a read-only SAS URL for a blob."""
|
||||||
|
sas_token = generate_blob_sas(
|
||||||
|
account_name=account_name,
|
||||||
|
container_name=container,
|
||||||
|
blob_name=blob_name,
|
||||||
|
account_key=account_key,
|
||||||
|
permission=BlobSasPermissions(read=True),
|
||||||
|
expiry=datetime.utcnow() + timedelta(hours=expiry_hours),
|
||||||
|
)
|
||||||
|
|
||||||
|
return f"https://{account_name}.blob.core.windows.net/{container}/{blob_name}?{sas_token}"
|
||||||
|
```
|
||||||
|
|
||||||
|
### User Delegation SAS (More Secure)
|
||||||
|
```python
|
||||||
|
from azure.identity import DefaultAzureCredential
|
||||||
|
from azure.storage.blob import BlobServiceClient, UserDelegationKey
|
||||||
|
|
||||||
|
def generate_user_delegation_sas(
|
||||||
|
account_url: str,
|
||||||
|
container: str,
|
||||||
|
blob_name: str,
|
||||||
|
) -> str:
|
||||||
|
"""Generate SAS using user delegation key (no storage key needed)."""
|
||||||
|
credential = DefaultAzureCredential()
|
||||||
|
blob_service = BlobServiceClient(account_url, credential=credential)
|
||||||
|
|
||||||
|
# Get user delegation key
|
||||||
|
delegation_key = blob_service.get_user_delegation_key(
|
||||||
|
key_start_time=datetime.utcnow(),
|
||||||
|
key_expiry_time=datetime.utcnow() + timedelta(hours=1)
|
||||||
|
)
|
||||||
|
|
||||||
|
sas_token = generate_blob_sas(
|
||||||
|
account_name=blob_service.account_name,
|
||||||
|
container_name=container,
|
||||||
|
blob_name=blob_name,
|
||||||
|
user_delegation_key=delegation_key,
|
||||||
|
permission=BlobSasPermissions(read=True),
|
||||||
|
expiry=datetime.utcnow() + timedelta(hours=1),
|
||||||
|
)
|
||||||
|
|
||||||
|
return f"{account_url}/{container}/{blob_name}?{sas_token}"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Cosmos DB Patterns
|
||||||
|
|
||||||
|
### Async Client Usage
|
||||||
|
```python
|
||||||
|
from azure.cosmos.aio import CosmosClient
|
||||||
|
from azure.identity.aio import DefaultAzureCredential
|
||||||
|
|
||||||
|
async def get_cosmos_client() -> CosmosClient:
|
||||||
|
"""Create async Cosmos client with managed identity."""
|
||||||
|
credential = DefaultAzureCredential()
|
||||||
|
return CosmosClient(
|
||||||
|
url="https://my-cosmos.documents.azure.com:443/",
|
||||||
|
credential=credential
|
||||||
|
)
|
||||||
|
|
||||||
|
async def query_items(container_name: str, query: str) -> list:
|
||||||
|
"""Query items from Cosmos DB container."""
|
||||||
|
async with await get_cosmos_client() as client:
|
||||||
|
database = client.get_database_client("my-database")
|
||||||
|
container = database.get_container_client(container_name)
|
||||||
|
|
||||||
|
items = []
|
||||||
|
async for item in container.query_items(
|
||||||
|
query=query,
|
||||||
|
enable_cross_partition_query=True
|
||||||
|
):
|
||||||
|
items.append(item)
|
||||||
|
|
||||||
|
return items
|
||||||
|
```
|
||||||
|
|
||||||
|
### Partition Key Design
|
||||||
|
```python
|
||||||
|
# Good partition key choices:
|
||||||
|
# - tenant_id for multi-tenant apps
|
||||||
|
# - user_id for user-specific data
|
||||||
|
# - category for catalog data
|
||||||
|
|
||||||
|
# Document structure
|
||||||
|
{
|
||||||
|
"id": "order-12345",
|
||||||
|
"partitionKey": "customer-789", # Use customer ID for orders
|
||||||
|
"orderDate": "2024-01-15",
|
||||||
|
"items": [...],
|
||||||
|
"total": 150.00
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Azure Functions Patterns
|
||||||
|
|
||||||
|
### HTTP Trigger with Input Validation
|
||||||
|
```python
|
||||||
|
import azure.functions as func
|
||||||
|
import logging
|
||||||
|
from pydantic import BaseModel, ValidationError
|
||||||
|
|
||||||
|
class CreateOrderRequest(BaseModel):
|
||||||
|
customer_id: str
|
||||||
|
items: list[dict]
|
||||||
|
|
||||||
|
app = func.FunctionApp()
|
||||||
|
|
||||||
|
@app.route(route="orders", methods=["POST"])
|
||||||
|
async def create_order(req: func.HttpRequest) -> func.HttpResponse:
|
||||||
|
"""Create a new order with validation."""
|
||||||
|
try:
|
||||||
|
body = req.get_json()
|
||||||
|
request = CreateOrderRequest(**body)
|
||||||
|
|
||||||
|
# Process order...
|
||||||
|
result = await process_order(request)
|
||||||
|
|
||||||
|
return func.HttpResponse(
|
||||||
|
body=result.model_dump_json(),
|
||||||
|
status_code=201,
|
||||||
|
mimetype="application/json"
|
||||||
|
)
|
||||||
|
|
||||||
|
except ValidationError as e:
|
||||||
|
return func.HttpResponse(
|
||||||
|
body=e.json(),
|
||||||
|
status_code=400,
|
||||||
|
mimetype="application/json"
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logging.exception("Error processing order")
|
||||||
|
return func.HttpResponse(
|
||||||
|
body='{"error": "Internal server error"}',
|
||||||
|
status_code=500,
|
||||||
|
mimetype="application/json"
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Durable Functions Orchestration
|
||||||
|
```python
|
||||||
|
import azure.functions as func
|
||||||
|
import azure.durable_functions as df
|
||||||
|
|
||||||
|
app = func.FunctionApp()
|
||||||
|
|
||||||
|
@app.orchestration_trigger(context_name="context")
|
||||||
|
def order_orchestrator(context: df.DurableOrchestrationContext):
|
||||||
|
"""Orchestrate multi-step order processing."""
|
||||||
|
order = context.get_input()
|
||||||
|
|
||||||
|
# Step 1: Validate inventory
|
||||||
|
inventory_result = yield context.call_activity(
|
||||||
|
"validate_inventory", order["items"]
|
||||||
|
)
|
||||||
|
|
||||||
|
if not inventory_result["available"]:
|
||||||
|
return {"status": "failed", "reason": "insufficient_inventory"}
|
||||||
|
|
||||||
|
# Step 2: Process payment
|
||||||
|
payment_result = yield context.call_activity(
|
||||||
|
"process_payment", order["payment"]
|
||||||
|
)
|
||||||
|
|
||||||
|
if not payment_result["success"]:
|
||||||
|
return {"status": "failed", "reason": "payment_failed"}
|
||||||
|
|
||||||
|
# Step 3: Create shipment
|
||||||
|
shipment = yield context.call_activity(
|
||||||
|
"create_shipment", order
|
||||||
|
)
|
||||||
|
|
||||||
|
return {"status": "completed", "shipment_id": shipment["id"]}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Application Insights
|
||||||
|
|
||||||
|
### Structured Logging
|
||||||
|
```python
|
||||||
|
import logging
|
||||||
|
from opencensus.ext.azure.log_exporter import AzureLogHandler
|
||||||
|
|
||||||
|
# Configure logging with Application Insights
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
logger.addHandler(AzureLogHandler(
|
||||||
|
connection_string="InstrumentationKey=xxx;IngestionEndpoint=xxx"
|
||||||
|
))
|
||||||
|
|
||||||
|
# Log with custom dimensions
|
||||||
|
logger.info(
|
||||||
|
"Order processed",
|
||||||
|
extra={
|
||||||
|
"custom_dimensions": {
|
||||||
|
"order_id": "12345",
|
||||||
|
"customer_id": "cust-789",
|
||||||
|
"total": 150.00
|
||||||
|
}
|
||||||
|
}
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Custom Metrics
|
||||||
|
```python
|
||||||
|
from opencensus.ext.azure import metrics_exporter
|
||||||
|
from opencensus.stats import aggregation, measure, stats, view
|
||||||
|
|
||||||
|
# Create measure
|
||||||
|
orders_measure = measure.MeasureInt(
|
||||||
|
"orders_processed",
|
||||||
|
"Number of orders processed",
|
||||||
|
"orders"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create view
|
||||||
|
orders_view = view.View(
|
||||||
|
"orders_processed_total",
|
||||||
|
"Total orders processed",
|
||||||
|
[],
|
||||||
|
orders_measure,
|
||||||
|
aggregation.CountAggregation()
|
||||||
|
)
|
||||||
|
|
||||||
|
# Register and export
|
||||||
|
view_manager = stats.stats.view_manager
|
||||||
|
view_manager.register_view(orders_view)
|
||||||
|
|
||||||
|
exporter = metrics_exporter.new_metrics_exporter(
|
||||||
|
connection_string="InstrumentationKey=xxx"
|
||||||
|
)
|
||||||
|
view_manager.register_exporter(exporter)
|
||||||
|
|
||||||
|
# Record metric
|
||||||
|
mmap = stats.stats.stats_recorder.new_measurement_map()
|
||||||
|
mmap.measure_int_put(orders_measure, 1)
|
||||||
|
mmap.record()
|
||||||
|
```
|
||||||
|
|
||||||
|
## CLI Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Authentication
|
||||||
|
az login
|
||||||
|
az account set --subscription "My Subscription"
|
||||||
|
az account show
|
||||||
|
|
||||||
|
# Resource Groups
|
||||||
|
az group list --output table
|
||||||
|
az group create --name my-rg --location uksouth
|
||||||
|
|
||||||
|
# Key Vault
|
||||||
|
az keyvault secret show --vault-name my-vault --name my-secret
|
||||||
|
az keyvault secret set --vault-name my-vault --name my-secret --value "secret-value"
|
||||||
|
|
||||||
|
# Storage
|
||||||
|
az storage blob list --account-name mystorageaccount --container-name mycontainer
|
||||||
|
az storage blob upload --account-name mystorageaccount --container-name mycontainer --file local.txt --name remote.txt
|
||||||
|
|
||||||
|
# App Service
|
||||||
|
az webapp list --output table
|
||||||
|
az webapp restart --name my-app --resource-group my-rg
|
||||||
|
az webapp log tail --name my-app --resource-group my-rg
|
||||||
|
|
||||||
|
# Functions
|
||||||
|
az functionapp list --output table
|
||||||
|
az functionapp restart --name my-func --resource-group my-rg
|
||||||
|
|
||||||
|
# Cosmos DB
|
||||||
|
az cosmosdb list --output table
|
||||||
|
az cosmosdb sql database list --account-name my-cosmos --resource-group my-rg
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Checklist
|
||||||
|
|
||||||
|
- [ ] Use Managed Identities instead of connection strings
|
||||||
|
- [ ] Store secrets in Key Vault, not app settings
|
||||||
|
- [ ] Enable Azure Defender for all resources
|
||||||
|
- [ ] Use Private Endpoints for PaaS services
|
||||||
|
- [ ] Enable diagnostic logging to Log Analytics
|
||||||
|
- [ ] Configure Network Security Groups
|
||||||
|
- [ ] Use User Delegation SAS instead of account keys
|
||||||
|
- [ ] Enable soft delete on Key Vault and Storage
|
||||||
|
- [ ] Configure Azure Policy for compliance
|
||||||
|
- [ ] Enable Microsoft Defender for Cloud
|
||||||
599
.claude/skills/infrastructure/cicd/SKILL.md
Normal file
599
.claude/skills/infrastructure/cicd/SKILL.md
Normal file
@@ -0,0 +1,599 @@
|
|||||||
|
---
|
||||||
|
name: cicd-pipelines
|
||||||
|
description: CI/CD pipeline patterns for Jenkins, GitHub Actions, and GitLab CI. Use when setting up continuous integration or deployment pipelines.
|
||||||
|
---
|
||||||
|
|
||||||
|
# CI/CD Pipelines Skill
|
||||||
|
|
||||||
|
## Jenkins
|
||||||
|
|
||||||
|
### Declarative Pipeline
|
||||||
|
```groovy
|
||||||
|
// Jenkinsfile
|
||||||
|
pipeline {
|
||||||
|
agent any
|
||||||
|
|
||||||
|
environment {
|
||||||
|
REGISTRY = 'myregistry.azurecr.io'
|
||||||
|
IMAGE_NAME = 'myapp'
|
||||||
|
COVERAGE_THRESHOLD = '80'
|
||||||
|
}
|
||||||
|
|
||||||
|
options {
|
||||||
|
timeout(time: 30, unit: 'MINUTES')
|
||||||
|
disableConcurrentBuilds()
|
||||||
|
buildDiscarder(logRotator(numToKeepStr: '10'))
|
||||||
|
}
|
||||||
|
|
||||||
|
stages {
|
||||||
|
stage('Checkout') {
|
||||||
|
steps {
|
||||||
|
checkout scm
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Install Dependencies') {
|
||||||
|
parallel {
|
||||||
|
stage('Python') {
|
||||||
|
when {
|
||||||
|
changeset "apps/backend/**"
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh 'uv sync'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
stage('Node') {
|
||||||
|
when {
|
||||||
|
changeset "apps/frontend/**"
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh 'npm ci'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Lint & Type Check') {
|
||||||
|
parallel {
|
||||||
|
stage('Python Lint') {
|
||||||
|
when {
|
||||||
|
changeset "apps/backend/**"
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh 'uv run ruff check apps/backend/'
|
||||||
|
sh 'uv run mypy apps/backend/'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
stage('TypeScript Lint') {
|
||||||
|
when {
|
||||||
|
changeset "apps/frontend/**"
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh 'npm run lint --workspace=frontend'
|
||||||
|
sh 'npm run typecheck --workspace=frontend'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Test') {
|
||||||
|
parallel {
|
||||||
|
stage('Backend Tests') {
|
||||||
|
when {
|
||||||
|
changeset "apps/backend/**"
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh """
|
||||||
|
uv run pytest apps/backend/ \
|
||||||
|
--cov=apps/backend/src \
|
||||||
|
--cov-report=xml \
|
||||||
|
--cov-fail-under=${COVERAGE_THRESHOLD} \
|
||||||
|
--junitxml=test-results/backend.xml
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
post {
|
||||||
|
always {
|
||||||
|
junit 'test-results/backend.xml'
|
||||||
|
publishCoverage adapters: [coberturaAdapter('coverage.xml')]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
stage('Frontend Tests') {
|
||||||
|
when {
|
||||||
|
changeset "apps/frontend/**"
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh """
|
||||||
|
npm run test --workspace=frontend -- \
|
||||||
|
--coverage \
|
||||||
|
--coverageThreshold='{"global":{"branches":${COVERAGE_THRESHOLD},"functions":${COVERAGE_THRESHOLD},"lines":${COVERAGE_THRESHOLD}}}' \
|
||||||
|
--reporter=junit \
|
||||||
|
--outputFile=test-results/frontend.xml
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
post {
|
||||||
|
always {
|
||||||
|
junit 'test-results/frontend.xml'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Security Scan') {
|
||||||
|
steps {
|
||||||
|
sh 'trivy fs --severity HIGH,CRITICAL --exit-code 1 .'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Build') {
|
||||||
|
when {
|
||||||
|
anyOf {
|
||||||
|
branch 'main'
|
||||||
|
branch 'release/*'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
script {
|
||||||
|
def version = sh(script: 'git describe --tags --always', returnStdout: true).trim()
|
||||||
|
sh """
|
||||||
|
docker build -t ${REGISTRY}/${IMAGE_NAME}:${version} .
|
||||||
|
docker tag ${REGISTRY}/${IMAGE_NAME}:${version} ${REGISTRY}/${IMAGE_NAME}:latest
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Push') {
|
||||||
|
when {
|
||||||
|
branch 'main'
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
withCredentials([usernamePassword(
|
||||||
|
credentialsId: 'registry-credentials',
|
||||||
|
usernameVariable: 'REGISTRY_USER',
|
||||||
|
passwordVariable: 'REGISTRY_PASS'
|
||||||
|
)]) {
|
||||||
|
sh """
|
||||||
|
echo \$REGISTRY_PASS | docker login ${REGISTRY} -u \$REGISTRY_USER --password-stdin
|
||||||
|
docker push ${REGISTRY}/${IMAGE_NAME}:${version}
|
||||||
|
docker push ${REGISTRY}/${IMAGE_NAME}:latest
|
||||||
|
"""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Deploy to Staging') {
|
||||||
|
when {
|
||||||
|
branch 'main'
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh 'kubectl apply -f k8s/staging/'
|
||||||
|
sh 'kubectl rollout status deployment/myapp -n staging'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Deploy to Production') {
|
||||||
|
when {
|
||||||
|
branch 'release/*'
|
||||||
|
}
|
||||||
|
input {
|
||||||
|
message "Deploy to production?"
|
||||||
|
ok "Deploy"
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh 'kubectl apply -f k8s/production/'
|
||||||
|
sh 'kubectl rollout status deployment/myapp -n production'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
post {
|
||||||
|
always {
|
||||||
|
cleanWs()
|
||||||
|
}
|
||||||
|
success {
|
||||||
|
slackSend(
|
||||||
|
channel: '#deployments',
|
||||||
|
color: 'good',
|
||||||
|
message: "Build ${env.BUILD_NUMBER} succeeded: ${env.BUILD_URL}"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
failure {
|
||||||
|
slackSend(
|
||||||
|
channel: '#deployments',
|
||||||
|
color: 'danger',
|
||||||
|
message: "Build ${env.BUILD_NUMBER} failed: ${env.BUILD_URL}"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Shared Library
|
||||||
|
```groovy
|
||||||
|
// vars/pythonPipeline.groovy
|
||||||
|
def call(Map config = [:]) {
|
||||||
|
pipeline {
|
||||||
|
agent any
|
||||||
|
|
||||||
|
stages {
|
||||||
|
stage('Test') {
|
||||||
|
steps {
|
||||||
|
sh "uv run pytest ${config.testPath ?: 'tests/'} --cov --cov-fail-under=${config.coverage ?: 80}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
stage('Lint') {
|
||||||
|
steps {
|
||||||
|
sh "uv run ruff check ${config.srcPath ?: 'src/'}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage in Jenkinsfile
|
||||||
|
@Library('my-shared-library') _
|
||||||
|
|
||||||
|
pythonPipeline(
|
||||||
|
testPath: 'apps/backend/tests/',
|
||||||
|
srcPath: 'apps/backend/src/',
|
||||||
|
coverage: 85
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
## GitHub Actions
|
||||||
|
|
||||||
|
### Complete Workflow
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/ci.yml
|
||||||
|
name: CI/CD
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [main, 'release/*']
|
||||||
|
pull_request:
|
||||||
|
branches: [main]
|
||||||
|
|
||||||
|
env:
|
||||||
|
REGISTRY: ghcr.io
|
||||||
|
IMAGE_NAME: ${{ github.repository }}
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
detect-changes:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
backend: ${{ steps.changes.outputs.backend }}
|
||||||
|
frontend: ${{ steps.changes.outputs.frontend }}
|
||||||
|
infrastructure: ${{ steps.changes.outputs.infrastructure }}
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
- uses: dorny/paths-filter@v3
|
||||||
|
id: changes
|
||||||
|
with:
|
||||||
|
filters: |
|
||||||
|
backend:
|
||||||
|
- 'apps/backend/**'
|
||||||
|
- 'packages/shared/**'
|
||||||
|
frontend:
|
||||||
|
- 'apps/frontend/**'
|
||||||
|
- 'packages/shared/**'
|
||||||
|
infrastructure:
|
||||||
|
- 'infrastructure/**'
|
||||||
|
|
||||||
|
backend-test:
|
||||||
|
needs: detect-changes
|
||||||
|
if: needs.detect-changes.outputs.backend == 'true'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16
|
||||||
|
env:
|
||||||
|
POSTGRES_USER: test
|
||||||
|
POSTGRES_PASSWORD: test
|
||||||
|
POSTGRES_DB: test
|
||||||
|
ports:
|
||||||
|
- 5432:5432
|
||||||
|
options: >-
|
||||||
|
--health-cmd pg_isready
|
||||||
|
--health-interval 10s
|
||||||
|
--health-timeout 5s
|
||||||
|
--health-retries 5
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install uv
|
||||||
|
uses: astral-sh/setup-uv@v4
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: uv sync
|
||||||
|
|
||||||
|
- name: Lint
|
||||||
|
run: |
|
||||||
|
uv run ruff check apps/backend/
|
||||||
|
uv run mypy apps/backend/
|
||||||
|
|
||||||
|
- name: Test
|
||||||
|
env:
|
||||||
|
DATABASE_URL: postgresql://test:test@localhost:5432/test
|
||||||
|
run: |
|
||||||
|
uv run pytest apps/backend/ \
|
||||||
|
--cov=apps/backend/src \
|
||||||
|
--cov-report=xml \
|
||||||
|
--cov-fail-under=80
|
||||||
|
|
||||||
|
- name: Upload coverage
|
||||||
|
uses: codecov/codecov-action@v4
|
||||||
|
with:
|
||||||
|
files: coverage.xml
|
||||||
|
flags: backend
|
||||||
|
|
||||||
|
frontend-test:
|
||||||
|
needs: detect-changes
|
||||||
|
if: needs.detect-changes.outputs.frontend == 'true'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: '22'
|
||||||
|
cache: 'npm'
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: npm ci
|
||||||
|
|
||||||
|
- name: Lint & Type Check
|
||||||
|
run: |
|
||||||
|
npm run lint --workspace=frontend
|
||||||
|
npm run typecheck --workspace=frontend
|
||||||
|
|
||||||
|
- name: Test
|
||||||
|
run: npm run test --workspace=frontend -- --coverage
|
||||||
|
|
||||||
|
- name: Upload coverage
|
||||||
|
uses: codecov/codecov-action@v4
|
||||||
|
with:
|
||||||
|
files: apps/frontend/coverage/lcov.info
|
||||||
|
flags: frontend
|
||||||
|
|
||||||
|
security-scan:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Run Trivy vulnerability scanner
|
||||||
|
uses: aquasecurity/trivy-action@master
|
||||||
|
with:
|
||||||
|
scan-type: 'fs'
|
||||||
|
severity: 'CRITICAL,HIGH'
|
||||||
|
exit-code: '1'
|
||||||
|
|
||||||
|
- name: Run Gitleaks
|
||||||
|
uses: gitleaks/gitleaks-action@v2
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
build-and-push:
|
||||||
|
needs: [backend-test, frontend-test, security-scan]
|
||||||
|
if: github.ref == 'refs/heads/main'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
packages: write
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Log in to Container registry
|
||||||
|
uses: docker/login-action@v3
|
||||||
|
with:
|
||||||
|
registry: ${{ env.REGISTRY }}
|
||||||
|
username: ${{ github.actor }}
|
||||||
|
password: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
- name: Extract metadata
|
||||||
|
id: meta
|
||||||
|
uses: docker/metadata-action@v5
|
||||||
|
with:
|
||||||
|
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
|
||||||
|
tags: |
|
||||||
|
type=sha
|
||||||
|
type=ref,event=branch
|
||||||
|
|
||||||
|
- name: Build and push
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
push: true
|
||||||
|
tags: ${{ steps.meta.outputs.tags }}
|
||||||
|
labels: ${{ steps.meta.outputs.labels }}
|
||||||
|
cache-from: type=gha
|
||||||
|
cache-to: type=gha,mode=max
|
||||||
|
|
||||||
|
deploy-staging:
|
||||||
|
needs: build-and-push
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
environment: staging
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Deploy to staging
|
||||||
|
run: |
|
||||||
|
kubectl apply -f k8s/staging/
|
||||||
|
kubectl rollout status deployment/myapp -n staging
|
||||||
|
|
||||||
|
deploy-production:
|
||||||
|
needs: deploy-staging
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
environment: production
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Deploy to production
|
||||||
|
run: |
|
||||||
|
kubectl apply -f k8s/production/
|
||||||
|
kubectl rollout status deployment/myapp -n production
|
||||||
|
```
|
||||||
|
|
||||||
|
### Reusable Workflow
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/python-ci.yml
|
||||||
|
name: Python CI
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_call:
|
||||||
|
inputs:
|
||||||
|
python-version:
|
||||||
|
required: false
|
||||||
|
type: string
|
||||||
|
default: '3.12'
|
||||||
|
working-directory:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
coverage-threshold:
|
||||||
|
required: false
|
||||||
|
type: number
|
||||||
|
default: 80
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
test:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
defaults:
|
||||||
|
run:
|
||||||
|
working-directory: ${{ inputs.working-directory }}
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- uses: astral-sh/setup-uv@v4
|
||||||
|
|
||||||
|
- run: uv sync
|
||||||
|
|
||||||
|
- run: uv run ruff check .
|
||||||
|
|
||||||
|
- run: uv run pytest --cov --cov-fail-under=${{ inputs.coverage-threshold }}
|
||||||
|
```
|
||||||
|
|
||||||
|
## GitLab CI
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# .gitlab-ci.yml
|
||||||
|
stages:
|
||||||
|
- test
|
||||||
|
- build
|
||||||
|
- deploy
|
||||||
|
|
||||||
|
variables:
|
||||||
|
REGISTRY: registry.gitlab.com
|
||||||
|
IMAGE_NAME: $CI_PROJECT_PATH
|
||||||
|
|
||||||
|
.python-base:
|
||||||
|
image: python:3.12
|
||||||
|
before_script:
|
||||||
|
- pip install uv
|
||||||
|
- uv sync
|
||||||
|
|
||||||
|
.node-base:
|
||||||
|
image: node:22
|
||||||
|
before_script:
|
||||||
|
- npm ci
|
||||||
|
|
||||||
|
test:backend:
|
||||||
|
extends: .python-base
|
||||||
|
stage: test
|
||||||
|
script:
|
||||||
|
- uv run ruff check apps/backend/
|
||||||
|
- uv run pytest apps/backend/ --cov --cov-fail-under=80
|
||||||
|
rules:
|
||||||
|
- changes:
|
||||||
|
- apps/backend/**
|
||||||
|
|
||||||
|
test:frontend:
|
||||||
|
extends: .node-base
|
||||||
|
stage: test
|
||||||
|
script:
|
||||||
|
- npm run lint --workspace=frontend
|
||||||
|
- npm run test --workspace=frontend -- --coverage
|
||||||
|
rules:
|
||||||
|
- changes:
|
||||||
|
- apps/frontend/**
|
||||||
|
|
||||||
|
build:
|
||||||
|
stage: build
|
||||||
|
image: docker:24
|
||||||
|
services:
|
||||||
|
- docker:24-dind
|
||||||
|
script:
|
||||||
|
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
|
||||||
|
- docker build -t $REGISTRY/$IMAGE_NAME:$CI_COMMIT_SHA .
|
||||||
|
- docker push $REGISTRY/$IMAGE_NAME:$CI_COMMIT_SHA
|
||||||
|
rules:
|
||||||
|
- if: $CI_COMMIT_BRANCH == "main"
|
||||||
|
|
||||||
|
deploy:staging:
|
||||||
|
stage: deploy
|
||||||
|
script:
|
||||||
|
- kubectl apply -f k8s/staging/
|
||||||
|
environment:
|
||||||
|
name: staging
|
||||||
|
rules:
|
||||||
|
- if: $CI_COMMIT_BRANCH == "main"
|
||||||
|
|
||||||
|
deploy:production:
|
||||||
|
stage: deploy
|
||||||
|
script:
|
||||||
|
- kubectl apply -f k8s/production/
|
||||||
|
environment:
|
||||||
|
name: production
|
||||||
|
rules:
|
||||||
|
- if: $CI_COMMIT_BRANCH == "main"
|
||||||
|
when: manual
|
||||||
|
```
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
### Pipeline Design Principles
|
||||||
|
|
||||||
|
1. **Fail Fast** - Run quick checks (lint, type check) before slow ones (tests)
|
||||||
|
2. **Parallelize** - Run independent jobs concurrently
|
||||||
|
3. **Cache** - Cache dependencies between runs
|
||||||
|
4. **Change Detection** - Only run what's affected
|
||||||
|
5. **Immutable Artifacts** - Tag images with commit SHA
|
||||||
|
6. **Environment Parity** - Same process for all environments
|
||||||
|
7. **Secrets Management** - Never hardcode, use CI/CD secrets
|
||||||
|
|
||||||
|
### Quality Gates
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# Minimum checks before merge
|
||||||
|
- Lint passes
|
||||||
|
- Type check passes
|
||||||
|
- Unit tests pass
|
||||||
|
- Coverage threshold met (80%+)
|
||||||
|
- Security scan passes
|
||||||
|
- No secrets detected
|
||||||
|
```
|
||||||
|
|
||||||
|
### Deployment Strategies
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# Rolling update (default)
|
||||||
|
strategy:
|
||||||
|
type: RollingUpdate
|
||||||
|
rollingUpdate:
|
||||||
|
maxSurge: 1
|
||||||
|
maxUnavailable: 0
|
||||||
|
|
||||||
|
# Blue-green (via service switch)
|
||||||
|
# Deploy new version alongside old
|
||||||
|
# Switch service selector when ready
|
||||||
|
|
||||||
|
# Canary (gradual rollout)
|
||||||
|
# Route percentage of traffic to new version
|
||||||
|
# Monitor metrics before full rollout
|
||||||
|
```
|
||||||
510
.claude/skills/infrastructure/database/SKILL.md
Normal file
510
.claude/skills/infrastructure/database/SKILL.md
Normal file
@@ -0,0 +1,510 @@
|
|||||||
|
---
|
||||||
|
name: database-patterns
|
||||||
|
description: Database design patterns, migrations with Alembic/Prisma, and query optimization. Use when working with SQL/NoSQL databases or schema migrations.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Database Patterns Skill
|
||||||
|
|
||||||
|
## Schema Migrations
|
||||||
|
|
||||||
|
### Alembic (Python/SQLAlchemy)
|
||||||
|
|
||||||
|
#### Setup
|
||||||
|
```bash
|
||||||
|
# Initialize Alembic
|
||||||
|
alembic init alembic
|
||||||
|
|
||||||
|
# Configure alembic.ini
|
||||||
|
sqlalchemy.url = postgresql://user:pass@localhost/myapp
|
||||||
|
```
|
||||||
|
|
||||||
|
#### alembic/env.py Configuration
|
||||||
|
```python
|
||||||
|
from logging.config import fileConfig
|
||||||
|
from sqlalchemy import engine_from_config, pool
|
||||||
|
from alembic import context
|
||||||
|
import os
|
||||||
|
|
||||||
|
# Import your models
|
||||||
|
from app.models import Base
|
||||||
|
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Override with environment variable
|
||||||
|
config.set_main_option(
|
||||||
|
"sqlalchemy.url",
|
||||||
|
os.environ.get("DATABASE_URL", config.get_main_option("sqlalchemy.url"))
|
||||||
|
)
|
||||||
|
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode."""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode."""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section, {}),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(
|
||||||
|
connection=connection,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Migration Commands
|
||||||
|
```bash
|
||||||
|
# Create migration from model changes
|
||||||
|
alembic revision --autogenerate -m "add users table"
|
||||||
|
|
||||||
|
# Create empty migration
|
||||||
|
alembic revision -m "add custom index"
|
||||||
|
|
||||||
|
# Apply migrations
|
||||||
|
alembic upgrade head
|
||||||
|
|
||||||
|
# Rollback one migration
|
||||||
|
alembic downgrade -1
|
||||||
|
|
||||||
|
# Rollback to specific revision
|
||||||
|
alembic downgrade abc123
|
||||||
|
|
||||||
|
# Show current revision
|
||||||
|
alembic current
|
||||||
|
|
||||||
|
# Show migration history
|
||||||
|
alembic history --verbose
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Migration Best Practices
|
||||||
|
```python
|
||||||
|
# alembic/versions/001_add_users_table.py
|
||||||
|
"""Add users table
|
||||||
|
|
||||||
|
Revision ID: abc123
|
||||||
|
Revises:
|
||||||
|
Create Date: 2024-01-15 10:00:00.000000
|
||||||
|
"""
|
||||||
|
from typing import Sequence, Union
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
revision: str = 'abc123'
|
||||||
|
down_revision: Union[str, None] = None
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
op.create_table(
|
||||||
|
'users',
|
||||||
|
sa.Column('id', sa.UUID(), nullable=False),
|
||||||
|
sa.Column('email', sa.String(255), nullable=False),
|
||||||
|
sa.Column('name', sa.String(100), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.func.now()),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), onupdate=sa.func.now()),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
)
|
||||||
|
# Create index separately for clarity
|
||||||
|
op.create_index('ix_users_email', 'users', ['email'], unique=True)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
op.drop_index('ix_users_email', table_name='users')
|
||||||
|
op.drop_table('users')
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Data Migrations
|
||||||
|
```python
|
||||||
|
"""Backfill user full names
|
||||||
|
|
||||||
|
Revision ID: def456
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
from sqlalchemy.sql import table, column
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# Define table structure for data migration
|
||||||
|
users = table('users',
|
||||||
|
column('id', sa.UUID),
|
||||||
|
column('first_name', sa.String),
|
||||||
|
column('last_name', sa.String),
|
||||||
|
column('full_name', sa.String),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Batch update for large tables
|
||||||
|
connection = op.get_bind()
|
||||||
|
connection.execute(
|
||||||
|
users.update().values(
|
||||||
|
full_name=users.c.first_name + ' ' + users.c.last_name
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
# Data migrations typically aren't reversible
|
||||||
|
pass
|
||||||
|
```
|
||||||
|
|
||||||
|
### Prisma (TypeScript)
|
||||||
|
|
||||||
|
#### Schema Definition
|
||||||
|
```prisma
|
||||||
|
// prisma/schema.prisma
|
||||||
|
datasource db {
|
||||||
|
provider = "postgresql"
|
||||||
|
url = env("DATABASE_URL")
|
||||||
|
}
|
||||||
|
|
||||||
|
generator client {
|
||||||
|
provider = "prisma-client-js"
|
||||||
|
}
|
||||||
|
|
||||||
|
model User {
|
||||||
|
id String @id @default(uuid())
|
||||||
|
email String @unique
|
||||||
|
name String
|
||||||
|
role Role @default(USER)
|
||||||
|
posts Post[]
|
||||||
|
createdAt DateTime @default(now()) @map("created_at")
|
||||||
|
updatedAt DateTime @updatedAt @map("updated_at")
|
||||||
|
|
||||||
|
@@map("users")
|
||||||
|
@@index([email])
|
||||||
|
}
|
||||||
|
|
||||||
|
model Post {
|
||||||
|
id String @id @default(uuid())
|
||||||
|
title String
|
||||||
|
content String?
|
||||||
|
published Boolean @default(false)
|
||||||
|
author User @relation(fields: [authorId], references: [id])
|
||||||
|
authorId String @map("author_id")
|
||||||
|
createdAt DateTime @default(now()) @map("created_at")
|
||||||
|
|
||||||
|
@@map("posts")
|
||||||
|
@@index([authorId])
|
||||||
|
}
|
||||||
|
|
||||||
|
enum Role {
|
||||||
|
USER
|
||||||
|
ADMIN
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Migration Commands
|
||||||
|
```bash
|
||||||
|
# Create migration from schema changes
|
||||||
|
npx prisma migrate dev --name add_users_table
|
||||||
|
|
||||||
|
# Apply migrations in production
|
||||||
|
npx prisma migrate deploy
|
||||||
|
|
||||||
|
# Reset database (development only)
|
||||||
|
npx prisma migrate reset
|
||||||
|
|
||||||
|
# Generate client
|
||||||
|
npx prisma generate
|
||||||
|
|
||||||
|
# View database
|
||||||
|
npx prisma studio
|
||||||
|
```
|
||||||
|
|
||||||
|
## SQLAlchemy 2.0 Patterns
|
||||||
|
|
||||||
|
### Model Definition
|
||||||
|
```python
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
from sqlalchemy import String, ForeignKey, func
|
||||||
|
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column, relationship
|
||||||
|
|
||||||
|
|
||||||
|
class Base(DeclarativeBase):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class User(Base):
|
||||||
|
__tablename__ = "users"
|
||||||
|
|
||||||
|
id: Mapped[UUID] = mapped_column(primary_key=True, default=uuid4)
|
||||||
|
email: Mapped[str] = mapped_column(String(255), unique=True, index=True)
|
||||||
|
name: Mapped[str] = mapped_column(String(100))
|
||||||
|
role: Mapped[str] = mapped_column(String(20), default="user")
|
||||||
|
created_at: Mapped[datetime] = mapped_column(server_default=func.now())
|
||||||
|
updated_at: Mapped[Optional[datetime]] = mapped_column(onupdate=func.now())
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
orders: Mapped[list["Order"]] = relationship(back_populates="user")
|
||||||
|
|
||||||
|
|
||||||
|
class Order(Base):
|
||||||
|
__tablename__ = "orders"
|
||||||
|
|
||||||
|
id: Mapped[UUID] = mapped_column(primary_key=True, default=uuid4)
|
||||||
|
user_id: Mapped[UUID] = mapped_column(ForeignKey("users.id"))
|
||||||
|
total: Mapped[int] # Store as cents
|
||||||
|
status: Mapped[str] = mapped_column(String(20), default="pending")
|
||||||
|
created_at: Mapped[datetime] = mapped_column(server_default=func.now())
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
user: Mapped["User"] = relationship(back_populates="orders")
|
||||||
|
items: Mapped[list["OrderItem"]] = relationship(back_populates="order")
|
||||||
|
```
|
||||||
|
|
||||||
|
### Async Repository Pattern
|
||||||
|
```python
|
||||||
|
from sqlalchemy import select
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
||||||
|
|
||||||
|
class UserRepository:
|
||||||
|
def __init__(self, session: AsyncSession):
|
||||||
|
self.session = session
|
||||||
|
|
||||||
|
async def get_by_id(self, user_id: UUID) -> User | None:
|
||||||
|
result = await self.session.execute(
|
||||||
|
select(User).where(User.id == user_id)
|
||||||
|
)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
|
|
||||||
|
async def get_by_email(self, email: str) -> User | None:
|
||||||
|
result = await self.session.execute(
|
||||||
|
select(User).where(User.email == email)
|
||||||
|
)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
|
|
||||||
|
async def list_with_orders(
|
||||||
|
self,
|
||||||
|
limit: int = 20,
|
||||||
|
offset: int = 0
|
||||||
|
) -> list[User]:
|
||||||
|
result = await self.session.execute(
|
||||||
|
select(User)
|
||||||
|
.options(selectinload(User.orders))
|
||||||
|
.limit(limit)
|
||||||
|
.offset(offset)
|
||||||
|
)
|
||||||
|
return list(result.scalars().all())
|
||||||
|
|
||||||
|
async def create(self, user: User) -> User:
|
||||||
|
self.session.add(user)
|
||||||
|
await self.session.flush()
|
||||||
|
return user
|
||||||
|
|
||||||
|
async def update(self, user: User) -> User:
|
||||||
|
await self.session.flush()
|
||||||
|
return user
|
||||||
|
|
||||||
|
async def delete(self, user: User) -> None:
|
||||||
|
await self.session.delete(user)
|
||||||
|
await self.session.flush()
|
||||||
|
```
|
||||||
|
|
||||||
|
## Query Optimization
|
||||||
|
|
||||||
|
### Indexing Strategies
|
||||||
|
```sql
|
||||||
|
-- Primary lookup patterns
|
||||||
|
CREATE INDEX idx_users_email ON users(email);
|
||||||
|
CREATE INDEX idx_orders_user_id ON orders(user_id);
|
||||||
|
|
||||||
|
-- Composite indexes (order matters!)
|
||||||
|
CREATE INDEX idx_orders_user_status ON orders(user_id, status);
|
||||||
|
|
||||||
|
-- Partial indexes
|
||||||
|
CREATE INDEX idx_orders_pending ON orders(user_id) WHERE status = 'pending';
|
||||||
|
|
||||||
|
-- Covering indexes
|
||||||
|
CREATE INDEX idx_users_email_name ON users(email) INCLUDE (name);
|
||||||
|
```
|
||||||
|
|
||||||
|
### N+1 Query Prevention
|
||||||
|
```python
|
||||||
|
# BAD - N+1 queries
|
||||||
|
users = await session.execute(select(User))
|
||||||
|
for user in users.scalars():
|
||||||
|
print(user.orders) # Each access triggers a query!
|
||||||
|
|
||||||
|
# GOOD - Eager loading
|
||||||
|
from sqlalchemy.orm import selectinload, joinedload
|
||||||
|
|
||||||
|
# Use selectinload for collections
|
||||||
|
users = await session.execute(
|
||||||
|
select(User).options(selectinload(User.orders))
|
||||||
|
)
|
||||||
|
|
||||||
|
# Use joinedload for single relations
|
||||||
|
orders = await session.execute(
|
||||||
|
select(Order).options(joinedload(Order.user))
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Pagination
|
||||||
|
```python
|
||||||
|
from sqlalchemy import select, func
|
||||||
|
|
||||||
|
|
||||||
|
async def paginate_users(
|
||||||
|
session: AsyncSession,
|
||||||
|
page: int = 1,
|
||||||
|
page_size: int = 20,
|
||||||
|
) -> dict:
|
||||||
|
# Count total
|
||||||
|
count_query = select(func.count()).select_from(User)
|
||||||
|
total = (await session.execute(count_query)).scalar_one()
|
||||||
|
|
||||||
|
# Fetch page
|
||||||
|
offset = (page - 1) * page_size
|
||||||
|
query = select(User).limit(page_size).offset(offset).order_by(User.created_at.desc())
|
||||||
|
result = await session.execute(query)
|
||||||
|
users = list(result.scalars().all())
|
||||||
|
|
||||||
|
return {
|
||||||
|
"items": users,
|
||||||
|
"total": total,
|
||||||
|
"page": page,
|
||||||
|
"page_size": page_size,
|
||||||
|
"has_more": offset + len(users) < total,
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## NoSQL Patterns (MongoDB)
|
||||||
|
|
||||||
|
### Document Design
|
||||||
|
```python
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from datetime import datetime
|
||||||
|
from bson import ObjectId
|
||||||
|
|
||||||
|
|
||||||
|
class PyObjectId(ObjectId):
|
||||||
|
@classmethod
|
||||||
|
def __get_validators__(cls):
|
||||||
|
yield cls.validate
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def validate(cls, v):
|
||||||
|
if not ObjectId.is_valid(v):
|
||||||
|
raise ValueError("Invalid ObjectId")
|
||||||
|
return ObjectId(v)
|
||||||
|
|
||||||
|
|
||||||
|
class UserDocument(BaseModel):
|
||||||
|
id: PyObjectId = Field(default_factory=PyObjectId, alias="_id")
|
||||||
|
email: str
|
||||||
|
name: str
|
||||||
|
# Embed frequently accessed data
|
||||||
|
profile: dict = {}
|
||||||
|
# Reference for large/changing data
|
||||||
|
order_ids: list[str] = []
|
||||||
|
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
populate_by_name = True
|
||||||
|
json_encoders = {ObjectId: str}
|
||||||
|
```
|
||||||
|
|
||||||
|
### MongoDB with Motor (Async)
|
||||||
|
```python
|
||||||
|
from motor.motor_asyncio import AsyncIOMotorClient
|
||||||
|
|
||||||
|
|
||||||
|
class MongoUserRepository:
|
||||||
|
def __init__(self, client: AsyncIOMotorClient, db_name: str):
|
||||||
|
self.collection = client[db_name].users
|
||||||
|
|
||||||
|
async def get_by_id(self, user_id: str) -> dict | None:
|
||||||
|
return await self.collection.find_one({"_id": ObjectId(user_id)})
|
||||||
|
|
||||||
|
async def create(self, user: UserDocument) -> str:
|
||||||
|
result = await self.collection.insert_one(user.dict(by_alias=True))
|
||||||
|
return str(result.inserted_id)
|
||||||
|
|
||||||
|
async def find_by_email_domain(self, domain: str) -> list[dict]:
|
||||||
|
cursor = self.collection.find(
|
||||||
|
{"email": {"$regex": f"@{domain}$"}},
|
||||||
|
{"email": 1, "name": 1} # Projection
|
||||||
|
).limit(100)
|
||||||
|
return await cursor.to_list(length=100)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Migration Safety
|
||||||
|
|
||||||
|
### Zero-Downtime Migration Pattern
|
||||||
|
|
||||||
|
```python
|
||||||
|
# Step 1: Add new column (nullable)
|
||||||
|
def upgrade_step1():
|
||||||
|
op.add_column('users', sa.Column('full_name', sa.String(200), nullable=True))
|
||||||
|
|
||||||
|
# Step 2: Backfill data (separate deployment)
|
||||||
|
def upgrade_step2():
|
||||||
|
# Run as background job, not in migration
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Step 3: Make column required (after backfill complete)
|
||||||
|
def upgrade_step3():
|
||||||
|
op.alter_column('users', 'full_name', nullable=False)
|
||||||
|
|
||||||
|
# Step 4: Remove old columns (after app updated)
|
||||||
|
def upgrade_step4():
|
||||||
|
op.drop_column('users', 'first_name')
|
||||||
|
op.drop_column('users', 'last_name')
|
||||||
|
```
|
||||||
|
|
||||||
|
### Pre-Migration Checklist
|
||||||
|
|
||||||
|
- [ ] Backup database before migration
|
||||||
|
- [ ] Test migration on copy of production data
|
||||||
|
- [ ] Check migration doesn't lock tables for too long
|
||||||
|
- [ ] Ensure rollback script works
|
||||||
|
- [ ] Plan for zero-downtime if needed
|
||||||
|
- [ ] Coordinate with application deployments
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Alembic
|
||||||
|
alembic upgrade head # Apply all migrations
|
||||||
|
alembic downgrade -1 # Rollback one
|
||||||
|
alembic history # Show history
|
||||||
|
alembic current # Show current version
|
||||||
|
|
||||||
|
# Prisma
|
||||||
|
npx prisma migrate dev # Development migration
|
||||||
|
npx prisma migrate deploy # Production migration
|
||||||
|
npx prisma db push # Push schema without migration
|
||||||
|
|
||||||
|
# PostgreSQL
|
||||||
|
pg_dump -Fc mydb > backup.dump # Backup
|
||||||
|
pg_restore -d mydb backup.dump # Restore
|
||||||
|
psql -d mydb -f migration.sql # Run SQL file
|
||||||
|
```
|
||||||
459
.claude/skills/infrastructure/docker-kubernetes/SKILL.md
Normal file
459
.claude/skills/infrastructure/docker-kubernetes/SKILL.md
Normal file
@@ -0,0 +1,459 @@
|
|||||||
|
---
|
||||||
|
name: docker-kubernetes
|
||||||
|
description: Docker containerization and Kubernetes orchestration patterns. Use when building containers, writing Dockerfiles, or deploying to Kubernetes.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Docker & Kubernetes Skill
|
||||||
|
|
||||||
|
## Dockerfile Best Practices
|
||||||
|
|
||||||
|
### Multi-Stage Build (Python)
|
||||||
|
```dockerfile
|
||||||
|
# Build stage
|
||||||
|
FROM python:3.12-slim as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install build dependencies
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
build-essential \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Install Python dependencies
|
||||||
|
COPY pyproject.toml uv.lock ./
|
||||||
|
RUN pip install uv && uv sync --frozen --no-dev
|
||||||
|
|
||||||
|
# Production stage
|
||||||
|
FROM python:3.12-slim as production
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Create non-root user
|
||||||
|
RUN groupadd -r appuser && useradd -r -g appuser appuser
|
||||||
|
|
||||||
|
# Copy only necessary files from builder
|
||||||
|
COPY --from=builder /app/.venv /app/.venv
|
||||||
|
COPY src/ ./src/
|
||||||
|
|
||||||
|
# Set environment
|
||||||
|
ENV PATH="/app/.venv/bin:$PATH"
|
||||||
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE=1
|
||||||
|
|
||||||
|
# Switch to non-root user
|
||||||
|
USER appuser
|
||||||
|
|
||||||
|
EXPOSE 8000
|
||||||
|
|
||||||
|
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
|
||||||
|
CMD python -c "import urllib.request; urllib.request.urlopen('http://localhost:8000/health')"
|
||||||
|
|
||||||
|
CMD ["uvicorn", "src.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Multi-Stage Build (Node.js)
|
||||||
|
```dockerfile
|
||||||
|
# Build stage
|
||||||
|
FROM node:22-alpine as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install dependencies first (layer caching)
|
||||||
|
COPY package*.json ./
|
||||||
|
RUN npm ci
|
||||||
|
|
||||||
|
# Build application
|
||||||
|
COPY . .
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# Production stage
|
||||||
|
FROM node:22-alpine as production
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Create non-root user
|
||||||
|
RUN addgroup -S appuser && adduser -S appuser -G appuser
|
||||||
|
|
||||||
|
# Copy only production dependencies and built files
|
||||||
|
COPY --from=builder /app/package*.json ./
|
||||||
|
RUN npm ci --only=production && npm cache clean --force
|
||||||
|
|
||||||
|
COPY --from=builder /app/dist ./dist
|
||||||
|
|
||||||
|
USER appuser
|
||||||
|
|
||||||
|
EXPOSE 3000
|
||||||
|
|
||||||
|
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
|
||||||
|
CMD wget --no-verbose --tries=1 --spider http://localhost:3000/health || exit 1
|
||||||
|
|
||||||
|
CMD ["node", "dist/main.js"]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Multi-Stage Build (Rust)
|
||||||
|
```dockerfile
|
||||||
|
# Build stage
|
||||||
|
FROM rust:1.75-slim as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Create a dummy project for dependency caching
|
||||||
|
RUN cargo new --bin app
|
||||||
|
WORKDIR /app/app
|
||||||
|
|
||||||
|
# Copy manifests and build dependencies
|
||||||
|
COPY Cargo.toml Cargo.lock ./
|
||||||
|
RUN cargo build --release && rm -rf src
|
||||||
|
|
||||||
|
# Copy source and build
|
||||||
|
COPY src ./src
|
||||||
|
RUN touch src/main.rs && cargo build --release
|
||||||
|
|
||||||
|
# Production stage
|
||||||
|
FROM debian:bookworm-slim as production
|
||||||
|
|
||||||
|
# Install runtime dependencies
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
ca-certificates \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Create non-root user
|
||||||
|
RUN groupadd -r appuser && useradd -r -g appuser appuser
|
||||||
|
|
||||||
|
COPY --from=builder /app/app/target/release/app /usr/local/bin/app
|
||||||
|
|
||||||
|
USER appuser
|
||||||
|
|
||||||
|
EXPOSE 8080
|
||||||
|
|
||||||
|
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
|
||||||
|
CMD ["/usr/local/bin/app", "health"]
|
||||||
|
|
||||||
|
CMD ["/usr/local/bin/app"]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Docker Compose
|
||||||
|
|
||||||
|
### Development Setup
|
||||||
|
```yaml
|
||||||
|
# docker-compose.yml
|
||||||
|
services:
|
||||||
|
app:
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
target: development
|
||||||
|
ports:
|
||||||
|
- "8000:8000"
|
||||||
|
volumes:
|
||||||
|
- .:/app
|
||||||
|
- /app/.venv # Exclude venv from mount
|
||||||
|
environment:
|
||||||
|
- DATABASE_URL=postgresql://user:pass@db:5432/myapp
|
||||||
|
- REDIS_URL=redis://redis:6379/0
|
||||||
|
- DEBUG=true
|
||||||
|
depends_on:
|
||||||
|
db:
|
||||||
|
condition: service_healthy
|
||||||
|
redis:
|
||||||
|
condition: service_started
|
||||||
|
|
||||||
|
db:
|
||||||
|
image: postgres:16-alpine
|
||||||
|
environment:
|
||||||
|
POSTGRES_USER: user
|
||||||
|
POSTGRES_PASSWORD: pass
|
||||||
|
POSTGRES_DB: myapp
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U user -d myapp"]
|
||||||
|
interval: 5s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
volumes:
|
||||||
|
- redis_data:/data
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
|
redis_data:
|
||||||
|
```
|
||||||
|
|
||||||
|
### Production Setup
|
||||||
|
```yaml
|
||||||
|
# docker-compose.prod.yml
|
||||||
|
services:
|
||||||
|
app:
|
||||||
|
image: ${REGISTRY}/myapp:${VERSION}
|
||||||
|
deploy:
|
||||||
|
replicas: 3
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
cpus: '0.5'
|
||||||
|
memory: 512M
|
||||||
|
reservations:
|
||||||
|
cpus: '0.25'
|
||||||
|
memory: 256M
|
||||||
|
restart_policy:
|
||||||
|
condition: on-failure
|
||||||
|
delay: 5s
|
||||||
|
max_attempts: 3
|
||||||
|
environment:
|
||||||
|
- DATABASE_URL_FILE=/run/secrets/db_url
|
||||||
|
secrets:
|
||||||
|
- db_url
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 3
|
||||||
|
start_period: 40s
|
||||||
|
|
||||||
|
secrets:
|
||||||
|
db_url:
|
||||||
|
external: true
|
||||||
|
```
|
||||||
|
|
||||||
|
## Kubernetes Manifests
|
||||||
|
|
||||||
|
### Deployment
|
||||||
|
```yaml
|
||||||
|
# k8s/deployment.yaml
|
||||||
|
apiVersion: apps/v1
|
||||||
|
kind: Deployment
|
||||||
|
metadata:
|
||||||
|
name: myapp
|
||||||
|
labels:
|
||||||
|
app: myapp
|
||||||
|
spec:
|
||||||
|
replicas: 3
|
||||||
|
selector:
|
||||||
|
matchLabels:
|
||||||
|
app: myapp
|
||||||
|
template:
|
||||||
|
metadata:
|
||||||
|
labels:
|
||||||
|
app: myapp
|
||||||
|
spec:
|
||||||
|
securityContext:
|
||||||
|
runAsNonRoot: true
|
||||||
|
runAsUser: 1000
|
||||||
|
fsGroup: 1000
|
||||||
|
containers:
|
||||||
|
- name: myapp
|
||||||
|
image: myregistry/myapp:v1.0.0
|
||||||
|
ports:
|
||||||
|
- containerPort: 8000
|
||||||
|
resources:
|
||||||
|
requests:
|
||||||
|
cpu: "100m"
|
||||||
|
memory: "128Mi"
|
||||||
|
limits:
|
||||||
|
cpu: "500m"
|
||||||
|
memory: "512Mi"
|
||||||
|
env:
|
||||||
|
- name: DATABASE_URL
|
||||||
|
valueFrom:
|
||||||
|
secretKeyRef:
|
||||||
|
name: myapp-secrets
|
||||||
|
key: database-url
|
||||||
|
livenessProbe:
|
||||||
|
httpGet:
|
||||||
|
path: /health
|
||||||
|
port: 8000
|
||||||
|
initialDelaySeconds: 10
|
||||||
|
periodSeconds: 10
|
||||||
|
readinessProbe:
|
||||||
|
httpGet:
|
||||||
|
path: /ready
|
||||||
|
port: 8000
|
||||||
|
initialDelaySeconds: 5
|
||||||
|
periodSeconds: 5
|
||||||
|
securityContext:
|
||||||
|
allowPrivilegeEscalation: false
|
||||||
|
readOnlyRootFilesystem: true
|
||||||
|
capabilities:
|
||||||
|
drop:
|
||||||
|
- ALL
|
||||||
|
```
|
||||||
|
|
||||||
|
### Service
|
||||||
|
```yaml
|
||||||
|
# k8s/service.yaml
|
||||||
|
apiVersion: v1
|
||||||
|
kind: Service
|
||||||
|
metadata:
|
||||||
|
name: myapp
|
||||||
|
spec:
|
||||||
|
selector:
|
||||||
|
app: myapp
|
||||||
|
ports:
|
||||||
|
- port: 80
|
||||||
|
targetPort: 8000
|
||||||
|
type: ClusterIP
|
||||||
|
---
|
||||||
|
apiVersion: networking.k8s.io/v1
|
||||||
|
kind: Ingress
|
||||||
|
metadata:
|
||||||
|
name: myapp
|
||||||
|
annotations:
|
||||||
|
kubernetes.io/ingress.class: nginx
|
||||||
|
cert-manager.io/cluster-issuer: letsencrypt-prod
|
||||||
|
spec:
|
||||||
|
tls:
|
||||||
|
- hosts:
|
||||||
|
- myapp.example.com
|
||||||
|
secretName: myapp-tls
|
||||||
|
rules:
|
||||||
|
- host: myapp.example.com
|
||||||
|
http:
|
||||||
|
paths:
|
||||||
|
- path: /
|
||||||
|
pathType: Prefix
|
||||||
|
backend:
|
||||||
|
service:
|
||||||
|
name: myapp
|
||||||
|
port:
|
||||||
|
number: 80
|
||||||
|
```
|
||||||
|
|
||||||
|
### ConfigMap and Secrets
|
||||||
|
```yaml
|
||||||
|
# k8s/config.yaml
|
||||||
|
apiVersion: v1
|
||||||
|
kind: ConfigMap
|
||||||
|
metadata:
|
||||||
|
name: myapp-config
|
||||||
|
data:
|
||||||
|
LOG_LEVEL: "info"
|
||||||
|
CACHE_TTL: "3600"
|
||||||
|
---
|
||||||
|
apiVersion: v1
|
||||||
|
kind: Secret
|
||||||
|
metadata:
|
||||||
|
name: myapp-secrets
|
||||||
|
type: Opaque
|
||||||
|
stringData:
|
||||||
|
database-url: postgresql://user:pass@db:5432/myapp # Use sealed-secrets in production
|
||||||
|
```
|
||||||
|
|
||||||
|
### Horizontal Pod Autoscaler
|
||||||
|
```yaml
|
||||||
|
# k8s/hpa.yaml
|
||||||
|
apiVersion: autoscaling/v2
|
||||||
|
kind: HorizontalPodAutoscaler
|
||||||
|
metadata:
|
||||||
|
name: myapp
|
||||||
|
spec:
|
||||||
|
scaleTargetRef:
|
||||||
|
apiVersion: apps/v1
|
||||||
|
kind: Deployment
|
||||||
|
name: myapp
|
||||||
|
minReplicas: 2
|
||||||
|
maxReplicas: 10
|
||||||
|
metrics:
|
||||||
|
- type: Resource
|
||||||
|
resource:
|
||||||
|
name: cpu
|
||||||
|
target:
|
||||||
|
type: Utilization
|
||||||
|
averageUtilization: 70
|
||||||
|
- type: Resource
|
||||||
|
resource:
|
||||||
|
name: memory
|
||||||
|
target:
|
||||||
|
type: Utilization
|
||||||
|
averageUtilization: 80
|
||||||
|
```
|
||||||
|
|
||||||
|
## Helm Chart Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
myapp-chart/
|
||||||
|
├── Chart.yaml
|
||||||
|
├── values.yaml
|
||||||
|
├── values-prod.yaml
|
||||||
|
├── templates/
|
||||||
|
│ ├── _helpers.tpl
|
||||||
|
│ ├── deployment.yaml
|
||||||
|
│ ├── service.yaml
|
||||||
|
│ ├── ingress.yaml
|
||||||
|
│ ├── configmap.yaml
|
||||||
|
│ ├── secret.yaml
|
||||||
|
│ └── hpa.yaml
|
||||||
|
└── charts/
|
||||||
|
```
|
||||||
|
|
||||||
|
### values.yaml
|
||||||
|
```yaml
|
||||||
|
replicaCount: 2
|
||||||
|
|
||||||
|
image:
|
||||||
|
repository: myregistry/myapp
|
||||||
|
tag: latest
|
||||||
|
pullPolicy: IfNotPresent
|
||||||
|
|
||||||
|
service:
|
||||||
|
type: ClusterIP
|
||||||
|
port: 80
|
||||||
|
|
||||||
|
ingress:
|
||||||
|
enabled: true
|
||||||
|
host: myapp.example.com
|
||||||
|
tls: true
|
||||||
|
|
||||||
|
resources:
|
||||||
|
requests:
|
||||||
|
cpu: 100m
|
||||||
|
memory: 128Mi
|
||||||
|
limits:
|
||||||
|
cpu: 500m
|
||||||
|
memory: 512Mi
|
||||||
|
|
||||||
|
autoscaling:
|
||||||
|
enabled: true
|
||||||
|
minReplicas: 2
|
||||||
|
maxReplicas: 10
|
||||||
|
targetCPUUtilization: 70
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Docker
|
||||||
|
docker build -t myapp:latest .
|
||||||
|
docker build --target development -t myapp:dev .
|
||||||
|
docker run -p 8000:8000 myapp:latest
|
||||||
|
docker compose up -d
|
||||||
|
docker compose logs -f app
|
||||||
|
docker compose down -v
|
||||||
|
|
||||||
|
# Kubernetes
|
||||||
|
kubectl apply -f k8s/
|
||||||
|
kubectl get pods -l app=myapp
|
||||||
|
kubectl logs -f deployment/myapp
|
||||||
|
kubectl rollout status deployment/myapp
|
||||||
|
kubectl rollout undo deployment/myapp
|
||||||
|
kubectl port-forward svc/myapp 8000:80
|
||||||
|
|
||||||
|
# Helm
|
||||||
|
helm install myapp ./myapp-chart
|
||||||
|
helm upgrade myapp ./myapp-chart -f values-prod.yaml
|
||||||
|
helm rollback myapp 1
|
||||||
|
helm uninstall myapp
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Checklist
|
||||||
|
|
||||||
|
- [ ] Run as non-root user
|
||||||
|
- [ ] Use multi-stage builds (minimal final image)
|
||||||
|
- [ ] Pin base image versions
|
||||||
|
- [ ] Scan images for vulnerabilities (Trivy, Snyk)
|
||||||
|
- [ ] No secrets in images or environment variables
|
||||||
|
- [ ] Read-only root filesystem where possible
|
||||||
|
- [ ] Drop all capabilities, add only needed ones
|
||||||
|
- [ ] Set resource limits
|
||||||
|
- [ ] Use network policies
|
||||||
|
- [ ] Enable Pod Security Standards
|
||||||
464
.claude/skills/infrastructure/gcp/SKILL.md
Normal file
464
.claude/skills/infrastructure/gcp/SKILL.md
Normal file
@@ -0,0 +1,464 @@
|
|||||||
|
---
|
||||||
|
name: gcp-services
|
||||||
|
description: Google Cloud Platform service patterns, IAM best practices, and common architectures. Use when designing or implementing GCP infrastructure.
|
||||||
|
---
|
||||||
|
|
||||||
|
# GCP Services Skill
|
||||||
|
|
||||||
|
## Common Architecture Patterns
|
||||||
|
|
||||||
|
### Web Application (Cloud Run + Cloud SQL)
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────────┐
|
||||||
|
│ VPC │
|
||||||
|
│ ┌─────────────────────────────────────────────────────────┐│
|
||||||
|
│ │ Public Subnet ││
|
||||||
|
│ │ ┌─────────────┐ ┌─────────────┐ ││
|
||||||
|
│ │ │ Cloud │ │ Cloud │ ││
|
||||||
|
│ │ │ Load │ │ NAT │ ││
|
||||||
|
│ │ │ Balancing │ │ │ ││
|
||||||
|
│ │ └──────┬──────┘ └──────┬──────┘ ││
|
||||||
|
│ └─────────┼───────────────────────────────────┼───────────┘│
|
||||||
|
│ │ │ │
|
||||||
|
│ ┌─────────┼───────────────────────────────────┼───────────┐│
|
||||||
|
│ │ │ Private Subnet │ ││
|
||||||
|
│ │ ┌──────▼──────┐ ┌───────▼─────┐ ││
|
||||||
|
│ │ │ Cloud Run │ │ Cloud SQL │ ││
|
||||||
|
│ │ │ (Service) │───────────────────▶│ PostgreSQL │ ││
|
||||||
|
│ │ └─────────────┘ └─────────────┘ ││
|
||||||
|
│ └─────────────────────────────────────────────────────────┘│
|
||||||
|
└─────────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### Serverless (Cloud Functions + API Gateway)
|
||||||
|
```
|
||||||
|
┌────────────┐ ┌─────────────┐ ┌─────────────┐
|
||||||
|
│ Cloud │────▶│ API │────▶│ Cloud │
|
||||||
|
│ CDN │ │ Gateway │ │ Functions │
|
||||||
|
└────────────┘ └─────────────┘ └──────┬──────┘
|
||||||
|
│
|
||||||
|
┌──────────────────────────┼──────────────┐
|
||||||
|
│ │ │
|
||||||
|
┌──────▼─────┐ ┌─────────┐ ┌────▼────┐
|
||||||
|
│ Firestore │ │ Cloud │ │ Secret │
|
||||||
|
└────────────┘ │ Storage │ │ Manager │
|
||||||
|
└─────────┘ └─────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
## IAM Best Practices
|
||||||
|
|
||||||
|
### Service Account with Least Privilege
|
||||||
|
```yaml
|
||||||
|
# Service account for Cloud Run service
|
||||||
|
resource "google_service_account" "app_sa" {
|
||||||
|
account_id = "my-app-service"
|
||||||
|
display_name = "My App Service Account"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Grant specific permissions
|
||||||
|
resource "google_project_iam_member" "app_storage" {
|
||||||
|
project = var.project_id
|
||||||
|
role = "roles/storage.objectViewer"
|
||||||
|
member = "serviceAccount:${google_service_account.app_sa.email}"
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "google_project_iam_member" "app_secrets" {
|
||||||
|
project = var.project_id
|
||||||
|
role = "roles/secretmanager.secretAccessor"
|
||||||
|
member = "serviceAccount:${google_service_account.app_sa.email}"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Workload Identity Federation
|
||||||
|
```python
|
||||||
|
# Python - google-auth with workload identity
|
||||||
|
from google.auth import default
|
||||||
|
from google.cloud import storage
|
||||||
|
|
||||||
|
# Automatically uses workload identity when on GKE/Cloud Run
|
||||||
|
credentials, project = default()
|
||||||
|
|
||||||
|
# Access Cloud Storage
|
||||||
|
storage_client = storage.Client(credentials=credentials, project=project)
|
||||||
|
bucket = storage_client.bucket("my-bucket")
|
||||||
|
```
|
||||||
|
|
||||||
|
## Secret Manager Patterns
|
||||||
|
|
||||||
|
### Accessing Secrets
|
||||||
|
```python
|
||||||
|
from google.cloud import secretmanager
|
||||||
|
|
||||||
|
def get_secret(project_id: str, secret_id: str, version: str = "latest") -> str:
|
||||||
|
"""Access a secret from Secret Manager."""
|
||||||
|
client = secretmanager.SecretManagerServiceClient()
|
||||||
|
name = f"projects/{project_id}/secrets/{secret_id}/versions/{version}"
|
||||||
|
response = client.access_secret_version(request={"name": name})
|
||||||
|
return response.payload.data.decode("UTF-8")
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
db_password = get_secret("my-project", "database-password")
|
||||||
|
```
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// TypeScript - @google-cloud/secret-manager
|
||||||
|
import { SecretManagerServiceClient } from "@google-cloud/secret-manager";
|
||||||
|
|
||||||
|
async function getSecret(
|
||||||
|
projectId: string,
|
||||||
|
secretId: string,
|
||||||
|
version: string = "latest"
|
||||||
|
): Promise<string> {
|
||||||
|
const client = new SecretManagerServiceClient();
|
||||||
|
const name = `projects/${projectId}/secrets/${secretId}/versions/${version}`;
|
||||||
|
|
||||||
|
const [response] = await client.accessSecretVersion({ name });
|
||||||
|
return response.payload?.data?.toString() || "";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cloud Run with Secret References
|
||||||
|
```yaml
|
||||||
|
# Cloud Run service with secret environment variables
|
||||||
|
apiVersion: serving.knative.dev/v1
|
||||||
|
kind: Service
|
||||||
|
metadata:
|
||||||
|
name: my-service
|
||||||
|
spec:
|
||||||
|
template:
|
||||||
|
spec:
|
||||||
|
containers:
|
||||||
|
- image: gcr.io/my-project/my-app
|
||||||
|
env:
|
||||||
|
- name: DATABASE_PASSWORD
|
||||||
|
valueFrom:
|
||||||
|
secretKeyRef:
|
||||||
|
name: database-password
|
||||||
|
key: latest
|
||||||
|
```
|
||||||
|
|
||||||
|
## Cloud Storage Patterns
|
||||||
|
|
||||||
|
### Signed URLs
|
||||||
|
```python
|
||||||
|
from datetime import timedelta
|
||||||
|
from google.cloud import storage
|
||||||
|
|
||||||
|
def generate_signed_url(
|
||||||
|
bucket_name: str,
|
||||||
|
blob_name: str,
|
||||||
|
expiration_minutes: int = 60
|
||||||
|
) -> str:
|
||||||
|
"""Generate a signed URL for downloading a blob."""
|
||||||
|
storage_client = storage.Client()
|
||||||
|
bucket = storage_client.bucket(bucket_name)
|
||||||
|
blob = bucket.blob(blob_name)
|
||||||
|
|
||||||
|
url = blob.generate_signed_url(
|
||||||
|
version="v4",
|
||||||
|
expiration=timedelta(minutes=expiration_minutes),
|
||||||
|
method="GET",
|
||||||
|
)
|
||||||
|
|
||||||
|
return url
|
||||||
|
```
|
||||||
|
|
||||||
|
### Upload with Resumable Upload
|
||||||
|
```python
|
||||||
|
from google.cloud import storage
|
||||||
|
|
||||||
|
def upload_large_file(
|
||||||
|
bucket_name: str,
|
||||||
|
source_file: str,
|
||||||
|
destination_blob: str
|
||||||
|
) -> str:
|
||||||
|
"""Upload a large file using resumable upload."""
|
||||||
|
storage_client = storage.Client()
|
||||||
|
bucket = storage_client.bucket(bucket_name)
|
||||||
|
blob = bucket.blob(destination_blob)
|
||||||
|
|
||||||
|
# For files > 5MB, uses resumable upload automatically
|
||||||
|
blob.upload_from_filename(source_file)
|
||||||
|
|
||||||
|
return f"gs://{bucket_name}/{destination_blob}"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Firestore Patterns
|
||||||
|
|
||||||
|
### Async Operations
|
||||||
|
```python
|
||||||
|
from google.cloud import firestore
|
||||||
|
from google.cloud.firestore_v1.async_client import AsyncClient
|
||||||
|
|
||||||
|
async def get_firestore_client() -> AsyncClient:
|
||||||
|
"""Create async Firestore client."""
|
||||||
|
return AsyncClient()
|
||||||
|
|
||||||
|
async def get_user(user_id: str) -> dict | None:
|
||||||
|
"""Get a user document from Firestore."""
|
||||||
|
db = await get_firestore_client()
|
||||||
|
doc_ref = db.collection("users").document(user_id)
|
||||||
|
doc = await doc_ref.get()
|
||||||
|
|
||||||
|
if doc.exists:
|
||||||
|
return doc.to_dict()
|
||||||
|
return None
|
||||||
|
|
||||||
|
async def query_users_by_status(status: str) -> list[dict]:
|
||||||
|
"""Query users by status."""
|
||||||
|
db = await get_firestore_client()
|
||||||
|
query = db.collection("users").where("status", "==", status)
|
||||||
|
|
||||||
|
docs = await query.get()
|
||||||
|
return [doc.to_dict() for doc in docs]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Transaction Pattern
|
||||||
|
```python
|
||||||
|
from google.cloud import firestore
|
||||||
|
|
||||||
|
def transfer_credits(
|
||||||
|
from_user_id: str,
|
||||||
|
to_user_id: str,
|
||||||
|
amount: int
|
||||||
|
) -> bool:
|
||||||
|
"""Transfer credits between users atomically."""
|
||||||
|
db = firestore.Client()
|
||||||
|
|
||||||
|
@firestore.transactional
|
||||||
|
def update_in_transaction(transaction):
|
||||||
|
from_ref = db.collection("users").document(from_user_id)
|
||||||
|
to_ref = db.collection("users").document(to_user_id)
|
||||||
|
|
||||||
|
from_snapshot = from_ref.get(transaction=transaction)
|
||||||
|
to_snapshot = to_ref.get(transaction=transaction)
|
||||||
|
|
||||||
|
from_credits = from_snapshot.get("credits")
|
||||||
|
|
||||||
|
if from_credits < amount:
|
||||||
|
raise ValueError("Insufficient credits")
|
||||||
|
|
||||||
|
transaction.update(from_ref, {"credits": from_credits - amount})
|
||||||
|
transaction.update(to_ref, {"credits": to_snapshot.get("credits") + amount})
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
transaction = db.transaction()
|
||||||
|
return update_in_transaction(transaction)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Cloud Functions Patterns
|
||||||
|
|
||||||
|
### HTTP Function with Validation
|
||||||
|
```python
|
||||||
|
import functions_framework
|
||||||
|
from flask import jsonify, Request
|
||||||
|
from pydantic import BaseModel, ValidationError
|
||||||
|
|
||||||
|
class CreateOrderRequest(BaseModel):
|
||||||
|
customer_id: str
|
||||||
|
items: list[dict]
|
||||||
|
total: float
|
||||||
|
|
||||||
|
@functions_framework.http
|
||||||
|
def create_order(request: Request):
|
||||||
|
"""HTTP Cloud Function for creating orders."""
|
||||||
|
try:
|
||||||
|
body = request.get_json(silent=True)
|
||||||
|
|
||||||
|
if not body:
|
||||||
|
return jsonify({"error": "Request body required"}), 400
|
||||||
|
|
||||||
|
order_request = CreateOrderRequest(**body)
|
||||||
|
|
||||||
|
# Process order...
|
||||||
|
result = process_order(order_request)
|
||||||
|
|
||||||
|
return jsonify(result.dict()), 201
|
||||||
|
|
||||||
|
except ValidationError as e:
|
||||||
|
return jsonify({"error": e.errors()}), 400
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error: {e}")
|
||||||
|
return jsonify({"error": "Internal server error"}), 500
|
||||||
|
```
|
||||||
|
|
||||||
|
### Pub/Sub Triggered Function
|
||||||
|
```python
|
||||||
|
import base64
|
||||||
|
import functions_framework
|
||||||
|
from cloudevents.http import CloudEvent
|
||||||
|
|
||||||
|
@functions_framework.cloud_event
|
||||||
|
def process_message(cloud_event: CloudEvent):
|
||||||
|
"""Process Pub/Sub message."""
|
||||||
|
# Decode the Pub/Sub message
|
||||||
|
data = base64.b64decode(cloud_event.data["message"]["data"]).decode()
|
||||||
|
|
||||||
|
print(f"Received message: {data}")
|
||||||
|
|
||||||
|
# Process the message
|
||||||
|
process_event(data)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Cloud Run Patterns
|
||||||
|
|
||||||
|
### Service with Health Checks
|
||||||
|
```python
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
app = FastAPI()
|
||||||
|
|
||||||
|
class HealthResponse(BaseModel):
|
||||||
|
status: str
|
||||||
|
version: str
|
||||||
|
|
||||||
|
@app.get("/health", response_model=HealthResponse)
|
||||||
|
async def health_check():
|
||||||
|
"""Health check endpoint for Cloud Run."""
|
||||||
|
return HealthResponse(status="healthy", version="1.0.0")
|
||||||
|
|
||||||
|
@app.get("/ready")
|
||||||
|
async def readiness_check():
|
||||||
|
"""Readiness check - verify dependencies."""
|
||||||
|
# Check database connection, etc.
|
||||||
|
await check_database_connection()
|
||||||
|
return {"status": "ready"}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cloud Run Job
|
||||||
|
```python
|
||||||
|
# main.py for Cloud Run Job
|
||||||
|
import os
|
||||||
|
from google.cloud import bigquery
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Main entry point for Cloud Run Job."""
|
||||||
|
task_index = int(os.environ.get("CLOUD_RUN_TASK_INDEX", 0))
|
||||||
|
task_count = int(os.environ.get("CLOUD_RUN_TASK_COUNT", 1))
|
||||||
|
|
||||||
|
print(f"Processing task {task_index} of {task_count}")
|
||||||
|
|
||||||
|
# Process batch based on task index
|
||||||
|
process_batch(task_index, task_count)
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
|
```
|
||||||
|
|
||||||
|
## Cloud Logging
|
||||||
|
|
||||||
|
### Structured Logging
|
||||||
|
```python
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from google.cloud import logging as cloud_logging
|
||||||
|
|
||||||
|
# Setup Cloud Logging
|
||||||
|
client = cloud_logging.Client()
|
||||||
|
client.setup_logging()
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
def log_with_trace(message: str, trace_id: str, **kwargs):
|
||||||
|
"""Log with trace ID for request correlation."""
|
||||||
|
log_entry = {
|
||||||
|
"message": message,
|
||||||
|
"logging.googleapis.com/trace": f"projects/{project_id}/traces/{trace_id}",
|
||||||
|
**kwargs
|
||||||
|
}
|
||||||
|
logger.info(json.dumps(log_entry))
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
log_with_trace(
|
||||||
|
"Order processed",
|
||||||
|
trace_id="abc123",
|
||||||
|
order_id="order-456",
|
||||||
|
customer_id="cust-789"
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Custom Metrics
|
||||||
|
```python
|
||||||
|
from google.cloud import monitoring_v3
|
||||||
|
|
||||||
|
def write_metric(project_id: str, metric_type: str, value: float):
|
||||||
|
"""Write a custom metric to Cloud Monitoring."""
|
||||||
|
client = monitoring_v3.MetricServiceClient()
|
||||||
|
project_name = f"projects/{project_id}"
|
||||||
|
|
||||||
|
series = monitoring_v3.TimeSeries()
|
||||||
|
series.metric.type = f"custom.googleapis.com/{metric_type}"
|
||||||
|
series.resource.type = "global"
|
||||||
|
|
||||||
|
now = time.time()
|
||||||
|
seconds = int(now)
|
||||||
|
nanos = int((now - seconds) * 10**9)
|
||||||
|
|
||||||
|
interval = monitoring_v3.TimeInterval(
|
||||||
|
{"end_time": {"seconds": seconds, "nanos": nanos}}
|
||||||
|
)
|
||||||
|
point = monitoring_v3.Point(
|
||||||
|
{"interval": interval, "value": {"double_value": value}}
|
||||||
|
)
|
||||||
|
series.points = [point]
|
||||||
|
|
||||||
|
client.create_time_series(name=project_name, time_series=[series])
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
write_metric("my-project", "orders/processed", 1.0)
|
||||||
|
```
|
||||||
|
|
||||||
|
## CLI Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Authentication
|
||||||
|
gcloud auth login
|
||||||
|
gcloud config set project my-project
|
||||||
|
gcloud config list
|
||||||
|
|
||||||
|
# Compute
|
||||||
|
gcloud compute instances list
|
||||||
|
gcloud compute ssh my-instance
|
||||||
|
|
||||||
|
# Cloud Run
|
||||||
|
gcloud run services list
|
||||||
|
gcloud run deploy my-service --image gcr.io/my-project/my-app
|
||||||
|
gcloud run services describe my-service
|
||||||
|
|
||||||
|
# Cloud Functions
|
||||||
|
gcloud functions list
|
||||||
|
gcloud functions deploy my-function --runtime python311 --trigger-http
|
||||||
|
gcloud functions logs read my-function
|
||||||
|
|
||||||
|
# Secret Manager
|
||||||
|
gcloud secrets list
|
||||||
|
gcloud secrets create my-secret --data-file=secret.txt
|
||||||
|
gcloud secrets versions access latest --secret=my-secret
|
||||||
|
|
||||||
|
# Cloud Storage
|
||||||
|
gsutil ls gs://my-bucket/
|
||||||
|
gsutil cp local-file.txt gs://my-bucket/
|
||||||
|
gsutil signurl -d 1h key.json gs://my-bucket/file.txt
|
||||||
|
|
||||||
|
# Firestore
|
||||||
|
gcloud firestore databases list
|
||||||
|
gcloud firestore export gs://my-bucket/firestore-backup
|
||||||
|
|
||||||
|
# Logging
|
||||||
|
gcloud logging read "resource.type=cloud_run_revision" --limit=10
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Checklist
|
||||||
|
|
||||||
|
- [ ] Use Service Accounts with least privilege
|
||||||
|
- [ ] Enable VPC Service Controls for sensitive data
|
||||||
|
- [ ] Use Secret Manager for all secrets
|
||||||
|
- [ ] Enable Cloud Audit Logs
|
||||||
|
- [ ] Configure Identity-Aware Proxy for internal apps
|
||||||
|
- [ ] Use Private Google Access for GCE instances
|
||||||
|
- [ ] Enable Binary Authorization for GKE
|
||||||
|
- [ ] Configure Cloud Armor for DDoS protection
|
||||||
|
- [ ] Use Customer-Managed Encryption Keys (CMEK) where required
|
||||||
|
- [ ] Enable Security Command Center
|
||||||
556
.claude/skills/infrastructure/terraform/SKILL.md
Normal file
556
.claude/skills/infrastructure/terraform/SKILL.md
Normal file
@@ -0,0 +1,556 @@
|
|||||||
|
---
|
||||||
|
name: terraform-aws
|
||||||
|
description: Terraform Infrastructure as Code for AWS with module patterns, state management, and best practices. Use when writing Terraform configurations or planning AWS infrastructure.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Terraform AWS Skill
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
infrastructure/
|
||||||
|
├── environments/
|
||||||
|
│ ├── dev/
|
||||||
|
│ │ ├── main.tf
|
||||||
|
│ │ ├── variables.tf
|
||||||
|
│ │ ├── outputs.tf
|
||||||
|
│ │ ├── terraform.tfvars
|
||||||
|
│ │ └── backend.tf
|
||||||
|
│ ├── staging/
|
||||||
|
│ └── prod/
|
||||||
|
├── modules/
|
||||||
|
│ ├── vpc/
|
||||||
|
│ │ ├── main.tf
|
||||||
|
│ │ ├── variables.tf
|
||||||
|
│ │ ├── outputs.tf
|
||||||
|
│ │ └── README.md
|
||||||
|
│ ├── ecs-service/
|
||||||
|
│ ├── rds/
|
||||||
|
│ ├── lambda/
|
||||||
|
│ └── s3-bucket/
|
||||||
|
└── shared/
|
||||||
|
└── providers.tf
|
||||||
|
```
|
||||||
|
|
||||||
|
## Backend Configuration
|
||||||
|
|
||||||
|
### S3 Backend (recommended for AWS)
|
||||||
|
```hcl
|
||||||
|
# backend.tf
|
||||||
|
terraform {
|
||||||
|
backend "s3" {
|
||||||
|
bucket = "mycompany-terraform-state"
|
||||||
|
key = "environments/dev/terraform.tfstate"
|
||||||
|
region = "eu-west-2"
|
||||||
|
encrypt = true
|
||||||
|
dynamodb_table = "terraform-state-lock"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### State Lock Table
|
||||||
|
```hcl
|
||||||
|
# One-time setup for state locking
|
||||||
|
resource "aws_dynamodb_table" "terraform_lock" {
|
||||||
|
name = "terraform-state-lock"
|
||||||
|
billing_mode = "PAY_PER_REQUEST"
|
||||||
|
hash_key = "LockID"
|
||||||
|
|
||||||
|
attribute {
|
||||||
|
name = "LockID"
|
||||||
|
type = "S"
|
||||||
|
}
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = "Terraform State Lock"
|
||||||
|
Environment = "shared"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Provider Configuration
|
||||||
|
|
||||||
|
```hcl
|
||||||
|
# providers.tf
|
||||||
|
terraform {
|
||||||
|
required_version = ">= 1.6.0"
|
||||||
|
|
||||||
|
required_providers {
|
||||||
|
aws = {
|
||||||
|
source = "hashicorp/aws"
|
||||||
|
version = "~> 5.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
provider "aws" {
|
||||||
|
region = var.aws_region
|
||||||
|
|
||||||
|
default_tags {
|
||||||
|
tags = {
|
||||||
|
Environment = var.environment
|
||||||
|
Project = var.project_name
|
||||||
|
ManagedBy = "terraform"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Module Patterns
|
||||||
|
|
||||||
|
### VPC Module
|
||||||
|
```hcl
|
||||||
|
# modules/vpc/variables.tf
|
||||||
|
variable "name" {
|
||||||
|
description = "Name prefix for VPC resources"
|
||||||
|
type = string
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "cidr_block" {
|
||||||
|
description = "CIDR block for the VPC"
|
||||||
|
type = string
|
||||||
|
default = "10.0.0.0/16"
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "availability_zones" {
|
||||||
|
description = "List of availability zones"
|
||||||
|
type = list(string)
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "enable_nat_gateway" {
|
||||||
|
description = "Enable NAT Gateway for private subnets"
|
||||||
|
type = bool
|
||||||
|
default = true
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "single_nat_gateway" {
|
||||||
|
description = "Use single NAT Gateway (cost saving for non-prod)"
|
||||||
|
type = bool
|
||||||
|
default = false
|
||||||
|
}
|
||||||
|
|
||||||
|
# modules/vpc/main.tf
|
||||||
|
locals {
|
||||||
|
az_count = length(var.availability_zones)
|
||||||
|
|
||||||
|
public_subnets = [
|
||||||
|
for i, az in var.availability_zones :
|
||||||
|
cidrsubnet(var.cidr_block, 8, i)
|
||||||
|
]
|
||||||
|
|
||||||
|
private_subnets = [
|
||||||
|
for i, az in var.availability_zones :
|
||||||
|
cidrsubnet(var.cidr_block, 8, i + local.az_count)
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_vpc" "main" {
|
||||||
|
cidr_block = var.cidr_block
|
||||||
|
enable_dns_hostnames = true
|
||||||
|
enable_dns_support = true
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = var.name
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_internet_gateway" "main" {
|
||||||
|
vpc_id = aws_vpc.main.id
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = "${var.name}-igw"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_subnet" "public" {
|
||||||
|
count = local.az_count
|
||||||
|
vpc_id = aws_vpc.main.id
|
||||||
|
cidr_block = local.public_subnets[count.index]
|
||||||
|
availability_zone = var.availability_zones[count.index]
|
||||||
|
map_public_ip_on_launch = true
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = "${var.name}-public-${var.availability_zones[count.index]}"
|
||||||
|
Type = "public"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_subnet" "private" {
|
||||||
|
count = local.az_count
|
||||||
|
vpc_id = aws_vpc.main.id
|
||||||
|
cidr_block = local.private_subnets[count.index]
|
||||||
|
availability_zone = var.availability_zones[count.index]
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = "${var.name}-private-${var.availability_zones[count.index]}"
|
||||||
|
Type = "private"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_eip" "nat" {
|
||||||
|
count = var.enable_nat_gateway ? (var.single_nat_gateway ? 1 : local.az_count) : 0
|
||||||
|
domain = "vpc"
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = "${var.name}-nat-eip-${count.index}"
|
||||||
|
}
|
||||||
|
|
||||||
|
depends_on = [aws_internet_gateway.main]
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_nat_gateway" "main" {
|
||||||
|
count = var.enable_nat_gateway ? (var.single_nat_gateway ? 1 : local.az_count) : 0
|
||||||
|
allocation_id = aws_eip.nat[count.index].id
|
||||||
|
subnet_id = aws_subnet.public[count.index].id
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = "${var.name}-nat-${count.index}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# modules/vpc/outputs.tf
|
||||||
|
output "vpc_id" {
|
||||||
|
description = "ID of the VPC"
|
||||||
|
value = aws_vpc.main.id
|
||||||
|
}
|
||||||
|
|
||||||
|
output "public_subnet_ids" {
|
||||||
|
description = "IDs of public subnets"
|
||||||
|
value = aws_subnet.public[*].id
|
||||||
|
}
|
||||||
|
|
||||||
|
output "private_subnet_ids" {
|
||||||
|
description = "IDs of private subnets"
|
||||||
|
value = aws_subnet.private[*].id
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Module Usage
|
||||||
|
```hcl
|
||||||
|
# environments/dev/main.tf
|
||||||
|
module "vpc" {
|
||||||
|
source = "../../modules/vpc"
|
||||||
|
|
||||||
|
name = "${var.project_name}-${var.environment}"
|
||||||
|
cidr_block = "10.0.0.0/16"
|
||||||
|
availability_zones = ["eu-west-2a", "eu-west-2b", "eu-west-2c"]
|
||||||
|
enable_nat_gateway = true
|
||||||
|
single_nat_gateway = true # Cost saving for dev
|
||||||
|
}
|
||||||
|
|
||||||
|
module "ecs_cluster" {
|
||||||
|
source = "../../modules/ecs-cluster"
|
||||||
|
|
||||||
|
name = "${var.project_name}-${var.environment}"
|
||||||
|
vpc_id = module.vpc.vpc_id
|
||||||
|
private_subnet_ids = module.vpc.private_subnet_ids
|
||||||
|
|
||||||
|
depends_on = [module.vpc]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Best Practices
|
||||||
|
|
||||||
|
### Security Groups
|
||||||
|
```hcl
|
||||||
|
resource "aws_security_group" "app" {
|
||||||
|
name = "${var.name}-app-sg"
|
||||||
|
description = "Security group for application servers"
|
||||||
|
vpc_id = var.vpc_id
|
||||||
|
|
||||||
|
# Only allow inbound from load balancer
|
||||||
|
ingress {
|
||||||
|
description = "HTTP from ALB"
|
||||||
|
from_port = var.app_port
|
||||||
|
to_port = var.app_port
|
||||||
|
protocol = "tcp"
|
||||||
|
security_groups = [aws_security_group.alb.id]
|
||||||
|
}
|
||||||
|
|
||||||
|
# Allow all outbound
|
||||||
|
egress {
|
||||||
|
description = "Allow all outbound"
|
||||||
|
from_port = 0
|
||||||
|
to_port = 0
|
||||||
|
protocol = "-1"
|
||||||
|
cidr_blocks = ["0.0.0.0/0"]
|
||||||
|
}
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = "${var.name}-app-sg"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# NEVER do this - open to world
|
||||||
|
# ingress {
|
||||||
|
# from_port = 22
|
||||||
|
# to_port = 22
|
||||||
|
# protocol = "tcp"
|
||||||
|
# cidr_blocks = ["0.0.0.0/0"] # BAD!
|
||||||
|
# }
|
||||||
|
```
|
||||||
|
|
||||||
|
### IAM Roles with Least Privilege
|
||||||
|
```hcl
|
||||||
|
data "aws_iam_policy_document" "ecs_task_assume_role" {
|
||||||
|
statement {
|
||||||
|
actions = ["sts:AssumeRole"]
|
||||||
|
|
||||||
|
principals {
|
||||||
|
type = "Service"
|
||||||
|
identifiers = ["ecs-tasks.amazonaws.com"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_iam_role" "ecs_task_role" {
|
||||||
|
name = "${var.name}-ecs-task-role"
|
||||||
|
assume_role_policy = data.aws_iam_policy_document.ecs_task_assume_role.json
|
||||||
|
}
|
||||||
|
|
||||||
|
# Specific permissions only
|
||||||
|
data "aws_iam_policy_document" "ecs_task_policy" {
|
||||||
|
statement {
|
||||||
|
sid = "AllowS3Access"
|
||||||
|
effect = "Allow"
|
||||||
|
actions = [
|
||||||
|
"s3:GetObject",
|
||||||
|
"s3:PutObject",
|
||||||
|
]
|
||||||
|
resources = [
|
||||||
|
"${aws_s3_bucket.app_data.arn}/*"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
statement {
|
||||||
|
sid = "AllowSecretsAccess"
|
||||||
|
effect = "Allow"
|
||||||
|
actions = [
|
||||||
|
"secretsmanager:GetSecretValue"
|
||||||
|
]
|
||||||
|
resources = [
|
||||||
|
aws_secretsmanager_secret.app_secrets.arn
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Secrets Management
|
||||||
|
```hcl
|
||||||
|
# NEVER hardcode secrets
|
||||||
|
# BAD:
|
||||||
|
# resource "aws_db_instance" "main" {
|
||||||
|
# password = "mysecretpassword" # NEVER!
|
||||||
|
# }
|
||||||
|
|
||||||
|
# GOOD: Use AWS Secrets Manager
|
||||||
|
resource "aws_secretsmanager_secret" "db_password" {
|
||||||
|
name = "${var.name}/db-password"
|
||||||
|
recovery_window_in_days = 7
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "aws_secretsmanager_secret_version" "db_password" {
|
||||||
|
secret_id = aws_secretsmanager_secret.db_password.id
|
||||||
|
secret_string = jsonencode({
|
||||||
|
password = random_password.db.result
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
resource "random_password" "db" {
|
||||||
|
length = 32
|
||||||
|
special = true
|
||||||
|
override_special = "!#$%&*()-_=+[]{}<>:?"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Reference in RDS
|
||||||
|
resource "aws_db_instance" "main" {
|
||||||
|
# ...
|
||||||
|
password = random_password.db.result
|
||||||
|
|
||||||
|
lifecycle {
|
||||||
|
ignore_changes = [password]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Variables and Validation
|
||||||
|
|
||||||
|
```hcl
|
||||||
|
variable "environment" {
|
||||||
|
description = "Environment name"
|
||||||
|
type = string
|
||||||
|
|
||||||
|
validation {
|
||||||
|
condition = contains(["dev", "staging", "prod"], var.environment)
|
||||||
|
error_message = "Environment must be dev, staging, or prod."
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "instance_type" {
|
||||||
|
description = "EC2 instance type"
|
||||||
|
type = string
|
||||||
|
default = "t3.micro"
|
||||||
|
|
||||||
|
validation {
|
||||||
|
condition = can(regex("^t[23]\\.", var.instance_type))
|
||||||
|
error_message = "Only t2 and t3 instance types are allowed."
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
variable "tags" {
|
||||||
|
description = "Additional tags for resources"
|
||||||
|
type = map(string)
|
||||||
|
default = {}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Data Sources
|
||||||
|
|
||||||
|
```hcl
|
||||||
|
# Get latest Amazon Linux 2 AMI
|
||||||
|
data "aws_ami" "amazon_linux_2" {
|
||||||
|
most_recent = true
|
||||||
|
owners = ["amazon"]
|
||||||
|
|
||||||
|
filter {
|
||||||
|
name = "name"
|
||||||
|
values = ["amzn2-ami-hvm-*-x86_64-gp2"]
|
||||||
|
}
|
||||||
|
|
||||||
|
filter {
|
||||||
|
name = "virtualization-type"
|
||||||
|
values = ["hvm"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Get current AWS account ID
|
||||||
|
data "aws_caller_identity" "current" {}
|
||||||
|
|
||||||
|
# Get current region
|
||||||
|
data "aws_region" "current" {}
|
||||||
|
|
||||||
|
# Reference in resources
|
||||||
|
resource "aws_instance" "example" {
|
||||||
|
ami = data.aws_ami.amazon_linux_2.id
|
||||||
|
instance_type = var.instance_type
|
||||||
|
|
||||||
|
tags = {
|
||||||
|
Name = "example-${data.aws_region.current.name}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Outputs
|
||||||
|
|
||||||
|
```hcl
|
||||||
|
output "vpc_id" {
|
||||||
|
description = "ID of the VPC"
|
||||||
|
value = module.vpc.vpc_id
|
||||||
|
}
|
||||||
|
|
||||||
|
output "alb_dns_name" {
|
||||||
|
description = "DNS name of the Application Load Balancer"
|
||||||
|
value = aws_lb.main.dns_name
|
||||||
|
}
|
||||||
|
|
||||||
|
output "db_endpoint" {
|
||||||
|
description = "Endpoint of the RDS instance"
|
||||||
|
value = aws_db_instance.main.endpoint
|
||||||
|
sensitive = false
|
||||||
|
}
|
||||||
|
|
||||||
|
output "db_password_secret_arn" {
|
||||||
|
description = "ARN of the database password secret"
|
||||||
|
value = aws_secretsmanager_secret.db_password.arn
|
||||||
|
sensitive = true # Hide in logs
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Lifecycle Rules
|
||||||
|
|
||||||
|
```hcl
|
||||||
|
resource "aws_instance" "example" {
|
||||||
|
ami = data.aws_ami.amazon_linux_2.id
|
||||||
|
instance_type = var.instance_type
|
||||||
|
|
||||||
|
lifecycle {
|
||||||
|
# Prevent accidental destruction
|
||||||
|
prevent_destroy = true
|
||||||
|
|
||||||
|
# Create new before destroying old
|
||||||
|
create_before_destroy = true
|
||||||
|
|
||||||
|
# Ignore changes to tags made outside Terraform
|
||||||
|
ignore_changes = [
|
||||||
|
tags["LastModified"],
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Initialize
|
||||||
|
terraform init
|
||||||
|
terraform init -upgrade # Upgrade providers
|
||||||
|
|
||||||
|
# Planning
|
||||||
|
terraform plan # Preview changes
|
||||||
|
terraform plan -out=tfplan # Save plan
|
||||||
|
terraform plan -target=module.vpc # Plan specific resource
|
||||||
|
|
||||||
|
# Applying
|
||||||
|
terraform apply # Apply changes
|
||||||
|
terraform apply tfplan # Apply saved plan
|
||||||
|
terraform apply -auto-approve # Skip confirmation (CI/CD only!)
|
||||||
|
|
||||||
|
# State management
|
||||||
|
terraform state list # List resources
|
||||||
|
terraform state show aws_vpc.main # Show specific resource
|
||||||
|
terraform import aws_vpc.main vpc-123 # Import existing resource
|
||||||
|
|
||||||
|
# Workspace (for multiple environments)
|
||||||
|
terraform workspace list
|
||||||
|
terraform workspace new staging
|
||||||
|
terraform workspace select dev
|
||||||
|
|
||||||
|
# Formatting and validation
|
||||||
|
terraform fmt -check # Check formatting
|
||||||
|
terraform fmt -recursive # Format all files
|
||||||
|
terraform validate # Validate configuration
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns to Avoid
|
||||||
|
|
||||||
|
```hcl
|
||||||
|
# BAD: Hardcoded values
|
||||||
|
resource "aws_instance" "web" {
|
||||||
|
ami = "ami-12345678" # Will break across regions
|
||||||
|
instance_type = "t2.micro" # No flexibility
|
||||||
|
}
|
||||||
|
|
||||||
|
# GOOD: Use data sources and variables
|
||||||
|
resource "aws_instance" "web" {
|
||||||
|
ami = data.aws_ami.amazon_linux_2.id
|
||||||
|
instance_type = var.instance_type
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: No state locking
|
||||||
|
terraform {
|
||||||
|
backend "s3" {
|
||||||
|
bucket = "my-state"
|
||||||
|
key = "terraform.tfstate"
|
||||||
|
# Missing dynamodb_table!
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: Secrets in tfvars
|
||||||
|
# terraform.tfvars
|
||||||
|
db_password = "mysecret" # NEVER!
|
||||||
|
|
||||||
|
# GOOD: Use secrets manager or environment variables
|
||||||
|
# export TF_VAR_db_password=$(aws secretsmanager get-secret-value ...)
|
||||||
|
```
|
||||||
666
.claude/skills/languages/csharp/SKILL.md
Normal file
666
.claude/skills/languages/csharp/SKILL.md
Normal file
@@ -0,0 +1,666 @@
|
|||||||
|
---
|
||||||
|
name: csharp-development
|
||||||
|
description: C# and .NET development patterns with xUnit, Clean Architecture, and modern C# practices. Use when writing C# code.
|
||||||
|
---
|
||||||
|
|
||||||
|
# C# Development Skill
|
||||||
|
|
||||||
|
## Project Structure (Clean Architecture)
|
||||||
|
|
||||||
|
```
|
||||||
|
Solution/
|
||||||
|
├── src/
|
||||||
|
│ ├── Domain/ # Enterprise business rules
|
||||||
|
│ │ ├── Entities/
|
||||||
|
│ │ ├── ValueObjects/
|
||||||
|
│ │ ├── Exceptions/
|
||||||
|
│ │ └── Domain.csproj
|
||||||
|
│ ├── Application/ # Application business rules
|
||||||
|
│ │ ├── Common/
|
||||||
|
│ │ │ ├── Interfaces/
|
||||||
|
│ │ │ └── Behaviours/
|
||||||
|
│ │ ├── Features/
|
||||||
|
│ │ │ └── Users/
|
||||||
|
│ │ │ ├── Commands/
|
||||||
|
│ │ │ └── Queries/
|
||||||
|
│ │ └── Application.csproj
|
||||||
|
│ ├── Infrastructure/ # External concerns
|
||||||
|
│ │ ├── Persistence/
|
||||||
|
│ │ ├── Identity/
|
||||||
|
│ │ └── Infrastructure.csproj
|
||||||
|
│ └── Api/ # Presentation layer
|
||||||
|
│ ├── Controllers/
|
||||||
|
│ ├── Middleware/
|
||||||
|
│ └── Api.csproj
|
||||||
|
├── tests/
|
||||||
|
│ ├── Domain.Tests/
|
||||||
|
│ ├── Application.Tests/
|
||||||
|
│ ├── Infrastructure.Tests/
|
||||||
|
│ └── Api.Tests/
|
||||||
|
└── Solution.sln
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing with xUnit
|
||||||
|
|
||||||
|
### Unit Tests with Fluent Assertions
|
||||||
|
```csharp
|
||||||
|
using FluentAssertions;
|
||||||
|
using Moq;
|
||||||
|
using Xunit;
|
||||||
|
|
||||||
|
namespace Application.Tests.Features.Users;
|
||||||
|
|
||||||
|
public class CreateUserCommandHandlerTests
|
||||||
|
{
|
||||||
|
private readonly Mock<IUserRepository> _userRepositoryMock;
|
||||||
|
private readonly Mock<IUnitOfWork> _unitOfWorkMock;
|
||||||
|
private readonly CreateUserCommandHandler _handler;
|
||||||
|
|
||||||
|
public CreateUserCommandHandlerTests()
|
||||||
|
{
|
||||||
|
_userRepositoryMock = new Mock<IUserRepository>();
|
||||||
|
_unitOfWorkMock = new Mock<IUnitOfWork>();
|
||||||
|
_handler = new CreateUserCommandHandler(
|
||||||
|
_userRepositoryMock.Object,
|
||||||
|
_unitOfWorkMock.Object);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Handle_WithValidCommand_ShouldCreateUser()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var command = UserFixtures.CreateUserCommand();
|
||||||
|
_userRepositoryMock
|
||||||
|
.Setup(x => x.ExistsByEmailAsync(command.Email, It.IsAny<CancellationToken>()))
|
||||||
|
.ReturnsAsync(false);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var result = await _handler.Handle(command, CancellationToken.None);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.Should().NotBeNull();
|
||||||
|
result.Id.Should().NotBeEmpty();
|
||||||
|
result.Email.Should().Be(command.Email);
|
||||||
|
|
||||||
|
_userRepositoryMock.Verify(
|
||||||
|
x => x.AddAsync(It.IsAny<User>(), It.IsAny<CancellationToken>()),
|
||||||
|
Times.Once);
|
||||||
|
_unitOfWorkMock.Verify(
|
||||||
|
x => x.SaveChangesAsync(It.IsAny<CancellationToken>()),
|
||||||
|
Times.Once);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task Handle_WithDuplicateEmail_ShouldThrowDuplicateEmailException()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var command = UserFixtures.CreateUserCommand();
|
||||||
|
_userRepositoryMock
|
||||||
|
.Setup(x => x.ExistsByEmailAsync(command.Email, It.IsAny<CancellationToken>()))
|
||||||
|
.ReturnsAsync(true);
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var act = () => _handler.Handle(command, CancellationToken.None);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
await act.Should()
|
||||||
|
.ThrowAsync<DuplicateEmailException>()
|
||||||
|
.WithMessage($"*{command.Email}*");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test Fixtures
|
||||||
|
```csharp
|
||||||
|
namespace Application.Tests.Fixtures;
|
||||||
|
|
||||||
|
public static class UserFixtures
|
||||||
|
{
|
||||||
|
public static User User(Action<User>? customize = null)
|
||||||
|
{
|
||||||
|
var user = new User
|
||||||
|
{
|
||||||
|
Id = Guid.NewGuid(),
|
||||||
|
Email = "test@example.com",
|
||||||
|
Name = "Test User",
|
||||||
|
Role = Role.User,
|
||||||
|
CreatedAt = DateTime.UtcNow,
|
||||||
|
UpdatedAt = DateTime.UtcNow
|
||||||
|
};
|
||||||
|
|
||||||
|
customize?.Invoke(user);
|
||||||
|
return user;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static CreateUserCommand CreateUserCommand(
|
||||||
|
Action<CreateUserCommand>? customize = null)
|
||||||
|
{
|
||||||
|
var command = new CreateUserCommand
|
||||||
|
{
|
||||||
|
Email = "newuser@example.com",
|
||||||
|
Name = "New User",
|
||||||
|
Password = "SecurePass123!"
|
||||||
|
};
|
||||||
|
|
||||||
|
customize?.Invoke(command);
|
||||||
|
return command;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static UserDto UserDto(Action<UserDto>? customize = null)
|
||||||
|
{
|
||||||
|
var dto = new UserDto
|
||||||
|
{
|
||||||
|
Id = Guid.NewGuid(),
|
||||||
|
Email = "test@example.com",
|
||||||
|
Name = "Test User",
|
||||||
|
Role = Role.User.ToString()
|
||||||
|
};
|
||||||
|
|
||||||
|
customize?.Invoke(dto);
|
||||||
|
return dto;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
var adminUser = UserFixtures.User(u => u.Role = Role.Admin);
|
||||||
|
var customCommand = UserFixtures.CreateUserCommand(c => c.Email = "custom@example.com");
|
||||||
|
```
|
||||||
|
|
||||||
|
### Theory Tests (Parameterized)
|
||||||
|
```csharp
|
||||||
|
namespace Domain.Tests.ValueObjects;
|
||||||
|
|
||||||
|
public class EmailTests
|
||||||
|
{
|
||||||
|
[Theory]
|
||||||
|
[InlineData("user@example.com", true)]
|
||||||
|
[InlineData("user.name@example.co.uk", true)]
|
||||||
|
[InlineData("invalid", false)]
|
||||||
|
[InlineData("missing@domain", false)]
|
||||||
|
[InlineData("", false)]
|
||||||
|
[InlineData(null, false)]
|
||||||
|
public void IsValid_ShouldReturnExpectedResult(string? email, bool expected)
|
||||||
|
{
|
||||||
|
// Act
|
||||||
|
var result = Email.IsValid(email);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.Should().Be(expected);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Theory]
|
||||||
|
[MemberData(nameof(ValidEmailTestData))]
|
||||||
|
public void Create_WithValidEmail_ShouldSucceed(string email)
|
||||||
|
{
|
||||||
|
// Act
|
||||||
|
var result = Email.Create(email);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
result.IsSuccess.Should().BeTrue();
|
||||||
|
result.Value.Value.Should().Be(email.ToLowerInvariant());
|
||||||
|
}
|
||||||
|
|
||||||
|
public static IEnumerable<object[]> ValidEmailTestData()
|
||||||
|
{
|
||||||
|
yield return new object[] { "user@example.com" };
|
||||||
|
yield return new object[] { "USER@EXAMPLE.COM" };
|
||||||
|
yield return new object[] { "user.name+tag@example.co.uk" };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Integration Tests with WebApplicationFactory
|
||||||
|
```csharp
|
||||||
|
using Microsoft.AspNetCore.Mvc.Testing;
|
||||||
|
using System.Net;
|
||||||
|
using System.Net.Http.Json;
|
||||||
|
|
||||||
|
namespace Api.Tests.Controllers;
|
||||||
|
|
||||||
|
public class UsersControllerTests : IClassFixture<WebApplicationFactory<Program>>
|
||||||
|
{
|
||||||
|
private readonly HttpClient _client;
|
||||||
|
private readonly WebApplicationFactory<Program> _factory;
|
||||||
|
|
||||||
|
public UsersControllerTests(WebApplicationFactory<Program> factory)
|
||||||
|
{
|
||||||
|
_factory = factory.WithWebHostBuilder(builder =>
|
||||||
|
{
|
||||||
|
builder.ConfigureServices(services =>
|
||||||
|
{
|
||||||
|
// Replace services with test doubles
|
||||||
|
services.RemoveAll<IUserRepository>();
|
||||||
|
services.AddScoped<IUserRepository, InMemoryUserRepository>();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
_client = _factory.CreateClient();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task CreateUser_WithValidRequest_ReturnsCreated()
|
||||||
|
{
|
||||||
|
// Arrange
|
||||||
|
var request = new CreateUserRequest
|
||||||
|
{
|
||||||
|
Email = "new@example.com",
|
||||||
|
Name = "New User",
|
||||||
|
Password = "SecurePass123!"
|
||||||
|
};
|
||||||
|
|
||||||
|
// Act
|
||||||
|
var response = await _client.PostAsJsonAsync("/api/users", request);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
response.StatusCode.Should().Be(HttpStatusCode.Created);
|
||||||
|
|
||||||
|
var user = await response.Content.ReadFromJsonAsync<UserDto>();
|
||||||
|
user.Should().NotBeNull();
|
||||||
|
user!.Email.Should().Be(request.Email);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
public async Task GetUser_WhenNotFound_ReturnsNotFound()
|
||||||
|
{
|
||||||
|
// Act
|
||||||
|
var response = await _client.GetAsync($"/api/users/{Guid.NewGuid()}");
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
response.StatusCode.Should().Be(HttpStatusCode.NotFound);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Domain Models
|
||||||
|
|
||||||
|
### Entity with Value Objects
|
||||||
|
```csharp
|
||||||
|
namespace Domain.Entities;
|
||||||
|
|
||||||
|
public class User : BaseEntity
|
||||||
|
{
|
||||||
|
private User() { } // EF Core constructor
|
||||||
|
|
||||||
|
public Guid Id { get; private set; }
|
||||||
|
public Email Email { get; private set; } = null!;
|
||||||
|
public string Name { get; private set; } = null!;
|
||||||
|
public Role Role { get; private set; }
|
||||||
|
public DateTime CreatedAt { get; private set; }
|
||||||
|
public DateTime UpdatedAt { get; private set; }
|
||||||
|
|
||||||
|
public static Result<User> Create(string email, string name)
|
||||||
|
{
|
||||||
|
var emailResult = Email.Create(email);
|
||||||
|
if (emailResult.IsFailure)
|
||||||
|
{
|
||||||
|
return Result.Failure<User>(emailResult.Error);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (string.IsNullOrWhiteSpace(name))
|
||||||
|
{
|
||||||
|
return Result.Failure<User>(DomainErrors.User.NameRequired);
|
||||||
|
}
|
||||||
|
|
||||||
|
var user = new User
|
||||||
|
{
|
||||||
|
Id = Guid.NewGuid(),
|
||||||
|
Email = emailResult.Value,
|
||||||
|
Name = name.Trim(),
|
||||||
|
Role = Role.User,
|
||||||
|
CreatedAt = DateTime.UtcNow,
|
||||||
|
UpdatedAt = DateTime.UtcNow
|
||||||
|
};
|
||||||
|
|
||||||
|
user.RaiseDomainEvent(new UserCreatedEvent(user.Id, user.Email.Value));
|
||||||
|
|
||||||
|
return Result.Success(user);
|
||||||
|
}
|
||||||
|
|
||||||
|
public Result UpdateName(string name)
|
||||||
|
{
|
||||||
|
if (string.IsNullOrWhiteSpace(name))
|
||||||
|
{
|
||||||
|
return Result.Failure(DomainErrors.User.NameRequired);
|
||||||
|
}
|
||||||
|
|
||||||
|
Name = name.Trim();
|
||||||
|
UpdatedAt = DateTime.UtcNow;
|
||||||
|
|
||||||
|
return Result.Success();
|
||||||
|
}
|
||||||
|
|
||||||
|
public void PromoteToAdmin()
|
||||||
|
{
|
||||||
|
Role = Role.Admin;
|
||||||
|
UpdatedAt = DateTime.UtcNow;
|
||||||
|
|
||||||
|
RaiseDomainEvent(new UserPromotedEvent(Id));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Value Object
|
||||||
|
```csharp
|
||||||
|
namespace Domain.ValueObjects;
|
||||||
|
|
||||||
|
public sealed class Email : ValueObject
|
||||||
|
{
|
||||||
|
private static readonly Regex EmailRegex = new(
|
||||||
|
@"^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$",
|
||||||
|
RegexOptions.Compiled);
|
||||||
|
|
||||||
|
public string Value { get; }
|
||||||
|
|
||||||
|
private Email(string value) => Value = value;
|
||||||
|
|
||||||
|
public static Result<Email> Create(string? email)
|
||||||
|
{
|
||||||
|
if (string.IsNullOrWhiteSpace(email))
|
||||||
|
{
|
||||||
|
return Result.Failure<Email>(DomainErrors.Email.Empty);
|
||||||
|
}
|
||||||
|
|
||||||
|
var normalizedEmail = email.Trim().ToLowerInvariant();
|
||||||
|
|
||||||
|
if (!EmailRegex.IsMatch(normalizedEmail))
|
||||||
|
{
|
||||||
|
return Result.Failure<Email>(DomainErrors.Email.InvalidFormat);
|
||||||
|
}
|
||||||
|
|
||||||
|
return Result.Success(new Email(normalizedEmail));
|
||||||
|
}
|
||||||
|
|
||||||
|
public static bool IsValid(string? email) =>
|
||||||
|
!string.IsNullOrWhiteSpace(email) && EmailRegex.IsMatch(email);
|
||||||
|
|
||||||
|
protected override IEnumerable<object> GetEqualityComponents()
|
||||||
|
{
|
||||||
|
yield return Value;
|
||||||
|
}
|
||||||
|
|
||||||
|
public override string ToString() => Value;
|
||||||
|
|
||||||
|
public static implicit operator string(Email email) => email.Value;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Result Pattern
|
||||||
|
```csharp
|
||||||
|
namespace Domain.Common;
|
||||||
|
|
||||||
|
public class Result
|
||||||
|
{
|
||||||
|
protected Result(bool isSuccess, Error error)
|
||||||
|
{
|
||||||
|
if (isSuccess && error != Error.None ||
|
||||||
|
!isSuccess && error == Error.None)
|
||||||
|
{
|
||||||
|
throw new ArgumentException("Invalid error", nameof(error));
|
||||||
|
}
|
||||||
|
|
||||||
|
IsSuccess = isSuccess;
|
||||||
|
Error = error;
|
||||||
|
}
|
||||||
|
|
||||||
|
public bool IsSuccess { get; }
|
||||||
|
public bool IsFailure => !IsSuccess;
|
||||||
|
public Error Error { get; }
|
||||||
|
|
||||||
|
public static Result Success() => new(true, Error.None);
|
||||||
|
public static Result Failure(Error error) => new(false, error);
|
||||||
|
public static Result<T> Success<T>(T value) => new(value, true, Error.None);
|
||||||
|
public static Result<T> Failure<T>(Error error) => new(default!, false, error);
|
||||||
|
}
|
||||||
|
|
||||||
|
public class Result<T> : Result
|
||||||
|
{
|
||||||
|
private readonly T _value;
|
||||||
|
|
||||||
|
protected internal Result(T value, bool isSuccess, Error error)
|
||||||
|
: base(isSuccess, error)
|
||||||
|
{
|
||||||
|
_value = value;
|
||||||
|
}
|
||||||
|
|
||||||
|
public T Value => IsSuccess
|
||||||
|
? _value
|
||||||
|
: throw new InvalidOperationException("Cannot access value of failed result");
|
||||||
|
}
|
||||||
|
|
||||||
|
public record Error(string Code, string Message)
|
||||||
|
{
|
||||||
|
public static readonly Error None = new(string.Empty, string.Empty);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## CQRS with MediatR
|
||||||
|
|
||||||
|
### Command
|
||||||
|
```csharp
|
||||||
|
namespace Application.Features.Users.Commands;
|
||||||
|
|
||||||
|
public record CreateUserCommand : IRequest<Result<UserDto>>
|
||||||
|
{
|
||||||
|
public required string Email { get; init; }
|
||||||
|
public required string Name { get; init; }
|
||||||
|
public required string Password { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public class CreateUserCommandValidator : AbstractValidator<CreateUserCommand>
|
||||||
|
{
|
||||||
|
public CreateUserCommandValidator()
|
||||||
|
{
|
||||||
|
RuleFor(x => x.Email)
|
||||||
|
.NotEmpty()
|
||||||
|
.EmailAddress()
|
||||||
|
.MaximumLength(255);
|
||||||
|
|
||||||
|
RuleFor(x => x.Name)
|
||||||
|
.NotEmpty()
|
||||||
|
.MaximumLength(100);
|
||||||
|
|
||||||
|
RuleFor(x => x.Password)
|
||||||
|
.NotEmpty()
|
||||||
|
.MinimumLength(8)
|
||||||
|
.Matches("[A-Z]").WithMessage("Password must contain uppercase letter")
|
||||||
|
.Matches("[a-z]").WithMessage("Password must contain lowercase letter")
|
||||||
|
.Matches("[0-9]").WithMessage("Password must contain digit")
|
||||||
|
.Matches("[^a-zA-Z0-9]").WithMessage("Password must contain special character");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public class CreateUserCommandHandler : IRequestHandler<CreateUserCommand, Result<UserDto>>
|
||||||
|
{
|
||||||
|
private readonly IUserRepository _userRepository;
|
||||||
|
private readonly IUnitOfWork _unitOfWork;
|
||||||
|
private readonly IPasswordHasher _passwordHasher;
|
||||||
|
|
||||||
|
public CreateUserCommandHandler(
|
||||||
|
IUserRepository userRepository,
|
||||||
|
IUnitOfWork unitOfWork,
|
||||||
|
IPasswordHasher passwordHasher)
|
||||||
|
{
|
||||||
|
_userRepository = userRepository;
|
||||||
|
_unitOfWork = unitOfWork;
|
||||||
|
_passwordHasher = passwordHasher;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<Result<UserDto>> Handle(
|
||||||
|
CreateUserCommand request,
|
||||||
|
CancellationToken cancellationToken)
|
||||||
|
{
|
||||||
|
if (await _userRepository.ExistsByEmailAsync(request.Email, cancellationToken))
|
||||||
|
{
|
||||||
|
return Result.Failure<UserDto>(DomainErrors.User.DuplicateEmail);
|
||||||
|
}
|
||||||
|
|
||||||
|
var userResult = User.Create(request.Email, request.Name);
|
||||||
|
if (userResult.IsFailure)
|
||||||
|
{
|
||||||
|
return Result.Failure<UserDto>(userResult.Error);
|
||||||
|
}
|
||||||
|
|
||||||
|
var user = userResult.Value;
|
||||||
|
|
||||||
|
await _userRepository.AddAsync(user, cancellationToken);
|
||||||
|
await _unitOfWork.SaveChangesAsync(cancellationToken);
|
||||||
|
|
||||||
|
return Result.Success(user.ToDto());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Query
|
||||||
|
```csharp
|
||||||
|
namespace Application.Features.Users.Queries;
|
||||||
|
|
||||||
|
public record GetUserByIdQuery(Guid Id) : IRequest<Result<UserDto>>;
|
||||||
|
|
||||||
|
public class GetUserByIdQueryHandler : IRequestHandler<GetUserByIdQuery, Result<UserDto>>
|
||||||
|
{
|
||||||
|
private readonly IUserRepository _userRepository;
|
||||||
|
|
||||||
|
public GetUserByIdQueryHandler(IUserRepository userRepository)
|
||||||
|
{
|
||||||
|
_userRepository = userRepository;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task<Result<UserDto>> Handle(
|
||||||
|
GetUserByIdQuery request,
|
||||||
|
CancellationToken cancellationToken)
|
||||||
|
{
|
||||||
|
var user = await _userRepository.GetByIdAsync(request.Id, cancellationToken);
|
||||||
|
|
||||||
|
if (user is null)
|
||||||
|
{
|
||||||
|
return Result.Failure<UserDto>(DomainErrors.User.NotFound);
|
||||||
|
}
|
||||||
|
|
||||||
|
return Result.Success(user.ToDto());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Controllers
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
namespace Api.Controllers;
|
||||||
|
|
||||||
|
[ApiController]
|
||||||
|
[Route("api/[controller]")]
|
||||||
|
public class UsersController : ControllerBase
|
||||||
|
{
|
||||||
|
private readonly ISender _sender;
|
||||||
|
|
||||||
|
public UsersController(ISender sender)
|
||||||
|
{
|
||||||
|
_sender = sender;
|
||||||
|
}
|
||||||
|
|
||||||
|
[HttpGet("{id:guid}")]
|
||||||
|
[ProducesResponseType(typeof(UserDto), StatusCodes.Status200OK)]
|
||||||
|
[ProducesResponseType(typeof(ProblemDetails), StatusCodes.Status404NotFound)]
|
||||||
|
public async Task<IActionResult> GetUser(Guid id, CancellationToken cancellationToken)
|
||||||
|
{
|
||||||
|
var result = await _sender.Send(new GetUserByIdQuery(id), cancellationToken);
|
||||||
|
|
||||||
|
return result.IsSuccess
|
||||||
|
? Ok(result.Value)
|
||||||
|
: NotFound(CreateProblemDetails(result.Error));
|
||||||
|
}
|
||||||
|
|
||||||
|
[HttpPost]
|
||||||
|
[ProducesResponseType(typeof(UserDto), StatusCodes.Status201Created)]
|
||||||
|
[ProducesResponseType(typeof(ProblemDetails), StatusCodes.Status400BadRequest)]
|
||||||
|
[ProducesResponseType(typeof(ProblemDetails), StatusCodes.Status409Conflict)]
|
||||||
|
public async Task<IActionResult> CreateUser(
|
||||||
|
[FromBody] CreateUserRequest request,
|
||||||
|
CancellationToken cancellationToken)
|
||||||
|
{
|
||||||
|
var command = new CreateUserCommand
|
||||||
|
{
|
||||||
|
Email = request.Email,
|
||||||
|
Name = request.Name,
|
||||||
|
Password = request.Password
|
||||||
|
};
|
||||||
|
|
||||||
|
var result = await _sender.Send(command, cancellationToken);
|
||||||
|
|
||||||
|
if (result.IsFailure)
|
||||||
|
{
|
||||||
|
return result.Error.Code == "User.DuplicateEmail"
|
||||||
|
? Conflict(CreateProblemDetails(result.Error))
|
||||||
|
: BadRequest(CreateProblemDetails(result.Error));
|
||||||
|
}
|
||||||
|
|
||||||
|
return CreatedAtAction(
|
||||||
|
nameof(GetUser),
|
||||||
|
new { id = result.Value.Id },
|
||||||
|
result.Value);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ProblemDetails CreateProblemDetails(Error error) =>
|
||||||
|
new()
|
||||||
|
{
|
||||||
|
Title = error.Code,
|
||||||
|
Detail = error.Message,
|
||||||
|
Status = StatusCodes.Status400BadRequest
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Build and test
|
||||||
|
dotnet build
|
||||||
|
dotnet test
|
||||||
|
dotnet test --collect:"XPlat Code Coverage"
|
||||||
|
|
||||||
|
# Run specific test project
|
||||||
|
dotnet test tests/Application.Tests
|
||||||
|
|
||||||
|
# Run with filter
|
||||||
|
dotnet test --filter "FullyQualifiedName~UserService"
|
||||||
|
|
||||||
|
# Generate coverage report (requires reportgenerator)
|
||||||
|
dotnet tool install -g dotnet-reportgenerator-globaltool
|
||||||
|
reportgenerator -reports:**/coverage.cobertura.xml -targetdir:coveragereport
|
||||||
|
|
||||||
|
# Run application
|
||||||
|
dotnet run --project src/Api
|
||||||
|
|
||||||
|
# EF Core migrations
|
||||||
|
dotnet ef migrations add InitialCreate --project src/Infrastructure --startup-project src/Api
|
||||||
|
dotnet ef database update --project src/Infrastructure --startup-project src/Api
|
||||||
|
```
|
||||||
|
|
||||||
|
## NuGet Dependencies
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<!-- Domain.csproj - Minimal dependencies -->
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="FluentResults" Version="3.15.2" />
|
||||||
|
</ItemGroup>
|
||||||
|
|
||||||
|
<!-- Application.csproj -->
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="MediatR" Version="12.2.0" />
|
||||||
|
<PackageReference Include="FluentValidation" Version="11.9.0" />
|
||||||
|
<PackageReference Include="FluentValidation.DependencyInjectionExtensions" Version="11.9.0" />
|
||||||
|
</ItemGroup>
|
||||||
|
|
||||||
|
<!-- Infrastructure.csproj -->
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="8.0.0" />
|
||||||
|
</ItemGroup>
|
||||||
|
|
||||||
|
<!-- Test projects -->
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="xunit" Version="2.6.6" />
|
||||||
|
<PackageReference Include="xunit.runner.visualstudio" Version="2.5.6" />
|
||||||
|
<PackageReference Include="Moq" Version="4.20.70" />
|
||||||
|
<PackageReference Include="FluentAssertions" Version="6.12.0" />
|
||||||
|
<PackageReference Include="Microsoft.AspNetCore.Mvc.Testing" Version="8.0.0" />
|
||||||
|
<PackageReference Include="coverlet.collector" Version="6.0.0" />
|
||||||
|
</ItemGroup>
|
||||||
|
```
|
||||||
686
.claude/skills/languages/go/SKILL.md
Normal file
686
.claude/skills/languages/go/SKILL.md
Normal file
@@ -0,0 +1,686 @@
|
|||||||
|
---
|
||||||
|
name: go-development
|
||||||
|
description: Go development patterns with testing, error handling, and idiomatic practices. Use when writing Go code.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Go Development Skill
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
project/
|
||||||
|
├── cmd/
|
||||||
|
│ └── server/
|
||||||
|
│ └── main.go # Application entry point
|
||||||
|
├── internal/
|
||||||
|
│ ├── domain/ # Business logic
|
||||||
|
│ │ ├── user.go
|
||||||
|
│ │ └── user_test.go
|
||||||
|
│ ├── repository/ # Data access
|
||||||
|
│ │ ├── user_repo.go
|
||||||
|
│ │ └── user_repo_test.go
|
||||||
|
│ ├── service/ # Application services
|
||||||
|
│ │ ├── user_service.go
|
||||||
|
│ │ └── user_service_test.go
|
||||||
|
│ └── handler/ # HTTP handlers
|
||||||
|
│ ├── user_handler.go
|
||||||
|
│ └── user_handler_test.go
|
||||||
|
├── pkg/ # Public packages
|
||||||
|
│ └── validation/
|
||||||
|
├── go.mod
|
||||||
|
├── go.sum
|
||||||
|
└── Makefile
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Patterns
|
||||||
|
|
||||||
|
### Table-Driven Tests
|
||||||
|
```go
|
||||||
|
package user
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestValidateEmail(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
email string
|
||||||
|
wantErr bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "valid email",
|
||||||
|
email: "user@example.com",
|
||||||
|
wantErr: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "missing @ symbol",
|
||||||
|
email: "userexample.com",
|
||||||
|
wantErr: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "empty email",
|
||||||
|
email: "",
|
||||||
|
wantErr: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "missing domain",
|
||||||
|
email: "user@",
|
||||||
|
wantErr: true,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tt := range tests {
|
||||||
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
|
err := ValidateEmail(tt.email)
|
||||||
|
if (err != nil) != tt.wantErr {
|
||||||
|
t.Errorf("ValidateEmail(%q) error = %v, wantErr %v",
|
||||||
|
tt.email, err, tt.wantErr)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test Fixtures with Helper Functions
|
||||||
|
```go
|
||||||
|
package service_test
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"myapp/internal/domain"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Test helper - creates valid user with optional overrides
|
||||||
|
func newTestUser(t *testing.T, opts ...func(*domain.User)) *domain.User {
|
||||||
|
t.Helper()
|
||||||
|
|
||||||
|
user := &domain.User{
|
||||||
|
ID: "user-123",
|
||||||
|
Email: "test@example.com",
|
||||||
|
Name: "Test User",
|
||||||
|
Role: domain.RoleUser,
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, opt := range opts {
|
||||||
|
opt(user)
|
||||||
|
}
|
||||||
|
|
||||||
|
return user
|
||||||
|
}
|
||||||
|
|
||||||
|
// Option functions for customization
|
||||||
|
func withEmail(email string) func(*domain.User) {
|
||||||
|
return func(u *domain.User) {
|
||||||
|
u.Email = email
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func withRole(role domain.Role) func(*domain.User) {
|
||||||
|
return func(u *domain.User) {
|
||||||
|
u.Role = role
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestUserService_CreateUser(t *testing.T) {
|
||||||
|
t.Run("creates user with valid data", func(t *testing.T) {
|
||||||
|
user := newTestUser(t)
|
||||||
|
|
||||||
|
svc := NewUserService(mockRepo)
|
||||||
|
result, err := svc.CreateUser(user)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("unexpected error: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if result.ID == "" {
|
||||||
|
t.Error("expected user to have an ID")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("rejects user with invalid email", func(t *testing.T) {
|
||||||
|
user := newTestUser(t, withEmail("invalid-email"))
|
||||||
|
|
||||||
|
svc := NewUserService(mockRepo)
|
||||||
|
_, err := svc.CreateUser(user)
|
||||||
|
|
||||||
|
if err == nil {
|
||||||
|
t.Error("expected error for invalid email")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Mocking with Interfaces
|
||||||
|
```go
|
||||||
|
package service
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
|
||||||
|
"myapp/internal/domain"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Repository interface for dependency injection
|
||||||
|
type UserRepository interface {
|
||||||
|
GetByID(ctx context.Context, id string) (*domain.User, error)
|
||||||
|
Create(ctx context.Context, user *domain.User) error
|
||||||
|
Update(ctx context.Context, user *domain.User) error
|
||||||
|
}
|
||||||
|
|
||||||
|
// Service with injected dependencies
|
||||||
|
type UserService struct {
|
||||||
|
repo UserRepository
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewUserService(repo UserRepository) *UserService {
|
||||||
|
return &UserService{repo: repo}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test file
|
||||||
|
package service_test
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"myapp/internal/domain"
|
||||||
|
"myapp/internal/service"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Mock implementation
|
||||||
|
type mockUserRepo struct {
|
||||||
|
users map[string]*domain.User
|
||||||
|
err error
|
||||||
|
}
|
||||||
|
|
||||||
|
func newMockUserRepo() *mockUserRepo {
|
||||||
|
return &mockUserRepo{
|
||||||
|
users: make(map[string]*domain.User),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *mockUserRepo) GetByID(ctx context.Context, id string) (*domain.User, error) {
|
||||||
|
if m.err != nil {
|
||||||
|
return nil, m.err
|
||||||
|
}
|
||||||
|
user, ok := m.users[id]
|
||||||
|
if !ok {
|
||||||
|
return nil, errors.New("user not found")
|
||||||
|
}
|
||||||
|
return user, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *mockUserRepo) Create(ctx context.Context, user *domain.User) error {
|
||||||
|
if m.err != nil {
|
||||||
|
return m.err
|
||||||
|
}
|
||||||
|
m.users[user.ID] = user
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *mockUserRepo) Update(ctx context.Context, user *domain.User) error {
|
||||||
|
if m.err != nil {
|
||||||
|
return m.err
|
||||||
|
}
|
||||||
|
m.users[user.ID] = user
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Custom Error Types
|
||||||
|
```go
|
||||||
|
package domain
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Sentinel errors for comparison
|
||||||
|
var (
|
||||||
|
ErrNotFound = errors.New("not found")
|
||||||
|
ErrUnauthorized = errors.New("unauthorized")
|
||||||
|
ErrInvalidInput = errors.New("invalid input")
|
||||||
|
)
|
||||||
|
|
||||||
|
// Structured error with context
|
||||||
|
type ValidationError struct {
|
||||||
|
Field string
|
||||||
|
Message string
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *ValidationError) Error() string {
|
||||||
|
return fmt.Sprintf("validation error on %s: %s", e.Field, e.Message)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Error wrapping
|
||||||
|
func GetUser(ctx context.Context, id string) (*User, error) {
|
||||||
|
user, err := repo.GetByID(ctx, id)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("getting user %s: %w", id, err)
|
||||||
|
}
|
||||||
|
return user, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Error checking
|
||||||
|
func handleError(err error) {
|
||||||
|
if errors.Is(err, ErrNotFound) {
|
||||||
|
// Handle not found
|
||||||
|
}
|
||||||
|
|
||||||
|
var validationErr *ValidationError
|
||||||
|
if errors.As(err, &validationErr) {
|
||||||
|
// Handle validation error
|
||||||
|
fmt.Printf("Field %s: %s\n", validationErr.Field, validationErr.Message)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Result Pattern (Optional)
|
||||||
|
```go
|
||||||
|
package result
|
||||||
|
|
||||||
|
// Result type for explicit error handling
|
||||||
|
type Result[T any] struct {
|
||||||
|
value T
|
||||||
|
err error
|
||||||
|
}
|
||||||
|
|
||||||
|
func Ok[T any](value T) Result[T] {
|
||||||
|
return Result[T]{value: value}
|
||||||
|
}
|
||||||
|
|
||||||
|
func Err[T any](err error) Result[T] {
|
||||||
|
return Result[T]{err: err}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r Result[T]) IsOk() bool {
|
||||||
|
return r.err == nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r Result[T]) IsErr() bool {
|
||||||
|
return r.err != nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r Result[T]) Unwrap() (T, error) {
|
||||||
|
return r.value, r.err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r Result[T]) UnwrapOr(defaultValue T) T {
|
||||||
|
if r.err != nil {
|
||||||
|
return defaultValue
|
||||||
|
}
|
||||||
|
return r.value
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
func ProcessPayment(amount float64) Result[*Receipt] {
|
||||||
|
if amount <= 0 {
|
||||||
|
return Err[*Receipt](ErrInvalidInput)
|
||||||
|
}
|
||||||
|
|
||||||
|
receipt := &Receipt{Amount: amount}
|
||||||
|
return Ok(receipt)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## HTTP Handlers
|
||||||
|
|
||||||
|
### Chi Router Pattern
|
||||||
|
```go
|
||||||
|
package handler
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"net/http"
|
||||||
|
|
||||||
|
"github.com/go-chi/chi/v5"
|
||||||
|
"github.com/go-chi/chi/v5/middleware"
|
||||||
|
|
||||||
|
"myapp/internal/service"
|
||||||
|
)
|
||||||
|
|
||||||
|
type UserHandler struct {
|
||||||
|
userService *service.UserService
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewUserHandler(userService *service.UserService) *UserHandler {
|
||||||
|
return &UserHandler{userService: userService}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (h *UserHandler) Routes() chi.Router {
|
||||||
|
r := chi.NewRouter()
|
||||||
|
|
||||||
|
r.Use(middleware.Logger)
|
||||||
|
r.Use(middleware.Recoverer)
|
||||||
|
|
||||||
|
r.Get("/", h.ListUsers)
|
||||||
|
r.Post("/", h.CreateUser)
|
||||||
|
r.Get("/{userID}", h.GetUser)
|
||||||
|
r.Put("/{userID}", h.UpdateUser)
|
||||||
|
r.Delete("/{userID}", h.DeleteUser)
|
||||||
|
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
func (h *UserHandler) GetUser(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID := chi.URLParam(r, "userID")
|
||||||
|
|
||||||
|
user, err := h.userService.GetUser(r.Context(), userID)
|
||||||
|
if err != nil {
|
||||||
|
if errors.Is(err, domain.ErrNotFound) {
|
||||||
|
http.Error(w, "User not found", http.StatusNotFound)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
http.Error(w, "Internal server error", http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
respondJSON(w, http.StatusOK, user)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (h *UserHandler) CreateUser(w http.ResponseWriter, r *http.Request) {
|
||||||
|
var req CreateUserRequest
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||||
|
http.Error(w, "Invalid request body", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := req.Validate(); err != nil {
|
||||||
|
respondJSON(w, http.StatusBadRequest, map[string]string{
|
||||||
|
"error": err.Error(),
|
||||||
|
})
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
user, err := h.userService.CreateUser(r.Context(), req.ToUser())
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, "Failed to create user", http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
respondJSON(w, http.StatusCreated, user)
|
||||||
|
}
|
||||||
|
|
||||||
|
func respondJSON(w http.ResponseWriter, status int, data any) {
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
w.WriteHeader(status)
|
||||||
|
json.NewEncoder(w).Encode(data)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Request Validation
|
||||||
|
```go
|
||||||
|
package handler
|
||||||
|
|
||||||
|
import (
|
||||||
|
"regexp"
|
||||||
|
)
|
||||||
|
|
||||||
|
var emailRegex = regexp.MustCompile(`^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$`)
|
||||||
|
|
||||||
|
type CreateUserRequest struct {
|
||||||
|
Email string `json:"email"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *CreateUserRequest) Validate() error {
|
||||||
|
if r.Email == "" {
|
||||||
|
return &domain.ValidationError{
|
||||||
|
Field: "email",
|
||||||
|
Message: "email is required",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !emailRegex.MatchString(r.Email) {
|
||||||
|
return &domain.ValidationError{
|
||||||
|
Field: "email",
|
||||||
|
Message: "invalid email format",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if r.Name == "" {
|
||||||
|
return &domain.ValidationError{
|
||||||
|
Field: "name",
|
||||||
|
Message: "name is required",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(r.Name) > 100 {
|
||||||
|
return &domain.ValidationError{
|
||||||
|
Field: "name",
|
||||||
|
Message: "name must be 100 characters or less",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *CreateUserRequest) ToUser() *domain.User {
|
||||||
|
return &domain.User{
|
||||||
|
Email: r.Email,
|
||||||
|
Name: r.Name,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Context Usage
|
||||||
|
|
||||||
|
```go
|
||||||
|
package service
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Context with timeout
|
||||||
|
func (s *UserService) ProcessOrder(ctx context.Context, orderID string) error {
|
||||||
|
// Create a timeout context
|
||||||
|
ctx, cancel := context.WithTimeout(ctx, 30*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
// Check context before expensive operations
|
||||||
|
select {
|
||||||
|
case <-ctx.Done():
|
||||||
|
return ctx.Err()
|
||||||
|
default:
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process order...
|
||||||
|
return s.repo.UpdateOrder(ctx, orderID)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Context values for request-scoped data
|
||||||
|
type contextKey string
|
||||||
|
|
||||||
|
const (
|
||||||
|
userIDKey contextKey = "userID"
|
||||||
|
requestIDKey contextKey = "requestID"
|
||||||
|
)
|
||||||
|
|
||||||
|
func WithUserID(ctx context.Context, userID string) context.Context {
|
||||||
|
return context.WithValue(ctx, userIDKey, userID)
|
||||||
|
}
|
||||||
|
|
||||||
|
func UserIDFromContext(ctx context.Context) (string, bool) {
|
||||||
|
userID, ok := ctx.Value(userIDKey).(string)
|
||||||
|
return userID, ok
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Concurrency Patterns
|
||||||
|
|
||||||
|
### Worker Pool
|
||||||
|
```go
|
||||||
|
package worker
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"sync"
|
||||||
|
)
|
||||||
|
|
||||||
|
type Job func(ctx context.Context) error
|
||||||
|
|
||||||
|
type Pool struct {
|
||||||
|
workers int
|
||||||
|
jobs chan Job
|
||||||
|
results chan error
|
||||||
|
wg sync.WaitGroup
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewPool(workers int) *Pool {
|
||||||
|
return &Pool{
|
||||||
|
workers: workers,
|
||||||
|
jobs: make(chan Job, workers*2),
|
||||||
|
results: make(chan error, workers*2),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Pool) Start(ctx context.Context) {
|
||||||
|
for i := 0; i < p.workers; i++ {
|
||||||
|
p.wg.Add(1)
|
||||||
|
go p.worker(ctx)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Pool) worker(ctx context.Context) {
|
||||||
|
defer p.wg.Done()
|
||||||
|
|
||||||
|
for {
|
||||||
|
select {
|
||||||
|
case <-ctx.Done():
|
||||||
|
return
|
||||||
|
case job, ok := <-p.jobs:
|
||||||
|
if !ok {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if err := job(ctx); err != nil {
|
||||||
|
p.results <- err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Pool) Submit(job Job) {
|
||||||
|
p.jobs <- job
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Pool) Close() {
|
||||||
|
close(p.jobs)
|
||||||
|
p.wg.Wait()
|
||||||
|
close(p.results)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Error Group
|
||||||
|
```go
|
||||||
|
package service
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
|
||||||
|
"golang.org/x/sync/errgroup"
|
||||||
|
)
|
||||||
|
|
||||||
|
func (s *OrderService) ProcessBatch(ctx context.Context, orderIDs []string) error {
|
||||||
|
g, ctx := errgroup.WithContext(ctx)
|
||||||
|
|
||||||
|
// Limit concurrency
|
||||||
|
g.SetLimit(10)
|
||||||
|
|
||||||
|
for _, orderID := range orderIDs {
|
||||||
|
orderID := orderID // Capture for goroutine
|
||||||
|
|
||||||
|
g.Go(func() error {
|
||||||
|
return s.ProcessOrder(ctx, orderID)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return g.Wait()
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Testing
|
||||||
|
go test ./... # Run all tests
|
||||||
|
go test -v ./... # Verbose output
|
||||||
|
go test -race ./... # Race detection
|
||||||
|
go test -cover ./... # Coverage summary
|
||||||
|
go test -coverprofile=coverage.out ./... # Coverage file
|
||||||
|
go tool cover -html=coverage.out # HTML coverage report
|
||||||
|
|
||||||
|
# Linting
|
||||||
|
go vet ./... # Built-in checks
|
||||||
|
golangci-lint run # Comprehensive linting
|
||||||
|
|
||||||
|
# Building
|
||||||
|
go build -o bin/server ./cmd/server # Build binary
|
||||||
|
go build -ldflags="-s -w" -o bin/server ./cmd/server # Smaller binary
|
||||||
|
|
||||||
|
# Dependencies
|
||||||
|
go mod tidy # Clean up dependencies
|
||||||
|
go mod verify # Verify dependencies
|
||||||
|
|
||||||
|
# Code generation
|
||||||
|
go generate ./... # Run generate directives
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns to Avoid
|
||||||
|
|
||||||
|
```go
|
||||||
|
// ❌ BAD - Returning nil error with nil value
|
||||||
|
func GetUser(id string) (*User, error) {
|
||||||
|
user := db.Find(id)
|
||||||
|
return user, nil // user might be nil!
|
||||||
|
}
|
||||||
|
|
||||||
|
// ✅ GOOD - Explicit error for not found
|
||||||
|
func GetUser(id string) (*User, error) {
|
||||||
|
user := db.Find(id)
|
||||||
|
if user == nil {
|
||||||
|
return nil, ErrNotFound
|
||||||
|
}
|
||||||
|
return user, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ❌ BAD - Ignoring errors
|
||||||
|
result, _ := SomeFunction()
|
||||||
|
|
||||||
|
// ✅ GOOD - Handle or propagate errors
|
||||||
|
result, err := SomeFunction()
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("calling SomeFunction: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ❌ BAD - Naked goroutine without error handling
|
||||||
|
go processItem(item)
|
||||||
|
|
||||||
|
// ✅ GOOD - Error handling with channels or error groups
|
||||||
|
g, ctx := errgroup.WithContext(ctx)
|
||||||
|
g.Go(func() error {
|
||||||
|
return processItem(ctx, item)
|
||||||
|
})
|
||||||
|
if err := g.Wait(); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// ❌ BAD - Mutex embedded in struct
|
||||||
|
type Service struct {
|
||||||
|
sync.Mutex
|
||||||
|
data map[string]string
|
||||||
|
}
|
||||||
|
|
||||||
|
// ✅ GOOD - Private mutex
|
||||||
|
type Service struct {
|
||||||
|
mu sync.Mutex
|
||||||
|
data map[string]string
|
||||||
|
}
|
||||||
|
```
|
||||||
675
.claude/skills/languages/java/SKILL.md
Normal file
675
.claude/skills/languages/java/SKILL.md
Normal file
@@ -0,0 +1,675 @@
|
|||||||
|
---
|
||||||
|
name: java-development
|
||||||
|
description: Java development patterns with Spring Boot, JUnit 5, and modern Java practices. Use when writing Java code.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Java Development Skill
|
||||||
|
|
||||||
|
## Project Structure (Spring Boot)
|
||||||
|
|
||||||
|
```
|
||||||
|
project/
|
||||||
|
├── src/
|
||||||
|
│ ├── main/
|
||||||
|
│ │ ├── java/
|
||||||
|
│ │ │ └── com/mycompany/myapp/
|
||||||
|
│ │ │ ├── MyApplication.java
|
||||||
|
│ │ │ ├── domain/
|
||||||
|
│ │ │ │ ├── User.java
|
||||||
|
│ │ │ │ └── Order.java
|
||||||
|
│ │ │ ├── repository/
|
||||||
|
│ │ │ │ └── UserRepository.java
|
||||||
|
│ │ │ ├── service/
|
||||||
|
│ │ │ │ └── UserService.java
|
||||||
|
│ │ │ ├── controller/
|
||||||
|
│ │ │ │ └── UserController.java
|
||||||
|
│ │ │ └── config/
|
||||||
|
│ │ │ └── SecurityConfig.java
|
||||||
|
│ │ └── resources/
|
||||||
|
│ │ ├── application.yml
|
||||||
|
│ │ └── application-test.yml
|
||||||
|
│ └── test/
|
||||||
|
│ └── java/
|
||||||
|
│ └── com/mycompany/myapp/
|
||||||
|
│ ├── service/
|
||||||
|
│ │ └── UserServiceTest.java
|
||||||
|
│ └── controller/
|
||||||
|
│ └── UserControllerTest.java
|
||||||
|
├── pom.xml
|
||||||
|
└── README.md
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing with JUnit 5
|
||||||
|
|
||||||
|
### Unit Tests with Mockito
|
||||||
|
```java
|
||||||
|
package com.mycompany.myapp.service;
|
||||||
|
|
||||||
|
import com.mycompany.myapp.domain.User;
|
||||||
|
import com.mycompany.myapp.repository.UserRepository;
|
||||||
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
import org.junit.jupiter.api.DisplayName;
|
||||||
|
import org.junit.jupiter.api.Nested;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.junit.jupiter.api.extension.ExtendWith;
|
||||||
|
import org.mockito.InjectMocks;
|
||||||
|
import org.mockito.Mock;
|
||||||
|
import org.mockito.junit.jupiter.MockitoExtension;
|
||||||
|
|
||||||
|
import java.util.Optional;
|
||||||
|
|
||||||
|
import static org.assertj.core.api.Assertions.*;
|
||||||
|
import static org.mockito.ArgumentMatchers.any;
|
||||||
|
import static org.mockito.Mockito.*;
|
||||||
|
|
||||||
|
@ExtendWith(MockitoExtension.class)
|
||||||
|
class UserServiceTest {
|
||||||
|
|
||||||
|
@Mock
|
||||||
|
private UserRepository userRepository;
|
||||||
|
|
||||||
|
@InjectMocks
|
||||||
|
private UserService userService;
|
||||||
|
|
||||||
|
@Nested
|
||||||
|
@DisplayName("createUser")
|
||||||
|
class CreateUser {
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("should create user with valid data")
|
||||||
|
void shouldCreateUserWithValidData() {
|
||||||
|
// Given
|
||||||
|
var request = UserFixtures.createUserRequest();
|
||||||
|
var savedUser = UserFixtures.user();
|
||||||
|
when(userRepository.save(any(User.class))).thenReturn(savedUser);
|
||||||
|
|
||||||
|
// When
|
||||||
|
var result = userService.createUser(request);
|
||||||
|
|
||||||
|
// Then
|
||||||
|
assertThat(result.getId()).isNotNull();
|
||||||
|
assertThat(result.getEmail()).isEqualTo(request.getEmail());
|
||||||
|
verify(userRepository).save(any(User.class));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("should throw exception for duplicate email")
|
||||||
|
void shouldThrowExceptionForDuplicateEmail() {
|
||||||
|
// Given
|
||||||
|
var request = UserFixtures.createUserRequest();
|
||||||
|
when(userRepository.existsByEmail(request.getEmail())).thenReturn(true);
|
||||||
|
|
||||||
|
// When/Then
|
||||||
|
assertThatThrownBy(() -> userService.createUser(request))
|
||||||
|
.isInstanceOf(DuplicateEmailException.class)
|
||||||
|
.hasMessageContaining(request.getEmail());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@Nested
|
||||||
|
@DisplayName("getUserById")
|
||||||
|
class GetUserById {
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("should return user when found")
|
||||||
|
void shouldReturnUserWhenFound() {
|
||||||
|
// Given
|
||||||
|
var user = UserFixtures.user();
|
||||||
|
when(userRepository.findById(user.getId())).thenReturn(Optional.of(user));
|
||||||
|
|
||||||
|
// When
|
||||||
|
var result = userService.getUserById(user.getId());
|
||||||
|
|
||||||
|
// Then
|
||||||
|
assertThat(result).isPresent();
|
||||||
|
assertThat(result.get().getId()).isEqualTo(user.getId());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("should return empty when user not found")
|
||||||
|
void shouldReturnEmptyWhenNotFound() {
|
||||||
|
// Given
|
||||||
|
when(userRepository.findById(any())).thenReturn(Optional.empty());
|
||||||
|
|
||||||
|
// When
|
||||||
|
var result = userService.getUserById("unknown-id");
|
||||||
|
|
||||||
|
// Then
|
||||||
|
assertThat(result).isEmpty();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test Fixtures
|
||||||
|
```java
|
||||||
|
package com.mycompany.myapp.fixtures;
|
||||||
|
|
||||||
|
import com.mycompany.myapp.domain.User;
|
||||||
|
import com.mycompany.myapp.dto.CreateUserRequest;
|
||||||
|
|
||||||
|
import java.time.Instant;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public final class UserFixtures {
|
||||||
|
|
||||||
|
private UserFixtures() {}
|
||||||
|
|
||||||
|
public static User user() {
|
||||||
|
return user(builder -> {});
|
||||||
|
}
|
||||||
|
|
||||||
|
public static User user(UserCustomizer customizer) {
|
||||||
|
var builder = User.builder()
|
||||||
|
.id(UUID.randomUUID().toString())
|
||||||
|
.email("test@example.com")
|
||||||
|
.name("Test User")
|
||||||
|
.role(Role.USER)
|
||||||
|
.createdAt(Instant.now())
|
||||||
|
.updatedAt(Instant.now());
|
||||||
|
|
||||||
|
customizer.customize(builder);
|
||||||
|
return builder.build();
|
||||||
|
}
|
||||||
|
|
||||||
|
public static CreateUserRequest createUserRequest() {
|
||||||
|
return createUserRequest(builder -> {});
|
||||||
|
}
|
||||||
|
|
||||||
|
public static CreateUserRequest createUserRequest(CreateUserRequestCustomizer customizer) {
|
||||||
|
var builder = CreateUserRequest.builder()
|
||||||
|
.email("newuser@example.com")
|
||||||
|
.name("New User")
|
||||||
|
.password("SecurePass123!");
|
||||||
|
|
||||||
|
customizer.customize(builder);
|
||||||
|
return builder.build();
|
||||||
|
}
|
||||||
|
|
||||||
|
@FunctionalInterface
|
||||||
|
public interface UserCustomizer {
|
||||||
|
void customize(User.UserBuilder builder);
|
||||||
|
}
|
||||||
|
|
||||||
|
@FunctionalInterface
|
||||||
|
public interface CreateUserRequestCustomizer {
|
||||||
|
void customize(CreateUserRequest.CreateUserRequestBuilder builder);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage in tests
|
||||||
|
var adminUser = UserFixtures.user(builder -> builder.role(Role.ADMIN));
|
||||||
|
var customRequest = UserFixtures.createUserRequest(builder ->
|
||||||
|
builder.email("custom@example.com"));
|
||||||
|
```
|
||||||
|
|
||||||
|
### Parameterized Tests
|
||||||
|
```java
|
||||||
|
package com.mycompany.myapp.validation;
|
||||||
|
|
||||||
|
import org.junit.jupiter.api.DisplayName;
|
||||||
|
import org.junit.jupiter.params.ParameterizedTest;
|
||||||
|
import org.junit.jupiter.params.provider.CsvSource;
|
||||||
|
import org.junit.jupiter.params.provider.NullAndEmptySource;
|
||||||
|
import org.junit.jupiter.params.provider.ValueSource;
|
||||||
|
|
||||||
|
import static org.assertj.core.api.Assertions.*;
|
||||||
|
|
||||||
|
class EmailValidatorTest {
|
||||||
|
|
||||||
|
private final EmailValidator validator = new EmailValidator();
|
||||||
|
|
||||||
|
@ParameterizedTest
|
||||||
|
@DisplayName("should accept valid emails")
|
||||||
|
@ValueSource(strings = {
|
||||||
|
"user@example.com",
|
||||||
|
"user.name@example.com",
|
||||||
|
"user+tag@example.co.uk"
|
||||||
|
})
|
||||||
|
void shouldAcceptValidEmails(String email) {
|
||||||
|
assertThat(validator.isValid(email)).isTrue();
|
||||||
|
}
|
||||||
|
|
||||||
|
@ParameterizedTest
|
||||||
|
@DisplayName("should reject invalid emails")
|
||||||
|
@ValueSource(strings = {
|
||||||
|
"invalid",
|
||||||
|
"missing@domain",
|
||||||
|
"@nodomain.com",
|
||||||
|
"spaces in@email.com"
|
||||||
|
})
|
||||||
|
void shouldRejectInvalidEmails(String email) {
|
||||||
|
assertThat(validator.isValid(email)).isFalse();
|
||||||
|
}
|
||||||
|
|
||||||
|
@ParameterizedTest
|
||||||
|
@DisplayName("should reject null and empty emails")
|
||||||
|
@NullAndEmptySource
|
||||||
|
void shouldRejectNullAndEmpty(String email) {
|
||||||
|
assertThat(validator.isValid(email)).isFalse();
|
||||||
|
}
|
||||||
|
|
||||||
|
@ParameterizedTest
|
||||||
|
@DisplayName("should validate password strength")
|
||||||
|
@CsvSource({
|
||||||
|
"password,false",
|
||||||
|
"Password1,false",
|
||||||
|
"Password1!,true",
|
||||||
|
"P@ssw0rd,true"
|
||||||
|
})
|
||||||
|
void shouldValidatePasswordStrength(String password, boolean expected) {
|
||||||
|
assertThat(validator.isStrongPassword(password)).isEqualTo(expected);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Integration Tests with Spring
|
||||||
|
```java
|
||||||
|
package com.mycompany.myapp.controller;
|
||||||
|
|
||||||
|
import com.mycompany.myapp.fixtures.UserFixtures;
|
||||||
|
import com.mycompany.myapp.repository.UserRepository;
|
||||||
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
import org.junit.jupiter.api.DisplayName;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc;
|
||||||
|
import org.springframework.boot.test.context.SpringBootTest;
|
||||||
|
import org.springframework.http.MediaType;
|
||||||
|
import org.springframework.test.context.ActiveProfiles;
|
||||||
|
import org.springframework.test.web.servlet.MockMvc;
|
||||||
|
|
||||||
|
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.*;
|
||||||
|
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.*;
|
||||||
|
|
||||||
|
@SpringBootTest
|
||||||
|
@AutoConfigureMockMvc
|
||||||
|
@ActiveProfiles("test")
|
||||||
|
class UserControllerIntegrationTest {
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private MockMvc mockMvc;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private UserRepository userRepository;
|
||||||
|
|
||||||
|
@BeforeEach
|
||||||
|
void setUp() {
|
||||||
|
userRepository.deleteAll();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("POST /api/users creates user and returns 201")
|
||||||
|
void createUserReturnsCreated() throws Exception {
|
||||||
|
var request = """
|
||||||
|
{
|
||||||
|
"email": "new@example.com",
|
||||||
|
"name": "New User",
|
||||||
|
"password": "SecurePass123!"
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
mockMvc.perform(post("/api/users")
|
||||||
|
.contentType(MediaType.APPLICATION_JSON)
|
||||||
|
.content(request))
|
||||||
|
.andExpect(status().isCreated())
|
||||||
|
.andExpect(jsonPath("$.id").isNotEmpty())
|
||||||
|
.andExpect(jsonPath("$.email").value("new@example.com"))
|
||||||
|
.andExpect(jsonPath("$.name").value("New User"));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("GET /api/users/{id} returns user when found")
|
||||||
|
void getUserReturnsUserWhenFound() throws Exception {
|
||||||
|
var user = userRepository.save(UserFixtures.user());
|
||||||
|
|
||||||
|
mockMvc.perform(get("/api/users/{id}", user.getId()))
|
||||||
|
.andExpect(status().isOk())
|
||||||
|
.andExpect(jsonPath("$.id").value(user.getId()))
|
||||||
|
.andExpect(jsonPath("$.email").value(user.getEmail()));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("GET /api/users/{id} returns 404 when not found")
|
||||||
|
void getUserReturns404WhenNotFound() throws Exception {
|
||||||
|
mockMvc.perform(get("/api/users/{id}", "unknown-id"))
|
||||||
|
.andExpect(status().isNotFound());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Domain Models with Records
|
||||||
|
|
||||||
|
```java
|
||||||
|
package com.mycompany.myapp.domain;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.*;
|
||||||
|
import java.time.Instant;
|
||||||
|
|
||||||
|
// Immutable domain model using records (Java 17+)
|
||||||
|
public record User(
|
||||||
|
String id,
|
||||||
|
@NotBlank @Email String email,
|
||||||
|
@NotBlank @Size(max = 100) String name,
|
||||||
|
Role role,
|
||||||
|
Instant createdAt,
|
||||||
|
Instant updatedAt
|
||||||
|
) {
|
||||||
|
// Compact constructor for validation
|
||||||
|
public User {
|
||||||
|
if (email != null) {
|
||||||
|
email = email.toLowerCase().trim();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Static factory methods
|
||||||
|
public static User create(String email, String name) {
|
||||||
|
var now = Instant.now();
|
||||||
|
return new User(
|
||||||
|
UUID.randomUUID().toString(),
|
||||||
|
email,
|
||||||
|
name,
|
||||||
|
Role.USER,
|
||||||
|
now,
|
||||||
|
now
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wither methods for immutable updates
|
||||||
|
public User withName(String newName) {
|
||||||
|
return new User(id, email, newName, role, createdAt, Instant.now());
|
||||||
|
}
|
||||||
|
|
||||||
|
public User withRole(Role newRole) {
|
||||||
|
return new User(id, email, name, newRole, createdAt, Instant.now());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// For mutable entities (JPA)
|
||||||
|
@Entity
|
||||||
|
@Table(name = "users")
|
||||||
|
@Getter
|
||||||
|
@Builder
|
||||||
|
@NoArgsConstructor
|
||||||
|
@AllArgsConstructor
|
||||||
|
public class UserEntity {
|
||||||
|
|
||||||
|
@Id
|
||||||
|
@GeneratedValue(strategy = GenerationType.UUID)
|
||||||
|
private String id;
|
||||||
|
|
||||||
|
@Column(unique = true, nullable = false)
|
||||||
|
private String email;
|
||||||
|
|
||||||
|
@Column(nullable = false)
|
||||||
|
private String name;
|
||||||
|
|
||||||
|
@Enumerated(EnumType.STRING)
|
||||||
|
private Role role;
|
||||||
|
|
||||||
|
@Column(name = "created_at")
|
||||||
|
private Instant createdAt;
|
||||||
|
|
||||||
|
@Column(name = "updated_at")
|
||||||
|
private Instant updatedAt;
|
||||||
|
|
||||||
|
@PrePersist
|
||||||
|
void onCreate() {
|
||||||
|
createdAt = Instant.now();
|
||||||
|
updatedAt = createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
@PreUpdate
|
||||||
|
void onUpdate() {
|
||||||
|
updatedAt = Instant.now();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Service Layer
|
||||||
|
|
||||||
|
```java
|
||||||
|
package com.mycompany.myapp.service;
|
||||||
|
|
||||||
|
import com.mycompany.myapp.domain.User;
|
||||||
|
import com.mycompany.myapp.dto.CreateUserRequest;
|
||||||
|
import com.mycompany.myapp.exception.DuplicateEmailException;
|
||||||
|
import com.mycompany.myapp.exception.UserNotFoundException;
|
||||||
|
import com.mycompany.myapp.repository.UserRepository;
|
||||||
|
import lombok.RequiredArgsConstructor;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
|
||||||
|
import java.util.Optional;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
@RequiredArgsConstructor
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public class UserService {
|
||||||
|
|
||||||
|
private final UserRepository userRepository;
|
||||||
|
private final PasswordEncoder passwordEncoder;
|
||||||
|
|
||||||
|
@Transactional
|
||||||
|
public User createUser(CreateUserRequest request) {
|
||||||
|
if (userRepository.existsByEmail(request.getEmail())) {
|
||||||
|
throw new DuplicateEmailException(request.getEmail());
|
||||||
|
}
|
||||||
|
|
||||||
|
var user = User.create(
|
||||||
|
request.getEmail(),
|
||||||
|
request.getName()
|
||||||
|
);
|
||||||
|
|
||||||
|
return userRepository.save(user);
|
||||||
|
}
|
||||||
|
|
||||||
|
public Optional<User> getUserById(String id) {
|
||||||
|
return userRepository.findById(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
public User getUserByIdOrThrow(String id) {
|
||||||
|
return userRepository.findById(id)
|
||||||
|
.orElseThrow(() -> new UserNotFoundException(id));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Transactional
|
||||||
|
public User updateUser(String id, UpdateUserRequest request) {
|
||||||
|
var user = getUserByIdOrThrow(id);
|
||||||
|
|
||||||
|
var updatedUser = user
|
||||||
|
.withName(request.getName())
|
||||||
|
.withRole(request.getRole());
|
||||||
|
|
||||||
|
return userRepository.save(updatedUser);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Transactional
|
||||||
|
public void deleteUser(String id) {
|
||||||
|
if (!userRepository.existsById(id)) {
|
||||||
|
throw new UserNotFoundException(id);
|
||||||
|
}
|
||||||
|
userRepository.deleteById(id);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## REST Controllers
|
||||||
|
|
||||||
|
```java
|
||||||
|
package com.mycompany.myapp.controller;
|
||||||
|
|
||||||
|
import com.mycompany.myapp.dto.*;
|
||||||
|
import com.mycompany.myapp.service.UserService;
|
||||||
|
import jakarta.validation.Valid;
|
||||||
|
import lombok.RequiredArgsConstructor;
|
||||||
|
import org.springframework.http.HttpStatus;
|
||||||
|
import org.springframework.http.ResponseEntity;
|
||||||
|
import org.springframework.web.bind.annotation.*;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
|
@RestController
|
||||||
|
@RequestMapping("/api/users")
|
||||||
|
@RequiredArgsConstructor
|
||||||
|
public class UserController {
|
||||||
|
|
||||||
|
private final UserService userService;
|
||||||
|
private final UserMapper userMapper;
|
||||||
|
|
||||||
|
@GetMapping
|
||||||
|
public List<UserResponse> listUsers() {
|
||||||
|
return userService.getAllUsers().stream()
|
||||||
|
.map(userMapper::toResponse)
|
||||||
|
.toList();
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/{id}")
|
||||||
|
public ResponseEntity<UserResponse> getUser(@PathVariable String id) {
|
||||||
|
return userService.getUserById(id)
|
||||||
|
.map(userMapper::toResponse)
|
||||||
|
.map(ResponseEntity::ok)
|
||||||
|
.orElse(ResponseEntity.notFound().build());
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping
|
||||||
|
@ResponseStatus(HttpStatus.CREATED)
|
||||||
|
public UserResponse createUser(@Valid @RequestBody CreateUserRequest request) {
|
||||||
|
var user = userService.createUser(request);
|
||||||
|
return userMapper.toResponse(user);
|
||||||
|
}
|
||||||
|
|
||||||
|
@PutMapping("/{id}")
|
||||||
|
public UserResponse updateUser(
|
||||||
|
@PathVariable String id,
|
||||||
|
@Valid @RequestBody UpdateUserRequest request) {
|
||||||
|
var user = userService.updateUser(id, request);
|
||||||
|
return userMapper.toResponse(user);
|
||||||
|
}
|
||||||
|
|
||||||
|
@DeleteMapping("/{id}")
|
||||||
|
@ResponseStatus(HttpStatus.NO_CONTENT)
|
||||||
|
public void deleteUser(@PathVariable String id) {
|
||||||
|
userService.deleteUser(id);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Exception Handling
|
||||||
|
|
||||||
|
```java
|
||||||
|
package com.mycompany.myapp.exception;
|
||||||
|
|
||||||
|
import org.springframework.http.HttpStatus;
|
||||||
|
import org.springframework.http.ProblemDetail;
|
||||||
|
import org.springframework.web.bind.MethodArgumentNotValidException;
|
||||||
|
import org.springframework.web.bind.annotation.ExceptionHandler;
|
||||||
|
import org.springframework.web.bind.annotation.RestControllerAdvice;
|
||||||
|
|
||||||
|
import java.net.URI;
|
||||||
|
|
||||||
|
@RestControllerAdvice
|
||||||
|
public class GlobalExceptionHandler {
|
||||||
|
|
||||||
|
@ExceptionHandler(UserNotFoundException.class)
|
||||||
|
public ProblemDetail handleUserNotFound(UserNotFoundException ex) {
|
||||||
|
var problem = ProblemDetail.forStatusAndDetail(
|
||||||
|
HttpStatus.NOT_FOUND,
|
||||||
|
ex.getMessage()
|
||||||
|
);
|
||||||
|
problem.setTitle("User Not Found");
|
||||||
|
problem.setType(URI.create("https://api.myapp.com/errors/user-not-found"));
|
||||||
|
return problem;
|
||||||
|
}
|
||||||
|
|
||||||
|
@ExceptionHandler(DuplicateEmailException.class)
|
||||||
|
public ProblemDetail handleDuplicateEmail(DuplicateEmailException ex) {
|
||||||
|
var problem = ProblemDetail.forStatusAndDetail(
|
||||||
|
HttpStatus.CONFLICT,
|
||||||
|
ex.getMessage()
|
||||||
|
);
|
||||||
|
problem.setTitle("Duplicate Email");
|
||||||
|
problem.setType(URI.create("https://api.myapp.com/errors/duplicate-email"));
|
||||||
|
return problem;
|
||||||
|
}
|
||||||
|
|
||||||
|
@ExceptionHandler(MethodArgumentNotValidException.class)
|
||||||
|
public ProblemDetail handleValidation(MethodArgumentNotValidException ex) {
|
||||||
|
var problem = ProblemDetail.forStatusAndDetail(
|
||||||
|
HttpStatus.BAD_REQUEST,
|
||||||
|
"Validation failed"
|
||||||
|
);
|
||||||
|
problem.setTitle("Validation Error");
|
||||||
|
|
||||||
|
var errors = ex.getBindingResult().getFieldErrors().stream()
|
||||||
|
.map(e -> new ValidationError(e.getField(), e.getDefaultMessage()))
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
problem.setProperty("errors", errors);
|
||||||
|
return problem;
|
||||||
|
}
|
||||||
|
|
||||||
|
record ValidationError(String field, String message) {}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Maven
|
||||||
|
mvn clean install # Build and test
|
||||||
|
mvn test # Run tests only
|
||||||
|
mvn verify # Run integration tests
|
||||||
|
mvn spring-boot:run # Run application
|
||||||
|
mvn jacoco:report # Generate coverage report
|
||||||
|
|
||||||
|
# Gradle
|
||||||
|
./gradlew build # Build and test
|
||||||
|
./gradlew test # Run tests only
|
||||||
|
./gradlew integrationTest # Run integration tests
|
||||||
|
./gradlew bootRun # Run application
|
||||||
|
./gradlew jacocoTestReport # Generate coverage report
|
||||||
|
|
||||||
|
# Common flags
|
||||||
|
-DskipTests # Skip tests
|
||||||
|
-Dspring.profiles.active=test # Set profile
|
||||||
|
```
|
||||||
|
|
||||||
|
## Dependencies (pom.xml)
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<dependencies>
|
||||||
|
<!-- Spring Boot -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-web</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-validation</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-data-jpa</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Lombok -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.projectlombok</groupId>
|
||||||
|
<artifactId>lombok</artifactId>
|
||||||
|
<scope>provided</scope>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Testing -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-test</artifactId>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.assertj</groupId>
|
||||||
|
<artifactId>assertj-core</artifactId>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
```
|
||||||
446
.claude/skills/languages/python/SKILL.md
Normal file
446
.claude/skills/languages/python/SKILL.md
Normal file
@@ -0,0 +1,446 @@
|
|||||||
|
---
|
||||||
|
name: python-fastapi
|
||||||
|
description: Python development with FastAPI, pytest, Ruff, Pydantic v2, and SQLAlchemy async patterns. Use when writing Python code, tests, or APIs.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Python Development Skill
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
src/
|
||||||
|
├── backend/
|
||||||
|
│ ├── __init__.py
|
||||||
|
│ ├── main.py # FastAPI app entry
|
||||||
|
│ ├── config.py # Pydantic Settings
|
||||||
|
│ ├── domains/
|
||||||
|
│ │ └── users/
|
||||||
|
│ │ ├── __init__.py
|
||||||
|
│ │ ├── router.py # API routes
|
||||||
|
│ │ ├── service.py # Business logic
|
||||||
|
│ │ ├── models.py # SQLAlchemy models
|
||||||
|
│ │ └── schemas.py # Pydantic schemas
|
||||||
|
│ └── shared/
|
||||||
|
│ ├── database.py # DB session management
|
||||||
|
│ └── models/ # Base models
|
||||||
|
tests/
|
||||||
|
├── backend/
|
||||||
|
│ ├── conftest.py # Shared fixtures
|
||||||
|
│ └── domains/
|
||||||
|
│ └── users/
|
||||||
|
│ └── test_service.py
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing with pytest
|
||||||
|
|
||||||
|
### Configuration (pyproject.toml)
|
||||||
|
```toml
|
||||||
|
[tool.pytest.ini_options]
|
||||||
|
testpaths = ["tests"]
|
||||||
|
asyncio_mode = "auto"
|
||||||
|
asyncio_default_fixture_loop_scope = "function"
|
||||||
|
addopts = [
|
||||||
|
"-v",
|
||||||
|
"--strict-markers",
|
||||||
|
"--cov=src",
|
||||||
|
"--cov-report=term-missing",
|
||||||
|
"--cov-fail-under=80"
|
||||||
|
]
|
||||||
|
|
||||||
|
[tool.coverage.run]
|
||||||
|
branch = true
|
||||||
|
source = ["src"]
|
||||||
|
omit = ["*/__init__.py", "*/migrations/*"]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Async Test Fixtures
|
||||||
|
```python
|
||||||
|
# tests/conftest.py
|
||||||
|
import pytest
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
from src.backend.shared.models.base import Base
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
async def db_session():
|
||||||
|
"""Create in-memory SQLite for fast tests."""
|
||||||
|
engine = create_async_engine(
|
||||||
|
"sqlite+aiosqlite:///:memory:",
|
||||||
|
echo=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
async with engine.begin() as conn:
|
||||||
|
await conn.run_sync(Base.metadata.create_all)
|
||||||
|
|
||||||
|
async_session = sessionmaker(
|
||||||
|
engine, class_=AsyncSession, expire_on_commit=False
|
||||||
|
)
|
||||||
|
|
||||||
|
async with async_session() as session:
|
||||||
|
yield session
|
||||||
|
await session.rollback()
|
||||||
|
|
||||||
|
await engine.dispose()
|
||||||
|
```
|
||||||
|
|
||||||
|
### Factory Functions for Test Data
|
||||||
|
```python
|
||||||
|
# tests/factories.py
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from src.backend.domains.users.schemas import UserCreate
|
||||||
|
|
||||||
|
|
||||||
|
def get_mock_user_create(overrides: dict[str, Any] | None = None) -> UserCreate:
|
||||||
|
"""Factory for UserCreate with sensible defaults."""
|
||||||
|
defaults = {
|
||||||
|
"email": "test@example.com",
|
||||||
|
"name": "Test User",
|
||||||
|
"password": "securepassword123",
|
||||||
|
}
|
||||||
|
return UserCreate(**(defaults | (overrides or {})))
|
||||||
|
|
||||||
|
|
||||||
|
def get_mock_user_dict(overrides: dict[str, Any] | None = None) -> dict[str, Any]:
|
||||||
|
"""Factory for raw user dict."""
|
||||||
|
defaults = {
|
||||||
|
"id": "user_123",
|
||||||
|
"email": "test@example.com",
|
||||||
|
"name": "Test User",
|
||||||
|
"is_active": True,
|
||||||
|
}
|
||||||
|
return defaults | (overrides or {})
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test Patterns
|
||||||
|
```python
|
||||||
|
# tests/backend/domains/users/test_service.py
|
||||||
|
import pytest
|
||||||
|
from unittest.mock import AsyncMock
|
||||||
|
|
||||||
|
from src.backend.domains.users.service import UserService
|
||||||
|
from tests.factories import get_mock_user_create
|
||||||
|
|
||||||
|
|
||||||
|
class TestUserService:
|
||||||
|
"""Test user service behavior through public API."""
|
||||||
|
|
||||||
|
async def test_create_user_returns_user_with_id(self, db_session):
|
||||||
|
"""Creating a user should return user with generated ID."""
|
||||||
|
service = UserService(db_session)
|
||||||
|
user_data = get_mock_user_create()
|
||||||
|
|
||||||
|
result = await service.create_user(user_data)
|
||||||
|
|
||||||
|
assert result.id is not None
|
||||||
|
assert result.email == user_data.email
|
||||||
|
|
||||||
|
async def test_create_user_hashes_password(self, db_session):
|
||||||
|
"""Password should be hashed, not stored in plain text."""
|
||||||
|
service = UserService(db_session)
|
||||||
|
user_data = get_mock_user_create({"password": "mypassword"})
|
||||||
|
|
||||||
|
result = await service.create_user(user_data)
|
||||||
|
|
||||||
|
assert result.hashed_password != "mypassword"
|
||||||
|
assert len(result.hashed_password) > 50 # Hashed
|
||||||
|
|
||||||
|
async def test_create_duplicate_email_raises_error(self, db_session):
|
||||||
|
"""Duplicate email should raise ValueError."""
|
||||||
|
service = UserService(db_session)
|
||||||
|
user_data = get_mock_user_create()
|
||||||
|
await service.create_user(user_data)
|
||||||
|
|
||||||
|
with pytest.raises(ValueError, match="Email already exists"):
|
||||||
|
await service.create_user(user_data)
|
||||||
|
|
||||||
|
async def test_get_user_not_found_returns_none(self, db_session):
|
||||||
|
"""Non-existent user should return None."""
|
||||||
|
service = UserService(db_session)
|
||||||
|
|
||||||
|
result = await service.get_user("nonexistent_id")
|
||||||
|
|
||||||
|
assert result is None
|
||||||
|
```
|
||||||
|
|
||||||
|
## Pydantic v2 Patterns
|
||||||
|
|
||||||
|
### Schemas with Validation
|
||||||
|
```python
|
||||||
|
# src/backend/domains/users/schemas.py
|
||||||
|
from datetime import datetime
|
||||||
|
from pydantic import BaseModel, EmailStr, Field, field_validator
|
||||||
|
|
||||||
|
|
||||||
|
class UserBase(BaseModel):
|
||||||
|
"""Base user schema with shared fields."""
|
||||||
|
|
||||||
|
email: EmailStr
|
||||||
|
name: str = Field(..., min_length=1, max_length=100)
|
||||||
|
|
||||||
|
|
||||||
|
class UserCreate(UserBase):
|
||||||
|
"""Schema for creating a user."""
|
||||||
|
|
||||||
|
password: str = Field(..., min_length=8)
|
||||||
|
|
||||||
|
@field_validator("password")
|
||||||
|
@classmethod
|
||||||
|
def password_must_be_strong(cls, v: str) -> str:
|
||||||
|
if not any(c.isupper() for c in v):
|
||||||
|
raise ValueError("Password must contain uppercase letter")
|
||||||
|
if not any(c.isdigit() for c in v):
|
||||||
|
raise ValueError("Password must contain a digit")
|
||||||
|
return v
|
||||||
|
|
||||||
|
|
||||||
|
class UserResponse(UserBase):
|
||||||
|
"""Schema for user responses (no password)."""
|
||||||
|
|
||||||
|
id: str
|
||||||
|
is_active: bool
|
||||||
|
created_at: datetime
|
||||||
|
|
||||||
|
model_config = {"from_attributes": True}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configuration with Pydantic Settings
|
||||||
|
```python
|
||||||
|
# src/backend/config.py
|
||||||
|
from functools import lru_cache
|
||||||
|
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||||
|
|
||||||
|
|
||||||
|
class Settings(BaseSettings):
|
||||||
|
"""Application settings from environment."""
|
||||||
|
|
||||||
|
model_config = SettingsConfigDict(
|
||||||
|
env_file=".env",
|
||||||
|
env_file_encoding="utf-8",
|
||||||
|
case_sensitive=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Database
|
||||||
|
database_url: str
|
||||||
|
database_pool_size: int = 5
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
redis_url: str = "redis://localhost:6379"
|
||||||
|
|
||||||
|
# Security
|
||||||
|
secret_key: str
|
||||||
|
access_token_expire_minutes: int = 30
|
||||||
|
|
||||||
|
# AWS
|
||||||
|
aws_region: str = "eu-west-2"
|
||||||
|
|
||||||
|
|
||||||
|
@lru_cache
|
||||||
|
def get_settings() -> Settings:
|
||||||
|
return Settings()
|
||||||
|
```
|
||||||
|
|
||||||
|
## FastAPI Patterns
|
||||||
|
|
||||||
|
### Router with Dependency Injection
|
||||||
|
```python
|
||||||
|
# src/backend/domains/users/router.py
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
||||||
|
from src.backend.shared.database import get_db_session
|
||||||
|
from .schemas import UserCreate, UserResponse
|
||||||
|
from .service import UserService
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/users", tags=["users"])
|
||||||
|
|
||||||
|
|
||||||
|
def get_user_service(
|
||||||
|
session: AsyncSession = Depends(get_db_session),
|
||||||
|
) -> UserService:
|
||||||
|
return UserService(session)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/", response_model=UserResponse, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_user(
|
||||||
|
user_data: UserCreate,
|
||||||
|
service: UserService = Depends(get_user_service),
|
||||||
|
) -> UserResponse:
|
||||||
|
"""Create a new user."""
|
||||||
|
try:
|
||||||
|
return await service.create_user(user_data)
|
||||||
|
except ValueError as e:
|
||||||
|
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{user_id}", response_model=UserResponse)
|
||||||
|
async def get_user(
|
||||||
|
user_id: str,
|
||||||
|
service: UserService = Depends(get_user_service),
|
||||||
|
) -> UserResponse:
|
||||||
|
"""Get user by ID."""
|
||||||
|
user = await service.get_user(user_id)
|
||||||
|
if not user:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
|
||||||
|
return user
|
||||||
|
```
|
||||||
|
|
||||||
|
## SQLAlchemy 2.0 Async Patterns
|
||||||
|
|
||||||
|
### Model Definition
|
||||||
|
```python
|
||||||
|
# src/backend/domains/users/models.py
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import String, Boolean, DateTime
|
||||||
|
from sqlalchemy.orm import Mapped, mapped_column
|
||||||
|
|
||||||
|
from src.backend.shared.models.base import Base
|
||||||
|
|
||||||
|
|
||||||
|
class User(Base):
|
||||||
|
__tablename__ = "users"
|
||||||
|
|
||||||
|
id: Mapped[str] = mapped_column(String(36), primary_key=True)
|
||||||
|
email: Mapped[str] = mapped_column(String(255), unique=True, index=True)
|
||||||
|
name: Mapped[str] = mapped_column(String(100))
|
||||||
|
hashed_password: Mapped[str] = mapped_column(String(255))
|
||||||
|
is_active: Mapped[bool] = mapped_column(Boolean, default=True)
|
||||||
|
created_at: Mapped[datetime] = mapped_column(
|
||||||
|
DateTime, default=datetime.utcnow
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Async Repository Pattern
|
||||||
|
```python
|
||||||
|
# src/backend/domains/users/repository.py
|
||||||
|
from sqlalchemy import select
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
||||||
|
from .models import User
|
||||||
|
|
||||||
|
|
||||||
|
class UserRepository:
|
||||||
|
def __init__(self, session: AsyncSession):
|
||||||
|
self._session = session
|
||||||
|
|
||||||
|
async def get_by_id(self, user_id: str) -> User | None:
|
||||||
|
result = await self._session.execute(
|
||||||
|
select(User).where(User.id == user_id)
|
||||||
|
)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
|
|
||||||
|
async def get_by_email(self, email: str) -> User | None:
|
||||||
|
result = await self._session.execute(
|
||||||
|
select(User).where(User.email == email)
|
||||||
|
)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
|
|
||||||
|
async def create(self, user: User) -> User:
|
||||||
|
self._session.add(user)
|
||||||
|
await self._session.commit()
|
||||||
|
await self._session.refresh(user)
|
||||||
|
return user
|
||||||
|
```
|
||||||
|
|
||||||
|
## Ruff Configuration
|
||||||
|
|
||||||
|
```toml
|
||||||
|
# pyproject.toml
|
||||||
|
[tool.ruff]
|
||||||
|
line-length = 120
|
||||||
|
target-version = "py311"
|
||||||
|
|
||||||
|
[tool.ruff.lint]
|
||||||
|
select = [
|
||||||
|
"E", # pycodestyle errors
|
||||||
|
"F", # pyflakes
|
||||||
|
"I", # isort
|
||||||
|
"UP", # pyupgrade
|
||||||
|
"B", # flake8-bugbear
|
||||||
|
"SIM", # flake8-simplify
|
||||||
|
"ASYNC", # flake8-async
|
||||||
|
]
|
||||||
|
ignore = [
|
||||||
|
"B008", # function-call-in-default-argument (FastAPI Depends)
|
||||||
|
"E501", # line-too-long (handled by formatter)
|
||||||
|
]
|
||||||
|
|
||||||
|
[tool.ruff.lint.per-file-ignores]
|
||||||
|
"__init__.py" = ["F401"] # unused imports (re-exports)
|
||||||
|
"tests/*" = ["S101"] # assert allowed in tests
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Testing
|
||||||
|
pytest # Run all tests
|
||||||
|
pytest tests/backend/domains/users/ # Run specific directory
|
||||||
|
pytest -k "test_create" # Run tests matching pattern
|
||||||
|
pytest --cov=src --cov-report=html # Coverage with HTML report
|
||||||
|
pytest -x # Stop on first failure
|
||||||
|
pytest --lf # Run last failed tests
|
||||||
|
|
||||||
|
# Linting & Formatting
|
||||||
|
ruff check . # Lint check
|
||||||
|
ruff check . --fix # Auto-fix issues
|
||||||
|
ruff format . # Format code
|
||||||
|
ruff format . --check # Check formatting
|
||||||
|
|
||||||
|
# Type Checking
|
||||||
|
mypy src # Full type check
|
||||||
|
pyright src # Alternative type checker
|
||||||
|
|
||||||
|
# Development
|
||||||
|
uvicorn src.backend.main:app --reload # Dev server
|
||||||
|
alembic revision --autogenerate -m "msg" # Create migration
|
||||||
|
alembic upgrade head # Apply migrations
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns to Avoid
|
||||||
|
|
||||||
|
```python
|
||||||
|
# BAD: Mutable default argument
|
||||||
|
def process_items(items: list = []): # Creates shared state!
|
||||||
|
items.append("new")
|
||||||
|
return items
|
||||||
|
|
||||||
|
# GOOD: Use None default
|
||||||
|
def process_items(items: list | None = None) -> list:
|
||||||
|
if items is None:
|
||||||
|
items = []
|
||||||
|
return [*items, "new"]
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: Bare except
|
||||||
|
try:
|
||||||
|
risky_operation()
|
||||||
|
except: # Catches everything including SystemExit!
|
||||||
|
pass
|
||||||
|
|
||||||
|
# GOOD: Specific exceptions
|
||||||
|
try:
|
||||||
|
risky_operation()
|
||||||
|
except ValueError as e:
|
||||||
|
logger.error(f"Validation failed: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: String formatting in SQL (injection risk!)
|
||||||
|
query = f"SELECT * FROM users WHERE email = '{email}'"
|
||||||
|
|
||||||
|
# GOOD: Parameterized query
|
||||||
|
stmt = select(User).where(User.email == email)
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: Sync in async context
|
||||||
|
async def get_data():
|
||||||
|
return requests.get(url) # Blocks event loop!
|
||||||
|
|
||||||
|
# GOOD: Use async client
|
||||||
|
async def get_data():
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
return await client.get(url)
|
||||||
|
```
|
||||||
574
.claude/skills/languages/rust/SKILL.md
Normal file
574
.claude/skills/languages/rust/SKILL.md
Normal file
@@ -0,0 +1,574 @@
|
|||||||
|
---
|
||||||
|
name: rust-async
|
||||||
|
description: Rust development with Tokio async runtime, cargo test, clippy, and systems programming patterns. Use when writing Rust code, agents, or system utilities.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Rust Development Skill
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
my-agent/
|
||||||
|
├── Cargo.toml
|
||||||
|
├── Cargo.lock
|
||||||
|
├── src/
|
||||||
|
│ ├── main.rs # Binary entry point
|
||||||
|
│ ├── lib.rs # Library root (if dual crate)
|
||||||
|
│ ├── config.rs # Configuration handling
|
||||||
|
│ ├── error.rs # Error types
|
||||||
|
│ ├── client/
|
||||||
|
│ │ ├── mod.rs
|
||||||
|
│ │ └── http.rs
|
||||||
|
│ └── discovery/
|
||||||
|
│ ├── mod.rs
|
||||||
|
│ ├── system.rs
|
||||||
|
│ └── network.rs
|
||||||
|
├── tests/
|
||||||
|
│ └── integration/
|
||||||
|
│ └── discovery_test.rs
|
||||||
|
└── benches/
|
||||||
|
└── performance.rs
|
||||||
|
```
|
||||||
|
|
||||||
|
## Cargo Configuration
|
||||||
|
|
||||||
|
```toml
|
||||||
|
# Cargo.toml
|
||||||
|
[package]
|
||||||
|
name = "my-agent"
|
||||||
|
version = "0.1.0"
|
||||||
|
edition = "2021"
|
||||||
|
rust-version = "1.75"
|
||||||
|
|
||||||
|
[dependencies]
|
||||||
|
# Async runtime
|
||||||
|
tokio = { version = "1.35", features = ["full"] }
|
||||||
|
|
||||||
|
# HTTP client
|
||||||
|
reqwest = { version = "0.11", features = ["json", "socks"] }
|
||||||
|
|
||||||
|
# Serialization
|
||||||
|
serde = { version = "1.0", features = ["derive"] }
|
||||||
|
serde_json = "1.0"
|
||||||
|
|
||||||
|
# Error handling
|
||||||
|
thiserror = "1.0"
|
||||||
|
anyhow = "1.0"
|
||||||
|
|
||||||
|
# Logging
|
||||||
|
tracing = "0.1"
|
||||||
|
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
|
||||||
|
|
||||||
|
# System info
|
||||||
|
sysinfo = "0.30"
|
||||||
|
|
||||||
|
[dev-dependencies]
|
||||||
|
tokio-test = "0.4"
|
||||||
|
mockall = "0.12"
|
||||||
|
tempfile = "3.10"
|
||||||
|
|
||||||
|
[profile.release]
|
||||||
|
opt-level = "z" # Optimize for size
|
||||||
|
lto = true # Link-time optimization
|
||||||
|
codegen-units = 1 # Better optimization
|
||||||
|
strip = true # Strip symbols
|
||||||
|
panic = "abort" # Smaller binary
|
||||||
|
|
||||||
|
[lints.rust]
|
||||||
|
unsafe_code = "forbid"
|
||||||
|
|
||||||
|
[lints.clippy]
|
||||||
|
all = "deny"
|
||||||
|
pedantic = "warn"
|
||||||
|
nursery = "warn"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Custom Error Types
|
||||||
|
```rust
|
||||||
|
// src/error.rs
|
||||||
|
use thiserror::Error;
|
||||||
|
|
||||||
|
#[derive(Error, Debug)]
|
||||||
|
pub enum AgentError {
|
||||||
|
#[error("Configuration error: {0}")]
|
||||||
|
Config(String),
|
||||||
|
|
||||||
|
#[error("Network error: {0}")]
|
||||||
|
Network(#[from] reqwest::Error),
|
||||||
|
|
||||||
|
#[error("IO error: {0}")]
|
||||||
|
Io(#[from] std::io::Error),
|
||||||
|
|
||||||
|
#[error("Serialization error: {0}")]
|
||||||
|
Serialization(#[from] serde_json::Error),
|
||||||
|
|
||||||
|
#[error("Discovery failed: {message}")]
|
||||||
|
Discovery { message: String, source: Option<Box<dyn std::error::Error + Send + Sync>> },
|
||||||
|
|
||||||
|
#[error("Connection timeout after {duration_secs}s")]
|
||||||
|
Timeout { duration_secs: u64 },
|
||||||
|
}
|
||||||
|
|
||||||
|
pub type Result<T> = std::result::Result<T, AgentError>;
|
||||||
|
```
|
||||||
|
|
||||||
|
### Result Pattern Usage
|
||||||
|
```rust
|
||||||
|
// src/client/http.rs
|
||||||
|
use crate::error::{AgentError, Result};
|
||||||
|
|
||||||
|
pub struct HttpClient {
|
||||||
|
client: reqwest::Client,
|
||||||
|
base_url: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl HttpClient {
|
||||||
|
pub fn new(base_url: &str, timeout_secs: u64) -> Result<Self> {
|
||||||
|
let client = reqwest::Client::builder()
|
||||||
|
.timeout(std::time::Duration::from_secs(timeout_secs))
|
||||||
|
.build()
|
||||||
|
.map_err(AgentError::Network)?;
|
||||||
|
|
||||||
|
Ok(Self {
|
||||||
|
client,
|
||||||
|
base_url: base_url.to_string(),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn get<T: serde::de::DeserializeOwned>(&self, path: &str) -> Result<T> {
|
||||||
|
let url = format!("{}{}", self.base_url, path);
|
||||||
|
|
||||||
|
let response = self
|
||||||
|
.client
|
||||||
|
.get(&url)
|
||||||
|
.send()
|
||||||
|
.await?
|
||||||
|
.error_for_status()?
|
||||||
|
.json::<T>()
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
Ok(response)
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn post<T, R>(&self, path: &str, body: &T) -> Result<R>
|
||||||
|
where
|
||||||
|
T: serde::Serialize,
|
||||||
|
R: serde::de::DeserializeOwned,
|
||||||
|
{
|
||||||
|
let url = format!("{}{}", self.base_url, path);
|
||||||
|
|
||||||
|
let response = self
|
||||||
|
.client
|
||||||
|
.post(&url)
|
||||||
|
.json(body)
|
||||||
|
.send()
|
||||||
|
.await?
|
||||||
|
.error_for_status()?
|
||||||
|
.json::<R>()
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
Ok(response)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Async Patterns with Tokio
|
||||||
|
|
||||||
|
### Main Entry Point
|
||||||
|
```rust
|
||||||
|
// src/main.rs
|
||||||
|
use tracing_subscriber::{layer::SubscriberExt, util::SubscriberInitExt};
|
||||||
|
|
||||||
|
mod config;
|
||||||
|
mod error;
|
||||||
|
mod client;
|
||||||
|
mod discovery;
|
||||||
|
|
||||||
|
use crate::config::Config;
|
||||||
|
use crate::error::Result;
|
||||||
|
|
||||||
|
#[tokio::main]
|
||||||
|
async fn main() -> Result<()> {
|
||||||
|
// Initialize tracing
|
||||||
|
tracing_subscriber::registry()
|
||||||
|
.with(tracing_subscriber::EnvFilter::new(
|
||||||
|
std::env::var("RUST_LOG").unwrap_or_else(|_| "info".into()),
|
||||||
|
))
|
||||||
|
.with(tracing_subscriber::fmt::layer())
|
||||||
|
.init();
|
||||||
|
|
||||||
|
let config = Config::from_env()?;
|
||||||
|
|
||||||
|
tracing::info!(version = env!("CARGO_PKG_VERSION"), "Starting agent");
|
||||||
|
|
||||||
|
run(config).await
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn run(config: Config) -> Result<()> {
|
||||||
|
let client = client::HttpClient::new(&config.server_url, config.timeout_secs)?;
|
||||||
|
|
||||||
|
// Main loop with graceful shutdown
|
||||||
|
let mut interval = tokio::time::interval(config.poll_interval);
|
||||||
|
|
||||||
|
loop {
|
||||||
|
tokio::select! {
|
||||||
|
_ = interval.tick() => {
|
||||||
|
if let Err(e) = discovery::collect_and_send(&client).await {
|
||||||
|
tracing::error!(error = %e, "Discovery cycle failed");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ = tokio::signal::ctrl_c() => {
|
||||||
|
tracing::info!("Shutdown signal received");
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Concurrent Operations
|
||||||
|
```rust
|
||||||
|
// src/discovery/mod.rs
|
||||||
|
use tokio::task::JoinSet;
|
||||||
|
use crate::error::Result;
|
||||||
|
|
||||||
|
pub async fn discover_all() -> Result<SystemInfo> {
|
||||||
|
let mut tasks = JoinSet::new();
|
||||||
|
|
||||||
|
// Spawn concurrent discovery tasks
|
||||||
|
tasks.spawn(async { ("cpu", discover_cpu().await) });
|
||||||
|
tasks.spawn(async { ("memory", discover_memory().await) });
|
||||||
|
tasks.spawn(async { ("disk", discover_disk().await) });
|
||||||
|
tasks.spawn(async { ("network", discover_network().await) });
|
||||||
|
|
||||||
|
let mut info = SystemInfo::default();
|
||||||
|
|
||||||
|
// Collect results as they complete
|
||||||
|
while let Some(result) = tasks.join_next().await {
|
||||||
|
match result {
|
||||||
|
Ok((name, Ok(data))) => {
|
||||||
|
tracing::debug!(component = name, "Discovery completed");
|
||||||
|
info.merge(name, data);
|
||||||
|
}
|
||||||
|
Ok((name, Err(e))) => {
|
||||||
|
tracing::warn!(component = name, error = %e, "Discovery failed");
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
tracing::error!(error = %e, "Task panicked");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(info)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Patterns
|
||||||
|
|
||||||
|
### Unit Tests
|
||||||
|
```rust
|
||||||
|
// src/discovery/system.rs
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
|
||||||
|
pub struct CpuInfo {
|
||||||
|
pub cores: usize,
|
||||||
|
pub usage_percent: f32,
|
||||||
|
pub model: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl CpuInfo {
|
||||||
|
pub fn is_high_usage(&self) -> bool {
|
||||||
|
self.usage_percent > 80.0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
fn get_mock_cpu_info(overrides: Option<CpuInfo>) -> CpuInfo {
|
||||||
|
let default = CpuInfo {
|
||||||
|
cores: 4,
|
||||||
|
usage_percent: 25.0,
|
||||||
|
model: "Test CPU".to_string(),
|
||||||
|
};
|
||||||
|
overrides.unwrap_or(default)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_is_high_usage_returns_true_above_threshold() {
|
||||||
|
let cpu = get_mock_cpu_info(Some(CpuInfo {
|
||||||
|
usage_percent: 85.0,
|
||||||
|
..get_mock_cpu_info(None)
|
||||||
|
}));
|
||||||
|
|
||||||
|
assert!(cpu.is_high_usage());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_is_high_usage_returns_false_below_threshold() {
|
||||||
|
let cpu = get_mock_cpu_info(Some(CpuInfo {
|
||||||
|
usage_percent: 50.0,
|
||||||
|
..get_mock_cpu_info(None)
|
||||||
|
}));
|
||||||
|
|
||||||
|
assert!(!cpu.is_high_usage());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_is_high_usage_returns_false_at_boundary() {
|
||||||
|
let cpu = get_mock_cpu_info(Some(CpuInfo {
|
||||||
|
usage_percent: 80.0,
|
||||||
|
..get_mock_cpu_info(None)
|
||||||
|
}));
|
||||||
|
|
||||||
|
assert!(!cpu.is_high_usage());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Async Tests
|
||||||
|
```rust
|
||||||
|
// tests/integration/client_test.rs
|
||||||
|
use tokio_test::block_on;
|
||||||
|
use my_agent::client::HttpClient;
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn test_client_handles_timeout() {
|
||||||
|
let client = HttpClient::new("http://localhost:9999", 1).unwrap();
|
||||||
|
|
||||||
|
let result: Result<serde_json::Value, _> = client.get("/test").await;
|
||||||
|
|
||||||
|
assert!(result.is_err());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tokio::test]
|
||||||
|
async fn test_concurrent_requests_complete() {
|
||||||
|
let client = HttpClient::new("https://httpbin.org", 30).unwrap();
|
||||||
|
|
||||||
|
let handles: Vec<_> = (0..5)
|
||||||
|
.map(|i| {
|
||||||
|
let client = client.clone();
|
||||||
|
tokio::spawn(async move {
|
||||||
|
client.get::<serde_json::Value>(&format!("/delay/{}", i % 2)).await
|
||||||
|
})
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
for handle in handles {
|
||||||
|
let result = handle.await.unwrap();
|
||||||
|
assert!(result.is_ok());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Mocking with Mockall
|
||||||
|
```rust
|
||||||
|
// src/discovery/mod.rs
|
||||||
|
#[cfg_attr(test, mockall::automock)]
|
||||||
|
pub trait SystemDiscovery {
|
||||||
|
fn get_cpu_info(&self) -> crate::error::Result<CpuInfo>;
|
||||||
|
fn get_memory_info(&self) -> crate::error::Result<MemoryInfo>;
|
||||||
|
}
|
||||||
|
|
||||||
|
// src/discovery/collector.rs
|
||||||
|
pub struct Collector<D: SystemDiscovery> {
|
||||||
|
discovery: D,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<D: SystemDiscovery> Collector<D> {
|
||||||
|
pub fn new(discovery: D) -> Self {
|
||||||
|
Self { discovery }
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn collect(&self) -> crate::error::Result<SystemSnapshot> {
|
||||||
|
let cpu = self.discovery.get_cpu_info()?;
|
||||||
|
let memory = self.discovery.get_memory_info()?;
|
||||||
|
|
||||||
|
Ok(SystemSnapshot { cpu, memory })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
use mockall::predicate::*;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_collector_combines_cpu_and_memory() {
|
||||||
|
let mut mock = MockSystemDiscovery::new();
|
||||||
|
|
||||||
|
mock.expect_get_cpu_info()
|
||||||
|
.times(1)
|
||||||
|
.returning(|| Ok(CpuInfo {
|
||||||
|
cores: 4,
|
||||||
|
usage_percent: 50.0,
|
||||||
|
model: "Mock CPU".to_string(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
mock.expect_get_memory_info()
|
||||||
|
.times(1)
|
||||||
|
.returning(|| Ok(MemoryInfo {
|
||||||
|
total_gb: 16.0,
|
||||||
|
available_gb: 8.0,
|
||||||
|
}));
|
||||||
|
|
||||||
|
let collector = Collector::new(mock);
|
||||||
|
let snapshot = collector.collect().unwrap();
|
||||||
|
|
||||||
|
assert_eq!(snapshot.cpu.cores, 4);
|
||||||
|
assert_eq!(snapshot.memory.total_gb, 16.0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
```rust
|
||||||
|
// src/config.rs
|
||||||
|
use crate::error::{AgentError, Result};
|
||||||
|
use std::time::Duration;
|
||||||
|
|
||||||
|
#[derive(Debug, Clone)]
|
||||||
|
pub struct Config {
|
||||||
|
pub server_url: String,
|
||||||
|
pub timeout_secs: u64,
|
||||||
|
pub poll_interval: Duration,
|
||||||
|
pub agent_id: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Config {
|
||||||
|
pub fn from_env() -> Result<Self> {
|
||||||
|
let server_url = std::env::var("SERVER_URL")
|
||||||
|
.map_err(|_| AgentError::Config("SERVER_URL not set".to_string()))?;
|
||||||
|
|
||||||
|
let timeout_secs = std::env::var("TIMEOUT_SECS")
|
||||||
|
.unwrap_or_else(|_| "30".to_string())
|
||||||
|
.parse()
|
||||||
|
.map_err(|_| AgentError::Config("Invalid TIMEOUT_SECS".to_string()))?;
|
||||||
|
|
||||||
|
let poll_interval_secs: u64 = std::env::var("POLL_INTERVAL_SECS")
|
||||||
|
.unwrap_or_else(|_| "60".to_string())
|
||||||
|
.parse()
|
||||||
|
.map_err(|_| AgentError::Config("Invalid POLL_INTERVAL_SECS".to_string()))?;
|
||||||
|
|
||||||
|
let agent_id = std::env::var("AGENT_ID")
|
||||||
|
.unwrap_or_else(|_| hostname::get()
|
||||||
|
.map(|h| h.to_string_lossy().to_string())
|
||||||
|
.unwrap_or_else(|_| "unknown".to_string()));
|
||||||
|
|
||||||
|
Ok(Self {
|
||||||
|
server_url,
|
||||||
|
timeout_secs,
|
||||||
|
poll_interval: Duration::from_secs(poll_interval_secs),
|
||||||
|
agent_id,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Clippy Configuration
|
||||||
|
|
||||||
|
```toml
|
||||||
|
# clippy.toml
|
||||||
|
cognitive-complexity-threshold = 10
|
||||||
|
too-many-arguments-threshold = 5
|
||||||
|
type-complexity-threshold = 200
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Building
|
||||||
|
cargo build # Debug build
|
||||||
|
cargo build --release # Release build
|
||||||
|
cargo build --target x86_64-unknown-linux-musl # Static binary
|
||||||
|
|
||||||
|
# Testing
|
||||||
|
cargo test # Run all tests
|
||||||
|
cargo test -- --nocapture # Show println! output
|
||||||
|
cargo test test_name # Run specific test
|
||||||
|
cargo test --test integration # Run integration tests only
|
||||||
|
|
||||||
|
# Linting
|
||||||
|
cargo clippy # Run clippy
|
||||||
|
cargo clippy -- -D warnings # Treat warnings as errors
|
||||||
|
cargo clippy --fix # Auto-fix issues
|
||||||
|
|
||||||
|
# Formatting
|
||||||
|
cargo fmt # Format code
|
||||||
|
cargo fmt --check # Check formatting
|
||||||
|
|
||||||
|
# Other
|
||||||
|
cargo doc --open # Generate and open docs
|
||||||
|
cargo bench # Run benchmarks
|
||||||
|
cargo tree # Show dependency tree
|
||||||
|
cargo audit # Check for vulnerabilities
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns to Avoid
|
||||||
|
|
||||||
|
```rust
|
||||||
|
// BAD: Unwrap without context
|
||||||
|
let config = Config::from_env().unwrap();
|
||||||
|
|
||||||
|
// GOOD: Provide context or use ?
|
||||||
|
let config = Config::from_env()
|
||||||
|
.expect("Failed to load configuration from environment");
|
||||||
|
|
||||||
|
// Or in functions returning Result:
|
||||||
|
let config = Config::from_env()?;
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Clone when not needed
|
||||||
|
fn process(data: &Vec<String>) {
|
||||||
|
let cloned = data.clone(); // Unnecessary allocation
|
||||||
|
for item in cloned.iter() {
|
||||||
|
println!("{}", item);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// GOOD: Use references
|
||||||
|
fn process(data: &[String]) {
|
||||||
|
for item in data {
|
||||||
|
println!("{}", item);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: String concatenation in loop
|
||||||
|
let mut result = String::new();
|
||||||
|
for item in items {
|
||||||
|
result = result + &item + ", "; // Allocates each iteration
|
||||||
|
}
|
||||||
|
|
||||||
|
// GOOD: Use push_str or join
|
||||||
|
let result = items.join(", ");
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Blocking in async context
|
||||||
|
async fn fetch_data() {
|
||||||
|
std::thread::sleep(Duration::from_secs(1)); // Blocks runtime!
|
||||||
|
}
|
||||||
|
|
||||||
|
// GOOD: Use async sleep
|
||||||
|
async fn fetch_data() {
|
||||||
|
tokio::time::sleep(Duration::from_secs(1)).await;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Ignoring errors silently
|
||||||
|
let _ = file.write_all(data);
|
||||||
|
|
||||||
|
// GOOD: Handle or propagate errors
|
||||||
|
file.write_all(data)?;
|
||||||
|
// Or log if truly ignorable:
|
||||||
|
if let Err(e) = file.write_all(data) {
|
||||||
|
tracing::warn!(error = %e, "Failed to write data");
|
||||||
|
}
|
||||||
|
```
|
||||||
624
.claude/skills/languages/typescript/SKILL.md
Normal file
624
.claude/skills/languages/typescript/SKILL.md
Normal file
@@ -0,0 +1,624 @@
|
|||||||
|
---
|
||||||
|
name: typescript-react
|
||||||
|
description: TypeScript development with React, Vitest, ESLint, Zod, TanStack Query, and Radix UI patterns. Use when writing TypeScript, React components, or frontend code.
|
||||||
|
---
|
||||||
|
|
||||||
|
# TypeScript React Development Skill
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
src/
|
||||||
|
├── main.tsx # App entry point
|
||||||
|
├── App.tsx # Root component
|
||||||
|
├── features/
|
||||||
|
│ └── users/
|
||||||
|
│ ├── components/
|
||||||
|
│ │ ├── UserList.tsx
|
||||||
|
│ │ └── UserForm.tsx
|
||||||
|
│ ├── hooks/
|
||||||
|
│ │ └── useUsers.ts
|
||||||
|
│ ├── api/
|
||||||
|
│ │ └── users.ts
|
||||||
|
│ └── schemas/
|
||||||
|
│ └── user.schema.ts
|
||||||
|
├── shared/
|
||||||
|
│ ├── components/
|
||||||
|
│ │ └── ui/ # Radix + Tailwind components
|
||||||
|
│ ├── hooks/
|
||||||
|
│ └── lib/
|
||||||
|
│ └── api-client.ts
|
||||||
|
└── test/
|
||||||
|
└── setup.ts # Vitest setup
|
||||||
|
tests/
|
||||||
|
└── features/
|
||||||
|
└── users/
|
||||||
|
└── UserList.test.tsx
|
||||||
|
```
|
||||||
|
|
||||||
|
## Vitest Configuration
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// vitest.config.ts
|
||||||
|
import { defineConfig } from 'vitest/config';
|
||||||
|
import react from '@vitejs/plugin-react';
|
||||||
|
import path from 'path';
|
||||||
|
|
||||||
|
export default defineConfig({
|
||||||
|
plugins: [react()],
|
||||||
|
test: {
|
||||||
|
globals: true,
|
||||||
|
environment: 'jsdom',
|
||||||
|
setupFiles: ['./src/test/setup.ts'],
|
||||||
|
include: ['**/*.test.{ts,tsx}'],
|
||||||
|
coverage: {
|
||||||
|
provider: 'v8',
|
||||||
|
reporter: ['text', 'json', 'html', 'cobertura'],
|
||||||
|
exclude: [
|
||||||
|
'node_modules/',
|
||||||
|
'src/test/',
|
||||||
|
'**/*.d.ts',
|
||||||
|
'**/*.config.*',
|
||||||
|
'**/index.ts',
|
||||||
|
],
|
||||||
|
thresholds: {
|
||||||
|
global: {
|
||||||
|
branches: 80,
|
||||||
|
functions: 80,
|
||||||
|
lines: 80,
|
||||||
|
statements: 80,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
resolve: {
|
||||||
|
alias: {
|
||||||
|
'@': path.resolve(__dirname, './src'),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test Setup
|
||||||
|
```typescript
|
||||||
|
// src/test/setup.ts
|
||||||
|
import '@testing-library/jest-dom/vitest';
|
||||||
|
import { cleanup } from '@testing-library/react';
|
||||||
|
import { afterEach, vi } from 'vitest';
|
||||||
|
|
||||||
|
// Cleanup after each test
|
||||||
|
afterEach(() => {
|
||||||
|
cleanup();
|
||||||
|
});
|
||||||
|
|
||||||
|
// Mock localStorage
|
||||||
|
const localStorageMock = {
|
||||||
|
getItem: vi.fn(),
|
||||||
|
setItem: vi.fn(),
|
||||||
|
removeItem: vi.fn(),
|
||||||
|
clear: vi.fn(),
|
||||||
|
};
|
||||||
|
Object.defineProperty(window, 'localStorage', { value: localStorageMock });
|
||||||
|
|
||||||
|
// Mock ResizeObserver (required for Radix UI)
|
||||||
|
class ResizeObserverMock {
|
||||||
|
observe = vi.fn();
|
||||||
|
unobserve = vi.fn();
|
||||||
|
disconnect = vi.fn();
|
||||||
|
}
|
||||||
|
window.ResizeObserver = ResizeObserverMock;
|
||||||
|
|
||||||
|
// Mock scrollIntoView
|
||||||
|
Element.prototype.scrollIntoView = vi.fn();
|
||||||
|
|
||||||
|
// Mock pointer capture (for Radix Select)
|
||||||
|
Element.prototype.hasPointerCapture = vi.fn();
|
||||||
|
Element.prototype.setPointerCapture = vi.fn();
|
||||||
|
Element.prototype.releasePointerCapture = vi.fn();
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing with React Testing Library
|
||||||
|
|
||||||
|
### Component Testing Pattern
|
||||||
|
```typescript
|
||||||
|
// tests/features/users/UserList.test.tsx
|
||||||
|
import { render, screen, waitFor } from '@testing-library/react';
|
||||||
|
import userEvent from '@testing-library/user-event';
|
||||||
|
import { describe, it, expect, vi } from 'vitest';
|
||||||
|
import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
|
||||||
|
|
||||||
|
import { UserList } from '@/features/users/components/UserList';
|
||||||
|
import { getMockUser } from './factories';
|
||||||
|
|
||||||
|
const createWrapper = () => {
|
||||||
|
const queryClient = new QueryClient({
|
||||||
|
defaultOptions: {
|
||||||
|
queries: { retry: false },
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
return ({ children }: { children: React.ReactNode }) => (
|
||||||
|
<QueryClientProvider client={queryClient}>{children}</QueryClientProvider>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
describe('UserList', () => {
|
||||||
|
it('should display users when data loads', async () => {
|
||||||
|
const users = [
|
||||||
|
getMockUser({ name: 'Alice' }),
|
||||||
|
getMockUser({ name: 'Bob' }),
|
||||||
|
];
|
||||||
|
|
||||||
|
render(<UserList users={users} />, { wrapper: createWrapper() });
|
||||||
|
|
||||||
|
expect(screen.getByText('Alice')).toBeInTheDocument();
|
||||||
|
expect(screen.getByText('Bob')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show loading state initially', () => {
|
||||||
|
render(<UserList users={[]} isLoading />, { wrapper: createWrapper() });
|
||||||
|
|
||||||
|
expect(screen.getByRole('progressbar')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should call onDelete when delete button clicked', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
const onDelete = vi.fn();
|
||||||
|
const users = [getMockUser({ id: 'user-1', name: 'Alice' })];
|
||||||
|
|
||||||
|
render(<UserList users={users} onDelete={onDelete} />, {
|
||||||
|
wrapper: createWrapper(),
|
||||||
|
});
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('button', { name: /delete alice/i }));
|
||||||
|
|
||||||
|
expect(onDelete).toHaveBeenCalledWith('user-1');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should filter users by search term', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
const users = [
|
||||||
|
getMockUser({ name: 'Alice Smith' }),
|
||||||
|
getMockUser({ name: 'Bob Jones' }),
|
||||||
|
];
|
||||||
|
|
||||||
|
render(<UserList users={users} />, { wrapper: createWrapper() });
|
||||||
|
|
||||||
|
await user.type(screen.getByRole('searchbox'), 'Alice');
|
||||||
|
|
||||||
|
expect(screen.getByText('Alice Smith')).toBeInTheDocument();
|
||||||
|
expect(screen.queryByText('Bob Jones')).not.toBeInTheDocument();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Factory Functions
|
||||||
|
```typescript
|
||||||
|
// tests/features/users/factories.ts
|
||||||
|
import type { User } from '@/features/users/schemas/user.schema';
|
||||||
|
|
||||||
|
export const getMockUser = (overrides?: Partial<User>): User => ({
|
||||||
|
id: 'user-123',
|
||||||
|
email: 'test@example.com',
|
||||||
|
name: 'Test User',
|
||||||
|
role: 'user',
|
||||||
|
createdAt: new Date().toISOString(),
|
||||||
|
...overrides,
|
||||||
|
});
|
||||||
|
|
||||||
|
export const getMockUserList = (count: number): User[] =>
|
||||||
|
Array.from({ length: count }, (_, i) =>
|
||||||
|
getMockUser({ id: `user-${i}`, name: `User ${i}` })
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing Hooks
|
||||||
|
```typescript
|
||||||
|
// tests/features/users/useUsers.test.tsx
|
||||||
|
import { renderHook, waitFor } from '@testing-library/react';
|
||||||
|
import { describe, it, expect, vi } from 'vitest';
|
||||||
|
import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
|
||||||
|
|
||||||
|
import { useUsers } from '@/features/users/hooks/useUsers';
|
||||||
|
import * as api from '@/features/users/api/users';
|
||||||
|
import { getMockUser } from './factories';
|
||||||
|
|
||||||
|
vi.mock('@/features/users/api/users');
|
||||||
|
|
||||||
|
const createWrapper = () => {
|
||||||
|
const queryClient = new QueryClient({
|
||||||
|
defaultOptions: { queries: { retry: false } },
|
||||||
|
});
|
||||||
|
return ({ children }: { children: React.ReactNode }) => (
|
||||||
|
<QueryClientProvider client={queryClient}>{children}</QueryClientProvider>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
describe('useUsers', () => {
|
||||||
|
it('should return users on successful fetch', async () => {
|
||||||
|
const mockUsers = [getMockUser()];
|
||||||
|
vi.mocked(api.fetchUsers).mockResolvedValue(mockUsers);
|
||||||
|
|
||||||
|
const { result } = renderHook(() => useUsers(), {
|
||||||
|
wrapper: createWrapper(),
|
||||||
|
});
|
||||||
|
|
||||||
|
await waitFor(() => expect(result.current.isSuccess).toBe(true));
|
||||||
|
|
||||||
|
expect(result.current.data).toEqual(mockUsers);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error state on failure', async () => {
|
||||||
|
vi.mocked(api.fetchUsers).mockRejectedValue(new Error('Network error'));
|
||||||
|
|
||||||
|
const { result } = renderHook(() => useUsers(), {
|
||||||
|
wrapper: createWrapper(),
|
||||||
|
});
|
||||||
|
|
||||||
|
await waitFor(() => expect(result.current.isError).toBe(true));
|
||||||
|
|
||||||
|
expect(result.current.error?.message).toBe('Network error');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Zod Schema Patterns
|
||||||
|
|
||||||
|
### Schema Definition
|
||||||
|
```typescript
|
||||||
|
// src/features/users/schemas/user.schema.ts
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
export const userSchema = z.object({
|
||||||
|
id: z.string().uuid(),
|
||||||
|
email: z.string().email(),
|
||||||
|
name: z.string().min(1).max(100),
|
||||||
|
role: z.enum(['admin', 'user', 'guest']),
|
||||||
|
createdAt: z.string().datetime(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export type User = z.infer<typeof userSchema>;
|
||||||
|
|
||||||
|
export const userCreateSchema = userSchema.omit({ id: true, createdAt: true });
|
||||||
|
export type UserCreate = z.infer<typeof userCreateSchema>;
|
||||||
|
|
||||||
|
export const userUpdateSchema = userCreateSchema.partial();
|
||||||
|
export type UserUpdate = z.infer<typeof userUpdateSchema>;
|
||||||
|
|
||||||
|
// Validation at API boundary
|
||||||
|
export const parseUser = (data: unknown): User => userSchema.parse(data);
|
||||||
|
|
||||||
|
export const parseUsers = (data: unknown): User[] =>
|
||||||
|
z.array(userSchema).parse(data);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Form Validation with React Hook Form
|
||||||
|
```typescript
|
||||||
|
// src/features/users/components/UserForm.tsx
|
||||||
|
import { useForm } from 'react-hook-form';
|
||||||
|
import { zodResolver } from '@hookform/resolvers/zod';
|
||||||
|
|
||||||
|
import { userCreateSchema, type UserCreate } from '../schemas/user.schema';
|
||||||
|
import { Button } from '@/shared/components/ui/button';
|
||||||
|
import { Input } from '@/shared/components/ui/input';
|
||||||
|
|
||||||
|
type UserFormProps = {
|
||||||
|
onSubmit: (data: UserCreate) => void;
|
||||||
|
isLoading?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function UserForm({ onSubmit, isLoading }: UserFormProps) {
|
||||||
|
const {
|
||||||
|
register,
|
||||||
|
handleSubmit,
|
||||||
|
formState: { errors },
|
||||||
|
} = useForm<UserCreate>({
|
||||||
|
resolver: zodResolver(userCreateSchema),
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<form onSubmit={handleSubmit(onSubmit)} className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<Input
|
||||||
|
{...register('email')}
|
||||||
|
type="email"
|
||||||
|
placeholder="Email"
|
||||||
|
aria-invalid={!!errors.email}
|
||||||
|
/>
|
||||||
|
{errors.email && (
|
||||||
|
<p className="text-sm text-red-500">{errors.email.message}</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<Input
|
||||||
|
{...register('name')}
|
||||||
|
placeholder="Name"
|
||||||
|
aria-invalid={!!errors.name}
|
||||||
|
/>
|
||||||
|
{errors.name && (
|
||||||
|
<p className="text-sm text-red-500">{errors.name.message}</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<Button type="submit" disabled={isLoading}>
|
||||||
|
{isLoading ? 'Creating...' : 'Create User'}
|
||||||
|
</Button>
|
||||||
|
</form>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## TanStack Query Patterns
|
||||||
|
|
||||||
|
### API Functions
|
||||||
|
```typescript
|
||||||
|
// src/features/users/api/users.ts
|
||||||
|
import { apiClient } from '@/shared/lib/api-client';
|
||||||
|
import { parseUsers, parseUser, type User, type UserCreate } from '../schemas/user.schema';
|
||||||
|
|
||||||
|
export async function fetchUsers(): Promise<User[]> {
|
||||||
|
const response = await apiClient.get('/users');
|
||||||
|
return parseUsers(response.data);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function fetchUser(id: string): Promise<User> {
|
||||||
|
const response = await apiClient.get(`/users/${id}`);
|
||||||
|
return parseUser(response.data);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function createUser(data: UserCreate): Promise<User> {
|
||||||
|
const response = await apiClient.post('/users', data);
|
||||||
|
return parseUser(response.data);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function deleteUser(id: string): Promise<void> {
|
||||||
|
await apiClient.delete(`/users/${id}`);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Custom Hooks
|
||||||
|
```typescript
|
||||||
|
// src/features/users/hooks/useUsers.ts
|
||||||
|
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
|
||||||
|
|
||||||
|
import { fetchUsers, createUser, deleteUser } from '../api/users';
|
||||||
|
import type { UserCreate } from '../schemas/user.schema';
|
||||||
|
|
||||||
|
export const userKeys = {
|
||||||
|
all: ['users'] as const,
|
||||||
|
lists: () => [...userKeys.all, 'list'] as const,
|
||||||
|
detail: (id: string) => [...userKeys.all, 'detail', id] as const,
|
||||||
|
};
|
||||||
|
|
||||||
|
export function useUsers() {
|
||||||
|
return useQuery({
|
||||||
|
queryKey: userKeys.lists(),
|
||||||
|
queryFn: fetchUsers,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useCreateUser() {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: (data: UserCreate) => createUser(data),
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: userKeys.lists() });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useDeleteUser() {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: (id: string) => deleteUser(id),
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: userKeys.lists() });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## ESLint Configuration
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// eslint.config.js
|
||||||
|
import js from '@eslint/js';
|
||||||
|
import tseslint from 'typescript-eslint';
|
||||||
|
import reactHooks from 'eslint-plugin-react-hooks';
|
||||||
|
import reactRefresh from 'eslint-plugin-react-refresh';
|
||||||
|
|
||||||
|
export default tseslint.config(
|
||||||
|
{ ignores: ['dist', 'coverage', 'node_modules'] },
|
||||||
|
{
|
||||||
|
extends: [js.configs.recommended, ...tseslint.configs.strictTypeChecked],
|
||||||
|
files: ['**/*.{ts,tsx}'],
|
||||||
|
languageOptions: {
|
||||||
|
parserOptions: {
|
||||||
|
project: ['./tsconfig.json', './tsconfig.node.json'],
|
||||||
|
tsconfigRootDir: import.meta.dirname,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
plugins: {
|
||||||
|
'react-hooks': reactHooks,
|
||||||
|
'react-refresh': reactRefresh,
|
||||||
|
},
|
||||||
|
rules: {
|
||||||
|
...reactHooks.configs.recommended.rules,
|
||||||
|
'react-refresh/only-export-components': [
|
||||||
|
'warn',
|
||||||
|
{ allowConstantExport: true },
|
||||||
|
],
|
||||||
|
'@typescript-eslint/no-unused-vars': [
|
||||||
|
'error',
|
||||||
|
{ argsIgnorePattern: '^_', varsIgnorePattern: '^_' },
|
||||||
|
],
|
||||||
|
'@typescript-eslint/no-explicit-any': 'error',
|
||||||
|
'@typescript-eslint/explicit-function-return-type': 'off',
|
||||||
|
'@typescript-eslint/consistent-type-imports': [
|
||||||
|
'error',
|
||||||
|
{ prefer: 'type-imports' },
|
||||||
|
],
|
||||||
|
},
|
||||||
|
}
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
## TypeScript Configuration
|
||||||
|
|
||||||
|
```json
|
||||||
|
// tsconfig.json
|
||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"target": "ES2022",
|
||||||
|
"useDefineForClassFields": true,
|
||||||
|
"lib": ["ES2022", "DOM", "DOM.Iterable"],
|
||||||
|
"module": "ESNext",
|
||||||
|
"skipLibCheck": true,
|
||||||
|
"moduleResolution": "bundler",
|
||||||
|
"allowImportingTsExtensions": true,
|
||||||
|
"resolveJsonModule": true,
|
||||||
|
"isolatedModules": true,
|
||||||
|
"noEmit": true,
|
||||||
|
"jsx": "react-jsx",
|
||||||
|
"strict": true,
|
||||||
|
"noImplicitAny": true,
|
||||||
|
"strictNullChecks": true,
|
||||||
|
"noUnusedLocals": true,
|
||||||
|
"noUnusedParameters": true,
|
||||||
|
"noFallthroughCasesInSwitch": true,
|
||||||
|
"noUncheckedIndexedAccess": true,
|
||||||
|
"paths": {
|
||||||
|
"@/*": ["./src/*"]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"include": ["src", "tests"]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Component Patterns
|
||||||
|
|
||||||
|
### Accessible Component with Radix
|
||||||
|
```typescript
|
||||||
|
// src/shared/components/ui/button.tsx
|
||||||
|
import * as React from 'react';
|
||||||
|
import { Slot } from '@radix-ui/react-slot';
|
||||||
|
import { cva, type VariantProps } from 'class-variance-authority';
|
||||||
|
import { cn } from '@/shared/lib/utils';
|
||||||
|
|
||||||
|
const buttonVariants = cva(
|
||||||
|
'inline-flex items-center justify-center rounded-md text-sm font-medium transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-50',
|
||||||
|
{
|
||||||
|
variants: {
|
||||||
|
variant: {
|
||||||
|
default: 'bg-primary text-primary-foreground hover:bg-primary/90',
|
||||||
|
destructive: 'bg-destructive text-destructive-foreground hover:bg-destructive/90',
|
||||||
|
outline: 'border border-input hover:bg-accent hover:text-accent-foreground',
|
||||||
|
ghost: 'hover:bg-accent hover:text-accent-foreground',
|
||||||
|
},
|
||||||
|
size: {
|
||||||
|
default: 'h-10 px-4 py-2',
|
||||||
|
sm: 'h-9 rounded-md px-3',
|
||||||
|
lg: 'h-11 rounded-md px-8',
|
||||||
|
icon: 'h-10 w-10',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
defaultVariants: {
|
||||||
|
variant: 'default',
|
||||||
|
size: 'default',
|
||||||
|
},
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
type ButtonProps = React.ButtonHTMLAttributes<HTMLButtonElement> &
|
||||||
|
VariantProps<typeof buttonVariants> & {
|
||||||
|
asChild?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const Button = React.forwardRef<HTMLButtonElement, ButtonProps>(
|
||||||
|
({ className, variant, size, asChild = false, ...props }, ref) => {
|
||||||
|
const Comp = asChild ? Slot : 'button';
|
||||||
|
return (
|
||||||
|
<Comp
|
||||||
|
className={cn(buttonVariants({ variant, size, className }))}
|
||||||
|
ref={ref}
|
||||||
|
{...props}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
);
|
||||||
|
Button.displayName = 'Button';
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Testing
|
||||||
|
npm test # Run tests once
|
||||||
|
npm run test:watch # Watch mode
|
||||||
|
npm run test:coverage # With coverage
|
||||||
|
npm run test:ui # Vitest UI
|
||||||
|
|
||||||
|
# Linting & Type Checking
|
||||||
|
npm run lint # ESLint
|
||||||
|
npm run lint:fix # Auto-fix
|
||||||
|
npm run typecheck # tsc --noEmit
|
||||||
|
|
||||||
|
# Development
|
||||||
|
npm run dev # Vite dev server
|
||||||
|
npm run build # Production build
|
||||||
|
npm run preview # Preview build
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns to Avoid
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// BAD: any type
|
||||||
|
const processData = (data: any) => data.value;
|
||||||
|
|
||||||
|
// GOOD: proper typing or unknown
|
||||||
|
const processData = (data: unknown): string => {
|
||||||
|
const parsed = schema.parse(data);
|
||||||
|
return parsed.value;
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Mutation in state update
|
||||||
|
const addItem = (item: Item) => {
|
||||||
|
items.push(item); // Mutates array!
|
||||||
|
setItems(items);
|
||||||
|
};
|
||||||
|
|
||||||
|
// GOOD: Immutable update
|
||||||
|
const addItem = (item: Item) => {
|
||||||
|
setItems((prev) => [...prev, item]);
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: useEffect for derived state
|
||||||
|
const [fullName, setFullName] = useState('');
|
||||||
|
useEffect(() => {
|
||||||
|
setFullName(`${firstName} ${lastName}`);
|
||||||
|
}, [firstName, lastName]);
|
||||||
|
|
||||||
|
// GOOD: Derive directly
|
||||||
|
const fullName = `${firstName} ${lastName}`;
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Testing implementation details
|
||||||
|
it('should call setState', () => {
|
||||||
|
const spy = vi.spyOn(React, 'useState');
|
||||||
|
render(<Component />);
|
||||||
|
expect(spy).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
// GOOD: Test user-visible behavior
|
||||||
|
it('should show error message for invalid input', async () => {
|
||||||
|
render(<Form />);
|
||||||
|
await userEvent.type(screen.getByLabelText('Email'), 'invalid');
|
||||||
|
await userEvent.click(screen.getByRole('button', { name: 'Submit' }));
|
||||||
|
expect(screen.getByText('Invalid email')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
```
|
||||||
462
.claude/skills/patterns/api-design/SKILL.md
Normal file
462
.claude/skills/patterns/api-design/SKILL.md
Normal file
@@ -0,0 +1,462 @@
|
|||||||
|
---
|
||||||
|
name: api-design
|
||||||
|
description: REST API design patterns with Pydantic/Zod schemas, error handling, and OpenAPI documentation. Use when designing or implementing API endpoints.
|
||||||
|
---
|
||||||
|
|
||||||
|
# API Design Skill
|
||||||
|
|
||||||
|
## Schema-First Development
|
||||||
|
|
||||||
|
Always define schemas before implementation. Schemas serve as:
|
||||||
|
- Runtime validation
|
||||||
|
- Type definitions
|
||||||
|
- API documentation
|
||||||
|
- Test data factories
|
||||||
|
|
||||||
|
### Python (Pydantic)
|
||||||
|
|
||||||
|
```python
|
||||||
|
# schemas/user.py
|
||||||
|
from datetime import datetime
|
||||||
|
from pydantic import BaseModel, EmailStr, Field, field_validator
|
||||||
|
|
||||||
|
|
||||||
|
class UserBase(BaseModel):
|
||||||
|
"""Shared fields for user schemas."""
|
||||||
|
email: EmailStr
|
||||||
|
name: str = Field(..., min_length=1, max_length=100)
|
||||||
|
|
||||||
|
|
||||||
|
class UserCreate(UserBase):
|
||||||
|
"""Request schema for creating a user."""
|
||||||
|
password: str = Field(..., min_length=8)
|
||||||
|
|
||||||
|
@field_validator("password")
|
||||||
|
@classmethod
|
||||||
|
def password_strength(cls, v: str) -> str:
|
||||||
|
if not any(c.isupper() for c in v):
|
||||||
|
raise ValueError("Password must contain uppercase")
|
||||||
|
if not any(c.isdigit() for c in v):
|
||||||
|
raise ValueError("Password must contain digit")
|
||||||
|
return v
|
||||||
|
|
||||||
|
|
||||||
|
class UserUpdate(BaseModel):
|
||||||
|
"""Request schema for updating a user (all optional)."""
|
||||||
|
email: EmailStr | None = None
|
||||||
|
name: str | None = Field(None, min_length=1, max_length=100)
|
||||||
|
|
||||||
|
|
||||||
|
class UserResponse(UserBase):
|
||||||
|
"""Response schema (no password)."""
|
||||||
|
id: str
|
||||||
|
is_active: bool
|
||||||
|
created_at: datetime
|
||||||
|
|
||||||
|
model_config = {"from_attributes": True}
|
||||||
|
|
||||||
|
|
||||||
|
class UserListResponse(BaseModel):
|
||||||
|
"""Paginated list response."""
|
||||||
|
items: list[UserResponse]
|
||||||
|
total: int
|
||||||
|
page: int
|
||||||
|
page_size: int
|
||||||
|
has_more: bool
|
||||||
|
```
|
||||||
|
|
||||||
|
### TypeScript (Zod)
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// schemas/user.schema.ts
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
export const userBaseSchema = z.object({
|
||||||
|
email: z.string().email(),
|
||||||
|
name: z.string().min(1).max(100),
|
||||||
|
});
|
||||||
|
|
||||||
|
export const userCreateSchema = userBaseSchema.extend({
|
||||||
|
password: z
|
||||||
|
.string()
|
||||||
|
.min(8)
|
||||||
|
.refine((p) => /[A-Z]/.test(p), 'Must contain uppercase')
|
||||||
|
.refine((p) => /\d/.test(p), 'Must contain digit'),
|
||||||
|
});
|
||||||
|
|
||||||
|
export const userUpdateSchema = userBaseSchema.partial();
|
||||||
|
|
||||||
|
export const userResponseSchema = userBaseSchema.extend({
|
||||||
|
id: z.string().uuid(),
|
||||||
|
isActive: z.boolean(),
|
||||||
|
createdAt: z.string().datetime(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export const userListResponseSchema = z.object({
|
||||||
|
items: z.array(userResponseSchema),
|
||||||
|
total: z.number().int().nonnegative(),
|
||||||
|
page: z.number().int().positive(),
|
||||||
|
pageSize: z.number().int().positive(),
|
||||||
|
hasMore: z.boolean(),
|
||||||
|
});
|
||||||
|
|
||||||
|
// Derived types
|
||||||
|
export type UserCreate = z.infer<typeof userCreateSchema>;
|
||||||
|
export type UserUpdate = z.infer<typeof userUpdateSchema>;
|
||||||
|
export type UserResponse = z.infer<typeof userResponseSchema>;
|
||||||
|
export type UserListResponse = z.infer<typeof userListResponseSchema>;
|
||||||
|
|
||||||
|
// Validation functions for API boundaries
|
||||||
|
export const parseUserCreate = (data: unknown) => userCreateSchema.parse(data);
|
||||||
|
export const parseUserResponse = (data: unknown) => userResponseSchema.parse(data);
|
||||||
|
```
|
||||||
|
|
||||||
|
## REST Endpoint Patterns
|
||||||
|
|
||||||
|
### Resource Naming
|
||||||
|
```
|
||||||
|
GET /users # List users
|
||||||
|
POST /users # Create user
|
||||||
|
GET /users/{id} # Get single user
|
||||||
|
PUT /users/{id} # Full update
|
||||||
|
PATCH /users/{id} # Partial update
|
||||||
|
DELETE /users/{id} # Delete user
|
||||||
|
|
||||||
|
# Nested resources
|
||||||
|
GET /users/{id}/orders # User's orders
|
||||||
|
POST /users/{id}/orders # Create order for user
|
||||||
|
|
||||||
|
# Actions (when CRUD doesn't fit)
|
||||||
|
POST /users/{id}/activate
|
||||||
|
POST /orders/{id}/cancel
|
||||||
|
```
|
||||||
|
|
||||||
|
### FastAPI Implementation
|
||||||
|
|
||||||
|
```python
|
||||||
|
# routers/users.py
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, Query, status
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
||||||
|
from app.schemas.user import (
|
||||||
|
UserCreate,
|
||||||
|
UserUpdate,
|
||||||
|
UserResponse,
|
||||||
|
UserListResponse,
|
||||||
|
)
|
||||||
|
from app.services.user import UserService
|
||||||
|
from app.dependencies import get_db, get_current_user
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/users", tags=["users"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("", response_model=UserListResponse)
|
||||||
|
async def list_users(
|
||||||
|
page: int = Query(1, ge=1),
|
||||||
|
page_size: int = Query(20, ge=1, le=100),
|
||||||
|
db: AsyncSession = Depends(get_db),
|
||||||
|
) -> UserListResponse:
|
||||||
|
"""List users with pagination."""
|
||||||
|
service = UserService(db)
|
||||||
|
return await service.list_users(page=page, page_size=page_size)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("", response_model=UserResponse, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_user(
|
||||||
|
data: UserCreate,
|
||||||
|
db: AsyncSession = Depends(get_db),
|
||||||
|
) -> UserResponse:
|
||||||
|
"""Create a new user."""
|
||||||
|
service = UserService(db)
|
||||||
|
try:
|
||||||
|
return await service.create_user(data)
|
||||||
|
except ValueError as e:
|
||||||
|
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{user_id}", response_model=UserResponse)
|
||||||
|
async def get_user(
|
||||||
|
user_id: str,
|
||||||
|
db: AsyncSession = Depends(get_db),
|
||||||
|
) -> UserResponse:
|
||||||
|
"""Get a user by ID."""
|
||||||
|
service = UserService(db)
|
||||||
|
user = await service.get_user(user_id)
|
||||||
|
if not user:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
|
||||||
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
@router.patch("/{user_id}", response_model=UserResponse)
|
||||||
|
async def update_user(
|
||||||
|
user_id: str,
|
||||||
|
data: UserUpdate,
|
||||||
|
db: AsyncSession = Depends(get_db),
|
||||||
|
current_user: UserResponse = Depends(get_current_user),
|
||||||
|
) -> UserResponse:
|
||||||
|
"""Partially update a user."""
|
||||||
|
service = UserService(db)
|
||||||
|
user = await service.update_user(user_id, data)
|
||||||
|
if not user:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
|
||||||
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{user_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def delete_user(
|
||||||
|
user_id: str,
|
||||||
|
db: AsyncSession = Depends(get_db),
|
||||||
|
current_user: UserResponse = Depends(get_current_user),
|
||||||
|
) -> None:
|
||||||
|
"""Delete a user."""
|
||||||
|
service = UserService(db)
|
||||||
|
deleted = await service.delete_user(user_id)
|
||||||
|
if not deleted:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Standard Error Response (RFC 7807)
|
||||||
|
|
||||||
|
```python
|
||||||
|
# schemas/error.py
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
|
||||||
|
class ErrorDetail(BaseModel):
|
||||||
|
"""Standard error response following RFC 7807."""
|
||||||
|
type: str = "about:blank"
|
||||||
|
title: str
|
||||||
|
status: int
|
||||||
|
detail: str
|
||||||
|
instance: str | None = None
|
||||||
|
|
||||||
|
|
||||||
|
# Exception handler
|
||||||
|
from fastapi import Request
|
||||||
|
from fastapi.responses import JSONResponse
|
||||||
|
|
||||||
|
async def validation_exception_handler(request: Request, exc: RequestValidationError):
|
||||||
|
return JSONResponse(
|
||||||
|
status_code=422,
|
||||||
|
content=ErrorDetail(
|
||||||
|
type="validation_error",
|
||||||
|
title="Validation Error",
|
||||||
|
status=422,
|
||||||
|
detail=str(exc.errors()),
|
||||||
|
instance=str(request.url),
|
||||||
|
).model_dump(),
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### TypeScript Error Handling
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// lib/api-client.ts
|
||||||
|
import axios, { AxiosError } from 'axios';
|
||||||
|
import { z } from 'zod';
|
||||||
|
|
||||||
|
const errorSchema = z.object({
|
||||||
|
type: z.string(),
|
||||||
|
title: z.string(),
|
||||||
|
status: z.number(),
|
||||||
|
detail: z.string(),
|
||||||
|
instance: z.string().optional(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export class ApiError extends Error {
|
||||||
|
constructor(
|
||||||
|
public status: number,
|
||||||
|
public title: string,
|
||||||
|
public detail: string,
|
||||||
|
) {
|
||||||
|
super(detail);
|
||||||
|
this.name = 'ApiError';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const apiClient = axios.create({
|
||||||
|
baseURL: '/api',
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
});
|
||||||
|
|
||||||
|
apiClient.interceptors.response.use(
|
||||||
|
(response) => response,
|
||||||
|
(error: AxiosError) => {
|
||||||
|
if (error.response?.data) {
|
||||||
|
const parsed = errorSchema.safeParse(error.response.data);
|
||||||
|
if (parsed.success) {
|
||||||
|
throw new ApiError(
|
||||||
|
parsed.data.status,
|
||||||
|
parsed.data.title,
|
||||||
|
parsed.data.detail,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw new ApiError(500, 'Server Error', 'An unexpected error occurred');
|
||||||
|
},
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Pagination Pattern
|
||||||
|
|
||||||
|
```python
|
||||||
|
# schemas/pagination.py
|
||||||
|
from typing import Generic, TypeVar
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
|
T = TypeVar("T")
|
||||||
|
|
||||||
|
|
||||||
|
class PaginatedResponse(BaseModel, Generic[T]):
|
||||||
|
"""Generic paginated response."""
|
||||||
|
items: list[T]
|
||||||
|
total: int
|
||||||
|
page: int = Field(ge=1)
|
||||||
|
page_size: int = Field(ge=1, le=100)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def has_more(self) -> bool:
|
||||||
|
return self.page * self.page_size < self.total
|
||||||
|
|
||||||
|
@property
|
||||||
|
def total_pages(self) -> int:
|
||||||
|
return (self.total + self.page_size - 1) // self.page_size
|
||||||
|
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
class UserListResponse(PaginatedResponse[UserResponse]):
|
||||||
|
pass
|
||||||
|
```
|
||||||
|
|
||||||
|
## Query Parameters
|
||||||
|
|
||||||
|
```python
|
||||||
|
# dependencies/pagination.py
|
||||||
|
from fastapi import Query
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
|
||||||
|
class PaginationParams(BaseModel):
|
||||||
|
page: int = Query(1, ge=1, description="Page number")
|
||||||
|
page_size: int = Query(20, ge=1, le=100, description="Items per page")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def offset(self) -> int:
|
||||||
|
return (self.page - 1) * self.page_size
|
||||||
|
|
||||||
|
|
||||||
|
class SortParams(BaseModel):
|
||||||
|
sort_by: str = Query("created_at", description="Field to sort by")
|
||||||
|
sort_order: str = Query("desc", pattern="^(asc|desc)$")
|
||||||
|
|
||||||
|
|
||||||
|
class FilterParams(BaseModel):
|
||||||
|
search: str | None = Query(None, min_length=1, max_length=100)
|
||||||
|
status: str | None = Query(None, pattern="^(active|inactive|pending)$")
|
||||||
|
created_after: datetime | None = Query(None)
|
||||||
|
created_before: datetime | None = Query(None)
|
||||||
|
```
|
||||||
|
|
||||||
|
## OpenAPI Documentation
|
||||||
|
|
||||||
|
```python
|
||||||
|
# main.py
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.openapi.utils import get_openapi
|
||||||
|
|
||||||
|
app = FastAPI(
|
||||||
|
title="My API",
|
||||||
|
description="API for managing resources",
|
||||||
|
version="1.0.0",
|
||||||
|
docs_url="/docs",
|
||||||
|
redoc_url="/redoc",
|
||||||
|
)
|
||||||
|
|
||||||
|
def custom_openapi():
|
||||||
|
if app.openapi_schema:
|
||||||
|
return app.openapi_schema
|
||||||
|
|
||||||
|
openapi_schema = get_openapi(
|
||||||
|
title=app.title,
|
||||||
|
version=app.version,
|
||||||
|
description=app.description,
|
||||||
|
routes=app.routes,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add security scheme
|
||||||
|
openapi_schema["components"]["securitySchemes"] = {
|
||||||
|
"bearerAuth": {
|
||||||
|
"type": "http",
|
||||||
|
"scheme": "bearer",
|
||||||
|
"bearerFormat": "JWT",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
app.openapi_schema = openapi_schema
|
||||||
|
return app.openapi_schema
|
||||||
|
|
||||||
|
app.openapi = custom_openapi
|
||||||
|
```
|
||||||
|
|
||||||
|
## HTTP Status Codes
|
||||||
|
|
||||||
|
| Code | Meaning | When to Use |
|
||||||
|
|------|---------|-------------|
|
||||||
|
| 200 | OK | Successful GET, PUT, PATCH |
|
||||||
|
| 201 | Created | Successful POST creating resource |
|
||||||
|
| 204 | No Content | Successful DELETE |
|
||||||
|
| 400 | Bad Request | Invalid request body/params |
|
||||||
|
| 401 | Unauthorized | Missing/invalid authentication |
|
||||||
|
| 403 | Forbidden | Authenticated but not authorized |
|
||||||
|
| 404 | Not Found | Resource doesn't exist |
|
||||||
|
| 409 | Conflict | Duplicate resource (e.g., email exists) |
|
||||||
|
| 422 | Unprocessable | Validation error |
|
||||||
|
| 500 | Server Error | Unexpected server error |
|
||||||
|
|
||||||
|
## Anti-Patterns
|
||||||
|
|
||||||
|
```python
|
||||||
|
# BAD: Returning different shapes
|
||||||
|
@router.get("/users/{id}")
|
||||||
|
async def get_user(id: str):
|
||||||
|
user = await get_user(id)
|
||||||
|
if user:
|
||||||
|
return user # UserResponse
|
||||||
|
return {"error": "not found"} # Different shape!
|
||||||
|
|
||||||
|
# GOOD: Consistent response or exception
|
||||||
|
@router.get("/users/{id}", response_model=UserResponse)
|
||||||
|
async def get_user(id: str):
|
||||||
|
user = await get_user(id)
|
||||||
|
if not user:
|
||||||
|
raise HTTPException(status_code=404, detail="User not found")
|
||||||
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: Exposing internal details
|
||||||
|
class UserResponse(BaseModel):
|
||||||
|
id: str
|
||||||
|
email: str
|
||||||
|
hashed_password: str # NEVER expose!
|
||||||
|
internal_notes: str # Internal only!
|
||||||
|
|
||||||
|
# GOOD: Explicit public fields
|
||||||
|
class UserResponse(BaseModel):
|
||||||
|
id: str
|
||||||
|
email: str
|
||||||
|
name: str
|
||||||
|
# Only fields clients need
|
||||||
|
|
||||||
|
|
||||||
|
# BAD: No validation at boundary
|
||||||
|
@router.post("/users")
|
||||||
|
async def create_user(data: dict): # Unvalidated!
|
||||||
|
return await service.create(data)
|
||||||
|
|
||||||
|
# GOOD: Schema validation
|
||||||
|
@router.post("/users", response_model=UserResponse)
|
||||||
|
async def create_user(data: UserCreate): # Validated!
|
||||||
|
return await service.create(data)
|
||||||
|
```
|
||||||
404
.claude/skills/patterns/monorepo/SKILL.md
Normal file
404
.claude/skills/patterns/monorepo/SKILL.md
Normal file
@@ -0,0 +1,404 @@
|
|||||||
|
---
|
||||||
|
name: monorepo-patterns
|
||||||
|
description: Monorepo workspace patterns for multi-package projects with shared dependencies, testing strategies, and CI/CD. Use when working in monorepo structures.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Monorepo Patterns Skill
|
||||||
|
|
||||||
|
## Recommended Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
project/
|
||||||
|
├── apps/
|
||||||
|
│ ├── backend/ # Python FastAPI
|
||||||
|
│ │ ├── src/
|
||||||
|
│ │ ├── tests/
|
||||||
|
│ │ └── pyproject.toml
|
||||||
|
│ └── frontend/ # React TypeScript
|
||||||
|
│ ├── src/
|
||||||
|
│ ├── tests/
|
||||||
|
│ └── package.json
|
||||||
|
├── packages/
|
||||||
|
│ ├── shared-types/ # Shared TypeScript types
|
||||||
|
│ │ ├── src/
|
||||||
|
│ │ └── package.json
|
||||||
|
│ └── ui-components/ # Shared React components
|
||||||
|
│ ├── src/
|
||||||
|
│ └── package.json
|
||||||
|
├── infrastructure/
|
||||||
|
│ ├── terraform/
|
||||||
|
│ │ ├── environments/
|
||||||
|
│ │ └── modules/
|
||||||
|
│ └── ansible/
|
||||||
|
│ ├── playbooks/
|
||||||
|
│ └── roles/
|
||||||
|
├── scripts/ # Shared scripts
|
||||||
|
├── docs/ # Documentation
|
||||||
|
├── .github/
|
||||||
|
│ └── workflows/
|
||||||
|
├── package.json # Root (workspaces config)
|
||||||
|
├── pyproject.toml # Python workspace config
|
||||||
|
└── CLAUDE.md # Project-level guidance
|
||||||
|
```
|
||||||
|
|
||||||
|
## Workspace Configuration
|
||||||
|
|
||||||
|
### npm Workspaces (Node.js)
|
||||||
|
```json
|
||||||
|
// package.json (root)
|
||||||
|
{
|
||||||
|
"name": "my-monorepo",
|
||||||
|
"private": true,
|
||||||
|
"workspaces": [
|
||||||
|
"apps/*",
|
||||||
|
"packages/*"
|
||||||
|
],
|
||||||
|
"scripts": {
|
||||||
|
"dev": "npm run dev --workspaces --if-present",
|
||||||
|
"build": "npm run build --workspaces --if-present",
|
||||||
|
"test": "npm run test --workspaces --if-present",
|
||||||
|
"lint": "npm run lint --workspaces --if-present",
|
||||||
|
"typecheck": "npm run typecheck --workspaces --if-present"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"typescript": "^5.6.0",
|
||||||
|
"vitest": "^3.2.0",
|
||||||
|
"@types/node": "^22.0.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### UV Workspace (Python)
|
||||||
|
```toml
|
||||||
|
# pyproject.toml (root)
|
||||||
|
[project]
|
||||||
|
name = "my-monorepo"
|
||||||
|
version = "0.0.0"
|
||||||
|
requires-python = ">=3.11"
|
||||||
|
|
||||||
|
[tool.uv.workspace]
|
||||||
|
members = ["apps/*", "packages/*"]
|
||||||
|
|
||||||
|
[tool.uv.sources]
|
||||||
|
shared-utils = { workspace = true }
|
||||||
|
```
|
||||||
|
|
||||||
|
## Package References
|
||||||
|
|
||||||
|
### TypeScript Internal Packages
|
||||||
|
```json
|
||||||
|
// packages/shared-types/package.json
|
||||||
|
{
|
||||||
|
"name": "@myorg/shared-types",
|
||||||
|
"version": "0.0.0",
|
||||||
|
"private": true,
|
||||||
|
"main": "./dist/index.js",
|
||||||
|
"types": "./dist/index.d.ts",
|
||||||
|
"exports": {
|
||||||
|
".": {
|
||||||
|
"types": "./dist/index.d.ts",
|
||||||
|
"import": "./dist/index.js"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"scripts": {
|
||||||
|
"build": "tsc",
|
||||||
|
"dev": "tsc --watch"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// apps/frontend/package.json
|
||||||
|
{
|
||||||
|
"name": "@myorg/frontend",
|
||||||
|
"dependencies": {
|
||||||
|
"@myorg/shared-types": "workspace:*"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Python Internal Packages
|
||||||
|
```toml
|
||||||
|
# packages/shared-utils/pyproject.toml
|
||||||
|
[project]
|
||||||
|
name = "shared-utils"
|
||||||
|
version = "0.1.0"
|
||||||
|
dependencies = []
|
||||||
|
|
||||||
|
[build-system]
|
||||||
|
requires = ["hatchling"]
|
||||||
|
build-backend = "hatchling.build"
|
||||||
|
|
||||||
|
# apps/backend/pyproject.toml
|
||||||
|
[project]
|
||||||
|
name = "backend"
|
||||||
|
dependencies = [
|
||||||
|
"shared-utils", # Resolved via workspace
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Strategies
|
||||||
|
|
||||||
|
### Run All Tests
|
||||||
|
```bash
|
||||||
|
# From root
|
||||||
|
npm test # All Node packages
|
||||||
|
uv run pytest # All Python packages
|
||||||
|
|
||||||
|
# Specific workspace
|
||||||
|
npm test --workspace=@myorg/frontend
|
||||||
|
uv run pytest apps/backend/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test Dependencies Between Packages
|
||||||
|
```typescript
|
||||||
|
// packages/shared-types/src/user.ts
|
||||||
|
export type User = {
|
||||||
|
id: string;
|
||||||
|
email: string;
|
||||||
|
name: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
// apps/frontend/src/features/users/types.ts
|
||||||
|
// Import from workspace package
|
||||||
|
import type { User } from '@myorg/shared-types';
|
||||||
|
|
||||||
|
export type UserListProps = {
|
||||||
|
users: User[];
|
||||||
|
onSelect: (user: User) => void;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
### Integration Tests Across Packages
|
||||||
|
```typescript
|
||||||
|
// apps/frontend/tests/integration/api.test.ts
|
||||||
|
import { User } from '@myorg/shared-types';
|
||||||
|
import { renderWithProviders } from '../utils/render';
|
||||||
|
|
||||||
|
describe('Frontend-Backend Integration', () => {
|
||||||
|
it('should display user from API', async () => {
|
||||||
|
const mockUser: User = {
|
||||||
|
id: 'user-1',
|
||||||
|
email: 'test@example.com',
|
||||||
|
name: 'Test User',
|
||||||
|
};
|
||||||
|
|
||||||
|
// Mock API response with shared type
|
||||||
|
server.use(
|
||||||
|
http.get('/api/users/user-1', () => HttpResponse.json(mockUser))
|
||||||
|
);
|
||||||
|
|
||||||
|
render(<UserProfile userId="user-1" />);
|
||||||
|
|
||||||
|
await expect(screen.findByText('Test User')).resolves.toBeInTheDocument();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## CI/CD Patterns
|
||||||
|
|
||||||
|
### Change Detection
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/ci.yml
|
||||||
|
name: CI
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [main]
|
||||||
|
pull_request:
|
||||||
|
branches: [main]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
detect-changes:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
frontend: ${{ steps.changes.outputs.frontend }}
|
||||||
|
backend: ${{ steps.changes.outputs.backend }}
|
||||||
|
infrastructure: ${{ steps.changes.outputs.infrastructure }}
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
- uses: dorny/paths-filter@v3
|
||||||
|
id: changes
|
||||||
|
with:
|
||||||
|
filters: |
|
||||||
|
frontend:
|
||||||
|
- 'apps/frontend/**'
|
||||||
|
- 'packages/shared-types/**'
|
||||||
|
- 'packages/ui-components/**'
|
||||||
|
backend:
|
||||||
|
- 'apps/backend/**'
|
||||||
|
- 'packages/shared-utils/**'
|
||||||
|
infrastructure:
|
||||||
|
- 'infrastructure/**'
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
needs: detect-changes
|
||||||
|
if: needs.detect-changes.outputs.frontend == 'true'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
- uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: '22'
|
||||||
|
cache: 'npm'
|
||||||
|
- run: npm ci
|
||||||
|
- run: npm run typecheck --workspace=@myorg/frontend
|
||||||
|
- run: npm run lint --workspace=@myorg/frontend
|
||||||
|
- run: npm run test --workspace=@myorg/frontend
|
||||||
|
|
||||||
|
backend:
|
||||||
|
needs: detect-changes
|
||||||
|
if: needs.detect-changes.outputs.backend == 'true'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
- uses: astral-sh/setup-uv@v4
|
||||||
|
- run: uv sync
|
||||||
|
- run: uv run ruff check apps/backend/
|
||||||
|
- run: uv run mypy apps/backend/
|
||||||
|
- run: uv run pytest apps/backend/ --cov --cov-fail-under=80
|
||||||
|
```
|
||||||
|
|
||||||
|
### Jenkinsfile for Monorepo
|
||||||
|
```groovy
|
||||||
|
// Jenkinsfile
|
||||||
|
pipeline {
|
||||||
|
agent any
|
||||||
|
|
||||||
|
stages {
|
||||||
|
stage('Detect Changes') {
|
||||||
|
steps {
|
||||||
|
script {
|
||||||
|
def changes = sh(
|
||||||
|
script: 'git diff --name-only HEAD~1',
|
||||||
|
returnStdout: true
|
||||||
|
).trim().split('\n')
|
||||||
|
|
||||||
|
env.FRONTEND_CHANGED = changes.any { it.startsWith('apps/frontend/') || it.startsWith('packages/') }
|
||||||
|
env.BACKEND_CHANGED = changes.any { it.startsWith('apps/backend/') }
|
||||||
|
env.INFRA_CHANGED = changes.any { it.startsWith('infrastructure/') }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Frontend') {
|
||||||
|
when {
|
||||||
|
expression { env.FRONTEND_CHANGED == 'true' }
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
dir('apps/frontend') {
|
||||||
|
sh 'npm ci'
|
||||||
|
sh 'npm run typecheck'
|
||||||
|
sh 'npm run lint'
|
||||||
|
sh 'npm run test'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Backend') {
|
||||||
|
when {
|
||||||
|
expression { env.BACKEND_CHANGED == 'true' }
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
sh 'uv sync'
|
||||||
|
sh 'uv run ruff check apps/backend/'
|
||||||
|
sh 'uv run pytest apps/backend/ --cov --cov-fail-under=80'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stage('Infrastructure') {
|
||||||
|
when {
|
||||||
|
expression { env.INFRA_CHANGED == 'true' }
|
||||||
|
}
|
||||||
|
steps {
|
||||||
|
dir('infrastructure/terraform') {
|
||||||
|
sh 'terraform init'
|
||||||
|
sh 'terraform validate'
|
||||||
|
sh 'terraform fmt -check -recursive'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Dependency Management
|
||||||
|
|
||||||
|
### Shared Dependencies at Root
|
||||||
|
```json
|
||||||
|
// package.json (root)
|
||||||
|
{
|
||||||
|
"devDependencies": {
|
||||||
|
// Shared dev dependencies
|
||||||
|
"typescript": "^5.6.0",
|
||||||
|
"vitest": "^3.2.0",
|
||||||
|
"eslint": "^9.0.0",
|
||||||
|
"@types/node": "^22.0.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Package-Specific Dependencies
|
||||||
|
```json
|
||||||
|
// apps/frontend/package.json
|
||||||
|
{
|
||||||
|
"dependencies": {
|
||||||
|
// App-specific dependencies
|
||||||
|
"react": "^18.3.0",
|
||||||
|
"@tanstack/react-query": "^5.0.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands Quick Reference
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install all dependencies
|
||||||
|
npm install # Node (from root)
|
||||||
|
uv sync # Python
|
||||||
|
|
||||||
|
# Run in specific workspace
|
||||||
|
npm run dev --workspace=@myorg/frontend
|
||||||
|
npm run test --workspace=@myorg/shared-types
|
||||||
|
|
||||||
|
# Run in all workspaces
|
||||||
|
npm run build --workspaces
|
||||||
|
npm run test --workspaces --if-present
|
||||||
|
|
||||||
|
# Add dependency to specific package
|
||||||
|
npm install lodash --workspace=@myorg/frontend
|
||||||
|
uv add requests --package backend
|
||||||
|
|
||||||
|
# Add shared dependency to root
|
||||||
|
npm install -D prettier
|
||||||
|
```
|
||||||
|
|
||||||
|
## CLAUDE.md Placement
|
||||||
|
|
||||||
|
### Root CLAUDE.md (Project-Wide)
|
||||||
|
```markdown
|
||||||
|
# Project Standards
|
||||||
|
|
||||||
|
[Core standards that apply everywhere]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Package-Specific CLAUDE.md
|
||||||
|
```markdown
|
||||||
|
# apps/frontend/CLAUDE.md
|
||||||
|
|
||||||
|
## Frontend-Specific Standards
|
||||||
|
|
||||||
|
- Use React Testing Library for component tests
|
||||||
|
- Prefer Radix UI primitives
|
||||||
|
- Use TanStack Query for server state
|
||||||
|
```
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# apps/backend/CLAUDE.md
|
||||||
|
|
||||||
|
## Backend-Specific Standards
|
||||||
|
|
||||||
|
- Use pytest-asyncio for async tests
|
||||||
|
- Pydantic v2 for all schemas
|
||||||
|
- SQLAlchemy 2.0 async patterns
|
||||||
|
```
|
||||||
|
|
||||||
|
Skills in `~/.claude/skills/` are automatically available across all packages in the monorepo.
|
||||||
486
.claude/skills/patterns/observability/SKILL.md
Normal file
486
.claude/skills/patterns/observability/SKILL.md
Normal file
@@ -0,0 +1,486 @@
|
|||||||
|
---
|
||||||
|
name: observability
|
||||||
|
description: Logging, metrics, and tracing patterns for application observability. Use when implementing monitoring, debugging, or production visibility.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Observability Skill
|
||||||
|
|
||||||
|
## Three Pillars
|
||||||
|
|
||||||
|
1. **Logs** - Discrete events with context
|
||||||
|
2. **Metrics** - Aggregated measurements over time
|
||||||
|
3. **Traces** - Request flow across services
|
||||||
|
|
||||||
|
## Structured Logging
|
||||||
|
|
||||||
|
### Python (structlog)
|
||||||
|
```python
|
||||||
|
import structlog
|
||||||
|
from structlog.types import Processor
|
||||||
|
|
||||||
|
def configure_logging(json_output: bool = True) -> None:
|
||||||
|
"""Configure structured logging."""
|
||||||
|
processors: list[Processor] = [
|
||||||
|
structlog.contextvars.merge_contextvars,
|
||||||
|
structlog.processors.add_log_level,
|
||||||
|
structlog.processors.TimeStamper(fmt="iso"),
|
||||||
|
structlog.processors.StackInfoRenderer(),
|
||||||
|
]
|
||||||
|
|
||||||
|
if json_output:
|
||||||
|
processors.append(structlog.processors.JSONRenderer())
|
||||||
|
else:
|
||||||
|
processors.append(structlog.dev.ConsoleRenderer())
|
||||||
|
|
||||||
|
structlog.configure(
|
||||||
|
processors=processors,
|
||||||
|
wrapper_class=structlog.make_filtering_bound_logger(logging.INFO),
|
||||||
|
context_class=dict,
|
||||||
|
logger_factory=structlog.PrintLoggerFactory(),
|
||||||
|
cache_logger_on_first_use=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
logger = structlog.get_logger()
|
||||||
|
|
||||||
|
# Add context that persists across log calls
|
||||||
|
structlog.contextvars.bind_contextvars(
|
||||||
|
request_id="req-123",
|
||||||
|
user_id="user-456",
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info("order_created", order_id="order-789", total=150.00)
|
||||||
|
# {"event": "order_created", "order_id": "order-789", "total": 150.0, "request_id": "req-123", "user_id": "user-456", "level": "info", "timestamp": "2024-01-15T10:30:00Z"}
|
||||||
|
|
||||||
|
logger.error("payment_failed", order_id="order-789", error="insufficient_funds")
|
||||||
|
```
|
||||||
|
|
||||||
|
### TypeScript (pino)
|
||||||
|
```typescript
|
||||||
|
import pino from 'pino';
|
||||||
|
|
||||||
|
const logger = pino({
|
||||||
|
level: process.env.LOG_LEVEL || 'info',
|
||||||
|
formatters: {
|
||||||
|
level: (label) => ({ level: label }),
|
||||||
|
},
|
||||||
|
timestamp: pino.stdTimeFunctions.isoTime,
|
||||||
|
redact: ['password', 'token', 'authorization'],
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create child logger with bound context
|
||||||
|
const requestLogger = logger.child({
|
||||||
|
requestId: 'req-123',
|
||||||
|
userId: 'user-456',
|
||||||
|
});
|
||||||
|
|
||||||
|
requestLogger.info({ orderId: 'order-789', total: 150.0 }, 'order_created');
|
||||||
|
requestLogger.error({ orderId: 'order-789', error: 'insufficient_funds' }, 'payment_failed');
|
||||||
|
|
||||||
|
// Express middleware
|
||||||
|
import { randomUUID } from 'crypto';
|
||||||
|
|
||||||
|
const loggingMiddleware = (req, res, next) => {
|
||||||
|
const requestId = req.headers['x-request-id'] || randomUUID();
|
||||||
|
|
||||||
|
req.log = logger.child({
|
||||||
|
requestId,
|
||||||
|
method: req.method,
|
||||||
|
path: req.path,
|
||||||
|
userAgent: req.headers['user-agent'],
|
||||||
|
});
|
||||||
|
|
||||||
|
const startTime = Date.now();
|
||||||
|
|
||||||
|
res.on('finish', () => {
|
||||||
|
req.log.info({
|
||||||
|
statusCode: res.statusCode,
|
||||||
|
durationMs: Date.now() - startTime,
|
||||||
|
}, 'request_completed');
|
||||||
|
});
|
||||||
|
|
||||||
|
next();
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
### Log Levels
|
||||||
|
|
||||||
|
| Level | When to Use |
|
||||||
|
|-------|-------------|
|
||||||
|
| `error` | Failures requiring attention |
|
||||||
|
| `warn` | Unexpected but handled situations |
|
||||||
|
| `info` | Business events (order created, user logged in) |
|
||||||
|
| `debug` | Technical details for debugging |
|
||||||
|
| `trace` | Very detailed tracing (rarely used in prod) |
|
||||||
|
|
||||||
|
## Metrics
|
||||||
|
|
||||||
|
### Python (prometheus-client)
|
||||||
|
```python
|
||||||
|
from prometheus_client import Counter, Histogram, Gauge, start_http_server
|
||||||
|
import time
|
||||||
|
|
||||||
|
# Define metrics
|
||||||
|
REQUEST_COUNT = Counter(
|
||||||
|
'http_requests_total',
|
||||||
|
'Total HTTP requests',
|
||||||
|
['method', 'endpoint', 'status']
|
||||||
|
)
|
||||||
|
|
||||||
|
REQUEST_LATENCY = Histogram(
|
||||||
|
'http_request_duration_seconds',
|
||||||
|
'HTTP request latency',
|
||||||
|
['method', 'endpoint'],
|
||||||
|
buckets=[0.01, 0.05, 0.1, 0.5, 1.0, 5.0]
|
||||||
|
)
|
||||||
|
|
||||||
|
ACTIVE_CONNECTIONS = Gauge(
|
||||||
|
'active_connections',
|
||||||
|
'Number of active connections'
|
||||||
|
)
|
||||||
|
|
||||||
|
ORDERS_PROCESSED = Counter(
|
||||||
|
'orders_processed_total',
|
||||||
|
'Total orders processed',
|
||||||
|
['status'] # success, failed
|
||||||
|
)
|
||||||
|
|
||||||
|
# Usage
|
||||||
|
def process_request(method: str, endpoint: str):
|
||||||
|
ACTIVE_CONNECTIONS.inc()
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Process request...
|
||||||
|
REQUEST_COUNT.labels(method=method, endpoint=endpoint, status='200').inc()
|
||||||
|
except Exception:
|
||||||
|
REQUEST_COUNT.labels(method=method, endpoint=endpoint, status='500').inc()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
REQUEST_LATENCY.labels(method=method, endpoint=endpoint).observe(
|
||||||
|
time.time() - start_time
|
||||||
|
)
|
||||||
|
ACTIVE_CONNECTIONS.dec()
|
||||||
|
|
||||||
|
# FastAPI middleware
|
||||||
|
from fastapi import FastAPI, Request
|
||||||
|
from prometheus_client import generate_latest, CONTENT_TYPE_LATEST
|
||||||
|
from starlette.responses import Response
|
||||||
|
|
||||||
|
app = FastAPI()
|
||||||
|
|
||||||
|
@app.middleware("http")
|
||||||
|
async def metrics_middleware(request: Request, call_next):
|
||||||
|
start_time = time.time()
|
||||||
|
response = await call_next(request)
|
||||||
|
|
||||||
|
REQUEST_COUNT.labels(
|
||||||
|
method=request.method,
|
||||||
|
endpoint=request.url.path,
|
||||||
|
status=response.status_code
|
||||||
|
).inc()
|
||||||
|
|
||||||
|
REQUEST_LATENCY.labels(
|
||||||
|
method=request.method,
|
||||||
|
endpoint=request.url.path
|
||||||
|
).observe(time.time() - start_time)
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
@app.get("/metrics")
|
||||||
|
async def metrics():
|
||||||
|
return Response(generate_latest(), media_type=CONTENT_TYPE_LATEST)
|
||||||
|
```
|
||||||
|
|
||||||
|
### TypeScript (prom-client)
|
||||||
|
```typescript
|
||||||
|
import { Registry, Counter, Histogram, Gauge, collectDefaultMetrics } from 'prom-client';
|
||||||
|
|
||||||
|
const register = new Registry();
|
||||||
|
collectDefaultMetrics({ register });
|
||||||
|
|
||||||
|
const httpRequestsTotal = new Counter({
|
||||||
|
name: 'http_requests_total',
|
||||||
|
help: 'Total HTTP requests',
|
||||||
|
labelNames: ['method', 'path', 'status'],
|
||||||
|
registers: [register],
|
||||||
|
});
|
||||||
|
|
||||||
|
const httpRequestDuration = new Histogram({
|
||||||
|
name: 'http_request_duration_seconds',
|
||||||
|
help: 'HTTP request duration',
|
||||||
|
labelNames: ['method', 'path'],
|
||||||
|
buckets: [0.01, 0.05, 0.1, 0.5, 1, 5],
|
||||||
|
registers: [register],
|
||||||
|
});
|
||||||
|
|
||||||
|
// Express middleware
|
||||||
|
const metricsMiddleware = (req, res, next) => {
|
||||||
|
const end = httpRequestDuration.startTimer({ method: req.method, path: req.path });
|
||||||
|
|
||||||
|
res.on('finish', () => {
|
||||||
|
httpRequestsTotal.inc({ method: req.method, path: req.path, status: res.statusCode });
|
||||||
|
end();
|
||||||
|
});
|
||||||
|
|
||||||
|
next();
|
||||||
|
};
|
||||||
|
|
||||||
|
// Metrics endpoint
|
||||||
|
app.get('/metrics', async (req, res) => {
|
||||||
|
res.set('Content-Type', register.contentType);
|
||||||
|
res.end(await register.metrics());
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Key Metrics (RED Method)
|
||||||
|
|
||||||
|
| Metric | Description |
|
||||||
|
|--------|-------------|
|
||||||
|
| **R**ate | Requests per second |
|
||||||
|
| **E**rrors | Error rate (%) |
|
||||||
|
| **D**uration | Latency (p50, p95, p99) |
|
||||||
|
|
||||||
|
### Key Metrics (USE Method for Resources)
|
||||||
|
|
||||||
|
| Metric | Description |
|
||||||
|
|--------|-------------|
|
||||||
|
| **U**tilization | % time resource is busy |
|
||||||
|
| **S**aturation | Queue depth, backlog |
|
||||||
|
| **E**rrors | Error count |
|
||||||
|
|
||||||
|
## Distributed Tracing
|
||||||
|
|
||||||
|
### Python (OpenTelemetry)
|
||||||
|
```python
|
||||||
|
from opentelemetry import trace
|
||||||
|
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
|
||||||
|
from opentelemetry.sdk.trace import TracerProvider
|
||||||
|
from opentelemetry.sdk.trace.export import BatchSpanProcessor
|
||||||
|
from opentelemetry.sdk.resources import Resource
|
||||||
|
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
|
||||||
|
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor
|
||||||
|
from opentelemetry.instrumentation.httpx import HTTPXClientInstrumentor
|
||||||
|
|
||||||
|
def configure_tracing(service_name: str, otlp_endpoint: str) -> None:
|
||||||
|
"""Configure OpenTelemetry tracing."""
|
||||||
|
resource = Resource.create({"service.name": service_name})
|
||||||
|
|
||||||
|
provider = TracerProvider(resource=resource)
|
||||||
|
processor = BatchSpanProcessor(OTLPSpanExporter(endpoint=otlp_endpoint))
|
||||||
|
provider.add_span_processor(processor)
|
||||||
|
|
||||||
|
trace.set_tracer_provider(provider)
|
||||||
|
|
||||||
|
# Auto-instrument libraries
|
||||||
|
FastAPIInstrumentor.instrument()
|
||||||
|
SQLAlchemyInstrumentor().instrument()
|
||||||
|
HTTPXClientInstrumentor().instrument()
|
||||||
|
|
||||||
|
# Manual instrumentation
|
||||||
|
tracer = trace.get_tracer(__name__)
|
||||||
|
|
||||||
|
async def process_order(order_id: str) -> dict:
|
||||||
|
with tracer.start_as_current_span("process_order") as span:
|
||||||
|
span.set_attribute("order.id", order_id)
|
||||||
|
|
||||||
|
# Child span for validation
|
||||||
|
with tracer.start_as_current_span("validate_order"):
|
||||||
|
validated = await validate_order(order_id)
|
||||||
|
|
||||||
|
# Child span for payment
|
||||||
|
with tracer.start_as_current_span("process_payment") as payment_span:
|
||||||
|
payment_span.set_attribute("payment.method", "card")
|
||||||
|
result = await charge_payment(order_id)
|
||||||
|
|
||||||
|
span.set_attribute("order.status", "completed")
|
||||||
|
return result
|
||||||
|
```
|
||||||
|
|
||||||
|
### TypeScript (OpenTelemetry)
|
||||||
|
```typescript
|
||||||
|
import { NodeSDK } from '@opentelemetry/sdk-node';
|
||||||
|
import { getNodeAutoInstrumentations } from '@opentelemetry/auto-instrumentations-node';
|
||||||
|
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-grpc';
|
||||||
|
import { Resource } from '@opentelemetry/resources';
|
||||||
|
import { SemanticResourceAttributes } from '@opentelemetry/semantic-conventions';
|
||||||
|
|
||||||
|
const sdk = new NodeSDK({
|
||||||
|
resource: new Resource({
|
||||||
|
[SemanticResourceAttributes.SERVICE_NAME]: 'my-service',
|
||||||
|
}),
|
||||||
|
traceExporter: new OTLPTraceExporter({
|
||||||
|
url: process.env.OTLP_ENDPOINT,
|
||||||
|
}),
|
||||||
|
instrumentations: [getNodeAutoInstrumentations()],
|
||||||
|
});
|
||||||
|
|
||||||
|
sdk.start();
|
||||||
|
|
||||||
|
// Manual instrumentation
|
||||||
|
import { trace, SpanStatusCode } from '@opentelemetry/api';
|
||||||
|
|
||||||
|
const tracer = trace.getTracer('my-service');
|
||||||
|
|
||||||
|
async function processOrder(orderId: string) {
|
||||||
|
return tracer.startActiveSpan('process_order', async (span) => {
|
||||||
|
try {
|
||||||
|
span.setAttribute('order.id', orderId);
|
||||||
|
|
||||||
|
await tracer.startActiveSpan('validate_order', async (validateSpan) => {
|
||||||
|
await validateOrder(orderId);
|
||||||
|
validateSpan.end();
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = await tracer.startActiveSpan('process_payment', async (paymentSpan) => {
|
||||||
|
paymentSpan.setAttribute('payment.method', 'card');
|
||||||
|
const res = await chargePayment(orderId);
|
||||||
|
paymentSpan.end();
|
||||||
|
return res;
|
||||||
|
});
|
||||||
|
|
||||||
|
span.setStatus({ code: SpanStatusCode.OK });
|
||||||
|
return result;
|
||||||
|
} catch (error) {
|
||||||
|
span.setStatus({ code: SpanStatusCode.ERROR, message: error.message });
|
||||||
|
span.recordException(error);
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
span.end();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Health Checks
|
||||||
|
|
||||||
|
```python
|
||||||
|
from fastapi import FastAPI, Response
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from enum import Enum
|
||||||
|
|
||||||
|
class HealthStatus(str, Enum):
|
||||||
|
HEALTHY = "healthy"
|
||||||
|
DEGRADED = "degraded"
|
||||||
|
UNHEALTHY = "unhealthy"
|
||||||
|
|
||||||
|
class ComponentHealth(BaseModel):
|
||||||
|
name: str
|
||||||
|
status: HealthStatus
|
||||||
|
message: str | None = None
|
||||||
|
|
||||||
|
class HealthResponse(BaseModel):
|
||||||
|
status: HealthStatus
|
||||||
|
version: str
|
||||||
|
components: list[ComponentHealth]
|
||||||
|
|
||||||
|
async def check_database() -> ComponentHealth:
|
||||||
|
try:
|
||||||
|
await db.execute("SELECT 1")
|
||||||
|
return ComponentHealth(name="database", status=HealthStatus.HEALTHY)
|
||||||
|
except Exception as e:
|
||||||
|
return ComponentHealth(name="database", status=HealthStatus.UNHEALTHY, message=str(e))
|
||||||
|
|
||||||
|
async def check_redis() -> ComponentHealth:
|
||||||
|
try:
|
||||||
|
await redis.ping()
|
||||||
|
return ComponentHealth(name="redis", status=HealthStatus.HEALTHY)
|
||||||
|
except Exception as e:
|
||||||
|
return ComponentHealth(name="redis", status=HealthStatus.DEGRADED, message=str(e))
|
||||||
|
|
||||||
|
@app.get("/health", response_model=HealthResponse)
|
||||||
|
async def health_check(response: Response):
|
||||||
|
components = await asyncio.gather(
|
||||||
|
check_database(),
|
||||||
|
check_redis(),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Overall status is worst component status
|
||||||
|
if any(c.status == HealthStatus.UNHEALTHY for c in components):
|
||||||
|
overall = HealthStatus.UNHEALTHY
|
||||||
|
response.status_code = 503
|
||||||
|
elif any(c.status == HealthStatus.DEGRADED for c in components):
|
||||||
|
overall = HealthStatus.DEGRADED
|
||||||
|
else:
|
||||||
|
overall = HealthStatus.HEALTHY
|
||||||
|
|
||||||
|
return HealthResponse(
|
||||||
|
status=overall,
|
||||||
|
version="1.0.0",
|
||||||
|
components=components,
|
||||||
|
)
|
||||||
|
|
||||||
|
@app.get("/ready")
|
||||||
|
async def readiness_check():
|
||||||
|
"""Kubernetes readiness probe - can we serve traffic?"""
|
||||||
|
# Check critical dependencies
|
||||||
|
await check_database()
|
||||||
|
return {"status": "ready"}
|
||||||
|
|
||||||
|
@app.get("/live")
|
||||||
|
async def liveness_check():
|
||||||
|
"""Kubernetes liveness probe - is the process healthy?"""
|
||||||
|
return {"status": "alive"}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Alerting Rules
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# prometheus-rules.yaml
|
||||||
|
groups:
|
||||||
|
- name: application
|
||||||
|
rules:
|
||||||
|
# High error rate
|
||||||
|
- alert: HighErrorRate
|
||||||
|
expr: |
|
||||||
|
sum(rate(http_requests_total{status=~"5.."}[5m]))
|
||||||
|
/
|
||||||
|
sum(rate(http_requests_total[5m])) > 0.05
|
||||||
|
for: 5m
|
||||||
|
labels:
|
||||||
|
severity: critical
|
||||||
|
annotations:
|
||||||
|
summary: "High error rate detected"
|
||||||
|
description: "Error rate is {{ $value | humanizePercentage }}"
|
||||||
|
|
||||||
|
# High latency
|
||||||
|
- alert: HighLatency
|
||||||
|
expr: |
|
||||||
|
histogram_quantile(0.95, rate(http_request_duration_seconds_bucket[5m])) > 1
|
||||||
|
for: 5m
|
||||||
|
labels:
|
||||||
|
severity: warning
|
||||||
|
annotations:
|
||||||
|
summary: "High latency detected"
|
||||||
|
description: "p95 latency is {{ $value }}s"
|
||||||
|
|
||||||
|
# Service down
|
||||||
|
- alert: ServiceDown
|
||||||
|
expr: up == 0
|
||||||
|
for: 1m
|
||||||
|
labels:
|
||||||
|
severity: critical
|
||||||
|
annotations:
|
||||||
|
summary: "Service is down"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
### Logging
|
||||||
|
- Use structured JSON logs
|
||||||
|
- Include correlation/request IDs
|
||||||
|
- Redact sensitive data
|
||||||
|
- Use appropriate log levels
|
||||||
|
- Don't log in hot paths (use sampling)
|
||||||
|
|
||||||
|
### Metrics
|
||||||
|
- Use consistent naming conventions
|
||||||
|
- Keep cardinality under control
|
||||||
|
- Use histograms for latency (not averages)
|
||||||
|
- Export business metrics alongside technical ones
|
||||||
|
|
||||||
|
### Tracing
|
||||||
|
- Instrument at service boundaries
|
||||||
|
- Propagate context across services
|
||||||
|
- Sample appropriately in production
|
||||||
|
- Add relevant attributes to spans
|
||||||
389
.claude/skills/testing/browser-testing/SKILL.md
Normal file
389
.claude/skills/testing/browser-testing/SKILL.md
Normal file
@@ -0,0 +1,389 @@
|
|||||||
|
---
|
||||||
|
name: browser-testing
|
||||||
|
description: Browser automation with Playwright for E2E tests and Claude's Chrome MCP for rapid development verification. Use when writing E2E tests or verifying UI during development.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Browser Testing Skill
|
||||||
|
|
||||||
|
## Two Approaches
|
||||||
|
|
||||||
|
### 1. Claude Chrome MCP (Development Verification)
|
||||||
|
Quick, interactive testing during development to verify UI works before writing permanent tests.
|
||||||
|
|
||||||
|
### 2. Playwright (Permanent E2E Tests)
|
||||||
|
Automated, repeatable tests that run in CI/CD.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Claude Chrome MCP (Quick Verification)
|
||||||
|
|
||||||
|
Use Chrome MCP tools for rapid UI verification during development.
|
||||||
|
|
||||||
|
### Basic Workflow
|
||||||
|
|
||||||
|
```
|
||||||
|
1. Navigate to page
|
||||||
|
2. Take screenshot to verify state
|
||||||
|
3. Find and interact with elements
|
||||||
|
4. Verify expected outcome
|
||||||
|
5. Record GIF for complex flows
|
||||||
|
```
|
||||||
|
|
||||||
|
### Navigation
|
||||||
|
```typescript
|
||||||
|
// Navigate to URL
|
||||||
|
mcp__claude-in-chrome__navigate({ url: "http://localhost:5173", tabId: 123 })
|
||||||
|
|
||||||
|
// Go back/forward
|
||||||
|
mcp__claude-in-chrome__navigate({ url: "back", tabId: 123 })
|
||||||
|
```
|
||||||
|
|
||||||
|
### Finding Elements
|
||||||
|
```typescript
|
||||||
|
// Find by natural language
|
||||||
|
mcp__claude-in-chrome__find({ query: "login button", tabId: 123 })
|
||||||
|
mcp__claude-in-chrome__find({ query: "email input field", tabId: 123 })
|
||||||
|
mcp__claude-in-chrome__find({ query: "product card containing organic", tabId: 123 })
|
||||||
|
```
|
||||||
|
|
||||||
|
### Reading Page State
|
||||||
|
```typescript
|
||||||
|
// Get accessibility tree (best for understanding structure)
|
||||||
|
mcp__claude-in-chrome__read_page({ tabId: 123, filter: "interactive" })
|
||||||
|
|
||||||
|
// Get text content (for articles/text-heavy pages)
|
||||||
|
mcp__claude-in-chrome__get_page_text({ tabId: 123 })
|
||||||
|
```
|
||||||
|
|
||||||
|
### Interactions
|
||||||
|
```typescript
|
||||||
|
// Click element by reference
|
||||||
|
mcp__claude-in-chrome__computer({
|
||||||
|
action: "left_click",
|
||||||
|
ref: "ref_42",
|
||||||
|
tabId: 123
|
||||||
|
})
|
||||||
|
|
||||||
|
// Type text
|
||||||
|
mcp__claude-in-chrome__computer({
|
||||||
|
action: "type",
|
||||||
|
text: "user@example.com",
|
||||||
|
tabId: 123
|
||||||
|
})
|
||||||
|
|
||||||
|
// Fill form input
|
||||||
|
mcp__claude-in-chrome__form_input({
|
||||||
|
ref: "ref_15",
|
||||||
|
value: "test@example.com",
|
||||||
|
tabId: 123
|
||||||
|
})
|
||||||
|
|
||||||
|
// Press keys
|
||||||
|
mcp__claude-in-chrome__computer({
|
||||||
|
action: "key",
|
||||||
|
text: "Enter",
|
||||||
|
tabId: 123
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
### Screenshots and GIFs
|
||||||
|
```typescript
|
||||||
|
// Take screenshot
|
||||||
|
mcp__claude-in-chrome__computer({ action: "screenshot", tabId: 123 })
|
||||||
|
|
||||||
|
// Record GIF for complex interaction
|
||||||
|
mcp__claude-in-chrome__gif_creator({ action: "start_recording", tabId: 123 })
|
||||||
|
// ... perform actions ...
|
||||||
|
mcp__claude-in-chrome__gif_creator({
|
||||||
|
action: "export",
|
||||||
|
download: true,
|
||||||
|
filename: "login-flow.gif",
|
||||||
|
tabId: 123
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
### Verification Pattern
|
||||||
|
|
||||||
|
```
|
||||||
|
1. Navigate to the page
|
||||||
|
2. Screenshot to see current state
|
||||||
|
3. Find the element you want to interact with
|
||||||
|
4. Interact with it
|
||||||
|
5. Screenshot to verify the result
|
||||||
|
6. If complex flow, record a GIF
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Playwright (Permanent E2E Tests)
|
||||||
|
|
||||||
|
### Project Setup
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// playwright.config.ts
|
||||||
|
import { defineConfig, devices } from '@playwright/test';
|
||||||
|
|
||||||
|
export default defineConfig({
|
||||||
|
testDir: './e2e',
|
||||||
|
fullyParallel: true,
|
||||||
|
forbidOnly: !!process.env.CI,
|
||||||
|
retries: process.env.CI ? 2 : 0,
|
||||||
|
workers: process.env.CI ? 1 : undefined,
|
||||||
|
reporter: [
|
||||||
|
['html'],
|
||||||
|
['junit', { outputFile: 'results.xml' }],
|
||||||
|
],
|
||||||
|
use: {
|
||||||
|
baseURL: 'http://localhost:5173',
|
||||||
|
trace: 'on-first-retry',
|
||||||
|
screenshot: 'only-on-failure',
|
||||||
|
},
|
||||||
|
projects: [
|
||||||
|
{
|
||||||
|
name: 'chromium',
|
||||||
|
use: { ...devices['Desktop Chrome'] },
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'firefox',
|
||||||
|
use: { ...devices['Desktop Firefox'] },
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'webkit',
|
||||||
|
use: { ...devices['Desktop Safari'] },
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'mobile',
|
||||||
|
use: { ...devices['iPhone 13'] },
|
||||||
|
},
|
||||||
|
],
|
||||||
|
webServer: {
|
||||||
|
command: 'npm run dev',
|
||||||
|
url: 'http://localhost:5173',
|
||||||
|
reuseExistingServer: !process.env.CI,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Page Object Pattern
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// e2e/pages/login.page.ts
|
||||||
|
import { Page, Locator, expect } from '@playwright/test';
|
||||||
|
|
||||||
|
export class LoginPage {
|
||||||
|
readonly page: Page;
|
||||||
|
readonly emailInput: Locator;
|
||||||
|
readonly passwordInput: Locator;
|
||||||
|
readonly submitButton: Locator;
|
||||||
|
readonly errorMessage: Locator;
|
||||||
|
|
||||||
|
constructor(page: Page) {
|
||||||
|
this.page = page;
|
||||||
|
this.emailInput = page.getByLabel('Email');
|
||||||
|
this.passwordInput = page.getByLabel('Password');
|
||||||
|
this.submitButton = page.getByRole('button', { name: 'Sign in' });
|
||||||
|
this.errorMessage = page.getByRole('alert');
|
||||||
|
}
|
||||||
|
|
||||||
|
async goto() {
|
||||||
|
await this.page.goto('/login');
|
||||||
|
}
|
||||||
|
|
||||||
|
async login(email: string, password: string) {
|
||||||
|
await this.emailInput.fill(email);
|
||||||
|
await this.passwordInput.fill(password);
|
||||||
|
await this.submitButton.click();
|
||||||
|
}
|
||||||
|
|
||||||
|
async expectError(message: string) {
|
||||||
|
await expect(this.errorMessage).toContainText(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
async expectLoggedIn() {
|
||||||
|
await expect(this.page).toHaveURL('/dashboard');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test Patterns
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// e2e/auth/login.spec.ts
|
||||||
|
import { test, expect } from '@playwright/test';
|
||||||
|
import { LoginPage } from '../pages/login.page';
|
||||||
|
|
||||||
|
test.describe('Login', () => {
|
||||||
|
let loginPage: LoginPage;
|
||||||
|
|
||||||
|
test.beforeEach(async ({ page }) => {
|
||||||
|
loginPage = new LoginPage(page);
|
||||||
|
await loginPage.goto();
|
||||||
|
});
|
||||||
|
|
||||||
|
test('should show validation errors for empty form', async () => {
|
||||||
|
await loginPage.submitButton.click();
|
||||||
|
|
||||||
|
await expect(loginPage.page.getByText('Email is required')).toBeVisible();
|
||||||
|
await expect(loginPage.page.getByText('Password is required')).toBeVisible();
|
||||||
|
});
|
||||||
|
|
||||||
|
test('should show error for invalid credentials', async () => {
|
||||||
|
await loginPage.login('wrong@example.com', 'wrongpassword');
|
||||||
|
|
||||||
|
await loginPage.expectError('Invalid credentials');
|
||||||
|
});
|
||||||
|
|
||||||
|
test('should redirect to dashboard on successful login', async () => {
|
||||||
|
await loginPage.login('user@example.com', 'correctpassword');
|
||||||
|
|
||||||
|
await loginPage.expectLoggedIn();
|
||||||
|
await expect(loginPage.page.getByRole('heading', { name: 'Dashboard' })).toBeVisible();
|
||||||
|
});
|
||||||
|
|
||||||
|
test('should persist session after page reload', async ({ page }) => {
|
||||||
|
await loginPage.login('user@example.com', 'correctpassword');
|
||||||
|
await loginPage.expectLoggedIn();
|
||||||
|
|
||||||
|
await page.reload();
|
||||||
|
|
||||||
|
await expect(page).toHaveURL('/dashboard');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### API Mocking
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// e2e/dashboard/projects.spec.ts
|
||||||
|
import { test, expect } from '@playwright/test';
|
||||||
|
|
||||||
|
test.describe('Projects Dashboard', () => {
|
||||||
|
test('should display projects from API', async ({ page }) => {
|
||||||
|
// Mock API response
|
||||||
|
await page.route('**/api/projects', async (route) => {
|
||||||
|
await route.fulfill({
|
||||||
|
status: 200,
|
||||||
|
contentType: 'application/json',
|
||||||
|
body: JSON.stringify([
|
||||||
|
{ id: '1', name: 'Project A', status: 'active' },
|
||||||
|
{ id: '2', name: 'Project B', status: 'completed' },
|
||||||
|
]),
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
await page.goto('/dashboard');
|
||||||
|
|
||||||
|
await expect(page.getByText('Project A')).toBeVisible();
|
||||||
|
await expect(page.getByText('Project B')).toBeVisible();
|
||||||
|
});
|
||||||
|
|
||||||
|
test('should show error state on API failure', async ({ page }) => {
|
||||||
|
await page.route('**/api/projects', async (route) => {
|
||||||
|
await route.fulfill({ status: 500 });
|
||||||
|
});
|
||||||
|
|
||||||
|
await page.goto('/dashboard');
|
||||||
|
|
||||||
|
await expect(page.getByRole('alert')).toContainText('Failed to load projects');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Visual Regression Testing
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// e2e/visual/components.spec.ts
|
||||||
|
import { test, expect } from '@playwright/test';
|
||||||
|
|
||||||
|
test.describe('Visual Regression', () => {
|
||||||
|
test('login page matches snapshot', async ({ page }) => {
|
||||||
|
await page.goto('/login');
|
||||||
|
|
||||||
|
await expect(page).toHaveScreenshot('login-page.png');
|
||||||
|
});
|
||||||
|
|
||||||
|
test('dashboard matches snapshot', async ({ page }) => {
|
||||||
|
// Setup authenticated state
|
||||||
|
await page.goto('/dashboard');
|
||||||
|
|
||||||
|
await expect(page).toHaveScreenshot('dashboard.png', {
|
||||||
|
mask: [page.locator('.timestamp')], // Mask dynamic content
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Accessibility Testing
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// e2e/a11y/accessibility.spec.ts
|
||||||
|
import { test, expect } from '@playwright/test';
|
||||||
|
import AxeBuilder from '@axe-core/playwright';
|
||||||
|
|
||||||
|
test.describe('Accessibility', () => {
|
||||||
|
test('login page should have no accessibility violations', async ({ page }) => {
|
||||||
|
await page.goto('/login');
|
||||||
|
|
||||||
|
const results = await new AxeBuilder({ page }).analyze();
|
||||||
|
|
||||||
|
expect(results.violations).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('dashboard should have no accessibility violations', async ({ page }) => {
|
||||||
|
await page.goto('/dashboard');
|
||||||
|
|
||||||
|
const results = await new AxeBuilder({ page })
|
||||||
|
.exclude('.third-party-widget') // Exclude third-party content
|
||||||
|
.analyze();
|
||||||
|
|
||||||
|
expect(results.violations).toEqual([]);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Playwright
|
||||||
|
npx playwright test # Run all tests
|
||||||
|
npx playwright test --project=chromium # Specific browser
|
||||||
|
npx playwright test login.spec.ts # Specific file
|
||||||
|
npx playwright test --ui # Interactive UI mode
|
||||||
|
npx playwright test --debug # Debug mode
|
||||||
|
npx playwright show-report # View HTML report
|
||||||
|
npx playwright codegen # Generate tests by recording
|
||||||
|
|
||||||
|
# Update snapshots
|
||||||
|
npx playwright test --update-snapshots
|
||||||
|
```
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
1. **Use role-based locators** (accessible to all)
|
||||||
|
```typescript
|
||||||
|
page.getByRole('button', { name: 'Submit' })
|
||||||
|
page.getByLabel('Email')
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Wait for specific conditions** (not arbitrary timeouts)
|
||||||
|
```typescript
|
||||||
|
await expect(page.getByText('Success')).toBeVisible();
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Isolate tests** (each test sets up its own state)
|
||||||
|
```typescript
|
||||||
|
test.beforeEach(async ({ page }) => {
|
||||||
|
await page.goto('/login');
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Use Page Objects** (DRY, maintainable)
|
||||||
|
```typescript
|
||||||
|
const loginPage = new LoginPage(page);
|
||||||
|
await loginPage.login(email, password);
|
||||||
|
```
|
||||||
|
|
||||||
|
5. **Mock external APIs** (fast, reliable)
|
||||||
|
```typescript
|
||||||
|
await page.route('**/api/**', handler);
|
||||||
|
```
|
||||||
349
.claude/skills/testing/tdd/SKILL.md
Normal file
349
.claude/skills/testing/tdd/SKILL.md
Normal file
@@ -0,0 +1,349 @@
|
|||||||
|
---
|
||||||
|
name: tdd-workflow
|
||||||
|
description: Test-Driven Development workflow with RED-GREEN-REFACTOR cycle, coverage verification, and quality gates. Use when planning or implementing features with TDD.
|
||||||
|
---
|
||||||
|
|
||||||
|
# TDD Workflow Skill
|
||||||
|
|
||||||
|
## The TDD Cycle
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────┐
|
||||||
|
│ │
|
||||||
|
│ ┌─────┐ ┌───────┐ ┌──────────┐ │
|
||||||
|
│ │ RED │ ──▶ │ GREEN │ ──▶ │ REFACTOR │ ──┐ │
|
||||||
|
│ └─────┘ └───────┘ └──────────┘ │ │
|
||||||
|
│ ▲ │ │
|
||||||
|
│ └──────────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
└─────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
## Phase 1: RED (Write Failing Test)
|
||||||
|
|
||||||
|
### Rules
|
||||||
|
- Write ONE test that describes desired behavior
|
||||||
|
- Test must fail for the RIGHT reason
|
||||||
|
- NO production code exists yet
|
||||||
|
|
||||||
|
### Checklist
|
||||||
|
- [ ] Test describes behavior, not implementation
|
||||||
|
- [ ] Test name clearly states expected outcome
|
||||||
|
- [ ] Test uses factory functions for data
|
||||||
|
- [ ] Test fails with meaningful error message
|
||||||
|
- [ ] Error indicates missing behavior, not syntax/import errors
|
||||||
|
|
||||||
|
### Example
|
||||||
|
```typescript
|
||||||
|
// STEP 1: Write the failing test
|
||||||
|
describe('PaymentProcessor', () => {
|
||||||
|
it('should reject payments with negative amounts', () => {
|
||||||
|
const payment = getMockPayment({ amount: -100 });
|
||||||
|
|
||||||
|
const result = processPayment(payment);
|
||||||
|
|
||||||
|
expect(result.success).toBe(false);
|
||||||
|
expect(result.error).toBe('Amount must be positive');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// At this point:
|
||||||
|
// - processPayment doesn't exist
|
||||||
|
// - Running test gives: "ReferenceError: processPayment is not defined"
|
||||||
|
// - This is the RIGHT failure - the function doesn't exist yet
|
||||||
|
```
|
||||||
|
|
||||||
|
## Phase 2: GREEN (Write Minimum Code)
|
||||||
|
|
||||||
|
### Rules
|
||||||
|
- Write ONLY enough code to pass the test
|
||||||
|
- No additional functionality
|
||||||
|
- No "while you're there" improvements
|
||||||
|
- Ugly code is acceptable (we'll refactor)
|
||||||
|
|
||||||
|
### Checklist
|
||||||
|
- [ ] Implementation makes test pass
|
||||||
|
- [ ] No code beyond what test demands
|
||||||
|
- [ ] All existing tests still pass
|
||||||
|
- [ ] Coverage increased for new code
|
||||||
|
|
||||||
|
### Example
|
||||||
|
```typescript
|
||||||
|
// STEP 2: Write minimum code to pass
|
||||||
|
export function processPayment(payment: Payment): PaymentResult {
|
||||||
|
if (payment.amount < 0) {
|
||||||
|
return { success: false, error: 'Amount must be positive' };
|
||||||
|
}
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
|
||||||
|
// This passes the test - that's enough for now
|
||||||
|
// Don't add validation for zero, max amounts, etc.
|
||||||
|
// Those need their own tests first!
|
||||||
|
```
|
||||||
|
|
||||||
|
## Phase 3: REFACTOR (Improve Code Quality)
|
||||||
|
|
||||||
|
### Rules
|
||||||
|
- COMMIT before refactoring
|
||||||
|
- Only refactor if it adds value
|
||||||
|
- All tests must stay green
|
||||||
|
- No new functionality
|
||||||
|
|
||||||
|
### When to Refactor
|
||||||
|
- Duplicate knowledge (not just similar code)
|
||||||
|
- Unclear names
|
||||||
|
- Complex conditionals
|
||||||
|
- Magic numbers/strings
|
||||||
|
|
||||||
|
### When NOT to Refactor
|
||||||
|
- Code is already clean
|
||||||
|
- Similarity is structural, not semantic
|
||||||
|
- Would add unnecessary abstraction
|
||||||
|
|
||||||
|
### Example
|
||||||
|
```typescript
|
||||||
|
// STEP 3: Assess and refactor
|
||||||
|
// BEFORE REFACTORING: git commit -m "feat: add negative amount validation"
|
||||||
|
|
||||||
|
// Assessment:
|
||||||
|
// - Magic number 0 could be a constant
|
||||||
|
// - Error message could be a constant
|
||||||
|
// - But... it's simple enough. Skip refactoring.
|
||||||
|
|
||||||
|
// If we had multiple validations:
|
||||||
|
const VALIDATION_ERRORS = {
|
||||||
|
NEGATIVE_AMOUNT: 'Amount must be positive',
|
||||||
|
ZERO_AMOUNT: 'Amount must be greater than zero',
|
||||||
|
EXCEEDS_LIMIT: 'Amount exceeds maximum limit',
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
const MAX_PAYMENT_AMOUNT = 10000;
|
||||||
|
|
||||||
|
export function processPayment(payment: Payment): PaymentResult {
|
||||||
|
if (payment.amount < 0) {
|
||||||
|
return { success: false, error: VALIDATION_ERRORS.NEGATIVE_AMOUNT };
|
||||||
|
}
|
||||||
|
if (payment.amount === 0) {
|
||||||
|
return { success: false, error: VALIDATION_ERRORS.ZERO_AMOUNT };
|
||||||
|
}
|
||||||
|
if (payment.amount > MAX_PAYMENT_AMOUNT) {
|
||||||
|
return { success: false, error: VALIDATION_ERRORS.EXCEEDS_LIMIT };
|
||||||
|
}
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Complete TDD Session Example
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// === CYCLE 1: Negative amounts ===
|
||||||
|
|
||||||
|
// RED: Write failing test
|
||||||
|
it('should reject negative amounts', () => {
|
||||||
|
const payment = getMockPayment({ amount: -100 });
|
||||||
|
const result = processPayment(payment);
|
||||||
|
expect(result.success).toBe(false);
|
||||||
|
});
|
||||||
|
// RUN: ❌ FAIL - processPayment is not defined
|
||||||
|
|
||||||
|
// GREEN: Minimal implementation
|
||||||
|
function processPayment(payment: Payment): PaymentResult {
|
||||||
|
if (payment.amount < 0) return { success: false };
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
// RUN: ✅ PASS
|
||||||
|
|
||||||
|
// REFACTOR: Assess - simple enough, skip
|
||||||
|
// COMMIT: "feat: reject negative payment amounts"
|
||||||
|
|
||||||
|
// === CYCLE 2: Zero amounts ===
|
||||||
|
|
||||||
|
// RED: Write failing test
|
||||||
|
it('should reject zero amounts', () => {
|
||||||
|
const payment = getMockPayment({ amount: 0 });
|
||||||
|
const result = processPayment(payment);
|
||||||
|
expect(result.success).toBe(false);
|
||||||
|
});
|
||||||
|
// RUN: ❌ FAIL - expected false, got true
|
||||||
|
|
||||||
|
// GREEN: Add zero check
|
||||||
|
function processPayment(payment: Payment): PaymentResult {
|
||||||
|
if (payment.amount <= 0) return { success: false };
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
// RUN: ✅ PASS
|
||||||
|
|
||||||
|
// REFACTOR: Assess - still simple, skip
|
||||||
|
// COMMIT: "feat: reject zero payment amounts"
|
||||||
|
|
||||||
|
// === CYCLE 3: Maximum amount ===
|
||||||
|
|
||||||
|
// RED: Write failing test
|
||||||
|
it('should reject amounts over 10000', () => {
|
||||||
|
const payment = getMockPayment({ amount: 10001 });
|
||||||
|
const result = processPayment(payment);
|
||||||
|
expect(result.success).toBe(false);
|
||||||
|
});
|
||||||
|
// RUN: ❌ FAIL - expected false, got true
|
||||||
|
|
||||||
|
// GREEN: Add max check
|
||||||
|
function processPayment(payment: Payment): PaymentResult {
|
||||||
|
if (payment.amount <= 0) return { success: false };
|
||||||
|
if (payment.amount > 10000) return { success: false };
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
// RUN: ✅ PASS
|
||||||
|
|
||||||
|
// REFACTOR: Now we have magic number 10000
|
||||||
|
// COMMIT first: "feat: reject payments over maximum"
|
||||||
|
// Then refactor:
|
||||||
|
const MAX_PAYMENT_AMOUNT = 10000;
|
||||||
|
|
||||||
|
function processPayment(payment: Payment): PaymentResult {
|
||||||
|
if (payment.amount <= 0) return { success: false };
|
||||||
|
if (payment.amount > MAX_PAYMENT_AMOUNT) return { success: false };
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
// RUN: ✅ PASS
|
||||||
|
// COMMIT: "refactor: extract max payment constant"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Quality Gates
|
||||||
|
|
||||||
|
Before committing, verify:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. All tests pass
|
||||||
|
npm test # or pytest, cargo test
|
||||||
|
|
||||||
|
# 2. Coverage meets threshold
|
||||||
|
npm test -- --coverage
|
||||||
|
# Verify: 80%+ overall
|
||||||
|
|
||||||
|
# 3. Type checking passes
|
||||||
|
npm run typecheck # or mypy, cargo check
|
||||||
|
|
||||||
|
# 4. Linting passes
|
||||||
|
npm run lint # or ruff, cargo clippy
|
||||||
|
|
||||||
|
# 5. No secrets in code
|
||||||
|
git diff --staged | grep -i "password\|secret\|api.key"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Coverage Verification
|
||||||
|
|
||||||
|
**CRITICAL: Never trust claimed coverage - always verify.**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Python
|
||||||
|
pytest --cov=src --cov-report=term-missing --cov-fail-under=80
|
||||||
|
|
||||||
|
# TypeScript
|
||||||
|
npm test -- --coverage --coverageThreshold='{"global":{"branches":80,"functions":80,"lines":80}}'
|
||||||
|
|
||||||
|
# Rust
|
||||||
|
cargo tarpaulin --fail-under 80
|
||||||
|
```
|
||||||
|
|
||||||
|
### Interpreting Coverage
|
||||||
|
|
||||||
|
| Metric | Meaning | Target |
|
||||||
|
|--------|---------|--------|
|
||||||
|
| Lines | % of lines executed | 80%+ |
|
||||||
|
| Branches | % of if/else paths taken | 80%+ |
|
||||||
|
| Functions | % of functions called | 80%+ |
|
||||||
|
| Statements | % of statements executed | 80%+ |
|
||||||
|
|
||||||
|
### Coverage Exceptions (Document!)
|
||||||
|
|
||||||
|
```python
|
||||||
|
# pragma: no cover - Reason: Debug utility only used in development
|
||||||
|
def debug_print(data):
|
||||||
|
print(json.dumps(data, indent=2))
|
||||||
|
```
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
/* istanbul ignore next -- @preserve Reason: Error boundary for React */
|
||||||
|
if (process.env.NODE_ENV === 'development') {
|
||||||
|
console.error(error);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns
|
||||||
|
|
||||||
|
### Writing Test After Code
|
||||||
|
```
|
||||||
|
❌ WRONG:
|
||||||
|
1. Write all production code
|
||||||
|
2. Write tests to cover it
|
||||||
|
3. "I have 80% coverage!"
|
||||||
|
|
||||||
|
✅ CORRECT:
|
||||||
|
1. Write failing test
|
||||||
|
2. Write code to pass
|
||||||
|
3. Repeat until feature complete
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing Implementation
|
||||||
|
```typescript
|
||||||
|
// ❌ BAD: Tests implementation detail
|
||||||
|
it('should call validatePayment', () => {
|
||||||
|
const spy = jest.spyOn(service, 'validatePayment');
|
||||||
|
processPayment(payment);
|
||||||
|
expect(spy).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ✅ GOOD: Tests behavior
|
||||||
|
it('should reject invalid payments', () => {
|
||||||
|
const payment = getMockPayment({ amount: -1 });
|
||||||
|
const result = processPayment(payment);
|
||||||
|
expect(result.success).toBe(false);
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Writing Too Many Tests at Once
|
||||||
|
```
|
||||||
|
❌ WRONG:
|
||||||
|
1. Write 10 tests
|
||||||
|
2. Implement everything
|
||||||
|
3. All tests pass
|
||||||
|
|
||||||
|
✅ CORRECT:
|
||||||
|
1. Write 1 test
|
||||||
|
2. Make it pass
|
||||||
|
3. Refactor if needed
|
||||||
|
4. Repeat
|
||||||
|
```
|
||||||
|
|
||||||
|
### Skipping Refactor Assessment
|
||||||
|
```
|
||||||
|
❌ WRONG:
|
||||||
|
1. Test passes
|
||||||
|
2. Immediately write next test
|
||||||
|
3. Code becomes messy
|
||||||
|
|
||||||
|
✅ CORRECT:
|
||||||
|
1. Test passes
|
||||||
|
2. Ask: "Should I refactor?"
|
||||||
|
3. If yes: commit, then refactor
|
||||||
|
4. If no: continue
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commit Messages
|
||||||
|
|
||||||
|
Follow this pattern:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# After GREEN (feature added)
|
||||||
|
git commit -m "feat: add payment amount validation"
|
||||||
|
|
||||||
|
# After REFACTOR
|
||||||
|
git commit -m "refactor: extract payment validation constants"
|
||||||
|
|
||||||
|
# Test-only changes
|
||||||
|
git commit -m "test: add edge cases for payment validation"
|
||||||
|
|
||||||
|
# Bug fixes (driven by failing test)
|
||||||
|
git commit -m "fix: handle null payment amount"
|
||||||
|
```
|
||||||
445
.claude/skills/testing/ui-testing/SKILL.md
Normal file
445
.claude/skills/testing/ui-testing/SKILL.md
Normal file
@@ -0,0 +1,445 @@
|
|||||||
|
---
|
||||||
|
name: ui-testing
|
||||||
|
description: React component testing with React Testing Library, user-centric patterns, and accessibility testing. Use when writing tests for React components or UI interactions.
|
||||||
|
---
|
||||||
|
|
||||||
|
# UI Testing Skill
|
||||||
|
|
||||||
|
## Core Principles
|
||||||
|
|
||||||
|
1. **Test behavior, not implementation** - What users see and do
|
||||||
|
2. **Query by accessibility** - Role, label, text (what assistive tech sees)
|
||||||
|
3. **User events over fire events** - Real user interactions
|
||||||
|
4. **Avoid test IDs when possible** - Last resort only
|
||||||
|
|
||||||
|
## Query Priority
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Best: Accessible to everyone
|
||||||
|
screen.getByRole('button', { name: 'Submit' })
|
||||||
|
screen.getByLabelText('Email address')
|
||||||
|
screen.getByPlaceholderText('Search...')
|
||||||
|
screen.getByText('Welcome back')
|
||||||
|
|
||||||
|
// Good: Semantic queries
|
||||||
|
screen.getByAltText('Company logo')
|
||||||
|
screen.getByTitle('Close dialog')
|
||||||
|
|
||||||
|
// Acceptable: Test IDs (when no other option)
|
||||||
|
screen.getByTestId('custom-datepicker')
|
||||||
|
```
|
||||||
|
|
||||||
|
## Component Testing Pattern
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// tests/features/dashboard/Dashboard.test.tsx
|
||||||
|
import { render, screen, within } from '@testing-library/react';
|
||||||
|
import userEvent from '@testing-library/user-event';
|
||||||
|
import { describe, it, expect, vi } from 'vitest';
|
||||||
|
|
||||||
|
import { Dashboard } from '@/features/dashboard/Dashboard';
|
||||||
|
import { getMockUser, getMockProjects } from './factories';
|
||||||
|
|
||||||
|
describe('Dashboard', () => {
|
||||||
|
it('should display user name in header', () => {
|
||||||
|
const user = getMockUser({ name: 'Alice' });
|
||||||
|
|
||||||
|
render(<Dashboard user={user} projects={[]} />);
|
||||||
|
|
||||||
|
expect(
|
||||||
|
screen.getByRole('heading', { name: /welcome, alice/i })
|
||||||
|
).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show empty state when no projects', () => {
|
||||||
|
render(<Dashboard user={getMockUser()} projects={[]} />);
|
||||||
|
|
||||||
|
expect(screen.getByText(/no projects yet/i)).toBeInTheDocument();
|
||||||
|
expect(
|
||||||
|
screen.getByRole('button', { name: /create project/i })
|
||||||
|
).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should list all projects with their status', () => {
|
||||||
|
const projects = [
|
||||||
|
getMockProject({ name: 'Project A', status: 'active' }),
|
||||||
|
getMockProject({ name: 'Project B', status: 'completed' }),
|
||||||
|
];
|
||||||
|
|
||||||
|
render(<Dashboard user={getMockUser()} projects={projects} />);
|
||||||
|
|
||||||
|
const projectList = screen.getByRole('list', { name: /projects/i });
|
||||||
|
const items = within(projectList).getAllByRole('listitem');
|
||||||
|
|
||||||
|
expect(items).toHaveLength(2);
|
||||||
|
expect(screen.getByText('Project A')).toBeInTheDocument();
|
||||||
|
expect(screen.getByText('Project B')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should navigate to project when clicked', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
const onNavigate = vi.fn();
|
||||||
|
const projects = [getMockProject({ id: 'proj-1', name: 'My Project' })];
|
||||||
|
|
||||||
|
render(
|
||||||
|
<Dashboard
|
||||||
|
user={getMockUser()}
|
||||||
|
projects={projects}
|
||||||
|
onNavigate={onNavigate}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('link', { name: /my project/i }));
|
||||||
|
|
||||||
|
expect(onNavigate).toHaveBeenCalledWith('/projects/proj-1');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Form Testing
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
describe('LoginForm', () => {
|
||||||
|
it('should show validation errors for empty fields', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
render(<LoginForm onSubmit={vi.fn()} />);
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('button', { name: /sign in/i }));
|
||||||
|
|
||||||
|
expect(screen.getByText(/email is required/i)).toBeInTheDocument();
|
||||||
|
expect(screen.getByText(/password is required/i)).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show error for invalid email format', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
render(<LoginForm onSubmit={vi.fn()} />);
|
||||||
|
|
||||||
|
await user.type(screen.getByLabelText(/email/i), 'invalid-email');
|
||||||
|
await user.click(screen.getByRole('button', { name: /sign in/i }));
|
||||||
|
|
||||||
|
expect(screen.getByText(/invalid email format/i)).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should submit form with valid data', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
const onSubmit = vi.fn();
|
||||||
|
render(<LoginForm onSubmit={onSubmit} />);
|
||||||
|
|
||||||
|
await user.type(screen.getByLabelText(/email/i), 'user@example.com');
|
||||||
|
await user.type(screen.getByLabelText(/password/i), 'securepass123');
|
||||||
|
await user.click(screen.getByRole('button', { name: /sign in/i }));
|
||||||
|
|
||||||
|
expect(onSubmit).toHaveBeenCalledWith({
|
||||||
|
email: 'user@example.com',
|
||||||
|
password: 'securepass123',
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should disable submit button while loading', async () => {
|
||||||
|
render(<LoginForm onSubmit={vi.fn()} isLoading />);
|
||||||
|
|
||||||
|
expect(screen.getByRole('button', { name: /signing in/i })).toBeDisabled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show server error message', () => {
|
||||||
|
render(
|
||||||
|
<LoginForm onSubmit={vi.fn()} error="Invalid credentials" />
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(screen.getByRole('alert')).toHaveTextContent(/invalid credentials/i);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Async Testing
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
describe('UserProfile', () => {
|
||||||
|
it('should show loading state initially', () => {
|
||||||
|
render(<UserProfile userId="user-1" />);
|
||||||
|
|
||||||
|
expect(screen.getByRole('progressbar')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should display user data when loaded', async () => {
|
||||||
|
// Mock API returns user data
|
||||||
|
server.use(
|
||||||
|
http.get('/api/users/user-1', () => {
|
||||||
|
return HttpResponse.json(getMockUser({ name: 'Alice' }));
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
|
render(<UserProfile userId="user-1" />);
|
||||||
|
|
||||||
|
// Wait for data to load
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(screen.queryByRole('progressbar')).not.toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(screen.getByText('Alice')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show error state on failure', async () => {
|
||||||
|
server.use(
|
||||||
|
http.get('/api/users/user-1', () => {
|
||||||
|
return HttpResponse.error();
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
|
render(<UserProfile userId="user-1" />);
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(screen.getByRole('alert')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(screen.getByText(/failed to load user/i)).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Modal and Dialog Testing
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
describe('ConfirmDialog', () => {
|
||||||
|
it('should not be visible when closed', () => {
|
||||||
|
render(<ConfirmDialog isOpen={false} onConfirm={vi.fn()} />);
|
||||||
|
|
||||||
|
expect(screen.queryByRole('dialog')).not.toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should be visible when open', () => {
|
||||||
|
render(
|
||||||
|
<ConfirmDialog
|
||||||
|
isOpen={true}
|
||||||
|
title="Delete item?"
|
||||||
|
message="This action cannot be undone."
|
||||||
|
onConfirm={vi.fn()}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(screen.getByRole('dialog')).toBeInTheDocument();
|
||||||
|
expect(
|
||||||
|
screen.getByRole('heading', { name: /delete item/i })
|
||||||
|
).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should call onConfirm when confirmed', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
const onConfirm = vi.fn();
|
||||||
|
|
||||||
|
render(
|
||||||
|
<ConfirmDialog isOpen={true} onConfirm={onConfirm} onCancel={vi.fn()} />
|
||||||
|
);
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('button', { name: /confirm/i }));
|
||||||
|
|
||||||
|
expect(onConfirm).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should call onCancel when cancelled', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
const onCancel = vi.fn();
|
||||||
|
|
||||||
|
render(
|
||||||
|
<ConfirmDialog isOpen={true} onConfirm={vi.fn()} onCancel={onCancel} />
|
||||||
|
);
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('button', { name: /cancel/i }));
|
||||||
|
|
||||||
|
expect(onCancel).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should close on escape key', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
const onCancel = vi.fn();
|
||||||
|
|
||||||
|
render(
|
||||||
|
<ConfirmDialog isOpen={true} onConfirm={vi.fn()} onCancel={onCancel} />
|
||||||
|
);
|
||||||
|
|
||||||
|
await user.keyboard('{Escape}');
|
||||||
|
|
||||||
|
expect(onCancel).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Accessibility Testing
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { axe, toHaveNoViolations } from 'jest-axe';
|
||||||
|
|
||||||
|
expect.extend(toHaveNoViolations);
|
||||||
|
|
||||||
|
describe('Accessibility', () => {
|
||||||
|
it('should have no accessibility violations', async () => {
|
||||||
|
const { container } = render(<LoginForm onSubmit={vi.fn()} />);
|
||||||
|
|
||||||
|
const results = await axe(container);
|
||||||
|
|
||||||
|
expect(results).toHaveNoViolations();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should have proper focus management', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
render(<LoginForm onSubmit={vi.fn()} />);
|
||||||
|
|
||||||
|
// Tab through form
|
||||||
|
await user.tab();
|
||||||
|
expect(screen.getByLabelText(/email/i)).toHaveFocus();
|
||||||
|
|
||||||
|
await user.tab();
|
||||||
|
expect(screen.getByLabelText(/password/i)).toHaveFocus();
|
||||||
|
|
||||||
|
await user.tab();
|
||||||
|
expect(screen.getByRole('button', { name: /sign in/i })).toHaveFocus();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should announce errors to screen readers', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
render(<LoginForm onSubmit={vi.fn()} />);
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('button', { name: /sign in/i }));
|
||||||
|
|
||||||
|
const errorMessage = screen.getByText(/email is required/i);
|
||||||
|
expect(errorMessage).toHaveAttribute('role', 'alert');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Select/Dropdown Components
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
describe('CountrySelect', () => {
|
||||||
|
it('should show selected country', () => {
|
||||||
|
render(<CountrySelect value="GB" onChange={vi.fn()} />);
|
||||||
|
|
||||||
|
expect(screen.getByRole('combobox')).toHaveTextContent('United Kingdom');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show options when opened', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
render(<CountrySelect value="" onChange={vi.fn()} />);
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('combobox'));
|
||||||
|
|
||||||
|
expect(screen.getByRole('listbox')).toBeInTheDocument();
|
||||||
|
expect(screen.getByRole('option', { name: /united states/i })).toBeInTheDocument();
|
||||||
|
expect(screen.getByRole('option', { name: /united kingdom/i })).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should call onChange when option selected', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
const onChange = vi.fn();
|
||||||
|
render(<CountrySelect value="" onChange={onChange} />);
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('combobox'));
|
||||||
|
await user.click(screen.getByRole('option', { name: /germany/i }));
|
||||||
|
|
||||||
|
expect(onChange).toHaveBeenCalledWith('DE');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should filter options when typing', async () => {
|
||||||
|
const user = userEvent.setup();
|
||||||
|
render(<CountrySelect value="" onChange={vi.fn()} />);
|
||||||
|
|
||||||
|
await user.click(screen.getByRole('combobox'));
|
||||||
|
await user.type(screen.getByRole('combobox'), 'united');
|
||||||
|
|
||||||
|
const options = screen.getAllByRole('option');
|
||||||
|
expect(options).toHaveLength(2); // US and UK
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing with Context Providers
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// tests/utils/render.tsx
|
||||||
|
import { render, type RenderOptions } from '@testing-library/react';
|
||||||
|
import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
|
||||||
|
import { ThemeProvider } from '@/shared/context/ThemeContext';
|
||||||
|
import { AuthProvider } from '@/features/auth/AuthContext';
|
||||||
|
|
||||||
|
type WrapperProps = {
|
||||||
|
children: React.ReactNode;
|
||||||
|
};
|
||||||
|
|
||||||
|
const createTestQueryClient = () =>
|
||||||
|
new QueryClient({
|
||||||
|
defaultOptions: {
|
||||||
|
queries: { retry: false },
|
||||||
|
mutations: { retry: false },
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
export function renderWithProviders(
|
||||||
|
ui: React.ReactElement,
|
||||||
|
options?: Omit<RenderOptions, 'wrapper'>
|
||||||
|
) {
|
||||||
|
const queryClient = createTestQueryClient();
|
||||||
|
|
||||||
|
function Wrapper({ children }: WrapperProps) {
|
||||||
|
return (
|
||||||
|
<QueryClientProvider client={queryClient}>
|
||||||
|
<AuthProvider>
|
||||||
|
<ThemeProvider>{children}</ThemeProvider>
|
||||||
|
</AuthProvider>
|
||||||
|
</QueryClientProvider>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
...render(ui, { wrapper: Wrapper, ...options }),
|
||||||
|
queryClient,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns to Avoid
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// BAD: Testing implementation details
|
||||||
|
it('should update state', () => {
|
||||||
|
const { result } = renderHook(() => useState(0));
|
||||||
|
act(() => result.current[1](1));
|
||||||
|
expect(result.current[0]).toBe(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
// GOOD: Test user-visible behavior
|
||||||
|
it('should increment counter when clicked', async () => {
|
||||||
|
render(<Counter />);
|
||||||
|
await userEvent.click(screen.getByRole('button', { name: /increment/i }));
|
||||||
|
expect(screen.getByText('1')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Using test IDs when accessible query exists
|
||||||
|
screen.getByTestId('submit-button')
|
||||||
|
|
||||||
|
// GOOD: Use accessible query
|
||||||
|
screen.getByRole('button', { name: /submit/i })
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Waiting with arbitrary timeout
|
||||||
|
await new Promise(r => setTimeout(r, 1000));
|
||||||
|
|
||||||
|
// GOOD: Wait for specific condition
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(screen.getByText('Loaded')).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Using fireEvent for user interactions
|
||||||
|
fireEvent.click(button);
|
||||||
|
|
||||||
|
// GOOD: Use userEvent for realistic interactions
|
||||||
|
await userEvent.click(button);
|
||||||
|
|
||||||
|
|
||||||
|
// BAD: Querying by class or internal structure
|
||||||
|
container.querySelector('.btn-primary')
|
||||||
|
|
||||||
|
// GOOD: Query by role and accessible name
|
||||||
|
screen.getByRole('button', { name: /save/i })
|
||||||
|
```
|
||||||
112
.gitignore
vendored
Normal file
112
.gitignore
vendored
Normal file
@@ -0,0 +1,112 @@
|
|||||||
|
# OS files
|
||||||
|
.DS_Store
|
||||||
|
.DS_Store?
|
||||||
|
._*
|
||||||
|
.Spotlight-V100
|
||||||
|
.Trashes
|
||||||
|
ehthumbs.db
|
||||||
|
Thumbs.db
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
*~
|
||||||
|
|
||||||
|
# VS Code
|
||||||
|
.vscode/
|
||||||
|
*.code-workspace
|
||||||
|
|
||||||
|
# JetBrains IDEs
|
||||||
|
.idea/
|
||||||
|
*.iml
|
||||||
|
|
||||||
|
# Node.js
|
||||||
|
node_modules/
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
.npm
|
||||||
|
.yarn/cache
|
||||||
|
.pnp.*
|
||||||
|
|
||||||
|
# Python
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
.Python
|
||||||
|
build/
|
||||||
|
develop-eggs/
|
||||||
|
dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
wheels/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
.venv/
|
||||||
|
venv/
|
||||||
|
ENV/
|
||||||
|
.env
|
||||||
|
.env.local
|
||||||
|
.env.*.local
|
||||||
|
*.pyc
|
||||||
|
|
||||||
|
# Rust
|
||||||
|
target/
|
||||||
|
Cargo.lock
|
||||||
|
**/*.rs.bk
|
||||||
|
|
||||||
|
# Coverage reports
|
||||||
|
coverage/
|
||||||
|
*.lcov
|
||||||
|
.nyc_output/
|
||||||
|
htmlcov/
|
||||||
|
.coverage
|
||||||
|
.coverage.*
|
||||||
|
coverage.xml
|
||||||
|
*.cover
|
||||||
|
*.coverage
|
||||||
|
|
||||||
|
# Build outputs
|
||||||
|
dist/
|
||||||
|
build/
|
||||||
|
out/
|
||||||
|
*.js.map
|
||||||
|
|
||||||
|
# Terraform
|
||||||
|
.terraform/
|
||||||
|
*.tfstate
|
||||||
|
*.tfstate.*
|
||||||
|
*.tfvars
|
||||||
|
!example.tfvars
|
||||||
|
.terraform.lock.hcl
|
||||||
|
|
||||||
|
# Ansible
|
||||||
|
*.retry
|
||||||
|
|
||||||
|
# Secrets (NEVER commit these)
|
||||||
|
*.pem
|
||||||
|
*.key
|
||||||
|
secrets.yml
|
||||||
|
secrets.yaml
|
||||||
|
credentials.json
|
||||||
|
.env.production
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
logs/
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Temporary files
|
||||||
|
tmp/
|
||||||
|
temp/
|
||||||
|
*.tmp
|
||||||
|
*.temp
|
||||||
|
|
||||||
|
# Test artifacts
|
||||||
|
test-results/
|
||||||
|
playwright-report/
|
||||||
72
README.md
Normal file
72
README.md
Normal file
@@ -0,0 +1,72 @@
|
|||||||
|
# AI Development Scaffold
|
||||||
|
|
||||||
|
A comprehensive Claude Code configuration for professional software development with strict TDD enforcement, multi-language support, and cloud infrastructure patterns.
|
||||||
|
|
||||||
|
## What's Included
|
||||||
|
|
||||||
|
- **TDD Enforcement** - Strict RED-GREEN-REFACTOR workflow with 80%+ coverage requirements
|
||||||
|
- **Multi-Language Support** - Python, TypeScript, Rust, Go, Java, C#
|
||||||
|
- **Cloud Patterns** - AWS, Azure, GCP infrastructure templates
|
||||||
|
- **IaC** - Terraform and Ansible best practices
|
||||||
|
- **Testing** - Unit, UI, and browser testing patterns (Playwright + Chrome MCP)
|
||||||
|
- **Security** - Automatic secret detection and blocking
|
||||||
|
- **Model Strategy** - Opus for planning, Sonnet for execution (`opusplan`)
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Clone this repo
|
||||||
|
git clone https://github.com/yourusername/ai-development-scaffold.git
|
||||||
|
|
||||||
|
# 2. Backup existing Claude config (if any)
|
||||||
|
mv ~/.claude ~/.claude.backup
|
||||||
|
|
||||||
|
# 3. Deploy
|
||||||
|
cp -r ai-development-scaffold/.claude/* ~/.claude/
|
||||||
|
chmod +x ~/.claude/hooks/*.sh
|
||||||
|
|
||||||
|
# 4. Verify
|
||||||
|
ls ~/.claude/
|
||||||
|
# Should show: CLAUDE.md, settings.json, agents/, skills/, hooks/
|
||||||
|
```
|
||||||
|
|
||||||
|
## Documentation
|
||||||
|
|
||||||
|
See [.claude/README.md](.claude/README.md) for:
|
||||||
|
- Complete file structure
|
||||||
|
- Hook configuration (Claude hooks vs git hooks)
|
||||||
|
- Usage examples
|
||||||
|
- Customization options
|
||||||
|
- Troubleshooting
|
||||||
|
|
||||||
|
## Agents Available
|
||||||
|
|
||||||
|
| Agent | Purpose |
|
||||||
|
|-------|---------|
|
||||||
|
| `@tdd-guardian` | TDD enforcement and guidance |
|
||||||
|
| `@code-reviewer` | Comprehensive PR review (uses Opus) |
|
||||||
|
| `@security-scanner` | Vulnerability and secrets detection |
|
||||||
|
| `@refactor-scan` | Refactoring assessment (TDD step 3) |
|
||||||
|
| `@dependency-audit` | Outdated/vulnerable package checks |
|
||||||
|
|
||||||
|
## Skills Available
|
||||||
|
|
||||||
|
| Category | Skills |
|
||||||
|
|----------|--------|
|
||||||
|
| Languages | Python, TypeScript, Rust, Go, Java, C# |
|
||||||
|
| Infrastructure | AWS, Azure, GCP, Terraform, Ansible, Docker/K8s, Database, CI/CD |
|
||||||
|
| Testing | TDD, UI Testing, Browser Testing |
|
||||||
|
| Patterns | Monorepo, API Design, Observability |
|
||||||
|
|
||||||
|
## Coverage Targets
|
||||||
|
|
||||||
|
| Layer | Target |
|
||||||
|
|-------|--------|
|
||||||
|
| Domain/Business Logic | 90%+ |
|
||||||
|
| API Routes | 80%+ |
|
||||||
|
| Infrastructure/DB | 70%+ |
|
||||||
|
| UI Components | 80%+ |
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT
|
||||||
Reference in New Issue
Block a user