Files
ai-development-scaffold/.claude/skills/infrastructure/gcp/SKILL.md
James Bland befb8fbaeb feat: initial Claude Code configuration scaffold
Comprehensive Claude Code guidance system with:

- 5 agents: tdd-guardian, code-reviewer, security-scanner, refactor-scan, dependency-audit
- 18 skills covering languages (Python, TypeScript, Rust, Go, Java, C#),
  infrastructure (AWS, Azure, GCP, Terraform, Ansible, Docker/K8s, Database, CI/CD),
  testing (TDD, UI, Browser), and patterns (Monorepo, API Design, Observability)
- 3 hooks: secret detection, auto-formatting, TDD git pre-commit
- Strict TDD enforcement with 80%+ coverage requirements
- Multi-model strategy: Opus for planning, Sonnet for execution (opusplan)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 15:47:34 -05:00

14 KiB

name, description
name description
gcp-services Google Cloud Platform service patterns, IAM best practices, and common architectures. Use when designing or implementing GCP infrastructure.

GCP Services Skill

Common Architecture Patterns

Web Application (Cloud Run + Cloud SQL)

┌─────────────────────────────────────────────────────────────┐
│                         VPC                                  │
│  ┌─────────────────────────────────────────────────────────┐│
│  │                   Public Subnet                          ││
│  │  ┌─────────────┐                    ┌─────────────┐     ││
│  │  │  Cloud      │                    │  Cloud      │     ││
│  │  │  Load       │                    │  NAT        │     ││
│  │  │  Balancing  │                    │             │     ││
│  │  └──────┬──────┘                    └──────┬──────┘     ││
│  └─────────┼───────────────────────────────────┼───────────┘│
│            │                                   │             │
│  ┌─────────┼───────────────────────────────────┼───────────┐│
│  │         │       Private Subnet              │           ││
│  │  ┌──────▼──────┐                    ┌───────▼─────┐     ││
│  │  │  Cloud Run  │                    │  Cloud SQL  │     ││
│  │  │  (Service)  │───────────────────▶│  PostgreSQL │     ││
│  │  └─────────────┘                    └─────────────┘     ││
│  └─────────────────────────────────────────────────────────┘│
└─────────────────────────────────────────────────────────────┘

Serverless (Cloud Functions + API Gateway)

┌────────────┐     ┌─────────────┐     ┌─────────────┐
│   Cloud    │────▶│     API     │────▶│   Cloud     │
│   CDN      │     │   Gateway   │     │  Functions  │
└────────────┘     └─────────────┘     └──────┬──────┘
                                              │
                   ┌──────────────────────────┼──────────────┐
                   │                          │              │
            ┌──────▼─────┐  ┌─────────┐  ┌────▼────┐
            │  Firestore │  │  Cloud  │  │ Secret  │
            └────────────┘  │ Storage │  │ Manager │
                            └─────────┘  └─────────┘

IAM Best Practices

Service Account with Least Privilege

# Service account for Cloud Run service
resource "google_service_account" "app_sa" {
  account_id   = "my-app-service"
  display_name = "My App Service Account"
}

# Grant specific permissions
resource "google_project_iam_member" "app_storage" {
  project = var.project_id
  role    = "roles/storage.objectViewer"
  member  = "serviceAccount:${google_service_account.app_sa.email}"
}

resource "google_project_iam_member" "app_secrets" {
  project = var.project_id
  role    = "roles/secretmanager.secretAccessor"
  member  = "serviceAccount:${google_service_account.app_sa.email}"
}

Workload Identity Federation

# Python - google-auth with workload identity
from google.auth import default
from google.cloud import storage

# Automatically uses workload identity when on GKE/Cloud Run
credentials, project = default()

# Access Cloud Storage
storage_client = storage.Client(credentials=credentials, project=project)
bucket = storage_client.bucket("my-bucket")

Secret Manager Patterns

Accessing Secrets

from google.cloud import secretmanager

def get_secret(project_id: str, secret_id: str, version: str = "latest") -> str:
    """Access a secret from Secret Manager."""
    client = secretmanager.SecretManagerServiceClient()
    name = f"projects/{project_id}/secrets/{secret_id}/versions/{version}"
    response = client.access_secret_version(request={"name": name})
    return response.payload.data.decode("UTF-8")

# Usage
db_password = get_secret("my-project", "database-password")
// TypeScript - @google-cloud/secret-manager
import { SecretManagerServiceClient } from "@google-cloud/secret-manager";

async function getSecret(
  projectId: string,
  secretId: string,
  version: string = "latest"
): Promise<string> {
  const client = new SecretManagerServiceClient();
  const name = `projects/${projectId}/secrets/${secretId}/versions/${version}`;

  const [response] = await client.accessSecretVersion({ name });
  return response.payload?.data?.toString() || "";
}

Cloud Run with Secret References

# Cloud Run service with secret environment variables
apiVersion: serving.knative.dev/v1
kind: Service
metadata:
  name: my-service
spec:
  template:
    spec:
      containers:
        - image: gcr.io/my-project/my-app
          env:
            - name: DATABASE_PASSWORD
              valueFrom:
                secretKeyRef:
                  name: database-password
                  key: latest

Cloud Storage Patterns

Signed URLs

from datetime import timedelta
from google.cloud import storage

def generate_signed_url(
    bucket_name: str,
    blob_name: str,
    expiration_minutes: int = 60
) -> str:
    """Generate a signed URL for downloading a blob."""
    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(blob_name)

    url = blob.generate_signed_url(
        version="v4",
        expiration=timedelta(minutes=expiration_minutes),
        method="GET",
    )

    return url

Upload with Resumable Upload

from google.cloud import storage

def upload_large_file(
    bucket_name: str,
    source_file: str,
    destination_blob: str
) -> str:
    """Upload a large file using resumable upload."""
    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(destination_blob)

    # For files > 5MB, uses resumable upload automatically
    blob.upload_from_filename(source_file)

    return f"gs://{bucket_name}/{destination_blob}"

Firestore Patterns

Async Operations

from google.cloud import firestore
from google.cloud.firestore_v1.async_client import AsyncClient

async def get_firestore_client() -> AsyncClient:
    """Create async Firestore client."""
    return AsyncClient()

async def get_user(user_id: str) -> dict | None:
    """Get a user document from Firestore."""
    db = await get_firestore_client()
    doc_ref = db.collection("users").document(user_id)
    doc = await doc_ref.get()

    if doc.exists:
        return doc.to_dict()
    return None

async def query_users_by_status(status: str) -> list[dict]:
    """Query users by status."""
    db = await get_firestore_client()
    query = db.collection("users").where("status", "==", status)

    docs = await query.get()
    return [doc.to_dict() for doc in docs]

Transaction Pattern

from google.cloud import firestore

def transfer_credits(
    from_user_id: str,
    to_user_id: str,
    amount: int
) -> bool:
    """Transfer credits between users atomically."""
    db = firestore.Client()

    @firestore.transactional
    def update_in_transaction(transaction):
        from_ref = db.collection("users").document(from_user_id)
        to_ref = db.collection("users").document(to_user_id)

        from_snapshot = from_ref.get(transaction=transaction)
        to_snapshot = to_ref.get(transaction=transaction)

        from_credits = from_snapshot.get("credits")

        if from_credits < amount:
            raise ValueError("Insufficient credits")

        transaction.update(from_ref, {"credits": from_credits - amount})
        transaction.update(to_ref, {"credits": to_snapshot.get("credits") + amount})

        return True

    transaction = db.transaction()
    return update_in_transaction(transaction)

Cloud Functions Patterns

HTTP Function with Validation

import functions_framework
from flask import jsonify, Request
from pydantic import BaseModel, ValidationError

class CreateOrderRequest(BaseModel):
    customer_id: str
    items: list[dict]
    total: float

@functions_framework.http
def create_order(request: Request):
    """HTTP Cloud Function for creating orders."""
    try:
        body = request.get_json(silent=True)

        if not body:
            return jsonify({"error": "Request body required"}), 400

        order_request = CreateOrderRequest(**body)

        # Process order...
        result = process_order(order_request)

        return jsonify(result.dict()), 201

    except ValidationError as e:
        return jsonify({"error": e.errors()}), 400
    except Exception as e:
        print(f"Error: {e}")
        return jsonify({"error": "Internal server error"}), 500

Pub/Sub Triggered Function

import base64
import functions_framework
from cloudevents.http import CloudEvent

@functions_framework.cloud_event
def process_message(cloud_event: CloudEvent):
    """Process Pub/Sub message."""
    # Decode the Pub/Sub message
    data = base64.b64decode(cloud_event.data["message"]["data"]).decode()

    print(f"Received message: {data}")

    # Process the message
    process_event(data)

Cloud Run Patterns

Service with Health Checks

from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()

class HealthResponse(BaseModel):
    status: str
    version: str

@app.get("/health", response_model=HealthResponse)
async def health_check():
    """Health check endpoint for Cloud Run."""
    return HealthResponse(status="healthy", version="1.0.0")

@app.get("/ready")
async def readiness_check():
    """Readiness check - verify dependencies."""
    # Check database connection, etc.
    await check_database_connection()
    return {"status": "ready"}

Cloud Run Job

# main.py for Cloud Run Job
import os
from google.cloud import bigquery

def main():
    """Main entry point for Cloud Run Job."""
    task_index = int(os.environ.get("CLOUD_RUN_TASK_INDEX", 0))
    task_count = int(os.environ.get("CLOUD_RUN_TASK_COUNT", 1))

    print(f"Processing task {task_index} of {task_count}")

    # Process batch based on task index
    process_batch(task_index, task_count)

if __name__ == "__main__":
    main()

Cloud Logging

Structured Logging

import json
import logging
from google.cloud import logging as cloud_logging

# Setup Cloud Logging
client = cloud_logging.Client()
client.setup_logging()

logger = logging.getLogger(__name__)

def log_with_trace(message: str, trace_id: str, **kwargs):
    """Log with trace ID for request correlation."""
    log_entry = {
        "message": message,
        "logging.googleapis.com/trace": f"projects/{project_id}/traces/{trace_id}",
        **kwargs
    }
    logger.info(json.dumps(log_entry))

# Usage
log_with_trace(
    "Order processed",
    trace_id="abc123",
    order_id="order-456",
    customer_id="cust-789"
)

Custom Metrics

from google.cloud import monitoring_v3

def write_metric(project_id: str, metric_type: str, value: float):
    """Write a custom metric to Cloud Monitoring."""
    client = monitoring_v3.MetricServiceClient()
    project_name = f"projects/{project_id}"

    series = monitoring_v3.TimeSeries()
    series.metric.type = f"custom.googleapis.com/{metric_type}"
    series.resource.type = "global"

    now = time.time()
    seconds = int(now)
    nanos = int((now - seconds) * 10**9)

    interval = monitoring_v3.TimeInterval(
        {"end_time": {"seconds": seconds, "nanos": nanos}}
    )
    point = monitoring_v3.Point(
        {"interval": interval, "value": {"double_value": value}}
    )
    series.points = [point]

    client.create_time_series(name=project_name, time_series=[series])

# Usage
write_metric("my-project", "orders/processed", 1.0)

CLI Commands

# Authentication
gcloud auth login
gcloud config set project my-project
gcloud config list

# Compute
gcloud compute instances list
gcloud compute ssh my-instance

# Cloud Run
gcloud run services list
gcloud run deploy my-service --image gcr.io/my-project/my-app
gcloud run services describe my-service

# Cloud Functions
gcloud functions list
gcloud functions deploy my-function --runtime python311 --trigger-http
gcloud functions logs read my-function

# Secret Manager
gcloud secrets list
gcloud secrets create my-secret --data-file=secret.txt
gcloud secrets versions access latest --secret=my-secret

# Cloud Storage
gsutil ls gs://my-bucket/
gsutil cp local-file.txt gs://my-bucket/
gsutil signurl -d 1h key.json gs://my-bucket/file.txt

# Firestore
gcloud firestore databases list
gcloud firestore export gs://my-bucket/firestore-backup

# Logging
gcloud logging read "resource.type=cloud_run_revision" --limit=10

Security Checklist

  • Use Service Accounts with least privilege
  • Enable VPC Service Controls for sensitive data
  • Use Secret Manager for all secrets
  • Enable Cloud Audit Logs
  • Configure Identity-Aware Proxy for internal apps
  • Use Private Google Access for GCE instances
  • Enable Binary Authorization for GKE
  • Configure Cloud Armor for DDoS protection
  • Use Customer-Managed Encryption Keys (CMEK) where required
  • Enable Security Command Center