Secrets Management¶
Protect sensitive credentials throughout the development lifecycle.
The Golden Rules¶
- Never commit secrets — Not even "temporarily"
- Never log secrets — Sanitize all output
- Rotate regularly — Assume compromise
- Least privilege — Only access what's needed
- Audit access — Know who accessed what
Environment Variables¶
Local Development¶
# .env (never commit!)
DATABASE_URL=postgresql://user:pass@localhost:5432/db
SECRET_KEY=dev-only-secret-key-12345
STRIPE_API_KEY=sk_test_...
# settings.py
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
database_url: str
secret_key: str
stripe_api_key: str
model_config = SettingsConfigDict(env_file=".env", env_file_encoding="utf-8")
settings = Settings()
.gitignore Protection¶
# Secrets - NEVER commit
.env
.env.*
*.pem
*.key
secrets/
credentials.json
# Except examples
!.env.example
Environment Example File¶
# .env.example (commit this)
DATABASE_URL=postgresql://user:password@localhost:5432/dbname
SECRET_KEY=generate-a-secure-key-here
STRIPE_API_KEY=sk_test_your_test_key
# Generate SECRET_KEY with:
# python -c "import secrets; print(secrets.token_urlsafe(32))"
Production Secrets¶
Cloud Provider Secrets¶
AWS Secrets Manager:
import boto3
import json
from functools import lru_cache
@lru_cache()
def get_secret(secret_name: str) -> dict:
client = boto3.client("secretsmanager")
response = client.get_secret_value(SecretId=secret_name)
return json.loads(response["SecretString"])
# Usage
db_creds = get_secret("prod/database")
connection_string = f"postgresql://{db_creds['username']}:{db_creds['password']}@{db_creds['host']}/{db_creds['database']}"
Google Cloud Secret Manager:
from google.cloud import secretmanager
def get_secret(project_id: str, secret_id: str, version: str = "latest") -> str:
client = secretmanager.SecretManagerServiceClient()
name = f"projects/{project_id}/secrets/{secret_id}/versions/{version}"
response = client.access_secret_version(request={"name": name})
return response.payload.data.decode("UTF-8")
Azure Key Vault:
from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient
credential = DefaultAzureCredential()
client = SecretClient(
vault_url="https://your-vault.vault.azure.net/",
credential=credential
)
secret = client.get_secret("database-password")
HashiCorp Vault¶
import hvac
client = hvac.Client(url="https://vault.example.com")
client.token = "your-vault-token"
# Read secret
secret = client.secrets.kv.v2.read_secret_version(
path="database/credentials"
)
username = secret["data"]["data"]["username"]
password = secret["data"]["data"]["password"]
Secret Rotation¶
Automated Rotation Strategy¶
from datetime import datetime, timedelta
import secrets
class SecretRotator:
def __init__(self, secret_manager):
self.manager = secret_manager
async def rotate_api_key(self, key_name: str):
"""Rotate API key with grace period for old key."""
# Generate new key
new_key = secrets.token_urlsafe(32)
# Store new key
await self.manager.set_secret(f"{key_name}_new", new_key)
# Mark old key for deprecation (allow 24h grace)
old_key = await self.manager.get_secret(key_name)
await self.manager.set_secret(
f"{key_name}_deprecated",
old_key,
expires_at=datetime.utcnow() + timedelta(hours=24)
)
# Update primary key
await self.manager.set_secret(key_name, new_key)
return new_key
async def is_valid_key(self, key_name: str, provided_key: str) -> bool:
"""Check key against current and deprecated keys."""
current = await self.manager.get_secret(key_name)
if secrets.compare_digest(provided_key, current):
return True
# Check deprecated key (grace period)
deprecated = await self.manager.get_secret(f"{key_name}_deprecated")
if deprecated and secrets.compare_digest(provided_key, deprecated):
return True
return False
Database Password Rotation¶
async def rotate_database_password():
"""Rotate database password with zero downtime."""
new_password = secrets.token_urlsafe(32)
# 1. Create new user or update password
await db.execute(f"""
ALTER USER app_user WITH PASSWORD '{new_password}'
""")
# 2. Update secret manager
await secret_manager.set_secret("db_password", new_password)
# 3. Signal app to refresh connections
await notify_apps_to_refresh()
# 4. Verify new connections work
await verify_database_connection(new_password)
CI/CD Secrets¶
GitHub Actions¶
# .github/workflows/deploy.yml
name: Deploy
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Deploy
env:
DATABASE_URL: ${{ secrets.DATABASE_URL }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
run: |
# Secrets available as environment variables
./deploy.sh
Environment-Specific Secrets¶
# Use environments for different secret sets
jobs:
deploy-staging:
environment: staging
steps:
- run: echo "Using staging secrets"
env:
API_KEY: ${{ secrets.API_KEY }} # staging API_KEY
deploy-production:
environment: production
needs: deploy-staging
steps:
- run: echo "Using production secrets"
env:
API_KEY: ${{ secrets.API_KEY }} # production API_KEY
Preventing Leaks¶
Pre-commit Hooks¶
# .pre-commit-config.yaml
repos:
- repo: https://github.com/Yelp/detect-secrets
rev: v1.4.0
hooks:
- id: detect-secrets
args: ['--baseline', '.secrets.baseline']
- repo: https://github.com/zricethezav/gitleaks
rev: v8.18.0
hooks:
- id: gitleaks
Git-secrets¶
# Install
brew install git-secrets
# Configure for AWS
git secrets --register-aws
# Add custom patterns
git secrets --add 'sk_live_[a-zA-Z0-9]{24}' # Stripe live keys
git secrets --add 'PRIVATE KEY'
# Install hooks
git secrets --install
Log Sanitization¶
import re
import structlog
SENSITIVE_PATTERNS = [
(r'password["\']?\s*[:=]\s*["\']?([^"\'&\s]+)', 'password=***'),
(r'api[_-]?key["\']?\s*[:=]\s*["\']?([^"\'&\s]+)', 'api_key=***'),
(r'bearer\s+([a-zA-Z0-9._-]+)', 'bearer ***'),
(r'sk_(live|test)_[a-zA-Z0-9]+', 'sk_***'),
]
def sanitize_log_message(message: str) -> str:
for pattern, replacement in SENSITIVE_PATTERNS:
message = re.sub(pattern, replacement, message, flags=re.IGNORECASE)
return message
def sanitize_processor(logger, method_name, event_dict):
if "event" in event_dict:
event_dict["event"] = sanitize_log_message(event_dict["event"])
return event_dict
structlog.configure(
processors=[
sanitize_processor,
structlog.processors.JSONRenderer(),
]
)
Secret Scanning¶
Detect Existing Secrets¶
# Scan repository history
gitleaks detect --source . --verbose
# Scan only staged changes
gitleaks protect --staged
# Generate baseline (for existing secrets)
detect-secrets scan > .secrets.baseline
Automated Scanning in CI¶
# .github/workflows/security.yml
name: Security Scan
on: [push, pull_request]
jobs:
secrets-scan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Gitleaks
uses: gitleaks/gitleaks-action@v2
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
Incident Response¶
If a Secret is Leaked¶
## Immediate Actions (within minutes)
1. **Revoke the secret immediately**
- Rotate API keys
- Change passwords
- Invalidate tokens
2. **Remove from Git history**
```bash
# Use BFG Repo-Cleaner
bfg --delete-files .env
bfg --replace-text passwords.txt
git reflog expire --expire=now --all
git gc --prune=now --aggressive
git push --force
```
3. **Notify affected parties**
- Security team
- Affected users (if applicable)
- Third-party services
## Follow-up Actions
4. **Audit access logs**
- Check if secret was used
- Identify potential data access
5. **Post-mortem**
- How did it happen?
- How to prevent recurrence?
Best Practices Summary¶
| Practice | Implementation |
|---|---|
| Use environment variables | pydantic-settings, python-dotenv |
| Never commit secrets | .gitignore, pre-commit hooks |
| Use secret managers | AWS Secrets Manager, Vault |
| Rotate regularly | Automated rotation with grace periods |
| Scan for leaks | gitleaks, detect-secrets |
| Sanitize logs | Remove secrets from log output |
| Least privilege | Separate keys per service/environment |
| Audit access | Log who accessed what secrets |