Skip to content

Shell & CLI Vademecum

Your comprehensive guide to command-line productivity.

Philosophy

The command line is where developers spend significant time. Mastering it means:

  • Faster iteration — No GUI overhead
  • Scriptable workflows — Automate repetitive tasks
  • Composable tools — Combine small tools for complex operations
  • Universal interface — Works everywhere: local, SSH, CI/CD

What's Covered

Section Focus
Daily Commands Git, Docker, bun, uv, pytest workflows
Docker Workflows Container debugging, compose, networking
Database CLI psql, pgcli, migrations, data inspection
Debugging Tools pdb, browser DevTools, network inspection
Productivity Shell aliases, fzf, tmux, direnv
Scripts Common automation scripts
Cheatsheet Quick reference for all tools

Quick Setup

Essential Tools

# macOS
brew install git docker fzf ripgrep jq tmux direnv pgcli

# Ubuntu/Debian
sudo apt install git fzf ripgrep jq tmux direnv
pip install pgcli

# Python tooling
pip install uv
uv tool install ruff pytest ipdb

Shell Configuration

Add to your ~/.zshrc or ~/.bashrc:

# Better history
HISTSIZE=10000
SAVEHIST=10000
setopt SHARE_HISTORY

# fzf integration
[ -f ~/.fzf.zsh ] && source ~/.fzf.zsh

# direnv (auto-load .envrc)
eval "$(direnv hook zsh)"

# Useful aliases
alias g="git"
alias dc="docker compose"
alias py="python"
alias pytest="uv run pytest"

Tool Philosophy

Unix Philosophy

Do one thing well, compose with pipes

# Find large files in node_modules
find node_modules -type f -size +1M | xargs ls -lh | sort -k5 -h

# Count lines by file type
find . -name "*.py" | xargs wc -l | sort -n

# Extract unique error types from logs
grep "ERROR" app.log | cut -d':' -f3 | sort | uniq -c | sort -rn

Modern Replacements

Classic Modern Why
find fd Faster, better defaults
grep ripgrep (rg) Much faster, respects .gitignore
cat bat Syntax highlighting
ls eza Better output, git integration
cd zoxide Frecency-based jumping
man tldr Practical examples
# Install modern tools
brew install fd ripgrep bat eza zoxide tldr

# Usage
fd "\.py$"              # Find Python files
rg "TODO" --type py     # Search in Python files only
bat README.md           # View with syntax highlighting
eza -la --git           # List with git status
z project               # Jump to frequently used directory
tldr tar                # Quick examples for tar

Directory Jumping

# zoxide - smart directory jumping
z docs          # Jump to most frequent "docs" directory
zi              # Interactive selection with fzf

# Bookmarks with shell function
mark() { export "MARK_$1"="$PWD"; }
jump() { cd "${!1:-MARK_$1}"; }

# Usage
mark project
cd /somewhere/else
jump MARK_project

Project Root Detection

# Jump to git root
alias groot='cd $(git rev-parse --show-toplevel)'

# Find project root (looks for pyproject.toml, package.json, etc.)
project_root() {
    local dir="$PWD"
    while [[ "$dir" != "/" ]]; do
        if [[ -f "$dir/pyproject.toml" ]] || [[ -f "$dir/package.json" ]]; then
            echo "$dir"
            return
        fi
        dir=$(dirname "$dir")
    done
}

Environment Management

direnv

Automatically load environment variables when entering directories.

# .envrc in project root
export DATABASE_URL="postgresql://localhost/dev"
export DEBUG=1
source .venv/bin/activate

# Allow the .envrc
direnv allow

Multiple Environments

# .envrc with environment selection
if [[ -f .env.local ]]; then
    dotenv .env.local
elif [[ -f .env ]]; then
    dotenv .env
fi

Output Processing

jq for JSON

# Pretty print
cat response.json | jq .

# Extract field
cat users.json | jq '.[].email'

# Filter
cat users.json | jq '.[] | select(.active == true)'

# Transform
cat users.json | jq '.[] | {name: .name, email: .email}'

# From API response
curl -s https://api.example.com/users | jq '.data[0].name'

Text Processing

# Column extraction
cat file.csv | cut -d',' -f2

# Line filtering
cat logs.txt | grep -v DEBUG

# Transformation
cat names.txt | tr '[:lower:]' '[:upper:]'

# Aggregation
cat access.log | awk '{print $1}' | sort | uniq -c | sort -rn | head

Best Practices

  1. Learn one thing at a time — Master git before Docker
  2. Create aliases for frequent commands — But document them
  3. Use shell historyCtrl+R to search
  4. Pipe everything — Output of one command → input of next
  5. Script repetitive tasks — If you do it twice, script it
  6. Version control your dotfiles — Your config is valuable