Concurrency & Parallelism
Understanding concurrent programming in Python and JavaScript.
Core Distinction
| Concept |
Definition |
Example |
| Concurrency |
Managing multiple tasks at once |
Single cashier serving multiple customers in rotation |
| Parallelism |
Executing multiple tasks simultaneously |
Multiple cashiers serving customers at the same time |
Concurrency (single core):
Task 1: ███░░░███░░░███
Task 2: ░░░███░░░███░░░
→ Time
Parallelism (multiple cores):
Task 1: ███████████████ (Core 1)
Task 2: ███████████████ (Core 2)
→ Time
When to Use What
Decision Tree
Is the task...
├── Waiting for I/O (network, disk, database)?
│ └── Use async/await (Python asyncio, JavaScript Promises)
│
├── CPU-intensive computation?
│ ├── Python → Use multiprocessing
│ └── JavaScript → Use Web Workers
│
├── Need shared state between workers?
│ ├── High contention → Be very careful, consider redesign
│ └── Low contention → Use thread-safe data structures
│
└── Simple parallel map over data?
└── Use process pools or worker pools
Python Quick Guide
| Situation |
Solution |
Module |
| HTTP requests |
asyncio + aiohttp |
asyncio |
| Database queries |
asyncio + asyncpg |
asyncio |
| File I/O |
asyncio + aiofiles |
asyncio |
| Image processing |
multiprocessing.Pool |
multiprocessing |
| Data crunching |
concurrent.futures.ProcessPoolExecutor |
concurrent.futures |
| Background tasks |
Celery, RQ |
External libraries |
JavaScript Quick Guide
| Situation |
Solution |
API |
| API calls |
async/await with fetch |
Promises |
| Multiple API calls |
Promise.all() |
Promises |
| CPU-intensive |
Web Workers |
Worker API |
| Heavy computation in React |
useDeferredValue, Web Workers |
React/Worker |
| Background sync |
Service Workers |
Service Worker API |
Section Contents
Fundamentals
Python Concurrency
JavaScript Concurrency
Patterns & Practice
Quick Examples
Python: Async HTTP Requests
import asyncio
import aiohttp
async def fetch_all(urls: list[str]) -> list[dict]:
async with aiohttp.ClientSession() as session:
tasks = [fetch_one(session, url) for url in urls]
return await asyncio.gather(*tasks)
async def fetch_one(session: aiohttp.ClientSession, url: str) -> dict:
async with session.get(url) as response:
return await response.json()
# Usage
results = asyncio.run(fetch_all([
"https://api.example.com/users/1",
"https://api.example.com/users/2",
"https://api.example.com/users/3",
]))
Python: CPU-Bound Processing
from concurrent.futures import ProcessPoolExecutor
import multiprocessing
def process_image(path: str) -> dict:
# CPU-intensive work
image = load_image(path)
result = apply_filters(image)
return {"path": path, "result": result}
# Process images in parallel
with ProcessPoolExecutor(max_workers=multiprocessing.cpu_count()) as executor:
results = list(executor.map(process_image, image_paths))
JavaScript: Parallel Fetches
// Fetch multiple resources in parallel
async function fetchUserData(userId) {
const [user, posts, comments] = await Promise.all([
fetch(`/api/users/${userId}`).then(r => r.json()),
fetch(`/api/users/${userId}/posts`).then(r => r.json()),
fetch(`/api/users/${userId}/comments`).then(r => r.json()),
]);
return { user, posts, comments };
}
JavaScript: Web Worker
// main.js
const worker = new Worker('worker.js');
worker.postMessage({ data: largeDataSet, operation: 'process' });
worker.onmessage = (event) => {
console.log('Processed:', event.data);
};
// worker.js
self.onmessage = (event) => {
const { data, operation } = event.data;
const result = heavyComputation(data);
self.postMessage(result);
};
Key Takeaways
- I/O-bound → Concurrency — Use async/await, don't waste time waiting
- CPU-bound → Parallelism — Use multiple processes/workers
- Shared state is hard — Avoid it when possible
- Measure first — Profile before optimizing
- Python GIL limits threading — Threads don't parallelize CPU work
- JavaScript is single-threaded — Use Web Workers for CPU work