React Concurrent Features¶
Keep your UI responsive during expensive renders.
Overview¶
React's concurrent features allow the browser to interrupt rendering to handle high-priority updates (like user input).
Traditional React:
[───── Expensive Render ─────] → UI frozen during render
Concurrent React:
[── Render ──][user input][── Render ──] → UI stays responsive
useTransition¶
Mark updates as non-urgent. React can interrupt them.
Basic Usage¶
import { useState, useTransition } from 'react';
function SearchResults() {
const [query, setQuery] = useState('');
const [results, setResults] = useState([]);
const [isPending, startTransition] = useTransition();
function handleChange(e: React.ChangeEvent<HTMLInputElement>) {
// Urgent: Update input immediately
setQuery(e.target.value);
// Non-urgent: Update results can wait
startTransition(() => {
setResults(searchDatabase(e.target.value));
});
}
return (
<div>
<input value={query} onChange={handleChange} />
{isPending && <Spinner />}
<ResultsList results={results} />
</div>
);
}
When to Use¶
- Search/filter that updates a large list
- Tab switching with heavy content
- Any state update that can be deferred
What Happens¶
User types "a":
1. setQuery('a') → Input updates immediately
2. startTransition → Schedules results update
3. User types "b":
- setQuery('ab') → Input updates immediately
- Previous results transition cancelled
- New transition for 'ab' starts
useDeferredValue¶
Defer rendering of a value until urgent updates are done.
Basic Usage¶
import { useState, useDeferredValue } from 'react';
function SearchResults() {
const [query, setQuery] = useState('');
const deferredQuery = useDeferredValue(query);
// Input updates immediately
// deferredQuery updates when React has time
return (
<div>
<input
value={query}
onChange={(e) => setQuery(e.target.value)}
/>
<ResultsList query={deferredQuery} />
</div>
);
}
With Memoization¶
import { useState, useDeferredValue, useMemo } from 'react';
function SearchResults() {
const [query, setQuery] = useState('');
const deferredQuery = useDeferredValue(query);
// Only re-compute when deferredQuery changes
const results = useMemo(
() => filterResults(deferredQuery),
[deferredQuery]
);
// Show stale content with visual indicator
const isStale = query !== deferredQuery;
return (
<div>
<input
value={query}
onChange={(e) => setQuery(e.target.value)}
/>
<div style={{ opacity: isStale ? 0.5 : 1 }}>
<ResultsList results={results} />
</div>
</div>
);
}
useTransition vs useDeferredValue¶
| useTransition | useDeferredValue |
|---|---|
| Wraps the state setter | Wraps the value |
| More control over when to defer | Simpler API |
isPending for loading states |
Compare old/new for staleness |
| Use when you control the update | Use when you receive a prop |
Suspense¶
Declaratively wait for async data or code.
Basic Usage¶
import { Suspense } from 'react';
function App() {
return (
<Suspense fallback={<Loading />}>
<UserProfile userId={1} />
</Suspense>
);
}
With Data Fetching¶
Using React's use hook (React 19+):
import { use, Suspense } from 'react';
// Resource that suspends
async function fetchUser(id: number) {
const response = await fetch(`/api/users/${id}`);
return response.json();
}
function UserProfile({ userId }: { userId: number }) {
// This suspends until data is ready
const user = use(fetchUser(userId));
return <div>{user.name}</div>;
}
function App() {
return (
<Suspense fallback={<Loading />}>
<UserProfile userId={1} />
</Suspense>
);
}
Nested Suspense¶
function App() {
return (
<Suspense fallback={<PageSkeleton />}>
<Header />
<Suspense fallback={<ContentSkeleton />}>
<MainContent />
</Suspense>
<Suspense fallback={<SidebarSkeleton />}>
<Sidebar />
</Suspense>
</Suspense>
);
}
With Error Boundaries¶
import { Suspense } from 'react';
import { ErrorBoundary } from 'react-error-boundary';
function App() {
return (
<ErrorBoundary fallback={<Error />}>
<Suspense fallback={<Loading />}>
<DataComponent />
</Suspense>
</ErrorBoundary>
);
}
SuspenseList (Experimental)¶
Coordinate multiple Suspense boundaries.
import { Suspense, SuspenseList } from 'react';
function Feed() {
return (
<SuspenseList revealOrder="forwards" tail="collapsed">
<Suspense fallback={<PostSkeleton />}>
<Post id={1} />
</Suspense>
<Suspense fallback={<PostSkeleton />}>
<Post id={2} />
</Suspense>
<Suspense fallback={<PostSkeleton />}>
<Post id={3} />
</Suspense>
</SuspenseList>
);
}
// revealOrder:
// - "forwards": reveal in order
// - "backwards": reveal in reverse order
// - "together": reveal all at once
// tail:
// - "collapsed": show one fallback at a time
// - "hidden": show nothing until ready
Streaming SSR¶
Server-side rendering with Suspense.
Server¶
// server.js
import { renderToPipeableStream } from 'react-dom/server';
app.get('/', (req, res) => {
const { pipe } = renderToPipeableStream(
<App />,
{
onShellReady() {
res.setHeader('content-type', 'text/html');
pipe(res);
},
}
);
});
Component¶
// App.tsx
function App() {
return (
<html>
<body>
<Header />
<Suspense fallback={<Loading />}>
<MainContent />
</Suspense>
</body>
</html>
);
}
What happens: 1. Server sends shell (Header) immediately 2. MainContent suspends on data fetch 3. Server streams MainContent when ready 4. Client hydrates progressively
Patterns¶
Optimistic Updates¶
import { useOptimistic, useTransition } from 'react';
function LikeButton({ postId, initialLikes }) {
const [isPending, startTransition] = useTransition();
const [optimisticLikes, addOptimisticLike] = useOptimistic(
initialLikes,
(state, newLike) => state + 1
);
async function handleLike() {
startTransition(async () => {
addOptimisticLike(1);
await likePost(postId);
});
}
return (
<button onClick={handleLike} disabled={isPending}>
{optimisticLikes} Likes
</button>
);
}
Progressive Loading¶
function UserProfile({ userId }) {
return (
<div>
{/* Critical content loads first */}
<Suspense fallback={<HeaderSkeleton />}>
<UserHeader userId={userId} />
</Suspense>
{/* Less critical can wait */}
<Suspense fallback={<PostsSkeleton />}>
<UserPosts userId={userId} />
</Suspense>
{/* Least critical loads last */}
<Suspense fallback={<RecommendationsSkeleton />}>
<Recommendations userId={userId} />
</Suspense>
</div>
);
}
Parallel Data Fetching¶
// Start fetches in parallel
function ProfilePage({ userId }) {
// These start fetching immediately, in parallel
const userPromise = fetchUser(userId);
const postsPromise = fetchPosts(userId);
return (
<>
<Suspense fallback={<UserSkeleton />}>
<User promise={userPromise} />
</Suspense>
<Suspense fallback={<PostsSkeleton />}>
<Posts promise={postsPromise} />
</Suspense>
</>
);
}
Best Practices¶
- Start with useTransition — Simplest way to keep UI responsive
- Use Suspense for data — With React 19+
useor data libraries - Add loading states —
isPendingfrom useTransition or Suspense fallback - Nest Suspense boundaries — More granular loading states
- Combine with memoization — useDeferredValue + useMemo
- Test interaction responsiveness — Not just render time
Common Mistakes¶
// Bad: Heavy computation in render
function Bad({ items }) {
const filtered = items.filter(/* expensive */); // Blocks render
return <List items={filtered} />;
}
// Good: Defer the expensive computation
function Good({ items }) {
const deferredItems = useDeferredValue(items);
const filtered = useMemo(
() => deferredItems.filter(/* expensive */),
[deferredItems]
);
return <List items={filtered} />;
}
// Bad: Suspense without fallback
<Suspense> {/* No fallback = nothing shown while loading */}
<AsyncComponent />
</Suspense>
// Good: Always provide fallback
<Suspense fallback={<Skeleton />}>
<AsyncComponent />
</Suspense>
// Bad: Everything in one Suspense
<Suspense fallback={<FullPageLoader />}>
<Header />
<Sidebar />
<MainContent />
</Suspense>
// Good: Granular Suspense boundaries
<Header />
<Suspense fallback={<SidebarSkeleton />}>
<Sidebar />
</Suspense>
<Suspense fallback={<ContentSkeleton />}>
<MainContent />
</Suspense>