Parallel Request
Problem
Pages load slowly as each API request waits for the previous one to complete, even when the data is independent. A dashboard needing user info, notifications, and analytics takes 6 seconds to load when each 2-second request runs sequentially. Users wait unnecessarily while independent data fetches happen one at a time instead of concurrently.
Solution
Issue multiple independent requests simultaneously rather than waiting for each to complete sequentially. This reduces total wait time and improves perceived performance.
Example
This example shows three independent API requests executing simultaneously using Promise.all, reducing total wait time compared to sequential requests.
// Execute all three requests in parallel
// Total time = max(request1, request2, request3), not sum
const [user, posts, comments] = await Promise.all([
// Fetch user data
fetch('/api/user').then(r => r.json()),
// Fetch posts data (doesn't wait for user)
fetch('/api/posts').then(r => r.json()),
// Fetch comments data (doesn't wait for posts)
fetch('/api/comments').then(r => r.json())
]);
// All three results are available once the slowest request completes
Benefits
- Dramatically reduces total load time by executing requests concurrently.
- Improves perceived performance by showing data as soon as possible.
- Makes efficient use of network capacity and browser connection limits.
- Enables faster page loads for dashboards and data-heavy interfaces.
Tradeoffs
- Can overwhelm servers if too many requests are issued simultaneously.
- Makes error handling more complex with multiple potential failure points.
- May create race conditions if requests have hidden dependencies.
- Requires careful management of loading states for multiple concurrent operations.