EdgeCases Logo
Feb 2026
Build Tools
Expert
8 min read

Webpack Dynamic Import Race Conditions: The Chunk Loading Trap

Code-split chunks racing against each other cause production-only crashes. Network latency turns microsecond races into user-facing errors.

webpack
code-splitting
dynamic-import
race-condition
performance
production

You've code-split your app perfectly. Chunks load on demand. Bundle sizes are minimal. Then production users on slow 3G connections start seeing blank screens and cryptic errors. The culprit: chunk loading race conditions that only manifest under network latency. Here's how they happen and how to defend against them.

The Basic Race Condition

Consider a route that lazy-loads two related chunks:

// Dashboard.tsx
const Chart = lazy(() => import('./Chart'));
const ChartUtils = lazy(() => import('./ChartUtils'));

function Dashboard() {
  return (
    <Suspense fallback={<Loading />}>
      <Chart />
    </Suspense>
  );
}

// Chart.tsx imports ChartUtils
import { formatData } from './ChartUtils';  // ← Assumed to be available

The problem: Chart expects ChartUtils to be loaded, but under slow networks, Chart might execute before ChartUtils finishes loading. The import fails, and your component crashes.

Why This Only Happens in Production

Development mode typically has:

  • Fast local network (no latency)
  • Chunks served from memory (instant)
  • Smaller chunk sizes (less download time)
  • HMR keeping modules pre-loaded

Production introduces:

  • Network latency (50-500ms+)
  • CDN cold starts
  • Parallel chunk requests with unpredictable completion order
  • Users on throttled connections

The race condition window that's microseconds in dev becomes hundreds of milliseconds in production—enough for code to execute before dependencies arrive.

Shared Chunks: The Sneaky Variant

Webpack's splitChunks optimization creates shared chunks:

// webpack.config.js
optimization: {
  splitChunks: {
    chunks: 'all',
    minSize: 20000,
  }
}

// This creates:
// - main.js (entry)
// - vendors-lodash-moment.js (shared vendor chunk)
// - dashboard.js (your lazy route)
// - chart.js (depends on vendors chunk)

When dashboard.js loads, it needs vendors-lodash-moment.js. Webpack's runtime handles this... usually. But if the vendor chunk is cached (304) while the main chunk downloads fresh (200), timing can get weird.

// Race condition timeline:
1. User navigates to /dashboard
2. dashboard.js request starts (needs vendors chunk)
3. vendors chunk request starts
4. dashboard.js arrives (5KB, fast)
5. dashboard.js executes immediately
6. lodash() called → undefined is not a function
7. vendors chunk arrives (too late)

The ChunkLoadError

When chunk loading fails entirely (network error, 404, timeout), Webpack throws a ChunkLoadError. But partial loading races are sneakier—the chunk "loaded" but dependencies haven't resolved.

// This error is the obvious one:
ChunkLoadError: Loading chunk 5 failed.

// These are the sneaky race condition errors:
TypeError: Cannot read property 'map' of undefined
TypeError: lodash_1.default is not a function
ReferenceError: formatDate is not defined

The second category looks like code bugs, not loading issues. They're hard to reproduce because they depend on network timing.

Defense: Proper Async Boundaries

Never assume synchronous imports in code-split modules. Treat chunk boundaries as async boundaries:

// ❌ BAD: Assumes ChartUtils loaded with Chart
// Chart.tsx
import { formatData } from './ChartUtils';

export function Chart({ data }) {
  return <svg>{formatData(data)}</svg>;
}

// ✅ GOOD: Explicit async dependency
// Chart.tsx
export function Chart({ data }) {
  const [utils, setUtils] = useState(null);
  
  useEffect(() => {
    import('./ChartUtils').then(setUtils);
  }, []);
  
  if (!utils) return <Loading />;
  return <svg>{utils.formatData(data)}</svg>;
}

Defense: Webpack Magic Comments

Control chunk loading order with magic comments:

// Preload: Start loading immediately, high priority
const Chart = lazy(() => import(
  /* webpackPreload: true */
  './Chart'
));

// Prefetch: Load during idle time, low priority
const Settings = lazy(() => import(
  /* webpackPrefetch: true */
  './Settings'
));

// Chunk name: Group related code
const ChartAndUtils = lazy(() => import(
  /* webpackChunkName: "chart-bundle" */
  './ChartWithUtils'  // Single chunk, no race
));

Defense: Error Boundaries With Retry

Wrap lazy components in error boundaries that can retry:

class ChunkErrorBoundary extends Component {
  state = { hasError: false, retryCount: 0 };
  
  static getDerivedStateFromError(error) {
    // Check if it's a chunk loading error
    const isChunkError = error.name === 'ChunkLoadError' ||
      error.message?.includes('Loading chunk');
    return { hasError: true, isChunkError };
  }
  
  retry = () => {
    this.setState(s => ({ 
      hasError: false, 
      retryCount: s.retryCount + 1 
    }));
  };
  
  render() {
    if (this.state.hasError) {
      if (this.state.retryCount < 3) {
        return (
          <button onClick={this.retry}>
            Loading failed. Click to retry.
          </button>
        );
      }
      return <div>Failed to load. Please refresh.</div>;
    }
    return this.props.children;
  }
}

Defense: Module Federation Careful Setup

Module Federation amplifies these issues with remote chunks:

// Remote module loading has even more race conditions
// Because remotes can be on different servers with different latency

// ❌ Dangerous: Assumes remote is loaded
import Button from 'remote/Button';

// ✅ Safe: Explicit async with fallback
const Button = lazy(() => 
  import('remote/Button').catch(() => import('./LocalButton'))
);

// ✅ Even safer: Validate remote before use
async function loadRemote(scope, module) {
  await __webpack_init_sharing__('default');
  const container = window[scope];
  
  if (!container) {
    throw new Error(`Remote ${scope} not loaded`);
  }
  
  await container.init(__webpack_share_scopes__.default);
  const factory = await container.get(module);
  return factory();
}

Testing for Race Conditions

Reproduce timing issues in development:

// 1. Chrome DevTools: Network tab → Throttling → Slow 3G

// 2. Custom delay plugin for webpack-dev-server:
// webpack.config.js
devServer: {
  setupMiddlewares: (middlewares, devServer) => {
    devServer.app.use((req, res, next) => {
      if (req.url.includes('.js')) {
        // Random delay 0-2000ms per chunk
        setTimeout(next, Math.random() * 2000);
      } else {
        next();
      }
    });
    return middlewares;
  }
}

// 3. Playwright test with slow network
test('dashboard loads on slow network', async ({ page }) => {
  await page.route('**/*.js', route => {
    setTimeout(() => route.continue(), 1000 + Math.random() * 2000);
  });
  
  await page.goto('/dashboard');
  await expect(page.locator('.chart')).toBeVisible({ timeout: 30000 });
});

Webpack 5 Improvements

Webpack 5 added better chunk loading guarantees:

// webpack.config.js
output: {
  // Ensures chunk dependencies load first
  chunkLoading: 'jsonp',  // or 'import' for ESM
  
  // Adds integrity checking
  crossOriginLoading: 'anonymous',
}

// Runtime chunk ensures loading logic is always available
optimization: {
  runtimeChunk: 'single',  // ← Critical for consistent loading
  
  splitChunks: {
    chunks: 'all',
    // Ensure shared dependencies are in predictable chunks
    cacheGroups: {
      vendor: {
        test: /[\/]node_modules[\/]/,
        name: 'vendors',
        chunks: 'all',
        priority: 10,
      },
    },
  },
}

The runtimeChunk: 'single' setting is crucial—it ensures the chunk loading runtime is consistent across all entry points.

Monitoring in Production

Track chunk loading failures:

// Global error handler for chunk failures
window.addEventListener('error', (event) => {
  if (event.message?.includes('Loading chunk')) {
    // Send to monitoring
    analytics.track('chunk_load_failure', {
      chunk: event.message,
      url: window.location.href,
      connection: navigator.connection?.effectiveType,
    });
  }
});

// React error boundary reporting
componentDidCatch(error, info) {
  if (error.name === 'ChunkLoadError') {
    Sentry.captureException(error, {
      tags: { type: 'chunk_loading' },
      extra: { componentStack: info.componentStack },
    });
  }
}

Key Takeaways

  • Chunk loading races appear in production due to network latency
  • Shared chunks and Module Federation multiply race condition risks
  • Treat chunk boundaries as async boundaries—never assume sync availability
  • Use runtimeChunk: 'single' for consistent loading behavior
  • Test with artificial delays; DevTools throttling isn't enough
  • Error boundaries with retry logic prevent blank screens
  • Monitor chunk failures in production—they look like code bugs

Advertisement

Related Insights

Explore related edge cases and patterns

Architecture
Expert
Edge Runtime: What Breaks When You Leave Node.js
8 min
Build Tools
Deep
Module Federation Memory Leaks: The Shared Singleton Problem
7 min

Advertisement