Engineering

The Frontend Build Pipeline

Why a compile step exists, what each tool does, and how to stop fighting it

Learning Objectives

By the end of this module you will be able to:

  • Explain why frontend applications require a build step and what it produces.
  • Describe the difference between ESM and CommonJS and why that distinction determines whether tree-shaking works.
  • Select the appropriate bundler (Vite, esbuild, Rollup, Webpack) for a given scenario and justify the choice.
  • Explain why tree-shaking fails with CommonJS and barrel files, and how to prevent it.
  • Describe the correct CI pipeline for TypeScript — separating transpilation from type-checking.
  • Use dynamic import() to implement route-based code splitting.

Why the build step exists at all

Backend engineers compile code because the machine cannot run source directly. The frontend has the same problem, but the "machine" is a network.

A naive browser app that loads modules one-by-one quickly becomes unworkable. Importing lodash-es alone triggers 600+ simultaneous HTTP requests without bundling. On a mobile network with high latency and limited bandwidth, that is a disaster. A build step solves several distinct problems in one pass:

  • Request minimization — Hundreds of files become a handful of chunks, reducing round-trips.
  • Dead-code elimination — Unused exports are stripped, reducing bundle size by ~25% on average.
  • Minification — Whitespace and long variable names are removed, significantly reducing bytes transferred.
  • Transpilation — Modern JavaScript (ES2022+) is converted to syntax that older browsers understand.
  • JSX and TypeScript compilation — JSX must be converted to React.createElement() calls; TypeScript annotations must be stripped.
  • CSS optimization — Critical above-the-fold styles are inlined; the rest is deferred.
  • Code splitting — Dividing the app into on-demand chunks can cut initial load from 2MB to 200KB and yield 1.64x+ performance improvements.

Think of it as a release build in Go or Rust: you don't ship debug binaries to production, and you don't ship raw source to browsers.

ESM vs CommonJS: the module system divide

This is the frontend equivalent of the static vs dynamic linking problem.

CommonJS (CJS) was Node.js's original module system. Modules are loaded with require(), which is a regular function call. Because it is a function, it can appear inside an if statement or depend on a runtime variable:

// CJS — require() is dynamic, evaluated at runtime
const feature = process.env.FEATURE_FLAG
  ? require('./feature-a')
  : require('./feature-b');

ES Modules (ESM) use static import/export declarations that must appear at the top level and cannot be conditional:

// ESM — import is static, resolved at parse time
import { debounce } from 'lodash-es';

This is not just syntactic. The consequences are significant:

PropertyCommonJSESM
Analysis timeRuntimeParse time (static)
Tree-shaking possibleNo (with rare exceptions)Yes
Loading modelSynchronous, blockingAsynchronous, parallel
ExportsValue copiesLive bindings
File extension default.js (with "type": "commonjs").mjs or .js (with "type": "module")

Because import is static, a bundler can read the entire dependency graph before executing any code. It knows exactly which exports are consumed. CommonJS forces bundlers to assume everything might be needed because require() paths can be dynamic.

Live bindings vs value copies

When an ESM module exports a variable and later mutates it, the importing module sees the updated value. CommonJS copies the value at import time — subsequent mutations in the exporter are invisible. This matters for mutable state and module-level singletons.

The dual package hazard is a related trap: when a library ships both a CJS and an ESM build, and your app accidentally loads both (common in complex bundler configs), each module system gets its own independent instance. If the package maintains a singleton or a configuration map, your app gets two independent versions of it — two different registries, two independent state stores. This causes bugs that are very difficult to trace.

The "exports" field in package.json is the correct fix: it lets a package declare exactly which file is loaded per condition ("import" vs "require"), superseding the older "main" and "module" fields:

{
  "exports": {
    ".": {
      "import": "./dist/esm/index.js",
      "require": "./dist/cjs/index.js"
    }
  }
}

The bundler landscape

The ecosystem has shifted toward native-language implementations (Go, Rust) in recent years, driven by performance demands that JavaScript-based tooling cannot satisfy.

Vite has become the default choice for new frontend projects as of 2025–2026. Its key architectural decision is the "unbundled dev + bundled production" split. During development, Vite serves modules individually via native ESM — no bundling, instant startup, HMR in 10–20ms regardless of app size. For production, it uses Rollup to produce optimized, tree-shaken chunks. This split exists because dev and production have contradictory goals: dev needs instant feedback; production needs minimal network round-trips.

esbuild is written in Go and achieves 100–125x performance gains through parallelism and a three-pass architecture. It bundles 10 copies of three.js in 0.33 seconds versus Webpack's 41.53 seconds. But it intentionally omits features like sophisticated code splitting and dynamic import optimization. Vite uses esbuild for dev-time dependency pre-bundling. esbuild is a strong choice for CI/CD pipelines, monorepos with frequent rebuilds, or component libraries — not for application bundling.

Rollup is purpose-built for library bundling: aggressive tree-shaking, multi-format output (ESM, CommonJS, UMD), and hand-optimized output. The right tool for publishing packages. Not for applications.

Webpack remains appropriate for specific scenarios: pre-ES6 browser support (IE11), Module Federation for micro-frontend architectures, or deep existing investment. Its slow build times and complex configuration make it a poor default for greenfield work.

Turbopack is Vercel's Rust-based bundler with incremental computation at its core. It memoizes build function results and tracks fine-grained dependencies so a single-file change triggers minimal recomputation regardless of app size. It is most relevant for large Next.js applications where traditional bundlers degrade with scale.

Fig 1
New application? → Vite Publishing a library? → Rollup CI / monorepo build? → esbuild IE11 / Module Fed.? → Webpack Large Next.js app? → Turbopack Vite 8+ project? → Rolldown (built-in)
Bundler selection guide
Vite 8 and Rolldown

Vite 8 (December 2025) integrates Rolldown — a Rust-based Rollup replacement — for production builds. This unifies the parse, resolve, transform, and minify stages under a single tool while keeping the same developer-facing API.

Tree-shaking: how it works and why it breaks

Tree-shaking is the build-time removal of unused exports via static analysis of the module graph. Because ESM import/export declarations are fixed at the top level, a bundler can trace the full dependency graph before execution and mark unreferenced exports for removal.

This is impossible with CommonJS because require() is a runtime call. The bundler cannot know at build time which branch will execute. It must bundle everything conservatively.

Two important flags control whether tree-shaking can safely remove code:

"sideEffects": false in package.json tells the bundler that every module in the package is "pure" — importing it produces no effects beyond its exports (no global polyfills, no DOM manipulation, no global state). When this flag is absent, bundlers assume modules have side effects and retain them even if nothing imports from them.

/*#__PURE__*/ as a comment before a function call tells the bundler that this specific call has no side effects and can be removed if the return value is unused. Useful for factory functions and constructors.

CSS side effects

CSS imports like import './styles.css' are side effects by definition. If you set "sideEffects": false globally in your library but ship CSS files, those imports will be stripped. Always list CSS files explicitly: "sideEffects": ["**/*.css"].

Barrel files are the most common cause of tree-shaking failures in real codebases. A barrel file re-exports everything from a package:

// components/index.js — a barrel file
export { Button } from './Button';
export { Modal } from './Modal';
export { Table } from './Table';
// ... 40 more exports

When you import import { Button } from '@mylib/components', the bundler sees the barrel and must analyze all re-exports to determine what is unused. This analysis is expensive and often conservative. In practice, barrel files cause bundlers to include far more than necessary. Named exports directly from their source file (import { Button } from '@mylib/components/Button') give bundlers precise, traceable information.

Hot Module Replacement

In development, you want to see your change immediately without losing application state. That is what HMR delivers.

The dev server watches the filesystem and maintains a WebSocket connection to the browser. When a file changes, the server compiles the module and sends an update payload over WebSocket. The browser receives the payload and swaps the module in place — without reloading the page.

If the updated module does not have an HMR handler, the update propagates up the dependency tree until a parent module accepts it. If nothing accepts, the dev server falls back to a full page reload.

Vite's HMR completes in 10–20ms regardless of app size because only the individual changed module is re-compiled and re-fetched. Webpack compiles the affected chunk, generates an update manifest, and re-bundles — typically 500–1600ms.

React Fast Refresh is a framework-specific HMR layer that preserves component state across updates. Without it, HMR would reset all state on every component edit.

HMR code is stripped from production

The module.hot property is undefined in production builds. Any code guarded by if (module.hot) is dead code in production and is removed by minifiers. This is intentional — HMR infrastructure has no place in production bundles.

The TypeScript compilation pipeline

TypeScript in a modern build system is two separate concerns that should run separately:

Transpilation strips type annotations and converts TypeScript syntax to JavaScript. esbuild and SWC do this. They do not check types. They are fast: Vite's esbuild-based transpiler is roughly 20–30x faster than tsc. This runs on every file save during development.

Type checking validates types against the full program graph. Only tsc does this. It is slow precisely because it needs cross-file analysis — it cannot check a single file in isolation. Running it on every keystroke is impractical.

The correct pattern: fast transpilers emit code continuously; tsc --noEmit runs separately in CI, in pre-commit hooks, and in your editor's background language server. The --noEmit flag tells tsc to check only, without writing any output files.

This is not a shortcut. It is the correct architecture. Attempting to integrate full type checking into a per-file HMR pipeline like Vite's is architecturally impossible — type checking requires the entire module graph.

isolatedModules

Enable "isolatedModules": true in tsconfig.json when using esbuild, Vite, or SWC. These tools transpile files individually without full project analysis. The flag prevents you from using TypeScript features that require cross-file analysis (like const enum inlining), ensuring each file can be transpiled independently and catching incompatibilities early.

One capability that belongs only to tsc: generating .d.ts declaration files via the declaration compiler option. If you are publishing a library, you must run tsc to emit declarations. esbuild and SWC cannot produce them.

Code splitting and dynamic imports

Code splitting divides your bundle into chunks loaded on demand. The mechanism is the dynamic import() function:

// Static import — always included in the initial bundle
import { HeavyChart } from './HeavyChart';

// Dynamic import — creates a separate chunk, loaded asynchronously
const HeavyChart = React.lazy(() => import('./HeavyChart'));

When a bundler encounters a dynamic import(), it automatically creates a separate chunk for that module and its dependencies. The chunk is fetched from the server only when the import() call executes.

Route-based code splitting is the highest-leverage starting point. Loading page components lazily means a user visiting /dashboard never downloads the code for /settings. This can cut initial bundle size dramatically.

// React Router with lazy-loaded routes
const Dashboard = React.lazy(() => import('./pages/Dashboard'));
const Settings = React.lazy(() => import('./pages/Settings'));

function App() {
  return (
    <Suspense fallback={<PageSpinner />}>
      <Routes>
        <Route path="/dashboard" element={<Dashboard />} />
        <Route path="/settings" element={<Settings />} />
      </Routes>
    </Suspense>
  );
}

When naming chunks, use magic comments so browsers can cache them across deployments. Auto-generated numeric names like 1.js change every build, defeating long-term caching:

import(/* webpackChunkName: "settings" */ './pages/Settings')

The primary metric for code splitting is min+gzip size: the bundle size after minification and gzip compression. This reflects what users actually download.

Key Principles

Match the tool to the goal, not the hype. Vite for applications, Rollup for libraries, esbuild for CI. No single bundler is universally optimal.

ESM is a prerequisite, not an option. Tree-shaking, HMR performance, and static analysis all depend on ESM. Libraries that ship only CommonJS cannot be tree-shaken. Prefer ESM throughout.

Separate transpilation from type checking. Type checking needs the full module graph. Transpilation operates per-file. Keep them in different processes and different pipeline stages.

Barrel files are a performance liability. They obscure the dependency graph and defeat tree-shaking. Prefer direct imports or granular package entrypoints.

Route-based splitting first. Before micro-optimizing individual components, split at route boundaries. The payoff is highest and the complexity is lowest.

Dev and production behave differently by design. Vite serves individual modules in dev and produces bundled chunks in production. HMR code is present in dev and stripped in production. Test production builds in CI.

Worked Example

Scenario: You have a React application using Webpack. The initial bundle is 1.8MB. You need to reduce it.

Step 1: Identify what is in the bundle.

Install webpack-bundle-analyzer. Run a production build and open the visualization. Look for:

  • Large third-party libraries appearing in full (likely not being tree-shaken)
  • Barrel file re-exports pulling in more than expected
  • Page components that are loaded upfront but not needed on first render

Step 2: Fix tree-shaking failures.

You notice import _ from 'lodash' in several files. This imports the entire library regardless of tree-shaking because you are importing the default export. Change to named imports:

// Before — includes entire lodash
import _ from 'lodash';
_.debounce(fn, 300);

// After — only debounce is included (assuming lodash-es)
import { debounce } from 'lodash-es';
debounce(fn, 300);

Note: lodash (CommonJS) cannot be tree-shaken at all. Switch to lodash-es (ESM) or use lodash/debounce per-method imports.

Step 3: Add route-based splitting.

Your entire app is in one chunk. Add React.lazy() for each page component:

// Before
import Dashboard from './pages/Dashboard';
import Settings from './pages/Settings';
import Reports from './pages/Reports';

// After
const Dashboard = React.lazy(() => import(/* webpackChunkName: "page-dashboard" */ './pages/Dashboard'));
const Settings = React.lazy(() => import(/* webpackChunkName: "page-settings" */ './pages/Settings'));
const Reports = React.lazy(() => import(/* webpackChunkName: "page-reports" */ './pages/Reports'));

Wrap routes in <Suspense fallback={<PageSpinner />}>.

Step 4: Add sideEffects to your own packages.

If your codebase uses internal packages, add "sideEffects": false to their package.json — or "sideEffects": ["**/*.css"] if they import CSS. Without this, Webpack cannot eliminate unused modules even with proper named imports.

Result: Initial bundle drops from 1.8MB to under 400KB min+gzip. The dashboard page component loads only when the user navigates there.

Compare & Contrast

Vite dev server vs Webpack dev server

Vite (dev)Webpack (dev)
Module servingIndividual ESM modules, unbundledBundled chunks
Startup timeNear-instant (no bundling)Grows with app size
HMR speed10–20ms (per-module)500–1600ms (per-chunk)
HMR mechanismRe-fetch single moduleRecompile + re-bundle chunk
Production parityLower (different serving model)Higher (same bundler in both modes)

The dev/prod gap in Vite is the main operational risk: bugs that exist only in production because the serving model is different. Mitigate with CI production builds and preview deployments.

tsc vs esbuild/SWC for TypeScript

tscesbuild / SWC
Type checkingFullNone
SpeedSlow (full program analysis)20–30x faster than tsc
OutputJS + .d.ts declarationsJS only
const enum supportYesLimited / unreliable
Use inCI type-check, library publishingDev builds, HMR, CI transpilation

These are not competing tools. They address different parts of the build. Use both.

Named exports vs barrel files

Named exports (direct)Barrel file
Tree-shakingFine-grained, accurateImpaired or defeated
Import verbosityLonger pathShort, convenient
Build performanceFaster graph analysisSlower, more work for bundler
IDE autocompleteWorksWorks

Use direct imports in application code. Use barrel files only when developer ergonomics genuinely matter more than bundle size, and profile the tradeoff.

Common Misconceptions

"Vite doesn't bundle for production, so bundles will be unoptimized." False. Vite uses Rollup (or Rolldown in Vite 8) for production builds with full tree-shaking, code splitting, and minification. The unbundled approach is development-only. Vite produces competitive or superior production bundle sizes compared to Webpack defaults.

"Tree-shaking means I don't need to worry about what I import." Tree-shaking only eliminates unused exports — it is not a magic eraser. If a library uses CommonJS, sets no sideEffects flag, or exports through a barrel, the bundler will include far more than you expect. Always verify with bundle analysis.

"Vite runs tsc in the background, so type errors will fail the build." Vite does not perform type checking. It transpiles TypeScript via esbuild and skips all type validation. You can ship type errors to production if you have no tsc --noEmit step in your CI pipeline. Add it explicitly.

"ESM and CommonJS are just different syntax for the same thing." They differ semantically. ESM exports are live bindings; CJS exports are value copies. ESM loads asynchronously enabling parallel dependency resolution; CJS loads synchronously. They cannot be safely mixed in a single module graph without the dual package hazard.

"Dynamic import() is only for lazy loading." Dynamic import() is also the mechanism for loading platform-specific code, feature flags, and conditionally loading heavy dependencies. Any place where you want to defer or conditionally include code is a valid use.

Key Takeaways

  1. The build step exists to satisfy network constraints, not developer preference. HTTP request minimization, dead-code elimination, minification, and code splitting are all responses to real delivery costs.
  2. ESM's static import/export is what makes tree-shaking possible. CommonJS require() is dynamic and cannot be statically analyzed. Libraries that ship only CJS cannot be tree-shaken.
  3. Vite's architecture — unbundled dev, bundled production — is an intentional tradeoff. Dev prioritizes feedback speed; production prioritizes delivery. The gap between the two environments is a known risk and should be covered by CI.
  4. TypeScript transpilation and type checking are separate pipeline stages. Fast transpilers (esbuild, SWC) emit code. tsc --noEmit checks types. Run them in parallel in CI, not sequentially.
  5. Barrel files and CommonJS are the most common causes of tree-shaking failure. Named exports from source files, sideEffects: false in package.json, and ESM-first library selection are the practical fixes.

Further Exploration

ESM and the module system

Vite

Tree-shaking

TypeScript build pipeline

Code splitting