Chrome Extension Code Splitting and Tree Shaking: Ship Smaller, Faster Extensions (2026)
Every kilobyte in your Chrome extension has a cost. Users notice slow installs. Chrome Web Store reviewers process smaller packages faster. And bloated background scripts eat memory that Chrome’s process model never gives back. Yet most extensions ship with hundreds of kilobytes of dead code, duplicated utilities, and monolithic bundles that load everything upfront regardless of what the user actually does.
This guide covers the same class of optimization that server-side teams apply to production builds — dead code elimination, chunk splitting, and systematic dependency auditing — translated into the specific constraints of the Chrome extension environment.
Why Bundle Size Matters for Extensions
Chrome extensions are not web pages. They run in a constrained environment with specific loading characteristics that make bundle size more impactful than in typical web apps:
Install and update friction. The Chrome Web Store enforces a 128 MB package limit, but users feel the difference long before that ceiling. A 2 MB extension versus a 200 KB extension creates measurably different first-impression experiences, particularly on slower connections where updates download in the background.
Review queue velocity. Automated and manual review times at the Chrome Web Store correlate with package complexity. Smaller, cleaner packages with fewer files tend to move through review faster. This matters when you are pushing a security patch or a time-sensitive feature update.
Memory overhead. Background service workers in Manifest V3 are supposed to be ephemeral, but a large bundle means a longer parse and initialization time every time Chrome wakes your worker. Content scripts injected into every tab multiply this cost by the number of open tabs. A 400 KB content script loaded in 20 tabs consumes 8 MB of memory just sitting there doing nothing.
Startup latency. Popup scripts must parse and execute before the UI renders. Users who click your extension icon and see a blank popup for 200ms will click again, triggering a second load. Keeping popup bundles under 50 KB eliminates this perception problem entirely.
Setting Up Vite for Chrome Extension Builds
Vite with the @crxjs/vite-plugin is the most ergonomic setup for modern Chrome extension development. It handles manifest-driven entry points, hot module replacement during development, and produces optimized Rollup bundles for production.
npm create vite@latest my-extension -- --template vanilla-ts
npm install -D @crxjs/vite-pluginA minimal vite.config.ts that gives you full control over chunking:
import { defineConfig } from 'vite'
import { crx } from '@crxjs/vite-plugin'
import manifest from './manifest.json'
export default defineConfig({
plugins: [crx({ manifest })],
build: {
rollupOptions: {
output: {
// Prevent Rollup from inlining small modules
// that should stay as separate cacheable chunks
inlineDynamicImports: false,
manualChunks: (id) => {
if (id.includes('node_modules')) {
// Vendor chunk strategy — explained below
if (id.includes('react') || id.includes('react-dom')) {
return 'vendor-react'
}
if (id.includes('webext-bridge')) {
return 'vendor-messaging'
}
return 'vendor'
}
},
},
},
// Target modern Chrome — enables more aggressive dead code elimination
target: 'chrome110',
minify: 'terser',
terserOptions: {
compress: {
drop_console: true,
drop_debugger: true,
pure_funcs: ['console.log', 'console.info'],
},
},
},
})The target: 'chrome110' setting is important. Vite will skip transpiling modern syntax that Chrome already understands natively, eliminating polyfill overhead that inflates bundles targeting older browsers.
Tree Shaking: Configuration and Gotchas
Tree shaking is Rollup’s (and therefore Vite’s) mechanism for removing exports that are never imported anywhere in your dependency graph. It works reliably with ES module syntax, but several common patterns defeat it silently.
CommonJS modules are not tree-shakeable. When you import from a library that ships only CommonJS (no module field in package.json, no .mjs files), the entire library is bundled regardless of what you use. Audit your dependencies:
npx is-esm lodash date-fns axiosIf a critical dependency is CommonJS-only, look for an ES module alternative or import the specific file directly:
// Bad — pulls in all of lodash (72 KB gzipped)
import { debounce } from 'lodash'
// Good — pulls in only debounce (< 1 KB)
import debounce from 'lodash-es/debounce'
// Or better: use lodash-es which is fully tree-shakeable
import { debounce } from 'lodash-es'Side-effect annotations matter. Libraries without "sideEffects": false in their package.json are treated conservatively by Rollup — it assumes any module might have side effects and keeps it even if nothing is imported. You can override this:
// In your own package.json
{
"sideEffects": ["*.css", "src/polyfills.ts"]
}For third-party packages that incorrectly omit this field, you can patch them with patch-package or use Vite’s optimizeDeps to pre-bundle and control what gets included.
Re-export patterns can hide dead code. If your utility file re-exports everything from a large library, the tree shaker sees all of it as potentially used:
// utils/index.ts — this defeats tree shaking for the whole utils barrel
export * from './format'
export * from './network'
export * from './storage'Prefer named imports from the specific module file instead of barrel re-exports, especially for large utility collections.
Code Splitting: Separate Chunks for Each Entry Point
Chrome extensions have a fundamentally multi-entry architecture: popup, options page, background service worker, content scripts, and potentially devtools panels or sidepanel pages. Each runs in isolation. Code that is only used in the popup should never land in the content script bundle.
With @crxjs/vite-plugin, entries are derived from your manifest automatically. For finer control, define them explicitly:
// vite.config.ts
export default defineConfig({
build: {
rollupOptions: {
input: {
popup: 'src/popup/index.html',
options: 'src/options/index.html',
background: 'src/background/index.ts',
content: 'src/content/index.ts',
},
output: {
entryFileNames: '[name].js',
chunkFileNames: 'chunks/[name]-[hash].js',
assetFileNames: 'assets/[name]-[hash][extname]',
},
},
},
})The key insight is that shared code between popup and options page gets extracted into a shared chunk automatically by Rollup’s chunk optimization. Code used only in the content script never touches that shared chunk. You get implicit code splitting aligned with Chrome’s execution context boundaries.
Dynamic Imports in Extension Pages
Dynamic imports work in extension pages (popup, options, sidepanel) and can defer heavy initialization until the user actually triggers a feature. They do not work in content scripts or background service workers without additional configuration.
A practical pattern for options pages with multiple sections:
// options/index.ts
const routes: Record<string, () => Promise<{ default: () => void }>> = {
'/general': () => import('./sections/general'),
'/privacy': () => import('./sections/privacy'),
'/advanced': () => import('./sections/advanced'),
}
async function navigate(path: string) {
const loader = routes[path]
if (!loader) return
const { default: render } = await loader()
render()
}
// Only load the current section on startup
navigate(location.hash.replace('#', '') || '/general')Each section becomes a separate chunk. A user who only visits the General settings tab never downloads the Advanced settings code. For options pages that have grown large with multiple configuration panels, this pattern routinely cuts initial load size by 40–60%.
For the background service worker, dynamic imports require the "background": { "type": "module" } field in your manifest and Chrome 111+. The worker is already lazy by nature (Chrome unloads it), so dynamic imports are most useful for splitting heavy one-time initialization from the hot path that runs on every wake-up event.
Analyzing Your Bundle
You cannot optimize what you cannot see. rollup-plugin-visualizer generates an interactive treemap of your bundle composition:
npm install -D rollup-plugin-visualizer// vite.config.ts
import { visualizer } from 'rollup-plugin-visualizer'
export default defineConfig({
plugins: [
crx({ manifest }),
visualizer({
filename: 'dist/bundle-analysis.html',
open: false,
gzipSize: true,
brotliSize: true,
}),
],
})Run npm run build and open dist/bundle-analysis.html. Look for:
- Unexpectedly large vendor modules — anything over 20 KB gzipped in a content script is a red flag
- Duplicate modules — the same utility appearing in multiple chunks means your chunk splitting is not working as intended
- Entire libraries used for one function — the classic example is importing
axiosfor a singleGETrequest whenfetchis built into Chrome
A complementary tool is bundlephobia (bundlephobia.com) for checking the install size of any npm package before adding it to your project.
Common Bloat Sources and Their Fixes
These are the dependencies that appear most frequently in oversized extension bundles, along with their lightweight alternatives:
Moment.js → day.js
Moment.js is 67 KB gzipped with locale data. Day.js is 2 KB and has an identical API for the methods most extensions actually use.
// Before: 67 KB
import moment from 'moment'
const formatted = moment(timestamp).format('YYYY-MM-DD')
// After: 2 KB
import dayjs from 'dayjs'
const formatted = dayjs(timestamp).format('YYYY-MM-DD')lodash → lodash-es or native methods
For most extension use cases, native JavaScript covers what lodash provides. Where you genuinely need lodash, use lodash-es with named imports:
// Before: 72 KB (full lodash)
import { groupBy, sortBy, uniqBy } from 'lodash'
// After: ~3 KB (three functions)
import { groupBy, sortBy, uniqBy } from 'lodash-es'
// Or better — native equivalents for simple cases
const grouped = items.reduce((acc, item) => {
;(acc[item.key] ??= []).push(item)
return acc
}, {} as Record<string, typeof items>)axios → native fetch
Chrome extensions have full access to the Fetch API. Axios adds 13 KB gzipped for features you likely do not need (interceptors at that scale, automatic XML parsing, browser compatibility for IE11).
// A thin fetch wrapper that covers 90% of axios use cases
async function request<T>(url: string, options?: RequestInit): Promise<T> {
const response = await fetch(url, {
headers: { 'Content-Type': 'application/json' },
...options,
})
if (!response.ok) throw new Error(`HTTP ${response.status}`)
return response.json() as Promise<T>
}date-fns — use the subpath import
If you need date-fns, import from the specific function path rather than the top-level package:
// Pulls in tree-shakeable ES modules
import { formatDistanceToNow } from 'date-fns'
// Make sure your bundler resolves the 'module' field in package.jsonReal Metrics: Before and After
A content script for a tab management extension before optimization:
| Bundle | Before | After | Reduction |
|---|---|---|---|
| Content script | 187 KB | 41 KB | 78% |
| Popup | 312 KB | 89 KB | 71% |
| Background SW | 94 KB | 28 KB | 70% |
| Total install size | 1.2 MB | 340 KB | 72% |
The changes that produced these numbers: replacing moment.js with dayjs (-65 KB), switching from lodash to lodash-es with named imports (-58 KB), removing axios in favor of fetch (-13 KB), enabling target: 'chrome110' to eliminate transpilation polyfills (-47 KB), and adding manual chunk splitting to prevent the vendor bundle from loading in the content script context (-71 KB).
None of these changes required rewriting application logic. They were purely dependency substitutions and build configuration adjustments.
Automated Size Budgets in CI
Manual auditing does not scale. Once you establish baseline sizes, automate enforcement so that pull requests that introduce significant bundle growth fail the build before they merge.
Create a scripts/check-bundle-size.mjs:
import { statSync, readdirSync } from 'fs'
import { join } from 'path'
const BUDGETS = {
'content.js': 50 * 1024, // 50 KB
'background.js': 30 * 1024, // 30 KB
'popup.js': 60 * 1024, // 60 KB
}
const distDir = 'dist'
let failed = false
for (const [file, budget] of Object.entries(BUDGETS)) {
const filePath = join(distDir, file)
try {
const { size } = statSync(filePath)
const kb = (size / 1024).toFixed(1)
const budgetKb = (budget / 1024).toFixed(0)
if (size > budget) {
console.error(`FAIL ${file}: ${kb} KB exceeds ${budgetKb} KB budget`)
failed = true
} else {
console.log(`PASS ${file}: ${kb} KB / ${budgetKb} KB`)
}
} catch {
console.warn(`SKIP ${file}: not found in dist`)
}
}
if (failed) process.exit(1)Add it to your CI workflow after the build step:
# .github/workflows/build.yml
- name: Build extension
run: npm run build
- name: Check bundle size budgets
run: node scripts/check-bundle-size.mjsTighten the budgets over time as you optimize. The script becomes self-documenting evidence of your size targets and catches regressions automatically.
A more sophisticated approach uses bundlesize or size-limit packages, which support gzip-aware comparisons and can post size reports as pull request comments via GitHub Actions.
Putting It Together
Bundle optimization for Chrome extensions is not a one-time task — it is a habit applied continuously as your extension grows. The highest-leverage moves, in order:
- Switch to Vite with a
target: 'chrome110'build to eliminate polyfill overhead immediately - Audit your top five dependencies with the bundle visualizer and replace CommonJS libraries with ES module alternatives
- Split your entry points explicitly and verify that content script chunks do not contain popup-only code
- Add dynamic imports to options pages with multiple sections
- Set size budgets in CI before the next sprint begins
Extensions that stay lean load faster, review faster, and give users a better first impression. The tooling to achieve this is mature, the techniques are well-understood, and the payoff in install size and runtime performance is measurable within a single build cycle.
Share this article
Build better extensions with free tools
Icon generator, MV3 converter, review exporter, and more — no signup needed.
Related Articles
Jetpack Compose Performance Optimization: Stop Burning Your 16ms Frame Budget
Jetpack Compose performance tips — recomposition control, stable types, LazyColumn tuning, and Baseline Profiles with real code examples.
I Built the Same Chrome Extension With 5 Different Frameworks. Here's What Actually Happened.
WXT vs Plasmo vs CRXJS vs Extension.js vs Bedframe. Real benchmarks, honest opinions, and the framework with 12K stars that's quietly dying.
5 Best Email Marketing Services to Grow Your Chrome Extension (2026)
Compare the top email marketing platforms for SaaS and Chrome extension developers. MailerLite, Mailchimp, Brevo, ActiveCampaign, and Drip reviewed.