High-performance webpack config for front-end delivery

Published Jul 21, 2017Last updated Jan 17, 2018
High-performance webpack config for front-end delivery

More and more developers are using webpack for easy bundling, but even in 2017, many websites still don’t take advantage of the biggest performance boosts webpack has to offer. webpack has access to more staggeringly-powerful built-in optimizations than you may be aware of. Are you utilizing all of it?

In this article, we’ll cover 7 easy webpack optimizations that will serve your app faster to users, no matter what you’re using (and an additional fallback option if dynamic imports isn’t available). In one example app, starting from no optimization, we were able to compress our entry JS by 700%, and leverage intelligent caching for snappy refreshes.

The best part? You can get a faster-loading site in only 15 minutes implementing a few of these tips. You can realistically implement all these tips in about 1 hour.

What this article covers (more 🚀 === more speed):

  1. 🕙 1 min: Scope Hoisting (✨ new in webpack 3 ✨)
  2. 🕙 2 min: Minification and Uglification with UglifyJS2 🚀🚀🚀
  3. 🕙 15 min+: Dynamic Imports for Lazy-loaded Modules 🚀🚀🚀🚀🚀
  4. 🕙 5 min Deterministic Hashes for better caching 🚀🚀
  5. 🕙 10 min CommonsChunkPlugin for deduplication and vendor caching 🚀🚀
  6. 🕙 2 min: Offline Plugin for webpack 🚀🚀
  7. 🕙 10 min: webpack Bundle Analyzer 🚀🚀🚀
  8. 🕙 2 min: Multi-entry Automatic CommonsChunkPlugin for special cases where dynamic import isn’t possible 🚀

This article assumes you’re using webpack 3.x (3.x boasts a 98% seamless upgrade from 2.x, and should be a painless upgrade). If you’re still on 1.x, you should upgrade to take advantage of automatic tree-shaking and dead code elimination!

🖥 Code Examples

Working examples of all of these concepts can be found in this webpack optimization sample repo. There will also be links in each section. This is cumulative whenever possible, so features added in one step will be carried over to the next.

1. Scope Hoisting

Est. time: 🕙 1 min

Est. boost: 🤷

webpack 3, released in June 2017, was released with many under-the-hood improvements as well as a more optimized compiling mode called “scope hoisting” that has been saving some people precious kB. To enable scope hoisting, add the following to your production webpack config:

const webpack = require('webpack');

module.exports = {
  plugins: [
    new webpack.optimize.ModuleConcatenationPlugin(),

Jeremy Gayed @tizmagik: 70K => 37K (gzip!) savings on our main bundle using #Webpack 3 RC.2 + ModuleConcatenationPlugin 😲 🔥 Awesome work @TheLarkInn @wSokra et al!

Some have reported a reduction of almost 50% in bundle size, but, in example 1, I didn’t notice any change (it was technically a 6 byte gain for me, which is insignificant). From what it seems, scope hoisting yields the greatest potential for legacy apps with hundreds of modules and not for stripped-down example projects like my test. There seems to be no real downside to using it, so I’d still recommend adding it if it doesn’t raise any errors. The release blog post explains this feature further.

Aug 2017 update: further improvements have been added to scope hoisting. This is a feature the webpack core team is serious about, and working to improve even further.

Result: 🤷 It’s a simple add, with no downsides, and potential future payoff. Why not?

🖥 Example 1: Scope hoisting (view code)

2. Minification and Uglification

Est. time: 🕙 2 min

Est. boost: 🚀🚀🚀

Uglification has been a close companion to optimization, and it’s never been easier to take advantage of than with webpack. It’s built right in! Though it’s one of the most accessible optimizations, it’s very easy for a developer in a rush to accidentally forget to uglify when sending assets to production. I personally have seen more than one webpack-bundled site with un-uglified code in production. So in our optimization journey, this is the first check to make sure is in place.

The wrong way

If you’re new to uglification, the main thing to understand is what a difference it makes in file size. Here’s what happens when we run webpack (no flags) command on our example base app:

                       Asset       Size  Chunks                    Chunk Names
             index.bundle.js    2.46 MB       0  [emitted]  [big]  index

The right way

2.46MB. Ouch! If I inspect that file, it’s full of spaces, line breaks, and gratuitous comments—all things that don’t need to make it to production. In order to fix this, all that’s needed is a simple -p flag:

webpack -p

Let’s see how that -p flag affects our build:

                       Asset       Size  Chunks                    Chunk Names    
             index.bundle.js    1.02 MB       0  [emitted]  [big]  index

1.02MB. That’s a 60% reduction in size, and I didn’t change any code; I only had to type 2 extra keyboard characters! webpack’s -p flag is short for “production” and enables minification and uglification, as well as provides quick enabling of various production improvements.

📝 Note

Despite what it may seem, the -p flag does not set Node’s environment variable to production. When running on your machine, you have to run

NODE_ENV=production PLATFORM=web webpack -p

It’s worth mentioning that some libraries (React and Vue, to name two) are designed to drop development and test features when bundled in production, resulting in smaller file sizes and faster run times. If you run this on the example project, the index bundle actually eeks out at 983 kB rather than 1.06MB. On some other setups, though, there may be no difference in size—it’s up to the libraries being used.

💁 Tip

For quick builds, add a "scripts" block to package.json so either you or your server can run npm run build as a shortcut:

"scripts": {
  "build": "webpack -p"

Advanced uglification

Uglify’s default setup is good enough for most projects and most people, but if you’re looking to squeeze every little drop of unnecessary code out of your bundles, add a webpack.optimize.UglifyJsPlugin to your production webpack config:

  new webpack.optimize.UglifyJsPlugin({/* options here */}),

For a full list of UglifyJS2 settings, the docs are the most up-to-date references.

⚠️🐌 Build Warning

If you accidentally enable uglification in development, this will significantly slow down webpack build times. It’s best to leave this setting in production-only (see this article for instructions on how to set up production webpack settings).

Result: In our example, we shaved off 60% of file size with default uglification & minification. Not bad!

🖥 Example: No example, but you can try running webpack -p on the base app.

3. Dynamic Imports for Lazy-loaded Modules

Est. time: 🕙 15 min+

Est. boost: 🚀🚀🚀🚀🚀

Dynamic importing is the Crown Jewel of front-end development. The Holy Grail. The Lost Arc. The Temple of Doom —er, scratch that last one; I got caught up naming Indiana Jones movies.

Whatever Harrison Ford movie you compare it to, dynamic, or lazy-loaded imports is a huge deal because it effortlessly achieves one of the central goals of front-end development: load things only when they’re needed, neither sooner, nor later.
In the words of Scott Jehl, “more weight doesn’t mean more wait.” How you deliver your code to users (lazy-loaded) is more important than the sum total code weight.

Let’s measure its impact. This is webpack -p on our starting app:

                       Asset       Size  Chunks                    Chunk Names    
             index.bundle.js    1.02 MB       0  [emitted]  [big]  index

We have 1.02MB of entry JS, which isn’t insignificant. But the crucial problem here is NOT the size. The problem is it’s delivered as one file. That’s bad because all your users must download the whole bundle before they see a thing on screen. We certainly can do better, breaking that file up and allowing paint to happen sooner.

Dynamic import, step 1: Babel setup

Babel and the Dynamic Import Plugin are both requirements to get this working. If you’re not using Babel in your app already, you’ll need it for this entire feature. For first time setups, install Babel:

yarn add babel-loader babel-core babel-preset-env

and update your webpack.config.js file to allow Babel to handle your JS files:

module: {
  rules: [
      test: /\.js$/,
      use: 'babel-loader',
      exclude: /node_modules/,

Once that’s set up, to get dynamic imports working, install the Babel plugin:

yarn add babel-plugin-syntax-dynamic-import

and then enable the plugin by modifying or creating a .babelrc file in your project root:

  "presets": ["env"],
  "plugins": ["syntax-dynamic-import", "transform-react-jsx"]

Some prefer to add Babel options within webpack like so so that it’s in the same file. I simply prefer the separate .babelrc file for a cleaner, reduced webpack config. Either way works.

💁 Tip

In case you’re used to seeing es2015 as the Babel preset rather than env, consider switching. env is a simpler config that can automatically transpile features based on your browserslist (familiar to users of Autoprefixer for CSS).

Dynamic import, step 2: import()

After we’ve got Babel set up, we’ll tell our app what and where we want to lazy-load. That’s as simple as replacing:

import Home from './components/Home';


const Home = import('./components/Home');

Doesn’t look much different, does it? If you’re ever seen require.ensure before, that has now become deprecated in favor of import().

Some frameworks like Vue already support this out-of-the-box. However, since our example app app is using React, we’ll need to add a tiny container called react-code-splitting. This saves us ~7 lines of boilerplate React code per import and deals with the rendering lifecycle update for us. But it’s nothing more than syntactic sugar, and the core functionality is 100% webpack import(). This is what our app now looks like:

import React from 'react';
import Async from 'react-code-splitting';

const Nav = () => (<Async load={import('./components/Nav')} />);
const Home = () => (<Async load={import('./views/home')} />);
const Countdown = () => (<Async load={import('./views/countdown')} />);

Because webpack turns every import() into a dynamic code split, now let’s see how that affected webpack -p:

                         0.bundle.js     222 kB       0  [emitted]         
                         1.bundle.js     533 kB       1  [emitted]  [big]  
                         2.bundle.js    1.41 kB       2  [emitted]         
                     index.bundle.js     229 kB       3  [emitted]         index

It reduced our core entry index.bundle.js file from 1.06MB to 229kB, an 80% reduction in size! This is significant because it’s our entry file. Before, painting couldn’t happen until that 1.06MB downloaded completed, parsed, and executed. Now, we only need 20% of the original code to start. And this applies for every page on the site! This doesn’t technically translate to a 5× faster paint—calculating is more complicated than that—but it’s nonetheless an amazing speed boost with little time investment.

Surely this can’t be it, there has to be more setup! you may be thinking. You’d be wrong!

In our example, index.bundle.js entry file didn’t change its name, so that’s still the only <script> tag needed. webpack handles all the rest for us (though you may run into a polyfill issue if you need to support a browser that doesn’t support Promise)!

Sean Larkinn @TheLarkInn: "Modern UI libraries have code splitting support." @vuejs: "Hold my beer..." #VueJS #vueconf #javascript #webpack #reactjs

Result: We have a paint-able app at only 20% of the original bundle size. Dynamic imports is arguably the single-best optimization you can make on the front-end. Provided you’re using client-side routing for maximum effectiveness.

🖥 Example 3: Dynamic Import (view code)

4. Deterministic Hashes for Caching

Est. time: 🕙 5 min

Est. boost: 🚀🚀

Caching is solely a benefit to returning users and doesn’t affect that critical first experience. By default, webpack doesn’t generate hashed file names (e.g.: app.8087f8d9fed812132141.js), which means everything stays cached and your updates may not be making it to users. This can break the experience and cause frustration.

The quickest way to add hashes in webpack is:

output: {
  filename: '[name].[hash].js',

But there’s a catch: these hashes regenerate on every build, whether the file contents have changed or not. If you’ve hooked up your app to automatically run webpack -p on deploy (which is a great idea), this means every deploy users will have to download all your webpack assets over again, even if you didn’t change a line of code.

We can do better with deterministic hashes, hashes that only change if the file changes.

⚠️🐌 Build Warning

Deterministic hashes will slow down your build times. They’re a great idea for every app, but this just means this config should reside in your production webpack config only.

To start, run yarn add chunk-manifest-webpack-plugin webpack-chunk-hash to add the proper plugins. Then add this to your production config:

const webpack = require('webpack');
const ChunkManifestPlugin = require('chunk-manifest-webpack-plugin');
const WebpackChunkHash = require('webpack-chunk-hash');
const HtmlWebpackPlugin = require('html-webpack-plugin');

/* Shared Dev & Production */

const config = {
  /* … our webpack config up until now */

  plugins: [
    // /* other plugins here */
    // /* Uncomment to enable automatic HTML generation */
    // new HtmlWebpackPlugin({
    //   inlineManifestWebpackName: 'webpackManifest',
    //   template: require('html-webpack-template'),
    // }),

/* Production */

if (process.env.NODE_ENV === 'production') {
  config.output.filename = '[name].[chunkhash].js';
  config.plugins = [
    ...config.plugins, // ES6 array destructuring, available in Node 5+
    new webpack.HashedModuleIdsPlugin(),
    new WebpackChunkHash(),
    new ChunkManifestPlugin({
      filename: 'chunk-manifest.json',
      manifestVariable: 'webpackManifest',
      inlineManifest: true,

module.exports = config;

Our process.env.NODE_ENV === 'production' conditional will now only apply if we’re in production.

💁 Tip

In the above example, running yarn add html-webpack-plugin html-webpack-template, and uncommenting the commented-out plugin, will get webpack to auto-generate HTML for you. This is great if you’re using a JS library like React to generate markup for you. You can even customize the template if needed.

Running webpack -p, you’ll notice a chunk-manifest.json file that needs to be inlined in the <head> of your document. If you’re not using webpack’s HTML plugin, you’ll need to do that manually:

    window.webpackManifest = { /* contents of chunk-manifest.json */ };

There’s also a manifest.js file that will need to be added via a <script> tag as well. Once both are in there, you should be good to go!

Result: Users now get updates as soon as we push them, but only if the file has changed its contents. Caching solved!

🖥 Example 4: Caching with Deterministic Hashes (view code)

5. CommonsChunkPlugin for Vendor Caching

Est. time: 🕙 10 min

Est. boost: 🚀🚀

We’ve taken great care to cache our webpack assets, but let’s take it even further and cache our vendor bundles so users don’t have to download the entire entry file again if we change a single line of code. In order to do that, let’s add a vendor entry item to store our third-party libraries:

module.exports = {
  entry: {
    app: './app.js',
    vendor: ['react', 'react-dom', 'react-router'],

When we run webpack -p on it…

                       Asset       Size  Chunks                    Chunk Names
             index.bundle.js     230 kB       3  [emitted]         index
            vendor.bundle.js     173 kB       4  [emitted]         vendor

Unfortunately our index file is bigger than it should be, and the culprit is webpack’s bundling React, ReactDOM, and React router in both index.bundle.js and vendor.bundle.js.

webpack isn’t to blame, though, as it did exactly what we asked it to. When you specify an entry file, you’re telling webpack you want each output file to be independent and complete. webpack assumed you will be serving either one or the other, not both at once.

However, we will be serving both at once, which will require just a bit more config. We’ll have to add CommonsChunkPlugin to plugins:

const webpack = require('webpack');

plugins: [
  new webpack.optimize.CommonsChunkPlugin({
    name: 'vendor',

CommonsChunkPlugin is now enabled, and knows to use the vendor entry point as the base for the CommonsChunk. You may have seen a minChunks setting on this plugin that we can omit here because we’ve already told webpack exactly what was going in this bundle.

💁 Tip

CommonsChunkPlugin’s name must match the name of the vendor entry file, otherwise we’re back to square one and duplicating vendor libraries across all entry files.

With all this in place, let’s run webpack -p again on the same app:

                       Asset       Size  Chunks                    Chunk Names
             index.bundle.js    55.7 kB       3  [emitted]         index
            vendor.bundle.js     174 kB       4  [emitted]         vendor

Wouldn’t you know it? Our index file has dropped in size by the approximate size of our vendor bundle: 174 kB. We’re no longer duplicating code, however, now we must load vendor.bundle.js first on every page before index and it is now a dependency wherever index is required:

<!-- vendor comes first! -->
<script src="vendor.bundle.js"></script>
<script src="index.bundle.js"></script>

Now whenever you update your app code, users will only have to redownload that 55.7 kB entry file, rather than all 174 kB. This is a solid win on any app setup.

💁 Tip

When picking modules for the vendor bundle:

  • Limit it to only modules the entire app uses
  • Also limit it to less-frequently-updated dependencies (remember: if one vendor lib updates, the whole bundle will re-download)
  • Only load commonly-used submodules. For example, if the app uses 'rxjs/Observable' frequently, but 'rxjs/Scheduler' rarely, then only load the former. And whatever you do, don’t load all of 'rxjs'!

Result: Like any caching effort, this caters mostly to returning visitors. If you have a frequently-referenced site or service, this is absolutely essential.

🖥 Example 5: Commons Chunk Plugin (view code)

6. Offline Plugin for webpack

Est. time: 🕙 2 min

Est. boost: 🚀🚀

Have you ever visited a site on your mobile phone when you were on spotty data, and you either accidentally triggered a refresh or the site itself triggered a refresh? Much frustration could have been avoided if the site that had already fully loaded had cached itself better. Fortunately, there’s a webpack plugin that’s become a staple in the PWA community: OfflinePlugin. By only adding a couple lines to your webpack config, now your site users can view your website while offline!

Install it:

yarn add offline-plugin

Add it to your webpack config:

const OfflinePlugin = require('offline-plugin');

module.exports = {
  entry: {
    // Adding to vendor recommended, but optional
    vendor: ['offline-plugin/runtime', /* … */],
  plugins: [
    new OfflinePlugin({
      AppCache: false,
      ServiceWorker: { events: true },

And, somewhere in your app (preferably in your entry file, before rendering code):

/* index.js */

if (process.env.NODE_ENV === 'production') {
  const runtime = require('offline-plugin/runtime');

    onUpdateReady() {
    onUpdated() {

Overall, it’s a simple addition to any app that can result in a significantly increased user experience for users like subway riders rapidly dropping in and out of service. For more information about the benefits, and how to configure it better for your setup, see the documentation.

Result: We checked off that pesky “Respond 200 when offline” requirement in Lighthouse in only minutes.

🖥 Example 6: Offline Plugin (view code)

7. webpack Bundle Analyzer

Est. time: 🕙 10 min

Est. boost: 🚀🚀🚀

Out of all the options we’ll cover for optimizing your build, this is by far the least automatic, but also helps catch careless mistakes that the automatic optimizations will skip over. In some regard, you start optimizing here because how else can you optimize your bundle if you don’t understand it? To add webpack Bundle Analyzer, run yarn add --dev webpack-bundle-analyzer in your repo, and add it to your development webpack config only:

const BundleAnalyzerPlugin = require('webpack-bundle-analyzer').BundleAnalyzerPlugin;

config = { /* shared webpack config */ };

if (process.env.NODE_ENV !== 'production' && process.env.NODE_ENV !== 'test') {
  config.plugins = [
    new BundleAnalyzerPlugin(),

Pay attention to the .BundleAnalyzerPlugin method at the end of require('webpack-bundle-analyzer').BundleAnalyzerPlugin—this is pretty unique

Run the following to run the bundle analyzer at localhost:8888:

node_module/.bin/webpack --profile --json > stats.json


Here, you can see the breakdown of your entire app, byte-by-byte. Look closely at the moment section above from example 7. There are a quite a few languages bundled! All of them, to be exact. While internationalization is a wonderful thing, we’re not ready for it yet in our sample app, so there’s no good reason we should serve unused languages to the client.

                          Asset       Size  Chunks                    Chunk Names
      0.cc206a4187c30a32c54e.js     224 kB       0  [emitted]

We’re using dynamic importing, which is great, but we’re still at 224 kB for our moment chunk. Researching a little bit, I found this solution that allowed me to only use the locales I needed.


According to the bundle analyzer, it’s looking a lot smaller! But let’s see how our final bundle performed:

                           Asset       Size  Chunks                    Chunk Names
       0.4108c847bef03ae9e840.js    62.7 kB       0  [emitted]         

It’s down by 161 kB. That’s quite a bit of savings! Had we never run the webpack bundle analyzer, we might have never noticed all that bloat in our app, and simply accepted it as dependency weight. You may be surprised how much one simple library swap or one reconfigured webpack line could save on your bundle size!

With webpack Bundle Analyzer, you get some great hints on where to start looking for optimization opportunities. Start at the biggest modules first and work your way down, seeing if there’s anything you can optimize along the way. Can you cherry-pick (e.g., can you use require('rxjs/Observable') instead of require('rxjs'))? Can you replace large libraries with smaller ones (e.g., swap React with Preact)? Are there any modules you can drop entirely? Asking questions like these can often have big payoffs.

Result: We discovered a pretty glaring bloat in our app, and was able to save 161 kB on a chunk request. Definitely worth our time.

🖥 Example 7: webpack Bundle Analyzer (view code)

8. Multi-entry Automatic CommonsChunk Plugin

Est. time: 🕙 2 min

Est. boost: 🚀

The last optimization we’ll cover is a technique from the early days of webpack that in my opinion isn’t as needed as it once was (if you disagree, please comment—I’d love to hear how you’re using it). This option should only be pursued if your app meets all the following:

  • Contains many, many entry bundles across the whole app
  • Can’t take advantage of dynamic imports
  • The amount of proprietary code far, far outweighs NPM libraries AND it’s split into ES6 modules

If your app doesn’t meet all these criteria, I’d recommend you to return to section 3. Dynamic Import, and #5. CommonsChunkPlugin for vendor caching. If you meet all the requirements and this is your only option, let’s talk about its pros and cons. First, assume we have the following entries in our app (we’re not using the examples from earlier because this is a very different setup):

module.exports = {
  entry: {
    main: './main.js',
    account: './account.js',
    shop: './shop.js',

We can update CommonsChunk to just figure things out automatically:

/* Dev & Production */
new webpack.optimize.CommonsChunkPlugin({
  name: 'commons',
  minChunks: 2,
/* Production */
new webpack.optimize.CommonsChunkPlugin({
  name: 'manifest',
  minChunks: Infinity,

The only setting to really tweak here is minChunks. This determines how many bundles a module must appear in, in order to make the commons.js file. If a library only appeared in 1 of the 3, it wouldn’t make it. But as soon as two bundles required it, it would now be removed from both modules, and placed in commons.js.

Again, this only works if you’re not taking advantage of dynamic imports. Because with dynamic imports, we were able to intelligently load, on every page, only the code the user needed and nothing more. But with this option, it’s somewhat “dumb” in the sense that it doesn’t know what a user needs per page (bad); it just bundles an average best commons file based on the assumption a user will hit many pages in your app (probably not the case). Further, it’s not truly automatic as you’ll have to test and find the most efficient minChunks setting based on your app’s user flow and architecture.

Result: Not bad, but it’s better to use dynamic importing (#3) and vendor commons chunk (#5)

Final Example Savings

We’ve covered some powerful new ways you can deliver your same experience to users, much faster. Here’s a breakdown of how much kB we saved in each step in our example app:

Technique Entry size gzipped Savings from previous step
Base App 2.5 MB 525 kB
Scope Hoisting 2.5 MB 525 kB
Uglification of base app via webpack -p 1 MB 266 kB + 50–60%
Dynamic import 229 kB 70 kB + 70–75%
Deterministic Hashes
Commons Chunk Plugin 230 kB 71 kB – 1%
webpack Offline Plugin
webpack Bundle Analyzer

† The Commons Chunk Plugin technique split one entry file into 2, and the
combined size of both is listed.

‡ The Bundle Analyzer technique saved 161 kB on a particular request. This
is significant savings even if it doesn’t apply to 100% of users.

Were you able to incorporate some techniques into your app? Did your Lighthouse score improve at all? Are you closer to that elusive 1-second paint time? Or were any webpack optimizations missed? Please leave a comment!

Further Reading & Notes

  • Google’s articles on optimization are nothing short of brilliant. I’ve not run across any other resource that demands you load your website in 1 second, and then provide excellent suggestions in reaching it. If you’re not sure how to improve your application, the RAIL model from that link is the best place to start.
  • Lighthouse, in case you missed the previous mentions in the article, is the current status quo of performance measurements. Lighthouse is very similar to Google’s old PageSpeed ranking, or YSlow, except it’s modernized for cutting-edge web apps in 2017 and holds your app to the latest standards.
  • webpack has made recent improvements to its dynamic import docs, mentioning Vue’s deft handling of import(). It’s exciting to see so much improvement on this year!
  • webpack’s AggressiveSplittingPlugin gets an honorable mention here, referenced in an article on webpack & HTTP/2 by the author of webpack. The plugin was originally included in this article, but after testing I found dynamic imports to be a universally better solution because it removes the option to lazy load and requires 100% to be served upfront. It was designed to tap into HTTP/2’s parallel downloading capabilities, but the small boost from that will rarely ever offset the overhead of downloading an entire application versus the minimal amount needed.
  • 10 things I learned making the fastest website in the world by David Gilbertson sets another high standard for optimization. Unsurprisingly, webpack plays a vital role in achieving his goals.
  • The focus of this article was front-end performance, not build times, but there were still several tips on the subject. If you followed the build tips and are struggling with a slow webpack --watch, try using webpack’s dev server instead. It builds in-memory and doesn’t write to the file system, saving precious time. The general idea with this strategy is to use this for development, and only bundle during deployment, skipping --watch entirely.
Discover and read more posts from Drew Powers
get started