Back Arrow
From the blog

How we improved page load speed for Next.js ecommerce website by 50%

How to stop declining of the performance indicators of your ecommerce website and perform optimising page load performance.

Sergei Pestov

Lead front-end developer

Introduction

So for one the sprints we decided to focus on optimising page load performance – here is the story of how it all worked out.

Here is short summary of results:

Not all indicators have reached the green zone yet, but there has been significant improvement for many of them. While further work is being planned, we will explain the steps we took to achieve the current result.

TS-Prune

First, we use a service that allows you to quickly find dead exports or code in a project.
Here's an example command for package.json:

"deadcode-check": "npx --yes ts-"deadcode-check": "npx --yes ts-prune -s \"pages/[**/]?
(_app|_document|_error|index)|store/(index|sagas)|styles/global\""

This yields the following result:

After working through the console, there will be a list of problem areas. We go through them manually and remove dead segments.

Depcheck

Another package we use for optimization helps us generate a list of unused npm packages. Then we manually go through the list and remove everything unnecessary, thereby reducing the project's weight and organising things.

Depcheck needs to be run several times because after each run and removal of unnecessary libraries, new ones may appear, which become redundant after the first run. For example, a library removed in the first run may have been used by another library, which is identified as a dead dependency in the second run.

Finding Duplicated npm Packages

We use a plugin that analyses the project and displays a list of duplicate packages with different versions. Here is an example of the results of its work, using a sorting project as an example:

However, the algorithm for further actions may depend on the types of results. For example, consider the following output:

Here, it's clear that the packages need to be updated. By updating the dependencies, you can reduce the bundle's size. Another example:

In this case, updates are also required. If it's not clear what and where to update, you can add an alias in the webpack settings. This will instruct the builder which version to update and where to obtain it. Using Redux as an example:

Image Optimization

Before optimization, the images on the website were not scaled according to the screen size. We replaced all the images with "next/image." Then, we adjusted the download priorities. Those images that enter the viewport first should be loaded with high priority and without lazy loading—meaning, as quickly as possible. This impacts the site's rendering speed.

Use the "next/image" module for all images.

Introduce modern image formats in place of JPG and PNG. This provides better compression without compromising quality and results in faster loading.

images: { 
formats: ['image/avif', 'image/webp'], 
}

Prioritise the loading of above-the-fold images (those within the viewport when the page initially loads) using the "Priority" property.
For vector images, set the "unoptimized" property.

<Image
	scr="/static/icons/mainPage/qualityControl.svg"
	width={40}
	height={40}
	unoptimized
	priority
	alt="Quality control"
     />

Code Splitting

It was discovered that there are many duplicates in the code on each page. Components that could have been loaded once, cached in the browser and be used by all pages, were inserted into the code of each page, consuming resources during subsequent page loads.

The code duplication occurred due to disabling the default code splitting algorithm in Next.js. Previous developers used this approach to make Linaria work, which is designed to improve productivity. However, disabling code splitting led to a decrease in performance.

As a result, we removed the line that blocks code splitting, resulting in an almost 50% increase in performance. The built-in Next.js mechanism for breaking JavaScript code into chunks has been tested and recommended for production. There is no need to disable it; otherwise, the code will be duplicated for each page, even though reusable modules (such as React, React-DOM, and UI kit) should be placed in separate (common) chunks, and not loaded again on each page.

You should avoid doing this:

// next.config.js 
webpack: (config, { isServer }) => { 
config.optimization.splitChunks = false; // 
return config;
}

Removing Blocking Resources

Next, we used WebPageTest, where you can run tests on website performance and obtain various metrics. After the test, it was discovered that the website's code contains blocking scripts.

The blocking script was modified to load asynchronously and only on those pages where it was necessary. Synchronous loading blocks the download queue. All scripts should be non-blocking.

Example with Aplaut:

Compact Assemblies

Instead of Terser, we use swcMinify to compress JS, saving about 200KB on a large project.

// next.config.js 
module.exports = { 
swcMinify: true, 
}

We compile JS only for modern browsers. The list of default browsers in Next can be overridden in your browserslist.

"chrome 61", 
"edge 16", 
"firefox 60", 
"opera 48", 
"safari 11"
// next.config.js 

module.exports = { 
experimental: { 
legacyBrowsers: false, 
}, 
}

Third-Party Scripts

To reduce the Total Blocking Time (TBT), third-party scripts for the website (e.g., Google Analytics) should be connected using next/script and appropriate loading strategies, most often afterInteractive. Here's an example of connecting Google Analytics using next/script.

Hidden Components

There are components on the site that remain hidden until the user interacts with them. These components are typically found in modal windows and sidebars. For these components, we use Dynamic Imports to reduce the initial amount of JavaScript required to load the page. The JavaScript for loading the hidden component is deferred until the user needs to interact with it.

Build Acceleration

You can disable ESLint checking in next.config.js. If linting is performed during the commit stage, there's no need to run linting during the build. This nuance doesn't affect the user experience but speeds up the build process.

module.exports = { 
eslint: { 
// Warning: This allows production builds to successfully complete even if // your project has ESLint errors. 
ignoreDuringBuilds: true, 
}, 
}

You can also use the --no-lint option in package.json.

"scripts": { 
"dev": "ts-node server.ts", 
"build": "next build --no-lint", 
"build-analyze": "rimraf .next && cross-env ANALYZE=true next build", 
"start": "cross-env ts-node server.ts", 
"lint": "tsc && eslint **/*.{js,jsx,ts,tsx} --fix" 
}
Conclusion

If work on a project is conducted regularly with frequent releases, consider setting up performance monitoring after each release. This way, you can promptly address any performance declines after introducing new features, which is more convenient than waiting for a critical drop in performance and optimising the entire system.

To stay proactive, we recommend following a performance improvement checklist by Vitaly Fridman. The checklist evolves each year, ensuring that the advice and approaches it contains remain relevant and suitable for use in projects.

We thank our lead front-end developer, Sergei Pestov, for helping prepare this article!

It's easy to start working with us. Just fill the brief or call us.

Find out more
White Arrow
From the blog
Related articles

Mastering advanced tracking with Kentico Xperience

Dmitry Bastron

We will take you on a journey through a real-life scenario of implementing advanced tracking and analytics using Kentico Xperience 13 DXP.

Kentico
Devtools

Sitecore integration with Azure Active Directory B2C

Dmitry Bastron

We would like to share our experience of integrating Sitecore 9.3 with the Azure AD B2C (Azure Active Directory Business to Consumer) user management system.

Sitecore
Azure

Activity logging with Xperience by Kentico

Dmitry Bastron

We'll dive into practical implementation in your Xperience by Kentico project. We'll guide you through setting up a custom activity type and show you how to log visitor activities effectively.

Kentico

Why is Kentico of such significance to us?

Anastasia Medvedeva

Kentico stands as one of our principal development tools, we believe it would be fitting to address why we opt to work with Kentico and why we allocate substantial time to cultivating our experts in this DXP.

Kentico

Interesting features of devtools for QA

Chrome DevTools serves as a developer console, offering an array of in-browser tools for constructing and debugging websites and applications.

Devtools
QA

Where to start learning Sitecore - An interview with Sitecore MVP Anna Gevel

Anna Gevel

As a software development company, we at Byteminds truly believe that learning and sharing knowledge is one of the best ways of growing technical expertise.

Sitecore

Sitecore replatforming and upgrades

Anastasia Medvedeva

Our expertise spans full-scale builds and support to upgrades and replatforming.

Sitecore

Kentico replatforming and upgrades

Anastasia Medvedeva

Since 2015, we've been harnessing Kentico's capabilities well beyond its core CMS functions.

Kentico

Umbraco replatforming and upgrades

Anastasia Medvedeva

Our team boasts several developers experienced in working with Umbraco, specialising in development, upgrading, and replatforming from other CMS to Umbraco.

Umbraco

Sitecore Personalize: tips & tricks for decision models and programmable nodes

Anna Gevel

We've collected various findings around decision models and programmable nodes working with Sitecore Personalize.

Sitecore

Fixed Price, Time & Materials, and Retainer: How to Choose the Right Agreement for Your Project with Us

We will explain how these agreements differ from one another and what projects they are suitable for.

Customer success

Enterprise projects: what does a developer need to know?

Let's talk about what enterprise development is, what nuance enterprise projects may have, and which skills you need to acquire to successfully work within the .NET stack.

Development

Headless CMS. Identifying Ideal Use Cases and Speeding Up Time-to-Market

Andrey Stepanov

All you need to know about Headless CMS. We also share the knowledge about benefits of Headless CMS, its pros and cons.

Headless CMS

Dynamic URL routing with Kontent.ai

We'll consider the top-to-bottom approach for modeling content relationships, as it is more user-friendly for content editors working in the Kontent.ai admin interface.

Kontent Ai

Troubleshooting tracking and personalisation in Sitecore XM Cloud

Anna Gevel

One of the first things I tested in Sitecore XM Cloud was embedded tracking and personalisation capabilities. It has been really interesting to see what is available out-of-the-box, how much flexibility XM Cloud offers to marketing teams and what is required from developers to set it up.

Sitecore
This website uses cookies. View Privacy Policy.