Skip to main content

How To Improve Website Performance In 2025

· 15 min read
Matt Zeunert

Nobody likes waiting for websites to load. But what causes slow website performance, and what can you do about it?

In this article we'll look at 12 things you can do on your website to make it load faster. But first, let's take a look at what website performance is all about and how it's measured.

What is website performance?

Website performance measures how fast your website is. Most importantly, how long does your website take to load?

Sometimes when a visitor clicks a link on Google elsewhere, the next page appears almost instantly. Other times visitors wait a long time for the contents of the next page to load.

Website performance rendering filmstrip

info

The page load process often consists of multiple stages where page content gradually appears.

Why is website performance important?

Ultimately, a fast website is about providing a better user experience. Visitors who spend less time waiting to see your content will be more engaged, more likely to navigate around your website, and more likely to convert to paying customers.

But Google also uses page speed as a ranking factor. Making your website faster can help you get more organic search engine traffic.

What is a good website speed?

According to Google, a good website loading time is below 2.5 seconds. Wait times up to 4 seconds are still ok, but Google considers anything above that poor.

We find that in 2025, websites typically load in just under 2 seconds.

How to measure website speed

There are many free tools out there that you can use to measure page load time and find ways to improve it:

These tools will assign a rating to your website, point out performance metrics you should improve, and provide additional debug data to help you improve your page speed.

DebugBear page speed test result

What is the best metric to assess website performance?

There are many different ways to measure page load time, for example First Contentful Paint or Speed Index. However, for the last few years Google has focused on the Largest Contentful Paint (LCP) metric.

LCP measures how soon after navigating to a page the main content element appears. Specifically, that means the largest element on the page, usually text or an image.

What's the best strategy to improve website performance?

Effective website performance optimization consists of two key strategies:

  1. Using automated tools to identify and test high-impact optimizations
  2. Analyzing request waterfall data to understand why content loads when it does

There are many techniques to make your website faster. Using tools to detect and try out optimizations is often easier than manually checking for ways you can speed up your site.

This article provides an overview of some of the most common causes of poor website performance and what you can do to fix these issues.

Best practices to improve website performance

Fast websites avoid loading unnecessary resources, make sure resources can be loaded quickly, and are built so that only the most critical resources hold back rendering.

Let's look at the most important best practices to speed up your website.

1. Optimize server response time

The first step to loading any website is downloading the HTML document from the server. Improving initial server response time is important, as other resources can't start loading until after the HTML has loaded.

A poor server response time is indicated by a high Time to First Byte metric value.

HTML document request with longer server response time

tip

If there's nothing you can do to speed up response time, consider using 103 Early Hints to start loading other important resources before the HTML is ready.

Server-side optimizations

To improve server response time you need to profile your backend application code to see what's causing delays.

  • Is the hosting environment not powerful enough?
  • Is a lot of time spent processing data and rendering HTML?
  • Are database queries slow?
  • Is the response delayed by third-party API requests?

Use a Content delivery network

Outside of applying optimizations to your server code, you can also use a Content Delivery Network (CDN) to improve performance. A CDN provides a global network of servers as well as built-in tools to speed up asset delivery.

With a CDN you'll get two key performance benefits:

  • Server connections will take less time to establish, as CDN has servers close to the visitor
  • CDN caching lets you cache some responses on the CDN so they can be served without contacting your server

Global server response time test result

tip

Run a global TTFB test to see how the speed of your website varies across the world.

2. Reduce render-blocking resources

Once the HTML document has been loaded, the next step to display the page content is loading other render-blocking resources. That includes CSS stylesheets and some JavaScript code.

Browsers don't display page content until after render-blocking requests are complete. You can see that in the request waterfall below. The filmstrip just shows a blank page until the render-blocking utag.sync.js file has finished loading.

Request waterfall showing a delay cause by render-blocking resources

Defer JavaScript files

HTML <script> tags are render-blocking by default, but usually JavaScript files don't need to block rendering. Instead you can defer running JavaScript code with the defer attribute. Then the page content can render even though the scripts are still loading.

<script src="jquery.js" defer />

3. Optimize image files

While the Largest Contentful Paint element can be a text element, LCP images are typically the cause of poor performance. That's because high-quality images are often large and slow to download.

To find out what the LCP element is on your website, run a test with DebugBear and click on the Largest Contentful Paint metric heading. If the LCP element is an image you can also see additional details for the image request, like the file size and download duration.

LCP element details in DebugBear

You can also see that most of the page load delay comes from the Resource Load Duration component of the LCP metric.

In this case the image is almost 2 megabytes large, and downloading the image accordingly takes 4.7 seconds.

LCP download duration

Get more insight into LCP images with CrUX data

If you're wondering if the synthetic page speed test result matches what real users are experiencing, you can check the data Google publishes as part of the Chrome User Experience Report (CrUX).

It includes data on the LCP subparts, which tell you what's really holding back loading the LCP image on your website.

CrUX LCP subpart data

How to reduce image file size

You can take a few concrete steps to optimize your images:

Image optimization tools like Squoosh or Optimizilla make it easy to compress your images so they can load more quickly.

4. Prioritize important content

Every request made by the browser has a request priority from Lowest to Highest. Render-blocking resources are high priority, while for example deferred JavaScript is low priority.

Chrome also has a particular way to prioritize image resources. The first 5 images are medium priority, images in the viewport are highpriority, and other images are low priority.

Request waterfall with difference resource priorities

The screenshot above shows an LCP image with a priority change from Low to High. The red bar on the request waterfall entry indicates when the priority change takes place.

Why does the priority change? It's when Chrome renders the page and realizes that this image is in the viewport.

Loading the LCP image with low priority means that the request will be made later than it should be, and bandwidth may instead be used to load other resources.

Optimize image loading with the fetchpriority attribute

If you know that an image is important you can add the fetchpriority="high" attribute to its <img> tag.

<img
fetchpriority="high"
src="https://quickbooks.intuit.com/oidam/intuit/hero_utterwaffle4.jpg"
/>

In DebugBear we can run a page speed experiment for this change to see how it impacts the Largest Contentful Paint score. In this case, the page now loads almost a full second faster.

Impact of high-priority LCP image request on performance

If we take a closer look at the comparison of the request waterfall, we can see that the LCP image request now starts a lot earlier. It also takes over 200 milliseconds less time than before.

Performance impact of fetchpriority high on an image

5. Delay loading unimportant resources

The fewer resources you fetch during the initial page load, the less different resources will compete for bandwidth. As a result, the resources that you actually need to load will download more quickly.

Browsers automatically reduce the priority for less important resources. For example, when you defer a JavaScript file that will reduce its request priority.

You can also use the fetchpriority="low" attribute to mark resources that are not essential early on to render the page.

Lazy-loading images that you know are below the fold with loading="lazy" lets you avoid unnecessary image requests entirely, until they are actually needed to render page content.

warning

Be careful to not lazy load all images, as that means you'll also lazy load the LCP image, slowing down your website.

6. Ensure key files are discovered early

Sequential request chains are a common cause of poor performance. Instead of loading the document and then immediately loading all other resources necessary to render the page, the first set of requests instead triggers other critical requests.

Here's an example of that: the browser doesn't know about the LCP image until after the CSS stylesheet has finished loading.

Request waterfall showing a CSS stylesheet triggering an image request

On that same page we also find another sequential request chain, this time due to a CSS @import statement.

That further delays the background image, because the browser doesn't know it needs to style the LCP element until the page has rendered. The imported stylesheet is render-blocking, introducing an additional delay.

CSS import chain

tip

For best performance, all other critical requests should be triggered directly by the document HTML.

What can we do to fix this issue?

Adding this code to the page HTML should have a big positive performance impact:

<link
rel="preload"
href="https://www.veeva.com/wp-content/homepage-hero-mobile.jpg"
as="image"
fetchpriority="high"
/>
<link
rel="preload"
as="style"
href="https://fonts.googleapis.com/css?family=Roboto:200,300,400,500,700"
/>

If run this as a DebugBear experiment we can see the before and after rendering filmstrip. Both the First Contentful Paint and Largest Contentful Paint metrics are much better now. Content starts to render sooner and it renders with the background image right away.

Performance impact of preloading resources from later in the request chain

7. Optimize font loading

In addition to optimizing images, you also need to make sure that text shows immediately after the page starts rendering. This can be tricky as many websites use web fonts that need to be downloaded first, but you can do two key steps to improve font performance:

Avoid excessive web font preloading

Preloading fonts is usually a good practice, but it can also make performance worse if you preload too many fonts or if the font files are too large. Browsers can prioritize these fonts over important render-blocking resources.

You can see an example below, where a website preloads over 30 different fonts, with some of them being over 300 kilobytes large. As a result, the page renders a lot more slowly.

Excessive font preloading recommendation

tip

Only preload the 2-3 most important fonts on your website. Each font file should be less than 100 kilobytes in size.

8. Speed up JavaScript code and CPU processing

Most JavaScript code can be deferred and shouldn't impact the initial load of your website. However, it's not always possible to delay loading all scripts, and running this code can delay the website's rendering process.

The DevTools performance tab can give you a lot of insight into what CPU processing tasks are slowing down your code.

DevTools performance profile showing a long task and a hydrate call

In this example, the JavaScript hydration task is blocking the CPU and causing the page to render more slowly.

tip

If your website is a single page application, check out our guides to React performance, Next.js performance, and Nuxt performance.

How to reduce CPU processing

To speed up CPU tasks on your website you can:

9. Optimize HTML, CSS, and JavaScript file size

Larger files take longer to download, causing slower website speed. You can see that in this request waterfall, where a 354 kilobyte CSS file takes 2.8 seconds to load. During this time, rendering is blocked.

Request waterfall showing a large CSS file

Text files like HTML or CSS have specific techniques you can apply to reduce their size:

One common reason for large code files are images or fonts embedded as Base64 data URLs. DebugBear can help you identify these with the Size Analysis feature. Expand the request list in the test result

CSS size analysis showing embedded images

10. Cache static content in the browser

When a visitor first comes to your website, the browser cache won't contain any saved content yet, and all resources need to be loaded over the network. However, if you set up serve website assets with an efficient cache policy, the browser can save them for later visits and navigations on your website.

Servers can indicate that a resource can be cached in the browser using the cache-control HTTP header. In this case the file can be cached for up to one year (31,536,000 seconds).

Cache control header for a CSS file

If we then test a second load of the page we can see that the CSS file now loads quickly from the cache and no data needs to be downloaded over the network.

Test result showing a warm load of a website with a cached resource

11. Cache back/forward navigations

Website visits often aren't simple and unidirectional: there are different types of navigations. Visitors reload pages to get up-to-date data, or navigate back to the previous page.

Back/forward navigations should usually feel instant because browsers can store the page in the back/forward cache.

However, sometimes browsers can't restore pages from the cache, either for security reasons or due to technical limitations. Tools like Lighthouse can tell you whether your page prevents back/forward cache restoration, for example due to a cache-control: no-store header.

Back/forward cache audit in Lighthouse

12. Speed up later navigations

After landing on your website, many users will start interacting and navigating around. By prefetching resources that will be loaded later you can provide a faster experience.

One new way to do that is using speculation rules. Speculation rules let you tell the browser when to prefetch different resources on your page. If you know many visitors will go to your login page after opening the pricing page, then you can pre-render the entire login page and achieve an instant navigation.

To set up speculation rules you just need to add a <script> tag with a type="speculationrules" attribute to your HTML.

For example, you can tell the browser to pre-render pages when the user hovers over a link. The preload condition is set by the eagerness attribute – moderate means the page is pre-rendered when a user hovers over the link for 200 milliseconds.

<script type="speculationrules">
{
"prerender": [
{
"where": {
"href_matches": "/*"
},
"eagerness": "moderate"
}
]
}
</script>
tip

To check how pre-rendering improves performance you can use real user monitoring and compare LCP scores by navigation type.

Pre-render navigations being faster than other navigation types

Core Web Vitals: Going beyond load time

In this post we've looked at the Largest Contentful Paint metric that measures how quickly a page loads. LCP is one of Google's Core Web Vitals that impact rankings. However, there are also two other Core Web Vitals.

Cumulative Layout Shift measures visual stability. Does content stay where it was first renders, or does it move around and disorient users.

Interaction to Next Paint measures how quickly your website responds to user interactions. Do visitors get quick visual feedback, or does the page remain frozen after the interaction?

Improving all three metrics takes you a long way to providing a better overall experience for your visitors.

DebugBear RUM Core Web Vitals dashboard

How to maintain web performance improvements over time

After deploying improvements on your website you need to do two things:

  • Confirm that the optimization actually improves website performance
  • Make sure the improvement is sustained and you don't quickly regress

A website performance monitoring tool can help you achieve both of these goals. You can track different metrics over time and get alerted to performance regressions.

Web performance monitoring dashboard

DebugBear combines synthetic performance tests, CrUX data, and real user monitoring to give you a comprehensive view of your website performance and how you compare to your competition.

CrUX performance dashboard

Illustration of website monitoringIllustration of website monitoring

Monitor Page Speed & Core Web Vitals

DebugBear monitoring includes:

  • In-depth Page Speed Reports
  • Automated Recommendations
  • Real User Analytics Data

Get a monthly email with page speed tips