Improve web site performance via parallel CSS downloads?

6,757

CSS files linked from HTML documents are added to the parallel download queue as the HTML is parsed; the key thing is that non-asynchronous JavaScript links block the HTML parser, preventing later tags from being added to the download queue until that JavaScript is downloaded, parsed and executed.[1]

Here's an example that forces the browser to download three of the four files sequentially (at least three round-trips):

<head>
  <script src="js/fizz.js"></script>
  <link rel="stylesheet" href="css/foo.css"/>
  <script src="js/buzz.js"></script>
  <link rel="stylesheet" href="css/bar.css"/>
</head>

Here's the example reworked so all 4 files are downloaded in parallel (at least one round-trip):

<head>
  <link rel="stylesheet" href="css/foo.css"/>
  <link rel="stylesheet" href="css/bar.css"/>
  <script async src="js/fizz.js"></script>
  <script src="js/buzz.js"></script>
</head>

Another note: CSS files are (by default) render-blocking, not parser-blocking; the page will continue to be parsed and the DOM constructed, but the render won't begin until the CSSOM is constructed.

The main reason to split your CSS up, is to get the minimum rules necessary to render the above-the-fold content to the client and parsed as soon as possible. The rest of the rules, for things that aren't immediately visible, can be marked as not-necessarily-render-blocking with media queries in the link tag, or added to the page by asynchronously loaded JavaScript.

So, there is no clear benefit to parallelizing your CSS downloads just for its own sake. But as always, measure and test!

For further reading, I recommend these articles on 'Web Fundamentals: Optimizing Performance' from Google: https://developers.google.com/web/fundamentals/performance/

[1]: This is ignoring the Speculative Parsing feature of some browsers:

https://docs.google.com/document/d/1JQZXrONw1RrjrdD_Z9jq1ZKsHguh8UVGHY_MZgE63II/preview?hl=en-GB&forcehl=1

https://developer.mozilla.org/en-US/docs/Web/HTML/Optimizing_your_pages_for_speculative_parsing

Share:
6,757

Related videos on Youtube

Abdul Wahid
Author by

Abdul Wahid

Updated on September 18, 2022

Comments

  • Abdul Wahid
    Abdul Wahid almost 2 years

    I was optimizing the page load time of a website. One of the ways was by combining multiple HTTP requests for CSS into one combined HTTP request. But one of the reviewers asked an interesting question: wouldn't paralellizing the download of multiple CSS files reduce the page load times?

    I never considered this option, as the only thing I read on the internet is that reducing the number of (blocking) HTTP requests is key to a faster web page (although Google Pagespeed Insights doesn't seem to clearly state this 1).

    I see a few reasons why parallellization would not improve performance, or only matter very little (outweighed by the benefit of using fewer HTTP requests):

    • Setting up a new connection is expensive. While setting up multiple connections can be done in parallel, browsers will use at most about 4-6 connections (depending on the browser), so downloading CSS in parallel would block downloading other assets like JavaScript and images.
    • Setting up a HTTPS connection takes some extra data. I've read this can easily be a few KBs of data. This is some extra data that has to be sent over the wire, instead of the CSS we actually want to send.
    • Due to the TCP Slow Start algorithm, the more data that has been sent over a connection, the faster the connection will be. So longer lived connections would actually send the data much faster than new connections. See for example the SPDY protocol, that uses a single connection to improve page load times.
    • TCP is an abstraction: there is still (normally) only one underlying connection. So while multiple requests are used, the data send over the wire may not necessarily benefit from multiple connections at all to improve speed.
    • Internet connections are inherently unreliable, especially on mobile. One request may be finished significantly faster than the other. Using multiple requests for CSS means rendering the web page is blocked until the last request has finished, which may be significantly later than the average connection.

    So, is there any benefit at all in parallelizing HTTP requests for CSS files?

    Note/update: all CSS files are render-blocking. CSS files that are not have already been moved outside the critical path.

    • Casey Chow
      Casey Chow almost 10 years
      I saw something on etsy's code as craft site where they recommended just using a single cookieless domain for static objects (css, imgs, js). It turns out modern browsers do a great job with parallel downloads from a single domain. As for multiple files from same domain (10 css files vs 1 big one), I think you are better of combining and minifying one large file and just making it one request. Specially with CSS where your site is probably going to need all the css to look correct.
    • Sun
      Sun almost 10 years
      In a way, browsers already download in parallel based on the number of connections that are available to use. It is based on the HTML that the browser is reading.
  • Gautham Nayak
    Gautham Nayak over 9 years
    Another good read, about how to keep your scripts downloading concurrently, is here: stackoverflow.com/questions/436411/…
  • Chaoley
    Chaoley over 9 years
    It's worth noting that all the .js files should be linked at the end of the code just before the closing body tag, that way they don't block anything.
  • gaurav5430
    gaurav5430 about 5 years
    can you elaborate "So, there is no clear benefit to parallelizing your CSS downloads just for its own sake" ? I somehow could not figure it out from the above answer
  • Hades Black
    Hades Black about 5 years
    @gaurav5430 There doesn't seem to be good reasons for splitting css files, beyond optimizing for the above-the-fold render time (and maybe cache optimization?)