Highlights of the HTTP Archive Web Almanac
- Published at
- Updated at
- Reading time
- 8min
I spent my Sunday reading the HTTP archive web almanac and shared the surprising and interesting pieces in a Twitter thread. I like to own my data โ so here we have the thread on my own site. Enjoy!
Spending my Sunday morning reading the Web Almanac sharing internet stats and analyzing HTTP Archive data for 2019.
I'll share facts and stats that I think are interesting in a thread. ๐ :)
Edited: I tweeted initially that it's 65% because I missed the fact that gzip and brotli should count together. ๐
It always feels like React/Vue/Angular are all over the internet โ they're not... jQuery still powers 85% of the crawled sites. ๐ฒ
The numbers for sites using "cutting-edge" frameworks are relatively low with React being the most popular with ~5% on desktop.
Even though the ES Module support is quite good these days they are not really used.
~1% is surprisingly low because you can use a fallback strategy shipping a single bundle using the `nomodule` attribute and use modules for supporting browsers today.
Only ~20% of sites use source maps? ๐ฒ
Roughly 50% of sites use flexbox โ only 2% use grid.
Only 20% of sites make use of responsive images...
No surprise here, but yeah... image alt attributes are not used as much as they should. :/
Edited: As Boris Schapira pointed out, images can be hidden from assistive technology by providing an empty alt
attribute (alt=""
). This fact was not taken into consideration by the Almanac and makes the statistic meaningless.
26% of the pages use font-display. ๐ฒ That's surprisingly high in my opinion. Because the support is not super-duper yet. I wonder how big google fonts' influence is in this trend. ๐
Honestly, I expected fewer sites being served over a secure connection. 80% of sites ship with https these days.
12 - 14% of sites use HSTS to ensure they are only accessible by supporting browsers via HTTPS. This is also higher then I expected. ๐ฒ
I got this statistic by myself recently, but it's still sooooo low. ๐ฟ
Only roughly 5% of crawled sites use Content-Security-Policy (CSP).
4 of 5 sites ship with color contrast issues. I really wish that we get better at this. :/
26% of the pages don't specify the language of their content. This can trouble text-to-speech technology like screenreaders.
4 of 5 forms don't ship with labels for their input elements. :/ I'm used to these bad numbers, but well... filling out forms can be tough for everybody (even tech people), we really have to get better at this. :/
10% of sites ship without headings at all. ๐ฒ
Google shows 50-60 characters in their search results. Generally speaking, the used title length is not optimal across the web. (at least for google)
Service workers are mainstream, right? ๐ Not really... Only 0.44% of the crawled sites register a service worker.
How often do we click the wrong thing because something moved around? Too often.. Jumpy pages are the standard... :/
2 of 3 pages have a huge content shift while loading.
CLS stands for Cumulative Layout Shift โ more info.
Speaking about tapping the wrong thing. Only 34% of the pages include big enough buttons and links...
Wordpress usage is still massive. 75% of sites using a CMS are running on wordpress.
CMS pages are heavy and make many requests... I did Wordpress development in the past and that makes sense thinking of the audience and users of e.g. wordpress. "Just install another plugin"...
HTML is mainly served from its origin server (80%). Most used CDN is cloudflare (10%). ๐ฒ
I thought the median value for page weight would be higher these days. :D On desktop it's 1.9MB and on mobile, it's 1.7MB. It's still fairly high though imo. ๐ (and median is clearly only one piece of the puzzle)
And that's it. I highly recommend to check it out! It's a very fascinating and interesting read about the state of the internet. :)
Join 5.4k readers and learn something new every week with Web Weekly.