Google has released a web page statistics report, derived from the 4.2 billion web pages they index (websites with high PageRank may have a higher weight):
* The average capacity of a web page is 320KB (including scripts, images, and CSS embedded in the web page). However, it should be noted that some websites will prevent Google crawlers from obtaining CSS and JS script files through robots.txt.
* Only 2/3 of compressible content is actually compressed. It is also important to note that some websites serve compressed content to real browsers but uncompressed to Google crawlers.
* 80% of pages contain 10 or more materials from the same host
* Most popular websites do not write scripts and CSS stored in the same host together, resulting in 8 more HTTP requests
* The number of images contained in each web page is 29.39, which adds up to 205.99KB
* Each web page contains 7.09 external scripts and 3.22 external CSS. The average script size is 57.98KB and CSS is 18.72KB.
* Only 17 million web pages use SSL encryption, accounting for 0.4% of the total
* Average page load speed is 4.9 seconds, requiring requests from 49 different sources
Google's move is to increase everyone's attention to optimizing web page reading speed. They have provided many help guides for web developers to improve efficiency. There are four main suggestions given by Google:
Compress pages using Gzip Optimize javaScript code using HTTP caching Merge scripts and CSS
Via GOS