Website Frontend Optimization
The website is generally divided into two: front-end and back-end. We can understand that the backend is used to realize the functions of the website, such as: realizing user registration, users being able to comment on articles, and so on. What about the front end? In fact, it should belong to the performance of the function. And most of the impact on the user experience comes from the front-end page.
And what is the purpose of our website? Isn’t it just for the target group to visit? So we can understand that the front end is really in contact with users. In addition to performance optimization in the background, the front-end page needs to work hard on performance optimization. Only in this way can we bring a better user experience to our users. It’s like, many people ask whether men only look at their appearance when looking for a girlfriend. Some wise men give this answer: face and body determine whether I want to understand her thoughts, and thoughts determine whether I am One vote will veto her face and figure. The same is true for websites. The user experience of the front end of the website determines whether the user wants to use the functions of the website, and the function of the website determines whether the user will vote against the front-end experience.
Not only that, if the front-end is optimized well, he can not only save costs for the enterprise, but he can also bring more users to users because of the enhanced user experience. So, like women, websites must be both internal and external (￣▽￣)”. Having said that, how should we optimize the performance of our front-end pages?
Generally speaking, the web front end refers to the part before the business logic of the website, including browser loading, website view model, image service, CDN service, etc. The main optimization methods include browser access, use of reverse proxy, CDN, etc.
Browser access optimization
1. Reduce http requests and set HTTP caching reasonably
The http protocol is a stateless application layer protocol, which means that each HTTP request needs to establish a communication link and perform data transmission, and on the server-side, each HTTP needs to start an independent thread to process. These communications and services are very expensive, and reducing the number of HTTP requests can effectively improve access performance.
The power of caching is powerful, and proper caching settings can greatly reduce HTTP requests. Suppose a website homepage, when the browser is not cached, a total of 78 requests will be sent, a total of more than 600K data, and when the second visit is the browser has been cached, there will be only 10 requests, a total of more than 20K. data. (It needs to be explained here that if you refresh the page directly with F5, the effect is different. In this case, the number of requests is still the same, but the request server of the cached resource is 304 response, only the Header does not have the Body, which can save bandwidth)
What is a reasonable setting? The principle is simple. The more caches the better, and the longer the caches, the better. For example, image resources that rarely change can directly set a long expiration header through Expires in the HTTP Header; resources that change infrequently and may change can use Last-Modifed for request verification. Try to make the resources stay in the cache as long as possible. The specific settings and principles of HTTP caching are not detailed here. For the analysis of the principles of browser http caching, please see here.
Use browser cache – Burnaby Shopify Services
Websites that use browser caching strategies should adopt a gradual update method when updating static resources. For example, 10 icon files need to be updated. It is not advisable to update all 10 files at once. Instead, they should be updated gradually, one file at a time. A certain interval time, so as to avoid a large number of cache failures in the user’s browser, centralized update of the cache, resulting in a sudden increase in server load and network congestion.
Enable compression – Burnaby SEO Services
Combining CSS images is another good way to reduce the number of requests.
Lazy Load Images (I still don’t know the content of this piece)
This strategy actually does not necessarily reduce the number of HTTP requests, but it can reduce the number of HTTP requests under certain conditions or when the page is just loaded. For pictures, only the first screen can be loaded when the page is just loaded, and subsequent pictures can be loaded when the user continues to scroll back. In this way, if the user is only interested in the content of the first screen, the remaining image requests are saved.
The browser will render the entire page after downloading all the CSS, so the best practice is to put the CSS on the top of the page and let the browser download the CSS as soon as possible. If you put CSS in other places such as BODY, the browser may start rendering the page before downloading and parsing the CSS, which causes the page to jump from the CSS-free state to the CSS state, and the user experience is worse, so Consider putting CSS in the HEAD.
Asynchronous request Callback (that is, extract some behavior styles, and slowly load the content of the information)
In this way as the above write directly on the page
<script>to page is going to affect performance, which increases the burden on the page is first loaded, the delayed timing DOMLoaded trigger and window.onload events. If timeliness permits, you can consider loading when the DOMLoaded event is triggered, or use setTimeout to flexibly control the timing of loading.
Reduce cookie transmission
On the one hand, a cookie is included in each request and response. A cookie that is too large will seriously affect data transmission. Therefore, which data needs to be written into the cookie must be carefully considered to minimize the amount of data transmitted in the cookie. On the other hand, for certain static resource access, such as CSS, script, etc., sending cookies is meaningless. You can consider using independent domain names to access static resources, avoid sending cookies when requesting static resources, and reduce the number of cookie transmissions.
a. HTML Collection (HTML collector, returns an array of content information)
in the script document.images, document. forms, getElementsByTagName() returns a collection of HTMLCollection type, which is mostly used as an array in normal use To use because it has a length attribute, you can also use the index to access each element. However, the access performance is much worse than the array. The reason is that this collection is not a static result. It represents only a specific query. The query will be re-executed every time the collection is accessed to update the query result. The so-called “access collection” includes reading the length property of the collection and accessing the elements in the collection.
Therefore, when you need to traverse the HTML Collection, try to convert it to an array before accessing it to improve performance. Even if it is not converted to an array, please visit it as little as possible. For example, when traversing, you can save the length property and members to local variables before using local variables.
b. Reflow & Repaint
In addition to the above point, DOM operations also need to consider the browser’s Reflow and Repaint, because these require resources.
The traditional proxy server is located on the browser side, and the proxy browser sends HTTP requests to the Internet, while the reverse proxy server is located on the side of the website computer room, and the proxy website web server receives the HTTP requests.