Secrets of Faster Facebook revealed
By Partho, Gaea News NetworkFriday, February 19, 2010
In the recent past Facebook had been struck with allegations of sluggishness. A disclosure in Facebook engineering’s note today counters the affect. According to the release, Facebook strives make their site as responsive as possible. Facebook engineers are running experiments to allow visitors more page view and more value out of the site when it runs faster. They made every possible effort to hike the speed and measure its progress. Facebook focused on three key components for optimizing the site to enable faster page viewing — network time, generation time, and render time that contribute to the performance of a page load. Let’s delve into the Facebook secrets to know the methodologies and techniques they used to increase the fastness of their site.
Three major components
Network time
It’s time that you have to wait while data is being transferred between computer and Facebook. The network time cannot be controlled completely, as some users accessing Facebook have slower connection than others. Improving upon network time depends on 5 main contributors bytes of cookies, HTML, CSS, JavaScript, and images. The Facebook engineers tried to reduce the number of bytes required to load a page, fewer bytes would mean lesser network time.
Generation time
It’s time between webserver receiving a request from the user to the time when user receives a response. The metric measures the efficiency of code and our webserver, caching, database and network hardware. Facebook made an effort to reduce the generation time through cleaner, faster code and constantly improving upon their backend architectures
Render time
It explains the time an user’s web browser would require to process a response from Facebook and display the required web page. Facebook had to face some constraints regarding the performance and behavior of various browsers but most of it was still under control. reducing the render time would require minimizing bytes of HTML, CSS, JavaScript, and images. Further, this will help in reducing the render time. The render time can also be reduced by executing as little javaScript as possible showing the page to user.
Facebook decided to roll up these three metrices into a number that would give us high level sense of how fast the site is. They call this metric Time-to-Interact. The component provides exactly how long the user can wait for the important contents of a page to become visible and usable.
Starting in early 2008 Facebook engineers followed the best practices laid down by pioneers in web performances filed to improve the TTI. In June 2009 they finally succeeded in making significant improvements reducing the median render time to half for users in the United States.
For an deeper insight, the engineers tried to reducing the number of cookies, HTML, CSS and cut back on JavaScript. They worked on developing new frameworks and methodologies that would enable the browser to show content to the user as quickly as possible.
One of the noticeable breakthroughs in the project was improving upon the traditional model for loading a web page. The entire process from sending request to server to receiving requested page was pipelined. This is how they describe the process
Wouldn’t it be great if the server could do a little bit of work, say in ten or fifty milliseconds, and then send a partial response back to the browser which can then start downloading JavaScript and CSS or even start displaying some content? Once the server has done some more processing and has produced another bit of output it can send that back to the browser as well. Then we just repeat the process until the server has nothing left to do. We’ve overlapped a significant portion of the generation time with the render time which will reduce the overall TTI experienced by the user.
The entire system is called BigPipe and the logical blocks of content after breaking down the web pages is called Pagelets. For instance, the homepage newsfeed can be considered as a Pagelet and the Suggestion box another one. The BigPipe concept reduced the TTI of Facebook pages and made it faster for users.
For those web engineers serious about making their web site faster can seek clues to gather some insight. They can go though Steve Souder’s compilations — High Performance Web Sites and Even Faster Web Sites.
Facebook’s claims of being faster seems an attempt to defend the recent complaints, which read, Facebook is slower than ever. This could be as a result of slower connections or the increasing burden of users Facebook continues to add. It remains a challenge for Facebook to keep up its fastness with respect to its proliferating user base.
Tags: BigPipe, Facebook secrets