How Facebook Jacked Up JavaScript Performance
Join the DZone community and get the full member experience.
Join For FreeA few months ago we learned about the high-performance PHP compiler, HipHop, which is now open source. So how did Facebook solve their problems with JavaScript performance when their applications and interface complexity grew dramatically? At the JavaScript Conference this year, Facebook software engineer Makinde Adeagbo took the audience through Facebook's JS optimization efforts.
Some users probably noticed that Facebook page load times had been getting longer and longer. Adeagbo said that in 2009, Facebook was very serious about addressing this issue. In 2006, the web 2.0 giant set a goal for page loads to be under 100ms. That goal slipped to one second and by mid-2009, pages were taking as long as five seconds to load. HipHop was smaller than the JavaScript code base at this point. Facebook developers also needed a solution for JavaScript to get the page loads under control.
The new goal for the end of 2009 had become 2.5 seconds. One of the big problems was whole-page-refreshing and scripts that blocked speedy rendering. First, the team exploited their knowledge of the common interaction types:
Next, they set up elements similar to this one:
<a href="/ring.php rel="dialog">...</a>
Then a standard listener routine was used to "hijack" these elements, allowing Facebook to update only the sections on a page that needed to be updated. The two-step technique is outlined in this blog entry:
These updates helped all the little labels you see in Facebook ("Like", "Comments", "Delete", etc.) refresh asynchronously, making load times faster and giving the site more real-time interactivity. Asynchronous requests carried very little information previously, but now the requests include content. Since 2009, Facebook has been running without page refreshes by dynamically flipping content and the fragment ID (page transitions). Just before the end of 2009, Adeagbo says that Facebook met its goal of 2.5 seconds per page load.
In the process of web building, developers should be aware that it is better NOT to implement a feature unless it is worth risking the good performance of your site(especially for sites with a high degree of complexity, like Facebook). Adeagbo provides a quote for some clarity on this issue:
Adeagbo's presentation was recorded by Ajaxian's [[http://ajaxian.com/archives/facebook-javascript-jsconf]] live blogging notes and the presentation slides on SlideShare. [http://www.slideshare.net/makinde/javascript-primer]
Some users probably noticed that Facebook page load times had been getting longer and longer. Adeagbo said that in 2009, Facebook was very serious about addressing this issue. In 2006, the web 2.0 giant set a goal for page loads to be under 100ms. That goal slipped to one second and by mid-2009, pages were taking as long as five seconds to load. HipHop was smaller than the JavaScript code base at this point. Facebook developers also needed a solution for JavaScript to get the page loads under control.
Javascript Primer
View more presentations from makinde.
The new goal for the end of 2009 had become 2.5 seconds. One of the big problems was whole-page-refreshing and scripts that blocked speedy rendering. First, the team exploited their knowledge of the common interaction types:
- User clicks
- Sends Async request
- Insert/Replace some content
Next, they set up elements similar to this one:
<a href="/ring.php rel="dialog">...</a>
Then a standard listener routine was used to "hijack" these elements, allowing Facebook to update only the sections on a page that needed to be updated. The two-step technique is outlined in this blog entry:
- First, build an old-fashioned website that uses hyperlinks and forms to pass information to the server. The server returns whole new pages with each request.
- Now, use JavaScript to intercept those links and form submissions and pass the information via XMLHttpRequest instead. You can then select which parts of the page need to be updated instead of updating the whole page.
These updates helped all the little labels you see in Facebook ("Like", "Comments", "Delete", etc.) refresh asynchronously, making load times faster and giving the site more real-time interactivity. Asynchronous requests carried very little information previously, but now the requests include content. Since 2009, Facebook has been running without page refreshes by dynamically flipping content and the fragment ID (page transitions). Just before the end of 2009, Adeagbo says that Facebook met its goal of 2.5 seconds per page load.
In the process of web building, developers should be aware that it is better NOT to implement a feature unless it is worth risking the good performance of your site(especially for sites with a high degree of complexity, like Facebook). Adeagbo provides a quote for some clarity on this issue:
"Adding a single line to this file requires great internal reflection and thought. You must ask yourself if your one line addition is so important, so critical to the success of the company, that it warrants a slowdown for every user on every page load. Adding a single letter here could cost thousands of man-hours around the world.
That is all."
Adeagbo's presentation was recorded by Ajaxian's [[http://ajaxian.com/archives/facebook-javascript-jsconf]] live blogging notes and the presentation slides on SlideShare. [http://www.slideshare.net/makinde/javascript-primer]
facebook
JavaScript
Opinions expressed by DZone contributors are their own.
Comments