The new goal for the end of 2009 had become 2.5 seconds. One of the big problems was whole-page-refreshing and scripts that blocked speedy rendering. First, the team exploited their knowledge of the common interaction types:
- User clicks
- Sends Async request
- Insert/Replace some content
Next, they set up elements similar to this one:
<a href="/ring.php rel="dialog">...</a>
Then a standard listener routine was used to "hijack" these elements, allowing Facebook to update only the sections on a page that needed to be updated. The two-step technique is outlined in this blog entry:
- First, build an old-fashioned website that uses hyperlinks and forms to pass information to the server. The server returns whole new pages with each request.
These updates helped all the little labels you see in Facebook ("Like", "Comments", "Delete", etc.) refresh asynchronously, making load times faster and giving the site more real-time interactivity. Asynchronous requests carried very little information previously, but now the requests include content. Since 2009, Facebook has been running without page refreshes by dynamically flipping content and the fragment ID (page transitions). Just before the end of 2009, Adeagbo says that Facebook met its goal of 2.5 seconds per page load.
In the process of web building, developers should be aware that it is better NOT to implement a feature unless it is worth risking the good performance of your site(especially for sites with a high degree of complexity, like Facebook). Adeagbo provides a quote for some clarity on this issue:
"Adding a single line to this file requires great internal reflection and thought. You must ask yourself if your one line addition is so important, so critical to the success of the company, that it warrants a slowdown for every user on every page load. Adding a single letter here could cost thousands of man-hours around the world.
That is all."