Facebook’s Instant Articles Feature Speeds Up Mobile Load Times
Facebook’s Instant Articles Feature Speeds Up Mobile Load Times
Join the DZone community and get the full member experience.Join For Free
xMatters delivers integration-driven collaboration that relays data between systems, while engaging the right people to proactively resolve issues. Read the Monitoring in a Connected Enterprise whitepaper and learn about 3 tools for resolving incidents quickly.
[This article was written by Ryan Pelette]
Earlier this month, Facebook made the groundbreaking announcement that they will launch a feature called Instant Articles for iOS users, which will entail them hosting news content from certain major media outlets such as the New York Times, NBC News, and National Geographic directly on their site. Until now, all media outlets have simply shared their stories on their Facebook pages, with users able to click links that direct them to the stories on their own sites. That practice will obviously continue for the time being alongside certain Facebook-hosted content, but depending on how this new experiment goes, who knows what the norm will be going forward?
The reasons behind this move are as wide-ranging as the implications, and involve the growing fear within the news industry that traditional forms of consuming content are rapidly becoming outdated by the growth of digital content platforms. With so many different ways for people to consume content now, these outlets have clearly decided that giving theirs away for free and undermining their own subscription models is worth a shot, if only to remain relevant.
However, rather than looking at this as another step in the decline of traditional news media, we suggest looking at it the way we do: as an innovative solution to a technical problem.
It’s no secret that news sites are the biggest offenders of web performance guidelines, with mobile sites often taking far longer to load than those of other industries, largely due to a glut of third party tags and advertisements designed to increase the organizations’ revenue (another effect of the failure of news subscriptions models in the digital age).
On Facebook, however, the article will be pre-loaded, thereby eliminating those third-party bottlenecks. The social media giant is already famous for putting performance standards above ad revenue – hardly much of a sacrifice given how they can monetize data in other ways through their interface – so while news organizations will see a decline in their own sites’ traffic, they’ll likely be able to increase the amount of eyes on their content. Facebook is already the dominant social media referral source, with more referrals in Q4 than all other social platforms combined.
We went to the Facebook Page for Instant Articles and analyzed the network requests to learn a little bit more about how it works and see how the promise is delivered. The page itself loads different posts including some examples of Instant Articles. That’s nothing new, though it’s important to note that on loading the page, Facebook has already started the work to make the articles feel responsive. This means that Facebook is sending more bytes on the wire to mobile users, which sounds bad at first considering that not all the articles will be read. On the other hand, as we’ll soon see, the amount of bytes saved by not sending users to the external articles more than makes up for it.
So what is Facebook doing behind the scenes when a page containing Instant Articles is loaded? First, an API call is made to get the article itself. This is a JSON object that holds or references all the content required for building the article and is a very efficient method. In this example, we’re analyzing the National Geographic “Quest for a Superbee” article.
Here is an “above the fold” snapshot of the article, which we’ll use as a reference in later parts of our analysis:
Here is the API call:
“FBNativeArticleQuery” is called for article ID “1082085805138404” – the ID for the “Quest for a Superbee” article.
The JSON returned has links for the video thumbnail of the article:
You can also find the body text in the JSON (some lines omitted for brevity):
“text”: “Brother Adam must have known he had become a beekeeper at an unlucky time. It was 1915, and he was a 16-year-old novice at Buckfast Abbey in southwest England. Rapid bee die-offs have been recorded for centuries, but the catastrophe that confronted the young monk was unprecedented. A mysterious disease had wiped out almost every apiary on the Isle of Wight and now was devastating the rest of England. Brother Adam found his hives suddenly vacant, bees crawling beneath them, unable to fly. That year he lost 29 of the abbey\u2019s 45 hives.”,
This includes the title, images, analytics, and everything else – most of which loads in parallel. Earlier it was pointed out that the article gets preloaded. Here we would like to clarify that not all of the requests are issued, and in fact, the majority of the network requests included with the article aren’t issued until the user taps the article link to start reading it.
Now let’s compare the Instant Article to its external counterpart from the National Geographic website – what users are linked to when reading the article anywhere other than Facebook for iOS. While the page loads relatively quickly in the small amount of samples taken – DomContentLoaded in 500ms and onload in 2s on average – the page loads 233 requests with 2MB overall.
Since there is no onload or DomContentLoaded equivalent in iOS, it is not possible to do a complete apples to apples comparison. What’s relevant is that since the article is preloaded, to it appears to load instantly in the eyes of the end user (assuming the Facebook Page containing the article is itself loaded). Therefore, it’s fair to say that the article has the equivalent of 0ms load time, and in most tests we performed, no single request exceeded 300ms. When the Facebook Page containing the article is loaded, the user is likely to see the article thumbnail relatively quickly (and the risk of requesting from a slower external source is eliminated since the contents load from Facebook’s servers that the user is already connected to).
Regarding the amount of data delivered, the JSON response for building the article was 22.46 KB. In addition to that, 117 requests were issued totaling 1.28 MB, including HTTP headers and request/response bodies.
Another factor that presents obstacles when it comes to making a direct comparison between the two versions is the amount of media within the article; the native version on the National Geographic site is less rich in media. For example, the Instant Article features various mp3 audio files, a video, and many more images embedded directly in the article. While the native article on National Geographic’s website includes some of the same content, it is all linked to in different pages, with the exception of four thumbnail images and a large banner image.
Since we tested through a wifi network in order to use a proxy for sniffing the traffic, it is unclear if any of this traffic is reduced using network aware development, so there may be a chance that Facebook sends less content on the wire for 3G and 4G connections. However, aside from possible optimization pertaining to bandwidth consumption, the Instant Articles are a great improvement in mobile user experience.
Published at DZone with permission of Mehdi Daoudi , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.