The content of this article was originally written by Grigore Raileanu over at Speed Awareness Month.
Nowadays complex applications and mission-critical systems demand ever more network speed and capacity. The most demanding sectors are stockbroking, commodities trading, online gaming, video streaming, and VoIP. However it’s hard to think of any business area that would not benefit from improved network performance.
Latency and throughput are the main factors that define the performance of a network. Whereas throughput is the quantity of data that can pass from source to destination in a specific time, latency is the time it takes for a single data transaction to occur, meaning the time it takes for the packet of data to travel to and from the destination, back to the source.
Latency and user experience
Latency is one of the key factors of a good user experience. While the focus in user experience so far has been on maximum bitrates, after a certain level of throughput has been achieved, latency is sometimes even more important than the throughput, or bitrate, offered.
Many people know the disturbing feeling when waiting for webpages to load when using a slow connection. Consumers are becoming ever more active in the Internet, forming communities to discuss and compare network performance with the aim of getting more from their favorite online activities.
The main question is how patient can the user be before abandoning the attempt of loading a webpage? The end users´ tolerable waiting time (TWT) is getting shorter. Studies suggest that feedback during the waiting time, for example, percent-done indicators, encourages users to wait longer. User’s TWT varies depending on the type of task he is trying to accomplish such as information retrieval, browsing, online shopping or downloading.
Users can become frustrated by slow websites, which are perceived to have lower credibility and quality – by contrast, faster websites are perceived to be more user-friendly and attractive.
People are becoming more sensitive to latency and it is being increasingly evaluated and discussed. Fast response times matter. With reduced latency, the end user has a better interaction with the web, as webpages download more quickly. Satisfied customers lead to less churn, while fast response times also mean more revenue.
Less latency – more revenue
Low latency increases competitiveness for any service. System latency is more important than the actual throughput for many internet based applications, such as VoIP, multimedia, IPTV and online gaming.
Real life examples from the Internet show that an increase in latency can directly affect the revenue of an Internet service.
Marissa Mayer, Vice President of Search Product and User Experience at Google mentioned in a CNET article that when Google increased the number of results on the page from 10 to 30 the loading time has increased accordingly from 0.4 to 0.9 seconds. These changes decreased traffic and ad revenues by 20%. It turned out that the cause was not just the increased number of results, but the fact that a page with 30 results was 0.5 seconds slower than the one with 10 results. “Google consistently returns search results across many billions of webpages and documents in fractions of a second. While this is impressive, is there really a difference between results that come back in 0.05 seconds and results that take 0.25 seconds? Actually, the answer is yes. Any increase in the time it takes Google to return a search result causes the number of search queries to fall. Even very small differences in results speed impact query volume.”
Amazon has also felt the effect of latency on its sales volume: every 100 ms increase in load time of Amazon.com decreased sales by one percent.
Also, tests at Microsoft on live search showed that when search results pages were slowed by one second queries per user declined by 1.0% and ad clicks per user declined by 1.5%. After slowing the search results page by two seconds queries per user declined by 2.5% and ad clicks per user declined by 4.4%.
Therefore with low latency businesses can increase end-user satisfaction and thus reduce churn. For service providers serving latency sensitive sectors such as gaming, online trading, video streaming, and VoIP, minimizing latency is critical for facing their customers’ demands. Recently latency requirements have increased for almost all types of network services and the list of industries sensitive to latency is continuously increasing.