Over a million developers have joined DZone.

A Brief History of Web Performance ROI

Tammy Everts provides a wonderful history of web performance ROI reaching back to 2008 and pulling forward to today.

· Performance Zone

Download Forrester’s “Vendor Landscape, Application Performance Management” report that examines the evolving role of APM as a key driver of customer satisfaction and business success, brought to you in partnership with BMC.

I’ve been researching and writing about the intersection of web performance, user experience, and business metrics for many years now, but it was only recently that I gave some thought to the history of how we got from knowing little to nothing less than ten years ago to where we are today.

While there’s still a lot to be learned, I thought it would be interesting to take an historical look at what we’ve learned over the past few years, the big questions we’ve asked during this journey, and some questions I plan to explore moving forward.

(I also covered this topic in a recent webinar. You can watch the recording here.)

2008: Does Web Performance Even Matter?

The answer to this question seems like a no-brainer today, but it wasn’t back then. 2008 was the inaugural year of the Velocity Conference — the first year that players from a number of companies came together to discuss this very question.

This was also the year that Aberdeen Group released its seminal study into the impact of performance on business metrics. Aberdeen’s findings have been cited hundreds (possibly even thousands) of times in articles, blog posts, slide decks and reports. I’d argue that Velocity and Aberdeen can take much of the credit for pushing performance into the spotlight.

Web performance: impact of a 1-second load time delay

2009: What Business Metrics Does Web Performance Affect?

Building on the culture of information sharing that was seeded at Velocity 2008, Velocity 2009 was a watershed year for web performance case studies. Companies like Amazon, Google, Shopzilla (now Connexity), Yahoo, and AOL came back to Velocity having done in-house research into the impact of load times/delays on metrics relevant to their businesses.

Some notable case studies that are still being cited today:

  • Shopzilla sped up average page load time from 6 seconds to 1.2 seconds and experienced a 12% increase in revenue and a 25% increase in page views.
  • AOL found that visitors in the top 10 percentile of site speed viewed 50% more pages than visitors in the bottom 10 percentile.
  • Microsoft Bing and Google teamed up for a co-presentation on the impact of server delays on user metrics. They found that even small delays (under 500 milliseconds) have a negative effect. They also found that the cost of delay increases over time and persists.

These had a galvanizing effect on our industry. It was apparent to anyone who was paying attention that performance really does touch pretty much every metric the business cares about.

2010: Should We Care About Page Slowdowns as Much As We Care About Downtime?

Outages get a lot of attention, for obvious reasons. When twenty minutes of downtime can means millions in lost revenue… well, that’s pretty scary. Because of this, it’s sometimes been a challenge to get business owners to care about slow load times as much as they care about downtime.

Fortunately, the folks at TRAC Research came to the rescue with an excellent study into the real impact of outages versus slowdowns. They surveyed more than 300 companies and found that, on average:

  • 4.4 seconds is the average delay in response times when business begins to be affected.
  • $21,000 is the average revenue loss for one hour of downtime.
  • $4,100 is the average revenue loss for an hour of slowdowns.
  • Website slowdowns occur 10 times more frequently than website outages.

In other words, website slowdowns have twice as much negative impact on revenue as outages do.

2011: Can We Compare Apples to Apples?

Many performance case studies are makeover stories. We see a “before” version of a slow site, then an “after” of a revved-up version of the same site. The “after” version almost invariably results in happier users and better business metrics.

But these “before and after” stories inevitably have detractors who make some good points. Most sites aren’t static. Perhaps the uptick in business metrics is due to a marketing campaign? Or perhaps some of the credit should go to changes in the user interface? Before and after stories are compelling (after all, most of us love a web performance Cinderella story), but they don’t convince everyone. That’s why we need to compare apples to apples, instead of apples to oranges.

In other words, we need to see the impact of performance changes on pages where all other elements are the same and the only changing variable is load time. But doing this kind of research is difficult.

I feel very lucky to have participated in a somewhat unusual study back when I was with Strangeloop. We had a customer who wanted to see firsthand the impact of page slowdowns on their own site, so they agreed to an 18-week study in which, for the first 12 weeks, the bulk of their traffic was served an optimized version of the site, but three small (but still statistically significant) cohorts were served pages with either a 200 millisecond, 500 millisecond, or 1000 millisecond HTML delay.

The results were eye-opening. While the 200-millisecond delay had a negligible impact on business metrics, at 500 milliseconds the impact was quite pronounced. And at 1000 milliseconds, we saw an 8.3% increase in bounce rate, a 3.5% hit to conversion rate, a 2.1% decrease in cart size, and a 9.4% drop in page views.

website monitoring: impact of site speed delay

Not only were business metrics affected during the 12 weeks of the study in which we introduced the HTML delay, they continued to be affected even after we resumed optimizing 100% of site traffic. We continued to monitor the visitors from the first 12 weeks of our study for 6 more weeks, and we found that the percentage of visitors who returned was lower for the cohorts that had received the 500 millisecond and 1000 millisecond delays. In other words, the slower load times these users experienced negatively affected their willingness to return to the site, even after the experiment was over.

website monitoring: impact of site speed delay

2012: What Can We Do With a LOT of User Data?

2012 was notable for being the year that real user monitoring (RUM) came on the scene. While people were generally excited by the idea of gathering data about every user experience on their site, we also wondered how the heck to take all this data and make it actionable.

One of the earliest RUM case studies that showed up on my radar was from Walmart Labs, a tech incubator within Walmart. (The research was led by Cliff Crocker, who served as senior engineering manager at Walmart Labs. Today we’re lucky to have Cliff here at SOASTA as our VP Product. Hi, Cliff!)

The team at Walmart Labs knew that Walmart.com was suffering from performance issues. As a for instance, initial measurement showed that an item page took about 24 seconds to load for the slowest 5% of users. Why? The usual culprits: too many page elements, slow third-party scripts, multiple hosts (25% of page content was served by third parties), and various best practice no-nos.

Walmart Labs dedicated a scrum team to one sprint of performance optimization. At the start of the process, the team performed some baseline measurements in which they used their real user monitoring data to look at the load times of key pages and look for patterns. Then the team created targets for page performance and, at the end of the sprint, measured the impact of optimization on key metrics.


Their findings showed a strong correlation between performance and conversions:

  • Overall, converted shoppers were served pages that loaded twice as quickly as pages served to non-converted shoppers.
  • This trend persisted, even on individual pages that experienced greater load times.
  • Non-buyers were served category pages that were 2-3 seconds slower than category pages served to buyers.
  • For every 1 second of improvement to load time, the site experienced up to a 2% improvement in conversion rate.
  • For every 100 milliseconds of improvement, they grew incremental revenue by up to 1%.

Today: Are We Focusing Our Optimization Efforts on the Right Pages?

“We worked to make our home page and other key pages faster, and it didn’t affect our business.”

I don’t hear this very often, but I do hear it. One reason why some companies work hard to optimize key pages and yet still not see results: they’re optimizing the wrong pages. Some pages (such as product and category pages) are more sensitive to performance changes than others (such as checkout pages).

Knowing this, our data science team here at SOASTA developed something called the Conversion Impact Score, which answers the question: What’s the relative impact of load time changes on business performance per page? Knowing the Conversion Impact Scores for pages on your own site lets you prioritize your optimization efforts. It’s a powerful metric. Every time I talk about it with our customers, I can see the excitement it generates.

READ: Conversion Impact Score: What is it? And why do you need to know yours?

Looking Ahead: New Questions

When I look back and see how much research has been generated and how many questions have been answered over the past seven years, it’s really exciting. But I’m even more excited about the questions we haven’t answered. Here are just a few things I’m looking to explore:

1. How can we better measure how performance affects user satisfaction?

We have tools that measure performance. We have tools that measure customer satisfaction. It stands to reason that we can have these tools speak to each other so that we can visualize the relationship between site speed and user happiness.

2. What impact does web performance have on customer lifetime value (CLV)?

In the performance monitoring space, we tend to measure the user experience in single sessions. But that’s a really short-sighted way of looking at things. In the marketing world, marketers take a big-picture look at the entire customer relationship. Typically, CLV is calculated over three years: that’s the length of the relationship that marketers expect a customer to have with their brand. So when marketing calculates the ROI for bringing in a new customer, they look at the amount of revenue that customer brings in over the entire relationship.

We in the web performance community need to align our metrics around customer experience with marketing’s metrics. Rather than focusing solely on single user sessions, it would be incredibly helpful to gather performance data over a much longer window of time.

3. What impact does performance have on enterprise productivity?

Much of the research on the business impact of web performance focuses on retail, but when it comes to using enterprise apps, performance affects workers, too. Thanks to apps like Slack, workers have high expectations when it comes to speed and ease of use. We need to gain a better understanding of how app performance affects internal metrics like app adoption and employee satisfaction and productivity.

4. Are we always measuring the right things?

As Steve Souders said at WebPerfDays Amsterdam:

website monitoring: user timing

At Velocity Amsterdam, there was a lot of talk about using UserTiming as a metric for gathering much more specific data around how pages render and how we measure user satisfaction and other business metrics. This discussion bled into WebPerfDays, with some deep discussion into how to use UserTiming in the field. As Steve also said:

website performance monitoring: usertimingDOWNLOAD: Everything You Need to Know about UserTiming, NavigationTiming and ResourceTiming

Six Takeaways

When I talked about performance ROI in my webinar, my hope was that people would leave having internalized these messages:

  1. User expectations and behavior are always changing
  2. Our tools for measuring ROI are evolving
  3. Know your own business success metrics
  4. Understand your own visitors
  5. Target the best (not the slowest) pages for optimization
  6. Monitor, test, repeat

As always, I value your feedback. If you have any questions or comments, I’d love to hear them.

See Forrester’s Report, “Vendor Landscape, Application Performance Management” to identify the right vendor to help IT deliver better service at a lower cost, brought to you in partnership with BMC.

performace,velocity conference,business metrics,downtime,response time

Published at DZone with permission of Tammy Everts, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}