DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
  1. DZone
  2. Software Design and Architecture
  3. Performance
  4. 3 Trends in Web Performance

3 Trends in Web Performance

Web performance monitoring has changed — make sure you're prepared with these tips on web performance optimization trends.

Akshay Ranganath user avatar by
Akshay Ranganath
CORE ·
Aug. 06, 18 · Opinion
Like (3)
Save
Tweet
Share
5.10K Views

Join the DZone community and get the full member experience.

Join For Free

website performance is always an ongoing effort and an elusive target. in the early days, pageload time was the gold standard of web performance monitoring. however, websites have evolved to be a lot more dynamic, javascript-heavy and filled with rich images and 3rd party content. we need better metrics, a better model, and better tooling to measure and monitor the performance. in this blog post, i explain the evolution 3 different trends that are shaping the industry:

so, let's dig in!

rail framework

rail is a framework that is being publicized by google. according to the google developer page , it is "a user-centric performance model that breaks down the user's experience into key actions."

the framework breaks down performance into these 4 key goals:

source .

basically, the rail model says that load your page fast so that it is interactive quickly. once it is interactive, ensure that any dynamic aspects are fast enough so that the page remains interactive to user input.

for more information about rail, please watch this video about rail in real word .

perceived performance

how fast your site appears to load is as important as it really loads. the problem with perceived performance is that there was no way to measure it. it's something of a conundrum like this:

source .

source .

we have had a lot of measurements like time to first byte, dom loaded, onload and a host of other measurements. these metrics have been immensely helpful in the evolution of the web performance industry. however, they were designed on the events that could be measured by browsers. however, they failed to answer some basic questions like these:

going beyond this, we have a few more measurements like the following:

all this boils down to measuring a host of metrics that are spread across different tools and platforms.

source .

if you notice, we've now gone from hard measurement of dns or onload to softer measurements related to user's interaction and experience. all this is harder to measure but, google has announced that their search indexing will now take these factors into account. specifically, google will now:

  • use mobile page speed as a ranking signal ,
  • encourage "developers to think broadly about performance," and
  • hint at using suggestions from tools like lighthouse, crux and pagespeed insights which we'll discuss next.

tooling for perceived performance

webpagetest has been the best and de-facto tool to visualize and show the concept of perceived performance. the film strip view, video, and visual progress charts have been the toolset that we have been using for a long time. however, we now have more tools that can help in building a case for measuring and optimizing for perceived performance.

  • lighthouse : project lighthouse by google is able to run a lot of audits and present findings. these findings are wide-ranging, like providing a filmstrip view to chrome javascript timeline. by using the audit, you would be able to dig into problems like a particular javascript taking a high amount of time and thus causing rendering issues.
  • chrome user experience report (crux) : google chrome collects performance measurements from real browsers. this information is sent back to google if users have opted in for the data collection. this anonymized data is now being exposed as the real user monitoring (rum) data for chrome under the project crux. this tool is especially helpful since it gives you a spread of the user experience. synthetic tests can only give you the data from simulated users with a pre-defined setup. crux can provide histograms of actual performance behaviors, including perceived performance information.
  • google pagespeed insights : this tool has now evolved to provide both suggestions to improve the page and pull up the crux data for the url being audited. by using the psi report, you should get an idea of the current performance and tips to analyze and fix your website.
  • akamai mpulse : akamai's mpulse real user monitoring solution provides all the perceived performance data along with the standard metrics like onload. the tool can be instrumented to measure rage clicks as well to identify the user's frustration levels when using a website.

what do you look for?

by correlating the data from synthetic tests and the rum solutions, you should be able to identify potential issues in perceived performance. some of the issues could be:

  • synchronous scripts holding up the browser's main thread and preventing it from rendering the content
  • a/b testing solutions that implement some fix leading delayed rendering of the page ( preventing flickering with a/b solutions )
  • page appearing to be ready but, actually busy. this can be seen in the first input delay (fid) as well as the difference between time to first interaction and time to first interactive.
  • page appearing to render but not showing the content that matters. this can be caught by metrics like first contentful paint, first meaningful paint and visually ready

i have highlighted a very small subset of issues that could be tracked and worked on by leveraging the perceived performance metrics. if you have any use cases, i'd like to hear more as well!

trends Real user monitoring Google (verb)

Published at DZone with permission of Akshay Ranganath, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • DevOps Roadmap for 2022
  • The Top 3 Challenges Facing Engineering Leaders Today—And How to Overcome Them
  • Top 5 Java REST API Frameworks
  • Promises, Thenables, and Lazy-Evaluation: What, Why, How

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: