DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
View Events Video Library
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Integrating PostgreSQL Databases with ANF: Join this workshop to learn how to create a PostgreSQL server using Instaclustr’s managed service

Mobile Database Essentials: Assess data needs, storage requirements, and more when leveraging databases for cloud and edge applications.

Monitoring and Observability for LLMs: Datadog and Google Cloud discuss how to achieve optimal AI model performance.

Automated Testing: The latest on architecture, TDD, and the benefits of AI and low-code tools.

Related

  • Database Monitoring: Key Metrics and Considerations
  • Deploy a Session Recording Solution Using Ansible and Audit Your Bastion Host
  • Agile Metrics and KPIs in Action
  • 7 Essential Software Quality Metrics for Project Success

Trending

  • GenAI-Infused ChatGPT: A Guide To Effective Prompt Engineering
  • The Convergence of Testing and Observability
  • A Guide to Data-Driven Design and Architecture
  • Microservices With Apache Camel and Quarkus (Part 5)
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. Deployment
  4. The Evolution of the Velocity Conference and Performance Metrics

The Evolution of the Velocity Conference and Performance Metrics

Read about how web performance metrics have changed over the years to more accurately reflect a site's usability for people in the real world.

Dawn Parzych user avatar by
Dawn Parzych
·
Jul. 27, 17 · Opinion
Like (1)
Save
Tweet
Share
2.36K Views

Join the DZone community and get the full member experience.

Join For Free

This year Velocity celebrated its 10th anniversary. A lot has changed in the past 10 years. Velocity has gone from a conference being held once a year to three yearly events, including international locations. Velocity was THE conference to attend to network and learn about web performance. It covered performance from both the back end and front end perspective, which I have always believed is critical to understand the overall performance of an application. This year O’Reilly decided to switch the focus of the conference and Velocity no longer included sessions on web performance, those moved to Fluent.

The conferences were co-located, which in theory meant you would be able to attend sessions for both conferences, with a joint pass. In actuality, even if you purchased a joint pass it was very difficult to attend sessions at both due to scheduling. Sessions did not start at the same time, which meant you could only catch the first or second half if you wanted to attend sessions from both tracks. As a result I wasn’t able to see nearly as many keynotes and sessions as I would have liked. But luckily, I have a trial subscription to Safari thanks to the conferences and can catch up on the sessions and keynotes I missed.

Since this was the 10th anniversary, it wasn’t surprising to hear a lot of talks on how the web has evolved over time. Some discussed changes for the better and some highlighted where improvements are still needed. One of my favorite talks fell into this category, although it wasn’t described in that way. Shubhie Panicker from Google, and Nic Jansma from SOASTA presented a session on “Reliably Measuring Responsiveness in the Wild.” This talk introduced the concept of “time to interactive” and using long tasks in the browser to help identify when a page is interactive.

Like the web itself, the metrics we use to understand performance have evolved and grown over the last 10 years. The web performance community has moved away from metrics like page load time and has tried to find ones that more accurately reflect the users' perception of when a page has loaded. Some measurements like “above the fold” time have come and gone while others have remained such as render time, time to first paint, and speed index.

Performance is about more than when items load on a page. As a user, seeing content is good, but if I’m trying to scroll or click and can’t that can be as frustrating as not seeing content. The important thing to note is users will remember the worst performing interaction on the web site. They don’t remember all the times the page loaded quickly and with no jank, but they do remember the one time there was. Time to interactive is an attempt to measure the point between the page painting and all objects fully loading, at which the site interactive. Chrome has introduced a long tasks API to provide a way to measure when there is no main thread contention and a user can interact with the page.

Finding new methods to measure applications in a way that matters to the end user and ultimately the business will help organizations ensure they meet customer’s expectations. New ways to measure performance will continue to emerge. Investigate the metrics, and experiment with them. But keep in mind, every site is different. Identify the core metrics that work for your application and organization.

Velocity (JavaScript library) Metric (unit) Session (web analytics)

Published at DZone with permission of Dawn Parzych, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Database Monitoring: Key Metrics and Considerations
  • Deploy a Session Recording Solution Using Ansible and Audit Your Bastion Host
  • Agile Metrics and KPIs in Action
  • 7 Essential Software Quality Metrics for Project Success

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: