Over a million developers have joined DZone.

We Test the Presidential Candidates’ Websites: Can You Guess Which One Is Fastest?

You know how they are performing at the polls, but do you know their web performance? Check out the presidential candidates' web performance analysis.

· Performance Zone

See Gartner’s latest research on the application performance monitoring landscape and how APM suites are becoming more and more critical to the business, brought to you in partnership with AppDynamics.

As the political season continues to heat up, we thought it was the perfect time to check out some of the most watched domain names in the United States—the websites of major Democratic and Republican candidates for President of the United States. In the wake of the South Carolina Republican primary on February 20, we used New Relic Synthetics to set up an automated Web browser to visit campaign sites every 30 minutes and to track the splash screens for campaign contributions, which urge visitors to join the team, family, movement, or revolution.

Although Synthetics can monitor from many different global locations, we used a single monitor near the U.S. capital to collect performance metrics. We wanted to understand how the candidate technology teams built their sites and especially how those sites perform in the real world.

Disclosure: This blog post does not represent the political views of New Relic and should not be taken as an endorsement of any candidate. It is strictly representative of the subject matter within regarding New Relic Synthetics and New Relic Insights, and the response times uncovered by the use of these products in this monitoring test.

Candidate (Web) Platforms: Some Faster Than Others

For supporters who hope their candidate’s site causes others to feel the (Web performance) burn, the results of the New Relic Synthetics monitors are clear: political ideology does not seem to have any connection to overall page load time.

We used New Relic Insights and a small NRQL query to quickly create dashboards from the Synthetics data. The average duration of page load time, or the time it takes for a Google Chrome browser to completely load the landing page of each major campaign site, varied widely.


Not surprisingly, there’s evidence that the total size of the Web pages—the sum of all of the responses for images, fonts, HTML, CSS, and JavaScript—affects overall performance. In general, all the campaign sites were image-heavy. All those smiling-supporter photos come at a measurable cost.


It’s also possible to connect changes observed in synthetic monitors to current events—there was a significant change the day after the South Carolina primaries on Sunday, February 21, and again during the evening of Republican Primary debate on Thursday, February 25.


Looking at Ted Cruz’s site, for example, segmenting requests by different content types shows that the average load time seems to be most impacted by an increase in JavaScript, CSS, and images served from a single host.


With increasing focus on page weight, reducing response size and the number of requests is critical to improving overall load time performance.

Some Campaigns Iterate on Their Sites More Than Others

Observing HTTP responses over time revealed some surprising and not-so-surprising results. In the case of Jeb Bush’s site, it’s possible to see third-party payment providers being turned off the moment he ceased campaign operations.


Similarly, after the suspension of Marco Rubio’s campaign average page load time spiked.


Synthetics data also suggest that some campaigns make changes to their sites more often than others. The Donald Trump, Hillary Clinton, Ben Carson, and John Kasich sites don’t vary much in size over time. In contrast, the Bernie Sanders and Ted Cruz sites made frequent changes that impacted the overall response size. Not surprisingly, Marco Rubio’s website response size stabilized after he ceased campaign operations.


Drilling down, the Hillary Clinton site is especially interesting. The total Web page response size steadied several days before the South Carolina Democratic primary after a period of more frequent activity.


Donald Trump’s website changed slightly more often than did Hillary Clinton’s, and there was a significant change on the evening of Wednesday, March 9 and Sunday, March 20 that increased overall page size.


Something All the Candidates Agree On: HTTPS is on By Default

Encryption, privacy, and security have been at the center of several campaign debates. However, the campaigns have unanimously embraced strong encryption to secure their sites. Using a free tool to analyze the configuration of campaign Web servers on February 29, we learned that every candidate domain scanned received an “A”: the second-highest grade possible.

New Relic Synthetics captures the time requests spend waiting for an SSL connection to be established. Some sites make this connection slightly faster than do others.


Since all the sites are delivered using Transport Layer Security (TLS), some have embraced the new protocol HTTP/2 (likely using a CDN provider). The sites for Ted Cruz, Marco Rubio, Bernie Sanders, Donald Trump, and Jeb Bush use the new protocol for some requests.

No Voter I.D. Required for These Election Site Monitors

In the name of transparency and open governance, we are sharing the code we used to generate this data using the New Relic Synthetics API—a simple Node.js script and the popular open-source requests library were all that were required.

var request = require('request');
const API_KEY = 'new-relic-admin-api-key';
const websites = [
  { name: 'Ben Carson', site: 'https://www.bencarson.com/' },
  { name: 'Bernie Sanders', site: 'https://berniesanders.com/?nosplash=true' },
  { name: 'Donald Trump', site: 'https://www.donaldjtrump.com/' },
  { name: 'Hillary Clinton', site: 'https://www.hillaryclinton.com/' },
  { name: 'Jeb Bush', site: 'https://jeb2016.com/' },
  { name: 'John Kasich', site: 'https://johnkasich.com/' },
  { name: 'Marco Rubio', site: 'https://marcorubio.com/' },
  { name: 'Ted Cruz', site: 'https://www.tedcruz.org/' }
var createMonitor = function(site, name) {
  var options = {
    uri: 'https://synthetics.newrelic.com/synthetics/api/v1/monitors',
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'X-Api-Key': API_KEY
    json: {
      'name': name,
      'frequency': 30, 
      'uri': site,
      'locations': [ 'AWS_US_EAST_1' ],
      'type' : 'browser'
  request(options, function (error, response, body) {
    if (!error) {
      console.log('success creating monitor for: ', site);
    } else {
      console.error('error creating monitor for: ', site, response.statusCode);
websites.forEach(function(w) { createMonitor(w.site, w.name); });

Final thoughts and a vote of confidence in building for the WebThis script creates a single monitor in the AWS U.S. East Region for eight campaign sites identified by the candidates’ names. Errors or successes are logged—if the response status code is 201, the new monitor that visits the page every half-hour has been successfully created.

According to NPR, some $4.4 billion is expected to be spent on television advertising alone in this election cycle, and the Web is a crucial part of campaign financing. Our experience monitoring hundreds of millions of application metrics has shown that milliseconds of page load slowdown can result in corresponding page abandonment and lost donations.

Given the vast amounts of money being raised through political websites, it is surprising that site performance is as varied as the political positions of the candidates themselves. For any website, including the ones for the next president of the United States, here are some best practices informed by the performance data that we’ve collected:

  • Understand how your Web pages are working in the real world. Data from simple Selenium scripts reveals a large amount of actionable information to improve the experience of visitors (and donors).
  • When in doubt, reduce Web page bloat. HTTP/2 helps with parallel requests and multiplexing over the same connection, but massive Web pages are slow no matter what.
  • Synthetic testing is a part of a much bigger idea: the power of having visibility into how an entire software stack is actually performing. Truly understanding all the reasons behind slow load times requires end-to-end visibility from the frontend to the backend.

Regardless of whether your websites are hosted in the cloud, at home, or in a state-of-the-art data center, professionals of all political persuasions should be using monitoring tools to build better applications and user experiences.

The Performance Zone is brought to you in partnership with AppDynamics.  See Gartner’s latest research on the application performance monitoring landscape and how APM suites are becoming more and more critical to the business.

performance and scalability,performance comparison

Published at DZone with permission of Clay Smith, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}