Is graceful degradation dead?

DZone 's Guide to

Is graceful degradation dead?

· Web Dev Zone ·
Free Resource

During my job as a consultant here in Siena, I encountered an Ext JS web application which completely reproduces a desktop experience, with resizeable windows, drag and drop and Start menu. Of course, without JavaScript enabled this application won't work at all, but the fact that this is not considered a problem at all by my colleagues made me think.

Plain old HTML documents and web applications, before the advent of Ajax technologies, did not execute logic on the client and keep all the computations on the server. This resulted in a variety of non-ordinary clients, such as screen readers and crawlers, being easy to build and run over web pages; applications, where present at all, were a set of connected web pages and forms, processed by a smart server.

Fast forward to 2010, and now HTML pages can also build themselves with JavaScript on the client side; there is no set of pages, but a set of services on the server side that expose data; part of the presentation or business logic is more and more moved to the client, to the point that some applications, like the ones based on Ext JS Web Desktop, expose only JSON responses at the PHP level, while the markup is built on the client by the various widgets that connect to the data store.

JavaScript is fundamental for accomplishing these tasks, and obviously it cannot be avoided in the construction of a up to par user experience. I realize that this is not necessarily an issue; in my experience enterprise applications are a bit different from public websites, since you can mandate (or at least agree on) a single particular browser to avoid cross-browser hacks or Internet Explorer bugs, and prescribe to have JavaScript support on every client. Besides that, innovation would be crumped in our world if niceful degradation were mandatory: the most modern web applications make use of JavaScript at every level (see below).

The JavaScript can of worms

Mandatory JavaScript support however brings up some issues.

First of all, the accessibility limitations that JavaScript involves. Not all the available clients are capable of executing JavaScript; this is true where the client is driven by an user, but it will come back to bite you also in the other points. The most famous case for accessibility is related to screen readers, employed by visual-impaired users as an assistive technology that read aloud the text present on web pages. An enterprise application can usually ignore the needs of screen readers, but public ones may not.

Another great issue with JavaScript-based web sites and applications is the support for crawlers. Google's or Yahoo's crawlers does not execute JavaScript code found in pages (yet), and can miss the links or the content an ordinary user would see if they are produced via JavaScript. Probably web applications are less prone to crawling than websites anyway, due to their focus on functionalities instead of content.

The testability of the application as a whole (acceptance and functional testing) is also a concern. Selenium, the only open source tool available for acceptance testing of JavaScript-based application is by design slower and brittle than Zend_Test or HttpUnit.

For testing at the acceptance level (the most coarse grained type of testing, where you exercise the app as a real user would in predefined scenarios), you need some kind of client that can programmatically execute your tests. Selenium uses a real web browser, while an application without AJAX calls can usually be tested by faking responses and requests, so that only a basic HTTP client or mock client is needed. In the Java worlds, this can be done by instantiating servlets single-handedly, or by opening a socket. In the PHP world, stub request and response object can be created by the testing framework (Zend_Test).

Note that niceful degradation does not satisfy testing requirements here: even if you test a degraded widget, you're still not testing the main version (non-degraded) that the users will see. This is not the time for plug-in scripts that embellish the user experience like a lightbox or AjaxLink. This is the time for entire desktop applications to be ported in the browser.

How the most popular websites face this

While obviously we can't renounce to JavaScript, we must keep an eye on our user target. I bet you, if you have worked on enterprise applications like me rather than public websites, have never considered accessibility requirements as primary ones for your applications. But what about public services that are used every day?

It seems that even successful web developers seldom think about accessibility and degradation. Just disable JavaScript for a moment and try go to one of the major applications you use every day. I did it and the result were not comfortable:

  • Facebook does not work for the global timeline or private messages. I can view my own profile, but that's not astounding.
  • Gmail solves the problem radically by linking to an alternate HTML version. Not every application can afford an alternate user interface, but maybe it can present a version with simplified controls. The HTML-based Gmail does not have the new Priority Inbox functionality as far as I can tell.
  • Twitter works for some functionalities, like viewing replies, but it does not tweet. It is also fairly minimalistic (it consists of 4 or 5 user stories), so it's a pity as accessibility could be implemented right away.

There are also more complex solutions. For example, Google Reader has an ARIA-compliant version that can be read by Firefox 3 plus a screen reader, and uses JavaScript (since I couldn't load it without JavaScript enabled). WAI-ARIA is a specification for controlling Rich Internet Application components in an accessible way. This means assistive technology is evolving for supporting JavaScript, and we can hope for other kinds of clients to evolve too. Selenium itself was created by ThoughtWorks as a solution to this problem, while Google is able to crawl Flash applications.

Moreover, Google can also crawl JavaScript code to a certain extent, performing operations like extracting URLs from JavaScript code. Google is also working on a standard and a prototype crawler implementation for Ajax-based applications.

We all hope that in the future testing, crawling or accessing a JavaScript application will be a standard feature and not an issue. Meanwhile, we shouldn't be scared of use less accessible, cutting (but not bleeding) edge technologies if they provide a competitive advantage. A modern web developer can't avoid JavaScript anymore.


Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}