Hexagonal architecture in JavaScript
Join the DZone community and get the full member experience.
Join For FreeThe goal: unit testing JavaScript code, in isolation from frameworks, the server side and the DOM.
To pursue this goal, the code under development must contain some logic, such as calculations, string manipulation, a state machine and so on. If it's only a callback coloring a button with a blue tint when it's clicked, testing it in isolation doesn't add much to the picture. Instead, testing a state machine describing the views of a single-page application is more interesting and has an higher ROI.
The architecture: port and adapters
JavaScript applications have an architecture: we can regard the client side of a web application as an application itself, defined by its upper boundaries, the user interface and the connection with the server side (and similar data sources).
A very popular scheme that has the capability to enable unit testing is the hexagonal architecture (which does not necessarily have six sides).
In this scheme, a central core containing the business logic is surrounded by ports, which can fall under two categories:
- driving side ports, where the interaction is initiated from the outside. DOM events that point to callbacks, framework-originated events, and setTimeout() definitions are all driving side ports.
- driven side ports, where the application interacts with the external infrastructure. As an example, consider the DOM elements to manipulate or where new content is inserted, or Ajax calls that target the client side, or even local databases.
The idea of the architecture is to satisfy each port with an adapter containing the code which is "problematic" to test; in this case, the difficulty originates due to its references to frameworks, Ajax and the DOM. The core remains free from dependencies on the outside and can be tested in a single process, deterministically.
The driving side
To be able to test the core, we need to emulate the adapters for the driving side. Who calls JavaScript code inside your application?
These ports might naturally conform to what the framework or the plugin requires:
- some callbacks to associate with events such as click and change.
- as a natural evolution, a single object containing this callback so that they can interact with a shared state.
- in the degenerate case, a single callback produced as an anonymous function.
For example, jQuery('#my_form').submit() lets you specify a callback to call when the user submits the form. Instead of specifying it in place as an anonymous function, it's usually easy to parameterize it and test it independently:
var form = $('#my_form'); form.submit(Application.formSubmit($('div#result')));
In this snippet, Application.formSubmit returns a closure. The nice thing is you can wire things to the closure by passing them to this Factory Method, making testing easier. If the callback calls $ directly it will be difficult to test in isolation (you'll have to provide some DOM nodes).
Calling $ or other infrastructure while testing is even correct when testing a presentational behavior like the choice of the content to be displayed, but this is an article on test doubles and privileges unit testing over functional testing.
By the way, not only the driving ports must define an interface easy to test: $ and similar libraries are driven ports that need test doubles too.
The driven side
The driven ports are defines as objects and functions that you want to substitute test doubles to test your logic in isolation; for example, not making a real Ajax call but substituting the request with a function that returns a predefined value (the Ajax specific case is explained here.)
In general, there are two choices:
- introduce a Test Double that directly implements the API called by your code (jQuery.get, jQuery.css, or XMLHttpRequest.)
- Wrap the original library and define your own interface, that you can substitute.
The principle of Only Mocking Types You Own points towards the second choice, otherwise we wouldn't be able to use mocking as a design tool. The interface would be fixed and you would be stuck with calling those catch-all framework methods that expect a 10-field JSON object as arguments.
The first solution works well only when methods are called directly:
var old$ = $; $ = function(selector) { // mocked behavior }; // test of an object calling $() $ = old$;
while if there is a Law of Demeter violation like $('#selector').css('..').html('...').click(...), it's painful to mock directly.
However, I tend to leave adapters as a very thin layer when introducing unit testing on an existing project, to reduce the cognitive load of the transition. Jumping from closures defined together with the framework configuration to external objects containing logic is already a large step that should be taken lightly.
Here's how the wrapping-and-mock approach looks like:
// real implementation var changeAppearance = function(element, css, html) { element.css(css).html(html); }; // production code var factory = function(element, appearance) { return function() { // logic to test appearance(element, css, html); }; }; // test var mockAppearance = function(element, css, html) { assertEquals(css, ...); assertEquals(html, ...); }; var objectUnderTest = factory(dummyElement, mockAppearance); objectUnderTest();
Conclusions
The more logic you have inside JavaScript, the more you should care for testing it independently of all the other stuff going on in the client side. An evolution towards ports modelling frameworks, Ajax and the DOM is natural; the next step is the introduction of adapters to further isolate the core from external concerns, providing a nice API to it instead of the framework's jungle of calls.
Opinions expressed by DZone contributors are their own.
Comments