Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Progressive Progressive Web App, Part 2

DZone's Guide to

Progressive Progressive Web App, Part 2

In this post, we finish the progressive web application we started last time by getting our backend logic set up and creating a better front-end.

· Web Dev Zone ·
Free Resource

Learn how error monitoring with Sentry closes the gap between the product team and your customers. With Sentry, you can focus on what you do best: building and scaling software that makes your users’ lives better.

Welcome back! If you missed Part 1, check it out here!

Unified Logic Server and Service Worker Logic - Hoops and Hurdles

It was certainly not easy to get to a shared code base between server and client, the Node + npm ecosystem and the web JS ecosystem are like genetically identical twins that have grown up with different families and when they finally meet there are many similarities and many differences that need to be overcome... It sounds like a great idea for a movie.

I chose to prefer web across the project. I decided upon this because I don't want to bundle and load code in to the user's browser, but, rather, I could take that hit on the server (I can scale this, the user can't), so if the API wasn't supported in Node then I would have to find a compatible shim.

Here are some of the challenges I faced.

A Broken Module System

As both the Node and Web Ecosystem grew up they both developed different ways of componentizing, segmenting and importing code at design time. This was a real issue when I was trying to build out this project.

I didn't want to CommonJS in the browser. I have an irrational desire to stay away from as much build tooling as possible and, add in my despise of how bundling works, it left me with not a lot of options.

My solution in the browser was to use the flat importScripts method. It works but it is dependent on very specific file ordering, as can be seen in the service worker like so:

sw.js

importScripts(`/scripts/router.js`);
importScripts(`/scripts/dot.js`);
importScripts(`/scripts/platform/web.js`);
importScripts(`/scripts/platform/common.js`);
importScripts(`/scripts/routes/index.js`);
importScripts(`/scripts/routes/root.js`);
importScripts(`/scripts/routes/proxy.js`);

And then for node, I used the normal CommonJS loading mechanism in the same file, but they are gated behind a simple if statement to import the modules.

if (typeof module !== 'undefined' && module.exports) {
    var doT = require('../dot.js');
    ...

My solution isn't a scalable solution, it worked but also littered my code with, well, code that I didn't want.

I look forward to the day where Node supports modules that the browsers will support... We need something simple, sane, shared, and scalable.

If you check out the code, you will see this pattern used in nearly every shared file and in many cases it was needed because I needed to import the WHATWG streams reference implementation.

Crossed Streams

Streams are probably the most important primitive that we have in computing (and probably the least understood) and both Node and the web have their own completely different solutions. It was a nightmare to deal with in this project and we really need to standardize on a unified solution (ideally DOM Streams).

Luckily, there is a full implementation of the Streams API that you can bring in to Node, and all you have to do is write a couple of utilities to map from Web Stream -> Node Stream and Node Stream -> Web Stream.

const nodeReadStreamToWHATWGReadableStream = (stream) => {

  return new ReadableStream({
    start(controller) {
      stream.on('data', data => {
        controller.enqueue(data)
      });
      stream.on('error', (error) => controller.abort(error))
      stream.on('end', () => {
        controller.close();
      })
    }
  });
};

class FromWHATWGReadableStream extends Readable {
  constructor(options, whatwgStream) {
    super(options);
    const streamReader = whatwgStream.getReader();

    pump(this);

    function pump(outStream) {
      return streamReader.read().then(({ value, done }) => {
        if (done) {
          outStream.push(null);
          return;
        }

        outStream.push(value.toString());
        return pump(outStream);
      });
    }
  }
}

These two helper functions were only used in the Node side of this project and they were used to let me get data into Node APIs that couldn't accept WHATWG streams and, likewise, to pass data into WHATWG stream compatible APIs that didn't understand Node streams. I specifically needed this for the fetch API in Node.

Once I had streams sorted, the final problem and inconsistency was Routing (coincidentally this is where I needed the Stream Utils the most).

Shared Routing

The Node ecosystem, particularly Express, is incredibly well known and amazingly robust, but we don't have a shared model between client and service worker.

Years ago I wrote LeviRoutes, a simple browser-side library that handled Express.js like routes and hooked into the History API and also the onhashchange API. No one used it but I was happy. I managed to dust of the cobwebs (make a tweak or two) and deploy it in this application. Looking at the code below you can see that my routing is nearly the same.

server.js

app.get('/', (req, res, next) => {
  routes['root'](dataPath, assetPath)
    .then(response => node.responseToExpressStream(res, response));         
});

app.get('/proxy', (req, res, next) => {
  routes['proxy'](dataPath, assetPath, req)
    .then(response => response.body.pipe(res, {end: true}));
})

sw.js

// The proxy server '/proxy'
router.get(`${self.location.origin}/proxy`, (e) => {
  e.respondWith(routes['proxy'](dataPath, assetPath, e.request));
}, {urlMatchProperty: 'href'});

// The root '/'
router.get(`${self.location.origin}/$`, (e) => {
  e.respondWith(routes['root'](dataPath, assetPath));
}, {urlMatchProperty: 'href'});

I would love to see a unified solution that brings the service worker onfetch API into Node.

I would also love to see an "Express"-like framework that unified Node and browser code request routing. There were just enough differences that meant I couldn't have the same source everywhere. We can handle routes nearly exactly the same on the client and the server, so we are not that far away.

No DOM Outside of the Render

When the user has no service worker available, the logic for the site is quite traditional. We render the site on the server and then incrementally refresh the content in the page through a traditional AJAX polling.

The logic uses the DOMParser API to turn an RSS feed into something that I can filter and query in the page.

// Get the RSS feed data.
fetch(`/proxy?url=${feedUrl}`)
      .then(feedResponse => feedResponse.text())
      // Convert it in to DOM
      .then(feedText => {
        const parser = new DOMParser();
        return parser.parseFromString(feedText,'application/xml');
      })
      // Find all the news items
      .then(doc => doc.querySelectorAll('item'))
      // Convert to an array
      .then(items => Array.prototype.map.call(items, item => convertRSSItemToJSON(item)))
      // Don't add in items that already exist in the page
      .then(items => items.filter(item => !!!(document.getElementById(item.guid))))
      // DOM Template.
      .then(items => items.map(item => applyTemplate(itemTemplate.cloneNode(true), item)))
      // Add it into the page
      .then(items => items.forEach(item => column.appendChild(item)))

Accessing the DOM of the RSS feed using the standard APIs in the browser was incredibly useful and it allowed me to use my own templating mechanism (that I am rather proud of) to update the page dynamically.

<template id='itemTemplate'>
  <div class="item" data-bind_id='guid'>
    <h3><span data-bind_inner-text='title'></span> (<a data-bind_href='link'>#</a>)</h3>
    <div data-bind_inner-text='pubDate'></div>
  </div>
</template>
<script>
  
const applyTemplate = (templateElement, data) => {
  const element = templateElement.content.cloneNode(true);    
  const treeWalker = document.createTreeWalker(element, NodeFilter.SHOW_ELEMENT, () => NodeFilter.FILTER_ACCEPT);

  while(treeWalker.nextNode()) {
    const node = treeWalker.currentNode;
    for(let bindAttr in node.dataset) {
      let isBindableAttr = (bindAttr.indexOf('bind_') == 0) ? true : false;
      if(isBindableAttr) {
        let dataKey = node.dataset[bindAttr];
        let bindKey = bindAttr.substr(5);
        node[bindKey] = data[dataKey];
      }
    }
  }

  return element;
};
</script>

I was very pleased with myself until I realized that I couldn't use any of this on the server or in a service worker. The only solution that I had was to bring in a custom XML parser and walk that to generate the HTML. It added some complications and left me cursing the web.

In the long run, I would love to see some more of the DOM APIs brought in to workers and also supported in Node, but the solution I have works, even if it isn't optimal.

There are really two questions in this post:

  • Is it practical to build systems that share a common server and service worker?
  • Is it possible to build a fully progressive Progressive Web App?

It is possible to build systems that share a common server and service worker, but is it practical? I like the idea, but I think it needs more research because if you are going JS all the way, then there are a lot of issues between the Node and web platforms that need to be ironed out.

Personally, I would love to see more "Web" APIs in the Node ecosystem.

Is it Possible to Build a Fully 'Progressive' Progressive Web App?

Yes.

I am very pleased that I did this. Even if you don't share the same language on the client as on the service, there are a number of critical things that I think I have been able to show.

  1. App Shell is not the only model that you can follow, the important point is that with the service worker you get the control over the network and you can decide what is best for your use case.
  2. It is possible to build a progressively rendered experience that uses service workers to bring performance and resilience (as well as an installed feel if you like). You need to think holistically, you need to start with rendering as much as you can on the server first and then taking control of the client.
  3. It is possible to think about experiences that are built "trisomorphically" (I still think the term isomorphic is best) with a common code base, a common routing structure, and common logic shared across client, service worker, and server.

I leave this as a final thought: We need to investigate more about how we want to build progressive web apps and we need to keep pushing on the patterns that let us get there. App Shell was a great start, it's not the end. Progressive rendering and enhancements are the keys to the long-term success of the web, no other medium can do this as well as the web.

If you are interested in the code, check it out on GitHub but you can also play with it directly and remix it on glitch.

What’s the best way to boost the efficiency of your product team and ship with confidence? Check out this ebook to learn how Sentry's real-time error monitoring helps developers stay in their workflow to fix bugs before the user even knows there’s a problem.

Topics:
web dev ,progressive web applications ,web application development ,node.js ,client-side javascript

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}