Proxy/Cache: A Faster Local Environment

DZone 's Guide to

Proxy/Cache: A Faster Local Environment

Sometimes, some projects have a local development environment that’s not completely isolated and depends on some infrastructure that we cannot locally raise.

· Web Dev Zone ·
Free Resource

Sometimes, some projects have a local development environment that’s not completely isolated and depends on some infrastructure that we cannot locally raise, be it because of license issues, time constraints, etc.

In my case, this dependency is a service that receives numerous API calls, each taking some time to respond, and since we don’t have the pertinent license we cannot test it locally.

Since we don’t want to disclose client data, we’ll use as an example this page that’s full of cats: https://http.cat/.


Searching, I found an npm library to cache these API calls, and that will only take some time on the first occasion a request is done. There’re plenty of libraries that can do this. I chose this next one, it seems to work pretty well https://www.npmjs.com/package/json-caching-proxy.


Once installed, we’ll create a main.js file where we’ll add the next piece of code, which is all that we’ll need to raise a server up that will listen to requests and redirecting them to the real page.


This example is taken from the library’s own page, we’ve just replaced the remoteServerUrl parameters with the domain we want to cache, in our case https://http.cat/, and the proxyPort with the 8080 port, since we want to listen to the requests done to our local host.

On the library’s page we can find different configurations like:

  • how to keep this cache between sessions with the inputHarFile parameter, since in this example all data will be deleted once we stop the server.
  • how to cache only certain requests that go through a particular route.
  • how to exclude certain requests with the excludedRouteMatchers parameter.
  • etc.

Next, we raise the server with this command:



node main.js

The server will be left listening for potential requests to our local host through the 8080 post. The console will show this:

Plain Text


JSON Caching Proxy Started:
Remote server url:      https://http.cat/
Proxy running on port:  8080
Proxy Timeout:          500000
Replay cache:           true
Save to cache:          true
Command prefix:         proxy
Proxy response header:  proxy-cache-playback
Cache all:              true
Cache busting params:   _,dc,cacheSlayer
Excluded routes:

Now we can access the page through our local host:8080 server, which will be in charge of redirecting requests towards https://http.cat/.


If we take a closer look at the messages appearing on the console we can see all the requests we’re doing. Yellow messages indicate a first time accessing the resource, and so we’re redirecting this request to the real page and storing the response in cache, so next time it’s consulted we can skip the waiting time of the real API call.


Now we can normally navigate the website while our server keeps storing all requests in cache.

Website navigation

Meanwhile, the green messages indicate that the resource has been accessed previously, and so the request isn’t done to the real page and we’re obtaining the responses directly from our cache.

Responses from cache


In order to visualize/debug the results, we can use applications like charles or burp. In our example, we’ll use Charles.

We’ll activate Charles’Reverse Proxies option.

Reverse Proxies

And add this line:

Reverse Proxy Settings

This will make Charles intercept all API calls to local host:80 and redirect them to our proxy-cache server.

Now we can directly access from “local host:80”.

Directly access

And examine the results of each of our requests.

Request results

dependencies, local development, proxy, web development

Published at DZone with permission of David Serrano . See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}