Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Cache Implementation in MuleSoft

DZone's Guide to

Cache Implementation in MuleSoft

Learn how to use the Cache Scope in Mule to store and reuse frequently called data- as from a web service or database- to save time and memory.

· Integration Zone ·
Free Resource

Continue to drive demand for API management solutions that address the entire API life cycle and bridge the gap to microservices adoption.  

The Cache Scope is a Mule feature for storing and reusing frequently called data. The Cache Scope saves time and processing load.

Why Use Cache in Mule?

Let us consider a scenario where you would call a web service or database in a Mule flow very frequently to get a token value (or any other parameter which is frequently required).

Assuming we get ten requests a day from the API to retrieve the token, each time the web service would be called and the token will be retrieved. Considering the number of requests, the above logic of calling the web service each time may look fine, but let’s assume the days roll over and requests start increasing each day, and finally we reach a situation where we are receiving hundreds of requests a day and each time we are calling the web service, which would surely take more time and consume memory.

So, in order to reduce the number of calls to the web service or database to get frequently used information, Mule provides a feature called Cache Scope which stores frequently used data in the queue and reuses it in subsequent calls.

Mule provides the below storage strategies during Cache Implementation:

  1. In-memory-store: data is stored in Local Mule Runtime Memory and it is non-persistent which means data would be lost if an event of restart or shutdown the mule application.

Configuration required:

  • Store name
  • Maximum number of entries (that is, cached responses)
  • The “lifespan” of a cached response within the object store (i.e. time to live)
  • The expiration interval between polls for expired cached responses

2. Managed-store: data cached will be stored in a place defined by ListableObjectStore inside Mule, a persistent storing mechanism where in the event of any shutdown or restart of the Mule application, the data would still be intact.

Configuration required:

  • Store name
  • Persistence of cached responses (true/false)
  • Maximum number of entries (i.e. cached responses)
  • The “lifespan” of a cached response within the object store (i.e. time to live)
  • The expiration interval between polls for expired cached responses

3. Simple-test-file-store: As the name denotes, data in this approach is stored as a file; also a persistent storing mechanism where in the event of any shutdown or restart of the Mule application, the data would still be intact.

Configuration required:

  • Store name
  • Maximum number of entries (i.e. cached responses)
  • The “lifespan” of a cached response within the object store (i.e. time to live)
  • The expiration interval between polls for expired cached responses
  • The name and location of the file in which the object store saves cached responses

Now that we know the why Cache is required and its advantages, let us begin with actual implementation itself.

First, let us define the Global Caching Strategy and add all the required configuration.

Image title

As seen above, the Cache Strategy is defined as a managed store (the reason being cache invalidation can also be performed on the same if required, which will be covered in the next blog).

Here, TTL is set as 5 hours, which means the token would be saved in the cache for 5 hours, and later when a new request enters, the web service will again be invoked again and the next token would be stored and the process continues as is.

Now that we have defined the Cache Strategy, let's create a sample flow to see how this works.

In the below Mule flow, we get requests from the source, which would then be accepted by the HTTP Listener. Then we enter the cache scope; for the first time, the web service would be invoked, and the token would be retrieved from the service and stored in the cache.

Further, following token requests would not invoke the web service, but instead, the token would be retrieved from the Mule object store cache itself for the next 5 hours (this timing can be adjusted based on the business requirement).

Image title

Since we have already defined the Global Cache Strategy, the task here is only to reference the strategy which is already defined.

Image title

In addition to the above implementation, there are various other functionalities which can be utilized when implementing the cache, listed below:

  • Filter option within Cache Scope

  • Key generation

  • Key expression

  • Other existing Object Store Mechanism

Conclusion

Mule Cache Strategy offers a very good feature for storing and reusing frequently used data, improves response time, and avoids processing loads.

Discover how organizations are modernizing their application architectures for speed and agility from the growing API economy

Topics:
performance ,integration ,mule ,mulesoft ,caching

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}