Setting Rate Limiting in Cloudhub (to Prevent DoS Attacks)

DZone 's Guide to

Setting Rate Limiting in Cloudhub (to Prevent DoS Attacks)

Learn how you can use Cloudhub's API Gateway to set up SLA tier-based rate limiting to prevent any DoS attacks on your API.

· Performance Zone ·
Free Resource

1.0 Overview

The MuleSoft Cloudhub environment, at an abstract level, consists of three main components; if your Mule Application is being exposed as a REST API endpoint, then you would most likely be using the API manager.

                                                                                  Figure 1.0.

The API Manager allows the user to create RAML definition files as well as policies. As there are many reference documents about RAML definition, I will not talk about that here, but instead, we will look at policies. The API Gateway components are given the least love by users, and that's because it's not exposed for usage compared to its more famous counterpart (the API manager and the Runtime Manager ).

The API Gateway is a component that implements the policies that are configured in API Manager. Figure 1.0 shows a typical user request to access a mule application that is running on the Runtime manager. At (2), the API gateway would validate and authorize users request, if the request is authorized, the gateway would then allow the user request to proceed (3) to the Mule Application. But the user request does not meet the policy specified by the API Gateway, then it would be rejected by the API Gateway and a 4XX HTTP response code (and its associated message) would be sent back to the user.

2.0 Setting Up SLA Tier Based Rate Limiting

With the understanding that we have gained from section 1.0, we would now talk about the rate limiting policy and SLA Tiers. As its name suggests, rate limiting policies are implemented so that the API gateway could reject a request if it is deemed too frequent given a duration of time. Figure 2.0 shows a rate limiting SLA based policy being configured:

Figure 2.0.

The details of this configuration are as depicted in Figure 2.01.

Figure 2.0a.

For rate limiting to happen, the API Gateway needs a form of identification for requesters who are making the request, in which it intuitively enforces users to submit a client id and client secret to the API for all requests that they will be issuing. If the API Gateway sees too many requests submitted by a particular client id, it will then reject the request until the specified duration has been reset.

The policy that we have put in place at Figure 2.0 lays the foundation for setting up SLA tiers. Figure 2.0b shows two SLA tiers that are configured.                                                                                  Figure 2.0b.

Settings for the GOLD SLA tier allows a user to send 5k request per hour, if users of GOLD SLA tier sent 5001 requests within one hour, then the request number 5001 would be rejected (Figure 2.0c depicts GOLD tier settings).                                                                               Figure 2.0c.

API users that have subscribed to Silver tier would only be allowed to send 1000 request per hour (settings for the silver tier is as per Figure 2.0d).                                                                               Figure 2.0d.

3.0 Conclusion

At the time of this writing, there are two types of rate limiting policies:

  • Rate Limiting Policy- The simplest form of rate limiting policies applies as a blanket policy across all API users; this policy cannot be used in conjunction with SLA tiers. Rate limits of SLA Tiers will always be ignored if this policy is implemented (as it has higher precedence when compared to SLA Tiers).
  • SLA-based Rate Limiting Policy- Rate limiting is based on SLA tier subscriptions that are allocated to the API user, which is what was demonstrated in this document.

I would obviously recommend the usage of SLA-based rate limiting as we could apportion the usage of the API according to user/usage groups on a case by case basis. The other benefit of having rate limiting in place is to prevent a denial of service (DoS) attack.

cloudhub, performance, rate limiting

Published at DZone with permission of Kian Ting , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}