How to make API rate limit policy in loopback How to make API rate limit policy in loopback express express

How to make API rate limit policy in loopback


If you're planning on using Loopback on IBM Bluemix hosting you can use their API Connect service that includes customer plan based policies with API level throttling, monitoring, API billing and many other API management features.

StrongLoop API Microgateway is used by API Connect but is now open sourced (Apr 2017).

Since Loopback is just a layer on top of Express, you can alternatively just use an Express lib.

For rate limiting on a single standalone Loopback server you can use one of these Express libs:

If you plan to use this on a cluster of Loopback servers you'll need to store the API call counts as part of the shared server state of each user or user session. The weapon of choice for this is Redis since it's a high performance in memory data store that can be scaled. Rate limiting Express libs that support Redis include:

Finally, you could also implement rate limiting on a reverse proxy. See Nginx Rate Limiting


This is an access control policy.

You can handle this by custom roles created by role resolver.

By creating a custom role and checking in that resolver callback if the current user exceeded from rate limit or not.


such a policy can only* be made with a database, such as redis/memcached. For my projects I rely on redback which is based on Redis. It has a built in RateLimit helper (among others) and it takes care of some raceconditions and atomic transactions.

* if you don't have a database, you could store it in-memory (in a hash or array) and use intervals to flush it, but I'd go with redback :)