I want to create a layer of rate limiting for my API
Is using something like 20 instances of express-rate-limit
a bad idea?
The issues I can foresee are reduced performance, increased RAM usage
import * as rateLimit from 'express-rate-limit'
// import express, others...
const standard = {
hour: rateLimit({
windowMs: 60 * 60 * 1000, // 1 hour
max: 30, // limit each IP to 30 requests per windowMs
}),
day: rateLimit({
windowMs: 24 * 60 * 60 * 1000,
max: 100,
}),
week: rateLimit({
windowMs: 7 * 24 * 60 * 60 * 1000,
max: 200,
})
}
app.use(standard.hour)
app.use(standard.day)
app.use(standard.week)
Your example with 3 instances is fine and is in fact the recommended way to do something like hourly/daily/weekly limits with express-rate-limit.
With the default memory store, each user costs about ~152 bytes. So 60k users * 20 instances would be less than 20MB of RAM used. I doubt it would add more than a few milliseconds of processing time.
It would be a bit worse with an external store where that means 20+ network requests that must be run sequentially, but still not the end of the world. Probably a similar amount of memory usage, but worse latency.
Still, even 20 instances is probably fine.
With that out of the way, I do think it could be better. I've been thinking about providing a way for multiple instances of express-rate-limit to share a single store, and I finally wrote down some thoughts at https://github.com/express-rate-limit/express-rate-limit/discussions/435. So, you may see a future release that has a way for all of the limits to be applied while only tracking the user's hit count once, negating both the memory and latency concerns.
Disclaimer: I am the author of express-rate-limit