There are a number of articles, tutorials and even modules that can limit malicious recursive queries by inspecting the query depth, cost or rate-limit a GraphQL server from a high rate of consecutive requests (DoS attack).
However, I haven't been able to find anything that would protect a GraphQL server from a "wide" query which simply pulls too many instances of a field in the same request, i.e.:
query MaliciousQuery {
alias1: fieldName { subfileld1 subfiled2 ...}
alias2: fieldName { subfileld1 subfiled2 ...}
...
alias10: fieldName { subfileld1 subfiled2 ...}
...
alias100: fieldName { subfileld1 subfiled2 ...}
...
alias1000: fieldName { subfileld1 subfiled2 ...}
...
}
Yes, GraphQL allows clients to ask for what they need but there are situations where we may want to limit the number of objects of a particular type, especially if fetching such an object is expensive. I should mention that pagination is not desirable in my use case.
One way, of course, is to limit the overall length of the query string but that is a crude way to accomplish this and will have unintended side effects with complex query requests that don't even refer to the actual expensive objects. Cost analysis can also be used but it seems like overkill for something this simple and will introduce other complexities, too.
It would be great if we can have a limiting directive on the schema where we can specify something like
@perRequestLimit(count: 5)
so clients cannot request more than, say, 5 of these expensive objects in a single query.
Is anyone aware of such a module, etc.? Or is there a different way to achieve this type of limiting?
It appears that no such module/implementation exists even though I feel that this should be a core feature of GraphQL, especially since this can be a large DoS vulnerability and, as mentioned, cost/complexity analysis can be overkill in some cases (as in our particular scenario).
I ended up writing a simple @resourceLimit
directive to restrict individual fields from being requested multiple times using aliases. This still gives us the option to add cost/complexity analysis down the road when we feel this may be necessary, but, for now, it serves our limited requirement.