I'm currently working on an SPA with an ASP MVC API. We have recently adding client side caching via HTTP headers on our API responses with appropriate max-age values depending on the expected frequency of changes.
While this has worked to improve performance, we are now having the issue where a user makes a change themselves and then gets a cache hit with the old data when reloading the page.
To resolve this, I've added a version parameter to the GET request that is incremented each time a change is made.
However, I've now found that RFC 7234 Sec 4.4 states that POST, PUT or DELETE requests should invalidate the GET request's cache for the same URI.
Given that, I'm wondering how I should better design my APIs so the version parameter is not necessary and the browser will automatically handle this.
For example: I have
Request 2 will invalidate 1, 4 will invalidate 3, however 4 should also invalidate 1.
Is this correct behaviour? Or should request 1 just return a collection of the IDs of all resources and I should make separate request 3s for each ID. That doesn't seem valid as it would resolve in 100s of requests rather than 1.
Is there an easy solution to this?
In the same chapter you are quoting, the specification states:
Note that this does not guarantee that all appropriate responses are invalidated.
Invalidation is a very difficult task in a distributed environment. There might be other caches, or other resources that rely on the same data (as in your case). That means it should not be attempted, it is cheaper to just plan it into the system.
One "workaround" is to make the client force an update on the resource which it knows must be changed because of the PUT
. So you could make a request for yourself (and for the cache) to update the representation of the "parent" resource with this header:
Cache-Control: max-age=0
Again, other caches might still have out-of-date but still valid cached responses, but it solves the problem of not receiving conflicting information for the same process on the same machine.
So I would not "normalize" the representations to return just URIs without any data, I would rather design the workflows in a way that avoids such problems if possible. If not, force refresh (as described), set sufficiently small caching time, or do not cache if all else fails.