I've made a simple app that sets Cache-Control: public, immutable, max-age=3600
in response headers. This app is purposely slow(sleep 2) to reproduce the issue.
My goal is to understand whether I can use Heroku for short term HTTP caching of API responses from my slow(in comparison with cache server) origin Heroku app.
From what I see there isn't any sort of HTTP caching by Heroku in front of my Heroku app.
Actual behavior:
Each request hits the origin/Heroku app:
$ curl https://cachecontroltest123.herokuapp.com/ --head
HTTP/1.1 200 OK
Connection: keep-alive
Cache-Control: public, immutable, max-age=3600
Content-Type: text/html;charset=utf-8
Content-Length: 12
X-Xss-Protection: 1; mode=block
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
Server: WEBrick/1.4.2 (Ruby/2.6.6/2020-03-31)
Date: Sat, 24 Oct 2020 09:28:51 GMT
Via: 1.1 vegur
Expected behavior:
After the first request, Heroku had to cache the response and for the upcoming hour serve it without hitting the (slow) origin.
I could find a few related questions on Stackoverflow, each of which saying that Heroku has a cluster of Varnish HTTP caching servers in front of each app but those answers are pretty old. Have something changed since then?
Heroku does not do any HTTP caching.
This task is either to be done in your app (in the dyno, either by your framework or something like Varnish/Nginx), or for more heavy load, by a CDN you put in front of the Heroku App (Fastly, CloudFront, Cloudflare).
There's also the Edge Heroku add-on that simplifies AWS Cloudfront configuration.