I have been trying to setup custom CDN using Caddy and varnish. The idea is to generate on demand SSL certificate and then pass it to varnish which further sends it to the backend server which is a nodejs application. If the request matches then varnish returns the cache results otherwise fetches new data. The working is described in the diagram flow diagram
Here are the respective files: docker-compose.yml
version: '3.7'
networks:
web:
external: true
internal:
external: false
driver: bridge
services:
caddy:
image: caddy
container_name: caddy
restart: unless-stopped
ports:
- "8080:8080"
- "80:80"
- "443:443"
volumes:
- $PWD/Caddyfile:/etc/caddy/Caddyfile
- $PWD/site:/srv
- caddy_data:/data
- caddy_config:/config
networks:
- web
varnish:
container_name: varnish
image: varnish:stable
restart: unless-stopped
volumes:
- $PWD/data/varnish/default.vcl:/etc/varnish/default.vcl
networks:
- web
- internal
volumes:
caddy_data:
external: true
caddy_config:
Caddyfile
{
on_demand_tls {
ask https://check-domain-URL
}
}
https:// {
tls {
on_demand
}
reverse_proxy varnish:80 {
header_up Host {host} # Won't work with another value or transparent preset
header_up X-Forwarded-Host {host}
header_up X-Real-IP {remote}
header_up X-Forwarded-For {remote}
header_up X-Forwarded-Proto {scheme}
header_up X-Caddy-Forwarded 1
header_down Cache-Control "public, max-age=31536000"
}
header /_next/static/* {
Cache-Control "public, max-age=31536000, immutable"
}
}
:8080 {
reverse_proxy backend-address:3000
}
default.vcl
vcl 4.0;
backend default {
.host = "caddy";
.port = "8080";
}
sub vcl_deliver
{
# Insert Diagnostic header to show Hit or Miss
if (obj.hits > 0) {
set resp.http.X-Cache = "HIT";
set resp.http.X-Cache-Hits = obj.hits;
}
else {
set resp.http.X-Cache = "MISS";
}
}
sub vcl_backend_response {
set beresp.ttl = 10s;
set beresp.grace = 1h;
}
Everything is working fine
The only problem is that Varnish Cache is always miss
, the one thing that it is supposed to do
I have tried all means, but it looks like the varnish sees every request as a new request.
Any ideas?
In order to understand why you receive the cache misses, you need to understand the built-in VCL.
This is the VCL code that is executed behind the scenes. Please have a look at the following tutorial that explains this: https://www.varnish-software.com/developers/tutorials/varnish-builtin-vcl/.
I'd like to summarize the standard situations where Varnish doesn't cache:
GET
or HEAD
Authorization
headerCookie
headerSet-Cookie
headerExpires
header or max-age=0
or s-maxage=0
in the Cache-Control
headerprivate
, no-cache
or no-store
in the Cache-Control
headerVary: *
response headerAn easy way to figure out why a cache miss occurs or why the cache is bypassed is by using varnishlog
.
You can run the following command to check logs for the homepage:
sudo varnishlog -g request -q "ReqUrl eq '/'"
Please change
/
into the desire URL.
If your application doesn't behave according to the built-in VCL specifications, you'll have to write some VCL code to compensate for that.
In most cases that is stripping off tracking cookies or defining which URL patterns should bypass the cache while forcing cache lookups on all other pages.
Have a look at the logs to figure out what's what and if you need help, just add a log transaction to your question and I'll help you out.