Search code examples
pathbrowser-cachecache-control

Is it possible to force a browser to use the same cache entry for different paths?


For example, there are 2 paths at the same domain: some_domain.com/first and some_domain.com/second.

Both of these paths return the same file, let's say - contents.html. Is it possible to configure caches in the way that after opening some_domain.com/first no loading will be performed for some_domain.com/second - so it will just reuse the cache prepared for some_domain.com/first?

Should it be done server-side by returning some special hypothetical Cache-Key: key_value header? The closest related thing could find is this question: Is it possible to cache a file in the browser despite its querystring?

Update - I can rephrase it a little bit. I'm using React with its Router. The server is configured to return index.html for all possible paths allowing then to process routes by client-side JS. Let's say, there is some caching configured to prevent everything the server returns from being expired. But here is an issue. Once I open some_domain.com/first, it gets cached, so the next time I access the same path, it is not downloaded again. But even if some_domain.com/second returns the same index.html, it's not cached after some_domain.com/first had been opened - I need to open this second path as well (i.e. by reloading the corresponding page in browser) at least once to make it cached as well (while it's obviously pointless, as both routes return the same index.html).


Solution

  • I've managed to solve the issue described in the question. If very shortly, it's done by a service worker checking event.request.referrer for each "fetch" event - it's always not defined in my case on each fetching of the root index.html for SPA. So every time when there is no referrer value, I'm performing refreshing and caching for the same / destination.

    If you are interested in a particular implementation, here is a very simple example server configuration done by Ktor (the key point here is to introduce "Last-Modified" headers to handle caching):

    fun main() {
        embeddedServer(Netty) {
            install(Compression) {
                gzip()
            }
            install(ConditionalHeaders) // adds proper "Last-Modified" headers according to states of served files
            routing {
                singlePageApplication {
                    react("build/dist/js/productionExecutable")
                }
            }
        }.start(wait = true)
    }
    

    And here is the code of my service worker with comments inside:

    // Configs.
    
    /** Key to store HTTP responses. */
    const RESPONSE_CACHE = "responses-v1";
    
    
    
    // Fetching.
    
    /** An entry point to handle all requests. */
    const fetchStrategy = event => {
        const destination = getDestinationToBeProcessedByServiceWorker(event);
        if (destination) {
            event.respondWith(fetchModifiedWithFallbackToCached(destination));
        }
    };
    
    /**
     * Prepares a destination to be processed by the service worker 
     * or returns `null` if the provided `event` should not be handled by the service worker.
     */
    const getDestinationToBeProcessedByServiceWorker = event => {
        // No referrer value implies an attempt to fetch the root `index.html`:
        // it can have different paths, but to avoid unnecessary requests for each of these paths,
        // only `/` will be used to fetch and cache its contents.
        if (!event.request.referrer) {
            return "/";
        }
        // All other types of requests will be processed by default browser means (including caching).
        return null;
    };
    
    /**
     * Tries to fetch a `destination` and cache it if it was modified,
     * returns its cached version if there are no updates.
     * 
     * It seems inevitable to perform at least one additional network request inside a service worker
     * for each response has been already fetched by the browser before the service worker's activation:
     * there is no straightforward way to share service worker caches with default browser ones
     * or get response data for some static request performed by the browser to fetch a resource.
     */
    const fetchModifiedWithFallbackToCached = async destination => {
        const cachedResponse = await getCachedResponse(destination);
        const request        = createRequestWithProperCacheHeader(destination, cachedResponse);
        let networkResponse;
        try {
            networkResponse = await fetch(request);
        } catch (error) {
            if (!cachedResponse) {
                throw error; // there could be some other fallback instead of just propagating the error
            }
        }
        if (await cacheNetworkResponseIfModified(destination, networkResponse)) {
            return networkResponse;
        }
        return cachedResponse;
    };
    
    /**
     * If there is some `cachedResponse` available, 
     * creates a request with a header to fetch the `destination` only if it was modified.
     * 
     * Returns just a pure `destination` otherwise.
     */
    const createRequestWithProperCacheHeader = (destination, cachedResponse) => {
        if (cachedResponse) {
            return new Request(destination, {
                headers: new Headers({
                    "If-Modified-Since": cachedResponse.headers.get("Last-Modified")
                })
            });
        } else {
            return destination;
        }
    };
    
    /**
     * Caches a provided `networkResponse` only if its status implies that it was modified:
     * returns `true` in this case, `false` - otherwise.
     */
    const cacheNetworkResponseIfModified = async (destination, networkResponse) => {
        if (networkResponse && networkResponse.ok) {
            await cacheResponse(destination, networkResponse.clone());
            return true;
        } else {
            return false;
        }
    };
    
    
    
    // Caching.
    
    const getCachedResponse = async destination => { 
        return await (await openResponseCache()).match(destination);
    };
    
    const cacheResponse = async (destination, response) => {
        await (await openResponseCache()).put(destination, response);
    };
    
    const openResponseCache = async () => await caches.open(RESPONSE_CACHE);
    
    
    
    // Service worker's setup.
    
    // Seems like there is no reason to perform any caching during the installation phase,
    // because methods like `Cache.add(...)` are still performing fetching under the hood
    // and do not allow to avoid one more additional request per each resource expected to be cached
    // after the browser has already fetched it.
    self.addEventListener("fetch", fetchStrategy);
    

    As a bonus, it's pretty easy to extend this service worker to include caching for static resources to have a kind of offline mode for SPA.