In my service worker's 'fetch' listener I want to examine the body of a POST request, and then later do event.respondWith(fetch(event.request))
. I clone the request and then convert the clone's body to an object, which works fine, but then I get the "event handler is already finished" error from event.respondWith(fetch(event.request))
. I thought if I cloned the request I was free to read the body of the clone without impacting the original request. How can I examine the request body in the 'fetch' listener and then still fetch the request later?
Here's the service worker:
'use strict';
self.addEventListener('fetch', async (event) => {
try {
const req_obj = await event.request.clone().json();
console.log(req_obj);
} catch (e) {
console.error(e);
}
event.respondWith(fetch(event.request));
});
If I don't clone the request, or if I clone it but I don't call the .json() method of the clone, then everything is fine. Also, I tried new Request(event.request)
instead of event.request.clone()
, and I tried first using the constructor to copy the request and then calling the .clone() of that copy, but still I get the same error when finally doing event.respondWith(fetch(event.request))
;
At https://developer.mozilla.org/en-US/docs/Web/API/Request/clone I read some stuff about backpressure on teed streams and such, which I didn't really understand, and then the warning:
Beware when you construct a Request from a stream and then clone it.
Does this mean I can't clone my request and then later fetch it?
The issue is that calling event.respondWith
is no longer valid after awaiting the .json()
call. To resolve this, pass a promise to event.respondWith
that includes the .json()
call as well as the fetch
:
self.addEventListener('fetch', (event) => {
event.respondWith((async () => {
try {
const req_obj = await event.request.clone().json();
console.log(req_obj);
} catch (e) {
console.error(e);
}
return fetch(event.request);
})());
});