I'm currently looking into Scotty for web development, and so far it looks pretty good. I'm worried though, that there seems to be no way to discard a file upload (or better yet an arbitrary POST body) where the file size is above a certain limit without receiving the whole file first. The example at https://github.com/scotty-web/scotty/blob/master/examples/upload.hs doesn't mention file size limits and i can't find anything in the documentation.
I could of course do a length
on the ByteString, but i can't see how that would work until the whole file is already loaded into memory.
You should be able to set some maxBytes
parameter, take maxBytes
from each file contents lazily, partition your file uploads into failures and successes, then handle each of them. Here's some untested code to illustrate what I mean in the context of your application:
post "/upload" $ do
fs <- files
let maxBytes = 9000 -- etc
fs' = [ (fieldName, BS.unpack (fileName fi), B.take (maxBytes + 1) (fileContent fi)) | (fieldName,fi) <- fs ]
(oks, fails) = partition ((<= maxBytes) . B.length) fs' -- separate out failures
liftIO $ sequence_ [ B.writeFile ("uploads" </> fn) fc | (_,fn,fc) <- oks ]
-- do something with 'fails'
-- and continue...
It's also entirely possible to just filter out failures "on the fly" but that solution is more specific to what you want to do with the failures -- this should illustrate the idea though. This solution should take care of your concerns; since you're using lazy ByteString
s, B.take
shouldn't have to read in the full contents of any of the files to be tagged as a failed upload.