Consider a database engine, which operates on an externally opened file - like SQLite, except the file handle is passed to its constructor. I'm using a setup like this for my app, but can't seem to figure out why NodeJS insists on closing the file descriptor after 2 seconds of operation. I need that thing to stay open!
const db = await DB.open(await fs.promises.open('/path/to/db/file', 'r+'));
...
(node:100840) Warning: Closing file descriptor 19 on garbage collection
(Use `node --trace-warnings ...` to show where the warning was created)
(node:100840) [DEP0137] DeprecationWarning: Closing a FileHandle object on garbage collection is deprecated. Please close FileHandle objects explicitly using FileHandle.prototype.close(). In the future, an error will be thrown if a file descriptor is closed during garbage collection.
The class DB
uses the provided file descriptors extensively, over an extended period of time, so it closing is rather annoying. In that class, I'm using methods such as readFile
, createReadStream()
and the readline
module to step through the lines of the file. I'm passing { autoClose: false, emitClose: false }
to any read/write streams I'm using, but to no avail.
Thanks
I suspect you're running into an evil problem in using await
in this
for await (const line of readline.createInterface({input: file.createReadStream({start: 0, autoClose: false})}))
If you use await
anywhere else in the for
loop block (which you are), the underlying stream fires all its data
events and finishes (while you are at the other await
and, in some cases, your process even exits before you got to process any of the data
or line
events from the stream. This is a truly flawed design and has bitten many others.
The safest way around this is to not use the asyncIterator
at all, and just wrap a promise yourself around the regular eveents from the readline object.