We use Apache FOP to convert a whole lot of XML's to AFP's and PDF's. Our current load would be around 25k files per run on a HP-UX system. We have 8 threads in total that are used to initialize and trigger the FOP conversion in a producer-consumer fashion. Recently there have been multiple failures during conversion and when looked up, we've received generic FOP errors like:
**ERROR,2460364,FOToPDF_Thread_11,FOP Exception, something.pdf,Failed to resolve font with embed-url './Fonts/arial.ttf'**
or its an error failing to load the font metrics file although the files are intact with the right permissions. Many other PDF's are generated so this can't be the problem.
We also wind up with:
**java.io.FileNotFoundException: /PDF/20130111130002/something.pdf (Too many open files (errno:24))**
Judging by the logs and volume being processed, this looks like an FOP problem. I've read that FOP has had this issue in the past with the font files. There have been instances where Apache has opened each font file multiple times and not closed the handles resulting in a large number of open files. This was supposed to be fixed, but if it still persists, what would be a good and quick solution to this, apart from posting this on the Apache lists?
Can the HP-UX maxfiles limit for the open file descriptors per process be increased beyond 2048? Would that help? Any other suggestions?
The relevant issue on the Apache FOP project is
https://issues.apache.org/jira/browse/FOP-2189
As I commented there, I was not able to identify any open file handle in FOP 1.0. The constructor of FontFileReader that takes an InputStream argument is indeed not closing it, but the caller (who created the stream) is closing it, see lines 94-106 of