I am using NPOI 1.2.3.0 in an ASP.NET application to export the results of a rather large SQL query to an Excel 2003 XLS file.
In short, the query results are populated into an ADO.NET DataTable. I then have a routine that loops through the rows in the DataTable and for each row adds a row to an NPOI spreadsheet. It is intelligent enough that once 65,000 rows are exceeded for a single sheet, a new sheet is created and the rows are continued there, starting at the first row in the new sheet.
This approach works well for some of my smaller database queries that include, say, 30,000 rows and 50 columns, but I have this one query that returns north of 125,000 rows and has roughly 50 columns, many of which have a good deal of text.
I am able to construct the spreadsheet without issue, but when I try to stream the generated spreadsheet down to the browser I get an OutOfMemoryException
when calling the HSSFWorkbook
class's Write
method. (Internally, the error is happening when the Write method calls the class's GetBytes
method.)
If I run the debugger and stop before the Write method is called, I see that the workbook's Size property returns a value of (roughly) 65 million.
This error is noted on the NPOI project at CodePlex - see the discussion titled Out of Memory Problems - but no resolution was found, unfortunately.
For completeness, here is the code where the exception is raised (specifically, it's raised on the workbook.Write
line).
Using exportData As New MemoryStream()
workbook.Write(exportData)
Response.ContentType = "application/vnd.ms-excel"
Response.AddHeader("Content-Disposition", "Attachment;Filename=" & saveAsName)
Response.Clear()
Response.BinaryWrite(exportData.GetBuffer())
Response.End()
End Using
Thanks!
What I would do in this scenario, bearing in mind the FileStream object does not cause errors and that the error is caused by the 512MB capacity limitation in 32 bit and 2GB limitation in 64 bit, is try and write the file to the memoryStream, catch the error and revert to the FileStream for the larger files if the error is encountered.
there's an obvious performance trade off here, but if your users are downloading > 2GB files they should probably expect this to be a bit slower :-)
Would be interested to know if this works for you.
thanks, Dave