I am using NPOI 1.2.3.0 in an ASP.NET application to export the results of a rather large SQL query to an Excel 2003 XLS file.
In short, the query results are populated into an ADO.NET DataTable. I then have a routine that loops through the rows in the DataTable and for each row adds a row to an NPOI spreadsheet. It is intelligent enough that once 65,000 rows are exceeded for a
single sheet, a new sheet is created and the rows are continued there, starting at the first row in the new sheet.
This approach works well for some of my smaller database queries that include, say, 30,000 rows and 50 columns, but I have this one query that returns north of 125,000 rows and has roughly 50 columns, many of which have a good deal of text.
I am able to construct the spreadsheet without issue, but when I try to stream the generated spreadsheet down to the browser I get an OutOfMemoryException when calling the HSSFWorkbook class'sWrite method.
(Internally, the error is happening when the Write method calls the class's GetBytesmethod.)
If I run the debugger and stop before the Write method is called, I see that the workbook's Size property returns a value of (roughly) 6.5 million. I'm not sure why such a small size would result in anOutOfMemoryException seeing that there
are gigabytes worth of RAM sitting around on my dev box.
This error is noted in another discussion here - Out of Memory Problems - but no resolution was found, unfortunately.
For completeness, here is the code where the exception is raised (specifically, it's raised on theworkbook.Write line).
Using exportData As New MemoryStream() workbook.Write(exportData) Response.ContentType = "application/vnd.ms-excel" Response.AddHeader("Content-Disposition", "Attachment;Filename=" & saveAsName) Response.Clear() Response.BinaryWrite(exportData.GetBuffer()) Response.End()End Using
Thanks!
Comments: ** Comment from web user: caseychamberlain **
In short, the query results are populated into an ADO.NET DataTable. I then have a routine that loops through the rows in the DataTable and for each row adds a row to an NPOI spreadsheet. It is intelligent enough that once 65,000 rows are exceeded for a
single sheet, a new sheet is created and the rows are continued there, starting at the first row in the new sheet.
This approach works well for some of my smaller database queries that include, say, 30,000 rows and 50 columns, but I have this one query that returns north of 125,000 rows and has roughly 50 columns, many of which have a good deal of text.
I am able to construct the spreadsheet without issue, but when I try to stream the generated spreadsheet down to the browser I get an OutOfMemoryException when calling the HSSFWorkbook class'sWrite method.
(Internally, the error is happening when the Write method calls the class's GetBytesmethod.)
If I run the debugger and stop before the Write method is called, I see that the workbook's Size property returns a value of (roughly) 6.5 million. I'm not sure why such a small size would result in anOutOfMemoryException seeing that there
are gigabytes worth of RAM sitting around on my dev box.
This error is noted in another discussion here - Out of Memory Problems - but no resolution was found, unfortunately.
For completeness, here is the code where the exception is raised (specifically, it's raised on theworkbook.Write line).
Using exportData As New MemoryStream() workbook.Write(exportData) Response.ContentType = "application/vnd.ms-excel" Response.AddHeader("Content-Disposition", "Attachment;Filename=" & saveAsName) Response.Clear() Response.BinaryWrite(exportData.GetBuffer()) Response.End()End Using
Thanks!
Comments: ** Comment from web user: caseychamberlain **
Have you tried writing directly to the Response.OutputStream? I've used this method in the past to greatly reduce the memory requirements of large amounts of data being streamed out ...
Response.ContentType = "application/vnd.ms-excel"
Response.AddHeader("Content-Disposition", "Attachment;Filename=" & saveAsName)
Response.Clear()
workbook.Write(Response.OutputStream)
Response.End()