I´m working with large amount of data in Excel.
What I need to do is read the data, and put it into a generic list, with some business rules.
In this process, I have the total count of 1 200 000 objects inside the list.
Often it causes an memory exception due the large amount of data, rarely the process doesn´t throw an exception.
I´ve already read the support documentation and have already used the tips for better performance, but it doesn´t works to me.
Could someone help me in this case, please?
Thanks in advance,
Thiago
Comments: ** Comment from web user: roberttanenbaum **
Here's the obvious problem. Over a million items are being kept in memory twice. Once in the Excel file and then again in the in-memory list.
At what point does it run out of memory?
How many items are added to the in-memory generic list before it runs out of memory?
Are you running on a 64-bit machine with 8 GB memory or on a 32-bit machine with 4 GB memory?
Maybe you can move to a 64-bit machine.
What is the cache memory size on the server?
Maybe you can increase the size of the cache.
If those things fail, you might have to implement your own memory management.
Pick a number like 100,000 and when you have that many items in your generic list, write them out to disk and close the file and reset the list to zero and read the next 100,000 items until it is all written out to disk.
Dispose of the workbook and force a garbage collection. Then close the Excel file and read back the items from disk.
Let us know what you do to solve this issue.