Hello guys,
I´m working with large amount of data in Excel.
What I need to do is read the data, and put it into a generic list, with some business rules.
In this process, I have the total count of 1 200 000 objects inside the list.
Often it causes an memory exception due the large amount of data, rarely the process doesn´t throw an exception.
I´ve already read the support documentation and have already used the tips for better performance, but it doesn´t works to me.
Could someone help me in this case, please?
Thanks in advance,
Thiago
Comments: ** Comment from web user: ThiagoGrandesso **
I´m working with large amount of data in Excel.
What I need to do is read the data, and put it into a generic list, with some business rules.
In this process, I have the total count of 1 200 000 objects inside the list.
Often it causes an memory exception due the large amount of data, rarely the process doesn´t throw an exception.
I´ve already read the support documentation and have already used the tips for better performance, but it doesn´t works to me.
Could someone help me in this case, please?
Thanks in advance,
Thiago
Comments: ** Comment from web user: ThiagoGrandesso **
About the amount of objects inside the list when it throws a memory exception, is around 100 000.
The machine is 64 bit and has 4 GB memory.
I suppose the idea about implement my own memory management is the best approach. Due the great DLL offered by you, I wouldn´t have to change a lot of code.
I have found some articles from the SAX approach using Open XML SDK. It doesn´t load the entire file into memory. Is there any approach like this in ClosedXML? It´s the best scenario I could have in performance terms.
Here are the articles:
http://msdn.microsoft.com/en-us/library/office/gg575571.aspx
http://msdn.microsoft.com/en-us/library/hh180830(v=office.14).aspx
Thank you,
Thiago