Quantcast
Channel: ClosedXML - The easy way to OpenXML
Viewing all articles
Browse latest Browse all 1877

New Post: Workaround on big excel files and memory issues?

$
0
0
I'm having a similar issue (except one very large sheet rather than 5 small ones). I've seen comments elsewhere that suggest moving to x64, but I'm not convinced that'll solve the issue for me since users are likely to be on 4gb ram machines, and if I'm hitting 1.7gb when after around 1/4 of one sheet is loaded, I'd expect to get considerably larger before all the data was there (let alone for the actual save processess).

Looking at the object model, it looks like cells don't get written back immediately to the representation of the file (rather they kick around until save or saveas is called), which is obviously going to exacerbate the situation. That is to say, there are two points where OOM exceptions show up:
  1. When inserting data/manipulating the sheet (Too many XLCell objects floating around, I think)
  2. When saving the sheet to disk (Haven't dug into this; I'm just hoping that internally it's not using XmlSerializer, because this is OMGBAD for memory footprint)
I'm wondering whether it would be possible to optimize the ClosedXML cell model to explicitly flush cells on command to reduce its memory footprint, and potentially allow saving in blocks instead of all at once. After all, there are programs that do handle this much data even on x86 machines (e.g. Access from experience, and probably most respectable DBs in any event, especially if Access can do it). Obviously the age old adage of don't use Excel as a DB does apply etc. but in this case, since Excel itself can handle that much data on users' machines, it would be nice if the solution we're using to build the files could as well!

Viewing all articles
Browse latest Browse all 1877

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>