Menu

#1395 java.lang.OutOfMemoryError: Direct buffer memory with CDF files

nextrelease
open-fixed
nobody
None
5
2015-09-03
2015-05-10
No

Seth reported that his .vap file that loads many datasets from CDF and appends them results in an out-of-memory error. I can reliably demonstrate this bug by running the vap:

See sftp://jfaden.net/home/jbf/ct/autoplot/users/2015/seth/20150508/RBSPA_voltages.vap

and changing the timerange to "May 2014 through Nov 2014"

Discussion

  • Jeremy Faden

    Jeremy Faden - 2015-05-11

    I can reliably generate this bug debugging in Netbeans with Java 1.8.0_45. I'm going to see if putting in an explicit System.gc (garbage collect) will release the references. This vap has lots of small files, so this seems like a likely cause.

     
  • Jeremy Faden

    Jeremy Faden - 2015-05-11

    This seems to fix the problem.

     
  • Jeremy Faden

    Jeremy Faden - 2015-05-11
    • status: open --> open-fixed
     
  • Jeremy Faden

    Jeremy Faden - 2015-05-14

    This was redone, and the condition is caught and a gc done. Occasionally this will still fail, and the final fallback is to use the heap buffer.

     
  • Jeremy Faden

    Jeremy Faden - 2015-09-03
    • summary: java.lang.OutOfMemoryError: Direct buffer memory --> java.lang.OutOfMemoryError: Direct buffer memory with CDF files
     
  • Jeremy Faden

    Jeremy Faden - 2015-09-03

    I'm staring at this code and getting very confused, see BufferDataSet.shouldAllocateDirect. Nand and I are attempting to add support for "huge" cdf files (>2G), and it looks like I'll need to open with a different mode.

     
  • Jeremy Faden

    Jeremy Faden - 2015-09-03

    I don't know if I ever commented on the bug where NIO allocations outside of the JVM were not freed, because it relies on the GC to remove the reference to the external data. If there's plenty of room in the JVM but external memory runs out, then there will be problems as well.

     
  • Jeremy Faden

    Jeremy Faden - 2015-09-03

    Digging down in the allocateDirect method and it's calls, I find there's a limit to the size of the direct memory allocations.
    -XX:MaxDirectMemorySize=<size></size>

    I have 16GB on my desktop and 24GB of swap, and I see that this is set to 1.9GB.

    This argument implies to me that something is broken. If I have to allocate a size, then why wouldn't I just allocate more heap? The docs talk about this being 0 by default, and the JVM gets to choose. Without knowing what the machine is going to be, how I can I set this?

     
MongoDB Logo MongoDB