ncks memory issue

Help
xin xi
2013-10-09
2013-10-17
  • xin xi

    xin xi - 2013-10-09

    I use ncks to split a 9.4GB nc file into two parts on the time dimension, 0-191 as one part, 192 as the other part.
    machine has a total of 32 GB memory, and ncks constantly use over 90% of memory. But ncks crashes at one point, reporting no enough memory (message below). What is the solution here?thanks.
    I use command: ncks -d Time,0,191 input.nc output.nc

    ncks: ERROR nco_malloc() unable to allocate 2532611328 B = 2473253 kB = 2415 MB = 2 GB
    ncks: INFO NCO has reported a malloc() failure. malloc() failures usually indicate that your machine does not have enough free memory (RAM+swap) to perform the requested operation. As such, malloc() failures result from the physical limitations imposed by your hardware. Read http://nco.sf.net/nco.html#mmr for a description of NCO memory usage. There are two workarounds in this scenario. One is to process your data in smaller chunks. The other is to use a machine with more free memory.

    Large tasks may uncover memory leaks in NCO. This is likeliest to occur with ncap. ncap scripts are completely dynamic and may be of arbitrary length and complexity. A script that contains many thousands of operations may uncover a slow memory leak even though each single operation consumes little additional memory. Memory leaks are usually identifiable by their memory usage signature. Leaks cause peak memory usage to increase monotonically with time regardless of script complexity. Slow leaks are very difficult to find. Sometimes a malloc() failure is the only noticeable clue to their existance. If you have good reasons to believe that your malloc() failure is ultimately due to an NCO memory leak (rather than inadequate RAM on your system), then we would be very interested in receiving a deta

     
  • Charlie Zender

    Charlie Zender - 2013-10-09

    quoting from the error message: "Read http://nco.sf.net/nco.html#mmr for a description of NCO memory usage. There are two workarounds in this scenario. One is to process your data in smaller chunks. The other is to use a machine with more free memory." if you read the manual, you will learn that ncrcat is much more memory efficient at hyperslabbing record variables than ncks is. so try ncrcat too.
    cz

     

Log in to post a comment.

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:





No, thanks