[jgrapht-users] Heap Out of Memory issues on large system
Brought to you by:
barak_naveh,
perfecthash
From: Benjamin B. <ba...@ia...> - 2011-12-28 16:53:22
|
List, I'm using JGraphT in an implementation of the Shared Nearest Neighbor clustering algorithm with a large number of nodes (O(100K)) with each node having O(10) edges. I'm running on a RH machine with a 64-bit JVM and 48GB of memory, with the arguments: -Xms10m -Xmx47G -Dnio.ms=10Mb -Dnio.mx=47Gb But am still getting OOM errors when adding the later edges. Granted this graph is large - but I have trouble believing it can't fit in 47GB of RAM. I've tried to get more detail using jvisualvm - but the program takes over 24 hours to reach the point of memory exhaustion, by which point I'm inevitably not watching the console when it dies. Dumping the heap on OOM works when enabled from within jvisualvm, but I don't have enough disk space to store it (quota issues). Am I doing something wrong, or is there an alternate way of using JGraphtT for large graphs? Should I be trying to use a DB backend instead of RAM for the graph (quota issues aside)? Is that even possible? Thanks! -Ben |