From: Walenz, B. <bw...@jc...> - 2012-07-03 16:44:57
|
Hi, Christoph- Are you using CA7 or CVS? This behavior was introduced to CVS on May 21, and fixed on the 29th. The bug was after an optimization in loading overlaps was made - only overlaps in the 'dupStore' are needed, the 'obtStore' can be ignored. This eliminated a huge amount of I/O and overhead from the dedupe compute. If updating CVS doesn't fix the problem, can you send some of the logging from deduplicate? b On 7/3/12 6:28 AM, "Christoph Hahn" <chr...@gm...> wrote: > Dear developers and users, > > I am encountering some problems in the deduplicate step. Unfortunately, > the memory usage is steadily increasing until the process dies because > of exceeding memory limit. So far, I used up to 32 GB. I could of course > just further increase the available memory, but I was wondering if there > is a possibility to fix and/or predict the maximum memory usage for this > step (and maybe also for the next steps) beforehand. > > Thanks for your help! > > much obliged, > Christoph > > Universtiy of Oslo, Norway > > ------------------------------------------------------------------------------ > Live Security Virtual Conference > Exclusive live event will cover all the ways today's security and > threat landscape has changed and how IT managers can respond. Discussions > will include endpoint security, mobile security and the latest in malware > threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ > _______________________________________________ > wgs-assembler-users mailing list > wgs...@li... > https://lists.sourceforge.net/lists/listinfo/wgs-assembler-users |