From: Geoff H. <ghu...@ws...> - 2004-03-21 01:26:42
|
On Mar 20, 2004, at 9:10 AM, Michael Banck wrote: >> We have converted it OK, but split it into smaller chunks first. >> Unless >> there is zero memory leak I am sure this number of molecules in one >> file >> could give a problem. I have not yet found memory leaks. But Michael has a good point that this would be a nice stress test. I have a few other truly nasty tests I'm working on, so I'll add this one for the "bulletproofing" tests. I expect though, that it's simply a RAM problem or an older version of Open Babel (and *not* the original babel 1.6!) The current babel command-line program (and the previous one for that matter) assumed that it could load whole files into memory. In this particular case (a multi-molecule file), the whole set of molecules will be instantiated as an OBMolVector before the writing occurs. I can attest though that trying it right now on Mac OS X and Open Babel 1.100.2 works just fine: (1.25GHz G4, 512MB RAM) > [ghutchis@alumina] Downloads: ls -l nciopen3d.mol > -rwxr-xr-x 1 ghutchis staff 179873013 Mar 20 20:13 nciopen3d.mol* > [ghutchis@alumina] Downloads: time babel nciopen3d.mol nciopen3d.mol2 > > real 0m17.634s > user 0m9.130s > sys 0m1.110s Yes, that's right. We can zip through 172MB of NCI molecules in ~17 seconds. The original babel is still grinding after several minutes. So Gaokeng, what version of Open Babel are you using? Did you download the latest 1.100.2 release from http://openbabel.sf.net/ ? Thanks, -Geoff |