From: Friedrich F. <foe...@bi...> - 2016-03-17 07:49:34
|
Dustin is probably essentially right. the problem may be that the header of your file(s), where the file dimensions are stored, does not follow the standard conventions. it may well be that these numbers are somewhat insane and pytom tries to allocate humongous amounts of memory. the mrc format (i guess that is what you are using) has a lot of different flavors, which are not necessarily all supported in pytom. for the em format, assignment of the 'magic number' at the very beginning is sometimes a problem on macs. best try to read in your files interactively in pytom and see if there any complaints. when you found the culprit try to write it out or convert it with a different program. best wishes friedrich On Thu, Mar 17, 2016 at 4:40 AM, Dustin Morado <dus...@gm...> wrote: > Anything with alloc in the error is usually a memory error. Do you have > enough RAM to fit everything? If you use binning does it work? > > -- > Cheers, > Dustin > > > On Mar 16, 2016, at 7:47 PM, Kanika Khanna <kk...@uc...> wrote: > > > > Hello > > > > I have been trying endlessly to run the localization job with parallel > processing, but it is always giving an error. > > > > Traceback (most recent call last): > > File "/usr/local/pytom/bin/localization.py", line 82, in <module> > > startLocalizationJob(jobName, splitX, splitY, splitZ, > doSplitAngles=False) > > File "/usr/local/pytom/bin/localization.py", line 21, in > startLocalizationJob > > leader.parallelRun(job, splitX, splitY, splitZ, verbose) > > File "/usr/local/pytom/localization/parallel_extract_peaks.py", line > 1046, in parallelRun > > result = self.run(verbose) > > File "/usr/local/pytom/localization/parallel_extract_peaks.py", line > 104, in run > > [resV, orientV, sumV, sqrV] = extractPeaks(v, ref, rot, scoreFnc, m, > mIsSphere, wedg, nodeName=self.name, verboseMode=verbose, > moreInfo=moreInfo) > > File "/usr/local/pytom/localization/extractPeaks.py", line 130, in > extractPeaks > > meanV = meanUnderMask(volume, maskV, p); > > File "/usr/local/pytom/basic/correlation.py", line 244, in > meanUnderMask > > result = iftshift(ifft(fMask*fft(volume)))/(size*p) > > File "/usr/local/pytom/basic/fourier.py", line 122, in fft > > returnValue = > pytom_volume.vol_comp(theTuple.complexVolume.sizeX(),theTuple.complexVolume.sizeY(),theTuple.complexVolume.sizeZ()) > > File "/usr/local/pytom/pytomc/swigModules/pytom_volume.py", line 234, > in __init__ > > this = _pytom_volume.new_vol_comp(*args) > > RuntimeError: std::bad_alloc > > > -------------------------------------------------------------------------- > > mpirun detected that one or more processes exited with non-zero status, > thus causing > > the job to be terminated. The first process to do so was: > > > > Process name: [[6894,1],0] > > Exit code: 1 > > > > > > On Fri, Mar 11, 2016 at 12:00 PM, Kanika Khanna <kk...@uc...> > wrote: > > Hi > > > > I am sorry but the job did start running and it gave this weird error > then and aborted again! :( > > Primary job terminated normally, but 1 process returned > > a non-zero exit code.. Per user-direction, the job has been aborted. > > > > node_0: send number of 50 rotations to node 2 > > node_0: send number of 25 rotations to node 1 > > node_2: send number of 25 rotations to node 3 > > node_0: starting to calculate 25 rotations > > node_3: starting to calculate 25 rotations > > node_1: starting to calculate 25 rotations > > node_2: starting to calculate 25 rotations > > Traceback (most recent call last): > > File "/usr/local/pytom/bin/localization.py", line 82, in <module> > > startLocalizationJob(jobName, splitX, splitY, splitZ, > doSplitAngles=False) > > File "/usr/local/pytom/bin/localization.py", line 21, in > startLocalizationJob > > leader.parallelRun(job, splitX, splitY, splitZ, verbose) > > File "/usr/local/pytom/localization/parallel_extract_peaks.py", line > 1046, in parallelRun > > result = self.run(verbose) > > File "/usr/local/pytom/localization/parallel_extract_peaks.py", line > 104, in run > > [resV, orientV, sumV, sqrV] = extractPeaks(v, ref, rot, scoreFnc, m, > mIsSphere, wedg, nodeName=self.name, verboseMode=verbose, > moreInfo=moreInfo) > > File "/usr/local/pytom/localization/extractPeaks.py", line 130, in > extractPeaks > > meanV = meanUnderMask(volume, maskV, p); > > File "/usr/local/pytom/basic/correlation.py", line 244, in > meanUnderMask > > result = iftshift(ifft(fMask*fft(volume)))/(size*p) > > File "/usr/local/pytom/pytomc/swigModules/pytom_volume.py", line 342, > in __mul__ > > return _pytom_volume.vol_comp___mul__(self, *args) > > RuntimeError: std::bad_alloc > > ------------------------------------------------------- > > Primary job terminated normally, but 1 process returned > > a non-zero exit code.. Per user-direction, the job has been aborted. > > > > > > On Fri, Mar 11, 2016 at 11:56 AM, Kanika Khanna <kk...@uc...> > wrote: > > Hi Thomas, > > > > I have been able to resolve issue 1. I just killed all the processes and > started again. It worked that time. > > > > On Sat, Mar 12, 2016 at 1:05 AM, Thomas Hrabe <th...@gm...> wrote: > > Hi Kanika, > > > > > > 1. > > to me, it looks like that you are trying to open a file called > > > > > <Volume Binning="[0, 0, 0]" Filename="PY79.mrc" Sampling="[0, 0, 0]" > Subregion="[0, 0, 0, 0, 0, 0]"/> > > > > > > Can you please attach the xml file? > > > > 2. > > I have to look into the script as well. I must re-install it before I > can advice about how to use it. > > > > Cheers, > > Thomas > > > > > On Mar 11, 2016, at 11:15 AM, Kanika Khanna <kk...@uc...> wrote: > > > > > > Hello all, > > > > > > 1. When I try to execute my job using the following command > > > mpirun --host HostName -c 4 pytom localization.py PathToJobFile 2 2 2 > > > > > > I get the following error: > > > Traceback (most recent call last): > > > File "/usr/local/pytom/bin/localization.py", line 82, in <module> > > > startLocalizationJob(jobName, splitX, splitY, splitZ, > doSplitAngles=False) > > > File "/usr/local/pytom/bin/localization.py", line 11, in > startLocalizationJob > > > job.check() > > > File "/usr/local/pytom/localization/peak_job.py", line 184, in check > > > Traceback (most recent call last): > > > File "/usr/local/pytom/bin/localization.py", line 82, in <module> > > > raise IOError('File: ' + str(self.volume) + ' not found!') > > > IOError: File: <Volume Binning="[0, 0, 0]" Filename="PY79.mrc" > Sampling="[0, 0, 0]" Subregion="[0, 0, 0, 0, 0, 0]"/> > > > not found! > > > startLocalizationJob(jobName, splitX, splitY, splitZ, > doSplitAngles=False) > > > File "/usr/local/pytom/bin/localization.py", line 11, in > startLocalizationJob > > > job.check() > > > File "/usr/local/pytom/localization/peak_job.py", line 184, in check > > > raise IOError('File: ' + str(self.volume) + ' not found!') > > > IOError: File: <Volume Binning="[0, 0, 0]" Filename="PY79.mrc" > Sampling="[0, 0, 0]" Subregion="[0, 0, 0, 0, 0, 0]"/> > > > not found! > > > Traceback (most recent call last): > > > File "/usr/local/pytom/bin/localization.py", line 82, in <module> > > > startLocalizationJob(jobName, splitX, splitY, splitZ, > doSplitAngles=False) > > > File "/usr/local/pytom/bin/localization.py", line 11, in > startLocalizationJob > > > job.check() > > > File "/usr/local/pytom/localization/peak_job.py", line 184, in check > > > raise IOError('File: ' + str(self.volume) + ' not found!') > > > IOError: File: <Volume Binning="[0, 0, 0]" Filename="PY79.mrc" > Sampling="[0, 0, 0]" Subregion="[0, 0, 0, 0, 0, 0]"/> > > > not found! > > > ------------------------------------------------------- > > > Primary job terminated normally, but 1 process returned > > > a non-zero exit code.. Per user-direction, the job has been aborted. > > > ------------------------------------------------------- > > > Traceback (most recent call last): > > > File "/usr/local/pytom/bin/localization.py", line 82, in <module> > > > startLocalizationJob(jobName, splitX, splitY, splitZ, > doSplitAngles=False) > > > File "/usr/local/pytom/bin/localization.py", line 11, in > startLocalizationJob > > > job.check() > > > File "/usr/local/pytom/localization/peak_job.py", line 184, in check > > > raise IOError('File: ' + str(self.volume) + ' not found!') > > > IOError: File: <Volume Binning="[0, 0, 0]" Filename="PY79.mrc" > Sampling="[0, 0, 0]" Subregion="[0, 0, 0, 0, 0, 0]"/> > > > not found! > > > > -------------------------------------------------------------------------- > > > mpirun detected that one or more processes exited with non-zero > status, thus causing > > > the job to be terminated. The first process to do so was: > > > > > > Process name: [[27852,1],3] > > > Exit code: 1 > > > > > > > > > Any idea how I can circumvent this? Or what exactly might be going > wrong? > > > > > > 2. Once you have the pl.xml file generated after extraction of > particles, we have the script VolumeDialog.py for viewing it in Chimera. I > already installed it in the Volume Viewer folder. How exactly are we > suppose to use it? Also, there were a couple of existing files of the same > name in the folder existing previously (is that the case with everyone > here?). Anything to do with them? > > > > > > Thanks! > > > > > > > > > > > > > > > > ------------------------------------------------------------------------------ > > > Transform Data into Opportunity. > > > Accelerate data analysis in your applications with > > > Intel Data Analytics Acceleration Library. > > > Click to learn more. > > > > http://pubads.g.doubleclick.net/gampad/clk?id=278785111&iu=/4140_______________________________________________ > > > Pytom-mail mailing list > > > Pyt...@li... > > > https://lists.sourceforge.net/lists/listinfo/pytom-mail > > > > > > > > > > > ------------------------------------------------------------------------------ > > Transform Data into Opportunity. > > Accelerate data analysis in your applications with > > Intel Data Analytics Acceleration Library. > > Click to learn more. > > > http://pubads.g.doubleclick.net/gampad/clk?id=278785231&iu=/4140_______________________________________________ > > Pytom-mail mailing list > > Pyt...@li... > > https://lists.sourceforge.net/lists/listinfo/pytom-mail > > > > ------------------------------------------------------------------------------ > Transform Data into Opportunity. > Accelerate data analysis in your applications with > Intel Data Analytics Acceleration Library. > Click to learn more. > http://pubads.g.doubleclick.net/gampad/clk?id=278785231&iu=/4140 > _______________________________________________ > Pytom-mail mailing list > Pyt...@li... > https://lists.sourceforge.net/lists/listinfo/pytom-mail > -- Dr. Friedrich Foerster Max-Planck Institut fuer Biochemie Am Klopferspitz 18 D-82152 Martinsried Tel: +49 89 8578 2632 Fax: +49 89 8578 2641 www.biochem.mpg.de/foerster |