You can subscribe to this list here.
2016 |
Jan
(2) |
Feb
(13) |
Mar
(9) |
Apr
(4) |
May
(5) |
Jun
(2) |
Jul
(8) |
Aug
(3) |
Sep
(25) |
Oct
(7) |
Nov
(49) |
Dec
(15) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2017 |
Jan
(24) |
Feb
(36) |
Mar
(53) |
Apr
(44) |
May
(37) |
Jun
(34) |
Jul
(12) |
Aug
(15) |
Sep
(14) |
Oct
(9) |
Nov
(9) |
Dec
(7) |
2018 |
Jan
(16) |
Feb
(9) |
Mar
(27) |
Apr
(39) |
May
(8) |
Jun
(24) |
Jul
(22) |
Aug
(11) |
Sep
(1) |
Oct
|
Nov
|
Dec
|
2019 |
Jan
(4) |
Feb
(5) |
Mar
|
Apr
(1) |
May
(21) |
Jun
(13) |
Jul
(31) |
Aug
(22) |
Sep
(9) |
Oct
(19) |
Nov
(24) |
Dec
(12) |
2020 |
Jan
(30) |
Feb
(12) |
Mar
(16) |
Apr
(4) |
May
(37) |
Jun
(17) |
Jul
(19) |
Aug
(15) |
Sep
(26) |
Oct
(84) |
Nov
(64) |
Dec
(55) |
2021 |
Jan
(18) |
Feb
(58) |
Mar
(26) |
Apr
(88) |
May
(51) |
Jun
(36) |
Jul
(31) |
Aug
(37) |
Sep
(79) |
Oct
(15) |
Nov
(29) |
Dec
(8) |
2022 |
Jan
(5) |
Feb
(8) |
Mar
(29) |
Apr
(21) |
May
(11) |
Jun
(11) |
Jul
(18) |
Aug
(16) |
Sep
(6) |
Oct
(10) |
Nov
(23) |
Dec
(1) |
2023 |
Jan
(18) |
Feb
|
Mar
(4) |
Apr
|
May
(3) |
Jun
(10) |
Jul
(1) |
Aug
|
Sep
|
Oct
(1) |
Nov
(3) |
Dec
(5) |
2024 |
Jan
(2) |
Feb
|
Mar
|
Apr
|
May
|
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2025 |
Jan
(1) |
Feb
|
Mar
|
Apr
(5) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Dmitry S. <Sem...@gm...> - 2021-05-18 13:12:43
|
Dear Grigory, Yes I did that — the particles are looking fine. I guess the issue still comes from the fact that originally in cryosparc Export the stacks of particle were placed into 624 mrc. But the number of particles is about 44 818. So even after the renaming and the export I see this in SCIPION export log What I guess may help is if I somehow combine all those files in 1 mrcs first and then add import them to SCIPION. Do you perhaps know how to do that? Thank you Sincerely, Dmitry > On 18. May 2021, at 15:02, Grigory Sharov <sha...@gm...> wrote: > > Hi Dmitry, > > as the error states your star file points to a non-existing image in the mrcs stack. You need to check first if you import from cryosparc with mrcs worked correctly (open / display particles) then trace all the steps you did before 2D classification. > > Best regards, > Grigory > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> > e-mail: gs...@mr... <mailto:gs...@mr...> > > > On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: > Dear Pablo, > > > Thank you. I heard about this option. For that I guess the https://pypi.org/project/cs2star/ <https://pypi.org/project/cs2star/> needs to be installed. > > > > > In Cryosparc itself there is an option to export files. And then what we get is the mrc files with different number of particles in each. > > It appeared to be possible to rename mrc —> to —>mrcs. Then SCIPION can import those particles. > > Currently the problem is that not relion nor cryosparc can run these particles. > > > > > Relion stops with error: > > > 00001: RUNNING PROTOCOL ----------------- > 00002: Hostname: cryoem01 > 00003: PID: 46455 > 00004: pyworkflow: 3.0.13 > 00005: plugin: relion > 00006: plugin v: 3.1.2 > 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix > 00008: workingDir: Runs/001546_ProtRelionClassify2D > 00009: runMode: Continue > 00010: MPI: 3 > 00011: threads: 3 > 00012: Starting at step: 1 > 00013: Running steps > 00014: STARTED: convertInputStep, step 1, time 2021-05-18 12:33:26.198123 > 00015: Converting set from 'Runs/001492_ProtUserSubSet/particles.sqlite' into 'Runs/001546_ProtRelionClassify2D/input_particles.star' > 00016: convertBinaryFiles: creating soft links. > 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input -> Runs/001057_ProtImportParticles/extra > 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 12:33:27.117588 > 00019: STARTED: runRelionStep, step 2, time 2021-05-18 12:33:27.145974 > 00020: mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 > 00021: RELION version: 3.1.2 > 00022: Precision: BASE=double, CUDA-ACC=single > 00023: > 00024: === RELION MPI setup === > 00025: + Number of MPI processes = 3 > 00026: + Number of threads per MPI process = 3 > 00027: + Total number of threads therefore = 9 > 00028: + Leader (0) runs on host = cryoem01 > 00029: + Follower 1 runs on host = cryoem01 > 00030: + Follower 2 runs on host = cryoem01 > 00031: ================= > 00032: uniqueHost cryoem01 has 2 ranks. > 00033: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. > 00034: Thread 0 on follower 1 mapped to device 0 > 00035: Thread 1 on follower 1 mapped to device 0 > 00036: Thread 2 on follower 1 mapped to device 0 > 00037: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. > 00038: Thread 0 on follower 2 mapped to device 1 > 00039: Thread 1 on follower 2 mapped to device 1 > 00040: Thread 2 on follower 2 mapped to device 1 > 00041: Running CPU instructions in double precision. > 00042: + WARNING: Changing psi sampling rate (before oversampling) to 5.625 degrees, for more efficient GPU calculations > 00043: + On host cryoem01: free scratch space = 448.252 Gb. > 00044: Copying particles to scratch directory: /data1/new_scratch/relion_volatile/ > 00045: 000/??? sec ~~(,_,"> [oo] > 00046: 1/ 60 sec ~~(,_,">in: /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 > 00047: ERROR: > 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00049: === Backtrace === > 00050: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) [0x4786a1] > 00051: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) [0x4b210f] > 00052: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) [0x4b407b] > 00053: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) [0x5b8f87] > 00054: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) [0x498540] > 00055: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) [0x49ab2a] > 00056: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) [0x4322a5] > 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] > 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() [0x435fbf] > 00059: ================== > 00060: ERROR: > 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 > 00063: Traceback (most recent call last): > 00064: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 197, in run > 00065: self._run() > 00066: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 248, in _run > 00067: resultFiles = self._runFunc() > 00068: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 244, in _runFunc > 00069: return self._func(*self._args) > 00070: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", line 811, in runRelionStep > 00071: self.runJob(self._getProgram(), params) > 00072: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 1388, in runJob > 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) > 00074: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", line 65, in runJob > 00075: process.runJob(log, programName, params, > 00076: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 52, in runJob > 00077: return runCommand(command, env, cwd) > 00078: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 67, in runCommand > 00079: check_call(command, shell=True, stdout=sys.stdout, stderr=sys.stderr, > 00080: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, in check_call > 00081: raise CalledProcessError(retcode, cmd) > 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. > 00083: Protocol failed: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. > 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:33:29.230213 > 00085: *** Last status is failed > 00086: ------------------- PROTOCOL FAILED (DONE 2/3) > > > Cryosparc (in SCIPION) requires CTF to run. > > > > Thais is where I am now. > > Perhaps there is a solution? > > > Sincerely, > Dmitry > > > >> On 18. May 2021, at 14:16, Pablo Conesa <pc...@cn... <mailto:pc...@cn...>> wrote: >> >> Dear Dmitry, the import of CS metadata files (*.cs) is not supported in Scipion. Does CS has an option to export to star files. It rings a bell. >> >> On 18/5/21 9:53, Dmitry Semchonok wrote: >>> <>Dear Grigory, >>> >>> >>> The files are in mrc format. >>> >>> >>> Please, let me try to explain more plan: >>> >>> I have a project in cryosparc. There I have cryosparc selected 2D classes. I want to export the particles of those classes into SCIPION. >>> >>> So I I pressed Export (fig 1) and the program(cryosparc) created the folder with mrc + other files (fig 2;3). I looked into J48 and found many *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can also notice that they all have different sizes) >>> >>> So I need to export them somehow in SCIPION >>> >>> For that, I used the SCIPION export - images protocol where for the files to add I put *.mrc. But the protocol seems to be added only 1 mrc as 1 picture and instead of having 46392 particles I have ~600 particles. >>> >>> (Also the geometry seems not preserved). >>> >>> >>> So my question how to export the particles from cryosparc into SCIPION correctly? >>> >>> >>> Thank you! >>> >>> >>> >>> >>> >>> https://disk.yandex.com/d/Fv3Q1lpwEzSisg <https://disk.yandex.com/d/Fv3Q1lpwEzSisg> >>> >>> Sincerely, >>> >>> Dmitry >>> >>> >>> >>> >>>> On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm... <mailto:sha...@gm...>> wrote: >>>> >>>> Hi Dmitry, >>>> >>>> mrc stacks should have "mrcs" extension. Is this the problem you are getting? >>>> >>>> Best regards, >>>> Grigory >>>> >>>> -------------------------------------------------------------------------------- >>>> Grigory Sharov, Ph.D. >>>> >>>> MRC Laboratory of Molecular Biology, >>>> Francis Crick Avenue, >>>> Cambridge Biomedical Campus, >>>> Cambridge CB2 0QH, UK. >>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>> e-mail: gs...@mr... <mailto:gs...@mr...> >>>> >>>> >>>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>> <>Dear colleagues, >>>> >>>> I would like to export the particles from cryosparc to SCIPION. >>>> >>>> How to do that? >>>> >>>> >>>> >>>> >>>> >>>> What I tried: >>>> >>>> >>>> 1. In cryosparc I pressed Export – to export the particles I am interested in. >>>> >>>> 2. In the folder Export – I found many mrc stacks with particles in each. >>>> >>>> 3. I tried to export them to SCIPION using Export particles but instead of reading each stack and combine them in the 1 dataset I received 1 particle / per each mrc stack. >>>> >>>> >>>> Any ideas? >>>> >>>> >>>> Thank you >>>> >>>> Sincerely, >>>> >>>> Dmitry >>>> >>>> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... <mailto:sci...@li...> >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... <mailto:sci...@li...> >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>> >>> >>> >>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... <mailto:sci...@li...> >>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >> -- >> Pablo Conesa - Madrid Scipion <http://scipion.i2pc.es/> team >> _______________________________________________ >> scipion-users mailing list >> sci...@li... <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> > > _______________________________________________ > scipion-users mailing list > sci...@li... <mailto:sci...@li...> > https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Grigory S. <sha...@gm...> - 2021-05-18 13:03:12
|
Hi Dmitry, as the error states your star file points to a non-existing image in the mrcs stack. You need to check first if you import from cryosparc with mrcs worked correctly (open / display particles) then trace all the steps you did before 2D classification. Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok <Sem...@gm...> wrote: > Dear Pablo, > > > Thank you. I heard about this option. For that I guess the > https://pypi.org/project/cs2star/ needs to be installed. > > > > > In Cryosparc itself there is an option to export files. And then what we > get is the mrc files with different number of particles in each. > > It appeared to be possible to rename mrc —> to —>mrcs. Then SCIPION can > import those particles. > > Currently the problem is that not relion nor cryosparc can run these > particles. > > > > > Relion stops with error: > > > 00001: RUNNING PROTOCOL ----------------- > 00002: Hostname: cryoem01 > 00003: PID: 46455 > 00004: pyworkflow: 3.0.13 > 00005: plugin: relion > 00006: plugin v: 3.1.2 > 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix > 00008: workingDir: Runs/001546_ProtRelionClassify2D > 00009: runMode: Continue > 00010: MPI: 3 > 00011: threads: 3 > 00012: Starting at step: 1 > 00013: Running steps > 00014: STARTED: convertInputStep, step 1, time 2021-05-18 12:33:26.198123 > 00015: Converting set from 'Runs/001492_ProtUserSubSet/particles.sqlite' > into 'Runs/001546_ProtRelionClassify2D/input_particles.star' > 00016: convertBinaryFiles: creating soft links. > 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input -> > Runs/001057_ProtImportParticles/extra > 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 > 12:33:27.117588 > 00019: STARTED: runRelionStep, step 2, time 2021-05-18 12:33:27.145974 > 00020: mpirun -np 3 `which relion_refine_mpi` --i > Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter > 690 --zero_mask --K 64 --norm --scale --o > Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 > 00021: RELION version: 3.1.2 > 00022: Precision: BASE=double, CUDA-ACC=single > 00023: > 00024: === RELION MPI setup === > 00025: + Number of MPI processes = 3 > 00026: + Number of threads per MPI process = 3 > 00027: + Total number of threads therefore = 9 > 00028: + Leader (0) runs on host = cryoem01 > 00029: + Follower 1 runs on host = cryoem01 > 00030: + Follower 2 runs on host = cryoem01 > 00031: ================= > 00032: uniqueHost cryoem01 has 2 ranks. > 00033: GPU-ids not specified for this rank, threads will automatically > be mapped to available devices. > 00034: Thread 0 on follower 1 mapped to device 0 > 00035: Thread 1 on follower 1 mapped to device 0 > 00036: Thread 2 on follower 1 mapped to device 0 > 00037: GPU-ids not specified for this rank, threads will automatically > be mapped to available devices. > 00038: Thread 0 on follower 2 mapped to device 1 > 00039: Thread 1 on follower 2 mapped to device 1 > 00040: Thread 2 on follower 2 mapped to device 1 > 00041: Running CPU instructions in double precision. > 00042: + WARNING: Changing psi sampling rate (before oversampling) to > 5.625 degrees, for more efficient GPU calculations > 00043: + On host cryoem01: free scratch space = 448.252 Gb. > 00044: Copying particles to scratch directory: > /data1/new_scratch/relion_volatile/ > 00045: 000/??? sec ~~(,_,"> > [oo] > 00046: 1/ 60 sec ~~(,_,">in: > /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 > 00047: ERROR: > 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs > /001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00049: === Backtrace === > 00050: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) > [0x4786a1] > 00051: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) > [0x4b210f] > 00052: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) > [0x4b407b] > 00053: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) > [0x5b8f87] > 00054: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) > [0x498540] > 00055: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) > [0x49ab2a] > 00056: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) > [0x4322a5] > 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] > 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() > [0x435fbf] > 00059: ================== > 00060: ERROR: > 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs > /001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 > 00063: Traceback (most recent call last): > 00064: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 197, in run > 00065: self._run() > 00066: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 248, in _run > 00067: resultFiles = self._runFunc() > 00068: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 244, in _runFunc > 00069: return self._func(*self._args) > 00070: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", > line 811, in runRelionStep > 00071: self.runJob(self._getProgram(), params) > 00072: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 1388, in runJob > 00073: self._stepsExecutor.runJob(self._log, program, arguments, > **kwargs) > 00074: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", > line 65, in runJob > 00075: process.runJob(log, programName, params, > 00076: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", > line 52, in runJob > 00077: return runCommand(command, env, cwd) > 00078: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", > line 67, in runCommand > 00079: check_call(command, shell=True, stdout=sys.stdout, > stderr=sys.stderr, > 00080: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, > in check_call > 00081: raise CalledProcessError(retcode, cmd) > 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which > relion_refine_mpi` --i > Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter > 690 --zero_mask --K 64 --norm --scale --o > Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned > non-zero exit status 1. > 00083: Protocol failed: Command ' mpirun -np 3 `which > relion_refine_mpi` --i > Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter > 690 --zero_mask --K 64 --norm --scale --o > Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned > non-zero exit status 1. > 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:33:29.230213 > 00085: *** Last status is failed > 00086: ------------------- PROTOCOL FAILED (DONE 2/3) > > > Cryosparc (in SCIPION) requires CTF to run. > > > > Thais is where I am now. > > Perhaps there is a solution? > > > Sincerely, > Dmitry > > > > On 18. May 2021, at 14:16, Pablo Conesa <pc...@cn...> wrote: > > Dear Dmitry, the import of CS metadata files (*.cs) is not supported in > Scipion. Does CS has an option to export to star files. It rings a bell. > On 18/5/21 9:53, Dmitry Semchonok wrote: > > Dear Grigory, > > > The files are in mrc format. > > > Please, let me try to explain more plan: > > I have a project in cryosparc. There I have cryosparc selected 2D > classes. I want to export the particles of those classes into SCIPION. > > So I I pressed Export (fig 1) and the program(cryosparc) created the > folder with mrc + other files (fig 2;3). I looked into J48 and found many > *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to > be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can > also notice that they all have different sizes) > > So I need to export them somehow in SCIPION > > For that, I used the SCIPION export - images protocol where for the files > to add I put *.mrc. But the protocol seems to be added only 1 mrc as 1 > picture and instead of having 46392 particles I have ~600 particles. > > (Also the geometry seems not preserved). > > > So my question how to export the particles from cryosparc into SCIPION > correctly? > > > Thank you! > > > > https://disk.yandex.com/d/Fv3Q1lpwEzSisg > > Sincerely, > > Dmitry > > > > On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm...> > wrote: > > Hi Dmitry, > > mrc stacks should have "mrcs" extension. Is this the problem you are > getting? > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm...> > wrote: > >> Dear colleagues, >> >> I would like to export the particles from cryosparc to SCIPION. >> >> How to do that? >> >> >> >> >> What I tried: >> >> >> 1. In cryosparc I pressed Export – to export the particles I am >> interested in. >> >> 2. In the folder Export – I found many mrc stacks with particles in >> each. >> >> 3. I tried to export them to SCIPION using Export particles but >> instead of reading each stack and combine them in the 1 dataset I received >> 1 particle / per each mrc stack. >> >> >> Any ideas? >> >> >> Thank you >> >> Sincerely, >> >> Dmitry >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > _______________________________________________ > scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users > > -- > Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Roberto M. <ro...@cn...> - 2021-05-18 12:42:24
|
Furthermore, MRC is a tricky format that has been expanded by different packages and reimplemented several times so there are several dialects. Scipion tries to honor the definition described here: MRC2014: Extensions to the MRC format header for electron cryo-microscopy and tomography Journal of Structural Biology 192.2 (2015): 146-150 I do not know about cryosparc Roberto On Tue, May 18, 2021 at 2:19 PM Pablo Conesa <pc...@cn...> wrote: > I'd say there is a error in the association of the image files. > > readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001300_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > > Basically, > 024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > does not have 11 images? > > > On 18/5/21 12:34, Dmitry Semchonok wrote: > > Dear colleagues, > > We managed to import the particles from cryosparc into SCIPION (thanks to Grigory) > > But when I want to run the 2D classification I get the error: > > > 1. With relion > > > > Follower 2 runs on host = cryoem01 > 00031: ================= > 00032: uniqueHost cryoem01 has 2 ranks. > 00033: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. > 00034: Thread 0 on follower 1 mapped to device 0 > 00035: Thread 1 on follower 1 mapped to device 0 > 00036: Thread 2 on follower 1 mapped to device 0 > 00037: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. > 00038: Thread 0 on follower 2 mapped to device 1 > 00039: Thread 1 on follower 2 mapped to device 1 > 00040: Thread 2 on follower 2 mapped to device 1 > 00041: Running CPU instructions in double precision. > 00042: + WARNING: Changing psi sampling rate (before oversampling) to 5.625 degrees, for more efficient GPU calculations > 00043: + On host cryoem01: free scratch space = 448.912 Gb. > 00044: Copying particles to scratch directory: /data1/new_scratch/relion_volatile/ > 00045: 000/??? sec ~~(,_,"> [oo] > 00046: 0/ 0 sec ~~(,_,">in: /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 > 00047: ERROR: > 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001300_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00049: === Backtrace === > 00050: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) [0x4786a1] > 00051: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) [0x4b210f] > 00052: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) [0x4b407b] > 00053: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) [0x5b8f87] > 00054: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) [0x498540] > 00055: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) [0x49ab2a] > 00056: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) [0x4322a5] > 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f219de79555] > 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() [0x435fbf] > 00059: ================== > 00060: ERROR: > 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001300_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 > 00063: Traceback (most recent call last): > 00064: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 197, in run > 00065: self._run() > 00066: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 248, in _run > 00067: resultFiles = self._runFunc() > 00068: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 244, in _runFunc > 00069: return self._func(*self._args) > 00070: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", line 811, in runRelionStep > 00071: self.runJob(self._getProgram(), params) > 00072: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 1388, in runJob > 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) > 00074: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", line 65, in runJob > 00075: process.runJob(log, programName, params, > 00076: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 52, in runJob > 00077: return runCommand(command, env, cwd) > 00078: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 67, in runCommand > 00079: check_call(command, shell=True, stdout=sys.stdout, stderr=sys.stderr, > 00080: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, in check_call > 00081: raise CalledProcessError(retcode, cmd) > 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001300_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001300_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. > 00083: Protocol failed: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001300_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001300_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. > 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:25:42.553565 > 00085: *** Last status is failed > 00086: ------------------- PROTOCOL FAILED (DONE 2/3) > > > > 2. With cryosparc > > — no CTF information. > > > Could you please suggest the solution? > > > Thank you > > Sincerely, > Dmitry > > > > _______________________________________________ > scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users > > -- > Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Dmitry S. <Sem...@gm...> - 2021-05-18 12:25:12
|
Dear Pablo, Thank you. I heard about this option. For that I guess the https://pypi.org/project/cs2star/ <https://pypi.org/project/cs2star/> needs to be installed. In Cryosparc itself there is an option to export files. And then what we get is the mrc files with different number of particles in each. It appeared to be possible to rename mrc —> to —>mrcs. Then SCIPION can import those particles. Currently the problem is that not relion nor cryosparc can run these particles. Relion stops with error: 00001: RUNNING PROTOCOL ----------------- 00002: Hostname: cryoem01 00003: PID: 46455 00004: pyworkflow: 3.0.13 00005: plugin: relion 00006: plugin v: 3.1.2 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix 00008: workingDir: Runs/001546_ProtRelionClassify2D 00009: runMode: Continue 00010: MPI: 3 00011: threads: 3 00012: Starting at step: 1 00013: Running steps 00014: STARTED: convertInputStep, step 1, time 2021-05-18 12:33:26.198123 00015: Converting set from 'Runs/001492_ProtUserSubSet/particles.sqlite' into 'Runs/001546_ProtRelionClassify2D/input_particles.star' 00016: convertBinaryFiles: creating soft links. 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input -> Runs/001057_ProtImportParticles/extra 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 12:33:27.117588 00019: STARTED: runRelionStep, step 2, time 2021-05-18 12:33:27.145974 00020: mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 00021: RELION version: 3.1.2 00022: Precision: BASE=double, CUDA-ACC=single 00023: 00024: === RELION MPI setup === 00025: + Number of MPI processes = 3 00026: + Number of threads per MPI process = 3 00027: + Total number of threads therefore = 9 00028: + Leader (0) runs on host = cryoem01 00029: + Follower 1 runs on host = cryoem01 00030: + Follower 2 runs on host = cryoem01 00031: ================= 00032: uniqueHost cryoem01 has 2 ranks. 00033: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. 00034: Thread 0 on follower 1 mapped to device 0 00035: Thread 1 on follower 1 mapped to device 0 00036: Thread 2 on follower 1 mapped to device 0 00037: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. 00038: Thread 0 on follower 2 mapped to device 1 00039: Thread 1 on follower 2 mapped to device 1 00040: Thread 2 on follower 2 mapped to device 1 00041: Running CPU instructions in double precision. 00042: + WARNING: Changing psi sampling rate (before oversampling) to 5.625 degrees, for more efficient GPU calculations 00043: + On host cryoem01: free scratch space = 448.252 Gb. 00044: Copying particles to scratch directory: /data1/new_scratch/relion_volatile/ 00045: 000/??? sec ~~(,_,"> [oo] 00046: 1/ 60 sec ~~(,_,">in: /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 00047: ERROR: 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs 00049: === Backtrace === 00050: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) [0x4786a1] 00051: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) [0x4b210f] 00052: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) [0x4b407b] 00053: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) [0x5b8f87] 00054: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) [0x498540] 00055: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) [0x49ab2a] 00056: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) [0x4322a5] 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() [0x435fbf] 00059: ================== 00060: ERROR: 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 00063: Traceback (most recent call last): 00064: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 197, in run 00065: self._run() 00066: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 248, in _run 00067: resultFiles = self._runFunc() 00068: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 244, in _runFunc 00069: return self._func(*self._args) 00070: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", line 811, in runRelionStep 00071: self.runJob(self._getProgram(), params) 00072: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 1388, in runJob 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) 00074: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", line 65, in runJob 00075: process.runJob(log, programName, params, 00076: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 52, in runJob 00077: return runCommand(command, env, cwd) 00078: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 67, in runCommand 00079: check_call(command, shell=True, stdout=sys.stdout, stderr=sys.stderr, 00080: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, in check_call 00081: raise CalledProcessError(retcode, cmd) 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. 00083: Protocol failed: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:33:29.230213 00085: *** Last status is failed 00086: ------------------- PROTOCOL FAILED (DONE 2/3) Cryosparc (in SCIPION) requires CTF to run. Thais is where I am now. Perhaps there is a solution? Sincerely, Dmitry > On 18. May 2021, at 14:16, Pablo Conesa <pc...@cn...> wrote: > > Dear Dmitry, the import of CS metadata files (*.cs) is not supported in Scipion. Does CS has an option to export to star files. It rings a bell. > > On 18/5/21 9:53, Dmitry Semchonok wrote: >> <>Dear Grigory, >> >> >> The files are in mrc format. >> >> >> Please, let me try to explain more plan: >> >> I have a project in cryosparc. There I have cryosparc selected 2D classes. I want to export the particles of those classes into SCIPION. >> >> So I I pressed Export (fig 1) and the program(cryosparc) created the folder with mrc + other files (fig 2;3). I looked into J48 and found many *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can also notice that they all have different sizes) >> >> So I need to export them somehow in SCIPION >> >> For that, I used the SCIPION export - images protocol where for the files to add I put *.mrc. But the protocol seems to be added only 1 mrc as 1 picture and instead of having 46392 particles I have ~600 particles. >> >> (Also the geometry seems not preserved). >> >> >> So my question how to export the particles from cryosparc into SCIPION correctly? >> >> >> Thank you! >> >> >> >> >> >> https://disk.yandex.com/d/Fv3Q1lpwEzSisg <https://disk.yandex.com/d/Fv3Q1lpwEzSisg> >> >> Sincerely, >> >> Dmitry >> >> >> >> >>> On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm... <mailto:sha...@gm...>> wrote: >>> >>> Hi Dmitry, >>> >>> mrc stacks should have "mrcs" extension. Is this the problem you are getting? >>> >>> Best regards, >>> Grigory >>> >>> -------------------------------------------------------------------------------- >>> Grigory Sharov, Ph.D. >>> >>> MRC Laboratory of Molecular Biology, >>> Francis Crick Avenue, >>> Cambridge Biomedical Campus, >>> Cambridge CB2 0QH, UK. >>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>> e-mail: gs...@mr... <mailto:gs...@mr...> >>> >>> >>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>> <>Dear colleagues, >>> >>> I would like to export the particles from cryosparc to SCIPION. >>> >>> How to do that? >>> >>> >>> >>> >>> >>> What I tried: >>> >>> >>> 1. In cryosparc I pressed Export – to export the particles I am interested in. >>> >>> 2. In the folder Export – I found many mrc stacks with particles in each. >>> >>> 3. I tried to export them to SCIPION using Export particles but instead of reading each stack and combine them in the 1 dataset I received 1 particle / per each mrc stack. >>> >>> >>> Any ideas? >>> >>> >>> Thank you >>> >>> Sincerely, >>> >>> Dmitry >>> >>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... <mailto:sci...@li...> >>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... <mailto:sci...@li...> >>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >> >> >> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> > -- > Pablo Conesa - Madrid Scipion <http://scipion.i2pc.es/> team > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Pablo C. <pc...@cn...> - 2021-05-18 12:19:57
|
I'd say there is a error in the association of the image files. readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001300_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs Basically, 024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs does not have 11 images? On 18/5/21 12:34, Dmitry Semchonok wrote: > Dear colleagues, > > We managed to import the particles from cryosparc into SCIPION (thanks to Grigory) > > But when I want to run the 2D classification I get the error: > > > 1. With relion > > > > Follower 2 runs on host = cryoem01 > 00031: ================= > 00032: uniqueHost cryoem01 has 2 ranks. > 00033: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. > 00034: Thread 0 on follower 1 mapped to device 0 > 00035: Thread 1 on follower 1 mapped to device 0 > 00036: Thread 2 on follower 1 mapped to device 0 > 00037: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. > 00038: Thread 0 on follower 2 mapped to device 1 > 00039: Thread 1 on follower 2 mapped to device 1 > 00040: Thread 2 on follower 2 mapped to device 1 > 00041: Running CPU instructions in double precision. > 00042: + WARNING: Changing psi sampling rate (before oversampling) to 5.625 degrees, for more efficient GPU calculations > 00043: + On host cryoem01: free scratch space = 448.912 Gb. > 00044: Copying particles to scratch directory: /data1/new_scratch/relion_volatile/ > 00045: 000/??? sec ~~(,_,"> [oo] > 00046: 0/ 0 sec ~~(,_,">in: /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 > 00047: ERROR: > 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001300_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00049: === Backtrace === > 00050: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) [0x4786a1] > 00051: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) [0x4b210f] > 00052: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) [0x4b407b] > 00053: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) [0x5b8f87] > 00054: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) [0x498540] > 00055: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) [0x49ab2a] > 00056: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) [0x4322a5] > 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f219de79555] > 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() [0x435fbf] > 00059: ================== > 00060: ERROR: > 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001300_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 > 00063: Traceback (most recent call last): > 00064: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 197, in run > 00065: self._run() > 00066: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 248, in _run > 00067: resultFiles = self._runFunc() > 00068: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 244, in _runFunc > 00069: return self._func(*self._args) > 00070: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", line 811, in runRelionStep > 00071: self.runJob(self._getProgram(), params) > 00072: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 1388, in runJob > 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) > 00074: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", line 65, in runJob > 00075: process.runJob(log, programName, params, > 00076: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 52, in runJob > 00077: return runCommand(command, env, cwd) > 00078: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 67, in runCommand > 00079: check_call(command, shell=True, stdout=sys.stdout, stderr=sys.stderr, > 00080: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, in check_call > 00081: raise CalledProcessError(retcode, cmd) > 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001300_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001300_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. > 00083: Protocol failed: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001300_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001300_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. > 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:25:42.553565 > 00085: *** Last status is failed > 00086: ------------------- PROTOCOL FAILED (DONE 2/3) > > > > 2. With cryosparc > > — no CTF information. > > > Could you please suggest the solution? > > > Thank you > > Sincerely, > Dmitry > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users -- Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* |
From: Pablo C. <pc...@cn...> - 2021-05-18 12:17:11
|
Dear Dmitry, the import of CS metadata files (*.cs) is not supported in Scipion. Does CS has an option to export to star files. It rings a bell. On 18/5/21 9:53, Dmitry Semchonok wrote: > > Dear Grigory, > > The files are in mrc format. > > Please, let me try to explain more plan: > > I have a project in cryosparc. There I have cryosparc selected 2D > classes. I want to export the particles of those classes into SCIPION. > > So I I pressed Export (fig 1) and the program(cryosparc) created the > folder with mrc + other files (fig 2;3). I looked into J48 and found > many *.mrc files of the particles. But it is not 1 mrc = 1 particle. > It seems to be a mrc stuck - so I have several files inside 1 *.mrc > (fig 4) (you can also notice that they all have different sizes) > > So I need to export them somehow in SCIPION > > For that, I used the SCIPION export - images protocol where for the > files to add I put *.mrc. But the protocol seems to be added only 1 > mrc as 1 picture and instead of having 46392 particles I have ~600 > particles. > > (Also the geometry seems not preserved). > > So my question how to export the particles from cryosparc into SCIPION > correctly? > > Thank you! > > > > https://disk.yandex.com/d/Fv3Q1lpwEzSisg > <https://disk.yandex.com/d/Fv3Q1lpwEzSisg> > > > Sincerely, > > Dmitry > > > > >> On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm... >> <mailto:sha...@gm...>> wrote: >> >> Hi Dmitry, >> >> mrc stacks should have "mrcs" extension. Is this the problem you are >> getting? >> >> Best regards, >> Grigory >> >> -------------------------------------------------------------------------------- >> Grigory Sharov, Ph.D. >> >> MRC Laboratory of Molecular Biology, >> Francis Crick Avenue, >> Cambridge Biomedical Campus, >> Cambridge CB2 0QH, UK. >> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >> e-mail: gs...@mr... <mailto:gs...@mr...> >> >> >> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm... >> <mailto:Sem...@gm...>> wrote: >> >> Dear colleagues, >> >> I would like to export the particles from cryosparc to SCIPION. >> >> How to do that? >> >> >> >> >> >> What I tried: >> >> >> 1.In cryosparc I pressed Export – to export the particles I am >> interested in. >> >> 2.In the folder Export – I found many mrc stacks with particles >> in each. >> >> 3.I tried to export them to SCIPION using Export particles but >> instead of reading each stack and combine them in the 1 dataset I >> received 1 particle / per each mrc stack. >> >> >> Any ideas? >> >> >> Thank you >> >> Sincerely, >> >> Dmitry >> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users -- Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* |
From: Dmitry S. <Sem...@gm...> - 2021-05-18 10:34:55
|
Dear colleagues, We managed to import the particles from cryosparc into SCIPION (thanks to Grigory) But when I want to run the 2D classification I get the error: 1. With relion Follower 2 runs on host = cryoem01 00031: ================= 00032: uniqueHost cryoem01 has 2 ranks. 00033: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. 00034: Thread 0 on follower 1 mapped to device 0 00035: Thread 1 on follower 1 mapped to device 0 00036: Thread 2 on follower 1 mapped to device 0 00037: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. 00038: Thread 0 on follower 2 mapped to device 1 00039: Thread 1 on follower 2 mapped to device 1 00040: Thread 2 on follower 2 mapped to device 1 00041: Running CPU instructions in double precision. 00042: + WARNING: Changing psi sampling rate (before oversampling) to 5.625 degrees, for more efficient GPU calculations 00043: + On host cryoem01: free scratch space = 448.912 Gb. 00044: Copying particles to scratch directory: /data1/new_scratch/relion_volatile/ 00045: 000/??? sec ~~(,_,"> [oo] 00046: 0/ 0 sec ~~(,_,">in: /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 00047: ERROR: 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001300_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs 00049: === Backtrace === 00050: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) [0x4786a1] 00051: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) [0x4b210f] 00052: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) [0x4b407b] 00053: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) [0x5b8f87] 00054: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) [0x498540] 00055: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) [0x49ab2a] 00056: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) [0x4322a5] 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f219de79555] 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() [0x435fbf] 00059: ================== 00060: ERROR: 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001300_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 00063: Traceback (most recent call last): 00064: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 197, in run 00065: self._run() 00066: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 248, in _run 00067: resultFiles = self._runFunc() 00068: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 244, in _runFunc 00069: return self._func(*self._args) 00070: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", line 811, in runRelionStep 00071: self.runJob(self._getProgram(), params) 00072: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 1388, in runJob 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) 00074: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", line 65, in runJob 00075: process.runJob(log, programName, params, 00076: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 52, in runJob 00077: return runCommand(command, env, cwd) 00078: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 67, in runCommand 00079: check_call(command, shell=True, stdout=sys.stdout, stderr=sys.stderr, 00080: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, in check_call 00081: raise CalledProcessError(retcode, cmd) 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001300_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001300_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. 00083: Protocol failed: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001300_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001300_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:25:42.553565 00085: *** Last status is failed 00086: ------------------- PROTOCOL FAILED (DONE 2/3) 2. With cryosparc — no CTF information. Could you please suggest the solution? Thank you Sincerely, Dmitry |
From: Grigory S. <sha...@gm...> - 2021-05-18 08:06:41
|
Hi, I don't use cryosparc. You need to rename mrc to mrcs. MRC format was never designed for stacks, mrc is only for a single 2D image or a 3D volume. 2D stacks should be mrcs (as in relion & eman2). Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Tue, May 18, 2021 at 8:53 AM Dmitry Semchonok <Sem...@gm...> wrote: > Dear Grigory, > > > > The files are in mrc format. > > > > Please, let me try to explain more plan: > > I have a project in cryosparc. There I have cryosparc selected 2D > classes. I want to export the particles of those classes into SCIPION. > > So I I pressed Export (fig 1) and the program(cryosparc) created the > folder with mrc + other files (fig 2;3). I looked into J48 and found many > *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to > be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can > also notice that they all have different sizes) > > So I need to export them somehow in SCIPION > > For that, I used the SCIPION export - images protocol where for the files > to add I put *.mrc. But the protocol seems to be added only 1 mrc as 1 > picture and instead of having 46392 particles I have ~600 particles. > > (Also the geometry seems not preserved). > > > > So my question how to export the particles from cryosparc into SCIPION > correctly? > > > > Thank you! > > > > https://disk.yandex.com/d/Fv3Q1lpwEzSisg > > Sincerely, > > Dmitry > > > > On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm...> > wrote: > > Hi Dmitry, > > mrc stacks should have "mrcs" extension. Is this the problem you are > getting? > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm...> > wrote: > >> Dear colleagues, >> >> I would like to export the particles from cryosparc to SCIPION. >> >> How to do that? >> >> >> >> >> What I tried: >> >> >> 1. In cryosparc I pressed Export – to export the particles I am >> interested in. >> >> 2. In the folder Export – I found many mrc stacks with particles in >> each. >> >> 3. I tried to export them to SCIPION using Export particles but >> instead of reading each stack and combine them in the 1 dataset I received >> 1 particle / per each mrc stack. >> >> >> Any ideas? >> >> >> Thank you >> >> Sincerely, >> >> Dmitry >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > |
From: Grigory S. <sha...@gm...> - 2021-05-18 08:06:20
|
Hi, I don't use cryosparc. You need to rename mrc to mrcs. MRC format was never designed for stacks, mrc is only for a single 2D image or a 3D volume. 2D stacks should be mrcs (as in relion & eman2). Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Tue, May 18, 2021 at 8:53 AM Dmitry Semchonok <Sem...@gm...> wrote: > Dear Grigory, > > > > The files are in mrc format. > > > > Please, let me try to explain more plan: > > I have a project in cryosparc. There I have cryosparc selected 2D > classes. I want to export the particles of those classes into SCIPION. > > So I I pressed Export (fig 1) and the program(cryosparc) created the > folder with mrc + other files (fig 2;3). I looked into J48 and found many > *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to > be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can > also notice that they all have different sizes) > > So I need to export them somehow in SCIPION > > For that, I used the SCIPION export - images protocol where for the files > to add I put *.mrc. But the protocol seems to be added only 1 mrc as 1 > picture and instead of having 46392 particles I have ~600 particles. > > (Also the geometry seems not preserved). > > > > So my question how to export the particles from cryosparc into SCIPION > correctly? > > > > Thank you! > > > > https://disk.yandex.com/d/Fv3Q1lpwEzSisg > > Sincerely, > > Dmitry > > > > On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm...> > wrote: > > Hi Dmitry, > > mrc stacks should have "mrcs" extension. Is this the problem you are > getting? > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm...> > wrote: > >> Dear colleagues, >> >> I would like to export the particles from cryosparc to SCIPION. >> >> How to do that? >> >> >> >> >> What I tried: >> >> >> 1. In cryosparc I pressed Export – to export the particles I am >> interested in. >> >> 2. In the folder Export – I found many mrc stacks with particles in >> each. >> >> 3. I tried to export them to SCIPION using Export particles but >> instead of reading each stack and combine them in the 1 dataset I received >> 1 particle / per each mrc stack. >> >> >> Any ideas? >> >> >> Thank you >> >> Sincerely, >> >> Dmitry >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > |
From: Dmitry S. <Sem...@gm...> - 2021-05-18 07:54:07
|
<>Dear Grigory, The files are in mrc format. Please, let me try to explain more plan: I have a project in cryosparc. There I have cryosparc selected 2D classes. I want to export the particles of those classes into SCIPION. So I I pressed Export (fig 1) and the program(cryosparc) created the folder with mrc + other files (fig 2;3). I looked into J48 and found many *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can also notice that they all have different sizes) So I need to export them somehow in SCIPION For that, I used the SCIPION export - images protocol where for the files to add I put *.mrc. But the protocol seems to be added only 1 mrc as 1 picture and instead of having 46392 particles I have ~600 particles. (Also the geometry seems not preserved). So my question how to export the particles from cryosparc into SCIPION correctly? Thank you! https://disk.yandex.com/d/Fv3Q1lpwEzSisg Sincerely, Dmitry > On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm...> wrote: > > Hi Dmitry, > > mrc stacks should have "mrcs" extension. Is this the problem you are getting? > > Best regards, > Grigory > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> > e-mail: gs...@mr... <mailto:gs...@mr...> > > > On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: > <>Dear colleagues, > > I would like to export the particles from cryosparc to SCIPION. > > How to do that? > > > > > > What I tried: > > > 1. In cryosparc I pressed Export – to export the particles I am interested in. > > 2. In the folder Export – I found many mrc stacks with particles in each. > > 3. I tried to export them to SCIPION using Export particles but instead of reading each stack and combine them in the 1 dataset I received 1 particle / per each mrc stack. > > > Any ideas? > > > Thank you > > Sincerely, > > Dmitry > > > _______________________________________________ > scipion-users mailing list > sci...@li... <mailto:sci...@li...> > https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Grigory S. <sha...@gm...> - 2021-05-17 16:13:08
|
Hi Dmitry, mrc stacks should have "mrcs" extension. Is this the problem you are getting? Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm...> wrote: > Dear colleagues, > > I would like to export the particles from cryosparc to SCIPION. > > How to do that? > > > > > > > What I tried: > > > > 1. In cryosparc I pressed Export – to export the particles I am > interested in. > > 2. In the folder Export – I found many mrc stacks with particles in > each. > > 3. I tried to export them to SCIPION using Export particles but > instead of reading each stack and combine them in the 1 dataset I received > 1 particle / per each mrc stack. > > > > Any ideas? > > > > Thank you > > Sincerely, > > Dmitry > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Dmitry S. <Sem...@gm...> - 2021-05-17 15:49:31
|
<>Dear colleagues, I would like to export the particles from cryosparc to SCIPION. How to do that? What I tried: 1. In cryosparc I pressed Export – to export the particles I am interested in. 2. In the folder Export – I found many mrc stacks with particles in each. 3. I tried to export them to SCIPION using Export particles but instead of reading each stack and combine them in the 1 dataset I received 1 particle / per each mrc stack. Any ideas? Thank you Sincerely, Dmitry |
From: Grigory S. <sha...@gm...> - 2021-05-14 08:17:28
|
Hi Karl, there are two problems: 1) you created a conda environment then activated it and then used scipion installer to create virtualenv inside conda env. This is a very weird setup. If you want to use conda, please follow the installation instructions. You only need to source conda itself (do not create env or activate it), and then run the installer with specific options. 2) you are using cuda 11.1 which is not supported by xmipp. You need to have cuda 10.x or disable CUDA during xmipp installation. Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Fri, May 14, 2021 at 7:03 AM Karl Herbine < Kar...@st...> wrote: > Hi there, > > I can't seem to find the solution to this problem. I have tried installing > scipion3 for the past few days and I am about to give up. Attached I have > included the output of the installation after following the guide on your > website. It fails after running: > > python3 -m scipioninstaller /path/where/you/want/scipion -j 4 > > Things that i've tried so far: > 1) reinstalling libzstd-devel hdf5-devel gcc gcc-c++ openmpi-devel > 2) reinstalling miniconda > 3) uninstalling and reinstalling pip install --user scipion-installer > 4) pip install opencv-python > 5) uninstalling and installing xmipp using the plugin manager on scipion3 > GUI > > Any advice or help would be greatly appreciated. > > Thanks, > Karl > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Karl H. <Kar...@st...> - 2021-05-13 21:22:45
|
Hi there, I can't seem to find the solution to this problem. I have tried installing scipion3 for the past few days and I am about to give up. Attached I have included the output of the installation after following the guide on your website. It fails after running: python3 -m scipioninstaller /path/where/you/want/scipion -j 4 Things that i've tried so far: 1) reinstalling libzstd-devel hdf5-devel gcc gcc-c++ openmpi-devel 2) reinstalling miniconda 3) uninstalling and reinstalling pip install --user scipion-installer 4) pip install opencv-python 5) uninstalling and installing xmipp using the plugin manager on scipion3 GUI Any advice or help would be greatly appreciated. Thanks, Karl |
From: Grigory S. <sha...@gm...> - 2021-05-13 12:42:18
|
Hello Annie, I'm sorry to hear about your troubles with xmipp installation. If you switch to the "Output Log" tab, what does it say there? Could you attach the whole output of that tab here as a text file? It looks like the xmipp plugin got installed but the binaries compilation has failed. Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 13, 2021 at 10:27 AM Anne M Dosey <do...@uw...> wrote: > Hello, > > I am a graduate student in Neil King's group at the University of > Washington. We have been trying to get scipion installed on our server > recently but have been running into several python/other package dependency > errors. Specifically, I am trying to get the local rec plugin working > properly. So far, we have scipion installed and running some things in it > works (relion jobs work fine), but the define subparticles module within > local rec does not work and is really what I need to use. I get the > following error in my terminal about Xmipp when I try to run it: > > >>> WARNING: Image library not found! > > Please install Xmipp to get full functionality. > (Configuration->Plugins->scipion-em-xmipp -> expand, in Scipion plugin > manager window) > > > > error when importing from xmipp3.protocols: cannot import name > 'isMdEmpty' from 'xmipp3.base' > (/home/doseya/.local/lib/python3.8/site-packages/xmipp3/base.py) > > And then there's a "'NoneType' object as no attribute 'AddObject'" error > in the output.log. I also think I have xmipp installed as a plugin (black > box is there next to scipion-em-xmipp) but then there's a sub-plugin for > xmipp under that that won't install for some reason (see screenshot), so > not sure how to properly install xmipp, which may fix this issue? > > Thanks for any help on this issue! > Annie Dosey > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Carlos O. S. <co...@cn...> - 2021-05-10 18:20:41
|
Dear Genis, El 10/05/2021 a las 19:35, Genís Valentín Gesé escribió: > Hi everyone and thanks for all the replies! > > The three suggested approaches allowed me to automatically separate > good from bad 2D classes. > > "cryoassess", "xmipp-eliminate empty classes" and "gl2d" produce a > SetOfAverages or a SetOfClasses2D. > > I know I can use a SetOfAverages as an input to xmipp-ransac or > eman2-initial model. However, is it possible to use these to generate > an initial model with relion or cryosparc? > I have never used relion or cryosparc with class averages. Probably, as the protocols are written now, they will complain that the set of averages has no CTF information (which is true, the particles have CTF, but the averages do not). Kind regards, Carlos Oscar > Best regards > > *Genis Valentin Gese* | Postdoctoral researcher > > Martin Hällberg group > Cell and Molecular Biology Department | Karolinska Institutet > 171 65 Stockholm | Solnavägen 9 > gen...@ki... <mailto:fir...@ki...> | ki.se > <https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fki.se%2Fen%2Fstartpage&data=02%7C01%7Cgenis.valentin.gese%40ki.se%7C96f824255829424948f308d82a5e4ec9%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637305931833981292&sdata=yXbUoiFkJqjlBIYtLNaNEXIlydBY8N0LVZ5C9bWR5nU%3D&reserved=0> > > > ------------------------------------------------------------------------ > *Från:* Carlos Oscar Sorzano <co...@cn...> > *Skickat:* den 10 maj 2021 09:39 > *Till:* Mailing list for Scipion users > <sci...@li...>; Genís Valentín Gesé > <gen...@ki...> > *Ämne:* Re: [scipion-users] 2D classification in streaming > > Dear Genis, > > > the idea is that cryoassess is used as a static protocol, labels the > class representatives into good and bad classes, and then GL2D > classifies in streaming all the particles into these classes. > > > Cheers, Carlos Oscar > > > El 09/05/2021 a las 11:15, Genís Valentín Gesé escribió: >> Hello all! >> >> At the moment I'm using "xmipp-gl2d static" to classify a streaming >> set of particles, and it would be great if one could get a stream of >> particles belonging to the good 2d classes. >> >> In this way, one could use 2D classification to filter out bad >> particles before further processing. >> >> I looked into cryoasses, but it does not seem to run in streaming mode. >> >> Any ideas on whether this can be done? >> >> Thanks and best regards >> >> *Genis Valentin Gese* | Postdoctoral researcher >> >> Martin Hällberg group >> Cell and Molecular Biology Department | Karolinska Institutet >> 171 65 Stockholm | Solnavägen 9 >> gen...@ki... <mailto:fir...@ki...> | ki.se >> <https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fki.se%2Fen%2Fstartpage&data=04%7C01%7Cgenis.valentin.gese%40ki.se%7C9537b354fd5743ec20f808d913978e43%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637562363919527485%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=hwzBc77vMs%2F%2FBrwyainuJus8R10IIjI2ABz1EaCShQc%3D&reserved=0> >> >> >> >> >> /När du skickar e-post till Karolinska Institutet (KI) innebär detta >> att KI kommer att behandla dina personuppgifter. /Här finns >> information om hur KI behandlar personuppgifter >> <https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fki.se%2Fmedarbetare%2Fintegritetsskyddspolicy&data=04%7C01%7Cgenis.valentin.gese%40ki.se%7C9537b354fd5743ec20f808d913978e43%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637562363919537440%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=E1%2Fl7SnjxsHAapTdk800TbheMBt5v2rtu6G9Z8%2FS%2F8Q%3D&reserved=0>. >> >> >> /Sending email to Karolinska Institutet (KI) will result in KI >> processing your personal data./ You can read more about KI’s >> processing of personal data here >> <https://ki.se/en/staff/data-protection-policy>. >> >> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.sourceforge.net%2Flists%2Flistinfo%2Fscipion-users&data=04%7C01%7Cgenis.valentin.gese%40ki.se%7C9537b354fd5743ec20f808d913978e43%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637562363919537440%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=5w6RBpv%2BFhM8LCZoVlVsIp7lTFVQ29BDZP3k9m%2BuwKM%3D&reserved=0> > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Genís V. G. <gen...@ki...> - 2021-05-10 17:50:20
|
Hi everyone and thanks for all the replies! The three suggested approaches allowed me to automatically separate good from bad 2D classes. "cryoassess", "xmipp-eliminate empty classes" and "gl2d" produce a SetOfAverages or a SetOfClasses2D. I know I can use a SetOfAverages as an input to xmipp-ransac or eman2-initial model. However, is it possible to use these to generate an initial model with relion or cryosparc? Best regards Genis Valentin Gese | Postdoctoral researcher Martin Hällberg group Cell and Molecular Biology Department | Karolinska Institutet 171 65 Stockholm | Solnavägen 9 gen...@ki...<mailto:fir...@ki...> | ki.se<https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fki.se%2Fen%2Fstartpage&data=02%7C01%7Cgenis.valentin.gese%40ki.se%7C96f824255829424948f308d82a5e4ec9%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637305931833981292&sdata=yXbUoiFkJqjlBIYtLNaNEXIlydBY8N0LVZ5C9bWR5nU%3D&reserved=0> ________________________________ Från: Carlos Oscar Sorzano <co...@cn...> Skickat: den 10 maj 2021 09:39 Till: Mailing list for Scipion users <sci...@li...>; Genís Valentín Gesé <gen...@ki...> Ämne: Re: [scipion-users] 2D classification in streaming Dear Genis, the idea is that cryoassess is used as a static protocol, labels the class representatives into good and bad classes, and then GL2D classifies in streaming all the particles into these classes. Cheers, Carlos Oscar El 09/05/2021 a las 11:15, Genís Valentín Gesé escribió: Hello all! At the moment I'm using "xmipp-gl2d static" to classify a streaming set of particles, and it would be great if one could get a stream of particles belonging to the good 2d classes. In this way, one could use 2D classification to filter out bad particles before further processing. I looked into cryoasses, but it does not seem to run in streaming mode. Any ideas on whether this can be done? Thanks and best regards Genis Valentin Gese | Postdoctoral researcher Martin Hällberg group Cell and Molecular Biology Department | Karolinska Institutet 171 65 Stockholm | Solnavägen 9 gen...@ki...<mailto:fir...@ki...> | ki.se<https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fki.se%2Fen%2Fstartpage&data=04%7C01%7Cgenis.valentin.gese%40ki.se%7C9537b354fd5743ec20f808d913978e43%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637562363919527485%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=hwzBc77vMs%2F%2FBrwyainuJus8R10IIjI2ABz1EaCShQc%3D&reserved=0> När du skickar e-post till Karolinska Institutet (KI) innebär detta att KI kommer att behandla dina personuppgifter. Här finns information om hur KI behandlar personuppgifter<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fki.se%2Fmedarbetare%2Fintegritetsskyddspolicy&data=04%7C01%7Cgenis.valentin.gese%40ki.se%7C9537b354fd5743ec20f808d913978e43%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637562363919537440%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=E1%2Fl7SnjxsHAapTdk800TbheMBt5v2rtu6G9Z8%2FS%2F8Q%3D&reserved=0>. Sending email to Karolinska Institutet (KI) will result in KI processing your personal data. You can read more about KI’s processing of personal data here<https://ki.se/en/staff/data-protection-policy>. _______________________________________________ scipion-users mailing list sci...@li...<mailto:sci...@li...> https://lists.sourceforge.net/lists/listinfo/scipion-users<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.sourceforge.net%2Flists%2Flistinfo%2Fscipion-users&data=04%7C01%7Cgenis.valentin.gese%40ki.se%7C9537b354fd5743ec20f808d913978e43%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637562363919537440%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=5w6RBpv%2BFhM8LCZoVlVsIp7lTFVQ29BDZP3k9m%2BuwKM%3D&reserved=0> |
From: Carlos O. S. <co...@cn...> - 2021-05-10 09:39:59
|
Dear Genis, the idea is that cryoassess is used as a static protocol, labels the class representatives into good and bad classes, and then GL2D classifies in streaming all the particles into these classes. Cheers, Carlos Oscar El 09/05/2021 a las 11:15, Genís Valentín Gesé escribió: > Hello all! > > At the moment I'm using "xmipp-gl2d static" to classify a streaming > set of particles, and it would be great if one could get a stream of > particles belonging to the good 2d classes. > > In this way, one could use 2D classification to filter out bad > particles before further processing. > > I looked into cryoasses, but it does not seem to run in streaming mode. > > Any ideas on whether this can be done? > > Thanks and best regards > > *Genis Valentin Gese* | Postdoctoral researcher > > Martin Hällberg group > Cell and Molecular Biology Department | Karolinska Institutet > 171 65 Stockholm | Solnavägen 9 > gen...@ki... <mailto:fir...@ki...> | ki.se > <https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fki.se%2Fen%2Fstartpage&data=02%7C01%7Cgenis.valentin.gese%40ki.se%7C96f824255829424948f308d82a5e4ec9%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637305931833981292&sdata=yXbUoiFkJqjlBIYtLNaNEXIlydBY8N0LVZ5C9bWR5nU%3D&reserved=0> > > > > > /När du skickar e-post till Karolinska Institutet (KI) innebär detta > att KI kommer att behandla dina personuppgifter. /Här finns > information om hur KI behandlar personuppgifter > <https://ki.se/medarbetare/integritetsskyddspolicy>. > > > /Sending email to Karolinska Institutet (KI) will result in KI > processing your personal data./ You can read more about KI’s > processing of personal data here > <https://ki.se/en/staff/data-protection-policy>. > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Pablo C. <pc...@cn...> - 2021-05-10 06:55:30
|
This one works in streaming, not sure how accurate it is for your case. On 9/5/21 13:09, Grigory Sharov wrote: > Hello Genis, > > cryoassess with streaming support is not yet released, but it should > work if you install that plugin in devel mode from git. > > I'm not aware of other (released) tools for automated 2D classes selection > > Best regards, > Grigory > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> > e-mail: gs...@mr... <mailto:gs...@mr...> > > > On Sun, May 9, 2021 at 11:48 AM Genís Valentín Gesé > <gen...@ki... <mailto:gen...@ki...>> wrote: > > Hello all! > > At the moment I'm using "xmipp-gl2d static" to classify a > streaming set of particles, and it would be great if one could get > a stream of particles belonging to the good 2d classes. > > In this way, one could use 2D classification to filter out bad > particles before further processing. > > I looked into cryoasses, but it does not seem to run in streaming > mode. > > Any ideas on whether this can be done? > > Thanks and best regards > > *Genis Valentin Gese* | Postdoctoral researcher > > Martin Hällberg group > Cell and Molecular Biology Department | Karolinska Institutet > 171 65 Stockholm | Solnavägen 9 > gen...@ki... <mailto:fir...@ki...> | > ki.se > <https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fki.se%2Fen%2Fstartpage&data=02%7C01%7Cgenis.valentin.gese%40ki.se%7C96f824255829424948f308d82a5e4ec9%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637305931833981292&sdata=yXbUoiFkJqjlBIYtLNaNEXIlydBY8N0LVZ5C9bWR5nU%3D&reserved=0> > > > > > /När du skickar e-post till Karolinska Institutet (KI) innebär > detta att KI kommer att behandla dina personuppgifter. /Här finns > information om hur KI behandlar personuppgifter > <https://ki.se/medarbetare/integritetsskyddspolicy>. > > > /Sending email to Karolinska Institutet (KI) will result in KI > processing your personal data./ You can read more about KI’s > processing of personal data here > <https://ki.se/en/staff/data-protection-policy>. > > _______________________________________________ > scipion-users mailing list > sci...@li... > <mailto:sci...@li...> > https://lists.sourceforge.net/lists/listinfo/scipion-users > <https://lists.sourceforge.net/lists/listinfo/scipion-users> > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users -- Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* |
From: Grigory S. <sha...@gm...> - 2021-05-09 11:09:29
|
Hello Genis, cryoassess with streaming support is not yet released, but it should work if you install that plugin in devel mode from git. I'm not aware of other (released) tools for automated 2D classes selection Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Sun, May 9, 2021 at 11:48 AM Genís Valentín Gesé < gen...@ki...> wrote: > Hello all! > > At the moment I'm using "xmipp-gl2d static" to classify a streaming set of > particles, and it would be great if one could get a stream of particles > belonging to the good 2d classes. > > In this way, one could use 2D classification to filter out bad particles > before further processing. > > I looked into cryoasses, but it does not seem to run in streaming mode. > > Any ideas on whether this can be done? > > Thanks and best regards > > *Genis Valentin Gese* | Postdoctoral researcher > > Martin Hällberg group > Cell and Molecular Biology Department | Karolinska Institutet > 171 65 Stockholm | Solnavägen 9 > gen...@ki... <fir...@ki...> | ki.se > <https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fki.se%2Fen%2Fstartpage&data=02%7C01%7Cgenis.valentin.gese%40ki.se%7C96f824255829424948f308d82a5e4ec9%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637305931833981292&sdata=yXbUoiFkJqjlBIYtLNaNEXIlydBY8N0LVZ5C9bWR5nU%3D&reserved=0> > > > > *När du skickar e-post till Karolinska Institutet (KI) innebär detta att > KI kommer att behandla dina personuppgifter. *Här finns information om > hur KI behandlar personuppgifter > <https://ki.se/medarbetare/integritetsskyddspolicy>. > > > *Sending email to Karolinska Institutet (KI) will result in KI processing > your personal data.* You can read more about KI’s processing of personal > data here <https://ki.se/en/staff/data-protection-policy>. > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Genís V. G. <gen...@ki...> - 2021-05-09 10:48:14
|
Hello all! At the moment I'm using "xmipp-gl2d static" to classify a streaming set of particles, and it would be great if one could get a stream of particles belonging to the good 2d classes. In this way, one could use 2D classification to filter out bad particles before further processing. I looked into cryoasses, but it does not seem to run in streaming mode. Any ideas on whether this can be done? Thanks and best regards Genis Valentin Gese | Postdoctoral researcher Martin Hällberg group Cell and Molecular Biology Department | Karolinska Institutet 171 65 Stockholm | Solnavägen 9 gen...@ki...<mailto:fir...@ki...> | ki.se<https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fki.se%2Fen%2Fstartpage&data=02%7C01%7Cgenis.valentin.gese%40ki.se%7C96f824255829424948f308d82a5e4ec9%7Cbff7eef1cf4b4f32be3da1dda043c05d%7C0%7C0%7C637305931833981292&sdata=yXbUoiFkJqjlBIYtLNaNEXIlydBY8N0LVZ5C9bWR5nU%3D&reserved=0> När du skickar e-post till Karolinska Institutet (KI) innebär detta att KI kommer att behandla dina personuppgifter. Här finns information om hur KI behandlar personuppgifter<https://ki.se/medarbetare/integritetsskyddspolicy>. Sending email to Karolinska Institutet (KI) will result in KI processing your personal data. You can read more about KI's processing of personal data here<https://ki.se/en/staff/data-protection-policy>. |
From: Dmitry S. <Sem...@gm...> - 2021-05-04 08:08:07
|
Dear Yunior, Thank you very much for taking the actions to improve SCIPION. I am very grateful! Sincerely, Dmitry > On 4. May 2021, at 10:05, Yunior C. Fonseca Reyna <su...@bc...> wrote: > > Hi Dmitry, > > You're right, that job can run without inputting a reference volume. The job can generate an initial volume. I have already extended that functionality but for some reason CS fails. Maybe it's my particles or the amount I'm using. I will try other sets to reach some conclusion. I'll write to you when I release the changes. > > Best Regards, > Yunior > > > Activo Vie, 30 Abr a 10:04 AM , Mailing list for Scipion users <sci...@li...> Escrito: > <>Dear Yunior, > > Small question – > Is it possible to add to the existing cryosparc2 – 3D Helical Refinement (BETA) protocol the function to run the unsupervised helix reconstruction with our input volume --- “when the helical refinement job also doesn't require an input volume”? > > As it described here https://guide.cryosparc.com/processing-data/all-job-types-in-cryosparc/helical-reconstruction-beta/case-study-empiar-10031-mavs <https://guide.cryosparc.com/processing-data/all-job-types-in-cryosparc/helical-reconstruction-beta/case-study-empiar-10031-mavs> in the chapter Helical Refinement — Asymmetric Helical Refinement > > Thank you > > Sincerely, > Dmitry > > Biocomputing unit - National Center for Biotechnology powered by Freshdesk <https://freshdesk.com/freshdesk-demo?utm_source=emailfooter&referrer=bcucnb.freshdesk.com> 926:553135 > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Yunior C. F. R. <su...@bc...> - 2021-05-04 08:05:29
|
Hi Dmitry, You're right, that job can run without inputting a reference volume. The job can generate an initial volume. I have already extended that functionality but for some reason CS fails. Maybe it's my particles or the amount I'm using. I will try other sets to reach some conclusion. I'll write to you when I release the changes. Best Regards, Yunior Activo Vie, 30 Abr a 10:04 AM , Mailing list for Scipion users <sci...@li...> Escrito: Dear Yunior, Small question – Is it possible to add to the existing cryosparc2 – 3D Helical Refinement (BETA) protocol the function to run the unsupervised helix reconstruction with our input volume --- “when the helical refinement job also doesn't require an input volume”? As it described here https://guide.cryosparc.com/processing-data/all-job-types-in-cryosparc/helical-reconstruction-beta/case-study-empiar-10031-mavs in the chapter Helical Refinement — Asymmetric Helical Refinement Thank you Sincerely, Dmitry |
From: Dmitry S. <Sem...@gm...> - 2021-04-30 08:04:09
|
<>Dear Yunior, Small question – Is it possible to add to the existing cryosparc2 – 3D Helical Refinement (BETA) protocol the function to run the unsupervised helix reconstruction with our input volume --- “when the helical refinement job also doesn't require an input volume”? As it described here https://guide.cryosparc.com/processing-data/all-job-types-in-cryosparc/helical-reconstruction-beta/case-study-empiar-10031-mavs <https://guide.cryosparc.com/processing-data/all-job-types-in-cryosparc/helical-reconstruction-beta/case-study-empiar-10031-mavs> in the chapter Helical Refinement — Asymmetric Helical Refinement Thank you Sincerely, Dmitry |
From: <pc...@cn...> - 2021-04-29 06:11:05
|
<div dir='auto'>You'll need the mrcs file containing the 2d averages. If you have the 2d classification star file, it should point to the mrcs.</div><div class="gmail_extra"><br><div class="gmail_quote">El 28 abr. 2021 11:31, Dmitry Semchonok <Sem...@gm...> escribió:<br type="attribution" /><blockquote class="quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div style="word-wrap:break-word">Dear Pablo,<div><br /></div><div>I am not sure how to get the stack, containing my particles out from (standing alone)</div><div><br /></div><div>- Relion </div><div>- Cryosparc</div><div><br /></div><div><br /></div><div>So I am 1 step behind import.</div><div><br /></div><div><br /></div><div>Or do you mean I can in SCIPION select — import averages and then as an input specify the 2D classification file from relion // or cryosparc?</div><div><br /></div><div>Thank you!</div><div><br /></div><div>Sincerely,</div><div>Dmitry</div><div><br /></div><div><br /><div><br /><blockquote><div>On 28. Apr 2021, at 10:22, Pablo Conesa <<a href="mailto:pconesa@cnb.csic.es">pconesa@cnb.csic.es</a>> wrote:</div><br /><div> <div><p>Hi Dmitry, have you tried "Import averages" to import the 2d classes stack?</p><p>And then from there you should be able to run any initial model method , except CS one that uses particles instead.<br /> </p> <div>On 28/4/21 9:59, Dmitry Semchonok wrote:<br /> </div> <blockquote> Hello Jorge, <div><br /> </div> <div>Yes, or into SCIPION.</div> <div><br /> </div> <div>Example: — we started a project in relion.</div> <div>We have some nice 2D class averages.</div> <div>But the initial modelling for some reason does not work.</div> <div><br /> </div> <div>So I would like somehow to:</div> <div><br /> </div> <div>a) extract those particles (mrc // stk),</div> <div>b) import them to SCIPION.</div> <div>c) run initial modelling in SCIPION.</div> <div>d) Export this initial model to relion and proceed.</div> <div><br /> </div> <div><br /> </div> <div><br /> </div> <div><br /> </div> <div><br /> </div> <div>Thank you</div> <div><br /> </div> <div><br /> </div> <div>Sincerely,</div> <div>Dmitry</div> <div><br /> </div> <div><br /> </div> <div><br /> </div> <div> <div><br /> <blockquote> <div>On 28. Apr 2021, at 09:39, Jorge Jimenez <<a href="mailto:support@bcucnb.freshdesk.com">support@bcucnb.freshdesk.com</a>> wrote:</div> <br /> <div> <div style="font-family:'helvetica neue' , 'helvetica' , 'arial' , sans-serif;font-size:13px"> <div style="font-family:'helvetica neue' , 'helvetica' , 'arial' , sans-serif;font-size:13px"> <div dir="ltr">Hi <span style="color:rgb( 24 , 50 , 71 );font-family:'-apple-system' , 'blinkmacsystemfont' , 'segoe ui' , 'roboto' , 'helvetica neue' , 'arial' , sans-serif;font-size:14px;font-style:normal;font-weight:400;letter-spacing:normal;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;display:inline !important;float:none" dir="ltr">Dmitry</span>,<br /> Do you mean moving from Relion or Cryosparc from outside of Scipion to process their results inside Scipion?<br /> </div> <div dir="ltr">Best,</div> <div dir="ltr">Scipion Team<br /> <br /> Ticket: <a href="https://bcucnb.freshdesk.com/helpdesk/tickets/921">https://bcucnb.freshdesk.com/helpdesk/tickets/921</a></div> <div dir="ltr"> <div><br /> </div> </div> <div> <blockquote> Activo Lun, 26 Abr a 9:39 AM , Mailing list for Scipion users <<a href="mailto:scipion-users@lists.sourceforge.net">scipion-users@lists.sourceforge.net</a>> Escrito: <div>Dear colleagues,</div> <div> </div> <div>Small question –</div> <div> </div> <div>Do you know how to extract particles from relion & cryosparc to run them in SCIPION?</div> <div> </div> <div>Thank you!</div> <div> </div> <div>Sincerely,</div> <div>Dmitry</div> </blockquote> </div> </div> <a href="https://freshdesk.com/freshdesk-demo?utm_source=emailfooter&referrer=bcucnb.freshdesk.com">Biocomputing unit - National Center for Biotechnology powered by Freshdesk</a> <span style="font-size:0px;font-family:'fdtktid';min-height:0px;height:0px;max-height:0px;line-height:0px;color:#ffffff">921:553135</span> </div> _______________________________________________<br /> scipion-users mailing list<br /> <a href="mailto:scipion-users@lists.sourceforge.net">scipion-users@lists.sourceforge.net</a><br /> <a href="https://lists.sourceforge.net/lists/listinfo/scipion-users">https://lists.sourceforge.net/lists/listinfo/scipion-users</a><br /> </div> </blockquote> </div> <br /> </div> <br /> <fieldset></fieldset> <br /> <fieldset></fieldset> <pre>_______________________________________________ scipion-users mailing list <a href="mailto:scipion-users@lists.sourceforge.net">scipion-users@lists.sourceforge.net</a> <a href="https://lists.sourceforge.net/lists/listinfo/scipion-users">https://lists.sourceforge.net/lists/listinfo/scipion-users</a> </pre> </blockquote> <div>-- <br /> Pablo Conesa - <strong>Madrid <a href="http://scipion.i2pc.es/">Scipion</a> team</strong></div> </div> _______________________________________________<br />scipion-users mailing list<br /><a href="mailto:scipion-users@lists.sourceforge.net">scipion-users@lists.sourceforge.net</a><br />https://lists.sourceforge.net/lists/listinfo/scipion-users<br /></div></blockquote></div><br /></div></div></blockquote></div><br></div> |