You can subscribe to this list here.
2016 |
Jan
(2) |
Feb
(13) |
Mar
(9) |
Apr
(4) |
May
(5) |
Jun
(2) |
Jul
(8) |
Aug
(3) |
Sep
(25) |
Oct
(7) |
Nov
(49) |
Dec
(15) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2017 |
Jan
(24) |
Feb
(36) |
Mar
(53) |
Apr
(44) |
May
(37) |
Jun
(34) |
Jul
(12) |
Aug
(15) |
Sep
(14) |
Oct
(9) |
Nov
(9) |
Dec
(7) |
2018 |
Jan
(16) |
Feb
(9) |
Mar
(27) |
Apr
(39) |
May
(8) |
Jun
(24) |
Jul
(22) |
Aug
(11) |
Sep
(1) |
Oct
|
Nov
|
Dec
|
2019 |
Jan
(4) |
Feb
(5) |
Mar
|
Apr
(1) |
May
(21) |
Jun
(13) |
Jul
(31) |
Aug
(22) |
Sep
(9) |
Oct
(19) |
Nov
(24) |
Dec
(12) |
2020 |
Jan
(30) |
Feb
(12) |
Mar
(16) |
Apr
(4) |
May
(37) |
Jun
(17) |
Jul
(19) |
Aug
(15) |
Sep
(26) |
Oct
(84) |
Nov
(64) |
Dec
(55) |
2021 |
Jan
(18) |
Feb
(58) |
Mar
(26) |
Apr
(88) |
May
(51) |
Jun
(36) |
Jul
(31) |
Aug
(37) |
Sep
(79) |
Oct
(15) |
Nov
(29) |
Dec
(8) |
2022 |
Jan
(5) |
Feb
(8) |
Mar
(29) |
Apr
(21) |
May
(11) |
Jun
(11) |
Jul
(18) |
Aug
(16) |
Sep
(6) |
Oct
(10) |
Nov
(23) |
Dec
(1) |
2023 |
Jan
(18) |
Feb
|
Mar
(4) |
Apr
|
May
(3) |
Jun
(10) |
Jul
(1) |
Aug
|
Sep
|
Oct
(1) |
Nov
(3) |
Dec
(5) |
2024 |
Jan
(2) |
Feb
|
Mar
|
Apr
|
May
|
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2025 |
Jan
(1) |
Feb
|
Mar
|
Apr
(5) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Grigory S. <sha...@gm...> - 2021-05-21 07:19:21
|
Hi Christian, I'm glad to hear that it finally worked out for you! PS. I think you don't need to export var to run scipion, it was only needed to compile xmipp. On Fri, May 21, 2021, 07:48 Christian Tüting < chr...@bi...> wrote: > Hi all, > > this is just summarizing our issue and the solution. We had some > problems during installation. The scipion3 core was installed, but xmipp > failed. There were two different issues. If you observe the same, here > is our solution: > > First Issue: 'libfftw3' not found in the system. Installation using > conda automatically during installation failed. > Solution: Installed libtiff systemwide with: yum install libtiff* > > Second issue: /usr/bin/ld: warning: libwebp.so.7, needed by > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > found (try using -rpath or -rpath-link) > The library libwebp.so.7 was not found, even though present in the > <condapath>/envs/scipion3/lib folder. > Solution: export the library path of the enviroment to LD_LIBRARY_PATH > with: export > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/conda/envs/scipion3/lib/ > With this, we were able to compile xmipp. > > To add the path not always to the path but only when scipion3 is > executed, I add a little function instead of a alias to start scipion3 > in ~/.bashrc: > > # scipion3 > scipion3() { > export > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ > /home/user/Data/Software/scipion3/scipion3 "$@" > } > > With this function, the envs/lib path is added first and then scipion is > started with all args given to the function. > > > Thanks @Grigory for all the support. > > Best > Christian > > > >>> Christian Tüting <chr...@bi...> > 20.05.21 22.48 Uhr >>> > Hi Jose Miguel, > > it looks like adding the scipion3 env library path to LD_LIBRARY_PATH > did the job. xmipp was compiled and installed successfully and the first > test were also running fine (preprocess micrographs and particle > picking). We setting up our workflow with all plugins and test tomorrow > again. But if no furher error occur, I will anser on this mailing thread > with a summarizing mail, describing the inital issue and our solution, > so if other have the same issue, they dont need to go through all mails. > > Best > > Christian > > > >>> Jose Miguel de la Rosa Trevin <del...@gm...> 20.05.21 > 22.40 Uhr >>> > Hi Christian, > > I have a repository with an alternative Scipion installation script and > with pre-compiled Xmipp binaries. > Let me know if you want to give it a try. > > Best, > Jose Miguel > > > > On Thu, May 20, 2021 at 8:59 PM Christian Tüting < > chr...@bi...> wrote: > > > Ok. We will test this and keep you informed. > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 20.56 Uhr >>> > > If you were able to successfully build xmipp, you need to check if you > > can > > display images in scipion to make sure it works. > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 7:54 PM Christian Tüting < > > chr...@bi...> wrote: > > > > > I (temporarily) fixed the error with: > > > > > > export > > > > > > > > > > > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ > > > > > > > > > xmipp is currently compiling. I am really no expert in server > > > maintainance, so I am not sure what are the implications of this. > > Afaik, > > > this is only a fix for the current shell, so after reboot, this fix > > > should be gone. So if xmipp needs this library, I guess I have to > add > > > this export command to ~/.bashrc, right? > > > > > > Best > > > > > > Christian > > > > > > > > > >>> Christian Tüting <chr...@bi...> > > > 20.05.21 20.20 Uhr >>> > > > Hi, > > > > > > I tried yum install libtiff libtiff-devel before, but yum says, > > "nothing > > > to do". But, yum install libtiff* helped to install the missing > > packages > > > somehow. > > > > > > So the error disappers from ./xmipp_c> > > > > But ./xmipp check_config still fails. > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > config > > > Configuring ----------------------------------------- > > > gcc detected > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > OpenCV not found > > > rm -v xmipp_test_opencv* > > > 'mpirun' and 'mpiexec' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicc' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicxx' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > Alternative found at '/usr/local/cuda/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/local/cuda/bin > > > CUDA-10.1.243 detected. > > > CUDA-8.0 is recommended. > > > Using '> > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ > ./xmipp > > > check_config > > > Checking configuration ------------------------------ > > > Checking compiler configuration ... > > > g++ 4.8.5 detected > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_main.cpp > > > -o xmipp_test_main.o -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > > > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > > /usr/bin/ld: warning: libwebp.so.7, needed by > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > > found (try using -rpath or -rpath-link) > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureImportRGB' > > > /home/user/Data/Software/> > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureFree' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIAppend' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDecGetRGB' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPINewDecoder' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureImportRGBA' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPConfigInitInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPEncoundefined reference to > > `WebPValidateConfig' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureInitInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPFreeDecBuffer' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDelete' > > > collect2: error: ld returned 1 exit status > > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > > Cannot compile > > > Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy > > > python3-scipy > > > python3-mpi4py > > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy > python3-scipy > > > --noconfirm > > > Please, see > > > ' > > > > > > > > https://scipion-em.github.io/docs> > for more information about libraries > dependencies. > > > Remember to re-run './xmipp config' after installing libraries in > > order > > > to take into account the new system configuration. > > > rm xmipp_test_main* > > > Check failed! Something wrong with the configuration. > > > > > > > > > It's still looking for libwebp.so.7, but this is present: > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr > > > ../../../../miniconda/envs/scipion3/lib/libwebp.so* > > > -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 > > > ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 > > > lrwxrwxrwx. 1 user user > > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 > > -> > > > libwebp.so.7.1.1 > > > lrwxrwxrwx. 1 user user > > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so > -> > > > libwebp.so.7.1.1 > > > > > > Best > > > > > > Christian > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> > > > Hi Christian, > > > > > > I guess xmipp script still cannot recognize libtiff that it has > > > installed. > > > I can reproduce your problem on my machine. But I have a system > > library > > > installed. > > > > > > So, the easiest solution is to install libtiff / libtiff-devel > using > > > your > > > package manager (yum) > > > > > > Best regards, > > >> > Grigory Sharov, Ph.D. > > > > > > MRC Laboratory of Molecular Biology, > > > Francis Crick Avenue, > > > Cambridge Biomedical Campus, > > > Cambridge CB2 0QH, UK. > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > e-mail: gs...@mr... > > > > > > > > > On Thu, May 20, 2021 at 6:34 PM Christian Tüting < > > > chr...@bi...> wrote: > > > > > > > Hi, > > > > > > > > it fails with similar errors like in the automatic installation: > > > > > > > > > > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > > > linux-vdso.so.1 => (0x00007ffe872c3000) > > > > libwebp.so.7 => > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > > > > (0x00007f8f7253a000) > > > > libzstd.so.1 => > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > > > > (0x00007f8f7246e000) > > > > liblzma.so.5 => > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > > > > (0x00007f8f723fe000) > > > > libjpeg.so.9 => > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > > > > (0x00007f8f721c2000) > > > > libz.so.1 => > > > > /home/user/Data/Softw> > libm.so.6 => /lib64/libm.so.6 > > (0x00007f8f71ea4000) > > > > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > > > > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > > > > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > > > > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > config > > > > Configuring ----------------------------------------- > > > > gcc detected > > > > 'libtiff' not found in the system > > > > 'libtiff' dependency not found. Do you want to install it using > > conda? > > > > [YES/no] Y > > > > Trying to install libtiff with conda > > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > > CommandNotFoundError: Your shell has not been properly configured > to > > > use > > > > 'conda activate'. > > > > To initialize your shell, run > > > > $ conda init <SHELL_NAME> > > > > Currently supported shells are: > > > > - bash > > > > - fish > > > > - tcsh > > > > - x> IMPORTANT: You may need to close and restart your shell after > > > running > > > > 'conda init'. > > > > Collecting package metadata (current_repodata.json): ...working... > > > done > > > > Solving environment: ...working... done > > > > # All requested packages already installed. > > > > 'libtiff' installed in conda environ 'scipion3'. > > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and > > > 'mpiexec' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bi> > > 'mpicc' not found > in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > 'mpicxx' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > > Alternative found at '/usr/local/cuda/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/local/cuda/bin > > > > CUDA-10.1.243 detected. > > > > CUDA-8.0 is recommended. > > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > > Configuration completed..... > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > config > > > > Configuring ----------------------------------------- > > > > gcc detected > > > > 'libtiff' not found in the system > > > > 'libtiff' dependency not found. Do you want to install it using > > conda? > > > > [YES/no] > > > > Trying to install libtiff with conda > > > > conda activate scipi> > > CommandNotFoundError: Your shell has not > been properly configured > to > > > use > > > > 'conda activate'. > > > > To initialize your shell, run > > > > $ conda init <SHELL_NAME> > > > > Currently supported shells are: > > > > - bash > > > > - fish > > > > - tcsh > > > > - xonsh > > > > - zsh > > > > - powershell > > > > See 'conda init --help' for more information and options. > > > > IMPORTANT: You may need to close and restart your shell after > > running > > > > 'conda init'. > > > > Collecting package metadata (current_repodata.json): ...working... > > > done > > > > Solving environment: ...working... done > > > > # All requested packages already installed. > > > > 'libtiff' installed in conda exmipp_test_opencv.cpp -o > > > xmipp_test_opencv.o > > > > -I../ > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > > OpenCV not found > > > > rm -v xmipp_test_opencv* > > > > 'mpirun' and 'mpiexec' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > 'mpicc' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > 'mpicxx' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return> > Java detected at: > > /home/user/Data/Software/miniconda/envs/scipion3 > > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > > Alternative found at '/usr/local/cuda/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/local/cuda/bin > > > > CUDA-10.1.243 detected. > > > > CUDA-8.0 is recommended. > > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > > Configuration completed..... > > > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > > check_config > > > > Checking configuration ------------------------------ > > > > Checking compiler configuration ... > > > > g++ 4.8.5 detected > > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > xmipp_test_main.cpp > > > > -o xmipp_test_main.o -I../ > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > > > > > > > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/c> > > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > > > /usr/bin/ld: warning: libwebp.so.7, needed by > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, > not > > > > found (try using -rpath or -rpath-link) > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> > > >> > > undefined reference to `WebPInitDecBufferInternal' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPPictureFree' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPIAppend' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPIDecGetRGB' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPINewDecoder' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPPictureImportRGBA' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPConfigInitInternal' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPEncode' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPValidateConfig' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPPictureInitInternal' > > > > /home/user/Dat> > > undefined reference to `WebPFreeDecBuffer' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPIDelete' > > > > collect2: error: ld returned 1 exit status > > > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > > > Cannot compile > > > > Possible solutions > > > > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > > > > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > > > > python3-mpi4py > > > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy > > python3-scipy > > > > --noconfirm > > > > Please, see > > > > ' > > > > > > > > > > > > > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > > > ' > > > > for more information about libraries dependencies. > > > > Remember to re-run './xmipp config' after installing libraries in > > > order > > > > to take into account the new system configuration. > > > > rm xmipp_test_main* > > > > Check failed! Trying to install libtiff with conda > > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > > CommandNotFoundError: Your shell has not been properly configured > to > > > use > > > > 'conda activate'. > > > > > > > > Because I can run it manually without errors: > > > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda > > > activate > > > > scipion3 ; conda install libtiff -y -c defaults > > > > Collecting package metadata (current_repodata.json): done > > > > Solving environment: done > > > > # All requested packages alre> > And also, thanks already in > advance. > > I really appreciate your help. > > > > > > > > Best Christian > > > > > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr > >>> > > > > Hi, > > > > > > > > Let's try manual installation: > > > > > > > > conda activate scipion3 > > > > > ldd > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > > -> > > > > > check output for errors > > > > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > > > > cd > /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > > > > ./xmipp config -> check if all is correct in xmipp.conf > > > > > ./xmipp check_config -> check for errors > > > > > ./xmipp compileAndInstall N=4 && ln -srfn build > > > > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && > > touch > > > > > installation_finished && rm bindings_linked 2> /dev/null > > > > > > > > > > > > You can post output errors from these commands here. > > > > > > > > Best regards, > > > > Grigory > > > > > > > > > > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > > Grigory Sharov, Ph.D. > > > > > > > > MRC Laboratory of Molecular Biology, > > > > Francis Crick Avenue, > > > > Cambridge Biomedical Campus, > > > > Cambridge CB2 0QH, UK. > > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > > e-mail: gsharov@mrc-lmb.> > > chr...@bi...> > > > > > yes we have the library in the scipion3 env: > > > > > > > > > > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > > > > > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda > > > install fftw -y -c defaults[0m > > > > > CommandNotFoundError: Your shell has not been properly > configured > > to > > > > use > > > > > 'conda activate'. > > > > > > > > > > When I am coping this command, it runs without any issue. > > > > > > > > > > > > > > > Thanks for your help. > > > > > > > > > > best > > > > > Christian > > > > > > > > > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr > > >>> > > > > > Hi, > > > > > > > > > > do you have libwebp.so.7 in > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > > > > > > > Best regards, > > > > > Grigory > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > > > Grigory Sharov, Ph.D. > > > > > > > > > > MRC Laboratory of Molecular Biology, > > > > > Francis Cri> > > > e-mail: gs...@mr... > > > > > > > > > > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok > > > <Sem...@gm...> > > > > > wrote: > > > > > > > > > > > Dear colleagues, > > > > > > > > > > > > We reinstalled the centos 7 on our server as well as > miniconda. > > > > > > > > > > > > > > > > > > > > > > > > The core was installed correctly but during xmipp installation > > > some > > > > error > > > > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > > > > > > > > > > Sincerely, > > > > > > Dmitry and > > > Christian_______________________________________________ > > > > > > scipion-users mailing list > > > > > > sci...@li... > > > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > > > > > > > > > > > _______________________________________________ > > > > > scipion-users mailing list > > > > > sci...@li... > > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > > > > > > _______________________________________________ > > > > scipion-users mailing list > > > > sci...@li... > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/sci > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Christian T. <chr...@bi...> - 2021-05-21 06:48:33
|
Hi all, this is just summarizing our issue and the solution. We had some problems during installation. The scipion3 core was installed, but xmipp failed. There were two different issues. If you observe the same, here is our solution: First Issue: 'libfftw3' not found in the system. Installation using conda automatically during installation failed. Solution: Installed libtiff systemwide with: yum install libtiff* Second issue: /usr/bin/ld: warning: libwebp.so.7, needed by /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not found (try using -rpath or -rpath-link) The library libwebp.so.7 was not found, even though present in the <condapath>/envs/scipion3/lib folder. Solution: export the library path of the enviroment to LD_LIBRARY_PATH with: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/conda/envs/scipion3/lib/ With this, we were able to compile xmipp. To add the path not always to the path but only when scipion3 is executed, I add a little function instead of a alias to start scipion3 in ~/.bashrc: # scipion3 scipion3() { export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ /home/user/Data/Software/scipion3/scipion3 "$@" } With this function, the envs/lib path is added first and then scipion is started with all args given to the function. Thanks @Grigory for all the support. Best Christian >>> Christian Tüting <chr...@bi...> 20.05.21 22.48 Uhr >>> Hi Jose Miguel, it looks like adding the scipion3 env library path to LD_LIBRARY_PATH did the job. xmipp was compiled and installed successfully and the first test were also running fine (preprocess micrographs and particle picking). We setting up our workflow with all plugins and test tomorrow again. But if no furher error occur, I will anser on this mailing thread with a summarizing mail, describing the inital issue and our solution, so if other have the same issue, they dont need to go through all mails. Best Christian >>> Jose Miguel de la Rosa Trevin <del...@gm...> 20.05.21 22.40 Uhr >>> Hi Christian, I have a repository with an alternative Scipion installation script and with pre-compiled Xmipp binaries. Let me know if you want to give it a try. Best, Jose Miguel On Thu, May 20, 2021 at 8:59 PM Christian Tüting < chr...@bi...> wrote: > Ok. We will test this and keep you informed. > > >>> Grigory Sharov <sha...@gm...> 20.05.21 20.56 Uhr >>> > If you were able to successfully build xmipp, you need to check if you > can > display images in scipion to make sure it works. > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 7:54 PM Christian Tüting < > chr...@bi...> wrote: > > > I (temporarily) fixed the error with: > > > > export > > > > > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ > > > > > > xmipp is currently compiling. I am really no expert in server > > maintainance, so I am not sure what are the implications of this. > Afaik, > > this is only a fix for the current shell, so after reboot, this fix > > should be gone. So if xmipp needs this library, I guess I have to add > > this export command to ~/.bashrc, right? > > > > Best > > > > Christian > > > > > > >>> Christian Tüting <chr...@bi...> > > 20.05.21 20.20 Uhr >>> > > Hi, > > > > I tried yum install libtiff libtiff-devel before, but yum says, > "nothing > > to do". But, yum install libtiff* helped to install the missing > packages > > somehow. > > > > So the error disappers from ./xmipp_c> > > > But ./xmipp check_config still fails. > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > OpenCV not found > > rm -v xmipp_test_opencv* > > 'mpirun' and 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '> > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > check_config > > Checking configuration ------------------------------ > > Checking compiler configuration ... > > g++ 4.8.5 detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_main.cpp > > -o xmipp_test_main.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > /usr/bin/ld: warning: libwebp.so.7, needed by > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > found (try using -rpath or -rpath-link) > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGB' > > /home/user/Data/Software/> > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureFree' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIAppend' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDecGetRGB' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPINewDecoder' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGBA' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPConfigInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPEncoundefined reference to > `WebPValidateConfig' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPFreeDecBuffer' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDelete' > > collect2: error: ld returned 1 exit status > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > Cannot compile > > Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy > > python3-scipy > > python3-mpi4py > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > > --noconfirm > > Please, see > > ' > > > > https://scipion-em.github.io/docs> > for more information about libraries dependencies. > > Remember to re-run './xmipp config' after installing libraries in > order > > to take into account the new system configuration. > > rm xmipp_test_main* > > Check failed! Something wrong with the configuration. > > > > > > It's still looking for libwebp.so.7, but this is present: > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr > > ../../../../miniconda/envs/scipion3/lib/libwebp.so* > > -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 > > ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 > > lrwxrwxrwx. 1 user user > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 > -> > > libwebp.so.7.1.1 > > lrwxrwxrwx. 1 user user > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so -> > > libwebp.so.7.1.1 > > > > Best > > > > Christian > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> > > Hi Christian, > > > > I guess xmipp script still cannot recognize libtiff that it has > > installed. > > I can reproduce your problem on my machine. But I have a system > library > > installed. > > > > So, the easiest solution is to install libtiff / libtiff-devel using > > your > > package manager (yum) > > > > Best regards, > >> > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 6:34 PM Christian Tüting < > > chr...@bi...> wrote: > > > > > Hi, > > > > > > it fails with similar errors like in the automatic installation: > > > > > > > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > > linux-vdso.so.1 => (0x00007ffe872c3000) > > > libwebp.so.7 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > > > (0x00007f8f7253a000) > > > libzstd.so.1 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > > > (0x00007f8f7246e000) > > > liblzma.so.5 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > > > (0x00007f8f723fe000) > > > libjpeg.so.9 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > > > (0x00007f8f721c2000) > > > libz.so.1 => > > > /home/user/Data/Softw> > libm.so.6 => /lib64/libm.so.6 > (0x00007f8f71ea4000) > > > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > > > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > > > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > > > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > config > > > Configuring ----------------------------------------- > > > gcc detected > > > 'libtiff' not found in the system > > > 'libtiff' dependency not found. Do you want to install it using > conda? > > > [YES/no] Y > > > Trying to install libtiff with conda > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > To initialize your shell, run > > > $ conda init <SHELL_NAME> > > > Currently supported shells are: > > > - bash > > > - fish > > > - tcsh > > > - x> IMPORTANT: You may need to close and restart your shell after > > running > > > 'conda init'. > > > Collecting package metadata (current_repodata.json): ...working... > > done > > > Solving environment: ...working... done > > > # All requested packages already installed. > > > 'libtiff' installed in conda environ 'scipion3'. > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and > > 'mpiexec' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bi> > > 'mpicc' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicxx' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > Alternative found at '/usr/local/cuda/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/local/cuda/bin > > > CUDA-10.1.243 detected. > > > CUDA-8.0 is recommended. > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > Configuration completed..... > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > config > > > Configuring ----------------------------------------- > > > gcc detected > > > 'libtiff' not found in the system > > > 'libtiff' dependency not found. Do you want to install it using > conda? > > > [YES/no] > > > Trying to install libtiff with conda > > > conda activate scipi> > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > To initialize your shell, run > > > $ conda init <SHELL_NAME> > > > Currently supported shells are: > > > - bash > > > - fish > > > - tcsh > > > - xonsh > > > - zsh > > > - powershell > > > See 'conda init --help' for more information and options. > > > IMPORTANT: You may need to close and restart your shell after > running > > > 'conda init'. > > > Collecting package metadata (current_repodata.json): ...working... > > done > > > Solving environment: ...working... done > > > # All requested packages already installed. > > > 'libtiff' installed in conda exmipp_test_opencv.cpp -o > > xmipp_test_opencv.o > > > -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > OpenCV not found > > > rm -v xmipp_test_opencv* > > > 'mpirun' and 'mpiexec' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicc' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicxx' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return> > Java detected at: > /home/user/Data/Software/miniconda/envs/scipion3 > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > Alternative found at '/usr/local/cuda/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/local/cuda/bin > > > CUDA-10.1.243 detected. > > > CUDA-8.0 is recommended. > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > Configuration completed..... > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > check_config > > > Checking configuration ------------------------------ > > > Checking compiler configuration ... > > > g++ 4.8.5 detected > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_main.cpp > > > -o xmipp_test_main.o -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/c> > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > > /usr/bin/ld: warning: libwebp.so.7, needed by > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > > found (try using -rpath or -rpath-link) > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> > >> > > undefined reference to `WebPInitDecBufferInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureFree' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIAppend' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDecGetRGB' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPINewDecoder' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureImportRGBA' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPConfigInitInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPEncode' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPValidateConfig' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureInitInternal' > > > /home/user/Dat> > > undefined reference to `WebPFreeDecBuffer' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDelete' > > > collect2: error: ld returned 1 exit status > > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > > Cannot compile > > > Possible solutions > > > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > > > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > > > python3-mpi4py > > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy > python3-scipy > > > --noconfirm > > > Please, see > > > ' > > > > > > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > > ' > > > for more information about libraries dependencies. > > > Remember to re-run './xmipp config' after installing libraries in > > order > > > to take into account the new system configuration. > > > rm xmipp_test_main* > > > Check failed! Trying to install libtiff with conda > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > > > > Because I can run it manually without errors: > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda > > activate > > > scipion3 ; conda install libtiff -y -c defaults > > > Collecting package metadata (current_repodata.json): done > > > Solving environment: done > > > # All requested packages alre> > And also, thanks already in advance. > I really appreciate your help. > > > > > > Best Christian > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > > > Hi, > > > > > > Let's try manual installation: > > > > > > conda activate scipion3 > > > > ldd > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > -> > > > > check output for errors > > > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > > > ./xmipp config -> check if all is correct in xmipp.conf > > > > ./xmipp check_config -> check for errors > > > > ./xmipp compileAndInstall N=4 && ln -srfn build > > > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && > touch > > > > installation_finished && rm bindings_linked 2> /dev/null > > > > > > > > > You can post output errors from these commands here. > > > > > > Best regards, > > > Grigory > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > Grigory Sharov, Ph.D. > > > > > > MRC Laboratory of Molecular Biology, > > > Francis Crick Avenue, > > > Cambridge Biomedical Campus, > > > Cambridge CB2 0QH, UK. > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > e-mail: gsharov@mrc-lmb.> > chr...@bi...> > > > > yes we have the library in the scipion3 env: > > > > > > > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda > > install fftw -y -c defaults[0m > > > > CommandNotFoundError: Your shell has not been properly configured > to > > > use > > > > 'conda activate'. > > > > > > > > When I am coping this command, it runs without any issue. > > > > > > > > > > > > Thanks for your help. > > > > > > > > best > > > > Christian > > > > > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr > >>> > > > > Hi, > > > > > > > > do you have libwebp.so.7 in > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > > > > > Best regards, > > > > Grigory > > > > > > > > > > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > > Grigory Sharov, Ph.D. > > > > > > > > MRC Laboratory of Molecular Biology, > > > > Francis Cri> > > > e-mail: gs...@mr... > > > > > > > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok > > <Sem...@gm...> > > > > wrote: > > > > > > > > > Dear colleagues, > > > > > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > > > > > > > > > The core was installed correctly but during xmipp installation > > some > > > error > > > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > > > > > > > Sincerely, > > > > > Dmitry and > > Christian_______________________________________________ > > > > > scipion-users mailing list > > > > > sci...@li... > > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > > > > > > _______________________________________________ > > > > scipion-users mailing list > > > > sci...@li... > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/sci > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > _______________________________________________ scipion-users mailing list sci...@li... https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Jose M. de la R. T. <del...@gm...> - 2021-05-20 20:50:45
|
Great! On Thu, May 20, 2021 at 10:47 PM Christian Tüting < chr...@bi...> wrote: > Hi Jose Miguel, > > it looks like adding the scipion3 env library path to LD_LIBRARY_PATH > did the job. xmipp was compiled and installed successfully and the first > test were also running fine (preprocess micrographs and particle > picking). We setting up our workflow with all plugins and test tomorrow > again. But if no furher error occur, I will anser on this mailing thread > with a summarizing mail, describing the inital issue and our solution, > so if other have the same issue, they dont need to go through all mails. > > Best > > Christian > > > >>> Jose Miguel de la Rosa Trevin <del...@gm...> 20.05.21 > 22.40 Uhr >>> > Hi Christian, > > I have a repository with an alternative Scipion installation script and > with pre-compiled Xmipp binaries. > Let me know if you want to give it a try. > > Best, > Jose Miguel > > > > On Thu, May 20, 2021 at 8:59 PM Christian Tüting < > chr...@bi...> wrote: > > > Ok. We will test this and keep you informed. > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 20.56 Uhr >>> > > If you were able to successfully build xmipp, you need to check if you > > can > > display images in scipion to make sure it works. > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 7:54 PM Christian Tüting < > > chr...@bi...> wrote: > > > > > I (temporarily) fixed the error with: > > > > > > export > > > > > > > > > > > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ > > > > > > > > > xmipp is currently compiling. I am really no expert in server > > > maintainance, so I am not sure what are the implications of this. > > Afaik, > > > this is only a fix for the current shell, so after reboot, this fix > > > should be gone. So if xmipp needs this library, I guess I have to > add > > > this export command to ~/.bashrc, right? > > > > > > Best > > > > > > Christian > > > > > > > > > >>> Christian Tüting <chr...@bi...> > > > 20.05.21 20.20 Uhr >>> > > > Hi, > > > > > > I tried yum install libtiff libtiff-devel before, but yum says, > > "nothing > > > to do". But, yum install libtiff* helped to install the missing > > packages > > > somehow. > > > > > > So the error disappers from ./xmipp_config. There are some path > > missing > > > in PATH, but alternatives are found so I guess this is fine (see > > output > > > below). > > > > > > But ./xmipp check_config still fails. > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > config > > > Configuring ----------------------------------------- > > > gcc detected > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > OpenCV not found > > > rm -v xmipp_test_opencv* > > > 'mpirun' and 'mpiexec' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicc' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicxx' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > Alternative found at '/usr/local/cuda/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/local/cuda/bin > > > CUDA-10.1.243 detected. > > > CUDA-8.0 is recommended. > > > Using '> > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ > ./xmipp > > > check_config > > > Checking configuration ------------------------------ > > > Checking compiler configuration ... > > > g++ 4.8.5 detected > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_main.cpp > > > -o xmipp_test_main.o -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > > > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > > /usr/bin/ld: warning: libwebp.so.7, needed by > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > > found (try using -rpath or -rpath-link) > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureImportRGB' > > > /home/user/Data/Software/> > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureFree' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIAppend' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDecGetRGB' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPINewDecoder' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureImportRGBA' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPConfigInitInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPEncoundefined reference to > > `WebPValidateConfig' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureInitInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPFreeDecBuffer' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDelete' > > > collect2: error: ld returned 1 exit status > > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > > Cannot compile > > > Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy > > > python3-scipy > > > python3-mpi4py > > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy > python3-scipy > > > --noconfirm > > > Please, see > > > ' > > > > > > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > > ' > > > for more information about libraries dependencies. > > > Remember to re-run './xmipp config' after installing libraries in > > order > > > to take into account the new system configuration. > > > rm xmipp_test_main* > > > Check failed! Something wrong with the configuration. > > > > > > > > > It's still looking for libwebp.so.7, but this is present: > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr > > > ../../../../miniconda/envs/scipion3/lib/libwebp.so* > > > -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 > > > ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 > > > lrwxrwxrwx. 1 user user > > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 > > -> > > > libwebp.so.7.1.1 > > > lrwxrwxrwx. 1 user user > > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so > -> > > > libwebp.so.7.1.1 > > > > > > Best > > > > > > Christian > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> > > > Hi Christian, > > > > > > I guess xmipp script still cannot recognize libtiff that it has > > > installed. > > > I can reproduce your problem on my machine. But I have a system > > library > > > installed. > > > > > > So, the easiest solution is to install libtiff / libtiff-devel > using > > > your > > > package manager (yum) > > > > > > Best regards, > > >> > Grigory Sharov, Ph.D. > > > > > > MRC Laboratory of Molecular Biology, > > > Francis Crick Avenue, > > > Cambridge Biomedical Campus, > > > Cambridge CB2 0QH, UK. > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > e-mail: gs...@mr... > > > > > > > > > On Thu, May 20, 2021 at 6:34 PM Christian Tüting < > > > chr...@bi...> wrote: > > > > > > > Hi, > > > > > > > > it fails with similar errors like in the automatic installation: > > > > > > > > > > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > > > linux-vdso.so.1 => (0x00007ffe872c3000) > > > > libwebp.so.7 => > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > > > > (0x00007f8f7253a000) > > > > libzstd.so.1 => > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > > > > (0x00007f8f7246e000) > > > > liblzma.so.5 => > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > > > > (0x00007f8f723fe000) > > > > libjpeg.so.9 => > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > > > > (0x00007f8f721c2000) > > > > libz.so.1 => > > > > /home/user/Data/Softw> > libm.so.6 => /lib64/libm.so.6 > > (0x00007f8f71ea4000) > > > > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > > > > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > > > > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > > > > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > config > > > > Configuring ----------------------------------------- > > > > gcc detected > > > > 'libtiff' not found in the system > > > > 'libtiff' dependency not found. Do you want to install it using > > conda? > > > > [YES/no] Y > > > > Trying to install libtiff with conda > > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > > CommandNotFoundError: Your shell has not been properly configured > to > > > use > > > > 'conda activate'. > > > > To initialize your shell, run > > > > $ conda init <SHELL_NAME> > > > > Currently supported shells are: > > > > - bash > > > > - fish > > > > - tcsh > > > > - x> IMPORTANT: You may need to close and restart your shell after > > > running > > > > 'conda init'. > > > > Collecting package metadata (current_repodata.json): ...working... > > > done > > > > Solving environment: ...working... done > > > > # All requested packages already installed. > > > > 'libtiff' installed in conda environ 'scipion3'. > > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and > > > 'mpiexec' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > 'mpicc' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > 'mpicxx' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > > Alternative found at '/usr/local/cuda/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/local/cuda/bin > > > > CUDA-10.1.243 detected. > > > > CUDA-8.0 is recommended. > > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > > Configuration completed..... > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > config > > > > Configuring ----------------------------------------- > > > > gcc detected > > > > 'libtiff' not found in the system > > > > 'libtiff' dependency not found. Do you want to install it using > > conda? > > > > [YES/no] > > > > Trying to install libtiff with conda > > > > conda activate scipi> > > CommandNotFoundError: Your shell has not > been properly configured > to > > > use > > > > 'conda activate'. > > > > To initialize your shell, run > > > > $ conda init <SHELL_NAME> > > > > Currently supported shells are: > > > > - bash > > > > - fish > > > > - tcsh > > > > - xonsh > > > > - zsh > > > > - powershell > > > > See 'conda init --help' for more information and options. > > > > IMPORTANT: You may need to close and restart your shell after > > running > > > > 'conda init'. > > > > Collecting package metadata (current_repodata.json): ...working... > > > done > > > > Solving environment: ...working... done > > > > # All requested packages already installed. > > > > 'libtiff' installed in conda exmipp_test_opencv.cpp -o > > > xmipp_test_opencv.o > > > > -I../ > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > > OpenCV not found > > > > rm -v xmipp_test_opencv* > > > > 'mpirun' and 'mpiexec' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > 'mpicc' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/lib64/openmpi/bin > > > > 'mpicxx' not found in the PATH > > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > > Please, press [return> > Java detected at: > > /home/user/Data/Software/miniconda/envs/scipion3 > > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > > Alternative found at '/usr/local/cuda/bin'. > > > > Please, press [return] to use it or type a path where to locate > it: > > > > -> /usr/local/cuda/bin > > > > CUDA-10.1.243 detected. > > > > CUDA-8.0 is recommended. > > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > > Configuration completed..... > > > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > > check_config > > > > Checking configuration ------------------------------ > > > > Checking compiler configuration ... > > > > g++ 4.8.5 detected > > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > xmipp_test_main.cpp > > > > -o xmipp_test_main.o -I../ > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > > > > > > > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/c> > > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > > > /usr/bin/ld: warning: libwebp.so.7, needed by > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, > not > > > > found (try using -rpath or -rpath-link) > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPInitDecBufferInternal' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPPictureFree' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPIAppend' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPIDecGetRGB' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPINewDecoder' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPPictureImportRGBA' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPConfigInitInternal' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPEncode' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPValidateConfig' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPPictureInitInternal' > > > > /home/user/Dat> > > undefined reference to `WebPFreeDecBuffer' > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > > undefined reference to `WebPIDelete' > > > > collect2: error: ld returned 1 exit status > > > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > > > Cannot compile > > > > Possible solutions > > > > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > > > > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > > > > python3-mpi4py > > > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy > > python3-scipy > > > > --noconfirm > > > > Please, see > > > > ' > > > > > > > > > > > > > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > > > ' > > > > for more information about libraries dependencies. > > > > Remember to re-run './xmipp config' after installing libraries in > > > order > > > > to take into account the new system configuration. > > > > rm xmipp_test_main* > > > > Check failed! Trying to install libtiff with conda > > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > > CommandNotFoundError: Your shell has not been properly configured > to > > > use > > > > 'conda activate'. > > > > > > > > Because I can run it manually without errors: > > > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda > > > activate > > > > scipion3 ; conda install libtiff -y -c defaults > > > > Collecting package metadata (current_repodata.json): done > > > > Solving environment: done > > > > # All requested packages alre> > And also, thanks already in > advance. > > I really appreciate your help. > > > > > > > > Best Christian > > > > > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr > >>> > > > > Hi, > > > > > > > > Let's try manual installation: > > > > > > > > conda activate scipion3 > > > > > ldd > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > > -> > > > > > check output for errors > > > > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > > > > cd > /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > > > > ./xmipp config -> check if all is correct in xmipp.conf > > > > > ./xmipp check_config -> check for errors > > > > > ./xmipp compileAndInstall N=4 && ln -srfn build > > > > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && > > touch > > > > > installation_finished && rm bindings_linked 2> /dev/null > > > > > > > > > > > > You can post output errors from these commands here. > > > > > > > > Best regards, > > > > Grigory > > > > > > > > > > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > > Grigory Sharov, Ph.D. > > > > > > > > MRC Laboratory of Molecular Biology, > > > > Francis Crick Avenue, > > > > Cambridge Biomedical Campus, > > > > Cambridge CB2 0QH, UK. > > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > > e-mail: gsharov@mrc-lmb.> > > chr...@bi...> > > > wrote: > > > > > > > > > Hi all, > > > > > > > > > > Dmitrys collegue here. > > > > > > > > > > yes we have the library in the scipion3 env: > > > > > > > > > > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > > > > > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda > > > install fftw -y -c defaults[0m > > > > > CommandNotFoundError: Your shell has not been properly > configured > > to > > > > use > > > > > 'conda activate'. > > > > > > > > > > When I am coping this command, it runs without any issue. > > > > > > > > > > > > > > > Thanks for your help. > > > > > > > > > > best > > > > > Christian > > > > > > > > > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr > > >>> > > > > > Hi, > > > > > > > > > > do you have libwebp.so.7 in > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > > > > > > > Best regards, > > > > > Grigory > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > > > Grigory Sharov, Ph.D. > > > > > > > > > > MRC Laboratory of Molecular Biology, > > > > > Francis Cri> > > > e-mail: gs...@mr... > > > > > > > > > > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok > > > <Sem...@gm...> > > > > > wrote: > > > > > > > > > > > Dear colleagues, > > > > > > > > > > > > We reinstalled the centos 7 on our server as well as > miniconda. > > > > > > > > > > > > > > > > > > > > > > > > The core was installed correctly but during xmipp installation > > > some > > > > error > > > > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > > > > > > > > > > Sincerely, > > > > > > Dmitry and > > > Christian_______________________________________________ > > > > > > scipion-users mailing list > > > > > > sci...@li... > > > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > > > > > > > > > > > _______________________________________________ > > > > > scipion-users mailing list > > > > > sci...@li... > > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > > > > > > _______________________________________________ > > > > scipion-users mailing list > > > > sci...@li... > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/sci > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Christian T. <chr...@bi...> - 2021-05-20 20:47:49
|
Hi Jose Miguel, it looks like adding the scipion3 env library path to LD_LIBRARY_PATH did the job. xmipp was compiled and installed successfully and the first test were also running fine (preprocess micrographs and particle picking). We setting up our workflow with all plugins and test tomorrow again. But if no furher error occur, I will anser on this mailing thread with a summarizing mail, describing the inital issue and our solution, so if other have the same issue, they dont need to go through all mails. Best Christian >>> Jose Miguel de la Rosa Trevin <del...@gm...> 20.05.21 22.40 Uhr >>> Hi Christian, I have a repository with an alternative Scipion installation script and with pre-compiled Xmipp binaries. Let me know if you want to give it a try. Best, Jose Miguel On Thu, May 20, 2021 at 8:59 PM Christian Tüting < chr...@bi...> wrote: > Ok. We will test this and keep you informed. > > >>> Grigory Sharov <sha...@gm...> 20.05.21 20.56 Uhr >>> > If you were able to successfully build xmipp, you need to check if you > can > display images in scipion to make sure it works. > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 7:54 PM Christian Tüting < > chr...@bi...> wrote: > > > I (temporarily) fixed the error with: > > > > export > > > > > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ > > > > > > xmipp is currently compiling. I am really no expert in server > > maintainance, so I am not sure what are the implications of this. > Afaik, > > this is only a fix for the current shell, so after reboot, this fix > > should be gone. So if xmipp needs this library, I guess I have to add > > this export command to ~/.bashrc, right? > > > > Best > > > > Christian > > > > > > >>> Christian Tüting <chr...@bi...> > > 20.05.21 20.20 Uhr >>> > > Hi, > > > > I tried yum install libtiff libtiff-devel before, but yum says, > "nothing > > to do". But, yum install libtiff* helped to install the missing > packages > > somehow. > > > > So the error disappers from ./xmipp_config. There are some path > missing > > in PATH, but alternatives are found so I guess this is fine (see > output > > below). > > > > But ./xmipp check_config still fails. > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > OpenCV not found > > rm -v xmipp_test_opencv* > > 'mpirun' and 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '> > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > check_config > > Checking configuration ------------------------------ > > Checking compiler configuration ... > > g++ 4.8.5 detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_main.cpp > > -o xmipp_test_main.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > /usr/bin/ld: warning: libwebp.so.7, needed by > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > found (try using -rpath or -rpath-link) > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGB' > > /home/user/Data/Software/> > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureFree' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIAppend' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDecGetRGB' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPINewDecoder' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGBA' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPConfigInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPEncoundefined reference to > `WebPValidateConfig' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPFreeDecBuffer' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDelete' > > collect2: error: ld returned 1 exit status > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > Cannot compile > > Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy > > python3-scipy > > python3-mpi4py > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > > --noconfirm > > Please, see > > ' > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > ' > > for more information about libraries dependencies. > > Remember to re-run './xmipp config' after installing libraries in > order > > to take into account the new system configuration. > > rm xmipp_test_main* > > Check failed! Something wrong with the configuration. > > > > > > It's still looking for libwebp.so.7, but this is present: > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr > > ../../../../miniconda/envs/scipion3/lib/libwebp.so* > > -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 > > ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 > > lrwxrwxrwx. 1 user user > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 > -> > > libwebp.so.7.1.1 > > lrwxrwxrwx. 1 user user > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so -> > > libwebp.so.7.1.1 > > > > Best > > > > Christian > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> > > Hi Christian, > > > > I guess xmipp script still cannot recognize libtiff that it has > > installed. > > I can reproduce your problem on my machine. But I have a system > library > > installed. > > > > So, the easiest solution is to install libtiff / libtiff-devel using > > your > > package manager (yum) > > > > Best regards, > >> > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 6:34 PM Christian Tüting < > > chr...@bi...> wrote: > > > > > Hi, > > > > > > it fails with similar errors like in the automatic installation: > > > > > > > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > > linux-vdso.so.1 => (0x00007ffe872c3000) > > > libwebp.so.7 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > > > (0x00007f8f7253a000) > > > libzstd.so.1 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > > > (0x00007f8f7246e000) > > > liblzma.so.5 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > > > (0x00007f8f723fe000) > > > libjpeg.so.9 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > > > (0x00007f8f721c2000) > > > libz.so.1 => > > > /home/user/Data/Softw> > libm.so.6 => /lib64/libm.so.6 > (0x00007f8f71ea4000) > > > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > > > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > > > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > > > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > config > > > Configuring ----------------------------------------- > > > gcc detected > > > 'libtiff' not found in the system > > > 'libtiff' dependency not found. Do you want to install it using > conda? > > > [YES/no] Y > > > Trying to install libtiff with conda > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > To initialize your shell, run > > > $ conda init <SHELL_NAME> > > > Currently supported shells are: > > > - bash > > > - fish > > > - tcsh > > > - x> IMPORTANT: You may need to close and restart your shell after > > running > > > 'conda init'. > > > Collecting package metadata (current_repodata.json): ...working... > > done > > > Solving environment: ...working... done > > > # All requested packages already installed. > > > 'libtiff' installed in conda environ 'scipion3'. > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and > > 'mpiexec' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicc' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicxx' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > Alternative found at '/usr/local/cuda/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/local/cuda/bin > > > CUDA-10.1.243 detected. > > > CUDA-8.0 is recommended. > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > Configuration completed..... > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > config > > > Configuring ----------------------------------------- > > > gcc detected > > > 'libtiff' not found in the system > > > 'libtiff' dependency not found. Do you want to install it using > conda? > > > [YES/no] > > > Trying to install libtiff with conda > > > conda activate scipi> > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > To initialize your shell, run > > > $ conda init <SHELL_NAME> > > > Currently supported shells are: > > > - bash > > > - fish > > > - tcsh > > > - xonsh > > > - zsh > > > - powershell > > > See 'conda init --help' for more information and options. > > > IMPORTANT: You may need to close and restart your shell after > running > > > 'conda init'. > > > Collecting package metadata (current_repodata.json): ...working... > > done > > > Solving environment: ...working... done > > > # All requested packages already installed. > > > 'libtiff' installed in conda exmipp_test_opencv.cpp -o > > xmipp_test_opencv.o > > > -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > OpenCV not found > > > rm -v xmipp_test_opencv* > > > 'mpirun' and 'mpiexec' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicc' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicxx' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return> > Java detected at: > /home/user/Data/Software/miniconda/envs/scipion3 > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > Alternative found at '/usr/local/cuda/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/local/cuda/bin > > > CUDA-10.1.243 detected. > > > CUDA-8.0 is recommended. > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > Configuration completed..... > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > check_config > > > Checking configuration ------------------------------ > > > Checking compiler configuration ... > > > g++ 4.8.5 detected > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_main.cpp > > > -o xmipp_test_main.o -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/c> > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > > /usr/bin/ld: warning: libwebp.so.7, needed by > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > > found (try using -rpath or -rpath-link) > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPInitDecBufferInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureFree' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIAppend' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDecGetRGB' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPINewDecoder' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureImportRGBA' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPConfigInitInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPEncode' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPValidateConfig' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureInitInternal' > > > /home/user/Dat> > > undefined reference to `WebPFreeDecBuffer' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDelete' > > > collect2: error: ld returned 1 exit status > > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > > Cannot compile > > > Possible solutions > > > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > > > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > > > python3-mpi4py > > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy > python3-scipy > > > --noconfirm > > > Please, see > > > ' > > > > > > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > > ' > > > for more information about libraries dependencies. > > > Remember to re-run './xmipp config' after installing libraries in > > order > > > to take into account the new system configuration. > > > rm xmipp_test_main* > > > Check failed! Trying to install libtiff with conda > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > > > > Because I can run it manually without errors: > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda > > activate > > > scipion3 ; conda install libtiff -y -c defaults > > > Collecting package metadata (current_repodata.json): done > > > Solving environment: done > > > # All requested packages alre> > And also, thanks already in advance. > I really appreciate your help. > > > > > > Best Christian > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > > > Hi, > > > > > > Let's try manual installation: > > > > > > conda activate scipion3 > > > > ldd > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > -> > > > > check output for errors > > > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > > > ./xmipp config -> check if all is correct in xmipp.conf > > > > ./xmipp check_config -> check for errors > > > > ./xmipp compileAndInstall N=4 && ln -srfn build > > > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && > touch > > > > installation_finished && rm bindings_linked 2> /dev/null > > > > > > > > > You can post output errors from these commands here. > > > > > > Best regards, > > > Grigory > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > Grigory Sharov, Ph.D. > > > > > > MRC Laboratory of Molecular Biology, > > > Francis Crick Avenue, > > > Cambridge Biomedical Campus, > > > Cambridge CB2 0QH, UK. > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > e-mail: gsharov@mrc-lmb.> > chr...@bi...> > > wrote: > > > > > > > Hi all, > > > > > > > > Dmitrys collegue here. > > > > > > > > yes we have the library in the scipion3 env: > > > > > > > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda > > install fftw -y -c defaults[0m > > > > CommandNotFoundError: Your shell has not been properly configured > to > > > use > > > > 'conda activate'. > > > > > > > > When I am coping this command, it runs without any issue. > > > > > > > > > > > > Thanks for your help. > > > > > > > > best > > > > Christian > > > > > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr > >>> > > > > Hi, > > > > > > > > do you have libwebp.so.7 in > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > > > > > Best regards, > > > > Grigory > > > > > > > > > > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > > Grigory Sharov, Ph.D. > > > > > > > > MRC Laboratory of Molecular Biology, > > > > Francis Cri> > > > e-mail: gs...@mr... > > > > > > > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok > > <Sem...@gm...> > > > > wrote: > > > > > > > > > Dear colleagues, > > > > > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > > > > > > > > > The core was installed correctly but during xmipp installation > > some > > > error > > > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > > > > > > > Sincerely, > > > > > Dmitry and > > Christian_______________________________________________ > > > > > scipion-users mailing list > > > > > sci...@li... > > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > > > > > > _______________________________________________ > > > > scipion-users mailing list > > > > sci...@li... > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/sci > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Jose M. de la R. T. <del...@gm...> - 2021-05-20 20:39:12
|
Hi Christian, I have a repository with an alternative Scipion installation script and with pre-compiled Xmipp binaries. Let me know if you want to give it a try. Best, Jose Miguel On Thu, May 20, 2021 at 8:59 PM Christian Tüting < chr...@bi...> wrote: > Ok. We will test this and keep you informed. > > >>> Grigory Sharov <sha...@gm...> 20.05.21 20.56 Uhr >>> > If you were able to successfully build xmipp, you need to check if you > can > display images in scipion to make sure it works. > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 7:54 PM Christian Tüting < > chr...@bi...> wrote: > > > I (temporarily) fixed the error with: > > > > export > > > > > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ > > > > > > xmipp is currently compiling. I am really no expert in server > > maintainance, so I am not sure what are the implications of this. > Afaik, > > this is only a fix for the current shell, so after reboot, this fix > > should be gone. So if xmipp needs this library, I guess I have to add > > this export command to ~/.bashrc, right? > > > > Best > > > > Christian > > > > > > >>> Christian Tüting <chr...@bi...> > > 20.05.21 20.20 Uhr >>> > > Hi, > > > > I tried yum install libtiff libtiff-devel before, but yum says, > "nothing > > to do". But, yum install libtiff* helped to install the missing > packages > > somehow. > > > > So the error disappers from ./xmipp_config. There are some path > missing > > in PATH, but alternatives are found so I guess this is fine (see > output > > below). > > > > But ./xmipp check_config still fails. > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > OpenCV not found > > rm -v xmipp_test_opencv* > > 'mpirun' and 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > Configuration completed..... > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > check_config > > Checking configuration ------------------------------ > > Checking compiler configuration ... > > g++ 4.8.5 detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_main.cpp > > -o xmipp_test_main.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > /usr/bin/ld: warning: libwebp.so.7, needed by > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > found (try using -rpath or -rpath-link) > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGB' > > /home/user/Data/Software/> > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureFree' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIAppend' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDecGetRGB' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPINewDecoder' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGBA' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPConfigInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPEncoundefined reference to > `WebPValidateConfig' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPFreeDecBuffer' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDelete' > > collect2: error: ld returned 1 exit status > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > Cannot compile > > Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy > > python3-scipy > > python3-mpi4py > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > > --noconfirm > > Please, see > > ' > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > ' > > for more information about libraries dependencies. > > Remember to re-run './xmipp config' after installing libraries in > order > > to take into account the new system configuration. > > rm xmipp_test_main* > > Check failed! Something wrong with the configuration. > > > > > > It's still looking for libwebp.so.7, but this is present: > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr > > ../../../../miniconda/envs/scipion3/lib/libwebp.so* > > -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 > > ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 > > lrwxrwxrwx. 1 user user > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 > -> > > libwebp.so.7.1.1 > > lrwxrwxrwx. 1 user user > > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so -> > > libwebp.so.7.1.1 > > > > Best > > > > Christian > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> > > Hi Christian, > > > > I guess xmipp script still cannot recognize libtiff that it has > > installed. > > I can reproduce your problem on my machine. But I have a system > library > > installed. > > > > So, the easiest solution is to install libtiff / libtiff-devel using > > your > > package manager (yum) > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 6:34 PM Christian Tüting < > > chr...@bi...> wrote: > > > > > Hi, > > > > > > it fails with similar errors like in the automatic installation: > > > > > > > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > > linux-vdso.so.1 => (0x00007ffe872c3000) > > > libwebp.so.7 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > > > (0x00007f8f7253a000) > > > libzstd.so.1 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > > > (0x00007f8f7246e000) > > > liblzma.so.5 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > > > (0x00007f8f723fe000) > > > libjpeg.so.9 => > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > > > (0x00007f8f721c2000) > > > libz.so.1 => > > > /home/user/Data/Softw> > libm.so.6 => /lib64/libm.so.6 > (0x00007f8f71ea4000) > > > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > > > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > > > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > > > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > config > > > Configuring ----------------------------------------- > > > gcc detected > > > 'libtiff' not found in the system > > > 'libtiff' dependency not found. Do you want to install it using > conda? > > > [YES/no] Y > > > Trying to install libtiff with conda > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > To initialize your shell, run > > > $ conda init <SHELL_NAME> > > > Currently supported shells are: > > > - bash > > > - fish > > > - tcsh > > > - x> IMPORTANT: You may need to close and restart your shell after > > running > > > 'conda init'. > > > Collecting package metadata (current_repodata.json): ...working... > > done > > > Solving environment: ...working... done > > > # All requested packages already installed. > > > 'libtiff' installed in conda environ 'scipion3'. > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and > > 'mpiexec' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicc' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicxx' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > Alternative found at '/usr/local/cuda/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/local/cuda/bin > > > CUDA-10.1.243 detected. > > > CUDA-8.0 is recommended. > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > Configuration completed..... > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > config > > > Configuring ----------------------------------------- > > > gcc detected > > > 'libtiff' not found in the system > > > 'libtiff' dependency not found. Do you want to install it using > conda? > > > [YES/no] > > > Trying to install libtiff with conda > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > To initialize your shell, run > > > $ conda init <SHELL_NAME> > > > Currently supported shells are: > > > - bash > > > - fish > > > - tcsh > > > - xonsh > > > - zsh > > > - powershell > > > See 'conda init --help' for more information and options. > > > IMPORTANT: You may need to close and restart your shell after > running > > > 'conda init'. > > > Collecting package metadata (current_repodata.json): ...working... > > done > > > Solving environment: ...working... done > > > # All requested packages already installed. > > > 'libtiff' installed in conda exmipp_test_opencv.cpp -o > > xmipp_test_opencv.o > > > -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > OpenCV not found > > > rm -v xmipp_test_opencv* > > > 'mpirun' and 'mpiexec' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicc' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/lib64/openmpi/bin > > > 'mpicxx' not found in the PATH > > > Alternative found at '/usr/lib64/openmpi/bin'. > > > Please, press [return> > Java detected at: > /home/user/Data/Software/miniconda/envs/scipion3 > > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > > Alternative found at '/usr/local/cuda/bin'. > > > Please, press [return] to use it or type a path where to locate it: > > > -> /usr/local/cuda/bin > > > CUDA-10.1.243 detected. > > > CUDA-8.0 is recommended. > > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > > Configuration completed..... > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > > check_config > > > Checking configuration ------------------------------ > > > Checking compiler configuration ... > > > g++ 4.8.5 detected > > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_main.cpp > > > -o xmipp_test_main.o -I../ > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/c> > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > > /usr/bin/ld: warning: libwebp.so.7, needed by > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > > found (try using -rpath or -rpath-link) > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPInitDecBufferInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureFree' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIAppend' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDecGetRGB' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPINewDecoder' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureImportRGBA' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPConfigInitInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPEncode' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPValidateConfig' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPPictureInitInternal' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPFreeDecBuffer' > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > > undefined reference to `WebPIDelete' > > > collect2: error: ld returned 1 exit status > > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > > Cannot compile > > > Possible solutions > > > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > > > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > > > python3-mpi4py > > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy > python3-scipy > > > --noconfirm > > > Please, see > > > ' > > > > > > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > > ' > > > for more information about libraries dependencies. > > > Remember to re-run './xmipp config' after installing libraries in > > order > > > to take into account the new system configuration. > > > rm xmipp_test_main* > > > Check failed! Trying to install libtiff with conda > > > conda activate scipion3 ; conda install libtiff -y -c defaults > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > > > > Because I can run it manually without errors: > > > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda > > activate > > > scipion3 ; conda install libtiff -y -c defaults > > > Collecting package metadata (current_repodata.json): done > > > Solving environment: done > > > # All requested packages alre> > And also, thanks already in advance. > I really appreciate your help. > > > > > > Best Christian > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > > > Hi, > > > > > > Let's try manual installation: > > > > > > conda activate scipion3 > > > > ldd > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > -> > > > > check output for errors > > > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > > > ./xmipp config -> check if all is correct in xmipp.conf > > > > ./xmipp check_config -> check for errors > > > > ./xmipp compileAndInstall N=4 && ln -srfn build > > > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && > touch > > > > installation_finished && rm bindings_linked 2> /dev/null > > > > > > > > > You can post output errors from these commands here. > > > > > > Best regards, > > > Grigory > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > Grigory Sharov, Ph.D. > > > > > > MRC Laboratory of Molecular Biology, > > > Francis Crick Avenue, > > > Cambridge Biomedical Campus, > > > Cambridge CB2 0QH, UK. > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > e-mail: gsharov@mrc-lmb.> > chr...@bi...> > > wrote: > > > > > > > Hi all, > > > > > > > > Dmitrys collegue here. > > > > > > > > yes we have the library in the scipion3 env: > > > > > > > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda > > install fftw -y -c defaults[0m > > > > CommandNotFoundError: Your shell has not been properly configured > to > > > use > > > > 'conda activate'. > > > > > > > > When I am coping this command, it runs without any issue. > > > > > > > > > > > > Thanks for your help. > > > > > > > > best > > > > Christian > > > > > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr > >>> > > > > Hi, > > > > > > > > do you have libwebp.so.7 in > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > > > > > Best regards, > > > > Grigory > > > > > > > > > > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > > Grigory Sharov, Ph.D. > > > > > > > > MRC Laboratory of Molecular Biology, > > > > Francis Crick Avenue, > > > > Cambridge Biomedical Campus, > > > > Cambridge CB2 0QH, UK. > > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > > e-mail: gs...@mr... > > > > > > > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok > > <Sem...@gm...> > > > > wrote: > > > > > > > > > Dear colleagues, > > > > > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > > > > > > > > > The core was installed correctly but during xmipp installation > > some > > > error > > > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > > > > > > > Sincerely, > > > > > Dmitry and > > Christian_______________________________________________ > > > > > scipion-users mailing list > > > > > sci...@li... > > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > > > > > > _______________________________________________ > > > > scipion-users mailing list > > > > sci...@li... > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/sci > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Christian T. <chr...@bi...> - 2021-05-20 18:59:12
|
Ok. We will test this and keep you informed. >>> Grigory Sharov <sha...@gm...> 20.05.21 20.56 Uhr >>> If you were able to successfully build xmipp, you need to check if you can display images in scipion to make sure it works. Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 7:54 PM Christian Tüting < chr...@bi...> wrote: > I (temporarily) fixed the error with: > > export > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ > > > xmipp is currently compiling. I am really no expert in server > maintainance, so I am not sure what are the implications of this. Afaik, > this is only a fix for the current shell, so after reboot, this fix > should be gone. So if xmipp needs this library, I guess I have to add > this export command to ~/.bashrc, right? > > Best > > Christian > > > >>> Christian Tüting <chr...@bi...> > 20.05.21 20.20 Uhr >>> > Hi, > > I tried yum install libtiff libtiff-devel before, but yum says, "nothing > to do". But, yum install libtiff* helped to install the missing packages > somehow. > > So the error disappers from ./xmipp_config. There are some path missing > in PATH, but alternatives are found so I guess this is fine (see output > below). > > But ./xmipp check_config still fails. > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > OpenCV not found > rm -v xmipp_test_opencv* > 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > check_config > Checking configuration ------------------------------ > Checking compiler configuration ... > g++ 4.8.5 detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp > -o xmipp_test_main.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > /usr/bin/ld: warning: libwebp.so.7, needed by > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > found (try using -rpath or -rpath-link) > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGB' > /home/user/Data/Software/> /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureFree' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIAppend' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDecGetRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPINewDecoder' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGBA' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPConfigInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPEncoundefined reference to `WebPValidateConfig' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPFreeDecBuffer' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDelete' > collect2: error: ld returned 1 exit status > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > Cannot compile > Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy > python3-scipy > python3-mpi4py > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > --noconfirm > Please, see > ' > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > ' > for more information about libraries dependencies. > Remember to re-run './xmipp config' after installing libraries in order > to take into account the new system configuration. > rm xmipp_test_main* > Check failed! Something wrong with the configuration. > > > It's still looking for libwebp.so.7, but this is present: > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr > ../../../../miniconda/envs/scipion3/lib/libwebp.so* > -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 > ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 > lrwxrwxrwx. 1 user user > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 -> > libwebp.so.7.1.1 > lrwxrwxrwx. 1 user user > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so -> > libwebp.so.7.1.1 > > Best > > Christian > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> > Hi Christian, > > I guess xmipp script still cannot recognize libtiff that it has > installed. > I can reproduce your problem on my machine. But I have a system library > installed. > > So, the easiest solution is to install libtiff / libtiff-devel using > your > package manager (yum) > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 6:34 PM Christian Tüting < > chr...@bi...> wrote: > > > Hi, > > > > it fails with similar errors like in the automatic installation: > > > > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > linux-vdso.so.1 => (0x00007ffe872c3000) > > libwebp.so.7 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > > (0x00007f8f7253a000) > > libzstd.so.1 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > > (0x00007f8f7246e000) > > liblzma.so.5 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > > (0x00007f8f723fe000) > > libjpeg.so.9 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > > (0x00007f8f721c2000) > > libz.so.1 => > > /home/user/Data/Softw> > libm.so.6 => /lib64/libm.so.6 (0x00007f8f71ea4000) > > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > 'libtiff' not found in the system > > 'libtiff' dependency not found. Do you want to install it using conda? > > [YES/no] Y > > Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > To initialize your shell, run > > $ conda init <SHELL_NAME> > > Currently supported shells are: > > - bash > > - fish > > - tcsh > > - x> IMPORTANT: You may need to close and restart your shell after > running > > 'conda init'. > > Collecting package metadata (current_repodata.json): ...working... > done > > Solving environment: ...working... done > > # All requested packages already installed. > > 'libtiff' installed in conda environ 'scipion3'. > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and > 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > 'libtiff' not found in the system > > 'libtiff' dependency not found. Do you want to install it using conda? > > [YES/no] > > Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > To initialize your shell, run > > $ conda init <SHELL_NAME> > > Currently supported shells are: > > - bash > > - fish > > - tcsh > > - xonsh > > - zsh > > - powershell > > See 'conda init --help' for more information and options. > > IMPORTANT: You may need to close and restart your shell after running > > 'conda init'. > > Collecting package metadata (current_repodata.json): ...working... > done > > Solving environment: ...working... done > > # All requested packages already installed. > > 'libtiff' installed in conda exmipp_test_opencv.cpp -o > xmipp_test_opencv.o > > -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > OpenCV not found > > rm -v xmipp_test_opencv* > > 'mpirun' and 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return> > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > Configuration completed..... > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > check_config > > Checking configuration ------------------------------ > > Checking compiler configuration ... > > g++ 4.8.5 detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_main.cpp > > -o xmipp_test_main.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/c> > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > /usr/bin/ld: warning: libwebp.so.7, needed by > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > found (try using -rpath or -rpath-link) > > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPInitDecBufferInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureFree' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIAppend' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDecGetRGB' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPINewDecoder' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGBA' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPConfigInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPEncode' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPValidateConfig' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPFreeDecBuffer' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDelete' > > collect2: error: ld returned 1 exit status > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > Cannot compile > > Possible solutions > > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > > python3-mpi4py > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > > --noconfirm > > Please, see > > ' > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > ' > > for more information about libraries dependencies. > > Remember to re-run './xmipp config' after installing libraries in > order > > to take into account the new system configuration. > > rm xmipp_test_main* > > Check failed! Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > > > Because I can run it manually without errors: > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda > activate > > scipion3 ; conda install libtiff -y -c defaults > > Collecting package metadata (current_repodata.json): done > > Solving environment: done > > # All requested packages alre> > And also, thanks already in advance. I really appreciate your help. > > > > Best Christian > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > > Hi, > > > > Let's try manual installation: > > > > conda activate scipion3 > > > ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > -> > > > check output for errors > > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > > ./xmipp config -> check if all is correct in xmipp.conf > > > ./xmipp check_config -> check for errors > > > ./xmipp compileAndInstall N=4 && ln -srfn build > > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && touch > > > installation_finished && rm bindings_linked 2> /dev/null > > > > > > You can post output errors from these commands here. > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gsharov@mrc-lmb.> chr...@bi...> > wrote: > > > > > Hi all, > > > > > > Dmitrys collegue here. > > > > > > yes we have the library in the scipion3 env: > > > > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda > install fftw -y -c defaults[0m > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > > > > When I am coping this command, it runs without any issue. > > > > > > > > > Thanks for your help. > > > > > > best > > > Christian > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> > > > Hi, > > > > > > do you have libwebp.so.7 in > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > > > Best regards, > > > Grigory > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > Grigory Sharov, Ph.D. > > > > > > MRC Laboratory of Molecular Biology, > > > Francis Crick Avenue, > > > Cambridge Biomedical Campus, > > > Cambridge CB2 0QH, UK. > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > e-mail: gs...@mr... > > > > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok > <Sem...@gm...> > > > wrote: > > > > > > > Dear colleagues, > > > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > > > > > The core was installed correctly but during xmipp installation > some > > error > > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > > > > Sincerely, > > > > Dmitry and > Christian_______________________________________________ > > > > scipion-users mailing list > > > > sci...@li... > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/sci |
From: Grigory S. <sha...@gm...> - 2021-05-20 18:56:29
|
If you were able to successfully build xmipp, you need to check if you can display images in scipion to make sure it works. Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 7:54 PM Christian Tüting < chr...@bi...> wrote: > I (temporarily) fixed the error with: > > export > > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ > > > xmipp is currently compiling. I am really no expert in server > maintainance, so I am not sure what are the implications of this. Afaik, > this is only a fix for the current shell, so after reboot, this fix > should be gone. So if xmipp needs this library, I guess I have to add > this export command to ~/.bashrc, right? > > Best > > Christian > > > >>> Christian Tüting <chr...@bi...> > 20.05.21 20.20 Uhr >>> > Hi, > > I tried yum install libtiff libtiff-devel before, but yum says, "nothing > to do". But, yum install libtiff* helped to install the missing packages > somehow. > > So the error disappers from ./xmipp_config. There are some path missing > in PATH, but alternatives are found so I guess this is fine (see output > below). > > But ./xmipp check_config still fails. > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > OpenCV not found > rm -v xmipp_test_opencv* > 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > check_config > Checking configuration ------------------------------ > Checking compiler configuration ... > g++ 4.8.5 detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp > -o xmipp_test_main.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > /usr/bin/ld: warning: libwebp.so.7, needed by > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > found (try using -rpath or -rpath-link) > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPInitDecBufferInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureFree' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIAppend' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDecGetRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPINewDecoder' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGBA' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPConfigInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPEncoundefined reference to `WebPValidateConfig' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPFreeDecBuffer' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDelete' > collect2: error: ld returned 1 exit status > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > Cannot compile > Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy > python3-scipy > python3-mpi4py > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > --noconfirm > Please, see > ' > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > ' > for more information about libraries dependencies. > Remember to re-run './xmipp config' after installing libraries in order > to take into account the new system configuration. > rm xmipp_test_main* > Check failed! Something wrong with the configuration. > > > It's still looking for libwebp.so.7, but this is present: > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr > ../../../../miniconda/envs/scipion3/lib/libwebp.so* > -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 > ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 > lrwxrwxrwx. 1 user user > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 -> > libwebp.so.7.1.1 > lrwxrwxrwx. 1 user user > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so -> > libwebp.so.7.1.1 > > Best > > Christian > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> > Hi Christian, > > I guess xmipp script still cannot recognize libtiff that it has > installed. > I can reproduce your problem on my machine. But I have a system library > installed. > > So, the easiest solution is to install libtiff / libtiff-devel using > your > package manager (yum) > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 6:34 PM Christian Tüting < > chr...@bi...> wrote: > > > Hi, > > > > it fails with similar errors like in the automatic installation: > > > > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > linux-vdso.so.1 => (0x00007ffe872c3000) > > libwebp.so.7 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > > (0x00007f8f7253a000) > > libzstd.so.1 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > > (0x00007f8f7246e000) > > liblzma.so.5 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > > (0x00007f8f723fe000) > > libjpeg.so.9 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > > (0x00007f8f721c2000) > > libz.so.1 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libz.so.1 > > (0x00007f8f7244d000) > > libm.so.6 => /lib64/libm.so.6 (0x00007f8f71ea4000) > > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > 'libtiff' not found in the system > > 'libtiff' dependency not found. Do you want to install it using conda? > > [YES/no] Y > > Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > To initialize your shell, run > > $ conda init <SHELL_NAME> > > Currently supported shells are: > > - bash > > - fish > > - tcsh > > - x> IMPORTANT: You may need to close and restart your shell after > running > > 'conda init'. > > Collecting package metadata (current_repodata.json): ...working... > done > > Solving environment: ...working... done > > # All requested packages already installed. > > 'libtiff' installed in conda environ 'scipion3'. > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and > 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > 'libtiff' not found in the system > > 'libtiff' dependency not found. Do you want to install it using conda? > > [YES/no] > > Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > To initialize your shell, run > > $ conda init <SHELL_NAME> > > Currently supported shells are: > > - bash > > - fish > > - tcsh > > - xonsh > > - zsh > > - powershell > > See 'conda init --help' for more information and options. > > IMPORTANT: You may need to close and restart your shell after running > > 'conda init'. > > Collecting package metadata (current_repodata.json): ...working... > done > > Solving environment: ...working... done > > # All requested packages already installed. > > 'libtiff' installed in conda exmipp_test_opencv.cpp -o > xmipp_test_opencv.o > > -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > OpenCV not found > > rm -v xmipp_test_opencv* > > 'mpirun' and 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > Configuration completed..... > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > check_config > > Checking configuration ------------------------------ > > Checking compiler configuration ... > > g++ 4.8.5 detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_main.cpp > > -o xmipp_test_main.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/c> > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > /usr/bin/ld: warning: libwebp.so.7, needed by > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > found (try using -rpath or -rpath-link) > > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPInitDecBufferInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureFree' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIAppend' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDecGetRGB' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPINewDecoder' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGBA' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPConfigInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPEncode' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPValidateConfig' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPFreeDecBuffer' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDelete' > > collect2: error: ld returned 1 exit status > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > Cannot compile > > Possible solutions > > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > > python3-mpi4py > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > > --noconfirm > > Please, see > > ' > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > ' > > for more information about libraries dependencies. > > Remember to re-run './xmipp config' after installing libraries in > order > > to take into account the new system configuration. > > rm xmipp_test_main* > > Check failed! Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > > > Because I can run it manually without errors: > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda > activate > > scipion3 ; conda install libtiff -y -c defaults > > Collecting package metadata (current_repodata.json): done > > Solving environment: done > > # All requested packages already installed. > > > > > > And also, thanks already in advance. I really appreciate your help. > > > > Best Christian > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > > Hi, > > > > Let's try manual installation: > > > > conda activate scipion3 > > > ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > -> > > > check output for errors > > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > > ./xmipp config -> check if all is correct in xmipp.conf > > > ./xmipp check_config -> check for errors > > > ./xmipp compileAndInstall N=4 && ln -srfn build > > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && touch > > > installation_finished && rm bindings_linked 2> /dev/null > > > > > > You can post output errors from these commands here. > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gsharov@mrc-lmb.> chr...@bi...> > wrote: > > > > > Hi all, > > > > > > Dmitrys collegue here. > > > > > > yes we have the library in the scipion3 env: > > > > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda > install fftw -y -c defaults[0m > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > > > > When I am coping this command, it runs without any issue. > > > > > > > > > Thanks for your help. > > > > > > best > > > Christian > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> > > > Hi, > > > > > > do you have libwebp.so.7 in > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > > > Best regards, > > > Grigory > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > Grigory Sharov, Ph.D. > > > > > > MRC Laboratory of Molecular Biology, > > > Francis Crick Avenue, > > > Cambridge Biomedical Campus, > > > Cambridge CB2 0QH, UK. > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > e-mail: gs...@mr... > > > > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok > <Sem...@gm...> > > > wrote: > > > > > > > Dear colleagues, > > > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > > > > > The core was installed correctly but during xmipp installation > some > > error > > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > > > > Sincerely, > > > > Dmitry and > Christian_______________________________________________ > > > > scipion-users mailing list > > > > sci...@li... > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Grigory S. <sha...@gm...> - 2021-05-20 18:55:01
|
I'm running out of options.. Can you try to downgrade libtiff inside scipion conda environment? (conda install libtiff=4.1.0) Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 7:19 PM Christian Tüting < chr...@bi...> wrote: > Hi, > > I tried yum install libtiff libtiff-devel before, but yum says, "nothing > to do". But, yum install libtiff* helped to install the missing packages > somehow. > > So the error disappers from ./xmipp_config. There are some path missing > in PATH, but alternatives are found so I guess this is fine (see output > below). > > But ./xmipp check_config still fails. > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > OpenCV not found > rm -v xmipp_test_opencv* > 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > check_config > Checking configuration ------------------------------ > Checking compiler configuration ... > g++ 4.8.5 detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp > -o xmipp_test_main.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > /usr/bin/ld: warning: libwebp.so.7, needed by > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > found (try using -rpath or -rpath-link) > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPInitDecBufferInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureFree' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIAppend' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDecGetRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPINewDecoder' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGBA' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPConfigInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPEncode' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPValidateConfig' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPFreeDecBuffer' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDelete' > collect2: error: ld returned 1 exit status > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > Cannot compile > Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy > python3-scipy > python3-mpi4py > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > --noconfirm > Please, see > ' > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > ' > for more information about libraries dependencies. > Remember to re-run './xmipp config' after installing libraries in order > to take into account the new system configuration. > rm xmipp_test_main* > Check failed! Something wrong with the configuration. > > > It's still looking for libwebp.so.7, but this is present: > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr > ../../../../miniconda/envs/scipion3/lib/libwebp.so* > -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 > ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 > lrwxrwxrwx. 1 user user > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 -> > libwebp.so.7.1.1 > lrwxrwxrwx. 1 user user > 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so -> > libwebp.so.7.1.1 > > Best > > Christian > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> > Hi Christian, > > I guess xmipp script still cannot recognize libtiff that it has > installed. > I can reproduce your problem on my machine. But I have a system library > installed. > > So, the easiest solution is to install libtiff / libtiff-devel using > your > package manager (yum) > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 6:34 PM Christian Tüting < > chr...@bi...> wrote: > > > Hi, > > > > it fails with similar errors like in the automatic installation: > > > > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > > linux-vdso.so.1 => (0x00007ffe872c3000) > > libwebp.so.7 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > > (0x00007f8f7253a000) > > libzstd.so.1 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > > (0x00007f8f7246e000) > > liblzma.so.5 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > > (0x00007f8f723fe000) > > libjpeg.so.9 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > > (0x00007f8f721c2000) > > libz.so.1 => > > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libz.so.1 > > (0x00007f8f7244d000) > > libm.so.6 => /lib64/libm.so.6 (0x00007f8f71ea4000) > > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > 'libtiff' not found in the system > > 'libtiff' dependency not found. Do you want to install it using conda? > > [YES/no] Y > > Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > To initialize your shell, run > > $ conda init <SHELL_NAME> > > Currently supported shells are: > > - bash > > - fish > > - tcsh > > - xonsh > > - zsh > > - powershell > > See 'conda init --help' for more information and options. > > IMPORTANT: You may need to close and restart your shell after running > > 'conda init'. > > Collecting package metadata (current_repodata.json): ...working... > done > > Solving environment: ...working... done > > # All requested packages already installed. > > 'libtiff' installed in conda environ 'scipion3'. > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and > 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > config > > Configuring ----------------------------------------- > > gcc detected > > 'libtiff' not found in the system > > 'libtiff' dependency not found. Do you want to install it using conda? > > [YES/no] > > Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > To initialize your shell, run > > $ conda init <SHELL_NAME> > > Currently supported shells are: > > - bash > > - fish > > - tcsh > > - xonsh > > - zsh > > - powershell > > See 'conda init --help' for more information and options. > > IMPORTANT: You may need to close and restart your shell after running > > 'conda init'. > > Collecting package metadata (current_repodata.json): ...working... > done > > Solving environment: ...working... done > > # All requested packages already installed. > > 'libtiff' installed in conda exmipp_test_opencv.cpp -o > xmipp_test_opencv.o > > -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > OpenCV not found > > rm -v xmipp_test_opencv* > > 'mpirun' and 'mpiexec' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicc' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > 'mpicxx' not found in the PATH > > Alternative found at '/usr/lib64/openmpi/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/lib64/openmpi/bin > > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > > Alternative found at '/usr/local/cuda/bin'. > > Please, press [return] to use it or type a path where to locate it: > > -> /usr/local/cuda/bin > > CUDA-10.1.243 detected. > > CUDA-8.0 is recommended. > > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > > Configuration completed..... > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > > check_config > > Checking configuration ------------------------------ > > Checking compiler configuration ... > > g++ 4.8.5 detected > > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_main.cpp > > -o xmipp_test_main.o -I../ > > -I/home/user/Data/Software/miniconda/envs/scipion3/include > > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > > > > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > > /usr/bin/ld: warning: libwebp.so.7, needed by > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > > found (try using -rpath or -rpath-link) > > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPInitDecBufferInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureFree' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIAppend' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDecGetRGB' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPINewDecoder' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureImportRGBA' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPConfigInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPEncode' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPValidateConfig' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPPictureInitInternal' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPFreeDecBuffer' > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > > undefined reference to `WebPIDelete' > > collect2: error: ld returned 1 exit status > > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > > Cannot compile > > Possible solutions > > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > > python3-mpi4py > > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > > --noconfirm > > Please, see > > ' > > > > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > > ' > > for more information about libraries dependencies. > > Remember to re-run './xmipp config' after installing libraries in > order > > to take into account the new system configuration. > > rm xmipp_test_main* > > Check failed! Trying to install libtiff with conda > > conda activate scipion3 ; conda install libtiff -y -c defaults > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > > > Because I can run it manually without errors: > > > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda > activate > > scipion3 ; conda install libtiff -y -c defaults > > Collecting package metadata (current_repodata.json): done > > Solving environment: done > > # All requested packages already installed. > > > > > > And also, thanks already in advance. I really appreciate your help. > > > > Best Christian > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > > Hi, > > > > Let's try manual installation: > > > > conda activate scipion3 > > > ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > -> > > > check output for errors > > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > > ./xmipp config -> check if all is correct in xmipp.conf > > > ./xmipp check_config -> check for errors > > > ./xmipp compileAndInstall N=4 && ln -srfn build > > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && touch > > > installation_finished && rm bindings_linked 2> /dev/null > > > > > > You can post output errors from these commands here. > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 4:49 PM Christian Tüting < > > chr...@bi...> wrote: > > > > > Hi all, > > > > > > Dmitrys collegue here. > > > > > > yes we have the library in the scipion3 env: > > > > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda > install fftw -y -c defaults[0m > > > CommandNotFoundError: Your shell has not been properly configured to > > use > > > 'conda activate'. > > > > > > When I am coping this command, it runs without any issue. > > > > > > > > > Thanks for your help. > > > > > > best > > > Christian > > > > > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> > > > Hi, > > > > > > do you have libwebp.so.7 in > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > > > Best regards, > > > Grigory > > > > > > > > > > > > > > > -------------------------------------------------------------------------------- > > > Grigory Sharov, Ph.D. > > > > > > MRC Laboratory of Molecular Biology, > > > Francis Crick Avenue, > > > Cambridge Biomedical Campus, > > > Cambridge CB2 0QH, UK. > > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > > e-mail: gs...@mr... > > > > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok > <Sem...@gm...> > > > wrote: > > > > > > > Dear colleagues, > > > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > > > > > The core was installed correctly but during xmipp installation > some > > error > > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > > > > Sincerely, > > > > Dmitry and > Christian_______________________________________________ > > > > scipion-users mailing list > > > > sci...@li... > > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > > > > > > _______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Christian T. <chr...@bi...> - 2021-05-20 18:54:21
|
I (temporarily) fixed the error with: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/user/Data/Software/miniconda/envs/scipion3/lib/ xmipp is currently compiling. I am really no expert in server maintainance, so I am not sure what are the implications of this. Afaik, this is only a fix for the current shell, so after reboot, this fix should be gone. So if xmipp needs this library, I guess I have to add this export command to ~/.bashrc, right? Best Christian >>> Christian Tüting <chr...@bi...> 20.05.21 20.20 Uhr >>> Hi, I tried yum install libtiff libtiff-devel before, but yum says, "nothing to do". But, yum install libtiff* helped to install the missing packages somehow. So the error disappers from ./xmipp_config. There are some path missing in PATH, but alternatives are found so I guess this is fine (see output below). But ./xmipp check_config still fails. (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config Configuring ----------------------------------------- gcc detected g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ -I/home/user/Data/Software/miniconda/envs/scipion3/include OpenCV not found rm -v xmipp_test_opencv* 'mpirun' and 'mpiexec' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin 'mpicc' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin 'mpicxx' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) Alternative found at '/usr/local/cuda/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/local/cuda/bin CUDA-10.1.243 detected. CUDA-8.0 is recommended. Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. Configuration completed..... (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp check_config Checking configuration ------------------------------ Checking compiler configuration ... g++ 4.8.5 detected g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp -o xmipp_test_main.o -I../ -I/home/user/Data/Software/miniconda/envs/scipion3/include -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread /usr/bin/ld: warning: libwebp.so.7, needed by /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not found (try using -rpath or -rpath-link) /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureImportRGB' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPInitDecBufferInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureFree' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIAppend' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIDecGetRGB' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPINewDecoder' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureImportRGBA' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPConfigInitInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPEncoundefined reference to `WebPValidateConfig' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureInitInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPFreeDecBuffer' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIDelete' collect2: error: ld returned 1 exit status Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS Cannot compile Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy python3-mpi4py In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy --noconfirm Please, see 'https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies' for more information about libraries dependencies. Remember to re-run './xmipp config' after installing libraries in order to take into account the new system configuration. rm xmipp_test_main* Check failed! Something wrong with the configuration. It's still looking for libwebp.so.7, but this is present: (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr ../../../../miniconda/envs/scipion3/lib/libwebp.so* -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 lrwxrwxrwx. 1 user user 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 -> libwebp.so.7.1.1 lrwxrwxrwx. 1 user user 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so -> libwebp.so.7.1.1 Best Christian >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> Hi Christian, I guess xmipp script still cannot recognize libtiff that it has installed. I can reproduce your problem on my machine. But I have a system library installed. So, the easiest solution is to install libtiff / libtiff-devel using your package manager (yum) Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 6:34 PM Christian Tüting < chr...@bi...> wrote: > Hi, > > it fails with similar errors like in the automatic installation: > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > linux-vdso.so.1 => (0x00007ffe872c3000) > libwebp.so.7 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > (0x00007f8f7253a000) > libzstd.so.1 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > (0x00007f8f7246e000) > liblzma.so.5 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > (0x00007f8f723fe000) > libjpeg.so.9 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > (0x00007f8f721c2000) > libz.so.1 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libz.so.1 > (0x00007f8f7244d000) > libm.so.6 => /lib64/libm.so.6 (0x00007f8f71ea4000) > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > 'libtiff' not found in the system > 'libtiff' dependency not found. Do you want to install it using conda? > [YES/no] Y > Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > To initialize your shell, run > $ conda init <SHELL_NAME> > Currently supported shells are: > - bash > - fish > - tcsh > - x> IMPORTANT: You may need to close and restart your shell after running > 'conda init'. > Collecting package metadata (current_repodata.json): ...working... done > Solving environment: ...working... done > # All requested packages already installed. > 'libtiff' installed in conda environ 'scipion3'. > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > 'libtiff' not found in the system > 'libtiff' dependency not found. Do you want to install it using conda? > [YES/no] > Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > To initialize your shell, run > $ conda init <SHELL_NAME> > Currently supported shells are: > - bash > - fish > - tcsh > - xonsh > - zsh > - powershell > See 'conda init --help' for more information and options. > IMPORTANT: You may need to close and restart your shell after running > 'conda init'. > Collecting package metadata (current_repodata.json): ...working... done > Solving environment: ...working... done > # All requested packages already installed. > 'libtiff' installed in conda exmipp_test_opencv.cpp -o xmipp_test_opencv.o > -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > OpenCV not found > rm -v xmipp_test_opencv* > 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > check_config > Checking configuration ------------------------------ > Checking compiler configuration ... > g++ 4.8.5 detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp > -o xmipp_test_main.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/c> xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > /usr/bin/ld: warning: libwebp.so.7, needed by > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > found (try using -rpath or -rpath-link) > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPInitDecBufferInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureFree' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIAppend' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDecGetRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPINewDecoder' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGBA' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPConfigInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPEncode' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPValidateConfig' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPFreeDecBuffer' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDelete' > collect2: error: ld returned 1 exit status > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > Cannot compile > Possible solutions > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > python3-mpi4py > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > --noconfirm > Please, see > ' > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > ' > for more information about libraries dependencies. > Remember to re-run './xmipp config' after installing libraries in order > to take into account the new system configuration. > rm xmipp_test_main* > Check failed! Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > > Because I can run it manually without errors: > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda activate > scipion3 ; conda install libtiff -y -c defaults > Collecting package metadata (current_repodata.json): done > Solving environment: done > # All requested packages already installed. > > > And also, thanks already in advance. I really appreciate your help. > > Best Christian > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > Hi, > > Let's try manual installation: > > conda activate scipion3 > > ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so -> > > check output for errors > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > ./xmipp config -> check if all is correct in xmipp.conf > > ./xmipp check_config -> check for errors > > ./xmipp compileAndInstall N=4 && ln -srfn build > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && touch > > installation_finished && rm bindings_linked 2> /dev/null > > > You can post output errors from these commands here. > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gsharov@mrc-lmb.> chr...@bi...> wrote: > > > Hi all, > > > > Dmitrys collegue here. > > > > yes we have the library in the scipion3 env: > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda install fftw -y -c defaults[0m > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > > > When I am coping this command, it runs without any issue. > > > > > > Thanks for your help. > > > > best > > Christian > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> > > Hi, > > > > do you have libwebp.so.7 in > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok <Sem...@gm...> > > wrote: > > > > > Dear colleagues, > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > The core was installed correctly but during xmipp installation some > error > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > Sincerely, > > > Dmitry and Christian_______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > _______________________________________________ scipion-users mailing list sci...@li... https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Christian T. <chr...@bi...> - 2021-05-20 18:19:32
|
Hi, I tried yum install libtiff libtiff-devel before, but yum says, "nothing to do". But, yum install libtiff* helped to install the missing packages somehow. So the error disappers from ./xmipp_config. There are some path missing in PATH, but alternatives are found so I guess this is fine (see output below). But ./xmipp check_config still fails. (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config Configuring ----------------------------------------- gcc detected g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ -I/home/user/Data/Software/miniconda/envs/scipion3/include OpenCV not found rm -v xmipp_test_opencv* 'mpirun' and 'mpiexec' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin 'mpicc' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin 'mpicxx' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) Alternative found at '/usr/local/cuda/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/local/cuda/bin CUDA-10.1.243 detected. CUDA-8.0 is recommended. Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. Configuration completed..... (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp check_config Checking configuration ------------------------------ Checking compiler configuration ... g++ 4.8.5 detected g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp -o xmipp_test_main.o -I../ -I/home/user/Data/Software/miniconda/envs/scipion3/include -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread /usr/bin/ld: warning: libwebp.so.7, needed by /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not found (try using -rpath or -rpath-link) /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureImportRGB' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPInitDecBufferInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureFree' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIAppend' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIDecGetRGB' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPINewDecoder' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureImportRGBA' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPConfigInitInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPEncode' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPValidateConfig' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureInitInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPFreeDecBuffer' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIDelete' collect2: error: ld returned 1 exit status Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS Cannot compile Possible solutiolibhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy python3-mpi4py In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy --noconfirm Please, see 'https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies' for more information about libraries dependencies. Remember to re-run './xmipp config' after installing libraries in order to take into account the new system configuration. rm xmipp_test_main* Check failed! Something wrong with the configuration. It's still looking for libwebp.so.7, but this is present: (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ls -ltr ../../../../miniconda/envs/scipion3/lib/libwebp.so* -rwxrwxrwx. 2 user user 598608 Feb 1 16:15 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7.1.1 lrwxrwxrwx. 1 user user 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so.7 -> libwebp.so.7.1.1 lrwxrwxrwx. 1 user user 16 May 20 10:07 ../../../../miniconda/envs/scipion3/lib/libwebp.so -> libwebp.so.7.1.1 Best Christian >>> Grigory Sharov <sha...@gm...> 20.05.21 19.54 Uhr >>> Hi Christian, I guess xmipp script still cannot recognize libtiff that it has installed. I can reproduce your problem on my machine. But I have a system library installed. So, the easiest solution is to install libtiff / libtiff-devel using your package manager (yum) Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 6:34 PM Christian Tüting < chr...@bi...> wrote: > Hi, > > it fails with similar errors like in the automatic installation: > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > linux-vdso.so.1 => (0x00007ffe872c3000) > libwebp.so.7 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > (0x00007f8f7253a000) > libzstd.so.1 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > (0x00007f8f7246e000) > liblzma.so.5 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > (0x00007f8f723fe000) > libjpeg.so.9 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > (0x00007f8f721c2000) > libz.so.1 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libz.so.1 > (0x00007f8f7244d000) > libm.so.6 => /lib64/libm.so.6 (0x00007f8f71ea4000) > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > 'libtiff' not found in the system > 'libtiff' dependency not found. Do you want to install it using conda? > [YES/no] Y > Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > To initialize your shell, run > $ conda init <SHELL_NAME> > Currently supported shells are: > - bash > - fish > - tcsh > - xonsh > - zsh > - powershell > See 'conda init --help' for more information and options. > IMPORTANT: You may need to close and restart your shell after running > 'conda init'. > Collecting package metadata (current_repodata.json): ...working... done > Solving environment: ...working... done > # All requested packages already installed. > 'libtiff' installed in conda environ 'scipion3'. > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I..> 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > 'libtiff' not found in the system > 'libtiff' dependency not found. Do you want to install it using conda? > [YES/no] > Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > To initialize your shell, run > $ conda init <SHELL_NAME> > Currently supported shells are: > - bash > - fish > - tcsh > - xonsh > - zsh > - powershell > See 'conda init --help' for more information and options. > IMPORTANT: You may need to close and restart your shell after running > 'conda init'. > Collecting package metadata (current_repodata.json): ...working... done > Solving environment: ...working... done > # All requested packages already installed. > 'libtiff' installed in conda exmipp_test_opencv.cpp -o xmipp_test_opencv.o > -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > OpenCV not found > rm -v xmipp_test_opencv* > 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > check_config > Checking configuration ------------------------------ > Checking compiler configuration ... > g++ 4.8.5 detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp > -o xmipp_test_main.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > /usr/bin/ld: warning: libwebp.so.7, needed by > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > found (try using -rpath or -rpath-link) > /home/user/Data/Software/miniconda/envs/scipion3/lib/lib> /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPInitDecBufferInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureFree' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIAppend' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDecGetRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPINewDecoder' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGBA' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPConfigInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPEncode' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPValidateConfig' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPFreeDecBuffer' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDelete' > collect2: error: ld returned 1 exit status > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > Cannot compile > Possible solutions > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > python3-mpi4py > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > --noconfirm > Please, see > ' > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > ' > for more information about libraries dependencies. > Remember to re-run './xmipp config' after installing libraries in order > to take into account the new system configuration. > rm xmipp_test_main* > Check failed! Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > > Because I can run it manually without errors: > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda activate > scipion3 ; conda install libtiff -y -c defaults > Collecting package metadata (current_repodata.json): done > Solving environment: done > # All requested packages already installed. > > > And also, thanks already in advance. I really appreciate your help. > > Best Christian > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > Hi, > > Let's try manual installation: > > conda activate scipion3 > > ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so -> > > check output for errors > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > ./xmipp config -> check if all is correct in xmipp.conf > > ./xmipp check_config -> check for errors > > ./xmipp compileAndInstall N=4 && ln -srfn build > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && touch > > installation_finished && rm bindings_linked 2> /dev/null > > > You can post output errors from these commands here. > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 4:49 PM Christian Tüting < > chr...@bi...> wrote: > > > Hi all, > > > > Dmitrys collegue here. > > > > yes we have the library in the scipion3 env: > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > /home/user/Data/Software> > [92m conda activate scipion3 ; conda install fftw -y -c defaults[0m > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > > > When I am coping this command, it runs without any issue. > > > > > > Thanks for your help. > > > > best > > Christian > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> > > Hi, > > > > do you have libwebp.so.7 in > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok <Sem...@gm...> > > wrote: > > > > > Dear colleagues, > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > The core was installed correctly but during xmipp installation some > error > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > Sincerely, > > > Dmitry and Christian_______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Grigory S. <sha...@gm...> - 2021-05-20 17:53:52
|
Hi Christian, I guess xmipp script still cannot recognize libtiff that it has installed. I can reproduce your problem on my machine. But I have a system library installed. So, the easiest solution is to install libtiff / libtiff-devel using your package manager (yum) Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 6:34 PM Christian Tüting < chr...@bi...> wrote: > Hi, > > it fails with similar errors like in the automatic installation: > > > (scipion3) [user@dataanalysisserver1 scipion3]$ ldd > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so > linux-vdso.so.1 => (0x00007ffe872c3000) > libwebp.so.7 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 > (0x00007f8f7253a000) > libzstd.so.1 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 > (0x00007f8f7246e000) > liblzma.so.5 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 > (0x00007f8f723fe000) > libjpeg.so.9 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 > (0x00007f8f721c2000) > libz.so.1 => > /home/user/Data/Software/miniconda/envs/scipion3/lib/./libz.so.1 > (0x00007f8f7244d000) > libm.so.6 => /lib64/libm.so.6 (0x00007f8f71ea4000) > libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) > libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) > librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) > /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > 'libtiff' not found in the system > 'libtiff' dependency not found. Do you want to install it using conda? > [YES/no] Y > Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > To initialize your shell, run > $ conda init <SHELL_NAME> > Currently supported shells are: > - bash > - fish > - tcsh > - xonsh > - zsh > - powershell > See 'conda init --help' for more information and options. > IMPORTANT: You may need to close and restart your shell after running > 'conda init'. > Collecting package metadata (current_repodata.json): ...working... done > Solving environment: ...working... done > # All requested packages already installed. > 'libtiff' installed in conda environ 'scipion3'. > g++ -c -w -mtune=native -march=native -std=c++11 -O3 > xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > OpenCV not found > rm -v xmipp_test_opencv* > 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config > Configuring ----------------------------------------- > gcc detected > 'libtiff' not found in the system > 'libtiff' dependency not found. Do you want to install it using conda? > [YES/no] > Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > To initialize your shell, run > $ conda init <SHELL_NAME> > Currently supported shells are: > - bash > - fish > - tcsh > - xonsh > - zsh > - powershell > See 'conda init --help' for more information and options. > IMPORTANT: You may need to close and restart your shell after running > 'conda init'. > Collecting package metadata (current_repodata.json): ...working... done > Solving environment: ...working... done > # All requested packages already installed. > 'libtiff' installed in conda exmipp_test_opencv.cpp -o xmipp_test_opencv.o > -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > OpenCV not found > rm -v xmipp_test_opencv* > 'mpirun' and 'mpiexec' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicc' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > 'mpicxx' not found in the PATH > Alternative found at '/usr/lib64/openmpi/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/lib64/openmpi/bin > Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 > 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) > Alternative found at '/usr/local/cuda/bin'. > Please, press [return] to use it or type a path where to locate it: > -> /usr/local/cuda/bin > CUDA-10.1.243 detected. > CUDA-8.0 is recommended. > Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. > Configuration completed..... > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp > check_config > Checking configuration ------------------------------ > Checking compiler configuration ... > g++ 4.8.5 detected > g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp > -o xmipp_test_main.o -I../ > -I/home/user/Data/Software/miniconda/envs/scipion3/include > -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 > > -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include > g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib > xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 > -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread > /usr/bin/ld: warning: libwebp.so.7, needed by > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not > found (try using -rpath or -rpath-link) > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPInitDecBufferInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureFree' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIAppend' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDecGetRGB' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPINewDecoder' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureImportRGBA' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPConfigInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPEncode' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPValidateConfig' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPPictureInitInternal' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPFreeDecBuffer' > /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: > undefined reference to `WebPIDelete' > collect2: error: ld returned 1 exit status > Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS > Cannot compile > Possible solutions > In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev > libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy > python3-mpi4py > In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy > --noconfirm > Please, see > ' > https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies > ' > for more information about libraries dependencies. > Remember to re-run './xmipp config' after installing libraries in order > to take into account the new system configuration. > rm xmipp_test_main* > Check failed! Trying to install libtiff with conda > conda activate scipion3 ; conda install libtiff -y -c defaults > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > > Because I can run it manually without errors: > > (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda activate > scipion3 ; conda install libtiff -y -c defaults > Collecting package metadata (current_repodata.json): done > Solving environment: done > # All requested packages already installed. > > > And also, thanks already in advance. I really appreciate your help. > > Best Christian > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> > Hi, > > Let's try manual installation: > > conda activate scipion3 > > ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so -> > > check output for errors > > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > > ./xmipp config -> check if all is correct in xmipp.conf > > ./xmipp check_config -> check for errors > > ./xmipp compileAndInstall N=4 && ln -srfn build > > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && touch > > installation_finished && rm bindings_linked 2> /dev/null > > > You can post output errors from these commands here. > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 4:49 PM Christian Tüting < > chr...@bi...> wrote: > > > Hi all, > > > > Dmitrys collegue here. > > > > yes we have the library in the scipion3 env: > > > > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libwebp.so.7 > > > > What's puzzles me a bit, is this in the error log: > > > > [92m conda activate scipion3 ; conda install fftw -y -c defaults[0m > > CommandNotFoundError: Your shell has not been properly configured to > use > > 'conda activate'. > > > > When I am coping this command, it runs without any issue. > > > > > > Thanks for your help. > > > > best > > Christian > > > > > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> > > Hi, > > > > do you have libwebp.so.7 in > > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > > > Best regards, > > Grigory > > > > > > > > -------------------------------------------------------------------------------- > > Grigory Sharov, Ph.D. > > > > MRC Laboratory of Molecular Biology, > > Francis Crick Avenue, > > Cambridge Biomedical Campus, > > Cambridge CB2 0QH, UK. > > tel. +44 (0) 1223 267228 <+44%201223%20267228> > > e-mail: gs...@mr... > > > > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok <Sem...@gm...> > > wrote: > > > > > Dear colleagues, > > > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > > > > > The core was installed correctly but during xmipp installation some > error > > > appeared. Please have a look on the file attached. > > > > > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > > > > > > > > Thank you in advance. > > > > > > > > > Sincerely, > > > Dmitry and Christian_______________________________________________ > > > scipion-users mailing list > > > sci...@li... > > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > > > > > > _______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Christian T. <chr...@bi...> - 2021-05-20 17:33:49
|
Hi, it fails with similar errors like in the automatic installation: (scipion3) [user@dataanalysisserver1 scipion3]$ ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so linux-vdso.so.1 => (0x00007ffe872c3000) libwebp.so.7 => /home/user/Data/Software/miniconda/envs/scipion3/lib/./libwebp.so.7 (0x00007f8f7253a000) libzstd.so.1 => /home/user/Data/Software/miniconda/envs/scipion3/lib/./libzstd.so.1 (0x00007f8f7246e000) liblzma.so.5 => /home/user/Data/Software/miniconda/envs/scipion3/lib/./liblzma.so.5 (0x00007f8f723fe000) libjpeg.so.9 => /home/user/Data/Software/miniconda/envs/scipion3/lib/./libjpeg.so.9 (0x00007f8f721c2000) libz.so.1 => /home/user/Data/Software/miniconda/envs/scipion3/lib/./libz.so.1 (0x00007f8f7244d000) libm.so.6 => /lib64/libm.so.6 (0x00007f8f71ea4000) libc.so.6 => /lib64/libc.so.6 (0x00007f8f71ad6000) libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f8f718ba000) librt.so.1 => /lib64/librt.so.1 (0x00007f8f716b2000) /lib64/ld-linux-x86-64.so.2 (0x00007f8f72427000) (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config Configuring ----------------------------------------- gcc detected 'libtiff' not found in the system 'libtiff' dependency not found. Do you want to install it using conda? [YES/no] Y Trying to install libtiff with conda conda activate scipion3 ; conda install libtiff -y -c defaults CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'. To initialize your shell, run $ conda init <SHELL_NAME> Currently supported shells are: - bash - fish - tcsh - xonsh - zsh - powershell See 'conda init --help' for more information and options. IMPORTANT: You may need to close and restart your shell after running 'conda init'. Collecting package metadata (current_repodata.json): ...working... done Solving environment: ...working... done # All requested packages already installed. 'libtiff' installed in conda environ 'scipion3'. g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ -I/home/user/Data/Software/miniconda/envs/scipion3/include OpenCV not found rm -v xmipp_test_opencv* 'mpirun' and 'mpiexec' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin 'mpicc' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin 'mpicxx' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) Alternative found at '/usr/local/cuda/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/local/cuda/bin CUDA-10.1.243 detected. CUDA-8.0 is recommended. Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. Configuration completed..... (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp config Configuring ----------------------------------------- gcc detected 'libtiff' not found in the system 'libtiff' dependency not found. Do you want to install it using conda? [YES/no] Trying to install libtiff with conda conda activate scipion3 ; conda install libtiff -y -c defaults CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'. To initialize your shell, run $ conda init <SHELL_NAME> Currently supported shells are: - bash - fish - tcsh - xonsh - zsh - powershell See 'conda init --help' for more information and options. IMPORTANT: You may need to close and restart your shell after running 'conda init'. Collecting package metadata (current_repodata.json): ...working... done Solving environment: ...working... done # All requested packages already installed. 'libtiff' installed in conda exmipp_test_opencv.cpp -o xmipp_test_opencv.o -I../ -I/home/user/Data/Software/miniconda/envs/scipion3/include OpenCV not found rm -v xmipp_test_opencv* 'mpirun' and 'mpiexec' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin 'mpicc' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin 'mpicxx' not found in the PATH Alternative found at '/usr/lib64/openmpi/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/lib64/openmpi/bin Java detected at: /home/user/Data/Software/miniconda/envs/scipion3 'nvcc' not found in the PATH (either in CUDA_BIN/XMIPP_CUDA_BIN) Alternative found at '/usr/local/cuda/bin'. Please, press [return] to use it or type a path where to locate it: -> /usr/local/cuda/bin CUDA-10.1.243 detected. CUDA-8.0 is recommended. Using '/usr/local/cuda-10.1/targets/x86_64-linux/lib'. Configuration completed..... (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ ./xmipp check_config Checking configuration ------------------------------ Checking compiler configuration ... g++ 4.8.5 detected g++ -c -w -mtune=native -march=native -std=c++11 -O3 xmipp_test_main.cpp -o xmipp_test_main.o -I../ -I/home/user/Data/Software/miniconda/envs/scipion3/include -I/home/user/Data/Software/miniconda/envs/scipion3/include/python3.8 -I/home/user/Data/Software/miniconda/envs/scipion3/lib/python3.8/site-packages/numpy/core/include g++ -L/home/user/Data/Software/miniconda/envs/scipion3/lib xmipp_test_main.o -o xmipp_test_main -lfftw3 -lfftw3_threads -lhdf5 -lhdf5_cpp -ltiff -ljpeg -lsqlite3 -lpthread /usr/bin/ld: warning: libwebp.so.7, needed by /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so, not found (try using -rpath or -rpath-link) /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureImportRGB' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPInitDecBufferInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureFree' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIAppend' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIDecGetRGB' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPINewDecoder' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureImportRGBA' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPConfigInitInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPEncode' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPValidateConfig' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPPictureInitInternal' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPFreeDecBuffer' /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so: undefined reference to `WebPIDelete' collect2: error: ld returned 1 exit status Check the LINKERFORPROGRAMS, LINKFLAGS and LIBDIRFLAGS Cannot compile Possible solutions In Ubuntu: sudo apt-get -y install libsqlite3-dev libfftw3-dev libhdf5-dev libopencv-dev python3-dev python3-numpy python3-scipy python3-mpi4py In Manjaro: sudo pacman -Syu install hdf5 python3-numpy python3-scipy --noconfirm Please, see 'https://scipion-em.github.io/docs/docs/scipion-modes/install-from-sources.html#step-2-dependencies' for more information about libraries dependencies. Remember to re-run './xmipp config' after installing libraries in order to take into account the new system configuration. rm xmipp_test_main* Check failed! Trying to install libtiff with conda conda activate scipion3 ; conda install libtiff -y -c defaults CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'. Because I can run it manually without errors: (scipion3) [user@dataanalysisserver1 xmippSrc-v3.20.07]$ conda activate scipion3 ; conda install libtiff -y -c defaults Collecting package metadata (current_repodata.json): done Solving environment: done # All requested packages already installed. And also, thanks already in advance. I really appreciate your help. Best Christian >>> Grigory Sharov <sha...@gm...> 20.05.21 18.12 Uhr >>> Hi, Let's try manual installation: conda activate scipion3 > ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so -> > check output for errors > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > ./xmipp config -> check if all is correct in xmipp.conf > ./xmipp check_config -> check for errors > ./xmipp compileAndInstall N=4 && ln -srfn build > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && touch > installation_finished && rm bindings_linked 2> /dev/null You can post output errors from these commands here. Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 4:49 PM Christian Tüting < chr...@bi...> wrote: > Hi all, > > Dmitrys collegue here. > > yes we have the library in the scipion3 env: > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libwebp.so.7 > > What's puzzles me a bit, is this in the error log: > > [92m conda activate scipion3 ; conda install fftw -y -c defaults[0m > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > > When I am coping this command, it runs without any issue. > > > Thanks for your help. > > best > Christian > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> > Hi, > > do you have libwebp.so.7 in > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok <Sem...@gm...> > wrote: > > > Dear colleagues, > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > The core was installed correctly but during xmipp installation some error > > appeared. Please have a look on the file attached. > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > Thank you in advance. > > > > > > Sincerely, > > Dmitry and Christian_______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Grigory S. <sha...@gm...> - 2021-05-20 16:12:02
|
Hi, Let's try manual installation: conda activate scipion3 > ldd /home/user/Data/Software/miniconda/envs/scipion3/lib/libtiff.so -> > check output for errors > export SCIPION_HOME="/home/user/Data/Software/scipion3/" > cd /home/user/Data/Software/scipion3/software/em/xmippSrc-v3.20.07 > ./xmipp config -> check if all is correct in xmipp.conf > ./xmipp check_config -> check for errors > ./xmipp compileAndInstall N=4 && ln -srfn build > /home/user/Data/Software/scipion3/software/em/xmipp && cd - && touch > installation_finished && rm bindings_linked 2> /dev/null You can post output errors from these commands here. Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 4:49 PM Christian Tüting < chr...@bi...> wrote: > Hi all, > > Dmitrys collegue here. > > yes we have the library in the scipion3 env: > > > [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 > > > /home/user/Data/Software/miniconda/envs/scipion3/lib/libwebp.so.7 > > What's puzzles me a bit, is this in the error log: > > [92m conda activate scipion3 ; conda install fftw -y -c defaults[0m > CommandNotFoundError: Your shell has not been properly configured to use > 'conda activate'. > > When I am coping this command, it runs without any issue. > > > Thanks for your help. > > best > Christian > > > > >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> > Hi, > > do you have libwebp.so.7 in > /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok <Sem...@gm...> > wrote: > > > Dear colleagues, > > > > We reinstalled the centos 7 on our server as well as miniconda. > > > > > > > > The core was installed correctly but during xmipp installation some error > > appeared. Please have a look on the file attached. > > > > > > > > Could you please advice how to proceed? > > > > > > > > > > > > > > Thank you in advance. > > > > > > Sincerely, > > Dmitry and Christian_______________________________________________ > > scipion-users mailing list > > sci...@li... > > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Christian T. <chr...@bi...> - 2021-05-20 15:49:05
|
Hi all, Dmitrys collegue here. yes we have the library in the scipion3 env: [user@dataanalysisserver1 lib]$ ls -d $PWD/libwebp.so.7 /home/user/Data/Software/miniconda/envs/scipion3/lib/libwebp.so.7 What's puzzles me a bit, is this in the error log: [92m conda activate scipion3 ; conda install fftw -y -c defaults[0m CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'. When I am coping this command, it runs without any issue. Thanks for your help. best Christian >>> Grigory Sharov <sha...@gm...> 20.05.21 16.39 Uhr >>> Hi, do you have libwebp.so.7 in /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok <Sem...@gm...> wrote: > Dear colleagues, > > We reinstalled the centos 7 on our server as well as miniconda. > > > > The core was installed correctly but during xmipp installation some error > appeared. Please have a look on the file attached. > > > > Could you please advice how to proceed? > > > > > > > Thank you in advance. > > > Sincerely, > Dmitry and Christian_______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Grigory S. <sha...@gm...> - 2021-05-20 14:39:04
|
Hi, do you have libwebp.so.7 in /home/user/Data/Software/miniconda/envs/scipion3/lib/ ? Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Thu, May 20, 2021 at 3:00 PM Dmitry Semchonok <Sem...@gm...> wrote: > Dear colleagues, > > We reinstalled the centos 7 on our server as well as miniconda. > > > > The core was installed correctly but during xmipp installation some error > appeared. Please have a look on the file attached. > > > > Could you please advice how to proceed? > > > > > > > Thank you in advance. > > > Sincerely, > Dmitry and Christian_______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Dmitry S. <Sem...@gm...> - 2021-05-20 13:59:55
|
Dear colleagues, We reinstalled the centos 7 on our server as well as miniconda. The core was installed correctly but during xmipp installation some error appeared. Please have a look on the file attached. |
From: Jose L. V. P. <jos...@ce...> - 2021-05-19 19:54:52
|
Dear EM community, >From I2PC we are proud to announce the second part of our seminar series on Image processing, addressed to Electron Tomography Methods. The aim of the seminar series is to gather expert developers and users in a same framework where the algorithms can gain visibility and users can understand the applications and mechanisms of the presented algorithms. The webinars will be online at 16:00 GTM+1 (Berlin, Paris, Madrid) and they are free without registration of cost. You can join the session through next link. https://conectaha.csic.es/b/bla-rkh-dqa-rpn [https://conectaha.csic.es/presentation/sgai/logo_with_text_conecta_az.png]<https://conectaha.csic.es/b/bla-rkh-dqa-rpn> I2PC - Instruct Course<https://conectaha.csic.es/b/bla-rkh-dqa-rpn> You have been invited to join I2PC - Instruct Course using BigBlueButton. To join, click the link above and enter your name. conectaha.csic.es The contents (speakers are free to present other contents) and speakers will be 3rd June. DeepFinder.Emmanuel Moebel. CNRS-UMR 144, Inria, CNRS, Institut Curie, PSL Research University, France 10th June. NOVACTF. Beata Turonova. Max-Planck Institute of Biophysics, Germany. 17th June.PySeg. Antonio Martínez-Sánchez.Universidad de Oviedo, Spain. 24th June. CRYOCARE.Tim Oliver.Max Planck Institute of Molecular Cell Biology and Genetics (MPI-CBG), Germany 8th June. Dynamo. Daniel Castaño. Dynamo. Biozentrum, University of Basel. Switzerland With best wishes I2PC Team [https://www.uchceu.es/img/logos/wur.jpg] Este mensaje y sus archivos adjuntos, enviados desde FUNDACIÓN UNIVERSITARIA SAN PABLO-CEU, pueden contener información confidencial y está destinado a ser leído sólo por la persona a la que va dirigido, por lo que queda prohibida la difusión, copia o utilización de dicha información por terceros. Si usted lo recibiera por error, por favor, notifíquelo al remitente y destruya el mensaje y cualquier documento adjunto que pudiera contener. Cualquier información, opinión, conclusión, recomendación, etc. contenida en el presente mensaje no relacionada con la actividad de FUNDACIÓN UNIVERSITARIA SAN PABLO-CEU, y/o emitida por persona no autorizada para ello, deberá considerarse como no proporcionada ni aprobada por FUNDACIÓN UNIVERSITARIA SAN PABLO-CEU, que pone los medios a su alcance para garantizar la seguridad y ausencia de errores en la correspondencia electrónica, pero no puede asegurar la inexistencia de virus o la no alteración de los documentos transmitidos electrónicamente, por lo que declina cualquier responsabilidad a este respecto. This message and its attachments, sent from FUNDACIÓN UNIVERSITARIA SAN PABLO-CEU, may contain confidential information and is intended to be read only by the person it is directed. Therefore any disclosure, copying or use by third parties of this information is prohibited. If you receive this in error, please notify the sender and destroy the message and any attachments may contain. Any information, opinion, conclusion, recommendation,... contained in this message and which is unrelated to the business activity of FUNDACIÓN UNIVERSITARIA SAN PABLO-CEU and/or issued by unauthorized personnel, shall be considered unapproved by FUNDACIÓN UNIVERSITARIA SAN PABLO-CEU. FUNDACIÓN UNIVERSITARIA SAN PABLO-CEU implements control measures to ensure, as far as possible, the security and reliability of all its electronic correspondence. However, FUNDACIÓN UNIVERSITARIA SAN PABLO-CEU does not guarantee that emails are virus-free or that documents have not be altered, and does not take responsibility in this respect. |
From: Grigory S. <sha...@gm...> - 2021-05-19 09:48:22
|
Thank you, Pablo! Indeed I never considered mrcs to have a single 2D particle, which is entirely possible. Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Wed, May 19, 2021 at 10:45 AM Pablo Conesa <pc...@cn...> wrote: > Hi, we've found the issue. > > Although import particles (files mode) seemed correct....it wasn't for > only those mrcs files having a single image. 30 of them. > > Removing them at import time worked and now relion 2d classification works > as expected. > > We'll fix it. > > > Cheers! > > > On 19/5/21 10:56, Grigory Sharov wrote: > > Hi Dmitry, > > I'll try to reproduce the error when I get a chance > > On Wed, May 19, 2021, 09:41 Pablo Conesa <pc...@cn...> wrote: > >> I see, maybe we can arrange a tele conf to see in detail what is wrong. >> I'll contact you. >> On 19/5/21 10:26, Dmitry Semchonok wrote: >> >> Dear Pablo and Grigory, >> >> >> Thank you! >> >> Yes, I am well aware of the fact that there is no info for the CTF etc :) >> >> >> All I need is just a nice aligned 2D set of images (from the set I >> imported) >> >> (Ideally I would like to have this set just from cryosparc but I have no >> idea how to do that right away :) ) >> >> Please see the log of relion >> >> >> >> RUNNING PROTOCOL ----------------- >> 00002: Hostname: cryoem01 >> 00003: PID: 36440 >> 00004: pyworkflow: 3.0.13 >> 00005: plugin: relion >> 00006: plugin v: 3.1.2 >> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >> 00008: workingDir: Runs/001735_ProtRelionClassify2D >> 00009: runMode: Continue >> 00010: MPI: 3 >> 00011: threads: 3 >> 00012: Starting at step: 1 >> 00013: Running steps >> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 >> 15:06:21.557294 >> 00015: Converting set from >> 'Runs/001662_ProtImportParticles/particles.sqlite' into >> 'Runs/001735_ProtRelionClassify2D/input_particles.star' >> 00016: convertBinaryFiles: creating soft links. >> 00017: Root: Runs/001735_ProtRelionClassify2D/extra/input -> >> Runs/001662_ProtImportParticles/extra >> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 >> 15:06:22.474076 >> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 15:06:22.502665 >> 00020: mpirun -np 3 `which relion_refine_mpi` --i >> Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter >> 690 --zero_mask --K 64 --norm --scale --o >> Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 >> 00021: RELION version: 3.1.2 >> 00022: Precision: BASE=double, CUDA-ACC=single >> 00023: >> 00024: === RELION MPI setup === >> 00025: + Number of MPI processes = 3 >> 00026: + Number of threads per MPI process = 3 >> 00027: + Total number of threads therefore = 9 >> 00028: + Leader (0) runs on host = cryoem01 >> 00029: + Follower 1 runs on host = cryoem01 >> 00030: + Follower 2 runs on host = cryoem01 >> 00031: ================= >> 00032: uniqueHost cryoem01 has 2 ranks. >> 00033: GPU-ids not specified for this rank, threads will automatically >> be mapped to available devices. >> 00034: Thread 0 on follower 1 mapped to device 0 >> 00035: Thread 1 on follower 1 mapped to device 0 >> 00036: Thread 2 on follower 1 mapped to device 0 >> 00037: GPU-ids not specified for this rank, threads will automatically >> be mapped to available devices. >> 00038: Thread 0 on follower 2 mapped to device 1 >> 00039: Thread 1 on follower 2 mapped to device 1 >> 00040: Thread 2 on follower 2 mapped to device 1 >> 00041: Running CPU instructions in double precision. >> 00042: + WARNING: Changing psi sampling rate (before oversampling) to >> 5.625 degrees, for more efficient GPU calculations >> 00043: + On host cryoem01: free scratch space = 447.485 Gb. >> 00044: Copying particles to scratch directory: >> /data1/new_scratch/relion_volatile/ >> 00045: 000/??? sec ~~(,_,"> >> [oo] >> 00046: 0/ 0 sec ~~(,_,">in: >> /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >> 00047: ERROR: >> 00048: readMRC: Image number 11 exceeds stack size 1 of image >> 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >> 00049: === Backtrace === >> 00050: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) >> [0x4786a1] >> 00051: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) >> [0x4b210f] >> 00052: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) >> [0x4b407b] >> 00053: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) >> [0x5b8f87] >> 00054: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) >> [0x498540] >> 00055: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) >> [0x49ab2a] >> 00056: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) >> [0x4322a5] >> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7fea6a54e555] >> 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() >> [0x435fbf] >> 00059: ================== >> 00060: ERROR: >> 00061: readMRC: Image number 11 exceeds stack size 1 of image >> 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 >> 00063: Traceback (most recent call last): >> 00064: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 197, in run >> 00065: self._run() >> 00066: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 248, in _run >> 00067: resultFiles = self._runFunc() >> 00068: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 244, in _runFunc >> 00069: return self._func(*self._args) >> 00070: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", >> line 811, in runRelionStep >> 00071: self.runJob(self._getProgram(), params) >> 00072: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 1388, in runJob >> 00073: self._stepsExecutor.runJob(self._log, program, arguments, >> **kwargs) >> 00074: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", >> line 65, in runJob >> 00075: process.runJob(log, programName, params, >> 00076: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >> line 52, in runJob >> 00077: return runCommand(command, env, cwd) >> 00078: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >> line 67, in runCommand >> 00079: check_call(command, shell=True, stdout=sys.stdout, >> stderr=sys.stderr, >> 00080: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, >> in check_call >> 00081: raise CalledProcessError(retcode, cmd) >> 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which >> relion_refine_mpi` --i >> Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter >> 690 --zero_mask --K 64 --norm --scale --o >> Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >> non-zero exit status 1. >> 00083: Protocol failed: Command ' mpirun -np 3 `which >> relion_refine_mpi` --i >> Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter >> 690 --zero_mask --K 64 --norm --scale --o >> Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >> non-zero exit status 1. >> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 15:06:24.609548 >> 00085: *** Last status is failed >> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >> >> >> *Additionally,* >> >> So it seemed from the first look that there is some issue with the image >> 11 — I delete the image 11 but the problem still remained. >> >> >> >> >> *Optionally,* >> >> I believe that xmipp-2D should work but I did not try it yet. >> >> Thank you >> >> Sincerely, >> Dmitry >> >> >> >> >> >> >> >> >> >> On 19. May 2021, at 10:13, Pablo Conesa <pc...@cn...> wrote: >> >> Hi! So, I think Grigory is right, you've gone through the import >> particles "without metadata info" therefore you only have the images >> without any alignment information. >> >> >> In theory, 2d classification should work with this kind of import. Could >> you please share the logs of one of the relion classification? >> On 19/5/21 9:26, Dmitry Semchonok wrote: >> >> Dear Grigory, >> >> >> I did nothing much, just tried to start relion 2D // or cryosparc. >> >> >> The only thing I tried additionally since I could no proceed is to a) >> change the box size; b) just resave the subset with the same number of >> images. >> >> >> Please see the image. >> >> <Screenshot 2021-05-19 at 09.23.41.png> >> >> >> Thank you >> >> Sincerely, >> >> Dmitry >> >> >> On 18. May 2021, at 15:19, Grigory Sharov <sha...@gm...> >> wrote: >> >> Hi, >> >> I imagine you have 624 micrographs, so particles are exported to mrc on a >> mic basis. I see you used the "files" option to import mrcs particles into >> Scipion. This means the imported particles have no metadata except pixel >> size you provided. >> >> What did you do with them after import? >> >> Best regards, >> Grigory >> >> >> -------------------------------------------------------------------------------- >> Grigory Sharov, Ph.D. >> >> MRC Laboratory of Molecular Biology, >> Francis Crick Avenue, >> Cambridge Biomedical Campus, >> Cambridge CB2 0QH, UK. >> tel. +44 (0) 1223 267228 <+44%201223%20267228> >> e-mail: gs...@mr... >> >> >> On Tue, May 18, 2021 at 2:12 PM Dmitry Semchonok <Sem...@gm...> >> wrote: >> >>> Dear Grigory, >>> >>> Yes I did that — the particles are looking fine. >>> >>> >>> I guess the issue still comes from the fact that originally in cryosparc >>> Export the stacks of particle were placed into 624 mrc. But the number of >>> particles is about 44 818. So even after the renaming and the export I see >>> this in SCIPION export log >>> >>> <Screenshot 2021-05-18 at 15.09.11.png> >>> >>> What I guess may help is if I somehow combine all those files in 1 mrcs >>> first and then add import them to SCIPION. >>> Do you perhaps know how to do that? >>> >>> Thank you >>> >>> Sincerely, >>> Dmitry >>> >>> >>> >>> On 18. May 2021, at 15:02, Grigory Sharov <sha...@gm...> >>> wrote: >>> >>> Hi Dmitry, >>> >>> as the error states your star file points to a non-existing image in the >>> mrcs stack. You need to check first if you import from cryosparc with mrcs >>> worked correctly (open / display particles) then trace all the steps you >>> did before 2D classification. >>> >>> Best regards, >>> Grigory >>> >>> >>> -------------------------------------------------------------------------------- >>> Grigory Sharov, Ph.D. >>> >>> MRC Laboratory of Molecular Biology, >>> Francis Crick Avenue, >>> Cambridge Biomedical Campus, >>> Cambridge CB2 0QH, UK. >>> tel. +44 (0) 1223 267228 <+44%201223%20267228> >>> e-mail: gs...@mr... >>> >>> >>> On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok <Sem...@gm...> >>> wrote: >>> >>>> Dear Pablo, >>>> >>>> >>>> Thank you. I heard about this option. For that I guess the >>>> https://pypi.org/project/cs2star/ needs to be installed. >>>> >>>> >>>> >>>> >>>> In Cryosparc itself there is an option to export files. And then what >>>> we get is the mrc files with different number of particles in each. >>>> >>>> It appeared to be possible to rename mrc —> to —>mrcs. Then SCIPION can >>>> import those particles. >>>> >>>> Currently the problem is that not relion nor cryosparc can run these >>>> particles. >>>> >>>> >>>> >>>> >>>> Relion stops with error: >>>> >>>> >>>> 00001: RUNNING PROTOCOL ----------------- >>>> 00002: Hostname: cryoem01 >>>> 00003: PID: 46455 >>>> 00004: pyworkflow: 3.0.13 >>>> 00005: plugin: relion >>>> 00006: plugin v: 3.1.2 >>>> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >>>> 00008: workingDir: Runs/001546_ProtRelionClassify2D >>>> 00009: runMode: Continue >>>> 00010: MPI: 3 >>>> 00011: threads: 3 >>>> 00012: Starting at step: 1 >>>> 00013: Running steps >>>> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 >>>> 12:33:26.198123 >>>> 00015: Converting set from >>>> 'Runs/001492_ProtUserSubSet/particles.sqlite' into >>>> 'Runs/001546_ProtRelionClassify2D/input_particles.star' >>>> 00016: convertBinaryFiles: creating soft links. >>>> 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input -> >>>> Runs/001057_ProtImportParticles/extra >>>> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 >>>> 12:33:27.117588 >>>> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 12:33:27.145974 >>>> 00020: mpirun -np 3 `which relion_refine_mpi` --i >>>> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >>>> 690 --zero_mask --K 64 --norm --scale --o >>>> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >>>> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >>>> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >>>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 >>>> 00021: RELION version: 3.1.2 >>>> 00022: Precision: BASE=double, CUDA-ACC=single >>>> 00023: >>>> 00024: === RELION MPI setup === >>>> 00025: + Number of MPI processes = 3 >>>> 00026: + Number of threads per MPI process = 3 >>>> 00027: + Total number of threads therefore = 9 >>>> 00028: + Leader (0) runs on host = cryoem01 >>>> 00029: + Follower 1 runs on host = cryoem01 >>>> 00030: + Follower 2 runs on host = cryoem01 >>>> 00031: ================= >>>> 00032: uniqueHost cryoem01 has 2 ranks. >>>> 00033: GPU-ids not specified for this rank, threads will >>>> automatically be mapped to available devices. >>>> 00034: Thread 0 on follower 1 mapped to device 0 >>>> 00035: Thread 1 on follower 1 mapped to device 0 >>>> 00036: Thread 2 on follower 1 mapped to device 0 >>>> 00037: GPU-ids not specified for this rank, threads will >>>> automatically be mapped to available devices. >>>> 00038: Thread 0 on follower 2 mapped to device 1 >>>> 00039: Thread 1 on follower 2 mapped to device 1 >>>> 00040: Thread 2 on follower 2 mapped to device 1 >>>> 00041: Running CPU instructions in double precision. >>>> 00042: + WARNING: Changing psi sampling rate (before oversampling) >>>> to 5.625 degrees, for more efficient GPU calculations >>>> 00043: + On host cryoem01: free scratch space = 448.252 Gb. >>>> 00044: Copying particles to scratch directory: >>>> /data1/new_scratch/relion_volatile/ >>>> 00045: 000/??? sec ~~(,_,"> >>>> [oo] >>>> 00046: 1/ 60 sec ~~(,_,">in: >>>> /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >>>> 00047: ERROR: >>>> 00048: readMRC: Image number 11 exceeds stack size 1 of image >>>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>>> 00049: === Backtrace === >>>> 00050: >>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) >>>> [0x4786a1] >>>> 00051: >>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) >>>> [0x4b210f] >>>> 00052: >>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) >>>> [0x4b407b] >>>> 00053: >>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) >>>> [0x5b8f87] >>>> 00054: >>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) >>>> [0x498540] >>>> 00055: >>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) >>>> [0x49ab2a] >>>> 00056: >>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) >>>> [0x4322a5] >>>> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] >>>> 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() >>>> [0x435fbf] >>>> 00059: ================== >>>> 00060: ERROR: >>>> 00061: readMRC: Image number 11 exceeds stack size 1 of image >>>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>>> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 >>>> 00063: Traceback (most recent call last): >>>> 00064: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>> line 197, in run >>>> 00065: self._run() >>>> 00066: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>> line 248, in _run >>>> 00067: resultFiles = self._runFunc() >>>> 00068: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>> line 244, in _runFunc >>>> 00069: return self._func(*self._args) >>>> 00070: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", >>>> line 811, in runRelionStep >>>> 00071: self.runJob(self._getProgram(), params) >>>> 00072: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>> line 1388, in runJob >>>> 00073: self._stepsExecutor.runJob(self._log, program, arguments, >>>> **kwargs) >>>> 00074: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", >>>> line 65, in runJob >>>> 00075: process.runJob(log, programName, params, >>>> 00076: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>>> line 52, in runJob >>>> 00077: return runCommand(command, env, cwd) >>>> 00078: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>>> line 67, in runCommand >>>> 00079: check_call(command, shell=True, stdout=sys.stdout, >>>> stderr=sys.stderr, >>>> 00080: File >>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, >>>> in check_call >>>> 00081: raise CalledProcessError(retcode, cmd) >>>> 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which >>>> relion_refine_mpi` --i >>>> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >>>> 690 --zero_mask --K 64 --norm --scale --o >>>> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >>>> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >>>> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >>>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >>>> non-zero exit status 1. >>>> 00083: Protocol failed: Command ' mpirun -np 3 `which >>>> relion_refine_mpi` --i >>>> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >>>> 690 --zero_mask --K 64 --norm --scale --o >>>> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >>>> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >>>> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >>>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >>>> non-zero exit status 1. >>>> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:33:29.230213 >>>> 00085: *** Last status is failed >>>> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >>>> >>>> >>>> Cryosparc (in SCIPION) requires CTF to run. >>>> >>>> >>>> >>>> Thais is where I am now. >>>> >>>> Perhaps there is a solution? >>>> >>>> >>>> Sincerely, >>>> Dmitry >>>> >>>> >>>> >>>> On 18. May 2021, at 14:16, Pablo Conesa <pc...@cn...> wrote: >>>> >>>> Dear Dmitry, the import of CS metadata files (*.cs) is not supported in >>>> Scipion. Does CS has an option to export to star files. It rings a bell. >>>> On 18/5/21 9:53, Dmitry Semchonok wrote: >>>> >>>> Dear Grigory, >>>> >>>> >>>> The files are in mrc format. >>>> >>>> >>>> Please, let me try to explain more plan: >>>> >>>> I have a project in cryosparc. There I have cryosparc selected 2D >>>> classes. I want to export the particles of those classes into SCIPION. >>>> >>>> So I I pressed Export (fig 1) and the program(cryosparc) created the >>>> folder with mrc + other files (fig 2;3). I looked into J48 and found many >>>> *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to >>>> be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can >>>> also notice that they all have different sizes) >>>> >>>> So I need to export them somehow in SCIPION >>>> >>>> For that, I used the SCIPION export - images protocol where for the >>>> files to add I put *.mrc. But the protocol seems to be added only 1 mrc as >>>> 1 picture and instead of having 46392 particles I have ~600 particles. >>>> >>>> (Also the geometry seems not preserved). >>>> >>>> >>>> So my question how to export the particles from cryosparc into SCIPION >>>> correctly? >>>> >>>> >>>> Thank you! >>>> >>>> >>>> >>>> https://disk.yandex.com/d/Fv3Q1lpwEzSisg >>>> >>>> Sincerely, >>>> >>>> Dmitry >>>> >>>> >>>> >>>> On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm...> >>>> wrote: >>>> >>>> Hi Dmitry, >>>> >>>> mrc stacks should have "mrcs" extension. Is this the problem you are >>>> getting? >>>> >>>> Best regards, >>>> Grigory >>>> >>>> >>>> -------------------------------------------------------------------------------- >>>> Grigory Sharov, Ph.D. >>>> >>>> MRC Laboratory of Molecular Biology, >>>> Francis Crick Avenue, >>>> Cambridge Biomedical Campus, >>>> Cambridge CB2 0QH, UK. >>>> tel. +44 (0) 1223 267228 <+44%201223%20267228> >>>> e-mail: gs...@mr... >>>> >>>> >>>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm...> >>>> wrote: >>>> >>>>> Dear colleagues, >>>>> >>>>> I would like to export the particles from cryosparc to SCIPION. >>>>> >>>>> How to do that? >>>>> >>>>> >>>>> >>>>> >>>>> What I tried: >>>>> >>>>> >>>>> 1. In cryosparc I pressed Export – to export the particles I am >>>>> interested in. >>>>> >>>>> 2. In the folder Export – I found many mrc stacks with particles >>>>> in each. >>>>> >>>>> 3. I tried to export them to SCIPION using Export particles but >>>>> instead of reading each stack and combine them in the 1 dataset I received >>>>> 1 particle / per each mrc stack. >>>>> >>>>> >>>>> Any ideas? >>>>> >>>>> >>>>> Thank you >>>>> >>>>> Sincerely, >>>>> >>>>> Dmitry >>>>> >>>>> _______________________________________________ >>>>> scipion-users mailing list >>>>> sci...@li... >>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>> >>>> >>>> >>>> >>>> _______________________________________________ >>>> scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users >>>> >>>> -- >>>> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>> >>>> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> >>> >>> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> >> >> >> >> _______________________________________________ >> scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users >> >> -- >> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> >> >> >> >> _______________________________________________ >> scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users >> >> -- >> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* >> > > > _______________________________________________ > scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users > > -- > Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > |
From: Pablo C. <pc...@cn...> - 2021-05-19 09:44:42
|
Hi, we've found the issue. Although import particles (files mode) seemed correct....it wasn't for only those mrcs files having a single image. 30 of them. Removing them at import time worked and now relion 2d classification works as expected. We'll fix it. Cheers! On 19/5/21 10:56, Grigory Sharov wrote: > Hi Dmitry, > > I'll try to reproduce the error when I get a chance > > On Wed, May 19, 2021, 09:41 Pablo Conesa <pc...@cn... > <mailto:pc...@cn...>> wrote: > > I see, maybe we can arrange a tele conf to see in detail what is > wrong. I'll contact you. > > On 19/5/21 10:26, Dmitry Semchonok wrote: >> Dear Pablo and Grigory, >> >> >> Thank you! >> >> Yes, I am well aware of the fact that there is no info for the >> CTF etc :) >> >> >> All I need is just a nice aligned 2D set of images (from the set >> I imported) >> >> (Ideally I would like to have this set just from cryosparc but I >> have no idea how to do that right away :) ) >> >> Please see the log of relion >> >> >> >> RUNNING PROTOCOL ----------------- >> 00002: Hostname: cryoem01 >> 00003: PID: 36440 >> 00004: pyworkflow: 3.0.13 >> 00005: plugin: relion >> 00006: plugin v: 3.1.2 >> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >> 00008: workingDir: Runs/001735_ProtRelionClassify2D >> 00009: runMode: Continue >> 00010: MPI: 3 >> 00011: threads: 3 >> 00012: Starting at step: 1 >> 00013: Running steps >> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 >> 15:06:21.557294 >> 00015: Converting set from >> 'Runs/001662_ProtImportParticles/particles.sqlite' into >> 'Runs/001735_ProtRelionClassify2D/input_particles.star' >> 00016: convertBinaryFiles: creating soft links. >> 00017: Root: Runs/001735_ProtRelionClassify2D/extra/input -> >> Runs/001662_ProtImportParticles/extra >> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 >> 15:06:22.474076 >> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 >> 15:06:22.502665 >> 00020: mpirun -np 3 `which relion_refine_mpi` --i >> Runs/001735_ProtRelionClassify2D/input_particles.star >> --particle_diameter 690 --zero_mask --K 64 --norm --scale --o >> Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 >> 00021: RELION version: 3.1.2 >> 00022: Precision: BASE=double, CUDA-ACC=single >> 00023: >> 00024: === RELION MPI setup === >> 00025: + Number of MPI processes = 3 >> 00026: + Number of threads per MPI process = 3 >> 00027: + Total number of threads therefore = 9 >> 00028: + Leader (0) runs on host = cryoem01 >> 00029: + Follower 1 runs on host = cryoem01 >> 00030: + Follower 2 runs on host = cryoem01 >> 00031: ================= >> 00032: uniqueHost cryoem01 has 2 ranks. >> 00033: GPU-ids not specified for this rank, threads will >> automatically be mapped to available devices. >> 00034: Thread 0 on follower 1 mapped to device 0 >> 00035: Thread 1 on follower 1 mapped to device 0 >> 00036: Thread 2 on follower 1 mapped to device 0 >> 00037: GPU-ids not specified for this rank, threads will >> automatically be mapped to available devices. >> 00038: Thread 0 on follower 2 mapped to device 1 >> 00039: Thread 1 on follower 2 mapped to device 1 >> 00040: Thread 2 on follower 2 mapped to device 1 >> 00041: Running CPU instructions in double precision. >> 00042: + WARNING: Changing psi sampling rate (before >> oversampling) to 5.625 degrees, for more efficient GPU calculations >> 00043: + On host cryoem01: free scratch space = 447.485 Gb. >> 00044: Copying particles to scratch directory: >> /data1/new_scratch/relion_volatile/ >> 00045: 000/??? sec ~~(,_,"> [oo] >> 00046: 0/ 0 sec ~~(,_,">in: >> /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >> 00047: ERROR: >> 00048: readMRC: Image number 11 exceeds stack size 1 of image >> 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >> <mailto:000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs> >> 00049: === Backtrace === >> 00050: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) >> [0x4786a1] >> 00051: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) >> [0x4b210f] >> 00052: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) >> [0x4b407b] >> 00053: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) >> [0x5b8f87] >> 00054: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) >> [0x498540] >> 00055: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) >> [0x49ab2a] >> 00056: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) >> [0x4322a5] >> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7fea6a54e555] >> 00058: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() >> [0x435fbf] >> 00059: ================== >> 00060: ERROR: >> 00061: readMRC: Image number 11 exceeds stack size 1 of image >> 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >> <mailto:000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs> >> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 >> 00063: Traceback (most recent call last): >> 00064: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 197, in run >> 00065: self._run() >> 00066: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 248, in _run >> 00067: resultFiles = self._runFunc() >> 00068: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 244, in _runFunc >> 00069: return self._func(*self._args) >> 00070: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", >> line 811, in runRelionStep >> 00071: self.runJob(self._getProgram(), params) >> 00072: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 1388, in runJob >> 00073: self._stepsExecutor.runJob(self._log, program, arguments, >> **kwargs) >> 00074: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", >> line 65, in runJob >> 00075: process.runJob(log, programName, params, >> 00076: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >> line 52, in runJob >> 00077: return runCommand(command, env, cwd) >> 00078: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >> line 67, in runCommand >> 00079: check_call(command, shell=True, stdout=sys.stdout, >> stderr=sys.stderr, >> 00080: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", >> line 364, in check_call >> 00081: raise CalledProcessError(retcode, cmd) >> 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 >> `which relion_refine_mpi` --i >> Runs/001735_ProtRelionClassify2D/input_particles.star >> --particle_diameter 690 --zero_mask --K 64 --norm --scale --o >> Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >> non-zero exit status 1. >> 00083: Protocol failed: Command ' mpirun -np 3 `which >> relion_refine_mpi` --i >> Runs/001735_ProtRelionClassify2D/input_particles.star >> --particle_diameter 690 --zero_mask --K 64 --norm --scale --o >> Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >> non-zero exit status 1. >> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 >> 15:06:24.609548 >> 00085: *** Last status is failed >> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >> >> >> *Additionally,* >> >> So it seemed from the first look that there is some issue with >> the image 11 — I delete the image 11 but the problem still remained. >> >> >> >> >> *Optionally,* >> >> I believe that xmipp-2D should work but I did not try it yet. >> >> Thank you >> >> Sincerely, >> Dmitry >> >> >> >> >> >> >> >> >> >>> On 19. May 2021, at 10:13, Pablo Conesa <pc...@cn... >>> <mailto:pc...@cn...>> wrote: >>> >>> Hi! So, I think Grigory is right, you've gone through the >>> import particles "without metadata info" therefore you only have >>> the images without any alignment information. >>> >>> >>> In theory, 2d classification should work with this kind of >>> import. Could you please share the logs of one of the relion >>> classification? >>> >>> On 19/5/21 9:26, Dmitry Semchonok wrote: >>>> >>>> Dear Grigory, >>>> >>>> >>>> I did nothing much, just tried to start relion 2D // or cryosparc. >>>> >>>> >>>> The only thing I tried additionally since I could no proceed is >>>> to a) change the box size; b) just resave the subset with the >>>> same number of images. >>>> >>>> >>>> Please see the image. >>>> >>>> >>>> <Screenshot 2021-05-19 at 09.23.41.png> >>>> >>>> >>>> Thank you >>>> >>>> >>>> Sincerely, >>>> >>>> Dmitry >>>> >>>> >>>> >>>>> On 18. May 2021, at 15:19, Grigory Sharov >>>>> <sha...@gm... <mailto:sha...@gm...>> >>>>> wrote: >>>>> >>>>> Hi, >>>>> >>>>> I imagine you have 624 micrographs, so particles are exported >>>>> to mrc on a mic basis. I see you used the "files" option to >>>>> import mrcs particles into Scipion. This means the imported >>>>> particles have no metadata except pixel size you provided. >>>>> >>>>> What did you do with them after import? >>>>> >>>>> Best regards, >>>>> Grigory >>>>> >>>>> -------------------------------------------------------------------------------- >>>>> Grigory Sharov, Ph.D. >>>>> >>>>> MRC Laboratory of Molecular Biology, >>>>> Francis Crick Avenue, >>>>> Cambridge Biomedical Campus, >>>>> Cambridge CB2 0QH, UK. >>>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>>> e-mail: gs...@mr... >>>>> <mailto:gs...@mr...> >>>>> >>>>> >>>>> On Tue, May 18, 2021 at 2:12 PM Dmitry Semchonok >>>>> <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>>> >>>>> Dear Grigory, >>>>> >>>>> Yes I did that — the particles are looking fine. >>>>> >>>>> >>>>> I guess the issue still comes from the fact that >>>>> originally in cryosparc Export the stacks of particle were >>>>> placed into 624 mrc. But the number of particles is about >>>>> 44 818. So even after the renaming and the export I see >>>>> this in SCIPION export log >>>>> >>>>> <Screenshot 2021-05-18 at 15.09.11.png> >>>>> >>>>> What I guess may help is if I somehow combine all those >>>>> files in 1 mrcs first and then add import them to SCIPION. >>>>> Do you perhaps know how to do that? >>>>> >>>>> Thank you >>>>> >>>>> Sincerely, >>>>> Dmitry >>>>> >>>>> >>>>> >>>>>> On 18. May 2021, at 15:02, Grigory Sharov >>>>>> <sha...@gm... >>>>>> <mailto:sha...@gm...>> wrote: >>>>>> >>>>>> Hi Dmitry, >>>>>> >>>>>> as the error states your star file points to a >>>>>> non-existing image in the mrcs stack. You need to check >>>>>> first if you import from cryosparc with mrcs worked >>>>>> correctly (open / display particles) then trace all the >>>>>> steps you did before 2D classification. >>>>>> >>>>>> Best regards, >>>>>> Grigory >>>>>> >>>>>> -------------------------------------------------------------------------------- >>>>>> Grigory Sharov, Ph.D. >>>>>> >>>>>> MRC Laboratory of Molecular Biology, >>>>>> Francis Crick Avenue, >>>>>> Cambridge Biomedical Campus, >>>>>> Cambridge CB2 0QH, UK. >>>>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>>>> e-mail: gs...@mr... >>>>>> <mailto:gs...@mr...> >>>>>> >>>>>> >>>>>> On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok >>>>>> <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>>>> >>>>>> Dear Pablo, >>>>>> >>>>>> >>>>>> Thank you. I heard about this option. For that I >>>>>> guess the https://pypi.org/project/cs2star/ >>>>>> <https://pypi.org/project/cs2star/> needs to be >>>>>> installed. >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> In Cryosparc itself there is an option to export >>>>>> files. And then what we get is the mrc files with >>>>>> different number of particles in each. >>>>>> >>>>>> It appeared to be possible to rename mrc —> to >>>>>> —>mrcs. Then SCIPION can import those particles. >>>>>> >>>>>> Currently the problem is that not relion nor >>>>>> cryosparc can run these particles. >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Relion stops with error: >>>>>> >>>>>> >>>>>> 00001: RUNNING PROTOCOL ----------------- >>>>>> 00002: Hostname: cryoem01 >>>>>> 00003: PID: 46455 >>>>>> 00004: pyworkflow: 3.0.13 >>>>>> 00005: plugin: relion >>>>>> 00006: plugin v: 3.1.2 >>>>>> 00007: currentDir: >>>>>> /data1/ScipionUserData/projects/Caro__helix >>>>>> 00008: workingDir: Runs/001546_ProtRelionClassify2D >>>>>> 00009: runMode: Continue >>>>>> 00010: MPI: 3 >>>>>> 00011: threads: 3 >>>>>> 00012: Starting at step: 1 >>>>>> 00013: Running steps >>>>>> 00014: STARTED: convertInputStep, step 1, time >>>>>> 2021-05-18 12:33:26.198123 >>>>>> 00015: Converting set from >>>>>> 'Runs/001492_ProtUserSubSet/particles.sqlite' into >>>>>> 'Runs/001546_ProtRelionClassify2D/input_particles.star' >>>>>> 00016: convertBinaryFiles: creating soft links. >>>>>> 00017: Root: >>>>>> Runs/001546_ProtRelionClassify2D/extra/input -> >>>>>> Runs/001057_ProtImportParticles/extra >>>>>> 00018: FINISHED: convertInputStep, step 1, time >>>>>> 2021-05-18 12:33:27.117588 >>>>>> 00019: STARTED: runRelionStep, step 2, time >>>>>> 2021-05-18 12:33:27.145974 >>>>>> 00020: mpirun -np 3 `which relion_refine_mpi` --i >>>>>> Runs/001546_ProtRelionClassify2D/input_particles.star >>>>>> --particle_diameter 690 --zero_mask --K 64 --norm >>>>>> --scale --o >>>>>> Runs/001546_ProtRelionClassify2D/extra/relion >>>>>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 >>>>>> --iter 25 --offset_range 5.0 --offset_step 2.0 >>>>>> --psi_step 10.0 --dont_combine_weights_via_disc >>>>>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 >>>>>> 00021: RELION version: 3.1.2 >>>>>> 00022: Precision: BASE=double, CUDA-ACC=single >>>>>> 00023: >>>>>> 00024: === RELION MPI setup === >>>>>> 00025: + Number of MPI processes = 3 >>>>>> 00026: + Number of threads per MPI process = 3 >>>>>> 00027: + Total number of threads therefore = 9 >>>>>> 00028: + Leader (0) runs on host = cryoem01 >>>>>> 00029: + Follower 1 runs on host = cryoem01 >>>>>> 00030: + Follower 2 runs on host = cryoem01 >>>>>> 00031: ================= >>>>>> 00032: uniqueHost cryoem01 has 2 ranks. >>>>>> 00033: GPU-ids not specified for this rank, threads >>>>>> will automatically be mapped to available devices. >>>>>> 00034: Thread 0 on follower 1 mapped to device 0 >>>>>> 00035: Thread 1 on follower 1 mapped to device 0 >>>>>> 00036: Thread 2 on follower 1 mapped to device 0 >>>>>> 00037: GPU-ids not specified for this rank, threads >>>>>> will automatically be mapped to available devices. >>>>>> 00038: Thread 0 on follower 2 mapped to device 1 >>>>>> 00039: Thread 1 on follower 2 mapped to device 1 >>>>>> 00040: Thread 2 on follower 2 mapped to device 1 >>>>>> 00041: Running CPU instructions in double precision. >>>>>> 00042: + WARNING: Changing psi sampling rate >>>>>> (before oversampling) to 5.625 degrees, for more >>>>>> efficient GPU calculations >>>>>> 00043: + On host cryoem01: free scratch space = >>>>>> 448.252 Gb. >>>>>> 00044: Copying particles to scratch directory: >>>>>> /data1/new_scratch/relion_volatile/ >>>>>> 00045: 000/??? sec ~~(,_,"> [oo] >>>>>> 00046: 1/ 60 sec ~~(,_,">in: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, >>>>>> line 192 >>>>>> 00047: ERROR: >>>>>> 00048: readMRC: Image number 11 exceeds stack size 1 >>>>>> of image >>>>>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>>>>> <mailto:000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs> >>>>>> 00049: === Backtrace === >>>>>> 00050: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) >>>>>> [0x4786a1] >>>>>> 00051: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) >>>>>> [0x4b210f] >>>>>> 00052: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) >>>>>> [0x4b407b] >>>>>> 00053: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) >>>>>> [0x5b8f87] >>>>>> 00054: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) >>>>>> [0x498540] >>>>>> 00055: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) >>>>>> [0x49ab2a] >>>>>> 00056: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) >>>>>> [0x4322a5] >>>>>> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) >>>>>> [0x7f657f51a555] >>>>>> 00058: >>>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() >>>>>> [0x435fbf] >>>>>> 00059: ================== >>>>>> 00060: ERROR: >>>>>> 00061: readMRC: Image number 11 exceeds stack size 1 >>>>>> of image >>>>>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>>>>> <mailto:000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs> >>>>>> 00062: application called MPI_Abort(MPI_COMM_WORLD, >>>>>> 1) - process 1 >>>>>> 00063: Traceback (most recent call last): >>>>>> 00064: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>>>> line 197, in run >>>>>> 00065: self._run() >>>>>> 00066: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>>>> line 248, in _run >>>>>> 00067: resultFiles = self._runFunc() >>>>>> 00068: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>>>> line 244, in _runFunc >>>>>> 00069: return self._func(*self._args) >>>>>> 00070: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", >>>>>> line 811, in runRelionStep >>>>>> 00071: self.runJob(self._getProgram(), params) >>>>>> 00072: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>>>> line 1388, in runJob >>>>>> 00073: self._stepsExecutor.runJob(self._log, program, >>>>>> arguments, **kwargs) >>>>>> 00074: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", >>>>>> line 65, in runJob >>>>>> 00075: process.runJob(log, programName, params, >>>>>> 00076: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>>>>> line 52, in runJob >>>>>> 00077: return runCommand(command, env, cwd) >>>>>> 00078: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>>>>> line 67, in runCommand >>>>>> 00079: check_call(command, shell=True, >>>>>> stdout=sys.stdout, stderr=sys.stderr, >>>>>> 00080: File >>>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", >>>>>> line 364, in check_call >>>>>> 00081: raise CalledProcessError(retcode, cmd) >>>>>> 00082: subprocess.CalledProcessError: Command ' >>>>>> mpirun -np 3 `which relion_refine_mpi` --i >>>>>> Runs/001546_ProtRelionClassify2D/input_particles.star >>>>>> --particle_diameter 690 --zero_mask --K 64 --norm >>>>>> --scale --o >>>>>> Runs/001546_ProtRelionClassify2D/extra/relion >>>>>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 >>>>>> --iter 25 --offset_range 5.0 --offset_step 2.0 >>>>>> --psi_step 10.0 --dont_combine_weights_via_disc >>>>>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j >>>>>> 3' returned non-zero exit status 1. >>>>>> 00083: Protocol failed: Command ' mpirun -np 3 >>>>>> `which relion_refine_mpi` --i >>>>>> Runs/001546_ProtRelionClassify2D/input_particles.star >>>>>> --particle_diameter 690 --zero_mask --K 64 --norm >>>>>> --scale --o >>>>>> Runs/001546_ProtRelionClassify2D/extra/relion >>>>>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 >>>>>> --iter 25 --offset_range 5.0 --offset_step 2.0 >>>>>> --psi_step 10.0 --dont_combine_weights_via_disc >>>>>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j >>>>>> 3' returned non-zero exit status 1. >>>>>> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 >>>>>> 12:33:29.230213 >>>>>> 00085: *** Last status is failed >>>>>> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >>>>>> >>>>>> >>>>>> Cryosparc (in SCIPION) requires CTF to run. >>>>>> >>>>>> >>>>>> >>>>>> Thais is where I am now. >>>>>> >>>>>> Perhaps there is a solution? >>>>>> >>>>>> >>>>>> Sincerely, >>>>>> Dmitry >>>>>> >>>>>> >>>>>> >>>>>>> On 18. May 2021, at 14:16, Pablo Conesa >>>>>>> <pc...@cn... <mailto:pc...@cn...>> >>>>>>> wrote: >>>>>>> >>>>>>> Dear Dmitry, the import of CS metadata files (*.cs) >>>>>>> is not supported in Scipion. Does CS has an option >>>>>>> to export to star files. It rings a bell. >>>>>>> >>>>>>> On 18/5/21 9:53, Dmitry Semchonok wrote: >>>>>>>> >>>>>>>> Dear Grigory, >>>>>>>> >>>>>>>> >>>>>>>> The files are in mrc format. >>>>>>>> >>>>>>>> >>>>>>>> Please, let me try to explain more plan: >>>>>>>> >>>>>>>> I have a project in cryosparc. There I have >>>>>>>> cryosparc selected 2D classes. I want to export the >>>>>>>> particles of those classes into SCIPION. >>>>>>>> >>>>>>>> So I I pressed Export (fig 1) and the >>>>>>>> program(cryosparc) created the folder with mrc + >>>>>>>> other files (fig 2;3). I looked into J48 and found >>>>>>>> many *.mrc files of the particles. But it is not 1 >>>>>>>> mrc = 1 particle. It seems to be a mrc stuck - so I >>>>>>>> have several files inside 1 *.mrc (fig 4) (you can >>>>>>>> also notice that they all have different sizes) >>>>>>>> >>>>>>>> So I need to export them somehow in SCIPION >>>>>>>> >>>>>>>> For that, I used the SCIPION export - images >>>>>>>> protocol where for the files to add I put *.mrc. >>>>>>>> But the protocol seems to be added only 1 mrc as 1 >>>>>>>> picture and instead of having 46392 particles I >>>>>>>> have ~600 particles. >>>>>>>> >>>>>>>> (Also the geometry seems not preserved). >>>>>>>> >>>>>>>> >>>>>>>> So my question how to export the particles from >>>>>>>> cryosparc into SCIPION correctly? >>>>>>>> >>>>>>>> >>>>>>>> Thank you! >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> https://disk.yandex.com/d/Fv3Q1lpwEzSisg >>>>>>>> <https://disk.yandex.com/d/Fv3Q1lpwEzSisg> >>>>>>>> >>>>>>>> >>>>>>>> Sincerely, >>>>>>>> >>>>>>>> Dmitry >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> On 17. May 2021, at 18:12, Grigory Sharov >>>>>>>>> <sha...@gm... >>>>>>>>> <mailto:sha...@gm...>> wrote: >>>>>>>>> >>>>>>>>> Hi Dmitry, >>>>>>>>> >>>>>>>>> mrc stacks should have "mrcs" extension. Is this >>>>>>>>> the problem you are getting? >>>>>>>>> >>>>>>>>> Best regards, >>>>>>>>> Grigory >>>>>>>>> >>>>>>>>> -------------------------------------------------------------------------------- >>>>>>>>> Grigory Sharov, Ph.D. >>>>>>>>> >>>>>>>>> MRC Laboratory of Molecular Biology, >>>>>>>>> Francis Crick Avenue, >>>>>>>>> Cambridge Biomedical Campus, >>>>>>>>> Cambridge CB2 0QH, UK. >>>>>>>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>>>>>>> e-mail: gs...@mr... >>>>>>>>> <mailto:gs...@mr...> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok >>>>>>>>> <Sem...@gm... <mailto:Sem...@gm...>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> Dear colleagues, >>>>>>>>> >>>>>>>>> I would like to export the particles from >>>>>>>>> cryosparc to SCIPION. >>>>>>>>> >>>>>>>>> How to do that? >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> What I tried: >>>>>>>>> >>>>>>>>> >>>>>>>>> 1.In cryosparc I pressed Export – to export >>>>>>>>> the particles I am interested in. >>>>>>>>> >>>>>>>>> 2.In the folder Export – I found many mrc >>>>>>>>> stacks with particles in each. >>>>>>>>> >>>>>>>>> 3.I tried to export them to SCIPION using >>>>>>>>> Export particles but instead of reading each >>>>>>>>> stack and combine them in the 1 dataset I >>>>>>>>> received 1 particle / per each mrc stack. >>>>>>>>> >>>>>>>>> >>>>>>>>> Any ideas? >>>>>>>>> >>>>>>>>> >>>>>>>>> Thank you >>>>>>>>> >>>>>>>>> Sincerely, >>>>>>>>> >>>>>>>>> Dmitry >>>>>>>>> >>>>>>>>> >>>>>>>>> _______________________________________________ >>>>>>>>> scipion-users mailing list >>>>>>>>> sci...@li... >>>>>>>>> <mailto:sci...@li...> >>>>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>>>>> >>>>>>>>> _______________________________________________ >>>>>>>>> scipion-users mailing list >>>>>>>>> sci...@li... >>>>>>>>> <mailto:sci...@li...> >>>>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> _______________________________________________ >>>>>>>> scipion-users mailing list >>>>>>>> sci...@li... <mailto:sci...@li...> >>>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>>> -- >>>>>>> Pablo Conesa - *Madrid Scipion >>>>>>> <http://scipion.i2pc.es/> team* >>>>>>> _______________________________________________ >>>>>>> scipion-users mailing list >>>>>>> sci...@li... >>>>>>> <mailto:sci...@li...> >>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>> >>>>>> _______________________________________________ >>>>>> scipion-users mailing list >>>>>> sci...@li... >>>>>> <mailto:sci...@li...> >>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>> >>>>>> _______________________________________________ >>>>>> scipion-users mailing list >>>>>> sci...@li... >>>>>> <mailto:sci...@li...> >>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>> >>>>> _______________________________________________ >>>>> scipion-users mailing list >>>>> sci...@li... >>>>> <mailto:sci...@li...> >>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>> >>>> >>>> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... <mailto:sci...@li...> >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>> -- >>> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> <mailto:sci...@li...> >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >> >> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> > -- > Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users -- Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* |
From: Grigory S. <sha...@gm...> - 2021-05-19 08:57:08
|
Hi Dmitry, I'll try to reproduce the error when I get a chance On Wed, May 19, 2021, 09:41 Pablo Conesa <pc...@cn...> wrote: > I see, maybe we can arrange a tele conf to see in detail what is wrong. > I'll contact you. > On 19/5/21 10:26, Dmitry Semchonok wrote: > > Dear Pablo and Grigory, > > > Thank you! > > Yes, I am well aware of the fact that there is no info for the CTF etc :) > > > All I need is just a nice aligned 2D set of images (from the set I > imported) > > (Ideally I would like to have this set just from cryosparc but I have no > idea how to do that right away :) ) > > Please see the log of relion > > > > RUNNING PROTOCOL ----------------- > 00002: Hostname: cryoem01 > 00003: PID: 36440 > 00004: pyworkflow: 3.0.13 > 00005: plugin: relion > 00006: plugin v: 3.1.2 > 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix > 00008: workingDir: Runs/001735_ProtRelionClassify2D > 00009: runMode: Continue > 00010: MPI: 3 > 00011: threads: 3 > 00012: Starting at step: 1 > 00013: Running steps > 00014: STARTED: convertInputStep, step 1, time 2021-05-18 15:06:21.557294 > 00015: Converting set from > 'Runs/001662_ProtImportParticles/particles.sqlite' into > 'Runs/001735_ProtRelionClassify2D/input_particles.star' > 00016: convertBinaryFiles: creating soft links. > 00017: Root: Runs/001735_ProtRelionClassify2D/extra/input -> > Runs/001662_ProtImportParticles/extra > 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 > 15:06:22.474076 > 00019: STARTED: runRelionStep, step 2, time 2021-05-18 15:06:22.502665 > 00020: mpirun -np 3 `which relion_refine_mpi` --i > Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter > 690 --zero_mask --K 64 --norm --scale --o > Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 > 00021: RELION version: 3.1.2 > 00022: Precision: BASE=double, CUDA-ACC=single > 00023: > 00024: === RELION MPI setup === > 00025: + Number of MPI processes = 3 > 00026: + Number of threads per MPI process = 3 > 00027: + Total number of threads therefore = 9 > 00028: + Leader (0) runs on host = cryoem01 > 00029: + Follower 1 runs on host = cryoem01 > 00030: + Follower 2 runs on host = cryoem01 > 00031: ================= > 00032: uniqueHost cryoem01 has 2 ranks. > 00033: GPU-ids not specified for this rank, threads will automatically > be mapped to available devices. > 00034: Thread 0 on follower 1 mapped to device 0 > 00035: Thread 1 on follower 1 mapped to device 0 > 00036: Thread 2 on follower 1 mapped to device 0 > 00037: GPU-ids not specified for this rank, threads will automatically > be mapped to available devices. > 00038: Thread 0 on follower 2 mapped to device 1 > 00039: Thread 1 on follower 2 mapped to device 1 > 00040: Thread 2 on follower 2 mapped to device 1 > 00041: Running CPU instructions in double precision. > 00042: + WARNING: Changing psi sampling rate (before oversampling) to > 5.625 degrees, for more efficient GPU calculations > 00043: + On host cryoem01: free scratch space = 447.485 Gb. > 00044: Copying particles to scratch directory: > /data1/new_scratch/relion_volatile/ > 00045: 000/??? sec ~~(,_,"> > [oo] > 00046: 0/ 0 sec ~~(,_,">in: > /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 > 00047: ERROR: > 00048: readMRC: Image number 11 exceeds stack size 1 of image > 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00049: === Backtrace === > 00050: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) > [0x4786a1] > 00051: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) > [0x4b210f] > 00052: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) > [0x4b407b] > 00053: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) > [0x5b8f87] > 00054: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) > [0x498540] > 00055: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) > [0x49ab2a] > 00056: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) > [0x4322a5] > 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7fea6a54e555] > 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() > [0x435fbf] > 00059: ================== > 00060: ERROR: > 00061: readMRC: Image number 11 exceeds stack size 1 of image > 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 > 00063: Traceback (most recent call last): > 00064: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 197, in run > 00065: self._run() > 00066: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 248, in _run > 00067: resultFiles = self._runFunc() > 00068: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 244, in _runFunc > 00069: return self._func(*self._args) > 00070: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", > line 811, in runRelionStep > 00071: self.runJob(self._getProgram(), params) > 00072: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 1388, in runJob > 00073: self._stepsExecutor.runJob(self._log, program, arguments, > **kwargs) > 00074: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", > line 65, in runJob > 00075: process.runJob(log, programName, params, > 00076: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", > line 52, in runJob > 00077: return runCommand(command, env, cwd) > 00078: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", > line 67, in runCommand > 00079: check_call(command, shell=True, stdout=sys.stdout, > stderr=sys.stderr, > 00080: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, > in check_call > 00081: raise CalledProcessError(retcode, cmd) > 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which > relion_refine_mpi` --i > Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter > 690 --zero_mask --K 64 --norm --scale --o > Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned > non-zero exit status 1. > 00083: Protocol failed: Command ' mpirun -np 3 `which > relion_refine_mpi` --i > Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter > 690 --zero_mask --K 64 --norm --scale --o > Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned > non-zero exit status 1. > 00084: FAILED: runRelionStep, step 2, time 2021-05-18 15:06:24.609548 > 00085: *** Last status is failed > 00086: ------------------- PROTOCOL FAILED (DONE 2/3) > > > *Additionally,* > > So it seemed from the first look that there is some issue with the image > 11 — I delete the image 11 but the problem still remained. > > > > > *Optionally,* > > I believe that xmipp-2D should work but I did not try it yet. > > Thank you > > Sincerely, > Dmitry > > > > > > > > > > On 19. May 2021, at 10:13, Pablo Conesa <pc...@cn...> wrote: > > Hi! So, I think Grigory is right, you've gone through the import > particles "without metadata info" therefore you only have the images > without any alignment information. > > > In theory, 2d classification should work with this kind of import. Could > you please share the logs of one of the relion classification? > On 19/5/21 9:26, Dmitry Semchonok wrote: > > Dear Grigory, > > > I did nothing much, just tried to start relion 2D // or cryosparc. > > > The only thing I tried additionally since I could no proceed is to a) > change the box size; b) just resave the subset with the same number of > images. > > > Please see the image. > > <Screenshot 2021-05-19 at 09.23.41.png> > > > Thank you > > Sincerely, > > Dmitry > > > On 18. May 2021, at 15:19, Grigory Sharov <sha...@gm...> > wrote: > > Hi, > > I imagine you have 624 micrographs, so particles are exported to mrc on a > mic basis. I see you used the "files" option to import mrcs particles into > Scipion. This means the imported particles have no metadata except pixel > size you provided. > > What did you do with them after import? > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Tue, May 18, 2021 at 2:12 PM Dmitry Semchonok <Sem...@gm...> > wrote: > >> Dear Grigory, >> >> Yes I did that — the particles are looking fine. >> >> >> I guess the issue still comes from the fact that originally in cryosparc >> Export the stacks of particle were placed into 624 mrc. But the number of >> particles is about 44 818. So even after the renaming and the export I see >> this in SCIPION export log >> >> <Screenshot 2021-05-18 at 15.09.11.png> >> >> What I guess may help is if I somehow combine all those files in 1 mrcs >> first and then add import them to SCIPION. >> Do you perhaps know how to do that? >> >> Thank you >> >> Sincerely, >> Dmitry >> >> >> >> On 18. May 2021, at 15:02, Grigory Sharov <sha...@gm...> >> wrote: >> >> Hi Dmitry, >> >> as the error states your star file points to a non-existing image in the >> mrcs stack. You need to check first if you import from cryosparc with mrcs >> worked correctly (open / display particles) then trace all the steps you >> did before 2D classification. >> >> Best regards, >> Grigory >> >> >> -------------------------------------------------------------------------------- >> Grigory Sharov, Ph.D. >> >> MRC Laboratory of Molecular Biology, >> Francis Crick Avenue, >> Cambridge Biomedical Campus, >> Cambridge CB2 0QH, UK. >> tel. +44 (0) 1223 267228 <+44%201223%20267228> >> e-mail: gs...@mr... >> >> >> On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok <Sem...@gm...> >> wrote: >> >>> Dear Pablo, >>> >>> >>> Thank you. I heard about this option. For that I guess the >>> https://pypi.org/project/cs2star/ needs to be installed. >>> >>> >>> >>> >>> In Cryosparc itself there is an option to export files. And then what we >>> get is the mrc files with different number of particles in each. >>> >>> It appeared to be possible to rename mrc —> to —>mrcs. Then SCIPION can >>> import those particles. >>> >>> Currently the problem is that not relion nor cryosparc can run these >>> particles. >>> >>> >>> >>> >>> Relion stops with error: >>> >>> >>> 00001: RUNNING PROTOCOL ----------------- >>> 00002: Hostname: cryoem01 >>> 00003: PID: 46455 >>> 00004: pyworkflow: 3.0.13 >>> 00005: plugin: relion >>> 00006: plugin v: 3.1.2 >>> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >>> 00008: workingDir: Runs/001546_ProtRelionClassify2D >>> 00009: runMode: Continue >>> 00010: MPI: 3 >>> 00011: threads: 3 >>> 00012: Starting at step: 1 >>> 00013: Running steps >>> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 >>> 12:33:26.198123 >>> 00015: Converting set from >>> 'Runs/001492_ProtUserSubSet/particles.sqlite' into >>> 'Runs/001546_ProtRelionClassify2D/input_particles.star' >>> 00016: convertBinaryFiles: creating soft links. >>> 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input -> >>> Runs/001057_ProtImportParticles/extra >>> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 >>> 12:33:27.117588 >>> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 12:33:27.145974 >>> 00020: mpirun -np 3 `which relion_refine_mpi` --i >>> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >>> 690 --zero_mask --K 64 --norm --scale --o >>> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >>> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >>> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 >>> 00021: RELION version: 3.1.2 >>> 00022: Precision: BASE=double, CUDA-ACC=single >>> 00023: >>> 00024: === RELION MPI setup === >>> 00025: + Number of MPI processes = 3 >>> 00026: + Number of threads per MPI process = 3 >>> 00027: + Total number of threads therefore = 9 >>> 00028: + Leader (0) runs on host = cryoem01 >>> 00029: + Follower 1 runs on host = cryoem01 >>> 00030: + Follower 2 runs on host = cryoem01 >>> 00031: ================= >>> 00032: uniqueHost cryoem01 has 2 ranks. >>> 00033: GPU-ids not specified for this rank, threads will automatically >>> be mapped to available devices. >>> 00034: Thread 0 on follower 1 mapped to device 0 >>> 00035: Thread 1 on follower 1 mapped to device 0 >>> 00036: Thread 2 on follower 1 mapped to device 0 >>> 00037: GPU-ids not specified for this rank, threads will automatically >>> be mapped to available devices. >>> 00038: Thread 0 on follower 2 mapped to device 1 >>> 00039: Thread 1 on follower 2 mapped to device 1 >>> 00040: Thread 2 on follower 2 mapped to device 1 >>> 00041: Running CPU instructions in double precision. >>> 00042: + WARNING: Changing psi sampling rate (before oversampling) to >>> 5.625 degrees, for more efficient GPU calculations >>> 00043: + On host cryoem01: free scratch space = 448.252 Gb. >>> 00044: Copying particles to scratch directory: >>> /data1/new_scratch/relion_volatile/ >>> 00045: 000/??? sec ~~(,_,"> >>> [oo] >>> 00046: 1/ 60 sec ~~(,_,">in: >>> /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >>> 00047: ERROR: >>> 00048: readMRC: Image number 11 exceeds stack size 1 of image >>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>> 00049: === Backtrace === >>> 00050: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) >>> [0x4786a1] >>> 00051: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) >>> [0x4b210f] >>> 00052: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) >>> [0x4b407b] >>> 00053: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) >>> [0x5b8f87] >>> 00054: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) >>> [0x498540] >>> 00055: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) >>> [0x49ab2a] >>> 00056: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) >>> [0x4322a5] >>> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] >>> 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() >>> [0x435fbf] >>> 00059: ================== >>> 00060: ERROR: >>> 00061: readMRC: Image number 11 exceeds stack size 1 of image >>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 >>> 00063: Traceback (most recent call last): >>> 00064: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>> line 197, in run >>> 00065: self._run() >>> 00066: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>> line 248, in _run >>> 00067: resultFiles = self._runFunc() >>> 00068: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>> line 244, in _runFunc >>> 00069: return self._func(*self._args) >>> 00070: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", >>> line 811, in runRelionStep >>> 00071: self.runJob(self._getProgram(), params) >>> 00072: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>> line 1388, in runJob >>> 00073: self._stepsExecutor.runJob(self._log, program, arguments, >>> **kwargs) >>> 00074: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", >>> line 65, in runJob >>> 00075: process.runJob(log, programName, params, >>> 00076: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>> line 52, in runJob >>> 00077: return runCommand(command, env, cwd) >>> 00078: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>> line 67, in runCommand >>> 00079: check_call(command, shell=True, stdout=sys.stdout, >>> stderr=sys.stderr, >>> 00080: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, >>> in check_call >>> 00081: raise CalledProcessError(retcode, cmd) >>> 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which >>> relion_refine_mpi` --i >>> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >>> 690 --zero_mask --K 64 --norm --scale --o >>> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >>> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >>> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >>> non-zero exit status 1. >>> 00083: Protocol failed: Command ' mpirun -np 3 `which >>> relion_refine_mpi` --i >>> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >>> 690 --zero_mask --K 64 --norm --scale --o >>> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >>> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >>> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >>> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >>> non-zero exit status 1. >>> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:33:29.230213 >>> 00085: *** Last status is failed >>> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >>> >>> >>> Cryosparc (in SCIPION) requires CTF to run. >>> >>> >>> >>> Thais is where I am now. >>> >>> Perhaps there is a solution? >>> >>> >>> Sincerely, >>> Dmitry >>> >>> >>> >>> On 18. May 2021, at 14:16, Pablo Conesa <pc...@cn...> wrote: >>> >>> Dear Dmitry, the import of CS metadata files (*.cs) is not supported in >>> Scipion. Does CS has an option to export to star files. It rings a bell. >>> On 18/5/21 9:53, Dmitry Semchonok wrote: >>> >>> Dear Grigory, >>> >>> >>> The files are in mrc format. >>> >>> >>> Please, let me try to explain more plan: >>> >>> I have a project in cryosparc. There I have cryosparc selected 2D >>> classes. I want to export the particles of those classes into SCIPION. >>> >>> So I I pressed Export (fig 1) and the program(cryosparc) created the >>> folder with mrc + other files (fig 2;3). I looked into J48 and found many >>> *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to >>> be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can >>> also notice that they all have different sizes) >>> >>> So I need to export them somehow in SCIPION >>> >>> For that, I used the SCIPION export - images protocol where for the >>> files to add I put *.mrc. But the protocol seems to be added only 1 mrc as >>> 1 picture and instead of having 46392 particles I have ~600 particles. >>> >>> (Also the geometry seems not preserved). >>> >>> >>> So my question how to export the particles from cryosparc into SCIPION >>> correctly? >>> >>> >>> Thank you! >>> >>> >>> >>> https://disk.yandex.com/d/Fv3Q1lpwEzSisg >>> >>> Sincerely, >>> >>> Dmitry >>> >>> >>> >>> On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm...> >>> wrote: >>> >>> Hi Dmitry, >>> >>> mrc stacks should have "mrcs" extension. Is this the problem you are >>> getting? >>> >>> Best regards, >>> Grigory >>> >>> >>> -------------------------------------------------------------------------------- >>> Grigory Sharov, Ph.D. >>> >>> MRC Laboratory of Molecular Biology, >>> Francis Crick Avenue, >>> Cambridge Biomedical Campus, >>> Cambridge CB2 0QH, UK. >>> tel. +44 (0) 1223 267228 <+44%201223%20267228> >>> e-mail: gs...@mr... >>> >>> >>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm...> >>> wrote: >>> >>>> Dear colleagues, >>>> >>>> I would like to export the particles from cryosparc to SCIPION. >>>> >>>> How to do that? >>>> >>>> >>>> >>>> >>>> What I tried: >>>> >>>> >>>> 1. In cryosparc I pressed Export – to export the particles I am >>>> interested in. >>>> >>>> 2. In the folder Export – I found many mrc stacks with particles >>>> in each. >>>> >>>> 3. I tried to export them to SCIPION using Export particles but >>>> instead of reading each stack and combine them in the 1 dataset I received >>>> 1 particle / per each mrc stack. >>>> >>>> >>>> Any ideas? >>>> >>>> >>>> Thank you >>>> >>>> Sincerely, >>>> >>>> Dmitry >>>> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> >>> >>> >>> >>> _______________________________________________ >>> scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users >>> >>> -- >>> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> >>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> >> >> _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > _______________________________________________ > scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users > > -- > Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > > _______________________________________________ > scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users > > -- > Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* > |
From: Pablo C. <pc...@cn...> - 2021-05-19 08:40:58
|
I see, maybe we can arrange a tele conf to see in detail what is wrong. I'll contact you. On 19/5/21 10:26, Dmitry Semchonok wrote: > Dear Pablo and Grigory, > > > Thank you! > > Yes, I am well aware of the fact that there is no info for the CTF etc :) > > > All I need is just a nice aligned 2D set of images (from the set I > imported) > > (Ideally I would like to have this set just from cryosparc but I have > no idea how to do that right away :) ) > > Please see the log of relion > > > > RUNNING PROTOCOL ----------------- > 00002: Hostname: cryoem01 > 00003: PID: 36440 > 00004: pyworkflow: 3.0.13 > 00005: plugin: relion > 00006: plugin v: 3.1.2 > 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix > 00008: workingDir: Runs/001735_ProtRelionClassify2D > 00009: runMode: Continue > 00010: MPI: 3 > 00011: threads: 3 > 00012: Starting at step: 1 > 00013: Running steps > 00014: STARTED: convertInputStep, step 1, time 2021-05-18 > 15:06:21.557294 > 00015: Converting set from > 'Runs/001662_ProtImportParticles/particles.sqlite' into > 'Runs/001735_ProtRelionClassify2D/input_particles.star' > 00016: convertBinaryFiles: creating soft links. > 00017: Root: Runs/001735_ProtRelionClassify2D/extra/input -> > Runs/001662_ProtImportParticles/extra > 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 > 15:06:22.474076 > 00019: STARTED: runRelionStep, step 2, time 2021-05-18 15:06:22.502665 > 00020: mpirun -np 3 `which relion_refine_mpi` --i > Runs/001735_ProtRelionClassify2D/input_particles.star > --particle_diameter 690 --zero_mask --K 64 --norm --scale --o > Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 > 00021: RELION version: 3.1.2 > 00022: Precision: BASE=double, CUDA-ACC=single > 00023: > 00024: === RELION MPI setup === > 00025: + Number of MPI processes = 3 > 00026: + Number of threads per MPI process = 3 > 00027: + Total number of threads therefore = 9 > 00028: + Leader (0) runs on host = cryoem01 > 00029: + Follower 1 runs on host = cryoem01 > 00030: + Follower 2 runs on host = cryoem01 > 00031: ================= > 00032: uniqueHost cryoem01 has 2 ranks. > 00033: GPU-ids not specified for this rank, threads will > automatically be mapped to available devices. > 00034: Thread 0 on follower 1 mapped to device 0 > 00035: Thread 1 on follower 1 mapped to device 0 > 00036: Thread 2 on follower 1 mapped to device 0 > 00037: GPU-ids not specified for this rank, threads will > automatically be mapped to available devices. > 00038: Thread 0 on follower 2 mapped to device 1 > 00039: Thread 1 on follower 2 mapped to device 1 > 00040: Thread 2 on follower 2 mapped to device 1 > 00041: Running CPU instructions in double precision. > 00042: + WARNING: Changing psi sampling rate (before oversampling) > to 5.625 degrees, for more efficient GPU calculations > 00043: + On host cryoem01: free scratch space = 447.485 Gb. > 00044: Copying particles to scratch directory: > /data1/new_scratch/relion_volatile/ > 00045: 000/??? sec ~~(,_,"> [oo] > 00046: 0/ 0 sec ~~(,_,">in: > /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 > 00047: ERROR: > 00048: readMRC: Image number 11 exceeds stack size 1 of image > 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00049: === Backtrace === > 00050: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) > [0x4786a1] > 00051: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) > [0x4b210f] > 00052: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) > [0x4b407b] > 00053: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) > [0x5b8f87] > 00054: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) > [0x498540] > 00055: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) > [0x49ab2a] > 00056: > /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) > [0x4322a5] > 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7fea6a54e555] > 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() > [0x435fbf] > 00059: ================== > 00060: ERROR: > 00061: readMRC: Image number 11 exceeds stack size 1 of image > 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs > 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 > 00063: Traceback (most recent call last): > 00064: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 197, in run > 00065: self._run() > 00066: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 248, in _run > 00067: resultFiles = self._runFunc() > 00068: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 244, in _runFunc > 00069: return self._func(*self._args) > 00070: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", > line 811, in runRelionStep > 00071: self.runJob(self._getProgram(), params) > 00072: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", > line 1388, in runJob > 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) > 00074: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", > line 65, in runJob > 00075: process.runJob(log, programName, params, > 00076: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", > line 52, in runJob > 00077: return runCommand(command, env, cwd) > 00078: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", > line 67, in runCommand > 00079: check_call(command, shell=True, stdout=sys.stdout, > stderr=sys.stderr, > 00080: File > "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line > 364, in check_call > 00081: raise CalledProcessError(retcode, cmd) > 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which > relion_refine_mpi` --i > Runs/001735_ProtRelionClassify2D/input_particles.star > --particle_diameter 690 --zero_mask --K 64 --norm --scale --o > Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned > non-zero exit status 1. > 00083: Protocol failed: Command ' mpirun -np 3 `which > relion_refine_mpi` --i > Runs/001735_ProtRelionClassify2D/input_particles.star > --particle_diameter 690 --zero_mask --K 64 --norm --scale --o > Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 > --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 > --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc > --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned > non-zero exit status 1. > 00084: FAILED: runRelionStep, step 2, time 2021-05-18 15:06:24.609548 > 00085: *** Last status is failed > 00086: ------------------- PROTOCOL FAILED (DONE 2/3) > > > *Additionally,* > > So it seemed from the first look that there is some issue with the > image 11 — I delete the image 11 but the problem still remained. > > > > > *Optionally,* > > I believe that xmipp-2D should work but I did not try it yet. > > Thank you > > Sincerely, > Dmitry > > > > > > > > > >> On 19. May 2021, at 10:13, Pablo Conesa <pc...@cn... >> <mailto:pc...@cn...>> wrote: >> >> Hi! So, I think Grigory is right, you've gone through the import >> particles "without metadata info" therefore you only have the images >> without any alignment information. >> >> >> In theory, 2d classification should work with this kind of import. >> Could you please share the logs of one of the relion classification? >> >> On 19/5/21 9:26, Dmitry Semchonok wrote: >>> >>> Dear Grigory, >>> >>> >>> I did nothing much, just tried to start relion 2D // or cryosparc. >>> >>> >>> The only thing I tried additionally since I could no proceed is to >>> a) change the box size; b) just resave the subset with the same >>> number of images. >>> >>> >>> Please see the image. >>> >>> >>> <Screenshot 2021-05-19 at 09.23.41.png> >>> >>> >>> Thank you >>> >>> >>> Sincerely, >>> >>> Dmitry >>> >>> >>> >>>> On 18. May 2021, at 15:19, Grigory Sharov <sha...@gm... >>>> <mailto:sha...@gm...>> wrote: >>>> >>>> Hi, >>>> >>>> I imagine you have 624 micrographs, so particles are exported to >>>> mrc on a mic basis. I see you used the "files" option to import >>>> mrcs particles into Scipion. This means the imported particles >>>> have no metadata except pixel size you provided. >>>> >>>> What did you do with them after import? >>>> >>>> Best regards, >>>> Grigory >>>> >>>> -------------------------------------------------------------------------------- >>>> Grigory Sharov, Ph.D. >>>> >>>> MRC Laboratory of Molecular Biology, >>>> Francis Crick Avenue, >>>> Cambridge Biomedical Campus, >>>> Cambridge CB2 0QH, UK. >>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>> e-mail: gs...@mr... <mailto:gs...@mr...> >>>> >>>> >>>> On Tue, May 18, 2021 at 2:12 PM Dmitry Semchonok >>>> <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>> >>>> Dear Grigory, >>>> >>>> Yes I did that — the particles are looking fine. >>>> >>>> >>>> I guess the issue still comes from the fact that originally in >>>> cryosparc Export the stacks of particle were placed into 624 >>>> mrc. But the number of particles is about 44 818. So even >>>> after the renaming and the export I see this in SCIPION export log >>>> >>>> <Screenshot 2021-05-18 at 15.09.11.png> >>>> >>>> What I guess may help is if I somehow combine all those files >>>> in 1 mrcs first and then add import them to SCIPION. >>>> Do you perhaps know how to do that? >>>> >>>> Thank you >>>> >>>> Sincerely, >>>> Dmitry >>>> >>>> >>>> >>>>> On 18. May 2021, at 15:02, Grigory Sharov >>>>> <sha...@gm... <mailto:sha...@gm...>> >>>>> wrote: >>>>> >>>>> Hi Dmitry, >>>>> >>>>> as the error states your star file points to a non-existing >>>>> image in the mrcs stack. You need to check first if you >>>>> import from cryosparc with mrcs worked correctly (open / >>>>> display particles) then trace all the steps you did before 2D >>>>> classification. >>>>> >>>>> Best regards, >>>>> Grigory >>>>> >>>>> -------------------------------------------------------------------------------- >>>>> Grigory Sharov, Ph.D. >>>>> >>>>> MRC Laboratory of Molecular Biology, >>>>> Francis Crick Avenue, >>>>> Cambridge Biomedical Campus, >>>>> Cambridge CB2 0QH, UK. >>>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>>> e-mail: gs...@mr... >>>>> <mailto:gs...@mr...> >>>>> >>>>> >>>>> On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok >>>>> <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>>> >>>>> Dear Pablo, >>>>> >>>>> >>>>> Thank you. I heard about this option. For that I guess the >>>>> https://pypi.org/project/cs2star/ >>>>> <https://pypi.org/project/cs2star/> needs to be installed. >>>>> >>>>> >>>>> >>>>> >>>>> In Cryosparc itself there is an option to export files. >>>>> And then what we get is the mrc files with different >>>>> number of particles in each. >>>>> >>>>> It appeared to be possible to rename mrc —> to —>mrcs. >>>>> Then SCIPION can import those particles. >>>>> >>>>> Currently the problem is that not relion nor cryosparc can >>>>> run these particles. >>>>> >>>>> >>>>> >>>>> >>>>> Relion stops with error: >>>>> >>>>> >>>>> 00001: RUNNING PROTOCOL ----------------- >>>>> 00002: Hostname: cryoem01 >>>>> 00003: PID: 46455 >>>>> 00004: pyworkflow: 3.0.13 >>>>> 00005: plugin: relion >>>>> 00006: plugin v: 3.1.2 >>>>> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >>>>> 00008: workingDir: Runs/001546_ProtRelionClassify2D >>>>> 00009: runMode: Continue >>>>> 00010: MPI: 3 >>>>> 00011: threads: 3 >>>>> 00012: Starting at step: 1 >>>>> 00013: Running steps >>>>> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 >>>>> 12:33:26.198123 >>>>> 00015: Converting set from >>>>> 'Runs/001492_ProtUserSubSet/particles.sqlite' into >>>>> 'Runs/001546_ProtRelionClassify2D/input_particles.star' >>>>> 00016: convertBinaryFiles: creating soft links. >>>>> 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input >>>>> -> Runs/001057_ProtImportParticles/extra >>>>> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 >>>>> 12:33:27.117588 >>>>> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 >>>>> 12:33:27.145974 >>>>> 00020: mpirun -np 3 `which relion_refine_mpi` --i >>>>> Runs/001546_ProtRelionClassify2D/input_particles.star >>>>> --particle_diameter 690 --zero_mask --K 64 --norm >>>>> --scale --o >>>>> Runs/001546_ProtRelionClassify2D/extra/relion >>>>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 >>>>> --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step >>>>> 10.0 --dont_combine_weights_via_disc --scratch_dir >>>>> /data1/new_scratch/ --pool 3 --gpu --j 3 >>>>> 00021: RELION version: 3.1.2 >>>>> 00022: Precision: BASE=double, CUDA-ACC=single >>>>> 00023: >>>>> 00024: === RELION MPI setup === >>>>> 00025: + Number of MPI processes = 3 >>>>> 00026: + Number of threads per MPI process = 3 >>>>> 00027: + Total number of threads therefore = 9 >>>>> 00028: + Leader (0) runs on host = cryoem01 >>>>> 00029: + Follower 1 runs on host = cryoem01 >>>>> 00030: + Follower 2 runs on host = cryoem01 >>>>> 00031: ================= >>>>> 00032: uniqueHost cryoem01 has 2 ranks. >>>>> 00033: GPU-ids not specified for this rank, threads will >>>>> automatically be mapped to available devices. >>>>> 00034: Thread 0 on follower 1 mapped to device 0 >>>>> 00035: Thread 1 on follower 1 mapped to device 0 >>>>> 00036: Thread 2 on follower 1 mapped to device 0 >>>>> 00037: GPU-ids not specified for this rank, threads will >>>>> automatically be mapped to available devices. >>>>> 00038: Thread 0 on follower 2 mapped to device 1 >>>>> 00039: Thread 1 on follower 2 mapped to device 1 >>>>> 00040: Thread 2 on follower 2 mapped to device 1 >>>>> 00041: Running CPU instructions in double precision. >>>>> 00042: + WARNING: Changing psi sampling rate (before >>>>> oversampling) to 5.625 degrees, for more efficient GPU >>>>> calculations >>>>> 00043: + On host cryoem01: free scratch space = 448.252 Gb. >>>>> 00044: Copying particles to scratch directory: >>>>> /data1/new_scratch/relion_volatile/ >>>>> 00045: 000/??? sec ~~(,_,"> [oo] >>>>> 00046: 1/ 60 sec ~~(,_,">in: >>>>> /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >>>>> 00047: ERROR: >>>>> 00048: readMRC: Image number 11 exceeds stack size 1 of >>>>> image >>>>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>>>> 00049: === Backtrace === >>>>> 00050: >>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) >>>>> [0x4786a1] >>>>> 00051: >>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) >>>>> [0x4b210f] >>>>> 00052: >>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) >>>>> [0x4b407b] >>>>> 00053: >>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) >>>>> [0x5b8f87] >>>>> 00054: >>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) >>>>> [0x498540] >>>>> 00055: >>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) >>>>> [0x49ab2a] >>>>> 00056: >>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) >>>>> [0x4322a5] >>>>> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) >>>>> [0x7f657f51a555] >>>>> 00058: >>>>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() >>>>> [0x435fbf] >>>>> 00059: ================== >>>>> 00060: ERROR: >>>>> 00061: readMRC: Image number 11 exceeds stack size 1 of >>>>> image >>>>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>>>> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - >>>>> process 1 >>>>> 00063: Traceback (most recent call last): >>>>> 00064: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>>> line 197, in run >>>>> 00065: self._run() >>>>> 00066: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>>> line 248, in _run >>>>> 00067: resultFiles = self._runFunc() >>>>> 00068: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>>> line 244, in _runFunc >>>>> 00069: return self._func(*self._args) >>>>> 00070: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", >>>>> line 811, in runRelionStep >>>>> 00071: self.runJob(self._getProgram(), params) >>>>> 00072: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>>>> line 1388, in runJob >>>>> 00073: self._stepsExecutor.runJob(self._log, program, >>>>> arguments, **kwargs) >>>>> 00074: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", >>>>> line 65, in runJob >>>>> 00075: process.runJob(log, programName, params, >>>>> 00076: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>>>> line 52, in runJob >>>>> 00077: return runCommand(command, env, cwd) >>>>> 00078: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>>>> line 67, in runCommand >>>>> 00079: check_call(command, shell=True, stdout=sys.stdout, >>>>> stderr=sys.stderr, >>>>> 00080: File >>>>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", >>>>> line 364, in check_call >>>>> 00081: raise CalledProcessError(retcode, cmd) >>>>> 00082: subprocess.CalledProcessError: Command ' mpirun -np >>>>> 3 `which relion_refine_mpi` --i >>>>> Runs/001546_ProtRelionClassify2D/input_particles.star >>>>> --particle_diameter 690 --zero_mask --K 64 --norm >>>>> --scale --o >>>>> Runs/001546_ProtRelionClassify2D/extra/relion >>>>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 >>>>> --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step >>>>> 10.0 --dont_combine_weights_via_disc --scratch_dir >>>>> /data1/new_scratch/ --pool 3 --gpu --j 3' returned >>>>> non-zero exit status 1. >>>>> 00083: Protocol failed: Command ' mpirun -np 3 `which >>>>> relion_refine_mpi` --i >>>>> Runs/001546_ProtRelionClassify2D/input_particles.star >>>>> --particle_diameter 690 --zero_mask --K 64 --norm >>>>> --scale --o >>>>> Runs/001546_ProtRelionClassify2D/extra/relion >>>>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 >>>>> --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step >>>>> 10.0 --dont_combine_weights_via_disc --scratch_dir >>>>> /data1/new_scratch/ --pool 3 --gpu --j 3' returned >>>>> non-zero exit status 1. >>>>> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 >>>>> 12:33:29.230213 >>>>> 00085: *** Last status is failed >>>>> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >>>>> >>>>> >>>>> Cryosparc (in SCIPION) requires CTF to run. >>>>> >>>>> >>>>> >>>>> Thais is where I am now. >>>>> >>>>> Perhaps there is a solution? >>>>> >>>>> >>>>> Sincerely, >>>>> Dmitry >>>>> >>>>> >>>>> >>>>>> On 18. May 2021, at 14:16, Pablo Conesa >>>>>> <pc...@cn... <mailto:pc...@cn...>> wrote: >>>>>> >>>>>> Dear Dmitry, the import of CS metadata files (*.cs) is >>>>>> not supported in Scipion. Does CS has an option to export >>>>>> to star files. It rings a bell. >>>>>> >>>>>> On 18/5/21 9:53, Dmitry Semchonok wrote: >>>>>>> >>>>>>> Dear Grigory, >>>>>>> >>>>>>> >>>>>>> The files are in mrc format. >>>>>>> >>>>>>> >>>>>>> Please, let me try to explain more plan: >>>>>>> >>>>>>> I have a project in cryosparc. There I have cryosparc >>>>>>> selected 2D classes. I want to export the particles of >>>>>>> those classes into SCIPION. >>>>>>> >>>>>>> So I I pressed Export (fig 1) and the >>>>>>> program(cryosparc) created the folder with mrc + other >>>>>>> files (fig 2;3). I looked into J48 and found many *.mrc >>>>>>> files of the particles. But it is not 1 mrc = 1 >>>>>>> particle. It seems to be a mrc stuck - so I have several >>>>>>> files inside 1 *.mrc (fig 4) (you can also notice that >>>>>>> they all have different sizes) >>>>>>> >>>>>>> So I need to export them somehow in SCIPION >>>>>>> >>>>>>> For that, I used the SCIPION export - images protocol >>>>>>> where for the files to add I put *.mrc. But the protocol >>>>>>> seems to be added only 1 mrc as 1 picture and instead of >>>>>>> having 46392 particles I have ~600 particles. >>>>>>> >>>>>>> (Also the geometry seems not preserved). >>>>>>> >>>>>>> >>>>>>> So my question how to export the particles from >>>>>>> cryosparc into SCIPION correctly? >>>>>>> >>>>>>> >>>>>>> Thank you! >>>>>>> >>>>>>> >>>>>>> >>>>>>> https://disk.yandex.com/d/Fv3Q1lpwEzSisg >>>>>>> <https://disk.yandex.com/d/Fv3Q1lpwEzSisg> >>>>>>> >>>>>>> >>>>>>> Sincerely, >>>>>>> >>>>>>> Dmitry >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> On 17. May 2021, at 18:12, Grigory Sharov >>>>>>>> <sha...@gm... >>>>>>>> <mailto:sha...@gm...>> wrote: >>>>>>>> >>>>>>>> Hi Dmitry, >>>>>>>> >>>>>>>> mrc stacks should have "mrcs" extension. Is this the >>>>>>>> problem you are getting? >>>>>>>> >>>>>>>> Best regards, >>>>>>>> Grigory >>>>>>>> >>>>>>>> -------------------------------------------------------------------------------- >>>>>>>> Grigory Sharov, Ph.D. >>>>>>>> >>>>>>>> MRC Laboratory of Molecular Biology, >>>>>>>> Francis Crick Avenue, >>>>>>>> Cambridge Biomedical Campus, >>>>>>>> Cambridge CB2 0QH, UK. >>>>>>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>>>>>> e-mail: gs...@mr... >>>>>>>> <mailto:gs...@mr...> >>>>>>>> >>>>>>>> >>>>>>>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok >>>>>>>> <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>>>>>> >>>>>>>> Dear colleagues, >>>>>>>> >>>>>>>> I would like to export the particles from cryosparc >>>>>>>> to SCIPION. >>>>>>>> >>>>>>>> How to do that? >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> What I tried: >>>>>>>> >>>>>>>> >>>>>>>> 1.In cryosparc I pressed Export – to export the >>>>>>>> particles I am interested in. >>>>>>>> >>>>>>>> 2.In the folder Export – I found many mrc stacks >>>>>>>> with particles in each. >>>>>>>> >>>>>>>> 3.I tried to export them to SCIPION using Export >>>>>>>> particles but instead of reading each stack and >>>>>>>> combine them in the 1 dataset I received 1 particle >>>>>>>> / per each mrc stack. >>>>>>>> >>>>>>>> >>>>>>>> Any ideas? >>>>>>>> >>>>>>>> >>>>>>>> Thank you >>>>>>>> >>>>>>>> Sincerely, >>>>>>>> >>>>>>>> Dmitry >>>>>>>> >>>>>>>> >>>>>>>> _______________________________________________ >>>>>>>> scipion-users mailing list >>>>>>>> sci...@li... >>>>>>>> <mailto:sci...@li...> >>>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>>>> >>>>>>>> _______________________________________________ >>>>>>>> scipion-users mailing list >>>>>>>> sci...@li... >>>>>>>> <mailto:sci...@li...> >>>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>>> >>>>>>> >>>>>>> >>>>>>> _______________________________________________ >>>>>>> scipion-users mailing list >>>>>>> sci...@li... <mailto:sci...@li...> >>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>> -- >>>>>> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> >>>>>> team* >>>>>> _______________________________________________ >>>>>> scipion-users mailing list >>>>>> sci...@li... >>>>>> <mailto:sci...@li...> >>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>> >>>>> _______________________________________________ >>>>> scipion-users mailing list >>>>> sci...@li... >>>>> <mailto:sci...@li...> >>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>> >>>>> _______________________________________________ >>>>> scipion-users mailing list >>>>> sci...@li... >>>>> <mailto:sci...@li...> >>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... >>>> <mailto:sci...@li...> >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> >>> >>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >> -- >> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users -- Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* |
From: Dmitry S. <Sem...@gm...> - 2021-05-19 08:26:34
|
Dear Pablo and Grigory, Thank you! Yes, I am well aware of the fact that there is no info for the CTF etc :) All I need is just a nice aligned 2D set of images (from the set I imported) (Ideally I would like to have this set just from cryosparc but I have no idea how to do that right away :) ) Please see the log of relion RUNNING PROTOCOL ----------------- 00002: Hostname: cryoem01 00003: PID: 36440 00004: pyworkflow: 3.0.13 00005: plugin: relion 00006: plugin v: 3.1.2 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix 00008: workingDir: Runs/001735_ProtRelionClassify2D 00009: runMode: Continue 00010: MPI: 3 00011: threads: 3 00012: Starting at step: 1 00013: Running steps 00014: STARTED: convertInputStep, step 1, time 2021-05-18 15:06:21.557294 00015: Converting set from 'Runs/001662_ProtImportParticles/particles.sqlite' into 'Runs/001735_ProtRelionClassify2D/input_particles.star' 00016: convertBinaryFiles: creating soft links. 00017: Root: Runs/001735_ProtRelionClassify2D/extra/input -> Runs/001662_ProtImportParticles/extra 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 15:06:22.474076 00019: STARTED: runRelionStep, step 2, time 2021-05-18 15:06:22.502665 00020: mpirun -np 3 `which relion_refine_mpi` --i Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 00021: RELION version: 3.1.2 00022: Precision: BASE=double, CUDA-ACC=single 00023: 00024: === RELION MPI setup === 00025: + Number of MPI processes = 3 00026: + Number of threads per MPI process = 3 00027: + Total number of threads therefore = 9 00028: + Leader (0) runs on host = cryoem01 00029: + Follower 1 runs on host = cryoem01 00030: + Follower 2 runs on host = cryoem01 00031: ================= 00032: uniqueHost cryoem01 has 2 ranks. 00033: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. 00034: Thread 0 on follower 1 mapped to device 0 00035: Thread 1 on follower 1 mapped to device 0 00036: Thread 2 on follower 1 mapped to device 0 00037: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. 00038: Thread 0 on follower 2 mapped to device 1 00039: Thread 1 on follower 2 mapped to device 1 00040: Thread 2 on follower 2 mapped to device 1 00041: Running CPU instructions in double precision. 00042: + WARNING: Changing psi sampling rate (before oversampling) to 5.625 degrees, for more efficient GPU calculations 00043: + On host cryoem01: free scratch space = 447.485 Gb. 00044: Copying particles to scratch directory: /data1/new_scratch/relion_volatile/ 00045: 000/??? sec ~~(,_,"> [oo] 00046: 0/ 0 sec ~~(,_,">in: /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 00047: ERROR: 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs 00049: === Backtrace === 00050: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) [0x4786a1] 00051: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) [0x4b210f] 00052: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) [0x4b407b] 00053: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) [0x5b8f87] 00054: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) [0x498540] 00055: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) [0x49ab2a] 00056: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) [0x4322a5] 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7fea6a54e555] 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() [0x435fbf] 00059: ================== 00060: ERROR: 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001735_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 00063: Traceback (most recent call last): 00064: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 197, in run 00065: self._run() 00066: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 248, in _run 00067: resultFiles = self._runFunc() 00068: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 244, in _runFunc 00069: return self._func(*self._args) 00070: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", line 811, in runRelionStep 00071: self.runJob(self._getProgram(), params) 00072: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 1388, in runJob 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) 00074: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", line 65, in runJob 00075: process.runJob(log, programName, params, 00076: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 52, in runJob 00077: return runCommand(command, env, cwd) 00078: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 67, in runCommand 00079: check_call(command, shell=True, stdout=sys.stdout, stderr=sys.stderr, 00080: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, in check_call 00081: raise CalledProcessError(retcode, cmd) 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. 00083: Protocol failed: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001735_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001735_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. 00084: FAILED: runRelionStep, step 2, time 2021-05-18 15:06:24.609548 00085: *** Last status is failed 00086: ------------------- PROTOCOL FAILED (DONE 2/3) Additionally, So it seemed from the first look that there is some issue with the image 11 — I delete the image 11 but the problem still remained. Optionally, I believe that xmipp-2D should work but I did not try it yet. Thank you Sincerely, Dmitry > On 19. May 2021, at 10:13, Pablo Conesa <pc...@cn...> wrote: > > Hi! So, I think Grigory is right, you've gone through the import particles "without metadata info" therefore you only have the images without any alignment information. > > > > In theory, 2d classification should work with this kind of import. Could you please share the logs of one of the relion classification? > > On 19/5/21 9:26, Dmitry Semchonok wrote: >> Dear Grigory, >> >> >> I did nothing much, just tried to start relion 2D // or cryosparc. >> >> >> >> The only thing I tried additionally since I could no proceed is to a) change the box size; b) just resave the subset with the same number of images. >> >> >> Please see the image. >> >> >> <Screenshot 2021-05-19 at 09.23.41.png> >> >> >> Thank you >> >> >> Sincerely, >> >> Dmitry >> >> >> >>> On 18. May 2021, at 15:19, Grigory Sharov <sha...@gm... <mailto:sha...@gm...>> wrote: >>> >>> Hi, >>> >>> I imagine you have 624 micrographs, so particles are exported to mrc on a mic basis. I see you used the "files" option to import mrcs particles into Scipion. This means the imported particles have no metadata except pixel size you provided. >>> >>> What did you do with them after import? >>> >>> Best regards, >>> Grigory >>> >>> -------------------------------------------------------------------------------- >>> Grigory Sharov, Ph.D. >>> >>> MRC Laboratory of Molecular Biology, >>> Francis Crick Avenue, >>> Cambridge Biomedical Campus, >>> Cambridge CB2 0QH, UK. >>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>> e-mail: gs...@mr... <mailto:gs...@mr...> >>> >>> >>> On Tue, May 18, 2021 at 2:12 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>> Dear Grigory, >>> >>> Yes I did that — the particles are looking fine. >>> >>> >>> I guess the issue still comes from the fact that originally in cryosparc Export the stacks of particle were placed into 624 mrc. But the number of particles is about 44 818. So even after the renaming and the export I see this in SCIPION export log >>> >>> <Screenshot 2021-05-18 at 15.09.11.png> >>> >>> What I guess may help is if I somehow combine all those files in 1 mrcs first and then add import them to SCIPION. >>> Do you perhaps know how to do that? >>> >>> Thank you >>> >>> Sincerely, >>> Dmitry >>> >>> >>> >>>> On 18. May 2021, at 15:02, Grigory Sharov <sha...@gm... <mailto:sha...@gm...>> wrote: >>>> >>>> Hi Dmitry, >>>> >>>> as the error states your star file points to a non-existing image in the mrcs stack. You need to check first if you import from cryosparc with mrcs worked correctly (open / display particles) then trace all the steps you did before 2D classification. >>>> >>>> Best regards, >>>> Grigory >>>> >>>> -------------------------------------------------------------------------------- >>>> Grigory Sharov, Ph.D. >>>> >>>> MRC Laboratory of Molecular Biology, >>>> Francis Crick Avenue, >>>> Cambridge Biomedical Campus, >>>> Cambridge CB2 0QH, UK. >>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>> e-mail: gs...@mr... <mailto:gs...@mr...> >>>> >>>> >>>> On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>> Dear Pablo, >>>> >>>> >>>> Thank you. I heard about this option. For that I guess the https://pypi.org/project/cs2star/ <https://pypi.org/project/cs2star/> needs to be installed. >>>> >>>> >>>> >>>> >>>> In Cryosparc itself there is an option to export files. And then what we get is the mrc files with different number of particles in each. >>>> >>>> It appeared to be possible to rename mrc —> to —>mrcs. Then SCIPION can import those particles. >>>> >>>> Currently the problem is that not relion nor cryosparc can run these particles. >>>> >>>> >>>> >>>> >>>> Relion stops with error: >>>> >>>> >>>> 00001: RUNNING PROTOCOL ----------------- >>>> 00002: Hostname: cryoem01 >>>> 00003: PID: 46455 >>>> 00004: pyworkflow: 3.0.13 >>>> 00005: plugin: relion >>>> 00006: plugin v: 3.1.2 >>>> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >>>> 00008: workingDir: Runs/001546_ProtRelionClassify2D >>>> 00009: runMode: Continue >>>> 00010: MPI: 3 >>>> 00011: threads: 3 >>>> 00012: Starting at step: 1 >>>> 00013: Running steps >>>> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 12:33:26.198123 >>>> 00015: Converting set from 'Runs/001492_ProtUserSubSet/particles.sqlite' into 'Runs/001546_ProtRelionClassify2D/input_particles.star' >>>> 00016: convertBinaryFiles: creating soft links. >>>> 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input -> Runs/001057_ProtImportParticles/extra >>>> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 12:33:27.117588 >>>> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 12:33:27.145974 >>>> 00020: mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 >>>> 00021: RELION version: 3.1.2 >>>> 00022: Precision: BASE=double, CUDA-ACC=single >>>> 00023: >>>> 00024: === RELION MPI setup === >>>> 00025: + Number of MPI processes = 3 >>>> 00026: + Number of threads per MPI process = 3 >>>> 00027: + Total number of threads therefore = 9 >>>> 00028: + Leader (0) runs on host = cryoem01 >>>> 00029: + Follower 1 runs on host = cryoem01 >>>> 00030: + Follower 2 runs on host = cryoem01 >>>> 00031: ================= >>>> 00032: uniqueHost cryoem01 has 2 ranks. >>>> 00033: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. >>>> 00034: Thread 0 on follower 1 mapped to device 0 >>>> 00035: Thread 1 on follower 1 mapped to device 0 >>>> 00036: Thread 2 on follower 1 mapped to device 0 >>>> 00037: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. >>>> 00038: Thread 0 on follower 2 mapped to device 1 >>>> 00039: Thread 1 on follower 2 mapped to device 1 >>>> 00040: Thread 2 on follower 2 mapped to device 1 >>>> 00041: Running CPU instructions in double precision. >>>> 00042: + WARNING: Changing psi sampling rate (before oversampling) to 5.625 degrees, for more efficient GPU calculations >>>> 00043: + On host cryoem01: free scratch space = 448.252 Gb. >>>> 00044: Copying particles to scratch directory: /data1/new_scratch/relion_volatile/ >>>> 00045: 000/??? sec ~~(,_,"> [oo] >>>> 00046: 1/ 60 sec ~~(,_,">in: /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >>>> 00047: ERROR: >>>> 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs <mailto:000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs> >>>> 00049: === Backtrace === >>>> 00050: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) [0x4786a1] >>>> 00051: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) [0x4b210f] >>>> 00052: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) [0x4b407b] >>>> 00053: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) [0x5b8f87] >>>> 00054: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) [0x498540] >>>> 00055: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) [0x49ab2a] >>>> 00056: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) [0x4322a5] >>>> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] >>>> 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() [0x435fbf] >>>> 00059: ================== >>>> 00060: ERROR: >>>> 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs <mailto:000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs> >>>> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 >>>> 00063: Traceback (most recent call last): >>>> 00064: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 197, in run >>>> 00065: self._run() >>>> 00066: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 248, in _run >>>> 00067: resultFiles = self._runFunc() >>>> 00068: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 244, in _runFunc >>>> 00069: return self._func(*self._args) >>>> 00070: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", line 811, in runRelionStep >>>> 00071: self.runJob(self._getProgram(), params) >>>> 00072: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 1388, in runJob >>>> 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) >>>> 00074: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", line 65, in runJob >>>> 00075: process.runJob(log, programName, params, >>>> 00076: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 52, in runJob >>>> 00077: return runCommand(command, env, cwd) >>>> 00078: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 67, in runCommand >>>> 00079: check_call(command, shell=True, stdout=sys.stdout, stderr=sys.stderr, >>>> 00080: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, in check_call >>>> 00081: raise CalledProcessError(retcode, cmd) >>>> 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. >>>> 00083: Protocol failed: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. >>>> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:33:29.230213 >>>> 00085: *** Last status is failed >>>> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >>>> >>>> >>>> Cryosparc (in SCIPION) requires CTF to run. >>>> >>>> >>>> >>>> Thais is where I am now. >>>> >>>> Perhaps there is a solution? >>>> >>>> >>>> Sincerely, >>>> Dmitry >>>> >>>> >>>> >>>>> On 18. May 2021, at 14:16, Pablo Conesa <pc...@cn... <mailto:pc...@cn...>> wrote: >>>>> >>>>> Dear Dmitry, the import of CS metadata files (*.cs) is not supported in Scipion. Does CS has an option to export to star files. It rings a bell. >>>>> >>>>> On 18/5/21 9:53, Dmitry Semchonok wrote: >>>>>> <>Dear Grigory, >>>>>> >>>>>> >>>>>> The files are in mrc format. >>>>>> >>>>>> >>>>>> Please, let me try to explain more plan: >>>>>> >>>>>> I have a project in cryosparc. There I have cryosparc selected 2D classes. I want to export the particles of those classes into SCIPION. >>>>>> >>>>>> So I I pressed Export (fig 1) and the program(cryosparc) created the folder with mrc + other files (fig 2;3). I looked into J48 and found many *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can also notice that they all have different sizes) >>>>>> >>>>>> So I need to export them somehow in SCIPION >>>>>> >>>>>> For that, I used the SCIPION export - images protocol where for the files to add I put *.mrc. But the protocol seems to be added only 1 mrc as 1 picture and instead of having 46392 particles I have ~600 particles. >>>>>> >>>>>> (Also the geometry seems not preserved). >>>>>> >>>>>> >>>>>> So my question how to export the particles from cryosparc into SCIPION correctly? >>>>>> >>>>>> >>>>>> Thank you! >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> https://disk.yandex.com/d/Fv3Q1lpwEzSisg <https://disk.yandex.com/d/Fv3Q1lpwEzSisg> >>>>>> >>>>>> Sincerely, >>>>>> >>>>>> Dmitry >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm... <mailto:sha...@gm...>> wrote: >>>>>>> >>>>>>> Hi Dmitry, >>>>>>> >>>>>>> mrc stacks should have "mrcs" extension. Is this the problem you are getting? >>>>>>> >>>>>>> Best regards, >>>>>>> Grigory >>>>>>> >>>>>>> -------------------------------------------------------------------------------- >>>>>>> Grigory Sharov, Ph.D. >>>>>>> >>>>>>> MRC Laboratory of Molecular Biology, >>>>>>> Francis Crick Avenue, >>>>>>> Cambridge Biomedical Campus, >>>>>>> Cambridge CB2 0QH, UK. >>>>>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>>>>> e-mail: gs...@mr... <mailto:gs...@mr...> >>>>>>> >>>>>>> >>>>>>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>>>>> <>Dear colleagues, >>>>>>> >>>>>>> I would like to export the particles from cryosparc to SCIPION. >>>>>>> >>>>>>> How to do that? >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> What I tried: >>>>>>> >>>>>>> >>>>>>> 1. In cryosparc I pressed Export – to export the particles I am interested in. >>>>>>> >>>>>>> 2. In the folder Export – I found many mrc stacks with particles in each. >>>>>>> >>>>>>> 3. I tried to export them to SCIPION using Export particles but instead of reading each stack and combine them in the 1 dataset I received 1 particle / per each mrc stack. >>>>>>> >>>>>>> >>>>>>> Any ideas? >>>>>>> >>>>>>> >>>>>>> Thank you >>>>>>> >>>>>>> Sincerely, >>>>>>> >>>>>>> Dmitry >>>>>>> >>>>>>> >>>>>>> _______________________________________________ >>>>>>> scipion-users mailing list >>>>>>> sci...@li... <mailto:sci...@li...> >>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>>> _______________________________________________ >>>>>>> scipion-users mailing list >>>>>>> sci...@li... <mailto:sci...@li...> >>>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> _______________________________________________ >>>>>> scipion-users mailing list >>>>>> sci...@li... <mailto:sci...@li...> >>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>> -- >>>>> Pablo Conesa - Madrid Scipion <http://scipion.i2pc.es/> team >>>>> _______________________________________________ >>>>> scipion-users mailing list >>>>> sci...@li... <mailto:sci...@li...> >>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... <mailto:sci...@li...> >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... <mailto:sci...@li...> >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... <mailto:sci...@li...> >>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >> >> >> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> > -- > Pablo Conesa - Madrid Scipion <http://scipion.i2pc.es/> team > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Pablo C. <pc...@cn...> - 2021-05-19 08:13:56
|
Hi! So, I think Grigory is right, you've gone through the import particles "without metadata info" therefore you only have the images without any alignment information. In theory, 2d classification should work with this kind of import. Could you please share the logs of one of the relion classification? On 19/5/21 9:26, Dmitry Semchonok wrote: > > Dear Grigory, > > I did nothing much, just tried to start relion 2D // or cryosparc. > > > The only thing I tried additionally since I could no proceed is to a) > change the box size; b) just resave the subset with the same number of > images. > > Please see the image. > > > > Thank you > > > Sincerely, > > Dmitry > > >> On 18. May 2021, at 15:19, Grigory Sharov <sha...@gm... >> <mailto:sha...@gm...>> wrote: >> >> Hi, >> >> I imagine you have 624 micrographs, so particles are exported to mrc >> on a mic basis. I see you used the "files" option to import mrcs >> particles into Scipion. This means the imported particles have no >> metadata except pixel size you provided. >> >> What did you do with them after import? >> >> Best regards, >> Grigory >> >> -------------------------------------------------------------------------------- >> Grigory Sharov, Ph.D. >> >> MRC Laboratory of Molecular Biology, >> Francis Crick Avenue, >> Cambridge Biomedical Campus, >> Cambridge CB2 0QH, UK. >> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >> e-mail: gs...@mr... <mailto:gs...@mr...> >> >> >> On Tue, May 18, 2021 at 2:12 PM Dmitry Semchonok <Sem...@gm... >> <mailto:Sem...@gm...>> wrote: >> >> Dear Grigory, >> >> Yes I did that — the particles are looking fine. >> >> >> I guess the issue still comes from the fact that originally in >> cryosparc Export the stacks of particle were placed into 624 mrc. >> But the number of particles is about 44 818. So even after the >> renaming and the export I see this in SCIPION export log >> >> <Screenshot 2021-05-18 at 15.09.11.png> >> >> What I guess may help is if I somehow combine all those files in >> 1 mrcs first and then add import them to SCIPION. >> Do you perhaps know how to do that? >> >> Thank you >> >> Sincerely, >> Dmitry >> >> >> >>> On 18. May 2021, at 15:02, Grigory Sharov >>> <sha...@gm... <mailto:sha...@gm...>> wrote: >>> >>> Hi Dmitry, >>> >>> as the error states your star file points to a non-existing >>> image in the mrcs stack. You need to check first if you import >>> from cryosparc with mrcs worked correctly (open / display >>> particles) then trace all the steps you did before 2D >>> classification. >>> >>> Best regards, >>> Grigory >>> >>> -------------------------------------------------------------------------------- >>> Grigory Sharov, Ph.D. >>> >>> MRC Laboratory of Molecular Biology, >>> Francis Crick Avenue, >>> Cambridge Biomedical Campus, >>> Cambridge CB2 0QH, UK. >>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>> e-mail: gs...@mr... <mailto:gs...@mr...> >>> >>> >>> On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok >>> <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>> >>> Dear Pablo, >>> >>> >>> Thank you. I heard about this option. For that I guess the >>> https://pypi.org/project/cs2star/ >>> <https://pypi.org/project/cs2star/> needs to be installed. >>> >>> >>> >>> >>> In Cryosparc itself there is an option to export files. And >>> then what we get is the mrc files with different number of >>> particles in each. >>> >>> It appeared to be possible to rename mrc —> to —>mrcs. Then >>> SCIPION can import those particles. >>> >>> Currently the problem is that not relion nor cryosparc can >>> run these particles. >>> >>> >>> >>> >>> Relion stops with error: >>> >>> >>> 00001: RUNNING PROTOCOL ----------------- >>> 00002: Hostname: cryoem01 >>> 00003: PID: 46455 >>> 00004: pyworkflow: 3.0.13 >>> 00005: plugin: relion >>> 00006: plugin v: 3.1.2 >>> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >>> 00008: workingDir: Runs/001546_ProtRelionClassify2D >>> 00009: runMode: Continue >>> 00010: MPI: 3 >>> 00011: threads: 3 >>> 00012: Starting at step: 1 >>> 00013: Running steps >>> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 >>> 12:33:26.198123 >>> 00015: Converting set from >>> 'Runs/001492_ProtUserSubSet/particles.sqlite' into >>> 'Runs/001546_ProtRelionClassify2D/input_particles.star' >>> 00016: convertBinaryFiles: creating soft links. >>> 00017: Root: >>> Runs/001546_ProtRelionClassify2D/extra/input -> >>> Runs/001057_ProtImportParticles/extra >>> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 >>> 12:33:27.117588 >>> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 >>> 12:33:27.145974 >>> 00020: mpirun -np 3 `which relion_refine_mpi` --i >>> Runs/001546_ProtRelionClassify2D/input_particles.star >>> --particle_diameter 690 --zero_mask --K 64 --norm --scale >>> --o Runs/001546_ProtRelionClassify2D/extra/relion >>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter >>> 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 >>> --dont_combine_weights_via_disc --scratch_dir >>> /data1/new_scratch/ --pool 3 --gpu --j 3 >>> 00021: RELION version: 3.1.2 >>> 00022: Precision: BASE=double, CUDA-ACC=single >>> 00023: >>> 00024: === RELION MPI setup === >>> 00025: + Number of MPI processes = 3 >>> 00026: + Number of threads per MPI process = 3 >>> 00027: + Total number of threads therefore = 9 >>> 00028: + Leader (0) runs on host = cryoem01 >>> 00029: + Follower 1 runs on host = cryoem01 >>> 00030: + Follower 2 runs on host = cryoem01 >>> 00031: ================= >>> 00032: uniqueHost cryoem01 has 2 ranks. >>> 00033: GPU-ids not specified for this rank, threads will >>> automatically be mapped to available devices. >>> 00034: Thread 0 on follower 1 mapped to device 0 >>> 00035: Thread 1 on follower 1 mapped to device 0 >>> 00036: Thread 2 on follower 1 mapped to device 0 >>> 00037: GPU-ids not specified for this rank, threads will >>> automatically be mapped to available devices. >>> 00038: Thread 0 on follower 2 mapped to device 1 >>> 00039: Thread 1 on follower 2 mapped to device 1 >>> 00040: Thread 2 on follower 2 mapped to device 1 >>> 00041: Running CPU instructions in double precision. >>> 00042: + WARNING: Changing psi sampling rate (before >>> oversampling) to 5.625 degrees, for more efficient GPU >>> calculations >>> 00043: + On host cryoem01: free scratch space = 448.252 Gb. >>> 00044: Copying particles to scratch directory: >>> /data1/new_scratch/relion_volatile/ >>> 00045: 000/??? sec ~~(,_,"> [oo] >>> 00046: 1/ 60 sec ~~(,_,">in: >>> /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >>> 00047: ERROR: >>> 00048: readMRC: Image number 11 exceeds stack size 1 of >>> image >>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>> 00049: === Backtrace === >>> 00050: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) >>> [0x4786a1] >>> 00051: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) >>> [0x4b210f] >>> 00052: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) >>> [0x4b407b] >>> 00053: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) >>> [0x5b8f87] >>> 00054: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) >>> [0x498540] >>> 00055: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) >>> [0x49ab2a] >>> 00056: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) >>> [0x4322a5] >>> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] >>> 00058: >>> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() >>> [0x435fbf] >>> 00059: ================== >>> 00060: ERROR: >>> 00061: readMRC: Image number 11 exceeds stack size 1 of >>> image >>> 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >>> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - >>> process 1 >>> 00063: Traceback (most recent call last): >>> 00064: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>> line 197, in run >>> 00065: self._run() >>> 00066: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>> line 248, in _run >>> 00067: resultFiles = self._runFunc() >>> 00068: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>> line 244, in _runFunc >>> 00069: return self._func(*self._args) >>> 00070: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", >>> line 811, in runRelionStep >>> 00071: self.runJob(self._getProgram(), params) >>> 00072: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >>> line 1388, in runJob >>> 00073: self._stepsExecutor.runJob(self._log, program, >>> arguments, **kwargs) >>> 00074: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", >>> line 65, in runJob >>> 00075: process.runJob(log, programName, params, >>> 00076: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>> line 52, in runJob >>> 00077: return runCommand(command, env, cwd) >>> 00078: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >>> line 67, in runCommand >>> 00079: check_call(command, shell=True, stdout=sys.stdout, >>> stderr=sys.stderr, >>> 00080: File >>> "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", >>> line 364, in check_call >>> 00081: raise CalledProcessError(retcode, cmd) >>> 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 >>> `which relion_refine_mpi` --i >>> Runs/001546_ProtRelionClassify2D/input_particles.star >>> --particle_diameter 690 --zero_mask --K 64 --norm --scale >>> --o Runs/001546_ProtRelionClassify2D/extra/relion >>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter >>> 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 >>> --dont_combine_weights_via_disc --scratch_dir >>> /data1/new_scratch/ --pool 3 --gpu --j 3' returned >>> non-zero exit status 1. >>> 00083: Protocol failed: Command ' mpirun -np 3 `which >>> relion_refine_mpi` --i >>> Runs/001546_ProtRelionClassify2D/input_particles.star >>> --particle_diameter 690 --zero_mask --K 64 --norm --scale >>> --o Runs/001546_ProtRelionClassify2D/extra/relion >>> --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter >>> 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 >>> --dont_combine_weights_via_disc --scratch_dir >>> /data1/new_scratch/ --pool 3 --gpu --j 3' returned >>> non-zero exit status 1. >>> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 >>> 12:33:29.230213 >>> 00085: *** Last status is failed >>> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >>> >>> >>> Cryosparc (in SCIPION) requires CTF to run. >>> >>> >>> >>> Thais is where I am now. >>> >>> Perhaps there is a solution? >>> >>> >>> Sincerely, >>> Dmitry >>> >>> >>> >>>> On 18. May 2021, at 14:16, Pablo Conesa >>>> <pc...@cn... <mailto:pc...@cn...>> wrote: >>>> >>>> Dear Dmitry, the import of CS metadata files (*.cs) is not >>>> supported in Scipion. Does CS has an option to export to >>>> star files. It rings a bell. >>>> >>>> On 18/5/21 9:53, Dmitry Semchonok wrote: >>>>> >>>>> Dear Grigory, >>>>> >>>>> >>>>> The files are in mrc format. >>>>> >>>>> >>>>> Please, let me try to explain more plan: >>>>> >>>>> I have a project in cryosparc. There I have cryosparc >>>>> selected 2D classes. I want to export the particles of >>>>> those classes into SCIPION. >>>>> >>>>> So I I pressed Export (fig 1) and the program(cryosparc) >>>>> created the folder with mrc + other files (fig 2;3). I >>>>> looked into J48 and found many *.mrc files of the >>>>> particles. But it is not 1 mrc = 1 particle. It seems to >>>>> be a mrc stuck - so I have several files inside 1 *.mrc >>>>> (fig 4) (you can also notice that they all have different >>>>> sizes) >>>>> >>>>> So I need to export them somehow in SCIPION >>>>> >>>>> For that, I used the SCIPION export - images protocol >>>>> where for the files to add I put *.mrc. But the protocol >>>>> seems to be added only 1 mrc as 1 picture and instead of >>>>> having 46392 particles I have ~600 particles. >>>>> >>>>> (Also the geometry seems not preserved). >>>>> >>>>> >>>>> So my question how to export the particles from cryosparc >>>>> into SCIPION correctly? >>>>> >>>>> >>>>> Thank you! >>>>> >>>>> >>>>> >>>>> https://disk.yandex.com/d/Fv3Q1lpwEzSisg >>>>> <https://disk.yandex.com/d/Fv3Q1lpwEzSisg> >>>>> >>>>> >>>>> Sincerely, >>>>> >>>>> Dmitry >>>>> >>>>> >>>>> >>>>> >>>>>> On 17. May 2021, at 18:12, Grigory Sharov >>>>>> <sha...@gm... >>>>>> <mailto:sha...@gm...>> wrote: >>>>>> >>>>>> Hi Dmitry, >>>>>> >>>>>> mrc stacks should have "mrcs" extension. Is this the >>>>>> problem you are getting? >>>>>> >>>>>> Best regards, >>>>>> Grigory >>>>>> >>>>>> -------------------------------------------------------------------------------- >>>>>> Grigory Sharov, Ph.D. >>>>>> >>>>>> MRC Laboratory of Molecular Biology, >>>>>> Francis Crick Avenue, >>>>>> Cambridge Biomedical Campus, >>>>>> Cambridge CB2 0QH, UK. >>>>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>>>> e-mail: gs...@mr... >>>>>> <mailto:gs...@mr...> >>>>>> >>>>>> >>>>>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok >>>>>> <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>>>> >>>>>> Dear colleagues, >>>>>> >>>>>> I would like to export the particles from cryosparc >>>>>> to SCIPION. >>>>>> >>>>>> How to do that? >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> What I tried: >>>>>> >>>>>> >>>>>> 1.In cryosparc I pressed Export – to export the >>>>>> particles I am interested in. >>>>>> >>>>>> 2.In the folder Export – I found many mrc stacks with >>>>>> particles in each. >>>>>> >>>>>> 3.I tried to export them to SCIPION using Export >>>>>> particles but instead of reading each stack and >>>>>> combine them in the 1 dataset I received 1 particle / >>>>>> per each mrc stack. >>>>>> >>>>>> >>>>>> Any ideas? >>>>>> >>>>>> >>>>>> Thank you >>>>>> >>>>>> Sincerely, >>>>>> >>>>>> Dmitry >>>>>> >>>>>> >>>>>> _______________________________________________ >>>>>> scipion-users mailing list >>>>>> sci...@li... >>>>>> <mailto:sci...@li...> >>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>>> >>>>>> _______________________________________________ >>>>>> scipion-users mailing list >>>>>> sci...@li... >>>>>> <mailto:sci...@li...> >>>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>> >>>>> >>>>> >>>>> _______________________________________________ >>>>> scipion-users mailing list >>>>> sci...@li... <mailto:sci...@li...> >>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>> -- >>>> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... >>>> <mailto:sci...@li...> >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> <mailto:sci...@li...> >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> <mailto:sci...@li...> >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> <https://lists.sourceforge.net/lists/listinfo/scipion-users> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users -- Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* |
From: Dmitry S. <Sem...@gm...> - 2021-05-19 07:27:21
|
Dear Grigory, I did nothing much, just tried to start relion 2D // or cryosparc. The only thing I tried additionally since I could no proceed is to a) change the box size; b) just resave the subset with the same number of images. Please see the image. Thank you Sincerely, Dmitry > On 18. May 2021, at 15:19, Grigory Sharov <sha...@gm...> wrote: > > Hi, > > I imagine you have 624 micrographs, so particles are exported to mrc on a mic basis. I see you used the "files" option to import mrcs particles into Scipion. This means the imported particles have no metadata except pixel size you provided. > > What did you do with them after import? > > Best regards, > Grigory > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> > e-mail: gs...@mr... <mailto:gs...@mr...> > > > On Tue, May 18, 2021 at 2:12 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: > Dear Grigory, > > Yes I did that — the particles are looking fine. > > > I guess the issue still comes from the fact that originally in cryosparc Export the stacks of particle were placed into 624 mrc. But the number of particles is about 44 818. So even after the renaming and the export I see this in SCIPION export log > > <Screenshot 2021-05-18 at 15.09.11.png> > > What I guess may help is if I somehow combine all those files in 1 mrcs first and then add import them to SCIPION. > Do you perhaps know how to do that? > > Thank you > > Sincerely, > Dmitry > > > >> On 18. May 2021, at 15:02, Grigory Sharov <sha...@gm... <mailto:sha...@gm...>> wrote: >> >> Hi Dmitry, >> >> as the error states your star file points to a non-existing image in the mrcs stack. You need to check first if you import from cryosparc with mrcs worked correctly (open / display particles) then trace all the steps you did before 2D classification. >> >> Best regards, >> Grigory >> >> -------------------------------------------------------------------------------- >> Grigory Sharov, Ph.D. >> >> MRC Laboratory of Molecular Biology, >> Francis Crick Avenue, >> Cambridge Biomedical Campus, >> Cambridge CB2 0QH, UK. >> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >> e-mail: gs...@mr... <mailto:gs...@mr...> >> >> >> On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: >> Dear Pablo, >> >> >> Thank you. I heard about this option. For that I guess the https://pypi.org/project/cs2star/ <https://pypi.org/project/cs2star/> needs to be installed. >> >> >> >> >> In Cryosparc itself there is an option to export files. And then what we get is the mrc files with different number of particles in each. >> >> It appeared to be possible to rename mrc —> to —>mrcs. Then SCIPION can import those particles. >> >> Currently the problem is that not relion nor cryosparc can run these particles. >> >> >> >> >> Relion stops with error: >> >> >> 00001: RUNNING PROTOCOL ----------------- >> 00002: Hostname: cryoem01 >> 00003: PID: 46455 >> 00004: pyworkflow: 3.0.13 >> 00005: plugin: relion >> 00006: plugin v: 3.1.2 >> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >> 00008: workingDir: Runs/001546_ProtRelionClassify2D >> 00009: runMode: Continue >> 00010: MPI: 3 >> 00011: threads: 3 >> 00012: Starting at step: 1 >> 00013: Running steps >> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 12:33:26.198123 >> 00015: Converting set from 'Runs/001492_ProtUserSubSet/particles.sqlite' into 'Runs/001546_ProtRelionClassify2D/input_particles.star' >> 00016: convertBinaryFiles: creating soft links. >> 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input -> Runs/001057_ProtImportParticles/extra >> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 12:33:27.117588 >> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 12:33:27.145974 >> 00020: mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 >> 00021: RELION version: 3.1.2 >> 00022: Precision: BASE=double, CUDA-ACC=single >> 00023: >> 00024: === RELION MPI setup === >> 00025: + Number of MPI processes = 3 >> 00026: + Number of threads per MPI process = 3 >> 00027: + Total number of threads therefore = 9 >> 00028: + Leader (0) runs on host = cryoem01 >> 00029: + Follower 1 runs on host = cryoem01 >> 00030: + Follower 2 runs on host = cryoem01 >> 00031: ================= >> 00032: uniqueHost cryoem01 has 2 ranks. >> 00033: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. >> 00034: Thread 0 on follower 1 mapped to device 0 >> 00035: Thread 1 on follower 1 mapped to device 0 >> 00036: Thread 2 on follower 1 mapped to device 0 >> 00037: GPU-ids not specified for this rank, threads will automatically be mapped to available devices. >> 00038: Thread 0 on follower 2 mapped to device 1 >> 00039: Thread 1 on follower 2 mapped to device 1 >> 00040: Thread 2 on follower 2 mapped to device 1 >> 00041: Running CPU instructions in double precision. >> 00042: + WARNING: Changing psi sampling rate (before oversampling) to 5.625 degrees, for more efficient GPU calculations >> 00043: + On host cryoem01: free scratch space = 448.252 Gb. >> 00044: Copying particles to scratch directory: /data1/new_scratch/relion_volatile/ >> 00045: 000/??? sec ~~(,_,"> [oo] >> 00046: 1/ 60 sec ~~(,_,">in: /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >> 00047: ERROR: >> 00048: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >> 00049: === Backtrace === >> 00050: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) [0x4786a1] >> 00051: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) [0x4b210f] >> 00052: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) [0x4b407b] >> 00053: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) [0x5b8f87] >> 00054: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) [0x498540] >> 00055: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) [0x49ab2a] >> 00056: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) [0x4322a5] >> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] >> 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() [0x435fbf] >> 00059: ================== >> 00060: ERROR: >> 00061: readMRC: Image number 11 exceeds stack size 1 of image 000011@Runs/001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 >> 00063: Traceback (most recent call last): >> 00064: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 197, in run >> 00065: self._run() >> 00066: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 248, in _run >> 00067: resultFiles = self._runFunc() >> 00068: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 244, in _runFunc >> 00069: return self._func(*self._args) >> 00070: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", line 811, in runRelionStep >> 00071: self.runJob(self._getProgram(), params) >> 00072: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", line 1388, in runJob >> 00073: self._stepsExecutor.runJob(self._log, program, arguments, **kwargs) >> 00074: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", line 65, in runJob >> 00075: process.runJob(log, programName, params, >> 00076: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 52, in runJob >> 00077: return runCommand(command, env, cwd) >> 00078: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", line 67, in runCommand >> 00079: check_call(command, shell=True, stdout=sys.stdout, stderr=sys.stderr, >> 00080: File "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, in check_call >> 00081: raise CalledProcessError(retcode, cmd) >> 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. >> 00083: Protocol failed: Command ' mpirun -np 3 `which relion_refine_mpi` --i Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter 690 --zero_mask --K 64 --norm --scale --o Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned non-zero exit status 1. >> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:33:29.230213 >> 00085: *** Last status is failed >> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >> >> >> Cryosparc (in SCIPION) requires CTF to run. >> >> >> >> Thais is where I am now. >> >> Perhaps there is a solution? >> >> >> Sincerely, >> Dmitry >> >> >> >>> On 18. May 2021, at 14:16, Pablo Conesa <pc...@cn... <mailto:pc...@cn...>> wrote: >>> >>> Dear Dmitry, the import of CS metadata files (*.cs) is not supported in Scipion. Does CS has an option to export to star files. It rings a bell. >>> >>> On 18/5/21 9:53, Dmitry Semchonok wrote: >>>> <>Dear Grigory, >>>> >>>> >>>> The files are in mrc format. >>>> >>>> >>>> Please, let me try to explain more plan: >>>> >>>> I have a project in cryosparc. There I have cryosparc selected 2D classes. I want to export the particles of those classes into SCIPION. >>>> >>>> So I I pressed Export (fig 1) and the program(cryosparc) created the folder with mrc + other files (fig 2;3). I looked into J48 and found many *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can also notice that they all have different sizes) >>>> >>>> So I need to export them somehow in SCIPION >>>> >>>> For that, I used the SCIPION export - images protocol where for the files to add I put *.mrc. But the protocol seems to be added only 1 mrc as 1 picture and instead of having 46392 particles I have ~600 particles. >>>> >>>> (Also the geometry seems not preserved). >>>> >>>> >>>> So my question how to export the particles from cryosparc into SCIPION correctly? >>>> >>>> >>>> Thank you! >>>> >>>> >>>> >>>> >>>> >>>> https://disk.yandex.com/d/Fv3Q1lpwEzSisg <https://disk.yandex.com/d/Fv3Q1lpwEzSisg> >>>> >>>> Sincerely, >>>> >>>> Dmitry >>>> >>>> >>>> >>>> >>>>> On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm... <mailto:sha...@gm...>> wrote: >>>>> >>>>> Hi Dmitry, >>>>> >>>>> mrc stacks should have "mrcs" extension. Is this the problem you are getting? >>>>> >>>>> Best regards, >>>>> Grigory >>>>> >>>>> -------------------------------------------------------------------------------- >>>>> Grigory Sharov, Ph.D. >>>>> >>>>> MRC Laboratory of Molecular Biology, >>>>> Francis Crick Avenue, >>>>> Cambridge Biomedical Campus, >>>>> Cambridge CB2 0QH, UK. >>>>> tel. +44 (0) 1223 267228 <tel:+44%201223%20267228> >>>>> e-mail: gs...@mr... <mailto:gs...@mr...> >>>>> >>>>> >>>>> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm... <mailto:Sem...@gm...>> wrote: >>>>> <>Dear colleagues, >>>>> >>>>> I would like to export the particles from cryosparc to SCIPION. >>>>> >>>>> How to do that? >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> What I tried: >>>>> >>>>> >>>>> 1. In cryosparc I pressed Export – to export the particles I am interested in. >>>>> >>>>> 2. In the folder Export – I found many mrc stacks with particles in each. >>>>> >>>>> 3. I tried to export them to SCIPION using Export particles but instead of reading each stack and combine them in the 1 dataset I received 1 particle / per each mrc stack. >>>>> >>>>> >>>>> Any ideas? >>>>> >>>>> >>>>> Thank you >>>>> >>>>> Sincerely, >>>>> >>>>> Dmitry >>>>> >>>>> >>>>> _______________________________________________ >>>>> scipion-users mailing list >>>>> sci...@li... <mailto:sci...@li...> >>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>>> _______________________________________________ >>>>> scipion-users mailing list >>>>> sci...@li... <mailto:sci...@li...> >>>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>>> >>>> >>>> >>>> >>>> _______________________________________________ >>>> scipion-users mailing list >>>> sci...@li... <mailto:sci...@li...> >>>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >>> -- >>> Pablo Conesa - Madrid Scipion <http://scipion.i2pc.es/> team >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... <mailto:sci...@li...> >>> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... <mailto:sci...@li...> >> https://lists.sourceforge.net/lists/listinfo/scipion-users <https://lists.sourceforge.net/lists/listinfo/scipion-users> > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users |
From: Grigory S. <sha...@gm...> - 2021-05-18 13:20:06
|
Hi, I imagine you have 624 micrographs, so particles are exported to mrc on a mic basis. I see you used the "files" option to import mrcs particles into Scipion. This means the imported particles have no metadata except pixel size you provided. What did you do with them after import? Best regards, Grigory -------------------------------------------------------------------------------- Grigory Sharov, Ph.D. MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH, UK. tel. +44 (0) 1223 267228 <+44%201223%20267228> e-mail: gs...@mr... On Tue, May 18, 2021 at 2:12 PM Dmitry Semchonok <Sem...@gm...> wrote: > Dear Grigory, > > Yes I did that — the particles are looking fine. > > > I guess the issue still comes from the fact that originally in cryosparc > Export the stacks of particle were placed into 624 mrc. But the number of > particles is about 44 818. So even after the renaming and the export I see > this in SCIPION export log > > > What I guess may help is if I somehow combine all those files in 1 mrcs > first and then add import them to SCIPION. > Do you perhaps know how to do that? > > Thank you > > Sincerely, > Dmitry > > > > On 18. May 2021, at 15:02, Grigory Sharov <sha...@gm...> > wrote: > > Hi Dmitry, > > as the error states your star file points to a non-existing image in the > mrcs stack. You need to check first if you import from cryosparc with mrcs > worked correctly (open / display particles) then trace all the steps you > did before 2D classification. > > Best regards, > Grigory > > > -------------------------------------------------------------------------------- > Grigory Sharov, Ph.D. > > MRC Laboratory of Molecular Biology, > Francis Crick Avenue, > Cambridge Biomedical Campus, > Cambridge CB2 0QH, UK. > tel. +44 (0) 1223 267228 <+44%201223%20267228> > e-mail: gs...@mr... > > > On Tue, May 18, 2021 at 1:25 PM Dmitry Semchonok <Sem...@gm...> > wrote: > >> Dear Pablo, >> >> >> Thank you. I heard about this option. For that I guess the >> https://pypi.org/project/cs2star/ needs to be installed. >> >> >> >> >> In Cryosparc itself there is an option to export files. And then what we >> get is the mrc files with different number of particles in each. >> >> It appeared to be possible to rename mrc —> to —>mrcs. Then SCIPION can >> import those particles. >> >> Currently the problem is that not relion nor cryosparc can run these >> particles. >> >> >> >> >> Relion stops with error: >> >> >> 00001: RUNNING PROTOCOL ----------------- >> 00002: Hostname: cryoem01 >> 00003: PID: 46455 >> 00004: pyworkflow: 3.0.13 >> 00005: plugin: relion >> 00006: plugin v: 3.1.2 >> 00007: currentDir: /data1/ScipionUserData/projects/Caro__helix >> 00008: workingDir: Runs/001546_ProtRelionClassify2D >> 00009: runMode: Continue >> 00010: MPI: 3 >> 00011: threads: 3 >> 00012: Starting at step: 1 >> 00013: Running steps >> 00014: STARTED: convertInputStep, step 1, time 2021-05-18 >> 12:33:26.198123 >> 00015: Converting set from >> 'Runs/001492_ProtUserSubSet/particles.sqlite' into >> 'Runs/001546_ProtRelionClassify2D/input_particles.star' >> 00016: convertBinaryFiles: creating soft links. >> 00017: Root: Runs/001546_ProtRelionClassify2D/extra/input -> >> Runs/001057_ProtImportParticles/extra >> 00018: FINISHED: convertInputStep, step 1, time 2021-05-18 >> 12:33:27.117588 >> 00019: STARTED: runRelionStep, step 2, time 2021-05-18 12:33:27.145974 >> 00020: mpirun -np 3 `which relion_refine_mpi` --i >> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >> 690 --zero_mask --K 64 --norm --scale --o >> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3 >> 00021: RELION version: 3.1.2 >> 00022: Precision: BASE=double, CUDA-ACC=single >> 00023: >> 00024: === RELION MPI setup === >> 00025: + Number of MPI processes = 3 >> 00026: + Number of threads per MPI process = 3 >> 00027: + Total number of threads therefore = 9 >> 00028: + Leader (0) runs on host = cryoem01 >> 00029: + Follower 1 runs on host = cryoem01 >> 00030: + Follower 2 runs on host = cryoem01 >> 00031: ================= >> 00032: uniqueHost cryoem01 has 2 ranks. >> 00033: GPU-ids not specified for this rank, threads will automatically >> be mapped to available devices. >> 00034: Thread 0 on follower 1 mapped to device 0 >> 00035: Thread 1 on follower 1 mapped to device 0 >> 00036: Thread 2 on follower 1 mapped to device 0 >> 00037: GPU-ids not specified for this rank, threads will automatically >> be mapped to available devices. >> 00038: Thread 0 on follower 2 mapped to device 1 >> 00039: Thread 1 on follower 2 mapped to device 1 >> 00040: Thread 2 on follower 2 mapped to device 1 >> 00041: Running CPU instructions in double precision. >> 00042: + WARNING: Changing psi sampling rate (before oversampling) to >> 5.625 degrees, for more efficient GPU calculations >> 00043: + On host cryoem01: free scratch space = 448.252 Gb. >> 00044: Copying particles to scratch directory: >> /data1/new_scratch/relion_volatile/ >> 00045: 000/??? sec ~~(,_,"> >> [oo] >> 00046: 1/ 60 sec ~~(,_,">in: >> /opt/Scipion3/software/em/relion-3.1.2/src/rwMRC.h, line 192 >> 00047: ERROR: >> 00048: readMRC: Image number 11 exceeds stack size 1 of image >> 000011@Runs >> /001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >> 00049: === Backtrace === >> 00050: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN11RelionErrorC1ERKSsS1_l+0x41) >> [0x4786a1] >> 00051: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE7readMRCElbRK8FileName+0x99f) >> [0x4b210f] >> 00052: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN5ImageIdE5_readERK8FileNameR13fImageHandlerblbb+0x17b) >> [0x4b407b] >> 00053: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN10Experiment22copyParticlesToScratchEibbd+0xda7) >> [0x5b8f87] >> 00054: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi18initialiseWorkLoadEv+0x210) >> [0x498540] >> 00055: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(_ZN14MlOptimiserMpi10initialiseEv+0x9aa) >> [0x49ab2a] >> 00056: >> /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi(main+0x55) >> [0x4322a5] >> 00057: /lib64/libc.so.6(__libc_start_main+0xf5) [0x7f657f51a555] >> 00058: /opt/Scipion3/software/em/relion-3.1.2/bin/relion_refine_mpi() >> [0x435fbf] >> 00059: ================== >> 00060: ERROR: >> 00061: readMRC: Image number 11 exceeds stack size 1 of image >> 000011@Runs >> /001546_ProtRelionClassify2D/extra/input/1024562735536827037_FoilHole_1618719_Data_1621438_1621440_20200703_085118_Fractions_patch_aligned_doseweighted_particles.mrcs >> 00062: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1 >> 00063: Traceback (most recent call last): >> 00064: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 197, in run >> 00065: self._run() >> 00066: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 248, in _run >> 00067: resultFiles = self._runFunc() >> 00068: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 244, in _runFunc >> 00069: return self._func(*self._args) >> 00070: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/relion/protocols/protocol_base.py", >> line 811, in runRelionStep >> 00071: self.runJob(self._getProgram(), params) >> 00072: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/protocol.py", >> line 1388, in runJob >> 00073: self._stepsExecutor.runJob(self._log, program, arguments, >> **kwargs) >> 00074: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/protocol/executor.py", >> line 65, in runJob >> 00075: process.runJob(log, programName, params, >> 00076: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >> line 52, in runJob >> 00077: return runCommand(command, env, cwd) >> 00078: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/site-packages/pyworkflow/utils/process.py", >> line 67, in runCommand >> 00079: check_call(command, shell=True, stdout=sys.stdout, >> stderr=sys.stderr, >> 00080: File >> "/usr/local/miniconda/envs/scipion3/lib/python3.8/subprocess.py", line 364, >> in check_call >> 00081: raise CalledProcessError(retcode, cmd) >> 00082: subprocess.CalledProcessError: Command ' mpirun -np 3 `which >> relion_refine_mpi` --i >> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >> 690 --zero_mask --K 64 --norm --scale --o >> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >> non-zero exit status 1. >> 00083: Protocol failed: Command ' mpirun -np 3 `which >> relion_refine_mpi` --i >> Runs/001546_ProtRelionClassify2D/input_particles.star --particle_diameter >> 690 --zero_mask --K 64 --norm --scale --o >> Runs/001546_ProtRelionClassify2D/extra/relion --oversampling 1 >> --flatten_solvent --tau2_fudge 2.0 --iter 25 --offset_range 5.0 >> --offset_step 2.0 --psi_step 10.0 --dont_combine_weights_via_disc >> --scratch_dir /data1/new_scratch/ --pool 3 --gpu --j 3' returned >> non-zero exit status 1. >> 00084: FAILED: runRelionStep, step 2, time 2021-05-18 12:33:29.230213 >> 00085: *** Last status is failed >> 00086: ------------------- PROTOCOL FAILED (DONE 2/3) >> >> >> Cryosparc (in SCIPION) requires CTF to run. >> >> >> >> Thais is where I am now. >> >> Perhaps there is a solution? >> >> >> Sincerely, >> Dmitry >> >> >> >> On 18. May 2021, at 14:16, Pablo Conesa <pc...@cn...> wrote: >> >> Dear Dmitry, the import of CS metadata files (*.cs) is not supported in >> Scipion. Does CS has an option to export to star files. It rings a bell. >> On 18/5/21 9:53, Dmitry Semchonok wrote: >> >> Dear Grigory, >> >> >> The files are in mrc format. >> >> >> Please, let me try to explain more plan: >> >> I have a project in cryosparc. There I have cryosparc selected 2D >> classes. I want to export the particles of those classes into SCIPION. >> >> So I I pressed Export (fig 1) and the program(cryosparc) created the >> folder with mrc + other files (fig 2;3). I looked into J48 and found many >> *.mrc files of the particles. But it is not 1 mrc = 1 particle. It seems to >> be a mrc stuck - so I have several files inside 1 *.mrc (fig 4) (you can >> also notice that they all have different sizes) >> >> So I need to export them somehow in SCIPION >> >> For that, I used the SCIPION export - images protocol where for the >> files to add I put *.mrc. But the protocol seems to be added only 1 mrc as >> 1 picture and instead of having 46392 particles I have ~600 particles. >> >> (Also the geometry seems not preserved). >> >> >> So my question how to export the particles from cryosparc into SCIPION >> correctly? >> >> >> Thank you! >> >> >> >> https://disk.yandex.com/d/Fv3Q1lpwEzSisg >> >> Sincerely, >> >> Dmitry >> >> >> >> On 17. May 2021, at 18:12, Grigory Sharov <sha...@gm...> >> wrote: >> >> Hi Dmitry, >> >> mrc stacks should have "mrcs" extension. Is this the problem you are >> getting? >> >> Best regards, >> Grigory >> >> >> -------------------------------------------------------------------------------- >> Grigory Sharov, Ph.D. >> >> MRC Laboratory of Molecular Biology, >> Francis Crick Avenue, >> Cambridge Biomedical Campus, >> Cambridge CB2 0QH, UK. >> tel. +44 (0) 1223 267228 <+44%201223%20267228> >> e-mail: gs...@mr... >> >> >> On Mon, May 17, 2021 at 4:49 PM Dmitry Semchonok <Sem...@gm...> >> wrote: >> >>> Dear colleagues, >>> >>> I would like to export the particles from cryosparc to SCIPION. >>> >>> How to do that? >>> >>> >>> >>> >>> What I tried: >>> >>> >>> 1. In cryosparc I pressed Export – to export the particles I am >>> interested in. >>> >>> 2. In the folder Export – I found many mrc stacks with particles in >>> each. >>> >>> 3. I tried to export them to SCIPION using Export particles but >>> instead of reading each stack and combine them in the 1 dataset I received >>> 1 particle / per each mrc stack. >>> >>> >>> Any ideas? >>> >>> >>> Thank you >>> >>> Sincerely, >>> >>> Dmitry >>> >>> _______________________________________________ >>> scipion-users mailing list >>> sci...@li... >>> https://lists.sourceforge.net/lists/listinfo/scipion-users >>> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> >> >> >> >> _______________________________________________ >> scipion-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/scipion-users >> >> -- >> Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es/> team* >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> >> >> _______________________________________________ >> scipion-users mailing list >> sci...@li... >> https://lists.sourceforge.net/lists/listinfo/scipion-users >> > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users > > > |