From: Pablo C. <pc...@cn...> - 2022-03-22 08:47:42
|
That it going to be a challenge. All of the conda environments rely on having an internet connection to download and install the packages. We had the need to use an HPC cluster without internet and we struggle to make thing work. If you still need to go down this route here are some ideas: 1.- Do you have a similar machine but with internet connection? Maybe doing the installation there and moving later the conda environments may work? 2.- In theory all packages, at least "pip ones" can be downloaded and installed locally. If you do scipion3 installb deepLearningToolkit You should get all the commands needed to install deepLearningToolkit, see bellow. Scipion v3.0.10 - Eugenius Building deepLearningToolkit-0.2 ... Skipping command: *wget -nv -c -O /home/pablo/software/scipion/software/em/deepLearningToolkit.tgz.part http://scipion.cnb.csic.es/downloads/scipion/software/external/deepLearningToolkit.tgz* mv -v /home/pablo/software/scipion/software/em/deepLearningToolkit.tgz.part /home/pablo/software/scipion/software/em/deepLearningToolkit.tgz All targets exist. Skipping command: tar -xf deepLearningToolkit.tgz All targets exist. cd /home/pablo/software/scipion/software/em/deepLearningToolkit if ls /home/pablo/software/scipion/software/em/xmipp/lib/libXmipp.so > /dev/null ; then touch xmippLibToken; echo ' > CUDA support find. Driver version: 510' ; else echo ; echo ' > Xmipp installation not found, please install it first (xmippSrc or xmippBin*).';echo; fi > CUDA support find. Driver version: 510 Skipping command: export PYTHONPATH="" && eval "$(/extra/miniconda3/bin/conda shell.bash hook)" && *conda create --force --yes -n xmipp_DLTK_v0.3 python=3 pandas=0.23 scikit-image=0.14 opencv=3.4 tensorflow-gpu=1.15 keras=2.2 scikit-learn=0.22 -c anaconda && conda activate xmipp_DLTK_v0.3 && pip install numpy==1.18.4 numpy==1.18.4 numpy==1.18.4* && conda env export > xmipp_DLTK_v0.3.yml All targets exist. Skipping command: export PYTHONPATH="" && eval "$(/extra/miniconda3/bin/conda shell.bash hook)" && *conda create --force --yes -n xmipp_MicCleaner python=3.6 micrograph-cleaner-em=0.35 -c rsanchez1369 -c anaconda -c conda-forge && conda activate xmipp_MicCleaner* && conda env export > xmipp_MicCleaner.yml All targets exist. Skipping command: export PYTHONPATH="" && eval "$(/extra/miniconda3/bin/conda shell.bash hook)" && *conda create --force --yes -n xmipp_deepEMhancer python=3.6 deepemhancer=0.12 numba=0.45 -c rsanchez1369 -c anaconda -c conda-forge && conda activate xmipp_deepEMhancer* && conda env export > xmipp_deepEMhancer.yml All targets exist. cd /home/pablo/software/scipion/software/em/deepLearningToolkit rm models_UPDATED_on_* xmippLibToken 2>/dev/null ; echo 'Downloading pre-trained models...' ; /home/pablo/software/scipion/software/em/xmipp/bin/xmipp_sync_data update /home/pablo/software/scipion/software/em/xmipp/models http://scipion.cnb.csic.es/downloads/scipion/software/em DLmodels && touch models_UPDATED_on_22_3_2022 && echo ' > Tensorflow installed with CUDA SUPPORT.' Downloading pre-trained models... Regenerating local MANIFEST... Verifying MD5s... 12%... 62%... ...done. Updated files: 0 Uncompressing models: cat /home/pablo/software/scipion/software/em/xmipp/models/xmipp_model_*.tgz | tar xzf - -i --directory=/home/pablo/software/scipion/software/em/xmipp/models ...done. Updated files: 4 Uncompressing models: cat /home/pablo/software/scipion/software/em/xmipp/models/xmipp_model_*.tgz | tar xzf - -i --directory=/home/pablo/software/scipion/software/em/xmipp/models > Tensorflow installed with CUDA SUPPORT. Skipping command: Link 'deepLearningToolkit-0.2 -> deepLearningToolkit' All targets exist. Done (1 m 06 s) Now comes the tedious part: wget -nv -c -O /home/pablo/software/scipion/software/em/deepLearningToolkit.tgz.part http://scipion.cnb.csic.es/downloads/scipion/software/external/deepLearningToolkit.tgz should be easy to get and move to the GPU node For the environments you'll need to manualy create them: conda create --force --yes -n xmipp_DLTK_v0.3 python=3 pandas=0.23 scikit-image=0.14 opencv=3.4 tensorflow-gpu=1.15 keras=2.2 scikit-learn=0.22 -c anaconda && conda activate xmipp_DLTK_v0.3 && pip install numpy==1.18.4 numpy==1.18.4 numpy==1.18.4 This command will use a connection to download: pandas=0.23, scikit_image=0.14, .... You can get any of those from pypi like this for scikit_image --> https://pypi.org/project/scikit-image/#files You will need to match the cpu architecture of the GPU nodes and in the worse case, download the sources. There are many thing that can go wrong and I haven't tried this. Actually, when I had to install from sources, some packages I've given up. I think this is not an easy an trivial task but might be doable. All the best, Pablo. Maybe is worth to give the GPU node temporary internet connection? if possible. On 22/3/22 3:53, Thu Nguyen wrote: > > Hi Heechang, > > May I ask if you ended up finding a solution to this problem ? > > I am currently in the same situation. Worse, my GPU nodes do not have > internet so I cannot even use them to install deepLearning. > > Regards, > > *----------------------------------------------------------------------------------------------------------------* > > *Thu D. Nguyen* > > Bio21 Institute > > The University of Melbourne, Victoria 3010 Australia > > > > _______________________________________________ > scipion-users mailing list > sci...@li... > https://lists.sourceforge.net/lists/listinfo/scipion-users -- Pablo Conesa - *Madrid Scipion <http://scipion.i2pc.es> team* |