Hi,
Just I installed openMPI on my machine which had multicore CPUs and I followed the ROOTPWA installation instruction. I made "cmake .." and "make" and I have got the following message at the end of the compilation of ROOTPWA.
[100%] Building CXX object mpi/CMakeFiles/testMpiPerformance.cpp.dir/testMpiPerformance.cpp.o
cc1plus: warnings being treated as errors
In file included from /usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_oarchive.hpp:18,
from /usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_cache.hpp:13,
from /usr/local/hep/boost-svn/boost/mpi/datatype.hpp:27,
from /usr/local/hep/boost-svn/boost/mpi/communicator.hpp:20,
from /usr/local/hep/boost-svn/boost/mpi/collectives.hpp:21,
from /usr/local/hep/boost-svn/boost/mpi.hpp:23,
from /home/matsuda/rootpwa-code/mpi/testMpiPerformance.cpp:11:
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp: In constructor 'boost::mpi::detail::mpi_datatype_primitive::mpi_datatype_primitive(const void)':
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:52: error: 'int MPI_Address(void, MPI_Aint)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1178)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:52: error: 'int MPI_Address(void, MPI_Aint)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1178)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp: In member function 'ompi_datatype_t boost::mpi::detail::mpi_datatype_primitive::get_mpi_datatype()':
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:75: error: 'int MPI_Type_struct(int, int, MPI_Aint, ompi_datatype_t, ompi_datatype_t)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1787)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:75: error: 'int MPI_Type_struct(int, int, MPI_Aint, ompi_datatype_t, ompi_datatype_t)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1787)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp: In member function 'void boost::mpi::detail::mpi_datatype_primitive::save_impl(const void, ompi_datatype_t, int)':
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:108: error: 'int MPI_Address(void, MPI_Aint)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1178)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:108: error: 'int MPI_Address(void, MPI_Aint)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1178)
make[2]: [mpi/CMakeFiles/testMpiPerformance.cpp.dir/testMpiPerformance.cpp.o] Error 1
make[1]: [mpi/CMakeFiles/testMpiPerformance.cpp.dir/all] Error 2
make: *** [all] Error 2
What does the above message mean ?
Thank you very much in advance,
Tatsuro
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi,
I could solve the problem which I raised by myself eventhough the reason was unknown.
I had used openmpi-1.8.3 but I had failed. Instead I used openmpi-1.6.5 then I succeeded
in compiling ROOTPWA with MPI. It is the result as follows;
[100%] Building CXX object mpi/CMakeFiles/testMpiPerformance.cpp.dir/testMpiPerformance.cpp.o
Linking CXX executable ../bin/testMpiPerformance.cpp [100%] Built target testMpiPerformance.cpp
Thank you so much,
Tatsuro
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
sorry for my late answer. The MPI part of ROOTPWA is still experimental.
You do not need it. However, since you had OpenMPI installed and
complied also the MPI Boost libraries, the compilation of the MPI part
was enabled automatically.
Cheers,
Boris.
On 16.12.14 17:06, Tatsuro Matsuda wrote:
Hi,
I could solve the problem which I raised by myself eventhough the reason was unknown.
I had used openmpi-1.8.3 but I had failed. Instead I used openmpi-1.6.5 then I succeeded
in compiling ROOTPWA with MPI. It is the result as follows;
[100%] Building CXX object mpi/CMakeFiles/testMpiPerformance.cpp.dir/testMpiPerformance.cpp.o
Linking CXX executable ../bin/testMpiPerformance.cpp [100%] Built target testMpiPerformance.cpp
Dear Boris,
Many thanks for your comments. Just I'd like to ask what the "experimental" means. This means MPI does not work for ROOTPWA at the moment even if the compilation of the MPI parts have been succeeded in ?
Cheers,
Tatsuro
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi,
Just I installed openMPI on my machine which had multicore CPUs and I followed the ROOTPWA installation instruction. I made "cmake .." and "make" and I have got the following message at the end of the compilation of ROOTPWA.
[100%] Building CXX object mpi/CMakeFiles/testMpiPerformance.cpp.dir/testMpiPerformance.cpp.o
cc1plus: warnings being treated as errors
In file included from /usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_oarchive.hpp:18,
from /usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_cache.hpp:13,
from /usr/local/hep/boost-svn/boost/mpi/datatype.hpp:27,
from /usr/local/hep/boost-svn/boost/mpi/communicator.hpp:20,
from /usr/local/hep/boost-svn/boost/mpi/collectives.hpp:21,
from /usr/local/hep/boost-svn/boost/mpi.hpp:23,
from /home/matsuda/rootpwa-code/mpi/testMpiPerformance.cpp:11:
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp: In constructor 'boost::mpi::detail::mpi_datatype_primitive::mpi_datatype_primitive(const void)':
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:52: error: 'int MPI_Address(void, MPI_Aint)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1178)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:52: error: 'int MPI_Address(void, MPI_Aint)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1178)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp: In member function 'ompi_datatype_t boost::mpi::detail::mpi_datatype_primitive::get_mpi_datatype()':
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:75: error: 'int MPI_Type_struct(int, int, MPI_Aint, ompi_datatype_t, ompi_datatype_t)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1787)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:75: error: 'int MPI_Type_struct(int, int, MPI_Aint, ompi_datatype_t, ompi_datatype_t)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1787)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp: In member function 'void boost::mpi::detail::mpi_datatype_primitive::save_impl(const void, ompi_datatype_t, int)':
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:108: error: 'int MPI_Address(void, MPI_Aint)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1178)
/usr/local/hep/boost-svn/boost/mpi/detail/mpi_datatype_primitive.hpp:108: error: 'int MPI_Address(void, MPI_Aint)' is deprecated (declared at /home/matsuda/openmpi/include/mpi.h:1178)
make[2]: [mpi/CMakeFiles/testMpiPerformance.cpp.dir/testMpiPerformance.cpp.o] Error 1
make[1]: [mpi/CMakeFiles/testMpiPerformance.cpp.dir/all] Error 2
make: *** [all] Error 2
What does the above message mean ?
Thank you very much in advance,
Tatsuro
Hi,
I could solve the problem which I raised by myself eventhough the reason was unknown.
I had used openmpi-1.8.3 but I had failed. Instead I used openmpi-1.6.5 then I succeeded
in compiling ROOTPWA with MPI. It is the result as follows;
[100%] Building CXX object mpi/CMakeFiles/testMpiPerformance.cpp.dir/testMpiPerformance.cpp.o
Linking CXX executable ../bin/testMpiPerformance.cpp
[100%] Built target testMpiPerformance.cpp
Thank you so much,
Tatsuro
Dear Tatsuro,
sorry for my late answer. The MPI part of ROOTPWA is still experimental.
You do not need it. However, since you had OpenMPI installed and
complied also the MPI Boost libraries, the compilation of the MPI part
was enabled automatically.
Cheers,
Boris.
On 16.12.14 17:06, Tatsuro Matsuda wrote:
Dear Boris,
Many thanks for your comments. Just I'd like to ask what the "experimental" means. This means MPI does not work for ROOTPWA at the moment even if the compilation of the MPI parts have been succeeded in ?
Cheers,
Tatsuro