From: Benjamin S. K. <be...@cf...> - 2004-03-08 00:26:11
|
I have found a Solaris 2.9 machine (thanks Sourceforge!) that should be similar to yours. I was able to get the default ./configure to work... You will need to change two lines in the Make.common file after running configure: change RPATHFLAG = -Wl,-rpath, to RPATHFLAG = -Wl,-R, and LIBS = to LIBS = -lrpcsvc This is due to differences in the Linux and Sun linkers. I have checked these changes in to the CVS repository. If you follow the anonymous CVS access instructions (http://libmesh.sourceforge.net/installation.php) and download what is currently in the CVS repository you should be able to just ./configure and go. It does seem to take a long time to link the libraries, but all the tests seem to run fine. A few notes on the output from configure that you sent me: -------------- - configure is a GNU tool, and as such looks for GNU compilers first. In your case it found gcc-3.3 and is using it to build the library instead of the Sun Forte compiler. This shouldn't be a problem (indeed, I have never built libMesh with the Forte compiler, so this might be good). However, problems *may* arise if PETSc is built with the Forte compiler and libMesh is not. You should be able to figure out which compiler PETSc was built with by looking in the file $PETSC_DIR/bmake/$PETSC_ARCH/variables To specify which compiler you want to use instead of letting configure detect it automatically you have to use environment variables. For instance, under bash or sh this should work: CXX=CC CC=cc F77=f77 ./configure .... (For other shells you might need to export the environment variables before running configure). After you have configured the library you don't need to worry about the compiler, the right one will be specified in the Make.common file. -------------- - As for the MPI stuff: libMesh will use MPI if it is available, although it is not required. If PETSc is found then MPI is definintely available, and the file $PETSC_DIR/bmake/$PETSC_ARCH/packages is queried to find it. This could pose a problem if your PETSc has been provided as binaries and not built from source. In this case I would be interested in hearing about it and improving the configuration. If PETSc is not there, then a number of other tests are conducted to find a valid MPI implementation. First mpi libraries and header files are looked for in /usr/lib & /usr/include (you can specify a directory with --with-mpi=/path/to/mpi). If nothing is found there then we do a final test to see if the compiler supports MPI directly. This is the case with the AIX compilers or the mpiCC provided by MPICH, for example. There was a bug in the Parmetis configuration that tried to build parmetis even if no valid MPI installation was found. I have fixed that in the CVS branch. |