From: Ondrej C. <on...@ce...> - 2007-07-08 16:38:00
|
Hi, the Debian package is still in the NEW queue, but hopefully they will soon look at it and upload it into the archive. In the meantime, I already created an updated version, which can be downloaded from: http://debian.wgdd.de/debian/ just put deb http://debian.wgdd.de/debian unstable main contrib non-free into your /etc/apt/sources.list. I however need to use libmesh without petsc, because I want to solve the matrices myself (using python-petsc) and the problem is, that the MPI makes it impossible to work with python-petsc and libmesh with petsc at the same time (it shouts about calling MPI functions after MPI_Finalize). I used to solve this problem by configuring libmesh against laspack. But since laspack is not free I created these binary libmesh packages: libmesh0.6.0-dev libmesh0.6.0-pure-dev libmesh0.6.0 libmesh0.6.0-pure the -dev packages contain header files, the other just the runtime .so library. the "libmesh0.6.0-pure" packages contain libmesh configured without petsc and laspack, the "libmesh0.6.0" packages are linked against petsc. How I achieved that is that I patched the laspack C++ interface in libmesh (I removed all calls to laspack), so that I configure libmesh with laspack, but actually there is no laspack in it, so that it can go into Debian main. The details can be checked here: http://debian.wgdd.de/debian/dists/unstable/main/source/libs/ The file libmesh_0.6.0~rc2.dfsg.orig.tar.gz is just the libmesh rc2 release (I didn't yet update the package to work with the latest libmesh release) with all non-free stuff deleted. The file libmesh_0.6.0~rc2.dfsg-1oc3.diff.gz contains all the debian specific stuff - commands how to configure && compile && install libmesh and my patches to mock up laspack. So those people like me, who just need the nice finite elements code from libmesh will install the libmesh0.6.0-pure-dev, the other poeple who need the full blown libmesh linked against petsc will install the libmesh0.6.0-dev package (instructions how to compile examples are in README.Debian), but it's very easy: g++ -I/usr/include/libmesh -I/usr/include/mpi -I/usr/include/petsc -c -o ex2.o ex2.C g++ -o ex2 ex2.o -lmesh -lpetsc -lpetscdm -lpetscksp -lpetscmat -lpetscsnes -lpetscts -lpetscvec Actually, technically the packages at debian.wgdd.de are linked against petsc linked against LAM (there is a special package for that in that repository), that's because python-petsc doesn't work properly, when linked against petsc with MPICH. This will be resolved with the petsc maintainer in Debian, so you don't have to worry - whatever you install it from above or from Debian (once libmesh gets into the unstable) it will work out of the box. Only python-petsc works fine when installed from above, but there are problems with the one in Debian. So to also ask some question - how about adding a special option to configure of libmesh to configure and compile without petsc and laspack? So that I don't have to patch the sources with each libmesh release. Ondrej |
From: Roy S. <roy...@ic...> - 2007-07-08 16:54:17
|
On Sun, 8 Jul 2007, Ondrej Certik wrote: > So to also ask some question - how about adding a special option to > configure of libmesh to configure and compile without petsc and > laspack? So that I don't have to patch the sources with each libmesh > release. Does the CVS libMesh still need such a patch? "./configure --disable-petsc --disable-laspack" compiles for me now, although naturally most of the example codes fail at runtime when they try to build numeric vectors of type "INVALID_SOLVER_PACKAGE". --- Roy |
From: John P. <pet...@cf...> - 2007-07-08 17:40:06
|
Roy Stogner writes: > On Sun, 8 Jul 2007, Ondrej Certik wrote: > > > So to also ask some question - how about adding a special option to > > configure of libmesh to configure and compile without petsc and > > laspack? So that I don't have to patch the sources with each libmesh > > release. > > Does the CVS libMesh still need such a patch? > "./configure --disable-petsc --disable-laspack" compiles for me now, > although naturally most of the example codes fail at runtime when they > try to build numeric vectors of type "INVALID_SOLVER_PACKAGE". Oh, I wasn't aware that you had fixed the compilation issues with no valid Solver packages defined. -J |
From: John P. <pet...@cf...> - 2007-07-08 16:56:26
|
Ondrej Certik writes: > Hi, > > the Debian package is still in the NEW queue, but hopefully they will > soon look at it and upload it into the archive. In the meantime, I > already created an updated version, which can be downloaded from: > > http://debian.wgdd.de/debian/ > > just put > > deb http://debian.wgdd.de/debian unstable main contrib non-free > > into your /etc/apt/sources.list. I however need to use libmesh without > petsc, because I want to solve the matrices myself (using > python-petsc) and the problem is, that the MPI makes it impossible to > work with python-petsc and libmesh with petsc at the same time (it > shouts about calling MPI functions after MPI_Finalize). I used to > solve this problem by configuring libmesh against laspack. But since > laspack is not free I created these binary libmesh packages: > > libmesh0.6.0-dev > libmesh0.6.0-pure-dev > libmesh0.6.0 > libmesh0.6.0-pure > > the -dev packages contain header files, the other just the runtime .so > library. the "libmesh0.6.0-pure" packages contain libmesh configured > without petsc and laspack, the "libmesh0.6.0" packages are linked > against petsc. > > How I achieved that is that I patched the laspack C++ interface in > libmesh (I removed all calls to laspack), so that I configure libmesh > with laspack, but actually there is no laspack in it, so that it can > go into Debian main. The details can be checked here: > > http://debian.wgdd.de/debian/dists/unstable/main/source/libs/ > > The file libmesh_0.6.0~rc2.dfsg.orig.tar.gz is just the libmesh rc2 > release (I didn't yet update the package to work with the latest > libmesh release) with all non-free stuff deleted. The file > libmesh_0.6.0~rc2.dfsg-1oc3.diff.gz contains all the debian specific > stuff - commands how to configure && compile && install libmesh and my > patches to mock up laspack. > > So those people like me, who just need the nice finite elements code > from libmesh will install the libmesh0.6.0-pure-dev, the other poeple > who need the full blown libmesh linked against petsc will install the > libmesh0.6.0-dev package (instructions how to compile examples are in > README.Debian), but it's very easy: > > g++ -I/usr/include/libmesh -I/usr/include/mpi -I/usr/include/petsc -c > -o ex2.o ex2.C > g++ -o ex2 ex2.o -lmesh -lpetsc -lpetscdm -lpetscksp -lpetscmat > -lpetscsnes -lpetscts -lpetscvec > > Actually, technically the packages at debian.wgdd.de are linked > against petsc linked against LAM (there is a special package for that > in that repository), that's because python-petsc doesn't work > properly, when linked against petsc with MPICH. This will be resolved > with the petsc maintainer in Debian, so you don't have to worry - > whatever you install it from above or from Debian (once libmesh gets > into the unstable) it will work out of the box. Only python-petsc > works fine when installed from above, but there are problems with the > one in Debian. > > So to also ask some question - how about adding a special option to > configure of libmesh to configure and compile without petsc and > laspack? So that I don't have to patch the sources with each libmesh > release. This should be possible... I would probably set it up as ./configure --disable-laspack --disable-petsc It would also give useful error messages when someone tries to actually *solve* after configuring petsc and laspack off (which just happened recently). My initial thought is to create a concrete "Dummy" LinearSolver class and SolverPackage enum which throws errors for all the pure virtual functions in the LinearSolver interface. Another option would be to make LinearSolver a non-abstract base, but I think this option is less preferable from a "good C++ practices" point-of-view. -John |
From: Ondrej C. <on...@ce...> - 2007-07-08 18:37:34
|
> Does the CVS libMesh still need such a patch? I think it does, see below. > "./configure --disable-petsc --disable-laspack" compiles for me now, > although naturally most of the example codes fail at runtime when they > try to build numeric vectors of type "INVALID_SOLVER_PACKAGE". The fix which you are probably talking about is just adding a missing #include, so that the compiler doesn't stop on AutoPtr error. But the real problem is that even though I am not calling "solve" in libmesh, it still fails on INVALID_SOLVER_PACKAGE, becuase I need to use equation_systems, because I am then calling equation_systems.get_system<LinearImplicitSystem>("Poisson").get_dof_map() etc. If you know a way how around it, I am interested (I posted the relevant code below). > This should be possible... I would probably set it up as > > ./configure --disable-laspack --disable-petsc > > It would also give useful error messages when someone tries to > actually *solve* after configuring petsc and laspack off (which just > happened recently). Unfortunately, it gives an error message even during the calculation the elements matrices. > My initial thought is to create a concrete "Dummy" LinearSolver > class and SolverPackage enum which throws errors for all the pure > virtual functions in the LinearSolver interface. My thought is to create a concrete Dummy LinearSolver, that will do nothing. This is exactly what I am doing, only I used the laspack interface for the dummy solver. The other option is to fix libmesh, so that it doesn't call the sparse things when it is not needed. Ondrej code for the element matrices: ----------------------------------- void mesh(const std::string& fmesh, const std::string& fmatrices, const std::string& fboundaries, double* bvalues, int vsize, int* bidx, int isize, double* lambda, int lsize, Updater *up) { char *p[1]={"./lmesh"}; int argc=1; char **argv=p; libMesh::init(argc, argv); { Mesh mesh(3); mesh.read(fmesh); mesh.find_neighbors(); int linear=mesh.elem(0)->type()==TET4; EquationSystems equation_systems (mesh); equation_systems.add_system<LinearImplicitSystem> ("Poisson"); if (linear) equation_systems.get_system("Poisson").add_variable("u", FIRST); else equation_systems.get_system("Poisson").add_variable("u", SECOND); equation_systems.init(); const unsigned int dim = mesh.mesh_dimension(); LinearImplicitSystem& system= equation_systems.get_system<LinearImplicitSystem>("Poisson"); const DofMap& dof_map = system.get_dof_map(); FEType fe_type = dof_map.variable_type(0); AutoPtr<FEBase> fe (FEBase::build(dim, fe_type)); QGauss qrule (dim, FIFTH); fe->attach_quadrature_rule (&qrule); AutoPtr<FEBase> fe_face (FEBase::build(dim, fe_type)); QGauss qface(dim-1, FIFTH); fe_face->attach_quadrature_rule (&qface); const std::vector<Real>& JxW = fe->get_JxW(); const std::vector<std::vector<Real> >& phi = fe->get_phi(); const std::vector<std::vector<RealGradient> >& dphi = fe->get_dphi(); DenseMatrix<Number> Ke; DenseVector<Number> Fee; std::vector<unsigned int> dof_indices; BC bc(fboundaries.c_str(),bvalues,bidx,isize); matrices mymatrices(fmatrices.c_str()); mymatrices.setsize(mesh.n_nodes(),mesh.n_elem(), linear); unsigned int nodemap[mesh.n_nodes()]; for (unsigned int i=0;i<mesh.n_nodes();i++) nodemap[mesh.node(i).dof_number(0,0,0)]=i; MeshBase::const_element_iterator el = mesh.elements_begin(); const MeshBase::const_element_iterator end_el = mesh.elements_end(); if (up!=NULL) up->init(mesh.n_elem()-1); for ( ; el != end_el ; ++el) { const Elem* elem = *el; if (up!=NULL) up->update(elem->id()); dof_map.dof_indices (elem, dof_indices); //std::cout << dof_indices.size() << " " << // elem->type() << std::endl; fe->reinit (elem); Ke.resize (dof_indices.size(), dof_indices.size()); Fee.resize (dof_indices.size()); for (unsigned int qp=0; qp<qrule.n_points(); qp++) { Real lam=lambda[elem->id()]; for (unsigned int i=0; i<phi.size(); i++) for (unsigned int j=0; j<phi.size(); j++) Ke(i,j) += JxW[qp]*(dphi[i][qp]*dphi[j][qp])*lam; } { int b,s; double bval; if (bc.find2(elem->id()+1,&b,&s,&bval)) for (unsigned int side=0; side<elem->n_sides(); side++) if ((side+1==(unsigned int)s) ) { if (elem->neighbor(side) != NULL) error(); const std::vector<std::vector<Real> >& phi_face=fe_face->get_phi(); const std::vector<Real>& JxW_face = fe_face->get_JxW(); fe_face->reinit(elem, side); Real value=bval; for (unsigned int qp=0; qp<qface.n_points(); qp++) { const Real penalty = 1.e10; for (unsigned int i=0; i<phi_face.size(); i++) for (unsigned int j=0; j<phi_face.size(); j++) Ke(i,j) += JxW_face[qp]* penalty*phi_face[i][qp]*phi_face[j][qp]; for (unsigned int i=0; i<phi_face.size(); i++) Fee(i) += JxW_face[qp]*penalty*value*phi_face[i][qp]; } } } mymatrices.addtoA(Ke,dof_indices,nodemap); mymatrices.addtoF(Fee); } //for element } libMesh::close(); } |
From: Roy S. <roy...@ic...> - 2007-07-08 18:49:33
|
On Sun, 8 Jul 2007, Ondrej Certik wrote: >> "./configure --disable-petsc --disable-laspack" compiles for me now, >> although naturally most of the example codes fail at runtime when they >> try to build numeric vectors of type "INVALID_SOLVER_PACKAGE". > > The fix which you are probably talking about is just adding a missing > #include, so that the compiler doesn't stop on AutoPtr error. But the > real problem is that even though I am not calling "solve" in libmesh, > it still fails on INVALID_SOLVER_PACKAGE, becuase I need to use > equation_systems, because I am then calling > > equation_systems.get_system<LinearImplicitSystem>("Poisson").get_dof_map() > > etc. If you know a way how around it, I am interested (I posted the > relevant code below). Even adding a system in the first place may be a problem, because the System class will try to create a NumericVector for it's solution - it doesn't matter that you're never going to be solving with that vector, because the data structures for NumericVector subclasses depend on what solver you expect to hand them to later. I'm afraid the only fix may be to add a "Dummy" subclass like you suggested except that it's not a dummy linear solver you need, it's dummy NumericVector (and if you insist on creating ImplicitSystem objects, SparseMatrix) instantiations. --- Roy |
From: John P. <pet...@cf...> - 2007-07-08 18:56:51
|
Roy Stogner writes: > On Sun, 8 Jul 2007, Ondrej Certik wrote: > > >> "./configure --disable-petsc --disable-laspack" compiles for me now, > >> although naturally most of the example codes fail at runtime when they > >> try to build numeric vectors of type "INVALID_SOLVER_PACKAGE". > > > > The fix which you are probably talking about is just adding a missing > > #include, so that the compiler doesn't stop on AutoPtr error. But the > > real problem is that even though I am not calling "solve" in libmesh, > > it still fails on INVALID_SOLVER_PACKAGE, becuase I need to use > > equation_systems, because I am then calling > > > > equation_systems.get_system<LinearImplicitSystem>("Poisson").get_dof_map() > > > > etc. If you know a way how around it, I am interested (I posted the > > relevant code below). > > Even adding a system in the first place may be a problem, because the > System class will try to create a NumericVector for it's solution - > it doesn't matter that you're never going to be solving with that > vector, because the data structures for NumericVector subclasses > depend on what solver you expect to hand them to later. I'm afraid > the only fix may be to add a "Dummy" subclass like you suggested > except that it's not a dummy linear solver you need, it's dummy > NumericVector (and if you insist on creating ImplicitSystem objects, > SparseMatrix) instantiations. DistributedVector (include/numerics/distributed_vector.h) should be a working built-in implementation of NumericVector that we can use for the "Dummy" solver package. -John |
From: Ondrej C. <on...@ce...> - 2007-07-08 20:06:25
|
> > Even adding a system in the first place may be a problem, because the > > System class will try to create a NumericVector for it's solution - > > it doesn't matter that you're never going to be solving with that > > vector, because the data structures for NumericVector subclasses > > depend on what solver you expect to hand them to later. I'm afraid > > the only fix may be to add a "Dummy" subclass like you suggested > > except that it's not a dummy linear solver you need, it's dummy > > NumericVector (and if you insist on creating ImplicitSystem objects, > > SparseMatrix) instantiations. Exactly, that's what I did in the debian package. I just made the laspack vector and laspack matrix a dummy vector and a dummy matrix (only I used the word laspack instead of dummy). > DistributedVector (include/numerics/distributed_vector.h) should be a > working built-in implementation of NumericVector > that we can use for the "Dummy" solver package. The LaspackVector actually derives from the NumericVector directly. But I guess any solution would be fine. Ondrej |
From: Roy S. <roy...@ic...> - 2007-07-08 20:33:31
|
On Sun, 8 Jul 2007, Ondrej Certik wrote: >> DistributedVector (include/numerics/distributed_vector.h) should be a >> working built-in implementation of NumericVector >> that we can use for the "Dummy" solver package. > > The LaspackVector actually derives from the NumericVector directly. > But I guess any solution would be fine. DistributedVector (whose existence I'd forgotten about - thanks, John) derives from PetscVector directly too, doesn't it? Would it be a partial fix if NumericVector::build() created a DistributedVector by default instead of just exiting with an error? --- Roy |
From: Ondrej C. <on...@ce...> - 2007-07-08 21:03:05
|
> DistributedVector (whose existence I'd forgotten about - thanks, John) > derives from PetscVector directly too, doesn't it? Would it be a DistributedVector derives from NumericVector, at least in 0.6.0-rc2. > partial fix if NumericVector::build() created a DistributedVector by > default instead of just exiting with an error? There are actually two places in the system that exit with the solver error and I tried to return NULL, which obviously segfaulted. :) But maybe returning the DistributedVector would do the job. Ondrej |
From: John P. <pet...@cf...> - 2007-07-08 21:18:08
|
Ondrej Certik writes: > > DistributedVector (whose existence I'd forgotten about - thanks, John) > > derives from PetscVector directly too, doesn't it? Would it be a > > DistributedVector derives from NumericVector, at least in 0.6.0-rc2. Yes, AFAIK it has always derived from the abstract NumericVector base. > > partial fix if NumericVector::build() created a DistributedVector by > > default instead of just exiting with an error? > > There are actually two places in the system that exit with the solver > error and I tried to return NULL, which obviously segfaulted. :) But > maybe returning the DistributedVector would do the job. I hope it works. This is also a good opportunity (for anyone out there who wants to) to actually *write* a LibMeshSparseMatrix/DistributedMatrix class and LibMeshLinearSolver. Serial is fine at first, we can work on parallel later ;) -John |
From: Ondrej C. <on...@ce...> - 2007-07-08 23:18:00
|
> the Debian package is still in the NEW queue, but hopefully they will > soon look at it and upload it into the archive. In the meantime, I They probably heard me and just uploaded a few minutes ago. :) It will be in Debian unstable tomorrow or the day after tomorrow. And it will get to Ubuntu usually a day or two after that. Ondrej |