From: Ondrej C. <on...@ce...> - 2006-03-27 14:09:19
|
Hello, if anyone is also interested in computing the lowest eigenvalues, below is how. Libmesh already has support for slepc, but it's not really needed if the only thing I want is to construct global matrices, save them to disk and solve them (externally). I am solving a generalized hermitian problem Ax=kBx. So after assembling matrices A and B, I save them to a file using PETSC_VIEWER_BINARY_DEFAULT format. Then I use example 7 from slepc http://www.grycap.upv.es/slepc/handson/handson3.html to solve them. The advantage is that now I can play with various options on the command line. 1)to find the largest eigenvalues: ./ex7 -f1 matrixA -f2 matrixB works with any solver. 2)to find the smallest eigenvalues, either: ./ex7 -f1 matrixA -f2 matrixB -st_type sinvert works with any solver or: ./ex7 -f1 matrixA -f2 matrixB -eps_type arpack -eps_smallest_real works only with the arpack solver. It's very slow, I have no idea why. This was around 20 times faster: ./ex7 -f1 matrixA -f2 matrixB -st_type sinvert -eps_type arpack You can also change the tolerance or the number of requested eigenvalues using options: -eps_nev 2 -eps_tol 1e-3 All of this can of course be built into libmesh, I think it already is. Another option is to use pysparse, the solver is also very good. Ondrej |
From: David X. <dx...@my...> - 2006-03-28 05:43:08
|
Ondrej, Thanks for the tips. How's the matrix file i/o performance when using the PETSC_VIEWER_BINARY_DEFAULT format for large matrices (size > 100000)? What's your libmesh code for exporting PETSC_VIEWER_BINARY_DEFAULT format matrices? As we discussed before, I tried pysparse with success, the bottleneck is the file i/o. I also compiled ARPACK with SLEPc and enabled ARPACK in libmesh. It worked in finding smallest eigenvalues, but it wasn't very fast compared to pysparse + time spend on file i/o. It would be nice to implement JDBSYM algorithm in libmesh. I've been trying to compile JDBSYM library (from it's C code implementation) with my libmesh code, haven't got it work yet. The reason I need the eigensolver integrated with libmesh code is to be able to post-process the eigenvectors, say, normalization. Do you know how to do that with pysparse? Another great eigenvalue solver is BLOPEX (http://www-math.cudenver.edu/~aknyazev/software/BLOPEX/), the performance is comparable to Pysparse/JDBSYM and it has become one of PETSc 2.3.1 extermal packages. But I haven't figured out how to use it in libmesh. Maybe it's worth a try. Thanks again for sharing your experience! David On 3/27/06, Ondrej Certik <on...@ce...> wrote: > Hello, > > if anyone is also interested in computing the lowest eigenvalues, below i= s > how. > > Libmesh already has support for slepc, but it's not really needed if the = only > thing I want is to construct global matrices, save them to disk and solve= them > (externally). > > I am solving a generalized hermitian problem Ax=3DkBx. So after assemblin= g > matrices A and B, I save them to a file using PETSC_VIEWER_BINARY_DEFAULT > format. Then I use example 7 from slepc > > http://www.grycap.upv.es/slepc/handson/handson3.html > > to solve them. The advantage is that now I can play with various options = on the > command line. > > 1)to find the largest eigenvalues: > > ./ex7 -f1 matrixA -f2 matrixB > > works with any solver. > > 2)to find the smallest eigenvalues, either: > > ./ex7 -f1 matrixA -f2 matrixB -st_type sinvert > > works with any solver > > or: > > ./ex7 -f1 matrixA -f2 matrixB -eps_type arpack -eps_smallest_real > > works only with the arpack solver. It's very slow, I have no idea why. Th= is was > around 20 times faster: > > ./ex7 -f1 matrixA -f2 matrixB -st_type sinvert -eps_type arpack > > > You can also change the tolerance or the number of requested eigenvalues = using > options: > > -eps_nev 2 -eps_tol 1e-3 > > > All of this can of course be built into libmesh, I think it already is. A= nother > option is to use pysparse, the solver is also very good. > Ondrej > > > ------------------------------------------------------- > This SF.Net email is sponsored by xPML, a groundbreaking scripting langua= ge > that extends applications into web and mobile media. Attend the live webc= ast > and join the prime developer group breaking into this new coding territor= y! > http://sel.as-us.falkag.net/sel?cmd=3Dlnk&kid=3D110944&bid=3D241720&dat= =3D121642 > _______________________________________________ > Libmesh-users mailing list > Lib...@li... > https://lists.sourceforge.net/lists/listinfo/libmesh-users > |
From: Ondrej C. <on...@ce...> - 2006-03-28 09:34:50
|
On Mon, Mar 27, 2006 at 09:43:04PM -0800, David Xu wrote: > Ondrej, > > Thanks for the tips. How's the matrix file i/o performance when using > the PETSC_VIEWER_BINARY_DEFAULT format for large matrices (size > > 100000)? I was doing these tests - it's something like 0.5 s. The same with loading it in my external petsc program. So it's ok. But I wasn't trying some very large matrices, like size > milion. > What's your libmesh code for exporting PETSC_VIEWER_BINARY_DEFAULT > format matrices? void save_sparse_matrix(SparseMatrix<Number>& M, const char *fname) { //M.print_matlab(fname); int ierr=0; PetscViewer petsc_viewer; M.close(); ierr = PetscViewerCreate (libMesh::COMM_WORLD, &petsc_viewer); CHKERRABORT(libMesh::COMM_WORLD,ierr); ierr = PetscViewerBinaryOpen( libMesh::COMM_WORLD, fname, FILE_MODE_WRITE, &petsc_viewer); CHKERRABORT(libMesh::COMM_WORLD,ierr); ierr = PetscViewerSetFormat (petsc_viewer, PETSC_VIEWER_BINARY_DEFAULT); CHKERRABORT(libMesh::COMM_WORLD,ierr); Mat mat = ((PetscMatrix<Number>&)(M)).mat(); ierr = MatView (mat, petsc_viewer); CHKERRABORT(libMesh::COMM_WORLD,ierr); ierr = PetscViewerDestroy (petsc_viewer); CHKERRABORT(libMesh::COMM_WORLD,ierr); } > As we discussed before, I tried pysparse with success, the bottleneck > is the file i/o. I also compiled ARPACK with SLEPc and enabled ARPACK Then load the sparse matrix to pysparse in a matlab format. Should be fast enough. The code I sent you was slow because I assemble from element matrices in python, which is obviously slow. > in libmesh. It worked in finding smallest eigenvalues, but it wasn't > very fast compared to pysparse + time spend on file i/o. Arpack can be also very fast - see my last email. It was as fast as pysparse. Even a little faster then the arnoldi/lanczos in slepc. > It would be nice to implement JDBSYM algorithm in libmesh. I've been > trying to compile JDBSYM library (from it's C code implementation) > with my libmesh code, haven't got it work yet. The reason I need the > eigensolver integrated with libmesh code is to be able to post-process > the eigenvectors, say, normalization. Do you know how to do that with > pysparse? Sure. The jdsym.jdsym(...) returns both eigenvalues and eigenvectors, so export the eigenvector to a file and do what you want with it. Better would be to integrate jdsym to slepc. > Another great eigenvalue solver is BLOPEX > (http://www-math.cudenver.edu/~aknyazev/software/BLOPEX/), the > performance is comparable to Pysparse/JDBSYM and it has become one of > PETSc 2.3.1 extermal packages. But I haven't figured out how to use it > in libmesh. Maybe it's worth a try. It is - but why is the eigensolver in petsc? I thought that petsc doesn't have interfaces to any eigensolvers (slepc does). And slepc doesn't (yet) have blopex interface. Do you use it through petsc, or directly? And if through petsc, how? Ondrej |
From: David X. <dx...@my...> - 2006-03-28 09:39:44
|
> It is - but why is the eigensolver in petsc? I thought that petsc doesn't= have > interfaces to any eigensolvers (slepc does). And slepc doesn't (yet) have > blopex interface. Do you use it through petsc, or directly? And if throug= h > petsc, how? This is also my question, how to get it working with libmesh since it comes with PETSc? See the bottom of this link: http://www-unix.mcs.anl.gov/petsc/petsc-2/documentation/changes/231.html |
From: Ondrej C. <on...@ce...> - 2006-03-28 09:44:13
|
I also wasn't able to find any other documentation besides sources. But arpack, slepc arnoldi, and pysparse are enough working options for me. Ondrej On Tue, Mar 28, 2006 at 01:39:42AM -0800, David Xu wrote: > > It is - but why is the eigensolver in petsc? I thought that petsc doesn't have > > interfaces to any eigensolvers (slepc does). And slepc doesn't (yet) have > > blopex interface. Do you use it through petsc, or directly? And if through > > petsc, how? > > This is also my question, how to get it working with libmesh since it > comes with PETSc? > See the bottom of this link: > http://www-unix.mcs.anl.gov/petsc/petsc-2/documentation/changes/231.html |
From: David X. <dx...@my...> - 2006-03-28 09:49:42
|
> > Then load the sparse matrix to pysparse in a matlab format. Should be fas= t > enough. The code I sent you was slow because I assemble from element mat= rices > in python, which is obviously slow. I did load matlab sparse matrix to pysparse. It still requires some conversion and if the matlab matrix file gets as large as gigabyte in size, the simple conversion and loading will surely drag down the speed. > > Another great eigenvalue solver is BLOPEX > > (http://www-math.cudenver.edu/~aknyazev/software/BLOPEX/), the > > performance is comparable to Pysparse/JDBSYM and it has become one of > > PETSc 2.3.1 extermal packages. But I haven't figured out how to use it > > in libmesh. Maybe it's worth a try. > > It is - but why is the eigensolver in petsc? I thought that petsc doesn't= have > interfaces to any eigensolvers (slepc does). And slepc doesn't (yet) have > blopex interface. Do you use it through petsc, or directly? And if throug= h > petsc, how? BLOPEX web site http://www-math.cudenver.edu/~aknyazev/software/BLOPEX/ has a PETSc driver code for download. Maybe that will shed us some light on how to make it work with libmesh. David |
From: Ondrej C. <on...@ce...> - 2006-03-28 10:21:18
|
> I did load matlab sparse matrix to pysparse. It still requires some > conversion and if the matlab matrix file gets as large as gigabyte in > size, the simple conversion and loading will surely drag down the > speed. I see. That's why I use the petsc binary format. And as to pysparse - you will have to construct the matrix in your libmesh code using the pysparse C code. Probably. > BLOPEX web site http://www-math.cudenver.edu/~aknyazev/software/BLOPEX/ > has a PETSc driver code for download. Maybe that will shed us some > light on how to make it work with libmesh. yes, this driver is in src/contrib/blopex in the petsc directory. It calls lobpcg_solve function do solve the eigenvalue problem, which is probably in the blopex package (I don't have it installed). Ondrej |
From: David X. <dx...@my...> - 2006-03-28 17:43:26
|
On 3/28/06, Ondrej Certik <on...@ce...> wrote: > > I did load matlab sparse matrix to pysparse. It still requires some > > conversion and if the matlab matrix file gets as large as gigabyte in > > size, the simple conversion and loading will surely drag down the > > speed. > > I see. That's why I use the petsc binary format. And as to pysparse - you= will > have to construct the matrix in your libmesh code using the pysparse C co= de. > Probably. > > > BLOPEX web site http://www-math.cudenver.edu/~aknyazev/software/BLOPEX/ > > has a PETSc driver code for download. Maybe that will shed us some > > light on how to make it work with libmesh. > > yes, this driver is in src/contrib/blopex in the petsc directory. It call= s > lobpcg_solve function do solve the eigenvalue problem, which is probably = in the > blopex package (I don't have it installed). > When I installed petsc-2.3.1, I added ---download-blopex=3D1 --with-blopex= =3D1 it automatically installed and complied blopex. |
From: David X. <dx...@my...> - 2006-03-28 09:52:04
|
On 3/28/06, Ondrej Certik <on...@ce...> wrote: > I also wasn't able to find any other documentation besides sources. "BLOPEX is availabe as an external block to the PETSc package from Mathematics and Computer Science Division of ANL. The BLOPEX-PETSc archive blopex_petsc.tgz contains a primitive BLOPEX-PETSc test driver that can be used as an example to call BLOPEX eigenxolvers from PETSc." http://www-math.cudenver.edu/~aknyazev/software/BLOPEX/blopex_petsc.tgz |