On Mon, Mar 27, 2006 at 09:43:04PM -0800, David Xu wrote:
> Thanks for the tips. How's the matrix file i/o performance when using
> the PETSC_VIEWER_BINARY_DEFAULT format for large matrices (size >
I was doing these tests - it's something like 0.5 s. The same with loading it
in my external petsc program. So it's ok. But I wasn't trying some very large
matrices, like size > milion.
> What's your libmesh code for exporting PETSC_VIEWER_BINARY_DEFAULT
> format matrices?
void save_sparse_matrix(SparseMatrix<Number>& M, const char *fname)
ierr = PetscViewerCreate (libMesh::COMM_WORLD, &petsc_viewer);
ierr = PetscViewerBinaryOpen( libMesh::COMM_WORLD, fname, FILE_MODE_WRITE,
ierr = PetscViewerSetFormat (petsc_viewer, PETSC_VIEWER_BINARY_DEFAULT);
Mat mat = ((PetscMatrix<Number>&)(M)).mat();
ierr = MatView (mat, petsc_viewer);
ierr = PetscViewerDestroy (petsc_viewer);
> As we discussed before, I tried pysparse with success, the bottleneck
> is the file i/o. I also compiled ARPACK with SLEPc and enabled ARPACK
Then load the sparse matrix to pysparse in a matlab format. Should be fast
enough. The code I sent you was slow because I assemble from element matrices
in python, which is obviously slow.
> in libmesh. It worked in finding smallest eigenvalues, but it wasn't
> very fast compared to pysparse + time spend on file i/o.
Arpack can be also very fast - see my last email. It was as fast as pysparse.
Even a little faster then the arnoldi/lanczos in slepc.
> It would be nice to implement JDBSYM algorithm in libmesh. I've been
> trying to compile JDBSYM library (from it's C code implementation)
> with my libmesh code, haven't got it work yet. The reason I need the
> eigensolver integrated with libmesh code is to be able to post-process
> the eigenvectors, say, normalization. Do you know how to do that with
Sure. The jdsym.jdsym(...) returns both eigenvalues and eigenvectors, so export
the eigenvector to a file and do what you want with it.
Better would be to integrate jdsym to slepc.
> Another great eigenvalue solver is BLOPEX
> (http://www-math.cudenver.edu/~aknyazev/software/BLOPEX/), the
> performance is comparable to Pysparse/JDBSYM and it has become one of
> PETSc 2.3.1 extermal packages. But I haven't figured out how to use it
> in libmesh. Maybe it's worth a try.
It is - but why is the eigensolver in petsc? I thought that petsc doesn't have
interfaces to any eigensolvers (slepc does). And slepc doesn't (yet) have
blopex interface. Do you use it through petsc, or directly? And if through