From: Kirk, Benjamin (JSCEG311) <benjamin.kirk1@na...>  20090303 19:42:29

> Thanks Ben. > > I use mpich21.0.7. > > After a discussion with PETSc developpers, the problem might come from lots of > allocation made by LibMesh within the call of PETSc. In fact if you look at > the PETSc log summary of the problem I solve, you can clearly see that most of > the time (more than 90%) is spent in the SNESSolve stage. The KSPSolve stage > for solving the linear system in Newton takes at most 5% of the time. > Actually, my problem is really at the very first Newton iteration which can > last a hour out of a 3 hours total time resolution. Here is the behavior i > have: > > ==> Solving time step 0, time = 0.01 > NL step 0, residual_2 = 5.346581e05 > .. 1 hour .. > NL step 1, residual_2 = 8.790777e10 > > ==> Solving time step 1, time = 2.000000e02 > NL step 0, residual_2 = 6.043076e05 > NL step 1, residual_2 = 9.936468e10 > ... > ... > etc. until the end for a total CPU of 3 hours. > Finally I always get the right solution but i dont understand the sudden stop > at the beginning. > It might not be only VecScatterCreate but i think its a whole bunch of memory > allocation that happens. > > What do you think? I think the problem is most definitely in the sparse matrix allocation. libMesh builds the graph of (what it thinks is) your sparse matrix so that the underlying PETSc data structures can be allocated perfectly. If for some reason the linear system you are assembling has a different structure than what we thought it would, insertions into the sparse matrix can be horrifically slow the first time you assemble the linear system. what you should look for is something like 'number of mallocs called during MatSetValues' when you run with info. we want that to be 0. what is it on the first linear solve? what type of elements are you using?? Ben 