Holy smokes did I drop the ball.  No excuses.... I just flat didn't do this.

Now I'm finally getting back around to doing some Trilinos work... and I _knew_ that there had been a patch for this stuff but it wasn't in the code!  So I went digging through my email and found out that I seriously screwed up here.

Norbert: I apologize for not applying this patch sooner (it is applied now). Also... if you did happen to do some more work on Trilinos with libMesh... I would be grateful for any patches you wish to send along.  I promie that it won't take me 11 months to apply them..........

Derek

On Wed, Oct 29, 2008 at 10:37 AM, Derek Gaston <friedmud@gmail.com> wrote:
Thanks Norbert!  This is something that's been on my list for a while now.

I'm finishing up an abstract for a conference today... then I'll take a look at this and get it committed... and then I should be freed up to do some work on this stuff myself.

Thanks again,
Derek


On Oct 29, 2008, at 9:05 AM, Norbert Stoop wrote:

Hi Derek,

Attached is another small Trilinos patch. Essentially, it sets the
various Aztec/NOX parameters to the same values as the PETSc solver does.
Additionally, there seems to be an inconsistency in Trilinos regarding
the size of the Krylov subspace. The docs state the default value is 30
(as in PETSc), whereas the code has it set to 300. I'm no expert but
this seems way too large, at least for the problems I'm dealing with,
and results in Aztec convergence problems. Setting it to 30 produces
identical results in PETSc and Trilinos (by identical I mean identical
residuals at every linear iteration).

-Norbert

Derek Gaston wrote:
On Oct 15, 2008, at 2:33 AM, Norbert Stoop wrote:

Parallel is where the issue is... and is why I haven't enabled that
capability yet.  I haven't yet done the work to make EpetraMatrix work
in parallel like I did for EpetraVector.

But if you have it working in serial... by all means send a patch.

I understand that Trilinos support is unofficial and work in progress. I
am still new to libmesh, and Trilinos in particular, but will invest
some more hours into this problem. Not sure if that will be of any help,
though...

Any help is appreciated... I'm currently being pulled in about 7
different directions.... so even if small progress is being made in this
direction... that will be better than none.

What puzzles me is that the error is already fully present after the
first nonlinear iteration, ie. after solving the linearized system.

Yes... if there is a problem in parallel then you probably aren't
getting the correct summations of off-processor contributions when you
do a residual evaluation.... so you would see it _immediately_.  Your
very first residual evaluation would be wrong.

Like I say though.... that shouldn't be broken anymore (for pure matrix
free).  Are you sure you've updated your libMesh?  There was a small bug
where closing the EpetraVector was not calling GlobalAssemble... but I
think I committed the fix for that a little bit ago.

Derek


Index: trilinos_nox_nonlinear_solver.C
===================================================================
--- trilinos_nox_nonlinear_solver.C     (revision 3121)
+++ trilinos_nox_nonlinear_solver.C     (working copy)
@@ -201,9 +201,10 @@

 Teuchos::ParameterList& lsParams = newtonParams.sublist("Linear Solver");
 lsParams.set("Aztec Solver", "GMRES");
-  lsParams.set("Max Iterations", 800);
-  lsParams.set("Tolerance", 1e-10);
+  lsParams.set("Max Iterations", static_cast<int>(this->max_linear_iterations));
+  lsParams.set("Tolerance", this->initial_linear_tolerance);
 lsParams.set("Output Frequency", 1);  
+  lsParams.set("Size of Krylov Subspace", 30);
//  lsParams.set("Preconditioner", "AztecOO");

 Teuchos::RCP<NOX::Epetra::Interface::Jacobian> iJac = MF;
@@ -220,16 +221,22 @@
 NOX::Epetra::Group& grp = *(grpPtr.get());

 Teuchos::RCP<NOX::StatusTest::NormF> absresid =
-    Teuchos::rcp(new NOX::StatusTest::NormF(1.0e-12, NOX::StatusTest::NormF::Unscaled));
+    Teuchos::rcp(new NOX::StatusTest::NormF(this->absolute_residual_tolerance, NOX::StatusTest::NormF::Unscaled));
+  Teuchos::RefCountPtr<NOX::StatusTest::NormF> relresid =
+    Teuchos::rcp(new NOX::StatusTest::NormF(grp, this->relative_residual_tolerance));
 Teuchos::RCP<NOX::StatusTest::MaxIters> maxiters =
-    Teuchos::rcp(new NOX::StatusTest::MaxIters(25));
+    Teuchos::rcp(new NOX::StatusTest::MaxIters(this->max_nonlinear_iterations));
 Teuchos::RCP<NOX::StatusTest::FiniteValue> finiteval =
   Teuchos::rcp(new NOX::StatusTest::FiniteValue());
+  Teuchos::RCP<NOX::StatusTest::NormUpdate> normupdate =
+    Teuchos::rcp(new NOX::StatusTest::NormUpdate(this->absolute_step_tolerance));
 Teuchos::RCP<NOX::StatusTest::Combo> combo =
   Teuchos::rcp(new NOX::StatusTest::Combo(NOX::StatusTest::Combo::OR));
 combo->addStatusTest(absresid);
+  combo->addStatusTest(relresid);
 combo->addStatusTest(maxiters);
 combo->addStatusTest(finiteval);
+  combo->addStatusTest(normupdate);


 Teuchos::RCP<NOX::Solver::Generic> solver =

-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/_______________________________________________
Libmesh-devel mailing list
Libmesh-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/libmesh-devel