Menu

#35 [foam-extend 4.1] MPI bug with GAMG solver for pressure (simpleFoam, but may be others)

1.0
closed
nobody
None
2020-04-16
2019-11-19
No

MPI runs of simpleFoam (likely other solvers as well) using the GAMG solver for the pressure field fail with:

BiCGStab:  Solving for Ux, Initial residual = 1, Final residual = 0.00464911, No Iterations 1
BiCGStab:  Solving for Uy, Initial residual = 0, Final residual = 0, No Iterations 0
[ubuntu:22474] *** An error occurred in MPI_Waitall
[ubuntu:22474] *** reported by process [1435959297,1]
[ubuntu:22474] *** on communicator MPI_COMM_WORLD
[ubuntu:22474] *** MPI_ERR_TRUNCATE: message truncated
[ubuntu:22474] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[ubuntu:22474] ***    and potentially your MPI job)

an MPI_Waitall error. Serial runs perform fine.

Systems tested:

  • CentOS 7, self-compiled foam-extend, using gcc4, gcc5, and gcc7 toolchains, same error
    *both mpi 1.8 as in foam-extend ThirdParty, and openmpi 1.10 as in CentOS 7
  • Ubuntu 18.04LTS, foam-extend, deb package (amd66_Ubuntu1804_5832cfe25, 02/11/2019), openmpi 2.1.1 (systemopenmpi ubuntu 18.04)

To reproduce:

pitzDaily testcase:

serial run (works):

blockMesh
simpleFoam

parallel run, original case as in tutorials (works):

blockMesh
pyFoamDecomposePar ./ 4
mpirun -np 4 simpleFoam -parallel

This shows that the original case using BiCGStab/PCG works.

change system/fvSolutions to:

solvers
{
/*
    p
    {
        solver           PCG;
        preconditioner   DIC;
        tolerance        1e-06;
        relTol           0.01;
    }
*/
     p
    {
        solver          GAMG;
        tolerance       1e-06;
        relTol          0.01;
        smoother        GaussSeidel;
        cacheAgglomeration true;
        nCellsInCoarsestLevel 10;
        agglomerator    faceAreaPair;
        mergeLevels     1;
    }
........

and the will only run in serial mode, but started with:

mpirun -np 4 simpleFoam -parallel

it will fail with the error message above.

Discussion

  • Torsten Schenkel

    Apologies, I've just seen, that this is a duplicate of ticket #34, I'll add the post there, please delete this.

     
  • Hrvoje Jasak

    Hrvoje Jasak - 2019-11-20
    • status: open --> closed
     
  • Hrvoje Jasak

    Hrvoje Jasak - 2019-11-20

    commit 196399ef6ac61aeff40f752daf53b83d59191171

     
  • Thomas

    Thomas - 2020-04-15

    persists in foam-extend-4.1_amd64_Ubuntu1804_5832cfe25.deb

     
  • Hrvoje Jasak

    Hrvoje Jasak - 2020-04-16

    Cannot reproduce - see #34

    Are you sure you know what you are doing?

    git log
    commit f2c557318fe73d31f2784b614998e85e0fd79761
    Author: Hrvoje Jasak h.jasak@wikki.co.uk
    Date: Wed Apr 8 18:23:43 2020 +0100

    git branch
    * master

    pwd
    foam-extend-4.1/tutorials/incompressible/simpleFoam/motorBike

    Starting time loop

    Time = 1

    smoothSolver: Solving for Ux, Initial residual = 1, Final residual = 0.04761956263, No Iterations 4
    smoothSolver: Solving for Uy, Initial residual = 0, Final residual = 0, No Iterations 0
    smoothSolver: Solving for Uz, Initial residual = 0, Final residual = 0, No Iterations 0
    GAMG: Solving for p, Initial residual = 1, Final residual = 0.06564726748, No Iterations 3
    GAMG: Solving for p, Initial residual = 0.1653323196, Final residual = 0.01239828005, No Iterations 1
    GAMG: Solving for p, Initial residual = 0.02296662651, Final residual = 0.002109357219, No Iterations 2
    time step continuity errors : sum local = 0.0008505700774, global = 6.167094958e-05, cumulative = 6.167094958e-05
    smoothSolver: Solving for omega, Initial residual = 0.02312237873, Final residual = 0.0009182389728, No Iterations 3
    bounding omega, min: -1005.354836 max: 18192.56711 average: 262.3292888
    smoothSolver: Solving for k, Initial residual = 1, Final residual = 0.05731544116, No Iterations 3
    ExecutionTime = 1.35 s ClockTime = 1 s

    This is not the best solver I ever wrote so I don't use it every day, but as you can see it works without problems. Not sure what to do next

     
  • Torsten Schenkel

    Of course it works on the git master branch, since you fixed the problem on that (commit 196399ef6ac61aeff40f752daf53b83d59191171).

    Thomas has pointed out that the bug is still present in the .deb file in the downloads section, since that has not been updated.

    Grumpy?

     

Log in to post a comment.