Menu

#2 cartesianMesh mpi errors

1.0
closed
2015-12-16
2015-12-06
Vyacheslav
No

cartesianMesh frequently fail with mpi errors when run in parallel

1 Attachments

Discussion

  • Vyacheslav

    Vyacheslav - 2015-12-06

    here is one of the most popular mpi error:
    [cof:1346] *** An error occurred in MPI_Bsend
    [cof:1346] *** on communicator MPI_COMM_WORLD
    [cof:1346] *** MPI_ERR_BUFFER: invalid buffer pointer
    [cof:1346] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort
    [0] #0 Foam::error::printStack(Foam::Ostream&)--------------------------------------------------------------------------
    mpirun has exited due to process rank 1 with PID 1345 on
    node cof exiting improperly. There are two reasons this could occur:

    1. this process did not call "init" before exiting, but others in
      the job did. This can cause a job to hang indefinitely while it waits
      for all processes to call "init". By rule, if one process calls "init",
      then ALL processes must call "init" prior to termination.

    2. this process called "init", but exited without calling "finalize".
      By rule, all processes that call "init" MUST call "finalize" prior to
      exiting or it will be considered an "abnormal termination"

    This may have caused other processes in the application to be
    terminated by signals sent by mpirun (as reported here).


     
  • Franjo Juretic

    Franjo Juretic - 2015-12-16

    Hello,

    Thank you for reporting the problem. The solution is available in the development branch.

    Regards,

    Franjo

     
  • Franjo Juretic

    Franjo Juretic - 2015-12-16
    • status: open --> closed
    • assigned_to: Franjo Juretic
     

Log in to post a comment.

Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.