pympi-users Mailing List for MPI Python (Page 5)
Status: Alpha
Brought to you by:
patmiller
You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(2) |
Nov
(3) |
Dec
(8) |
2005 |
Jan
(9) |
Feb
(4) |
Mar
(3) |
Apr
(1) |
May
|
Jun
(2) |
Jul
(16) |
Aug
(11) |
Sep
(10) |
Oct
|
Nov
|
Dec
|
2006 |
Jan
|
Feb
(4) |
Mar
|
Apr
|
May
|
Jun
(5) |
Jul
(7) |
Aug
(2) |
Sep
|
Oct
|
Nov
(7) |
Dec
(4) |
2007 |
Jan
|
Feb
|
Mar
(1) |
Apr
(4) |
May
(4) |
Jun
(3) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2008 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(2) |
Nov
|
Dec
|
2009 |
Jan
|
Feb
(2) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Julian C. <rjc...@cs...> - 2005-01-18 16:44:24
|
Today I noticed yet another python to MPI binding released, as announced on the Daily Python URL ( www.pythonware.com/daily ). It was created by an engineering school in Argentina. They do mention the other mpi projects in the preamble to the documentation, but curiously, they list pyMPI is not being interactive. Obviously the word curious is a bit of an understatement here, since the pyMPI docs rightly emphasize it's interactive abilities.. |
From: Pat M. <pat...@ll...> - 2004-12-10 16:12:06
|
First of all, thanks for using pyMPI -- I truly love Python and parallel programming, so it is nice to be able to work on both at the same time. -- Pat ******************************************************** Greetings all... hadn't noticed that anyone was actually signed into the pympi-users list and had not actually subscribed myself way back when the list was created [oops!]. Only came [re]aware of its existance when a large message bounced into the moderators [me] box. So, anyway, now at least I'm lurking on the list and can help with any problems. ******************************** Now to actually giving some useful advice... There are two easy ways to spread information in pyMPI. The first is bcast(). This sends the same information to all processors. This typically looks like: if mpi.rank == 0: xyz = <<< do some work here >> info = mpi.bcast(xyz) else: info = mpi.bcast() The other typical way is to use scatter(). This is more appropriate for arrays and lists.... if mpi.rank == 0: orginal_A = << some work >> A = mpi.scatter(original_A) else: A = mpi.scatter() In the above, each rank gets a [nearly] equal piece of the array. ---- %< ----------------------------------- The flip side of spreading stuff is to bring it back together. One way is to use reduce (or allreduce). Say you want the average of a value spread over the ranks... avg = mpi.allreduce(x,mpi.SUM)/mpi.size If you want to concatenate a bunch of lists spread over the ranks, use gather (or allgather). big_list = mpi.gather(small_lists) # Note: big_list is None on rank 1, 2, 3, ... If you want one item from each rank, do all_items = mpi.gather([item]) Pat -- Pat Miller | (925) 423-0309 | http://www.llnl.gov/CASC/people/pmiller The most dangerous of all falsehoods is a slightly distorted truth. -- G.C. Lichtenberg (1742-1799) |
From: Nils W. <nw...@me...> - 2004-12-10 07:33:21
|
Pat Miller wrote: >There are weirdnesses in how mpirun handles console I/O >(e.g. does isatty() return true or false). There are >a couple of configuration controls that you can set >to get interactive working right... > >% ./configure --with-isatty=yes --with-prompt-nl=yes > >Try one or the other or both. > >Cheers, > >Pat > > Pat, I tried all possible combinations of configuration flags, but it failed again. However, if I reduce the number of processors np to one it works. /home/nwagner> mpirun -v -np 1 pyMPI running /home/nwagner/pyMPI on 1 LINUX ch_p4 processors Created /home/nwagner/PI9893 Python 2.3+ (pyMPI 2.0b0) on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> What is going there ? Nils > >--- Nils Wagner <nw...@me...> wrote: > > > >>Password: >>Python 2.3+ (pyMPI 2.0b0) on linux2 >>Type "help", "copyright", "credits" or "license" for more information. >> >> >>get stuck without a prompt. >> >>For what reason ?? >> >>Nils >> >> >> >> >> >> >> >> > > > |
From: Mike S. <st...@gm...> - 2004-12-10 04:34:03
|
> Your test works fine for me > > /home/nwagner> mpicc mpi_test.c > /home/nwagner> mpirun -np 2 a.out > Password: > I am rank 0 > I am rank 1 > /home/nwagner> > > but > > /home/nwagner> mpirun -v -np 2 pyMPI > running /home/nwagner/pyMPI on 2 LINUX ch_p4 processors > Created /home/nwagner/PI29497 > Password: > Python 2.3+ (pyMPI 2.0b0) on linux2 > Type "help", "copyright", "credits" or "license" for more information. > > get stuck without a prompt. Please reconfigure pyMPI with one or both of the following options: --with-isatty --with-prompt-nl That should give you back the interactive prompt. ~Mike |
From: Nils W. <nw...@me...> - 2004-12-09 16:04:42
|
Patrick Miller wrote: >Short answer: Try % env CC=mpicc ./configure > >Often, the simplest way to get pyMPI to work is to find the right >version of mpicc. MPICH and LAM and other MPI libraries often >define this script as a way to simplify the often complex set of >library and include files needed to link. > >The name is typically something like mpicc mpcc mpigcc mpiicc or >you can use a MPI C++ compiler like mpiCC mpiiCC. > >First use this test program to make sure it works right... > >% cat test.c >#include <mpi.h> >#include <stdio.h> > >int main(int argc, char** argv) { > int rank; > MPI_Init(&argc,&argv); > MPI_Comm_rank(MPI_COMM_WORLD,&rank); > printf("I am rank %d\n",rank); > MPI_Finalize(); > return 0; >} > >% mpicc test.c >% mpirun -np 4 a.out >I am rank 3 >I am rank 1 >I am rank 0 >I am rank 2 > >Once you have verified a working MPI C compiler, just tell >the configure script. An easy way to do that is to set >an environment variable. > >% env CC=mpicc ./configure --prefix=$HOME/pub > >Cheers, > >Pat > > > > > > Your test works fine for me /home/nwagner> mpicc mpi_test.c /home/nwagner> mpirun -np 2 a.out Password: I am rank 0 I am rank 1 /home/nwagner> but /home/nwagner> mpirun -v -np 2 pyMPI running /home/nwagner/pyMPI on 2 LINUX ch_p4 processors Created /home/nwagner/PI29497 Password: Python 2.3+ (pyMPI 2.0b0) on linux2 Type "help", "copyright", "credits" or "license" for more information. get stuck without a prompt. For what reason ?? Nils |
From: Nils W. <nw...@me...> - 2004-12-09 15:24:32
|
Hi all, make check yields snip [0] MPI Abort by user Aborting program ! [0] Aborting program! p0_28324: p4_error: : 1 FAIL: micro_tests/pyMPI_comm_misc_011 pyMPI_comm_misc_012 [0] MPI Abort by user Aborting program ! [0] Aborting program! p0_28329: p4_error: : 77 FAIL: micro_tests/pyMPI_comm_misc_012 snip ======================= 2 of 87 tests failed (18 tests were not run) ======================= make[1]: *** [check-TESTS] Error 1 make[1]: Leaving directory `/usr/local/src/pyMPI-2.0b0' make: *** [check-am] Error 2 Any comment ? Nils |
From: Mike S. <st...@gm...> - 2004-12-09 00:25:31
|
Nils, I find that this is most easily resolved by simply specifying the name of your C compiler. For instance, you can set environment variable CC for a single command by doing the following: $ CC=/opt/mpich/bin/mpicc ./configure --with-python=python2.3 Otherwise you could try adding the --with-libs option and specifying where your mpich libraries are in addition to specifying the include path. However, I find that simply setting CC is much easier and reproducable across systems. Given that mpicc is usually already in the users path, they can just do $ CC=mpicc ./configure and expect the configure to work. One other thing worth mentioning is that if you want to use pyMPI interactively you made need to specify --with-isatty and/or --with-prompt-nl at configure time. Keep this in mind if you run into problems with interactive mode. ~Mike On Wed, 08 Dec 2004 10:13:01 -0800, pym...@li... <pym...@li...> wrote: > Date: Wed, 08 Dec 2004 15:02:02 +0100 > From: Nils Wagner <nw...@me...> > Organization: Institute A of Mechanics > To: pym...@li... > Subject: [Pympi-users] WARNING: MPI must need some more libraries... > > Hi all, > > I am going to install pyMPI-2.0b0. > > ./configure --with-includes=/opt/mpich/include --with-python=python2.3 > > yields a warning > > configure: WARNING: MPI must need some more libraries... Look at > config.log. You may need to add --with-libs info > > Which additional libraries must be included ? > > Any pointer would be appreciated. > > Nils > > A make in /var/tmp/pyMPI-2.0b0 yields > > snip |
From: Patrick M. <pnm...@pa...> - 2004-12-08 18:21:47
|
Short answer: Try % env CC=mpicc ./configure Often, the simplest way to get pyMPI to work is to find the right version of mpicc. MPICH and LAM and other MPI libraries often define this script as a way to simplify the often complex set of library and include files needed to link. The name is typically something like mpicc mpcc mpigcc mpiicc or you can use a MPI C++ compiler like mpiCC mpiiCC. First use this test program to make sure it works right... % cat test.c #include <mpi.h> #include <stdio.h> int main(int argc, char** argv) { int rank; MPI_Init(&argc,&argv); MPI_Comm_rank(MPI_COMM_WORLD,&rank); printf("I am rank %d\n",rank); MPI_Finalize(); return 0; } % mpicc test.c % mpirun -np 4 a.out I am rank 3 I am rank 1 I am rank 0 I am rank 2 Once you have verified a working MPI C compiler, just tell the configure script. An easy way to do that is to set an environment variable. % env CC=mpicc ./configure --prefix=$HOME/pub Cheers, Pat |
From: Nils W. <nw...@me...> - 2004-12-08 14:02:13
|
Hi all, I am going to install pyMPI-2.0b0. ./configure --with-includes=/opt/mpich/include --with-python=python2.3 yields a warning configure: WARNING: MPI must need some more libraries... Look at config.log. You may need to add --with-libs info Which additional libraries must be included ? Any pointer would be appreciated. Nils A make in /var/tmp/pyMPI-2.0b0 yields snip /var/tmp/pyMPI-2.0b0/initmpi.c:724: undefined reference to `MPI_Error_string' initmpi.o(.text+0x25b): In function `mpi_trace': /var/tmp/pyMPI-2.0b0/initmpi.c:889: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0x266):/var/tmp/pyMPI-2.0b0/initmpi.c:889: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x276):/var/tmp/pyMPI-2.0b0/initmpi.c:889: undefined reference to `MPI_Comm_rank' initmpi.o(.text+0x288):/var/tmp/pyMPI-2.0b0/initmpi.c:889: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x37c):/var/tmp/pyMPI-2.0b0/initmpi.c:907: undefined reference to `MPI_Comm_rank' initmpi.o(.text+0x3a2):/var/tmp/pyMPI-2.0b0/initmpi.c:907: undefined reference to `MPI_Error_string' initmpi.o(.text+0x3fb): In function `mpi_traceln': /var/tmp/pyMPI-2.0b0/initmpi.c:919: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0x406):/var/tmp/pyMPI-2.0b0/initmpi.c:919: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x416):/var/tmp/pyMPI-2.0b0/initmpi.c:919: undefined reference to `MPI_Comm_rank' initmpi.o(.text+0x428):/var/tmp/pyMPI-2.0b0/initmpi.c:919: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x4bc):/var/tmp/pyMPI-2.0b0/initmpi.c:925: undefined reference to `MPI_Comm_rank' initmpi.o(.text+0x4e2):/var/tmp/pyMPI-2.0b0/initmpi.c:925: undefined reference to `MPI_Error_string' initmpi.o(.text+0x51d): In function `wtick': /var/tmp/pyMPI-2.0b0/initmpi.c:846: undefined reference to `MPI_Wtick' initmpi.o(.text+0x61b): In function `finalized': /var/tmp/pyMPI-2.0b0/initmpi.c:801: undefined reference to `MPI_Error_string' initmpi.o(.text+0x64d): In function `wtime': /var/tmp/pyMPI-2.0b0/initmpi.c:871: undefined reference to `MPI_Wtime' initmpi.o(.text+0x759): In function `mpiReady': /var/tmp/pyMPI-2.0b0/initmpi.h:257: undefined reference to `MPI_Initialized' initmpi.o(.text+0xc1c): In function `initpympi': /var/tmp/pyMPI-2.0b0/initmpi.c:1250: undefined reference to `MPI_Comm_rank' initmpi.o(.text+0xca2):/var/tmp/pyMPI-2.0b0/initmpi.c:1291: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0xcad):/var/tmp/pyMPI-2.0b0/initmpi.c:1291: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0xcbd):/var/tmp/pyMPI-2.0b0/initmpi.c:1291: undefined reference to `MPI_Pack_size' initmpi.o(.text+0xcd0):/var/tmp/pyMPI-2.0b0/initmpi.c:1291: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0xd05):/var/tmp/pyMPI-2.0b0/initmpi.c:1292: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0xd10):/var/tmp/pyMPI-2.0b0/initmpi.c:1292: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0xd20):/var/tmp/pyMPI-2.0b0/initmpi.c:1292: undefined reference to `MPI_Pack_size' initmpi.o(.text+0xd33):/var/tmp/pyMPI-2.0b0/initmpi.c:1292: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x11de):/var/tmp/pyMPI-2.0b0/initmpi.c:1450: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x121d):/var/tmp/pyMPI-2.0b0/initmpi.c:1459: undefined reference to `MPI_Get_version' initmpi.o(.text+0x12ce):/var/tmp/pyMPI-2.0b0/initmpi.c:1712: undefined reference to `MPI_Pack_size' initmpi.o(.text+0x12fa):/var/tmp/pyMPI-2.0b0/initmpi.c:1712: undefined reference to `MPI_Error_string' initmpi.o(.text+0x130f):/var/tmp/pyMPI-2.0b0/initmpi.c:1712: undefined reference to `MPI_Pack_size' initmpi.o(.text+0x14d6):/var/tmp/pyMPI-2.0b0/initmpi.c:1488: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0x14e1):/var/tmp/pyMPI-2.0b0/initmpi.c:1488: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x14ef):/var/tmp/pyMPI-2.0b0/initmpi.c:1488: undefined reference to `MPI_Comm_rank' initmpi.o(.text+0x1501):/var/tmp/pyMPI-2.0b0/initmpi.c:1488: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x1585):/var/tmp/pyMPI-2.0b0/initmpi.c:1494: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0x1590):/var/tmp/pyMPI-2.0b0/initmpi.c:1494: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x159e):/var/tmp/pyMPI-2.0b0/initmpi.c:1494: undefined reference to `MPI_Comm_size' initmpi.o(.text+0x15b0):/var/tmp/pyMPI-2.0b0/initmpi.c:1494: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x168d):/var/tmp/pyMPI-2.0b0/initmpi.c:1510: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0x1698):/var/tmp/pyMPI-2.0b0/initmpi.c:1510: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x16a6):/var/tmp/pyMPI-2.0b0/initmpi.c:1510: undefined reference to `MPI_Comm_size' initmpi.o(.text+0x16b8):/var/tmp/pyMPI-2.0b0/initmpi.c:1510: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x171a):/var/tmp/pyMPI-2.0b0/initmpi.c:1516: undefined reference to `MPI_Wtick' initmpi.o(.text+0x1b74):/var/tmp/pyMPI-2.0b0/initmpi.c:1611: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0x1b7f):/var/tmp/pyMPI-2.0b0/initmpi.c:1611: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x1b8d):/var/tmp/pyMPI-2.0b0/initmpi.c:1611: undefined reference to `MPI_Comm_dup' initmpi.o(.text+0x1b9f):/var/tmp/pyMPI-2.0b0/initmpi.c:1611: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x1bd4):/var/tmp/pyMPI-2.0b0/initmpi.c:1613: undefined reference to `MPI_Errhandler_get' initmpi.o(.text+0x1bdf):/var/tmp/pyMPI-2.0b0/initmpi.c:1613: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x1bed):/var/tmp/pyMPI-2.0b0/initmpi.c:1613: undefined reference to `MPI_Comm_dup' initmpi.o(.text+0x1bff):/var/tmp/pyMPI-2.0b0/initmpi.c:1613: undefined reference to `MPI_Errhandler_set' initmpi.o(.text+0x1d85):/var/tmp/pyMPI-2.0b0/initmpi.c:1654: undefined reference to `MPI_Type_contiguous' initmpi.o(.text+0x1d9f):/var/tmp/pyMPI-2.0b0/initmpi.c:1656: undefined reference to `MPI_Type_commit' initmpi.o(.text+0x1dc8):/var/tmp/pyMPI-2.0b0/initmpi.c:1659: undefined reference to `MPI_Type_contiguous' initmpi.o(.text+0x1de2):/var/tmp/pyMPI-2.0b0/initmpi.c:1661: undefined reference to `MPI_Type_commit' initmpi.o(.text+0x1fc1):/var/tmp/pyMPI-2.0b0/initmpi.c:1707: undefined reference to `MPI_Error_string' initmpi.o(.text+0x2177):/var/tmp/pyMPI-2.0b0/initmpi.c:1672: undefined reference to `MPI_Comm_dup' initmpi.o(.text+0x21a2):/var/tmp/pyMPI-2.0b0/initmpi.c:1672: undefined reference to `MPI_Comm_dup' initmpi.o(.text+0x21b7):/var/tmp/pyMPI-2.0b0/initmpi.c:1672: undefined reference to `MPI_Comm_size' initmpi.o(.text+0x21cc):/var/tmp/pyMPI-2.0b0/initmpi.c:1672: undefined reference to `MPI_Comm_size' initmpi.o(.text+0x21e1):/var/tmp/pyMPI-2.0b0/initmpi.c:1672: undefined reference to `MPI_Comm_rank' initmpi.o(.text+0x2239): In function `MPI_ReadlineFunctionPointer': /var/tmp/pyMPI-2.0b0/initmpi.c:613: undefined reference to `MPI_Barrier' initmpi.o(.text+0x22a6):/var/tmp/pyMPI-2.0b0/initmpi.c:633: undefined reference to `MPI_Bcast' initmpi.o(.text+0x22ce):/var/tmp/pyMPI-2.0b0/initmpi.c:635: undefined reference to `MPI_Bcast' initmpi.o(.text+0x2347):/var/tmp/pyMPI-2.0b0/initmpi.c:654: undefined reference to `MPI_Bcast' initmpi.o(.text+0x2387):/var/tmp/pyMPI-2.0b0/initmpi.c:660: undefined reference to `MPI_Bcast' initmpi.o(.text+0x23c6):/var/tmp/pyMPI-2.0b0/initmpi.c:660: undefined reference to `MPI_Error_string' initmpi.o(.text+0x241b):/var/tmp/pyMPI-2.0b0/initmpi.c:671: undefined reference to `MPI_Error_string' initmpi.o(.text+0x2451):/var/tmp/pyMPI-2.0b0/initmpi.c:629: undefined reference to `MPI_Bcast' initmpi.o(.text+0x27b3): In function `initmpi': /var/tmp/pyMPI-2.0b0/initmpi.c:1034: undefined reference to `MPI_Initialized' initmpi.o(.data+0x0): In function `PyMPI_Finalized': /var/tmp/pyMPI-2.0b0/initmpi.c:525: undefined reference to `MPI_Finalize' initmpi.o(.data+0xb0): In function `PyShutdownMPI': /var/tmp/pyMPI-2.0b0/initmpi.c:724: undefined reference to `MPI_Init' mpicomm.o(.text+0x69): In function `mpiReady': /var/tmp/pyMPI-2.0b0/initmpi.h:257: undefined reference to `MPI_Initialized' mpicomm.o(.text+0x389): In function `comm_abort': /var/tmp/pyMPI-2.0b0/mpicomm.c:368: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x396):/var/tmp/pyMPI-2.0b0/mpicomm.c:368: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x3a8):/var/tmp/pyMPI-2.0b0/mpicomm.c:368: undefined reference to `MPI_Abort' mpicomm.o(.text+0x3bc):/var/tmp/pyMPI-2.0b0/mpicomm.c:368: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x3fb):/var/tmp/pyMPI-2.0b0/mpicomm.c:375: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x42e):/var/tmp/pyMPI-2.0b0/mpicomm.c:374: undefined reference to `MPI_Abort' mpicomm.o(.text+0x4ee): In function `comm_free': /var/tmp/pyMPI-2.0b0/mpicomm.c:334: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x4fb):/var/tmp/pyMPI-2.0b0/mpicomm.c:334: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x506):/var/tmp/pyMPI-2.0b0/mpicomm.c:334: undefined reference to `MPI_Comm_free' mpicomm.o(.text+0x51a):/var/tmp/pyMPI-2.0b0/mpicomm.c:334: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x598):/var/tmp/pyMPI-2.0b0/mpicomm.c:342: undefined reference to `MPI_Comm_free' mpicomm.o(.text+0x5be):/var/tmp/pyMPI-2.0b0/mpicomm.c:342: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x644): In function `deltaT': /var/tmp/pyMPI-2.0b0/mpicomm.c:484: undefined reference to `MPI_Wtime' mpicomm.o(.text+0x98f): In function `comm_rank': /var/tmp/pyMPI-2.0b0/mpicomm.c:46: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x99c):/var/tmp/pyMPI-2.0b0/mpicomm.c:46: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x9ae):/var/tmp/pyMPI-2.0b0/mpicomm.c:46: undefined reference to `MPI_Comm_rank' mpicomm.o(.text+0x9c2):/var/tmp/pyMPI-2.0b0/mpicomm.c:46: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0xa32):/var/tmp/pyMPI-2.0b0/mpicomm.c:52: undefined reference to `MPI_Error_string' mpicomm.o(.text+0xa50):/var/tmp/pyMPI-2.0b0/mpicomm.c:52: undefined reference to `MPI_Comm_rank' mpicomm.o(.text+0xdb0): In function `mergeWith': /var/tmp/pyMPI-2.0b0/mpicomm.c:510: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0xdbb):/var/tmp/pyMPI-2.0b0/mpicomm.c:510: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0xdcb):/var/tmp/pyMPI-2.0b0/mpicomm.c:510: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0xddd):/var/tmp/pyMPI-2.0b0/mpicomm.c:510: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0xf7e):/var/tmp/pyMPI-2.0b0/mpicomm.c:534: undefined reference to `MPI_Error_string' mpicomm.o(.text+0xff0):/var/tmp/pyMPI-2.0b0/mpicomm.c:534: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x1326): In function `setFileBasedQueing': /var/tmp/pyMPI-2.0b0/mpicomm.c:795: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x1331):/var/tmp/pyMPI-2.0b0/mpicomm.c:795: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1341):/var/tmp/pyMPI-2.0b0/mpicomm.c:795: undefined reference to `MPI_Comm_rank' mpicomm.o(.text+0x1353):/var/tmp/pyMPI-2.0b0/mpicomm.c:795: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1388):/var/tmp/pyMPI-2.0b0/mpicomm.c:797: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x1393):/var/tmp/pyMPI-2.0b0/mpicomm.c:797: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x13a3):/var/tmp/pyMPI-2.0b0/mpicomm.c:797: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x13b5):/var/tmp/pyMPI-2.0b0/mpicomm.c:797: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x15fa):/var/tmp/pyMPI-2.0b0/mpicomm.c:878: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x1605):/var/tmp/pyMPI-2.0b0/mpicomm.c:878: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1615):/var/tmp/pyMPI-2.0b0/mpicomm.c:878: undefined reference to `MPI_Comm_rank' mpicomm.o(.text+0x1627):/var/tmp/pyMPI-2.0b0/mpicomm.c:878: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x165c):/var/tmp/pyMPI-2.0b0/mpicomm.c:880: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x1667):/var/tmp/pyMPI-2.0b0/mpicomm.c:880: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1677):/var/tmp/pyMPI-2.0b0/mpicomm.c:880: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x1689):/var/tmp/pyMPI-2.0b0/mpicomm.c:880: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x192d):/var/tmp/pyMPI-2.0b0/mpicomm.c:931: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x19f1):/var/tmp/pyMPI-2.0b0/mpicomm.c:950: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x1b5e):/var/tmp/pyMPI-2.0b0/mpicomm.c:946: undefined reference to `MPI_Comm_rank' mpicomm.o(.text+0x1b7c):/var/tmp/pyMPI-2.0b0/mpicomm.c:946: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x1ba9):/var/tmp/pyMPI-2.0b0/mpicomm.c:946: undefined reference to `MPI_Comm_rank' mpicomm.o(.text+0x1bc0):/var/tmp/pyMPI-2.0b0/mpicomm.c:946: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x1e36): In function `synchronizeQueuedOutput': /var/tmp/pyMPI-2.0b0/mpicomm.c:554: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x1e40):/var/tmp/pyMPI-2.0b0/mpicomm.c:554: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1e4f):/var/tmp/pyMPI-2.0b0/mpicomm.c:554: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x1e60):/var/tmp/pyMPI-2.0b0/mpicomm.c:554: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1e90):/var/tmp/pyMPI-2.0b0/mpicomm.c:556: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x1e9a):/var/tmp/pyMPI-2.0b0/mpicomm.c:556: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1eb5):/var/tmp/pyMPI-2.0b0/mpicomm.c:556: undefined reference to `MPI_Recv' mpicomm.o(.text+0x1ec7):/var/tmp/pyMPI-2.0b0/mpicomm.c:556: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1f24):/var/tmp/pyMPI-2.0b0/mpicomm.c:574: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x1f2e):/var/tmp/pyMPI-2.0b0/mpicomm.c:574: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1f4e):/var/tmp/pyMPI-2.0b0/mpicomm.c:574: undefined reference to `MPI_Recv' mpicomm.o(.text+0x1f60):/var/tmp/pyMPI-2.0b0/mpicomm.c:574: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x1fdd):/var/tmp/pyMPI-2.0b0/mpicomm.c:589: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x1fe7):/var/tmp/pyMPI-2.0b0/mpicomm.c:589: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2002):/var/tmp/pyMPI-2.0b0/mpicomm.c:589: undefined reference to `MPI_Recv' mpicomm.o(.text+0x2014):/var/tmp/pyMPI-2.0b0/mpicomm.c:589: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2071):/var/tmp/pyMPI-2.0b0/mpicomm.c:599: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x207b):/var/tmp/pyMPI-2.0b0/mpicomm.c:599: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x209b):/var/tmp/pyMPI-2.0b0/mpicomm.c:599: undefined reference to `MPI_Recv' mpicomm.o(.text+0x20ad):/var/tmp/pyMPI-2.0b0/mpicomm.c:599: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x212a):/var/tmp/pyMPI-2.0b0/mpicomm.c:614: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x2134):/var/tmp/pyMPI-2.0b0/mpicomm.c:614: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x214a):/var/tmp/pyMPI-2.0b0/mpicomm.c:614: undefined reference to `MPI_Send' mpicomm.o(.text+0x215c):/var/tmp/pyMPI-2.0b0/mpicomm.c:614: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x21a1):/var/tmp/pyMPI-2.0b0/mpicomm.c:561: undefined reference to `MPI_Recv' mpicomm.o(.text+0x2277):/var/tmp/pyMPI-2.0b0/mpicomm.c:679: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x2281):/var/tmp/pyMPI-2.0b0/mpicomm.c:679: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2298):/var/tmp/pyMPI-2.0b0/mpicomm.c:679: undefined reference to `MPI_Send' mpicomm.o(.text+0x22aa):/var/tmp/pyMPI-2.0b0/mpicomm.c:679: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x230f):/var/tmp/pyMPI-2.0b0/mpicomm.c:717: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x2319):/var/tmp/pyMPI-2.0b0/mpicomm.c:717: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2330):/var/tmp/pyMPI-2.0b0/mpicomm.c:717: undefined reference to `MPI_Send' mpicomm.o(.text+0x2342):/var/tmp/pyMPI-2.0b0/mpicomm.c:717: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2374):/var/tmp/pyMPI-2.0b0/mpicomm.c:724: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x237e):/var/tmp/pyMPI-2.0b0/mpicomm.c:724: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x239a):/var/tmp/pyMPI-2.0b0/mpicomm.c:724: undefined reference to `MPI_Recv' mpicomm.o(.text+0x23ac):/var/tmp/pyMPI-2.0b0/mpicomm.c:724: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2662):/var/tmp/pyMPI-2.0b0/mpicomm.c:673: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x266c):/var/tmp/pyMPI-2.0b0/mpicomm.c:673: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2683):/var/tmp/pyMPI-2.0b0/mpicomm.c:673: undefined reference to `MPI_Send' mpicomm.o(.text+0x2695):/var/tmp/pyMPI-2.0b0/mpicomm.c:673: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x26d0):/var/tmp/pyMPI-2.0b0/mpicomm.c:675: undefined reference to `MPI_Send' mpicomm.o(.text+0x26fc):/var/tmp/pyMPI-2.0b0/mpicomm.c:675: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x28ad):/var/tmp/pyMPI-2.0b0/mpicomm.c:710: undefined reference to `MPI_Send' mpicomm.o(.text+0x28e1):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x28eb):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2907):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Send' mpicomm.o(.text+0x2919):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x294b):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x2955):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2971):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Send' mpicomm.o(.text+0x2983):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x299a):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x29a4):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x29bb):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Send' mpicomm.o(.text+0x29cd):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x29f5):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Recv' mpicomm.o(.text+0x2a15):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Send' mpicomm.o(.text+0x2a3d):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Recv' mpicomm.o(.text+0x2a7f):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Recv' mpicomm.o(.text+0x2a98):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x2ab9):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Send' mpicomm.o(.text+0x2b22):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x2b4c):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Send' mpicomm.o(.text+0x2b70):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Recv' mpicomm.o(.text+0x2c15):/var/tmp/pyMPI-2.0b0/mpicomm.c:712: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x2cf7):/var/tmp/pyMPI-2.0b0/mpicomm.c:620: undefined reference to `MPI_Send' mpicomm.o(.text+0x2d1d):/var/tmp/pyMPI-2.0b0/mpicomm.c:620: undefined reference to `MPI_Send' mpicomm.o(.text+0x2f95): In function `PyMPI_Comm': /var/tmp/pyMPI-2.0b0/mpicomm.c:1548: undefined reference to `MPI_Wtime' mpicomm.o(.text+0x2fd8):/var/tmp/pyMPI-2.0b0/mpicomm.c:1559: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x2fe3):/var/tmp/pyMPI-2.0b0/mpicomm.c:1559: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x2fef):/var/tmp/pyMPI-2.0b0/mpicomm.c:1559: undefined reference to `MPI_Comm_rank' mpicomm.o(.text+0x3001):/var/tmp/pyMPI-2.0b0/mpicomm.c:1559: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x302a):/var/tmp/pyMPI-2.0b0/mpicomm.c:1560: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x3035):/var/tmp/pyMPI-2.0b0/mpicomm.c:1560: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x3041):/var/tmp/pyMPI-2.0b0/mpicomm.c:1560: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x3053):/var/tmp/pyMPI-2.0b0/mpicomm.c:1560: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x3070):/var/tmp/pyMPI-2.0b0/mpicomm.c:1567: undefined reference to `MPI_Comm_rank' mpicomm.o(.text+0x3080):/var/tmp/pyMPI-2.0b0/mpicomm.c:1567: undefined reference to `MPI_Comm_size' mpicomm.o(.text+0x30a3):/var/tmp/pyMPI-2.0b0/mpicomm.c:1567: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x31af): In function `comm_dup': /var/tmp/pyMPI-2.0b0/mpicomm.c:235: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x31bc):/var/tmp/pyMPI-2.0b0/mpicomm.c:235: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x31ce):/var/tmp/pyMPI-2.0b0/mpicomm.c:235: undefined reference to `MPI_Comm_dup' mpicomm.o(.text+0x31e2):/var/tmp/pyMPI-2.0b0/mpicomm.c:235: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x3253):/var/tmp/pyMPI-2.0b0/mpicomm.c:242: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x3270):/var/tmp/pyMPI-2.0b0/mpicomm.c:242: undefined reference to `MPI_Comm_dup' mpicomm.o(.text+0x3605): In function `comm_create': /var/tmp/pyMPI-2.0b0/mpicomm.c:211: undefined reference to `MPI_Group_free' mpicomm.o(.text+0x3639):/var/tmp/pyMPI-2.0b0/mpicomm.c:215: undefined reference to `MPI_Group_free' mpicomm.o(.text+0x3665):/var/tmp/pyMPI-2.0b0/mpicomm.c:215: undefined reference to `MPI_Error_string' mpicomm.o(.text+0x3701):/var/tmp/pyMPI-2.0b0/mpicomm.c:183: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x3711):/var/tmp/pyMPI-2.0b0/mpicomm.c:183: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x3726):/var/tmp/pyMPI-2.0b0/mpicomm.c:183: undefined reference to `MPI_Comm_group' mpicomm.o(.text+0x3747):/var/tmp/pyMPI-2.0b0/mpicomm.c:183: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x3773):/var/tmp/pyMPI-2.0b0/mpicomm.c:188: undefined reference to `MPI_Group_incl' mpicomm.o(.text+0x37b7):/var/tmp/pyMPI-2.0b0/mpicomm.c:196: undefined reference to `MPI_Errhandler_get' mpicomm.o(.text+0x37c7):/var/tmp/pyMPI-2.0b0/mpicomm.c:196: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x37e4):/var/tmp/pyMPI-2.0b0/mpicomm.c:196: undefined reference to `MPI_Comm_create' mpicomm.o(.text+0x37fb):/var/tmp/pyMPI-2.0b0/mpicomm.c:196: undefined reference to `MPI_Errhandler_set' mpicomm.o(.text+0x3877):/var/tmp/pyMPI-2.0b0/mpicomm.c:199: undefined reference to `MPI_Comm_group' mpicomm.o(.text+0x38a3):/var/tmp/pyMPI-2.0b0/mpicomm.c:199: undefined reference to `MPI_Comm_create' mpicomm.o(.text+0x3972): In function `PyMPIObject_Communicator_getattr': /var/tmp/pyMPI-2.0b0/mpicomm.c:1347: undefined reference to `MPI_Wtime' mpicomm.o(.text+0x3a31):/var/tmp/pyMPI-2.0b0/mpicomm.c:1337: undefined reference to `MPI_Wtick' mpicomm.o(.text+0x3a5b):/var/tmp/pyMPI-2.0b0/mpicomm.c:1342: undefined reference to `MPI_Wtime' mpistatus.o(.text+0x19b): In function `Status_string': /var/tmp/pyMPI-2.0b0/mpistatus.c:91: undefined reference to `MPI_Error_string' mpifile.o(.text+0x45f): In function `fileBroadcast': /var/tmp/pyMPI-2.0b0/mpifile.c:84: undefined reference to `MPI_Errhandler_get' mpifile.o(.text+0x46c):/var/tmp/pyMPI-2.0b0/mpifile.c:84: undefined reference to `MPI_Errhandler_set' mpifile.o(.text+0x483):/var/tmp/pyMPI-2.0b0/mpifile.c:84: undefined reference to `MPI_Bcast' mpifile.o(.text+0x498):/var/tmp/pyMPI-2.0b0/mpifile.c:84: undefined reference to `MPI_Errhandler_set' mpifile.o(.text+0x4d3):/var/tmp/pyMPI-2.0b0/mpifile.c:88: undefined reference to `MPI_Errhandler_get' mpifile.o(.text+0x4e0):/var/tmp/pyMPI-2.0b0/mpifile.c:88: undefined reference to `MPI_Errhandler_set' mpifile.o(.text+0x4f6):/var/tmp/pyMPI-2.0b0/mpifile.c:88: undefined reference to `MPI_Bcast' mpifile.o(.text+0x50b):/var/tmp/pyMPI-2.0b0/mpifile.c:88: undefined reference to `MPI_Errhandler_set' mpifile.o(.text+0x535):/var/tmp/pyMPI-2.0b0/mpifile.c:95: undefined reference to `MPI_Bcast' mpifile.o(.text+0x557):/var/tmp/pyMPI-2.0b0/mpifile.c:95: undefined reference to `MPI_Bcast' mpifile.o(.text+0x57f):/var/tmp/pyMPI-2.0b0/mpifile.c:95: undefined reference to `MPI_Error_string' mpifile.o(.text+0xa6e): In function `File_read': /var/tmp/pyMPI-2.0b0/mpifile.c:344: undefined reference to `MPI_Errhandler_get' mpifile.o(.text+0xa7b):/var/tmp/pyMPI-2.0b0/mpifile.c:344: undefined reference to `MPI_Errhandler_set' mpifile.o(.text+0xa92):/var/tmp/pyMPI-2.0b0/mpifile.c:344: undefined reference to `MPI_Bcast' mpifile.o(.text+0xaa7):/var/tmp/pyMPI-2.0b0/mpifile.c:344: undefined reference to `MPI_Errhandler_set' mpifile.o(.text+0xaf4):/var/tmp/pyMPI-2.0b0/mpifile.c:150: undefined reference to `MPI_Bcast' mpifile.o(.text+0xbef):/var/tmp/pyMPI-2.0b0/mpifile.c:162: undefined reference to `MPI_Errhandler_get' mpifile.o(.text+0xbfc):/var/tmp/pyMPI-2.0b0/mpifile.c:162: undefined reference to `MPI_Errhandler_set' mpifile.o(.text+0xc15):/var/tmp/pyMPI-2.0b0/mpifile.c:162: undefined reference to `MPI_Bcast' mpifile.o(.text+0xc2a):/var/tmp/pyMPI-2.0b0/mpifile.c:162: undefined reference to `MPI_Errhandler_set' mpifile.o(.text+0xcaf):/var/tmp/pyMPI-2.0b0/mpifile.c:153: undefined reference to `MPI_Error_string' mpifile.o(.text+0xd1d):/var/tmp/pyMPI-2.0b0/mpifile.c:168: undefined reference to `MPI_Bcast' mpirequest.o(.text+0x3f): In function `cancel': /var/tmp/pyMPI-2.0b0/mpirequest.c:432: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x4c):/var/tmp/pyMPI-2.0b0/mpirequest.c:432: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x57):/var/tmp/pyMPI-2.0b0/mpirequest.c:432: undefined reference to `MPI_Cancel' mpirequest.o(.text+0x6b):/var/tmp/pyMPI-2.0b0/mpirequest.c:432: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xa0):/var/tmp/pyMPI-2.0b0/mpirequest.c:433: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0xad):/var/tmp/pyMPI-2.0b0/mpirequest.c:433: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xb8):/var/tmp/pyMPI-2.0b0/mpirequest.c:433: undefined reference to `MPI_Cancel' mpirequest.o(.text+0xcc):/var/tmp/pyMPI-2.0b0/mpirequest.c:433: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x121):/var/tmp/pyMPI-2.0b0/mpirequest.c:441: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x12e):/var/tmp/pyMPI-2.0b0/mpirequest.c:441: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x139):/var/tmp/pyMPI-2.0b0/mpirequest.c:441: undefined reference to `MPI_Cancel' mpirequest.o(.text+0x14d):/var/tmp/pyMPI-2.0b0/mpirequest.c:441: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x182):/var/tmp/pyMPI-2.0b0/mpirequest.c:446: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x18f):/var/tmp/pyMPI-2.0b0/mpirequest.c:446: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x19a):/var/tmp/pyMPI-2.0b0/mpirequest.c:446: undefined reference to `MPI_Cancel' mpirequest.o(.text+0x1ae):/var/tmp/pyMPI-2.0b0/mpirequest.c:446: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x1da):/var/tmp/pyMPI-2.0b0/mpirequest.c:446: undefined reference to `MPI_Error_string' mpirequest.o(.text+0x200):/var/tmp/pyMPI-2.0b0/mpirequest.c:453: undefined reference to `MPI_Cancel' mpirequest.o(.text+0x218):/var/tmp/pyMPI-2.0b0/mpirequest.c:453: undefined reference to `MPI_Cancel' mpirequest.o(.text+0x228):/var/tmp/pyMPI-2.0b0/mpirequest.c:453: undefined reference to `MPI_Cancel' mpirequest.o(.text+0x23b):/var/tmp/pyMPI-2.0b0/mpirequest.c:453: undefined reference to `MPI_Cancel' mpirequest.o(.text+0x2d7): In function `pwait': /var/tmp/pyMPI-2.0b0/mpirequest.c:326: undefined reference to `MPI_Pack_size' mpirequest.o(.text+0x2e7):/var/tmp/pyMPI-2.0b0/mpirequest.c:327: undefined reference to `MPI_Type_size' mpirequest.o(.text+0x31f):/var/tmp/pyMPI-2.0b0/mpirequest.c:331: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x32f):/var/tmp/pyMPI-2.0b0/mpirequest.c:331: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x33b):/var/tmp/pyMPI-2.0b0/mpirequest.c:331: undefined reference to `MPI_Wait' mpirequest.o(.text+0x34f):/var/tmp/pyMPI-2.0b0/mpirequest.c:331: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x38b):/var/tmp/pyMPI-2.0b0/mpirequest.c:334: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x398):/var/tmp/pyMPI-2.0b0/mpirequest.c:334: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x3a4):/var/tmp/pyMPI-2.0b0/mpirequest.c:334: undefined reference to `MPI_Wait' mpirequest.o(.text+0x3b8):/var/tmp/pyMPI-2.0b0/mpirequest.c:334: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x43c):/var/tmp/pyMPI-2.0b0/mpirequest.c:354: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x449):/var/tmp/pyMPI-2.0b0/mpirequest.c:354: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x458):/var/tmp/pyMPI-2.0b0/mpirequest.c:354: undefined reference to `MPI_Wait' mpirequest.o(.text+0x46c):/var/tmp/pyMPI-2.0b0/mpirequest.c:354: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x4a1):/var/tmp/pyMPI-2.0b0/mpirequest.c:358: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x4b1):/var/tmp/pyMPI-2.0b0/mpirequest.c:358: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x4d2):/var/tmp/pyMPI-2.0b0/mpirequest.c:358: undefined reference to `MPI_Unpack' mpirequest.o(.text+0x4e7):/var/tmp/pyMPI-2.0b0/mpirequest.c:358: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x544):/var/tmp/pyMPI-2.0b0/mpirequest.c:369: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x551):/var/tmp/pyMPI-2.0b0/mpirequest.c:369: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x589):/var/tmp/pyMPI-2.0b0/mpirequest.c:369: undefined reference to `MPI_Unpack' mpirequest.o(.text+0x59e):/var/tmp/pyMPI-2.0b0/mpirequest.c:369: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x606):/var/tmp/pyMPI-2.0b0/mpirequest.c:383: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x616):/var/tmp/pyMPI-2.0b0/mpirequest.c:383: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x648):/var/tmp/pyMPI-2.0b0/mpirequest.c:383: undefined reference to `MPI_Irecv' mpirequest.o(.text+0x65d):/var/tmp/pyMPI-2.0b0/mpirequest.c:383: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x692):/var/tmp/pyMPI-2.0b0/mpirequest.c:391: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x69f):/var/tmp/pyMPI-2.0b0/mpirequest.c:391: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x6ab):/var/tmp/pyMPI-2.0b0/mpirequest.c:391: undefined reference to `MPI_Wait' mpirequest.o(.text+0x6bf):/var/tmp/pyMPI-2.0b0/mpirequest.c:391: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x768):/var/tmp/pyMPI-2.0b0/mpirequest.c:342: undefined reference to `MPI_Unpack' mpirequest.o(.text+0x791):/var/tmp/pyMPI-2.0b0/mpirequest.c:342: undefined reference to `MPI_Error_string' mpirequest.o(.text+0x7b3):/var/tmp/pyMPI-2.0b0/mpirequest.c:342: undefined reference to `MPI_Wait' mpirequest.o(.text+0x7c8):/var/tmp/pyMPI-2.0b0/mpirequest.c:342: undefined reference to `MPI_Wait' mpirequest.o(.text+0x7ff):/var/tmp/pyMPI-2.0b0/mpirequest.c:342: undefined reference to `MPI_Irecv' mpirequest.o(.text+0x818):/var/tmp/pyMPI-2.0b0/mpirequest.c:342: undefined reference to `MPI_Wait' mpirequest.o(.text+0x82b):/var/tmp/pyMPI-2.0b0/mpirequest.c:342: undefined reference to `MPI_Wait' mpirequest.o(.text+0x854):/var/tmp/pyMPI-2.0b0/mpirequest.c:342: undefined reference to `MPI_Unpack' mpirequest.o(.text+0xa30): In function `test': /var/tmp/pyMPI-2.0b0/mpirequest.c:484: undefined reference to `MPI_Pack_size' mpirequest.o(.text+0xa40):/var/tmp/pyMPI-2.0b0/mpirequest.c:485: undefined reference to `MPI_Type_size' mpirequest.o(.text+0xa78):/var/tmp/pyMPI-2.0b0/mpirequest.c:491: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0xa85):/var/tmp/pyMPI-2.0b0/mpirequest.c:491: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xa9c):/var/tmp/pyMPI-2.0b0/mpirequest.c:491: undefined reference to `MPI_Test' mpirequest.o(.text+0xab0):/var/tmp/pyMPI-2.0b0/mpirequest.c:491: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xb40):/var/tmp/pyMPI-2.0b0/mpirequest.c:508: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0xb4d):/var/tmp/pyMPI-2.0b0/mpirequest.c:508: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xb64):/var/tmp/pyMPI-2.0b0/mpirequest.c:508: undefined reference to `MPI_Test' mpirequest.o(.text+0xb78):/var/tmp/pyMPI-2.0b0/mpirequest.c:508: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xbbb):/var/tmp/pyMPI-2.0b0/mpirequest.c:514: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0xbcb):/var/tmp/pyMPI-2.0b0/mpirequest.c:514: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xbec):/var/tmp/pyMPI-2.0b0/mpirequest.c:514: undefined reference to `MPI_Unpack' mpirequest.o(.text+0xc01):/var/tmp/pyMPI-2.0b0/mpirequest.c:514: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xc65):/var/tmp/pyMPI-2.0b0/mpirequest.c:527: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0xc72):/var/tmp/pyMPI-2.0b0/mpirequest.c:527: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xca9):/var/tmp/pyMPI-2.0b0/mpirequest.c:527: undefined reference to `MPI_Unpack' mpirequest.o(.text+0xcbe):/var/tmp/pyMPI-2.0b0/mpirequest.c:527: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xd0b):/var/tmp/pyMPI-2.0b0/mpirequest.c:537: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0xd18):/var/tmp/pyMPI-2.0b0/mpirequest.c:537: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xd4d):/var/tmp/pyMPI-2.0b0/mpirequest.c:537: undefined reference to `MPI_Irecv' mpirequest.o(.text+0xd62):/var/tmp/pyMPI-2.0b0/mpirequest.c:537: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xde6):/var/tmp/pyMPI-2.0b0/mpirequest.c:566: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0xdf3):/var/tmp/pyMPI-2.0b0/mpirequest.c:566: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xe0a):/var/tmp/pyMPI-2.0b0/mpirequest.c:566: undefined reference to `MPI_Test' mpirequest.o(.text+0xe1e):/var/tmp/pyMPI-2.0b0/mpirequest.c:566: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0xe81):/var/tmp/pyMPI-2.0b0/mpirequest.c:574: undefined reference to `MPI_Error_string' mpirequest.o(.text+0xeee):/var/tmp/pyMPI-2.0b0/mpirequest.c:556: undefined reference to `MPI_Unpack' mpirequest.o(.text+0xf75):/var/tmp/pyMPI-2.0b0/mpirequest.c:549: undefined reference to `MPI_Test' mpirequest.o(.text+0xf91):/var/tmp/pyMPI-2.0b0/mpirequest.c:549: undefined reference to `MPI_Test' mpirequest.o(.text+0xfad):/var/tmp/pyMPI-2.0b0/mpirequest.c:549: undefined reference to `MPI_Test' mpirequest.o(.text+0xfd6):/var/tmp/pyMPI-2.0b0/mpirequest.c:549: undefined reference to `MPI_Unpack' mpirequest.o(.text+0x1032):/var/tmp/pyMPI-2.0b0/mpirequest.c:549: undefined reference to `MPI_Irecv' mpirequest.o(.text+0x1319): In function `Request_getattr': /var/tmp/pyMPI-2.0b0/mpirequest.c:160: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x1332):/var/tmp/pyMPI-2.0b0/mpirequest.c:160: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x1343):/var/tmp/pyMPI-2.0b0/mpirequest.c:160: undefined reference to `MPI_Test' mpirequest.o(.text+0x135a):/var/tmp/pyMPI-2.0b0/mpirequest.c:160: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x1392):/var/tmp/pyMPI-2.0b0/mpirequest.c:162: undefined reference to `MPI_Errhandler_get' mpirequest.o(.text+0x13a2):/var/tmp/pyMPI-2.0b0/mpirequest.c:162: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x13ab):/var/tmp/pyMPI-2.0b0/mpirequest.c:162: undefined reference to `MPI_Test_cancelled' mpirequest.o(.text+0x13c2):/var/tmp/pyMPI-2.0b0/mpirequest.c:162: undefined reference to `MPI_Errhandler_set' mpirequest.o(.text+0x14e4):/var/tmp/pyMPI-2.0b0/mpirequest.c:166: undefined reference to `MPI_Error_string' mpirequest.o(.text+0x151e):/var/tmp/pyMPI-2.0b0/mpirequest.c:166: undefined reference to `MPI_Test' mpirequest.o(.text+0x152e):/var/tmp/pyMPI-2.0b0/mpirequest.c:166: undefined reference to `MPI_Test_cancelled' mpicart.o(.text+0x78): In function `get_coords': /var/tmp/pyMPI-2.0b0/mpicart.c:102: undefined reference to `MPI_Errhandler_get' mpicart.o(.text+0x85):/var/tmp/pyMPI-2.0b0/mpicart.c:102: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x97):/var/tmp/pyMPI-2.0b0/mpicart.c:102: undefined reference to `MPI_Comm_rank' mpicart.o(.text+0xab):/var/tmp/pyMPI-2.0b0/mpicart.c:102: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x100):/var/tmp/pyMPI-2.0b0/mpicart.c:107: undefined reference to `MPI_Errhandler_get' mpicart.o(.text+0x10d):/var/tmp/pyMPI-2.0b0/mpicart.c:107: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x128):/var/tmp/pyMPI-2.0b0/mpicart.c:107: undefined reference to `MPI_Cart_coords' mpicart.o(.text+0x13d):/var/tmp/pyMPI-2.0b0/mpicart.c:107: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x1bd):/var/tmp/pyMPI-2.0b0/mpicart.c:120: undefined reference to `MPI_Error_string' mpicart.o(.text+0x22e):/var/tmp/pyMPI-2.0b0/mpicart.c:119: undefined reference to `MPI_Comm_rank' mpicart.o(.text+0x257):/var/tmp/pyMPI-2.0b0/mpicart.c:119: undefined reference to `MPI_Cart_coords' mpicart.o(.text+0x370): In function `Cart_shift': /var/tmp/pyMPI-2.0b0/mpicart.c:154: undefined reference to `MPI_Errhandler_get' mpicart.o(.text+0x37d):/var/tmp/pyMPI-2.0b0/mpicart.c:154: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x3a4):/var/tmp/pyMPI-2.0b0/mpicart.c:154: undefined reference to `MPI_Cart_shift' mpicart.o(.text+0x3b9):/var/tmp/pyMPI-2.0b0/mpicart.c:154: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x473):/var/tmp/pyMPI-2.0b0/mpicart.c:168: undefined reference to `MPI_Error_string' mpicart.o(.text+0x4c4):/var/tmp/pyMPI-2.0b0/mpicart.c:168: undefined reference to `MPI_Cart_shift' mpicart.o(.text+0x53a): In function `PyMPI_Cart_create': /var/tmp/pyMPI-2.0b0/mpicart.c:329: undefined reference to `MPI_Errhandler_get' mpicart.o(.text+0x547):/var/tmp/pyMPI-2.0b0/mpicart.c:329: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x559):/var/tmp/pyMPI-2.0b0/mpicart.c:329: undefined reference to `MPI_Comm_size' mpicart.o(.text+0x56d):/var/tmp/pyMPI-2.0b0/mpicart.c:329: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x61c):/var/tmp/pyMPI-2.0b0/mpicart.c:347: undefined reference to `MPI_Errhandler_get' mpicart.o(.text+0x629):/var/tmp/pyMPI-2.0b0/mpicart.c:347: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x648):/var/tmp/pyMPI-2.0b0/mpicart.c:347: undefined reference to `MPI_Cart_create' mpicart.o(.text+0x661):/var/tmp/pyMPI-2.0b0/mpicart.c:347: undefined reference to `MPI_Errhandler_set' mpicart.o(.text+0x6a1):/var/tmp/pyMPI-2.0b0/mpicart.c:354: undefined reference to `MPI_Wtime' mpicart.o(.text+0x718):/var/tmp/pyMPI-2.0b0/mpicart.c:365: undefined reference to `MPI_Error_string' mpicart.o(.text+0x760):/var/tmp/pyMPI-2.0b0/mpicart.c:368: undefined reference to `MPI_Comm_size' mpicart.o(.text+0x786):/var/tmp/pyMPI-2.0b0/mpicart.c:368: undefined reference to `MPI_Cart_create' mpicart.o(.text+0x7b3):/var/tmp/pyMPI-2.0b0/mpicart.c:368: undefined reference to `MPI_Error_string' commhelpers.o(.text+0x3ad): In function `nativeMakeReadyToSend': /var/tmp/pyMPI-2.0b0/commhelpers.c:479: undefined reference to `MPI_Type_extent' commhelpers.o(.text+0x5ff):/var/tmp/pyMPI-2.0b0/commhelpers.c:454: undefined reference to `MPI_Error_string' commhelpers.o(.text+0x7a1): In function `PackMessage': /var/tmp/pyMPI-2.0b0/commhelpers.c:334: undefined reference to `MPI_Errhandler_get' commhelpers.o(.text+0x7b1):/var/tmp/pyMPI-2.0b0/commhelpers.c:334: undefined reference to `MPI_Errhandler_set' commhelpers.o(.text+0x7d8):/var/tmp/pyMPI-2.0b0/commhelpers.c:334: undefined reference to `MPI_Pack' commhelpers.o(.text+0x7f0):/var/tmp/pyMPI-2.0b0/commhelpers.c:334: undefined reference to `MPI_Errhandler_set' commhelpers.o(.text+0x83f):/var/tmp/pyMPI-2.0b0/commhelpers.c:342: undefined reference to `MPI_Errhandler_get' commhelpers.o(.text+0x84f):/var/tmp/pyMPI-2.0b0/commhelpers.c:342: undefined reference to `MPI_Errhandler_set' commhelpers.o(.text+0x875):/var/tmp/pyMPI-2.0b0/commhelpers.c:342: undefined reference to `MPI_Pack' commhelpers.o(.text+0x88d):/var/tmp/pyMPI-2.0b0/commhelpers.c:342: undefined reference to `MPI_Errhandler_set' commhelpers.o(.text+0x989):/var/tmp/pyMPI-2.0b0/commhelpers.c:353: undefined reference to `MPI_Errhandler_get' commhelpers.o(.text+0x999):/var/tmp/pyMPI-2.0b0/commhelpers.c:353: undefined reference to `MPI_Errhandler_set' commhelpers.o(.text+0x9c6):/var/tmp/pyMPI-2.0b0/commhelpers.c:353: undefined reference to `MPI_Pack' commhelpers.o(.text+0x9de):/var/tmp/pyMPI-2.0b0/commhelpers.c:353: undefined reference to `MPI_Errhandler_set' commhelpers.o(.text+0xa75):/var/tmp/pyMPI-2.0b0/commhelpers.c:361: undefined reference to `MPI_Pack' commhelpers.o(.text+0xaa3):/var/tmp/pyMPI-2.0b0/commhelpers.c:361: undefined reference to `MPI_Pack' commhelpers.o(.text+0xad2):/var/tmp/pyMPI-2.0b0/commhelpers.c:361: undefined reference to `MPI_Pack' commhelpers.o(.text+0xafb):/var/tmp/pyMPI-2.0b0/commhelpers.c:361: undefined reference to `MPI_Error_string' commhelpers.o(.text+0xbce): In function `procsOf': /var/tmp/pyMPI-2.0b0/commhelpers.c:287: undefined reference to `MPI_Errhandler_get' commhelpers.o(.text+0xbdb):/var/tmp/pyMPI-2.0b0/commhelpers.c:287: undefined reference to `MPI_Errhandler_set' commhelpers.o(.text+0xbed):/var/tmp/pyMPI-2.0b0/commhelpers.c:287: undefined reference to `MPI_Comm_size' commhelpers.o(.text+0xc01):/var/tmp/pyMPI-2.0b0/commhelpers.c:287: undefined reference to `MPI_Errhandler_set' commhelpers.o(.text+0xc2e):/var/tmp/pyMPI-2.0b0/commhelpers.c:291: undefined reference to `MPI_Comm_size' commhelpers.o(.text+0xc56):/var/tmp/pyMPI-2.0b0/commhelpers.c:291: undefined reference to `MPI_Error_string' comm_point_to_point.o(.text+0x19): In function `mpiReady': /var/tmp/pyMPI-2.0b0/initmpi.h:257: undefined reference to `MPI_Initialized' comm_point_to_point.o(.text+0x20f): In function `native_pt_to_pt_sendrecv': /var/tmp/pyMPI-2.0b0/comm_point_to_point.c:483: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x21f):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:483: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x25b):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:483: undefined reference to `MPI_Sendrecv' comm_point_to_point.o(.text+0x273):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:483: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x35d):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:511: undefined reference to `MPI_Error_string' comm_point_to_point.o(.text+0x3d8):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:509: undefined reference to `MPI_Sendrecv' comm_point_to_point.o(.text+0x4e4): In function `native_pt_to_pt_recv': /var/tmp/pyMPI-2.0b0/comm_point_to_point.c:391: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x4f1):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:391: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x517):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:391: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x52c):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:391: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x595):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:405: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x5a2):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:405: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x5cf):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:405: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x5e4):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:405: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x64e):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:417: undefined reference to `MPI_Error_string' comm_point_to_point.o(.text+0x6a5):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:416: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x6f1):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:416: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x89f): In function `native_pt_to_pt_send': /var/tmp/pyMPI-2.0b0/comm_point_to_point.c:341: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x8ac):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:341: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x8cd):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:341: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0x8e2):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:341: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x917):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:345: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x924):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:345: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x952):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:345: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0x967):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:345: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x9ee):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:354: undefined reference to `MPI_Error_string' comm_point_to_point.o(.text+0xa3d):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:353: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0xa7a):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:353: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0xb43): In function `pympi_pt_to_pt_send_implementation': /var/tmp/pyMPI-2.0b0/comm_point_to_point.c:66: undefined reference to `MPI_Pack_size' comm_point_to_point.o(.text+0xb9a):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:77: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0xba7):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:77: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0xbbf):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:77: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0xbd4):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:77: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0xc17):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:83: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0xc24):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:83: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0xc4c):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:83: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0xc61):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:83: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0xcba):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:97: undefined reference to `MPI_Error_string' comm_point_to_point.o(.text+0xcf4):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:96: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0xd3c):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:96: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0xef5): In function `pympi_pt_to_pt_sendrecv': /var/tmp/pyMPI-2.0b0/comm_point_to_point.c:218: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0xf02):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:218: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0xf16):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:218: undefined reference to `MPI_Pack_size' comm_point_to_point.o(.text+0xf2b):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:218: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0xf94):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:230: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0xfa1):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:230: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0xfdd):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:230: undefined reference to `MPI_Sendrecv' comm_point_to_point.o(.text+0xff2):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:230: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x1027):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:238: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x1034):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:238: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x105b):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:238: undefined reference to `MPI_Unpack' comm_point_to_point.o(.text+0x1070):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:238: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x10c7):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:246: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x10d4):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:246: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x1115):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:246: undefined reference to `MPI_Unpack' comm_point_to_point.o(.text+0x112a):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:246: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x116d):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:256: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x117a):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:256: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x11a8):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:256: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0x11bd):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:256: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x1228):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:270: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x1235):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:270: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x1276):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:270: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x128b):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:270: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x130b):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:285: undefined reference to `MPI_Error_string' comm_point_to_point.o(.text+0x1365):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:284: undefined reference to `MPI_Unpack' comm_point_to_point.o(.text+0x13b0):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:284: undefined reference to `MPI_Pack_size' comm_point_to_point.o(.text+0x13f8):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:284: undefined reference to `MPI_Sendrecv' comm_point_to_point.o(.text+0x1427):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:284: undefined reference to `MPI_Unpack' comm_point_to_point.o(.text+0x146f):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:284: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x14bd):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:284: undefined reference to `MPI_Send' comm_point_to_point.o(.text+0x15a4): In function `pympi_pt_to_pt_recv_implementation': /var/tmp/pyMPI-2.0b0/comm_point_to_point.c:119: undefined reference to `MPI_Pack_size' comm_point_to_point.o(.text+0x15d9):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:125: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x15e9):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:125: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x1606):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:125: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x161b):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:125: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x1650):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:129: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x165d):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:129: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x167e):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:129: undefined reference to `MPI_Unpack' comm_point_to_point.o(.text+0x1693):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:129: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x16ea):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:135: undefined reference to `MPI_Errhandler_get' comm_point_to_point.o(.text+0x16f7):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:135: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x1732):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:135: undefined reference to `MPI_Unpack' comm_point_to_point.o(.text+0x1747):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:135: undefined reference to `MPI_Errhandler_set' comm_point_to_point.o(.text+0x17e0):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:156: undefined reference to `MPI_Unpack' comm_point_to_point.o(.text+0x180b):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:156: undefined reference to `MPI_Error_string' comm_point_to_point.o(.text+0x1866):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:144: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x1892):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:144: undefined reference to `MPI_Recv' comm_point_to_point.o(.text+0x18db):/var/tmp/pyMPI-2.0b0/comm_point_to_point.c:144: undefined reference to `MPI_Unpack' comm_collective.o(.text+0x39): In function `mpiReady': /var/tmp/pyMPI-2.0b0/initmpi.h:257: undefined reference to `MPI_Initialized' comm_collective.o(.text+0x1c4): In function `native_collective_bcast': /var/tmp/pyMPI-2.0b0/comm_collective.c:1036: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x1d1):/var/tmp/pyMPI-2.0b0/comm_collective.c:1036: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x1ea):/var/tmp/pyMPI-2.0b0/comm_collective.c:1036: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x1ff):/var/tmp/pyMPI-2.0b0/comm_collective.c:1036: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x263):/var/tmp/pyMPI-2.0b0/comm_collective.c:1048: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x270):/var/tmp/pyMPI-2.0b0/comm_collective.c:1048: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x296):/var/tmp/pyMPI-2.0b0/comm_collective.c:1048: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x2ab):/var/tmp/pyMPI-2.0b0/comm_collective.c:1048: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x371):/var/tmp/pyMPI-2.0b0/comm_collective.c:1061: undefined reference to `MPI_Error_string' comm_collective.o(.text+0x3b7):/var/tmp/pyMPI-2.0b0/comm_collective.c:1061: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x3e3):/var/tmp/pyMPI-2.0b0/comm_collective.c:1061: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x458):/var/tmp/pyMPI-2.0b0/comm_collective.c:1019: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x465):/var/tmp/pyMPI-2.0b0/comm_collective.c:1019: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x47e):/var/tmp/pyMPI-2.0b0/comm_collective.c:1019: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x493):/var/tmp/pyMPI-2.0b0/comm_collective.c:1019: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x4c8):/var/tmp/pyMPI-2.0b0/comm_collective.c:1021: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x4d5):/var/tmp/pyMPI-2.0b0/comm_collective.c:1021: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x4fb):/var/tmp/pyMPI-2.0b0/comm_collective.c:1021: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x510):/var/tmp/pyMPI-2.0b0/comm_collective.c:1021: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x5ad):/var/tmp/pyMPI-2.0b0/comm_collective.c:1025: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x5df):/var/tmp/pyMPI-2.0b0/comm_collective.c:1025: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x608):/var/tmp/pyMPI-2.0b0/comm_collective.c:1025: undefined reference to `MPI_Error_string' comm_collective.o(.text+0x7a2): In function `native_collective_reduce': /var/tmp/pyMPI-2.0b0/comm_collective.c:955: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x7af):/var/tmp/pyMPI-2.0b0/comm_collective.c:955: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x7df):/var/tmp/pyMPI-2.0b0/comm_collective.c:955: undefined reference to `MPI_Reduce' comm_collective.o(.text+0x7f4):/var/tmp/pyMPI-2.0b0/comm_collective.c:955: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x908):/var/tmp/pyMPI-2.0b0/comm_collective.c:967: undefined reference to `MPI_Reduce' comm_collective.o(.text+0x965):/var/tmp/pyMPI-2.0b0/comm_collective.c:944: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x972):/var/tmp/pyMPI-2.0b0/comm_collective.c:944: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x9a2):/var/tmp/pyMPI-2.0b0/comm_collective.c:944: undefined reference to `MPI_Reduce' comm_collective.o(.text+0x9b7):/var/tmp/pyMPI-2.0b0/comm_collective.c:944: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0xa13):/var/tmp/pyMPI-2.0b0/comm_collective.c:946: undefined reference to `MPI_Error_string' comm_collective.o(.text+0xa7a):/var/tmp/pyMPI-2.0b0/comm_collective.c:946: undefined reference to `MPI_Reduce' comm_collective.o(.text+0xc05): In function `native_collective_allreduce': /var/tmp/pyMPI-2.0b0/comm_collective.c:883: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0xc15):/var/tmp/pyMPI-2.0b0/comm_collective.c:883: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0xc40):/var/tmp/pyMPI-2.0b0/comm_collective.c:883: undefined reference to `MPI_Allreduce' comm_collective.o(.text+0xc58):/var/tmp/pyMPI-2.0b0/comm_collective.c:883: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0xcbe):/var/tmp/pyMPI-2.0b0/comm_collective.c:898: undefined reference to `MPI_Error_string' comm_collective.o(.text+0xd45):/var/tmp/pyMPI-2.0b0/comm_collective.c:897: undefined reference to `MPI_Allreduce' comm_collective.o(.text+0x1612): In function `pympi_collective_bcast_implementation': /var/tmp/pyMPI-2.0b0/comm_collective.c:125: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x161f):/var/tmp/pyMPI-2.0b0/comm_collective.c:125: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x1632):/var/tmp/pyMPI-2.0b0/comm_collective.c:125: undefined reference to `MPI_Bcast' comm_collective.o(.text+0x1647):/var/tmp/pyMPI-2.0b0/comm_collective.c:125: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x167c):/var/tmp/pyMPI-2.0b0/comm_collective.c:130: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x1689):/var/tmp/pyMPI-2.0b0/comm_collective.c:130: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x16ad):/var/tmp/pyMPI-2.0b0/comm_collective.c:130: undefined reference to `MPI_Unpack' comm_collective.o(.text+0x16c2):/var/tmp/pyMPI-2.0b0/comm_collective.c:130: undefined reference to `MPI_Errhandler_set' comm_collective.o(.text+0x1717):/var/tmp/pyMPI-2.0b0/comm_collective.c:137: undefined reference to `MPI_Errhandler_get' comm_collective.o(.text+0x1724):/var/tmp/pyMPI-2.0b0/comm_collective.c:137: undefined refer... [truncated message content] |
From: Julian C. <rjc...@cs...> - 2004-11-27 13:39:45
|
Hi Mike I did everything you said and it works just like you said it would! 1. I checked out the latest cvs (version 2.1b4 as of 26Nov04) i.e. cvs -d:pserver:ano...@cv...:/cvsroot/pympi checkout pyMPI 2. ran ./boot 3. ran ./configure --with-isatty --with-prompt-nl 4. ran make 5. ran make install 6. ran /opt/mpich/bin/mpirun -np 2 -machinefile ../machines.LINUX /usr/local/bin/pyMPI When you run interactive pyMPI, the extra carriage return after the prompt will appear before the last node replies i.e. >>> print mpi.rank 0 >>> 1 As long as you know what's going on, it works fine.. tks for your help, -----Original Message----- From: pym...@li... [mailto:pym...@li...]On Behalf Of Mike Steder Sent: Wednesday, November 24, 2004 1:34 AM To: pym...@li... Subject: [Pympi-users] Re: Never get to the python prompt >>> Hi Julian, Sorry for the late response. > I"m currently having a problem starting up pyMPI on a simple 2 node test > setup > > 1. mpirun works correctly across 2+ plus nodes > 2. pyMPI runs correctly if started up in single cpu mode, with -np set to 1. > This works on both nodes. > 3. pyMPI never gets to the >>> prompt, if I run it with -np set to 2. > 4. I added printf statements to the pyMPI initialisation and it appears that > both nodes complete the MPI portion (because I get duplicate printfs). The > first (local) node then enters Py_Main and displays the credits, but the > second node enters Py_Main and never displays anything (assuming it should). > 5. Result is that the process waits indefinitely for something to occur and > never displays the >>> prompt. > 6. I am able to ssh to th> e second node and run pyMPI with -np set to 1. I was having identitcal problems. After contacting Pat he pointed out to me a configure option worth trying. Apparently the problem is related to I/O buffers not being properly flushed for some reason. We can enforce/improve this behavior by adding some options to our configure. However, I would first recommend that you go ahead and check out the latest version of pyMPI from CVS. If you're unfamiliar with how to do that check out this page: http://sourceforge.net/cvs/?group_id=5335 Specifically you want to execute the following: $ cvs -d:pserver:ano...@cv...:/cvsroot/pympi login $ cvs -d:pserver:ano...@cv...:/cvsroot/pympi checkout pyMPI Assuming you execute these commands from /home/cook/ it will create a directory /home/cook/pyMPI with the most recent sources. Now that you've got fresh sources you need to boostrap the CVS distribution by doing( from the pyMPI directory) $ ./boot And then configure however you would normally but add either "--with-isatty" or "--with-isatty --with-prompt-nl" For instance: $ CC=mpicc ./configure --prefix=/usr --with-isatty --with-prompt-nl $ make $ ./pyMPI There is a minor formatting glitch with this version which should be resolved before a stable tarball is released in which the prompt begins on the line below the >>> , like >>> []<-- cursor Hopefully these changes will be rolled into the latest version 2.0b4 in the next few days. I would suggest going with CVS rather then waiting however. Please let us know if you still have problems with the newer version with the --with-isatty flag on. Heck, let us know if it works for you now as well. Goodluck, keep us posted, ~Mike ------------------------------------------------------- SF email is sponsored by - The IT Product Guide Read honest & candid reviews on hundreds of IT Products from real users. Discover which products truly live up to the hype. Start reading now. http://productguide.itmanagersjournal.com/ _______________________________________________ Pympi-users mailing list Pym...@li... https://lists.sourceforge.net/lists/listinfo/pympi-users |
From: Mike S. <st...@gm...> - 2004-11-24 06:33:34
|
Hi Julian, Sorry for the late response. > I"m currently having a problem starting up pyMPI on a simple 2 node test > setup > > 1. mpirun works correctly across 2+ plus nodes > 2. pyMPI runs correctly if started up in single cpu mode, with -np set to 1. > This works on both nodes. > 3. pyMPI never gets to the >>> prompt, if I run it with -np set to 2. > 4. I added printf statements to the pyMPI initialisation and it appears that > both nodes complete the MPI portion (because I get duplicate printfs). The > first (local) node then enters Py_Main and displays the credits, but the > second node enters Py_Main and never displays anything (assuming it should). > 5. Result is that the process waits indefinitely for something to occur and > never displays the >>> prompt. > 6. I am able to ssh to th> e second node and run pyMPI with -np set to 1. I was having identitcal problems. After contacting Pat he pointed out to me a configure option worth trying. Apparently the problem is related to I/O buffers not being properly flushed for some reason. We can enforce/improve this behavior by adding some options to our configure. However, I would first recommend that you go ahead and check out the latest version of pyMPI from CVS. If you're unfamiliar with how to do that check out this page: http://sourceforge.net/cvs/?group_id=5335 Specifically you want to execute the following: $ cvs -d:pserver:ano...@cv...:/cvsroot/pympi login $ cvs -d:pserver:ano...@cv...:/cvsroot/pympi checkout pyMPI Assuming you execute these commands from /home/cook/ it will create a directory /home/cook/pyMPI with the most recent sources. Now that you've got fresh sources you need to boostrap the CVS distribution by doing( from the pyMPI directory) $ ./boot And then configure however you would normally but add either "--with-isatty" or "--with-isatty --with-prompt-nl" For instance: $ CC=mpicc ./configure --prefix=/usr --with-isatty --with-prompt-nl $ make $ ./pyMPI There is a minor formatting glitch with this version which should be resolved before a stable tarball is released in which the prompt begins on the line below the >>> , like >>> []<-- cursor Hopefully these changes will be rolled into the latest version 2.0b4 in the next few days. I would suggest going with CVS rather then waiting however. Please let us know if you still have problems with the newer version with the --with-isatty flag on. Heck, let us know if it works for you now as well. Goodluck, keep us posted, ~Mike |
From: Julian C. <rjc...@cs...> - 2004-11-11 20:19:46
|
I'm currently having a problem starting up pyMPI on a simple 2 node test setup 1. mpirun works correctly across 2+ plus nodes 2. pyMPI runs correctly if started up in single cpu mode, with -np set to 1. This works on both nodes. 3. pyMPI never gets to the >>> prompt, if I run it with -np set to 2. 4. I added printf statements to the pyMPI initialisation and it appears that both nodes complete the MPI portion (because I get duplicate printfs). The first (local) node then enters Py_Main and displays the credits, but the second node enters Py_Main and never displays anything (assuming it should). 5. Result is that the process waits indefinitely for something to occur and never displays the >>> prompt. 6. I am able to ssh to the second node and run pyMPI with -np set to 1. Setup: Running pyMPI-2.0b0 Running suse 8.1 using mpich 1.x w/ssh, installed as precompiled binary from suse setup CD's Is there anything I can look at to narrow down the problem? tks Julian Cook |
From: Jonathan H. <jh...@ne...> - 2004-10-19 04:34:28
|
Hi Barry, Haven't seen this specific build problem before but I have found that after running configure my 2.0b0 makefile had path errors in it that required hand editing to fix. I suggest taking a close look at your makefile. Cheers, Jon Hujsak pym...@li... wrote: >Send Pympi-users mailing list submissions to > pym...@li... > >To subscribe or unsubscribe via the World Wide Web, visit > https://lists.sourceforge.net/lists/listinfo/pympi-users >or, via email, send a message with subject or body 'help' to > pym...@li... > >You can reach the person managing the list at > pym...@li... > >When replying, please edit your Subject line so it is more specific >than "Re: Contents of Pympi-users digest..." > > >Today's Topics: > > 1. pyMPI-2.0b0 build errors (Barry Rountree) > >--__--__-- > >Message: 1 >From: Barry Rountree <rou...@cs...> >To: pym...@li... >Date: Mon, 18 Oct 2004 17:31:07 -0400 >Subject: [Pympi-users] pyMPI-2.0b0 build errors > > >--Boundary-00=_bYDdB2K7zKtPWdo >Content-Type: text/plain; > charset="us-ascii" >Content-Transfer-Encoding: 7bit >Content-Disposition: inline > >python v2.3.3, Mandrake 10.0, kernel 2.6.3-7mdk > >./configure works fine (gzipped config.log attached) > >make gives the following error: > >gcc -pthread -o pyMPI_linker pyMPI_linker.o >cp pyMPI pyMPI2.0 >cp pyMPI_linker pyMPI2.0_linker >rm -f augmentedMakefile >/usr/bin/python2.3 ./utils/editMakefile.py >< /usr/lib/python2.3/config/Makefile CC /usr/bin/mpicc | \ > /usr/bin/python2.3 ./utils/editMakefile.py CXX /usr/bin/mpiCC | \ > /usr/bin/python2.3 ./utils/editMakefile.py OPT -DHAVE_MPI >-I/usr/include/python2.3 -g -O2 -DNDEBUG -O2 -fomit-frame-pointer -pipe >-march=i586 -mcpu=pentiumpro -g | \ > /usr/bin/python2.3 ./utils/editMakefile.py LIBS `echo -lm >-L/usr/lib/python2.3/config -lpython2.3 -Xlinker -export-dynamic -lpthread >-ldl -lutil | tr ' ' '\012' | egrep '^-(l|L)'` | \ > /usr/bin/python2.3 ./utils/editMakefile.py LINKFORSHARED -Xlinker >-export-dynamic | \ > /usr/bin/python2.3 ./utils/editMakefile.py >LDSHARED /usr/bin/pyMPI2.0_linker gcc -pthread -shared | \ > /usr/bin/python2.3 ./utils/editMakefile.py >BLDSHARED /usr/bin/pyMPI2.0_linker gcc -pthread -shared \ > > >>>augmentedMakefile >>> >>> >make[1]: Leaving directory `/opt/pyMPI-2.0b0' >[rountree@shuttle pyMPI-2.0b0]$ > > >make clean, then python setup.py build gives the following error: > >gcc -pthread -o pyMPI_linker pyMPI_linker.o >cp pyMPI pyMPI2.0 >cp pyMPI_linker pyMPI2.0_linker >rm -f augmentedMakefile >/usr/bin/python2.3 ./utils/editMakefile.py >< /usr/lib/python2.3/config/Makefile CC /usr/bin/mpicc | \ > /usr/bin/python2.3 ./utils/editMakefile.py CXX /usr/bin/mpiCC | \ > /usr/bin/python2.3 ./utils/editMakefile.py OPT -DHAVE_MPI >-I/usr/include/python2.3 -g -O2 -DNDEBUG -O2 -fomit-frame-pointer -pipe >-march=i586 -mcpu=pentiumpro -g | \ > /usr/bin/python2.3 ./utils/editMakefile.py LIBS `echo -lm >-L/usr/lib/python2.3/config -lpython2.3 -Xlinker -export-dynamic -lpthread >-ldl -lutil | tr ' ' '\012' | egrep '^-(l|L)'` | \ > /usr/bin/python2.3 ./utils/editMakefile.py LINKFORSHARED -Xlinker >-export-dynamic | \ > /usr/bin/python2.3 ./utils/editMakefile.py >LDSHARED /usr/bin/pyMPI2.0_linker gcc -pthread -shared | \ > /usr/bin/python2.3 ./utils/editMakefile.py >BLDSHARED /usr/bin/pyMPI2.0_linker gcc -pthread -shared \ > > >>>augmentedMakefile >>> >>> >make[1]: Leaving directory `/opt/pyMPI-2.0b0' >running build >running build_ext >building 'mpi' extension >gcc -pthread -fno-strict-aliasing -DNDEBUG -O2 -fomit-frame-pointer -pipe >-march=i586 -mcpu=pentiumpro -g -fPIC -I/usr/include/python2.3 -c >pyMPI_softload.c -o build/temp.linux-i686-2.3/pyMPI_softload.o >gcc: pyMPI_softload.c: No such file or directory >gcc: no input files >error: command 'gcc' failed with exit status 1 >[rountree@shuttle pyMPI-2.0b0]$ > > >Ideas? > >Thanks, > >Barry Rountree > >--Boundary-00=_bYDdB2K7zKtPWdo >Content-Type: application/x-gzip; > name="config.log.gz" >Content-Transfer-Encoding: base64 >Content-Disposition: attachment; > filename="config.log.gz" > >H4sICJQzdEEAA2NvbmZpZy5sb2cA7D1rc9vGrt/1K/Y4nRO7iSS+H2rTM4osJ7q1LY+k9DT3i0qT >lMSGIjV82NG5Pf/9AktS4pJLyU7SmSa1p3WsXQDEYrFYAIulZisvJgvPd4kdBonlBTGxgi1Zu3Fs >Ld2YbKLQSW3XIbdbgFhvADKKyf0K/m1FaRB4wRIxF94yjdyXJAmJ5TnEcW/T5RL7vMW+m6ytD0DS >ImsvTuDPTqs1Ssi9FRM7cq2keMiOGDzFXmF/a+kGblRAvLl+R/ppEiIkkTqq3iFkFNyFtpV4YYBc >rq3AIb4XuBS5Rch3pNPds9FqPXtG2sUPefYMP9/4VrIIo3Un/8z0t1ZhnATW2iWvSLxKk8R3O3bc >SZdWx3XSVkq72mvo9TRDKz5H8FnqaB25ra+dD0VrDK2XXpB+LBruoOGZSP4No7uyIiLqRFR7qtZT >JDIYzogkCEqr1U3jqHvrBd0caQNIafAhCO+DVrn9V4I/+76s04pAkMxPzumOLIVof9j17mgjAIju >zv3YXbpJvI29YBGWAFAytKlKn+FubdkrnA8uQMFDGPvunevzKaSBdweq5/IotG76s7c9Qgn5oAc+ >YhSNpT/zB5U//iqKE63auARRxkVDuEm6sCwSy2fIrsK1243CNEgi160S+N26s7q/S5ErdpSONBfU >MkCVIKuOO4UchKCqiRsncU0lc6XcaXRPlCWhR+yVa3/ARXeber5DYK4Sd02S7cZlQGWjRyI3Tv2k >R5WgHbnOykraPiplexmkDLSilQjjZDfS1YRH0FWkMl1YeGAWXk/P22hjYB3fgkHKZcRgGaVn7PQm >ByRtm4E1xdIT7ldusnKjXDRucOdFYbB2g4SA/YutgBmKKiv7x2xBFcp9ml5hfGndf2AgDLkHHSnY >IMperd+U99SrnZogc7hGw0liWH7ku9Or/s/DMwZFkhu5NRShwm1kBb53y8CArSn4LUTKgcJxF0/h >dBtCVU9CWK+R51QYMtU9mSAsd5mCWuF1vbGZKTXL81JBVpQa8kefxdblRmxDrmF7HvNwSRClBnRJ >kKQ6egUbV11FyHUgpbQ0a72yXl0yg92WTNA2wu7Hwps5FbAceT/5seu4d90gheXy0z/VFvSdvhkM >zojckTsSOb2CnTNCZaM7FBGFjpB1tTXYws5ag3CzjbzlKiGngARbk0wuwP6RabhI7i0wWBc4RLoR >v4RN2e60ZuhgoI+BYHEO9gMoMxi3Fbakke3S0dhAO3cnHA8pgOEjM1B/F/Gvxy3ABMVLtj+A+BMC >W0VA8a6Gk8Hb/vWs/3p0OZq9J9B0MZpdD6dTcjGekD656U9mo8G7y/6E3Lyb3Iynw06LEZQB0/fd >v2BDEdhmZSe/u6rgsnZ8vNgBH4ScRq7vWrF7RsIF6VHJdAWxK8pE1HqS2FPl1sS1HBxfvHFtlEe4 >zncs77a7tO02/uupYDfX+STsLWeXzkGXYsIc5Cw65N5LVj3SKTs37fYmchfex1dIGz4BVceLXhVP >ghZEasdFe9aGjyyg4hXMEOzYAbTjzl7poJt9u+0GFhjpNm1z9p+TFbhyTvxqE8beR2h2vJi27/R2 >B+mHwZL+2jfN5/ZHaw5+3kcv2bfadEt3X4EcSthWsEzRQ31lv7RfvHhpOdbLha6/DG9/t1/i9vty >Y8WABhhojF41CHYnDrqntf+DVm1Gh0DWoeP6PUIH0oL5KZbYQ5fKYyfXDn3ftRMJx7hqLyLwQdor >J0K30vUXc082NNJ2tuDmeUgu+ACrnpL3nYxWJw6Br0c+FXQn+8+OErETfg6693h0wLp1IU4AzPbl >XkUvH0fm0Qg7tknbx4UMnq+PM0x/z138aFebHj8yN3A+T6AgltbnTedpB9ZS8kL4KBpnPbDHZJEG >No2SfpuDzxQlz3stAAbtd9xN3EUd64KydWlfZ9ozYT8Cew72JIBFDnYFjHEA5hrCvN/WECw+bxVa >2yM+AiRphJAioUsYyCRpzNpUPTe1Itu836p+qe9RPfJb+5fnZJ2CVwE7nkusJNs8kE00uPihHPcx >tE2J90hFPLCbwogt2IRJmCabNGHRdpzCD3agi96BT5TVEqDE3VQUrbTFW52wQl3neX84uhJz92H0 >gRGqYppNDqCkChqH4j2IEPYKOwrjOCcM3Sye0UxTFCuii9MFbDk4E+5H104TNNEVFHk3weFObEdE >qOIU1UWoKuKeNbaj6prv+YJ9AdSUZjpYxrS95tksO1VudO6EqmbJJw3ZHqNZ9GmMjTizmM7Yz26Z >AHXsH8SbJnElpUkwOCrodrzHZpjUZO6oNEVtnH5N5QVWGZ+WbbsbCFPajDJpenkg7eXBoej8oejH >h8J34zSzMTySdLEaHmVcQoiORhKzWXRApH89HZEBg6qLO0OQDao9lg4NTNcVHnu6rh8bmI6xFQfT >NMoBSeCSwHUd1ynDGLiEypJvYjL72AMJQtAWRuD3WJhsoR/IrbvAZMTJ2j1hiZscy0oWFiiyg5nD >JTgxmILrtf4gz7wFbiQEfLyNn8b4P7QCL6sQfKi1iyCwY3qL8hNM2ayu6GTru7igvcD2UwcXUpYV >xBCZQS3Hq7DGmD6zHGnC1gcPhn1tu6deDb5kQa8Im+lkVjpNZgS5yrQRLIzWeWATl829LGoPnJ0d >vMlRBFlEy3RQhQBE5WIaWtPakCFe5q4NiMXAP/b9LRm8eFFGgPCVSQHc4/qJ0iDfvyAsAZWw3TgO >IwYNUxS5EIbl0bNAPKsgSzgrR3EhjpMAzrLnOHEfPXD1g6QDgdN1CDsEeH80/w0DhEgHNokw2lbR >H67lNOssg9EhJzvgE+jofk+5gxUQd1YQ137fhUZEyJwriFEHP/ffDOfX/ashOTnh9Mz6k+bOX4aT >6Wh8ze+cziaj6zf8vtfv3kyGN+PJrNL9/upmNL/q/w8E0CcSp2d0jT1CvWcyvBz2p8DmLaczZ3M+ >fUufKHUEDk/kZLMF4HLPbniAcYhuJiCKntMGucPi5sj+WWFAfqzqxU/l2ZclqZKSAU1jANDpOKaC >sirz1Fc21AfgmuLnqC+iP1Z9ZV17Ut9vQn0VuZoWdZdghxkQpeTAYmdFwRW9KWMr08CD3Z4Z90NW >RZ1JhVe7pVLgwgVQqnnZKveqJrLJ9mp/OenLhzCqmzesyEHZ15M13NnqydsKkKgwloLt1c1auPni >xaH0LWCYOZ3m9O0SaPzt87eyIfAcf2hWdgLk5W+h/Sl/+5S/fcrfftX52w1N4IId3Kdy48Sxs4Y8 >fRvvkrl2peUpq/sFsrpgaXlZXWje72C1rC5sXZ+T1pUNkZfWhWb1wUm30gbMkMgDUmSbTbvVAnJD >4fr1BkarhwNyQ+GG8oYmNQbkhsbLJ2aMcvNusmGWR8Lm3epjMfljMY+PxeSOxRSFxrGYqsJyVsl8 >1LgzVZ37DCw6OcydqXHHZVZlc4QDdp15QUJwqZyewVpjqO4zaL/hanlOV5rt0132dOFFwGoao++E >PlVO76yJxOnQAotWouA54ON7Cw8mnuZhNmGEVXFh4G/hV+6IuYi04xVWrLXZuFYEblzQqTxKfnRs >aOryU2z4144N4T/QT/iNGkpOz+Cv/4P/qfE+VaSzH2jiNfudGXci4Kf/ft0rtKTOFTpHLZjJtWCK >ICnH88RMnKeIgt6YJ1bEWjq7mmdWRFU/Wiek0DxwrQyuDlaLOWvsSuLxyFaRJI3/uAqYUh6dt6hw >himu9j2aLYvcWb7nkMHNDYOvNh5jKpIufUJuWZFNXr1giZ0ysIKuA5dnfmZOAenxtEbBZN+j6MCm >/+kZPnjeY45hqBVX4OfJiv8lrPgzb1E/Fttn9yCWgfgAs3q787HPzgoq9QP8PIzebJMVk4VSVKmp >5FJRmXPg/cH/XvczclJHZtEaz2MVTagWUt7UOKLJuNqa5j5KE6r2FuQ+SYMOMMvAKU2lndDFyQ6C >4WFgtHoBLQeqfIJ4CM6onrttQoZbXRKauNWlcvSTyW6X95Ag0gZqK2+5YgIewGqs9VB0JoHsgDql >iefH/2Jh9GZ8ReXh14pYAPAAE9qBYcVJxNauAHiJVEUldKZOuagcz1JuVNiHNUo3dHYSmU6zrL03 >GU26BONOp8NCNpZXKIZULYPOONmdePv5TRcWqeoR5ND8YRhymdNkBYF3QX23xeScs2gH2FYa2Abz >FVlAjsu2UmUbkyE7lruxl7jtjWV/wEwhg6jWBoCZpz3z6GbY6BCgRWIxD4xBF+t6lvPPwlVNEMM2 >A8roG46ns2FpGfJREVRRmHKDRhmZgvhpwjXF5qnMM9Xc2TTFw7OZgTIYks6ZxuIZD5hNU27MMSim >Up5NoLvjpGMhRUyfsMTKJ2FVYlpZ4YqO7MbZlfXBpR4al6jWrG8mU9G/Dp00u4dXop3tqFzCerO7 >bDJ3X8LIW3qB5RcaXSdlNg5cFWq3MnJfITcYLHA5UmC2JlWQq0d5u8OvWwtreGoGiMEu1/xVCDOG >B52twRh+Xw7nF5f9N1MW1GikolapXJ5zCKjNBLRqsJUVAJGVazkwyGrloSrgblENE/ZxbnvEN+PN >5UFAkVfko9LqpIPxryoKvEhGFTF8qrJYKt88yitT3VnK8ldtAubxd0iVUYncqB6aYbDZwRk+gO0y >eBiS2Oh4qpJcKx3dxl284gbeNQOIaZEvOmsSN2+iSkfzJgDClYykN69m/jgxo/+tDzOPo77xQaIr >/I0Pcu2uYXP4xgeZzeS3bnu8IPlbmNjE8YJv3cCmAUSLzrc1SCZ/Dux3ViSFEMTzvYTxTmXMM33R >ccoir6QJmtVj45RFrvMjN+d4VJnJBmTj3AAsFgUwcEpjYvsBQ2Qp8c5bVFlrZlJhqsvzRByrbooo >NqPLRi20b5pOBRObnz+deGBMo8IM0CmVjNVwuwVHPeNli1R/KGKJQfDXe60GQpttBoikFL0HI7m3 >InxNS4+czG/G09Gv88F8On43GQxPQFh5FchDmEUzFgJZ6SEsKhDasiwuXCuBLjD2PVEXGcaSvMYQ >D3OKxEJR/wGKeOeFaUwop14l5wBzxbtVA83HbtWoisa7VQOMN2YUoE/hKRFvrShG42HSI9eKYnJj >M7W5ukJVlWosWvDKQDUn4lVVN+uZMOa4T1WNUtKcVldtsuJBBsjkZNQGtbiauUTXXgRhGx0gO2lb >vmfRop32+fX58PW7N3QJthfh2kvyGrtNCJsrBNjtjbfBl/Dgm2xouSD8bW/SVxusmEjXmygkbHmM >yr5rI2dudD24fHc+vHnPQlZPQA+lV1VN5OSrxzczFqZk6/6U0TG5tZyJy/Oa7LVyCo3t4OjA5ej6 >54vxZPq2PxmeM9DlM5T2r3mBY9v9iCUqRd0jg6ByVOPynENZVfiKRvLqVQZY48zpay5ZTXsEWSZT >VzA7HvQvr8bnl6PXrETLmTmmw6hvQuR1fzrkETH5t1xVXRB408Ji60Jpbbf93ch8xwc98fEMhgEX >Jd5cXPanMxZMbeBJ5KhKkfjPLlMy5kUv36p5cGKoUa0IaRgjy2TtYCXPZNK3lTGQStOVB1VXyyPN >eOsAN3T3ZAC1xlShzhQzxNsY+07n08H8+mYyHgynU1hg88H4+uKMQcMLeayifpaPaQjcDdBA1Tm8 >dRoC7zaeSk+uGjYUekD1mPQozY03EDONMjFaXk529R/Acxok9BHIOXs2qAlCaW2KTI9YPXvO1QP0 >MmYAG5PcsKFUaWzCe/CpsBaZgdM4nvSnJlaxzvlPXUS7tzBUMHokp+QFmzS7Ww9hYHYh17UtLHNE >ABQFXvhw2GoojWbMa1oEzZyb3gwA9yK+JhiNkZwm1C4q3WzH0/kUPVq81lEpLtZEhRPQfdb8fH3T >I6pcMYsqJwRlAXgBqMbUrFWmRzSqsV3sJjgnt2n5OrgmyZx8wt9sWmjmpS5eeov60LRI3FJ1TVKb >V43EBAF4ULCxwCltCJs16UsfaAFF3oEWNB870AKflhc0aXJzhkCTGae9PFpOfKfJuAl8ifhOk7lv >4NBk9QCrtRKhErsMYPPBrqaI9WCWal5MLIzGGVjlCx4E/ukLjz7hz1l9Cn8VKbwLHywAL+GlKWrj >dQ+NxtfV4jViWxvO0qPx6dP8oCC4ctZEzhsyWACupWGC9Mr8aEY9yTJHZ2Ic+NvX28S1Q8e98C3G >/dTlr2qivvAkZXZ6d33Nrd5eo5fPei36hlxaI9um5bTdsgCxnKbp8lrDBDxnHu5YiQUPF856h5+j >fNJzHnNvDrTh0ZXbmmaKT5XbX23lNkv7ZjK8GP1KTnAF1598OXp9Pppkvfn6Lj2cAc2TlnvwveFo >QpmOZkM+ebYwsI558372FgUNBOBvhkClaPEIbjYn00YKx3nAHMnoTXUYrBVsRAb5TvqT9zipOUYZ >dDo7H8zfDvvnMP+wPPcdb/u/gN6/n85n72+G0/lbfud01p9x+2aY6eP34FLi9VwNr8aT9804XCZG >17NmBmfn0M3reXc9gs5KTya1Ymgo83Ki6sEyg3Ff1Vtv3mMgDgyNJ4B4OboecvgdzrDj9bsLvrBv >+pP+Vc50490I6Jhmby7PU05gyXNC83mcpLdkbeELEQneNl6FG3eR4kuuFu49WuIkpJUKL+mlOZJ/ >NYFtBfRBvmcn9F0O4A9YjX4AOT37AfZHP7zv5FRu6LZCfvS9tZcAqz8hTz9acexGeGUD71DN51So >czweyzejlyT2MBrJaJSQs4Lt7MUasJ3iGztgzwmcrGI1/+6G/VWR3Hpl9LFlV4K5p0ltlx+7bPeO >w8qllHH+xm/6FRJL25YIPbwIsvRcJkL6Ur270HPwHWn0mmuNo7I9BU8BCJCTwUl+i7H8vH/T179l >Qi8cD3ozl77hZG0lMEd4hphfcqSPDxfEorxl8sP3wSdeQCcdQPG6LGhAtEzpu+H3XN+HKb5SH9wi >H6/T+tuC62MznnE6273osciKZ5MZ5xeBwXss3KA4Vy7wGLz1xneRkTifbJSdf29tY+ojZCo3vB7D >GgB2pniFfk8FL7xbdpLSV7Xhl0M4OZEY4OCpGOTiRXt6mRwJgZLlUgCvLVrDpNGvlMDaa5APHsSV >ZqpQRnKaLZ55gwjOyB9/1GDnjdBUoKUXAmaqR2V8+v3iDAQKzlID8g/MRstTpv+WIRpuyea6siD/ >OPigY/dnNV1tzBLrta8QoHs4Hm6DmP/jOqdnDLjJKUf428QMOyHwzvk1eiZyKLDjH3JAc3NgZ9Te >5o/Tc4EV9PXJoa96+LtPDv9lFhq9pHNwclRu1G1ojYUJmqFz1g65oPdjN252aSImp5Px1WjMTBS9 >DfOVTdQ+JOSFyeXByeVymg3szygd2MYg9oQN77dMgX13bvth7D7PqnVgiPsvWynqDLJvdfniGmIK >3OVrCkeWL32HCAfvQF7GrJ/BQRiB9w2tiN7RzN5A5TA4jRdANVOtnhr3IWQrZtAPLZZQ+X4gQ0gX >mDPyzXqOM92QS9fpQeKXzKXrIhhCvK++e/DDbpcDJ+pjcxS6ID3lKJ5yFE85iqccxVOOgpejYCVH >18ZoNupfjv53eM7rvhhdH+q8HM7HN8NJfwY2YPqw10Pktbw/5cFcVV/YcL90FY41FGXMTJnqiNnd >shpeeRoqSGWbxDTDYGopiQWrsrSxidbu+fUB5Hpd0PwHoyX//Cer45WHFBeSKg8pc5Fdy2oQXrE+ >eEiNIt8vHRZtf6WmUVB0WdUF5QXJcUEVa499aHH3o8Jq0b1zOX5iHAu16Uqvzl7G3btKnIN4nVYR >fYmDePCQpE/zkHb1SI/wkFT9yUN68pCePKQnD+nJQ/pLekgNm5bZuGmJUu29TQUJBqzxO0x1Ua2+ >lmu9secLMEkMBekrzGT9lWpIdAlfRdL4AuRC5J9QNQKUpf9v7/h709aB/+dToHWS96TBWtq11aTo >KQO68kqBB0yj0iQrgwx4g8AgsFZP/e7vzj8SO7ED7K1PTxubhOK7s2M75/Od7+rbseV94kQuyvHp >2e4axsnrQw6Pg4Zx0DAOGsZBw/g/aRj/XRhKvN8cAk9+2sAT7Rs7v0ioSTxoS3BJgnd2CSeJyX98 >AInW9LaQkYty2WpclE/TXkhserLyI91vWDZdO3CwDXa3DU4xY1meBs/n/HuMg1PMxbpb0/tZB5eX >e1sH5bPjg3VwsA4O1sHBOjhYB7+sdcA3nIN58FObB8lH/pXsAz7qHANBEOxsIXD6pzERkra32gin >x7arabh+qdkINVQjl6CCXzE9UwQ+/q5Vsl5mcnGayWrvTf3VF59F3L64br/Ce7a9P9Uo24sze3Pn >7N4isALYZ5YpT0tCn3WOjgpF47/C0REiO+swTK5eF9VKAmmvyfPXMgsCNGC2bEO83u7Tg97SS8Hq >QOSMghAjiTkVLhVvHc2RulAuvb4AzquHG3lLnZqekFV2CgWuTzCx30UudwsJUO5rGhAv1cpQwuze >es1qlwOfp2es4DiY/GK8jqJpUBqsSuuRXwqGa5kZV5Chy0qZdnlVfS7ROpxEaNq8Wq7DIqzZIhZW >uVVWwK+LJE2BJLrEK7djIqYa0gqnGKdIWbYvnQLXO8iQsR+O4nBdSX5ygn/SzZPOiORc7Aoh+UHM >HCXYpeIDWxc2/nKCC8LKRYx//AEdbCiK46E7Ob88Ly6D4diPkhydKgVlsimXDn7ElkcB6GLwcoII >V5EKub+3EQfhhlYqFKbd1SEbf7oONBi78C1DyaFZ6nbbTC/hphoGYgNdv29uWcJNNQzEWTpxp12a >VoIz9MqXStdRUZl6mIfYXE3BZGrBvjYKLPU0nF7zPgBRJUu4k9LsjUUKPwgK458ruSCHFar4TMoA >5xtRCpHcxaO8kKunlF8bQsdZjAxEM+F4xJwRs5iYwMKTTpU+Cwzm+s3S83A2U0vcnjZjMODOjjGO >ZPWwouzeFRsSxZUNZ50fHk+nY1gS7By5kvBgLhkOfkYX829K2/NPfyGzyZld+NEYRImrHyNqyH5f >w1YqKhbPa1yGGi2DhYrB7Fvvm1pVTH6lUHRrVV51BRJfgYtETS6J68rUTcUBkYTL+Yh6H27ckf/t >iwqDsWhDYLB2W2lMCRvUWsOBauNDID5GtOM1wRR1l36IOccV9GBAR8rkShhjVA0G0j1DGOCUuQR/ >oTtaX2awdfMflCKs3gyRIPTjzJQT4Kd4x0D2clkKSkEIwnMbpVGZEttjax3haezWbZPvm16FXY/p >kud/d69rjcZjIXM2Opvwv1MqFjGTo8/zxBdPShfE8ep9Wr1rerf1Cm20vCq98hrdmksMmF7nPSCO >OIYfMtT6bZa7RdLrUKjWVFDsFEercVuttd96lZtuw+teu+SjAMk+HMkyfzOWel5n94GC1Ic673st >VPT2mB+hhPK6XI/crzaXLrz+rXdT2682sp74OLDGCC4y4sDSSq0ihMHk3Laq0D6w2wx2IcZbgGAb >sku4EwLK9jUo9Q2XWF0VQASrk+hyiAFtrxeahtKBu3dtr3dNP7gkACuQONXaFaKr7OhGauwIblfr >MNklzKdOnFrlukUr+OXZUxNqhOK5x6BMAiaLuNav1foMwxGxbCROvdnteY0GrXo9Dz+HKD8WirPC >+dlZQtDutN51vFuVJkF2K516u2fB9QCnVhdf/PkLIUEp+tHwb8tgZEJnwq6CcGu9/UM+4qw8tTcI >XtSk8KJpyPuC4oMmJ4QuOQFgT+kX8nC9edXanY+RhyfhZ/jSYisiqb1IIuCT4RvgTezLQQVxzuwS >1npcTvwTSK86SdQyP2FVIcJZooKEj4CDgCu7NTyR7LWA8d4AqMWJ75JTetF7w8po38E0xcfags70 >0TRSWsmhlNbnQFSxtasduQta/FBoDec1Ly1TrBK7dlxSjsvo0HHJsSybfAC7dSjrfbDWA4tW1hBu >I5d8irugOnUEXwDTpbHMlQQD4Rh8rcJ1yjcTCgXhGgVxUBUiUhfCYo9yoQ3PyOwSOYYyrnFExCzE >mJ+rDxHqPyR+RpkpC+lXcmjcGqgN9LMPikuwACUj3oF1cLz9qmDQNIzkANfoBeu6RDxw6BRkA6ri >w3kEMpfDvq7nUYDNwZiHkyUueTT+Kc/c+YhTAThmpROj/iuwQkUmsoh3qLMKEjAHZKbSBpSmObyU >N0ocdPiKXsgOsDvDiaN0SsESh+nvtp4pujsRJaVfrJzqFoOleyVzN+gdi6cWBV8Gh7Iw2QiAp9IC >VCCLyGfAJKwBvlpQW9PWTDkWKgDFiTB9KIECGqyP5lG6wxvUj/AMRwcDhDjz6VAdpSr/iCOnHaFY >Yo5vGgF3rz7PlzOKZ8guWb28h/+Af1BuXsd5B4C8Jx42b8OF8EiCa0nFMZhUK54ycQG+KM5NYJf8 >iwdlD8cC3uEun7cNDmaibZKFAseUANPezfFKYgAgtOz/jFS5N593rYXay4+cKUUQQu9XFqGxYlKD >j9/EiWAgAZrfz55CBREMRT2/wYGkrCFhH6levozFxGwlu8vV5Pq0eXW3+h1zvJa5/s7tnla7n9Xm >Mba5s62OaZuXO8eNa3en53jhbY5sWwxIbryKLXjGHgCTE1STE1Kj+9hF/lMZzhAbVzY6nue08EwE >h2UMLq7XWA2xdLPvhKbC2A9/cAAUTKvCM+4uwwOz6TSYUlw8M6H3UfxBCf3itzfS/UWyUSX+eoRe >yGAY64sfw3QHrDEGe4e77BxKkyZs3myffk4lJ183reSUT9nj00fipfuWBFbZwqryYpD+RZTNzjE2 >PyYi6PtjknKCyr43RGq3kLHtAWuWQJot4HQIm3BiG5zcpug0NfKQxSkeO/8A2FN2cVuyAAA= > >--Boundary-00=_bYDdB2K7zKtPWdo-- > > > >--__--__-- > >_______________________________________________ >Pympi-users mailing list >Pym...@li... >https://lists.sourceforge.net/lists/listinfo/pympi-users > > >End of Pympi-users Digest > > > > |
From: Barry R. <rou...@cs...> - 2004-10-18 21:31:17
|
python v2.3.3, Mandrake 10.0, kernel 2.6.3-7mdk ./configure works fine (gzipped config.log attached) make gives the following error: gcc -pthread -o pyMPI_linker pyMPI_linker.o cp pyMPI pyMPI2.0 cp pyMPI_linker pyMPI2.0_linker rm -f augmentedMakefile /usr/bin/python2.3 ./utils/editMakefile.py < /usr/lib/python2.3/config/Makefile CC /usr/bin/mpicc | \ /usr/bin/python2.3 ./utils/editMakefile.py CXX /usr/bin/mpiCC | \ /usr/bin/python2.3 ./utils/editMakefile.py OPT -DHAVE_MPI -I/usr/include/python2.3 -g -O2 -DNDEBUG -O2 -fomit-frame-pointer -pipe -march=i586 -mcpu=pentiumpro -g | \ /usr/bin/python2.3 ./utils/editMakefile.py LIBS `echo -lm -L/usr/lib/python2.3/config -lpython2.3 -Xlinker -export-dynamic -lpthread -ldl -lutil | tr ' ' '\012' | egrep '^-(l|L)'` | \ /usr/bin/python2.3 ./utils/editMakefile.py LINKFORSHARED -Xlinker -export-dynamic | \ /usr/bin/python2.3 ./utils/editMakefile.py LDSHARED /usr/bin/pyMPI2.0_linker gcc -pthread -shared | \ /usr/bin/python2.3 ./utils/editMakefile.py BLDSHARED /usr/bin/pyMPI2.0_linker gcc -pthread -shared \ >> augmentedMakefile make[1]: Leaving directory `/opt/pyMPI-2.0b0' [rountree@shuttle pyMPI-2.0b0]$ make clean, then python setup.py build gives the following error: gcc -pthread -o pyMPI_linker pyMPI_linker.o cp pyMPI pyMPI2.0 cp pyMPI_linker pyMPI2.0_linker rm -f augmentedMakefile /usr/bin/python2.3 ./utils/editMakefile.py < /usr/lib/python2.3/config/Makefile CC /usr/bin/mpicc | \ /usr/bin/python2.3 ./utils/editMakefile.py CXX /usr/bin/mpiCC | \ /usr/bin/python2.3 ./utils/editMakefile.py OPT -DHAVE_MPI -I/usr/include/python2.3 -g -O2 -DNDEBUG -O2 -fomit-frame-pointer -pipe -march=i586 -mcpu=pentiumpro -g | \ /usr/bin/python2.3 ./utils/editMakefile.py LIBS `echo -lm -L/usr/lib/python2.3/config -lpython2.3 -Xlinker -export-dynamic -lpthread -ldl -lutil | tr ' ' '\012' | egrep '^-(l|L)'` | \ /usr/bin/python2.3 ./utils/editMakefile.py LINKFORSHARED -Xlinker -export-dynamic | \ /usr/bin/python2.3 ./utils/editMakefile.py LDSHARED /usr/bin/pyMPI2.0_linker gcc -pthread -shared | \ /usr/bin/python2.3 ./utils/editMakefile.py BLDSHARED /usr/bin/pyMPI2.0_linker gcc -pthread -shared \ >> augmentedMakefile make[1]: Leaving directory `/opt/pyMPI-2.0b0' running build running build_ext building 'mpi' extension gcc -pthread -fno-strict-aliasing -DNDEBUG -O2 -fomit-frame-pointer -pipe -march=i586 -mcpu=pentiumpro -g -fPIC -I/usr/include/python2.3 -c pyMPI_softload.c -o build/temp.linux-i686-2.3/pyMPI_softload.o gcc: pyMPI_softload.c: No such file or directory gcc: no input files error: command 'gcc' failed with exit status 1 [rountree@shuttle pyMPI-2.0b0]$ Ideas? Thanks, Barry Rountree |
From: <ben...@id...> - 2004-05-22 13:45:07
|
Dear Open Source developer I am doing a research project on "Fun and Software Development" in which I kindly invite you to participate. You will find the online survey under http://fasd.ethz.ch/qsf/. The questionnaire consists of 53 questions and you will need about 15 minutes to complete it. With the FASD project (Fun and Software Development) we want to define the motivational significance of fun when software developers decide to engage in Open Source projects. What is special about our research project is that a similar survey is planned with software developers in commercial firms. This procedure allows the immediate comparison between the involved individuals and the conditions of production of these two development models. Thus we hope to obtain substantial new insights to the phenomenon of Open Source Development. With many thanks for your participation, Benno Luthiger PS: The results of the survey will be published under http://www.isu.unizh.ch/fuehrung/blprojects/FASD/. We have set up the mailing list fa...@we... for this study. Please see http://fasd.ethz.ch/qsf/mailinglist_en.html for registration to this mailing list. _______________________________________________________________________ Benno Luthiger Swiss Federal Institute of Technology Zurich 8092 Zurich Mail: benno.luthiger(at)id.ethz.ch _______________________________________________________________________ |
From: <oli...@en...> - 2003-03-19 19:53:53
|
Hello, I'd like to perform non blocking asynchronous communications between different MPI processus. 0 send to 1 and 2 some message at some given times while 1 and 2 are busy (doing some time consuming operation like sleeping for instance ;). However they check what is in their mail box from time to time. To achieve this task I used mpi.isend and mpi.recv but according to the following I should have made a msitake somewhere because non of the sent messages never arrive to 1 or 2. What have i done wrong? Thanks Olivier PS: Here is the sample program testMPI.py: <quote> #!/usr/bin/env python import mpi, time if mpi.rank == 0: print '** 0 : sleeping ...' time.sleep(3.0) print '** 0 : sending hello to %d ...'%1 request1 = mpi.isend('Hello 1', 1) print '** 0 : sleeping ...' time.sleep(4.0) print '** 0 : sending hello to %d ...'%2 request2 = mpi.isend('Hello 2', 2) while (not request1) or (not request2): pass print '** 0 : sequence finished' else: msg = None while msg == None: print '**', mpi.rank, ': sleeping 1s ...' time.sleep(1.0) print '**', mpi.rank, ': receiving...' reception = mpi.irecv(0) if reception: msg = reception.message print '**', mpi.rank, ':', 'received :', msg </quote> And here is the output (interrupted by keyboard ^C stroke) sync17% mpirun -np 3 -machinefile machines pyMPI testMPI.py ** 1 : sleeping 1s ... ** 0 : sleeping ... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 0 : sending hello to 1 ... ** 0 : sleeping ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 0 : sending hello to 2 ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... ** 1 : receiving... ** 1 : sleeping 1s ... ** 2 : receiving... ** 2 : sleeping 1s ... bm_list_20012: p4_error: interrupt SIGINT: 2 Killed by signal 2. Killed by signal 2. p0_20011: p4_error: interrupt SIGINT: 2 |