Re: [Pympi-users] Problem running pyMPI with OpenMPI, isatty compile problem
Status: Alpha
Brought to you by:
patmiller
|
From: Julian C. <jul...@ya...> - 2006-07-18 19:11:53
|
Luigi
I have a slightly different version if this file. I pasted the entire file below. The difference appears to be that the _throw portion is defined at the top instead.
You can try this version instead. Alternately you could try a more adventurous change and remove the entire HAVE_MPC_ISATTY section entirely - though you would need to understand which #if and #endif lines to remove.
Julian
#include "mpi.h"
#include "Python.h"
#include "pyMPI.h"
#include "pyMPI_Macros.h"
#ifdef HAVE_MPC_ISATTY
#include <pm_util.h>
#endif
#ifndef __THROW
#define __THROW
#endif
START_CPLUSPLUS
#ifdef HAVE_MPC_ISATTY
/**************************************************************************/
/* GLOBAL ************** isatty ************************/
/**************************************************************************/
/* Replacement for isatty() with correct results under AIX's POE */
/**************************************************************************/
int isatty(int filedes) __THROW {
int status;
/* ----------------------------------------------- */
/* Do the isatty() work */
/* ----------------------------------------------- */
status = (mpc_isatty(filedes) == 1);
return status;
}
#else
#if PYMPI_ISATTY
/**************************************************************************/
/* GLOBAL ************** isatty ************************/
/**************************************************************************/
/* Assume stdin,stdout,stderr are attached to a tty */
/**************************************************************************/
int isatty(int filedes) __THROW {
return (filedes == 0 || filedes == 1 || filedes == 2);
}
#endif
#endif
END_CPLUSPLUS
----- Original Message ----
From: Luigi Paioro <lu...@la...>
To: Julian Cook <jul...@ya...>
Cc: pym...@li...
Sent: Tuesday, July 18, 2006 4:47:55 AM
Subject: Re: [Pympi-users] Problem running pyMPI with OpenMPI
> 2. It appears that your pympi build will actually run non-interactively,
> though I suggest you confirm it by creating a non-trivial script, such
> as the pi example and running it as a file:
>
> $ mpirun -np 3 pyMPI pi_test.py
It seems to work:
$ mpirun -np 3 pyMPI fractal.py
Starting computation (groan)
process 1 done with computation!!
process 2 done with computation!!
process 0 done with computation!!
Header length is 54
BMP size is (400, 400)
Data length is 480000
Pretty output image! For the time being I can test only one CPU, anyway
3 parallel processes started.
> 3. If this runs with good output over all cpu's then the probable cause
> is the build, you need to add --isatty to the configure, There is an
> example of building on Solaris in the mailing list [2005] that discusses
> this. Also there are new line config flags that need to be considered.
Well, I've tried with these options:
$ CC=mpicc; ./configure -prefix=<inst path> --with-includes=-I<mpi
path>/include --with-isatty --with-prompt-nl
but I get this error:
mpicc -DHAVE_CONFIG_H -I. -I. -I. -I<mpi path>/include
-I/usr/include/python2.4 -g -O2 -g -O2 -c `test -f 'pyMPI_isatty.c' ||
echo './'`pyMPI_isatty.c
pyMPI_isatty.c:52: error: syntax error before '{' token
make[1]: *** [pyMPI_isatty.o] Error 1
make[1]: Leaving directory `<src path>/pyMPI-2.4b4'
make: *** [all] Error 2
> 4. If the above test doesn t work, you need to fall back to testing mpi
> itself, using the examples in the mpi installation. pympi is effectively
> an mpi program, so mpi itself must work for python to work.
MPI itself works!
Thank you.
Luigi
|