[Pympi-users] About Gather/Scatter
Status: Alpha
Brought to you by:
patmiller
|
From: Luigi P. <lu...@la...> - 2006-07-18 09:28:43
|
This is a general question about how Gather and Scatter functions work.
I refer to Miller 2002 paper which says that a "simple way to achieve=20
parallelism is with gather/scatter parallelism. A scatter operation will=20
take a container, split it into equal (or nearly equal) parts that are=20
messaged to various slave tasks. A gather reverses that and collects=20
sub-containers together into one larger Python list."
This is the example code:
import mpi
import crypt
if mpi.rank =3D=3D 0:
words =3D open(=92/usr/dict/words=92).read().split()
else:
words =3D []
local_words =3D mpi.scatter(words)
target =3D =92xxaGcwiAKoYgc=92
for word in local_words:
if crypt.crypt(word,target[:2]) =3D=3D target:
print =92the word is=92,word
break
I would like to understand whether
a) is the list split into mpi.size groups and then each group sent to a=20
parallel task
b) or is each list entry successively sent to the first parallel task fre=
e?
Example (np=3D2)
local_words =3D ['Luigi', 'Dough', 'Patrick', 'Julian', 'James']
a)
local_words_0 =3D ['Luigi', 'Dough', 'Patrick'] --> to rank 0
local_words_1 =3D ['Julian', 'James'] --> to rank 1
b)
for word in local_words:
word --> single list element sent to the first "rank" free
Hope I'm clear.
Thanks.
Luigi
|