### Email Archive: pympi-users (read-only)

 2003: 2004: 2005: 2006: 2007: 2008: 2009: Jan Feb Mar (1) Apr May Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May (1) Jun Jul Aug Sep Oct (2) Nov (3) Dec (8) Jan (9) Feb (4) Mar (3) Apr (1) May Jun (2) Jul (16) Aug (11) Sep (10) Oct Nov Dec Jan Feb (4) Mar Apr May Jun (5) Jul (7) Aug (2) Sep Oct Nov (7) Dec (4) Jan Feb Mar (1) Apr (4) May (4) Jun (3) Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug Sep Oct (2) Nov Dec Jan Feb (2) Mar Apr May Jun Jul Aug Sep Oct Nov Dec
 [Pympi-users] About Gather/Scatter From: Luigi Paioro - 2006-07-18 09:28 ```This is a general question about how Gather and Scatter functions work. I refer to Miller 2002 paper which says that a "simple way to achieve=20 parallelism is with gather/scatter parallelism. A scatter operation will=20 take a container, split it into equal (or nearly equal) parts that are=20 messaged to various slave tasks. A gather reverses that and collects=20 sub-containers together into one larger Python list." This is the example code: import mpi import crypt if mpi.rank =3D=3D 0: words =3D open(=92/usr/dict/words=92).read().split() else: words =3D [] local_words =3D mpi.scatter(words) target =3D =92xxaGcwiAKoYgc=92 for word in local_words: if crypt.crypt(word,target[:2]) =3D=3D target: print =92the word is=92,word break I would like to understand whether a) is the list split into mpi.size groups and then each group sent to a=20 parallel task b) or is each list entry successively sent to the first parallel task fre= e? Example (np=3D2) local_words =3D ['Luigi', 'Dough', 'Patrick', 'Julian', 'James'] a) local_words_0 =3D ['Luigi', 'Dough', 'Patrick'] --> to rank 0 local_words_1 =3D ['Julian', 'James'] --> to rank 1 b) for word in local_words: word --> single list element sent to the first "rank" free Hope I'm clear. Thanks. Luigi ```

 Re: [Pympi-users] About Gather/Scatter From: Pat Miller - 2006-07-18 11:36 Attachments: Message as HTML ```Hello all, I've just started my new job in New York City and will shortly be active in the pyMPI world again. To answer Luigi's question, scatter takes any container (supports length and slice) and splits it into nearly equal pieces (the low ranks get extras). Each piece is sent in a single message to the target rank. So, with np=2 A = [11,22,33,44,55] localA = mpi.scatter(A) on rank 0, localA is [11,22,33] on rank 1, localA is [44,55] Notice this works for anything that looks vaguely like a list. So you can scatter a dictionary with D.iteritems() for instance. The localA is always a list, however, the original type is not preserved. Pat ```