Newsgroups: comp.parallel.mpi
From: arjen@fwi.uva.nl (Arjen Schoneveld)
Subject: MPI_Type_vector problems
Organization: FWI, University of Amsterdam
Date: 10 Jan 1995 10:16:43 GMT
Message-ID: <3etmqb$7te@mail.fwi.uva.nl>

Hello,

Could someone help me with the following problem. I am having troubles
with the MPI_Type_vector function.

I have defined a structure:

typedef struct {
  int chrom[MAXSTRING];        
  double value;                 
  double fitness;             
} cell;

and a matrix:

cell **pop; /* size [proc_x + 1][proc_y + 1], the outer 2 rows and columns of
               the matrix are used for communicated rows and columns of other
               processors */ 

I have created an MPI_CELL datatype:

MPI_Address(&c.chrom, &displ[0]);
MPI_Address(&c.value, &displ[1]);
types[0] = MPI_INT;
types[1] = MPI_DOUBLE;
for(i=1;i>=0;i--)
  displ[i] -= displ[0];
MPI_Type_struct(2, blockcounts, displ, types, &MPI_CELL);
MPI_Type_commit(&MPI_CELL);

Furthermore I have defined a 2D periodic grid, MPI_COMM_GRID.

I also have defined an MPI_BOUNDARY datatype which corresponds to  
a column in the pop matrix:

MPI_Type_vector(proc_x + 1, 1, proc_y + 1, MPI_CELL, &MPI_BOUNDARY);
MPI_Type_commit(&MPI_BOUNDARY);

If I send the top and bottom rows of the matrix everything goes well:

MPI_Sendrecv(&pop[1][1], proc_y, MPI_CELL, proc_north, \
                 FROM_SOUTH, &pop[proc_x+1][1], proc_y, \
                 MPI_CELL , proc_south, FROM_SOUTH, MPI_COMM_GRID, &status);
MPI_Sendrecv(&pop[proc_x][1], proc_y, MPI_CELL, proc_south, \
                 FROM_NORTH, &pop[0][1], proc_y, MPI_CELL, \
                 proc_north, FROM_NORTH, MPI_COMM_GRID, &status);

However, if is want to send the columns using the MPI_BOUNDARY data type,
things go wrong:

MPI_Sendrecv(&pop[0][1], 1, MPI_BOUNDARY, \
                 proc_west, FROM_EAST, &pop[0][proc_y+1], \
                 1, MPI_BOUNDARY, proc_east, FROM_EAST, MPI_COMM_GRID, \
                 &status);
MPI_Sendrecv(&pop[0][proc_y], 1, MPI_BOUNDARY, proc_east, \
                 FROM_WEST, &pop[0][0], 1, MPI_BOUNDARY, proc_west, \
                 FROM_WEST, MPI_COMM_GRID, &status);

Example output:
---------------
Consider four processor (0, 1, 2, 3), each of which contains a matrix 
pop of size [proc_x+1][proc_y+1]. processors are arranged as follows in a grid:
| proc 1, proc 3 |
| proc 0, proc 2 |

Now the behaviour of procs. 0 and 2 (I only displayed the "int-ed" cell.value)

before communication:
proc 0: (north = 1, south = 1, east = 2, west = 2)
 0  0  0  0  0  0  0
 0  1  2  3  4  5  0
 0  2  4  6  8 10  0
 0  3  6  9 12 15  0
 0  4  8 12 16 20  0
 0  5 10 15 20 25  0
 0  0  0  0  0  0  0

proc 2:
 0  0  0  0  0  0  0
 0 11 12 13 14 15  0
 0 12 14 16 18 20  0
 0 13 16 19 22 25  0
 0 14 18 22 26 30  0
 0 15 20 25 30 35  0
 0  0  0  0  0  0  0

after communication:
proc 0:
 0 10 15 20 25 30  0
 0  1  2  3  4  5 11
 0  2  4  6  0 10  0
 0  3  6  9 12 15 13
 0  4  8 12  0 20  0
 0  5 10 15 20 25  0
 0  6  7  8  9 10  0

proc 2:
 0 20 25 30 35 40  0
 0 11 12 13 14 15  1
 0 12 14 16  0 20  0
 0  3 16 19 22 25  3
 0 14 18 22 26 30  0
 0 15 20 25 30 35  0
 0 16 17 18 19 20  0

This is obviously wrong. 

Could someone help me with this one (I am new to MPI)?

Thanks a Lot in Advance (TaLiA),

Arjen.
-- 
Arjen Schoneveld		Parallel Scientific Computing & Simulation Group
tel: (+31) 20 525 7539		University of Amsterdam
room: F2.22			Kruislaan 403, 1098 SJ Amsterdam
e-mail: arjen@fwi.uva.nl	The Netherlands

