Newsgroups: comp.parallel.mpi
From: smith@epcc.ed.ac.uk (A Gordon Smith)
Subject: Re: mpi availability & performance
Organization: Edinburgh Parallel Computing Centre
Date: Mon, 29 May 1995 09:41:30 GMT
Message-ID: <D9C2x7.HHs@dcs.ed.ac.uk>

In article <3piq5e$4aa@NNTP.MsState.Edu>, tony@aurora.cs.msstate.edu (Tony Skjellum) writes:
> 
> CampbellM (campbellm@aol.com) wrote:
> 
> : 1. What platforms is mpi *currently* available on?
> We know that there are many MPI's around.  The key ones are as follows:
> 	LAM - for clusters, not emphasizing performance but rather environment for
> 		development
> 	MPICH - Argonne/MSU, runs on T3D, Paragon, iPSC860, SGI multiprocessors,
> 			SP-2, SP-1, clusters (Sun, HP, etc), clusters with Myrinet,
> 			Convex, PVP's from Cray, etc.
> 	CHIMP/MPI - T3D, and probably others.
> Both MPICH and CHIMP are aimed at high performance.

CHIMP/MPI has been ported to the following platforms:

      o    Sun workstations running SunOS 4.1.x [sun4] 
      o    Sun workstations running Solaris 2.x [sun5] 
      o    Silicon Graphics running IRIX 4 [sgi4] 
      o    Silicon Graphics running IRIX 5 [sgi5] 
      o    IBM RS/6000 running AIX 3.2 [rs6000] 
      o    Sequent Symmetry [symm] 
      o    DEC Alpha AXP running OSF/1 [axposf] 
      o    Meiko CS-1 - transputer node [t800] 
      o    Meiko CS-1 - i860 node (MK096) [i860] 
      o    Meiko CS-1 - SPARC node [cshost] 
      o    Meiko CS-2 [cs2]
      o    Hewlett-Packard 7 workstations running HP-UX [hpux7]

CHIMP has not been ported to Cray T3D, indeed CHIMP is not actively
being ported to other platforms.

EPCC has implemented MPI for T3D from scratch in collaboration with
Cray Research. Contact <t3dmpi@epcc.ed.ac.uk> for more information.


-- 
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
 -=-=- A. Gordon Smith -=- Edinburgh Parallel Computing Centre -=-=-
 =-= Email <smith@epcc.ed.ac.uk> -=- Phone {+44 (0)131 650 6712} =-=
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

