Newsgroups: comp.parallel.mpi
From: cameron@epcc.ed.ac.uk (Kenneth Cameron)
Subject: Re: CM5 or T3D versions of PETSc Chameleon SLES?
Organization: Edinburgh Parallel Computing Centre
Date: Tue, 31 Jan 1995 12:08:58 GMT
Message-ID: <D39r2y.1tD@dcs.ed.ac.uk>

In article <SAROFF.95Jan26094552@ec.msc.edu>, (Stephen Saroff) writes:
> Also is there a T#d version of MPI?  And if so, is it based on PVM on
> SHMEM?
> 

        We have developed a native MPI for the T3D here at EPCC. The work
(6 mths) was funded by CRI. We finished at the beginning of December and   
have a number of local projects running sucessfully with it.
How it is made available to Cray users, support arrangements etc are still
to be sorted out. If you are interested in it, you should let your local
CRI bod know. You can also email
			t3dmpi-requests@epcc.ed.ac.uk
(this is n't a mailing list) which will reach the projects manager at EPCC.
He can then keep you up to date. You can also email any technical questions
to the same address and we'll do our best to answer them.

It is written from scratch on top of SHMEM.

> How does MPI interact with SCALAPACK?

I've never looked at SCALAPACK before, so I'm not sure what you mean ?
A quick search of a nearby netlib mirror tells me its some kind of
parallel maths lib but the intro docs don't say what it uses to communicate.
If it uses shared memory, then there should be no interaction at all. If
it is designed to run on top of MPI, then it should do so just fine.

-- 
e||) Kenneth Cameron (kenneth@ed)     Edinburgh Parallel Computing Centre e||)
c||c Applications Scientist             University of Edinburgh, Scotland c||c
               This message has been digitally re-mastered.

