Newsgroups: comp.parallel.mpi
From: aps@theo23.RZ-Berlin.MPG.DE (Ari P Seitsonen)
Reply-To: Ari.P.Seitsonen@iki.fi
Subject: Re: Problems with Cray T3D and T3E implementations of MPI
Date: 02 Nov 1996 19:21:47 GMT
Message-ID: <APS.96Nov2202147@theo23.RZ-Berlin.MPG.DE>


Hello Lutz,

In article <55faur$du1@gwdu19.gwdg.de> lpressl@avca.uni-geophys.gwdg.de (Lutz Pressler) writes:

>As I don't see why this barrier should be necessary, I think this is
>some kind of implementation error. 

Maybe this is the same error as I find in T3E; there's some info on

http://www.kfa-juelich.de/zam/docs/tki/tki_html/t0299/chapter_5/section_1.html

  My program also doesn't hang anymore after setting an MPI_Barrier,
just before all MPI_Allreduce(...,MPI_SUM,...). Setting 'streams' off
(you can do it also just during linking of the program, with f90
-Wl"-Dstreams=off") is not a very good choice, about 20-30 % slower
than with streams & those barrier's. It's a known problem, you might
ask some Cray programming experts for more help/info.

    Good luck,

       apsi
--
-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-=*=-

  Ari P Seitsonen

  Fritz-Haber-Institut der Max-Planck-Gesellschaft
  Abteilung Theorie,            Faradayweg 4 - 6
  14195 Berlin - Dahlem,        Deutschland

  Telefon +49 - 30 - 8413 4850
  Fax     +49 - 30 - 8413 4701 (FHI)
  Fax     +49 - 30 - 8413 4920 (Personal)

  Email Ari.P.Seitsonen@FHI-Berlin.MPG.DE / Ari.P.Seitsonen@iki.fi
  WWW   http://www.FHI-Berlin.MPG.DE/th/personal/apsi/apsi.html

