emacs-bug-tracker
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[debbugs-tracker] bug#31434: closed ([PATCH 0/2] OpenMPI 3.0)


From: GNU bug Tracking System
Subject: [debbugs-tracker] bug#31434: closed ([PATCH 0/2] OpenMPI 3.0)
Date: Fri, 25 May 2018 11:45:01 +0000

Your message dated Fri, 25 May 2018 13:44:36 +0200
with message-id <address@hidden>
and subject line Re: [bug#31434] [PATCH 0/2] OpenMPI 3.0
has caused the debbugs.gnu.org bug report #31434,
regarding [PATCH 0/2] OpenMPI 3.0
to be marked as done.

(If you believe you have received this mail in error, please contact
address@hidden)


-- 
31434: http://debbugs.gnu.org/cgi/bugreport.cgi?bug=31434
GNU Bug Tracking System
Contact address@hidden with problems
--- Begin Message --- Subject: [PATCH 0/2] OpenMPI 3.0 Date: Sat, 12 May 2018 18:01:21 +0200
Hello,

The attached patch updates OpenMPI to 3.0.1.  The ‘superlu-dist’ upgrade
was necessary to get there.

I was able to rebuild everything that depends on it on x86_64, except
‘dealii-openmpi’, where GCC would eat all the memory of the machine
before completing (I suppose it might work with --cores=1 but I haven’t
tried.)

Ludo’.

Ludovic Courtès (2):
  gnu: superlu-dist: Update to 5.3.0.
  gnu: openmpi: Update to 3.0.1.

 gnu/packages/maths.scm | 87 ++++++++++++++++++++++++++++++++++--------
 gnu/packages/mpi.scm   |  8 ++--
 2 files changed, 76 insertions(+), 19 deletions(-)

-- 
2.17.0




--- End Message ---
--- Begin Message --- Subject: Re: [bug#31434] [PATCH 0/2] OpenMPI 3.0 Date: Fri, 25 May 2018 13:44:36 +0200 User-agent: Gnus/5.13 (Gnus v5.13) Emacs/25.3 (gnu/linux)
Hello,

address@hidden (Ludovic Courtès) skribis:

> Ludovic Courtès <address@hidden> skribis:
>
>> The attached patch updates OpenMPI to 3.0.1.  The ‘superlu-dist’ upgrade
>> was necessary to get there.
>
> Did you have a chance to look into it?  You mentioned on IRC that there
> might be a better way than setting OMPI_MCA_plm_rsh_agent.

I went ahead and apply the patches as they were so we can have a recent
Open MPI.  We can always revisit the OMPI_MCA_plm_rsh_agent issue later.

Thanks,
Ludo’.


--- End Message ---

reply via email to

[Prev in Thread] Current Thread [Next in Thread]