octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: mpi 1.1.1 released


From: c.
Subject: Re: mpi 1.1.1 released
Date: Tue, 10 Dec 2013 08:43:33 +0100

On 4 Dec 2013, at 18:21, Sukanta Basu <address@hidden> wrote:

> Hi Carlo,
> 
> I hope all is well.
> 
> For the past few weeks, I have been using mpi (1.1.1) with openmpi
> successfully. The memory leak issue still remains. However, I was
> unable to compile mpi with mpich2. I get the following error. I would
> appreciate any suggestion!
> 
> Best regards,
> Sukanta

Hi Sukanta,

Sorry for the late reply.

The problem with mpich2 is given by the following lines in the Makefile:

 MPICC := mpic++
 JUNK := $(shell $(MPICC) -showme:compile)
 MPIINC := $(shell echo $(JUNK) | sed -e "s/-pthread/-lpthread/g")
 JUNK := $(shell $(MPICC) -showme:link)
 MPILIBS := $(shell echo $(JUNK) | sed -e "s/-pthread/ /g")

which try to extract compile and link flags needed to build against 
your mpi library by running

 mpic++ -showme:compile

and 

 mpic++ -showme:link

respectively. It appears mpich2 does not support these commands and I'm 
actually not even 
sure it provides the mpic++ wrapper either.

So the best option for the moment is the following, I changed the variable 
definitions in 
the Makefile as follows:

 MPICC     ?= mpic++
 OFMPIINC  ?= $(shell $(MPICC) -showme:compile | sed -e 
"s/-pthread/-lpthread/g")
 MPIINC    := $(OFMPIINC)
 OFMPILIBS ?= $(shell $(MPICC) -showme:link | sed -e "s/-pthread/ /g")
 MPILIBS   := $(OFMPILIBS)

with this change you can set the include and link flags for your mpi yourself 
before installing 
the package:

>> setenv ("OFMPIINC", "-I/opt/openmpi/1.6.5/include")
>> setenv ("OFMPLIBS", "-L/opt/openmpi/1.6.5/lib -lmpi_cxx -lmpi -lm")
>> pkg install mpi.tar.gz

I currently made this change in the subversion repository, it's not released 
yet, so you 
must either download the package from there or unpack and patch the package 
yourself.

I'd be glad to here feedback from you if you try this out.

HTH,
c.





reply via email to

[Prev in Thread] Current Thread [Next in Thread]