[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Parallel and MPI with default branch
From: |
Sebastian Schöps |
Subject: |
Re: Parallel and MPI with default branch |
Date: |
Sun, 18 Feb 2018 17:56:54 +0100 |
> Am 17.02.2018 um 01:19 schrieb c. <address@hidden>:
>> On 17 Feb 2018, at 01:16, c. <address@hidden> wrote:
>>
>> Hi, I just udated the uploaded package at
>>
>> https://gitserver.mate.polimi.it/redmine/attachments/download/64/mpi-2.2.0.tar.gz
> sorry, wrong link, the correct one is:
> https://gitserver.mate.polimi.it/redmine/attachments/download/65/mpi-2.2.0.tar.gz
Sorry Carlo,
does not work for me :(
octave:1> pkg install -verbose mpi-2.2.0.tar.gz
mkdir (/var/folders/yr/0kxqs6ns5_ncvr7v_sg568ww0000gp/T/oct-B32mL7)
untar (mpi-2.2.0.tar.gz,
/var/folders/yr/0kxqs6ns5_ncvr7v_sg568ww0000gp/T/oct-B32mL7)
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Init.cc
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Initialized.cc
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Comm_rank.cc
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Comm_size.cc
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Init.cc -o MPI_Init.o
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Initialized.cc -o
MPI_Initialized.o
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Comm_rank.cc -o
MPI_Comm_rank.o
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Comm_size.cc -o
MPI_Comm_size.o
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Finalize.cc
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Finalize.cc -o MPI_Finalize.o
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Finalized.cc
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Finalized.cc -o
MPI_Finalized.o
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Send.cc
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Send.cc -o MPI_Send.o
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Recv.cc
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Recv.cc -o MPI_Recv.o
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Barrier.cc
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Barrier.cc -o MPI_Barrier.o
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/bin/mkoctfile-4.3.0+ --verbose
-I/usr/local/Cellar/open-mpi/3.0.0_2/include -c MPI_Iprobe.cc
clang++ -std=gnu++11 -c -I/usr/X11/include -fPIC
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/..
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave
-I/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include -D_THREAD_SAFE -pthread
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk-9.jdk/Contents/Home/include/darwin
-I/usr/local/Cellar/open-mpi/3.0.0_2/include MPI_Iprobe.cc -o MPI_Iprobe.o
MPI_Send.cc:90:26: error: no viable overloaded '='
retval = info;
~~~~~~ ^ ~~~~
/usr/local/Cellar/octave/HEAD-90bd5649983c_1/include/octave-4.3.0+/octave/ov.h:359:17:
note: candidate function not viable: no
known conversion from 'Array<int>' to 'const octave_value' for 1st
argument
octave_value& operator = (const octave_value& a)
^
MPI_Recv.cc:72:15: error: no matching function for call to 'MPI_Get_count'
MPI_Get_count (&status, MPI_CHAR, &num);
^~~~~~~~~~~~~
/usr/local/Cellar/open-mpi/3.0.0_2/include/mpi.h:1477:20: note: candidate
function not viable: no known conversion from
'octave_idx_type *' (aka 'long long *') to 'int *' for 3rd argument
OMPI_DECLSPEC int MPI_Get_count(const MPI_Status *status, MPI_Datatype
datatype, int *count);
^
1 error generated.
make: *** [MPI_Send.o] Error 1
make: *** Waiting for unfinished jobs....
1 error generated.
make: *** [MPI_Recv.o] Error 1
rm MPI_Finalize.o MPI_Barrier.o MPI_Comm_rank.o MPI_Iprobe.o MPI_Finalized.o
MPI_Initialized.o MPI_Comm_size.o
pkg: error running `make' for the mpi package.
error: called from
configure_make at line 95 column 9
install at line 192 column 7
pkg at line 394 column 9
octave:1>
- Re: Parallel and MPI with default branch, c., 2018/02/14
- Re: Parallel and MPI with default branch, c., 2018/02/16
- Re: Parallel and MPI with default branch, kingcrimson, 2018/02/19
- Re: Parallel and MPI with default branch, Sebastian Schöps, 2018/02/19
- Re: Parallel and MPI with default branch, kingcrimson, 2018/02/19
- Re: Parallel and MPI with default branch, Sebastian Schöps, 2018/02/19
- Re: Parallel and MPI with default branch, Carlo de Falco, 2018/02/19
- Re: Parallel and MPI with default branch, Sebastian Schöps, 2018/02/19
- Re: Parallel and MPI with default branch, kingcrimson, 2018/02/20