espressomd-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [ESPResSo-devel] MPI + Cuda gives error


From: Joost de Graaf
Subject: Re: [ESPResSo-devel] MPI + Cuda gives error
Date: Wed, 20 Sep 2017 12:36:18 +0100

Hello Flo,

The error only occurs for n>1 mpi processes. This is odd, since a cuda block is called, which should be executed on only one machine anyway.

Best Wishes, Joost

On 19 September 2017 at 17:24, Florian Weik <address@hidden> wrote:
Hi Joost,
"dim 0 4 1" looks in deed like an invalid argument. A block size of 0 makes little sense.

Flo

On Tue, Sep 19, 2017 at 6:14 PM Joost de Graaf <address@hidden> wrote:
Hi Guys,

Do you also experience the following error message:

error "invalid configuration argument" calling reset_boundaries with dim 0 4 1, grid 64 1 1 in /home/joost/FILES/ESPRESSO/src/core/lbgpu_cuda.cu:3287
application called MPI_Abort(comm=0x84000004, 1) - process 1

in the latest Python version of ESPResSo? I run my script with mpiexec -n 4 ./pypresso [script + options].

I just wanted to check this before launching a bug report, so that I don't waste time. I have recently rebuilt my machine, so I can imagine there is an issue with one of the libraries, but the error does not seem to imply that. I hope you can help. That way I can do the final checks on the Active Matter bit of the upcoming ESPResSo summer school.

Thanks in Advance,

Joost


reply via email to

[Prev in Thread] Current Thread [Next in Thread]