[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [ESPResSo-users] [SPAM] Re: activate magnetostatics method DipolarDi

From: ricky95
Subject: Re: [ESPResSo-users] [SPAM] Re: activate magnetostatics method DipolarDirectSumCpu
Date: Sun, 12 May 2019 15:42:41 +0800

Dear Rudolf,

I have tried to run the script "" of the supplementary information of
with " mpirun -n  32"and found that
      13: misplaced part id 7647. 0x29a4fa0 != 0x29a4cd0
during thermalization.
I have checked the document and searched the mailing list, unfortunately I didn't find the answer. Does anyone have the clue about that?

Best regards
在 2019年5月10日 下午10:32,Rudolf Weeber <address@hidden>写道:

On Fri, May 10, 2019 at 09:05:12PM +0800, address@hidden wrote:
> I think the DipolarDirectSumCpu is a long range interaction and the command of mpi version will divide the whole box into local boxes so that each node sees everything on its own local box and on one layer of cells. Due to that, the DipolarDirectSumCpu can't work fine! Is that why I could not activate magnetostatics method DipolarDirectSumCpu with the command of mpi version? If it's true, how can I calculate the dipolar interaction with open boundary?
The dipolar direct sum on th CPU is not parallelized. You can only use it on a single core.
The remaining simulation is typically suited for a single cpu core.

There is a gpu implementation (DipolarDirectSumGpu) optimized for systems with more than ~2000 magnetic particles.
For large systems (>~10000 dipoles), you can use the P2NFFT method
It scales as N log N in the number of dipoles (as opposed to N^2 for direct summatoin). The supplementary information contains usage examples for ESPResSo.

Hope that helps!
Regards, Rudolf

reply via email to

[Prev in Thread] Current Thread [Next in Thread]