[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [ESPResSo-users] iccp3m question

From: Stefan Kesselheim
Subject: Re: [ESPResSo-users] iccp3m question
Date: Wed, 1 Oct 2014 20:41:32 +0200

I very briefly checked your script and it looks OK. The error appears strange 
indeed. I hope there is no serious issue with ICCP3M. 
However, P3M can produce different results. I strongly recommend using P3M with 
the same parameters (rcut, alpha, cap, mesh) on all numbers of CPUs. 
I have recently quit science, so my time to check it more carefully is somewhat 
For better debugging, might also want to consider smaller systems, where 
results are obtained quicker..
Cheers and good luck

On Sep 29, 2014, at 1:25 AM, Xikai Jiang <address@hidden> wrote:

> Dear All:
> I'm testing iccp3m for a system that consists of a single charged particle 
> confined between two parallel walls with zero electrical potential difference 
> in Espresso-3.3.0. All atoms are fixed in space, and the charged particle 
> carries a charge of 50e. My purpose is to test whether the system is 
> electrically neutral (it should be neutral in principle).
> When I put the test charge 0.2nm above the bottom wall, and run Espresso for 
> 1 step using 1 core, the net charge of system is 0.2717e, which can be 
> considered neutral. But when I run the same simulation using 8 cores, the net 
> charge of system becomes 6.224e, and cannot be considered neutral within 
> numerical inaccuracy.
> I tried to put the test charge away from the wall (1nm away), the net charge 
> in system becomes 4.9e using both 1 core and 8 cores. Refining the grid on 
> wall (spacing from 0.3nm to 0.1nm) helps to reduce the net charge in system 
> to 2.4e, about 4.8% of the amount of the test charge.
> Does anyone have ideas where the problem is? I have attached the test code, 
> any help or comments are appreciated.
> Regards.
> Xikai
> <test_constant_potential_parallel_plate_nve_fixed_atoms.tcl>

reply via email to

[Prev in Thread] Current Thread [Next in Thread]