[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ESPResSo] Diffusion of a probe particle

From: Olaf Lenz
Subject: [ESPResSo] Diffusion of a probe particle
Date: Mon, 06 Aug 2007 16:13:13 +0200
User-agent: Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US; rv: Gecko/20070728 Thunderbird/ Mnenhy/

Hash: SHA1


I'm currently computing diffusion constants of probe particles in
systems with different obstacles. For a start, I just want to simulate
free diffusion of non-interacting particles without any obstacles and
measure the mean-square deviation (MSD).

Using ESPResSo, this yields a number of problems which render the
simulation pretty inefficient. I want to ask whether any of you has some
more ideas on how to treat the problem efficiently.

The main problem seems to originate from the following:
On the one hand, when I set up a single probe particle and let it
diffuse, the script needs to return control to the Tcl level very often,
so that I get a very significant time overhead. The only solution to
this that I see would be to implement the measurement of the MSD in C so
that the simulation does not have to return to the Tcl level. This would
however neither be simple nor in the spirit of ESPResSo, so I would like
to avoid this.

On the other hand, when I set up a larger number of probe particles
(e.g. 10000), this also seems to create a time overhead, even though the
particles do not interact, and the memory requirements are still far
from any hard memory limits (approx 10 MB). Apparently, the particles
are somehow included in the Verlet-list of the other particles, even
though they do not interact. To get rid of this problem, I tried to make
the box length as large as possible (e.g. box_l=1000). This should cause
the particles to be far from each other so that they do not occur in
each other's Verlet lists any more. This did indeed reduce the problem,
but still a large number of particles significantly slows down the
simulation. Also, changing the Verlet skin affected the timing.
Where, exactly, does this overhead for larger particle numbers come from?

Furthermore, making the box size very large will not work any more as
soon as obstacles of a certain density are introduced. Either it is
necessary to create lots of replicas of the obstacles, or I have to make
the box relatively small - both methods would make the simulation slow
again. Does anyone have an idea how to handle this?

Using a large box length, I have also tried to disable the Verlet lists.
- From my understanding, this should speed up the simulation, as no Verlet
list update would ever be required and the size of the default cell
lists should be small enough not to contain any neighboring particles.
However, this did not seem to be the case. Instead, the simulation was
significantly slower (factor 3 or so). Can anybody explain this to me?

Best regards

Version: GnuPG v1.4.1 (GNU/Linux)
Comment: Using GnuPG with Mozilla -


reply via email to

[Prev in Thread] Current Thread [Next in Thread]