parallel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Problem with thousands of small jobs.


From: Rob Sargent
Subject: Re: Problem with thousands of small jobs.
Date: Fri, 06 Feb 2015 09:40:20 -0700
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.4.0

What is your actual cpu configurations? eg 16 dual cores, or 8 quad core not hyperthreaded, or 8 dual core hyperthreaded or whatever.

On 02/06/2015 08:39 AM, xmoon 2000 wrote:
Do you think the arguments to the parallel command are OK?

On 6 February 2015 at 14:58, Felipe Alvarez <felipe.alvarez@gmail.com> wrote:
echo $SHELL

Bash is default in cygwin


On Sat, 7 Feb 2015 00:56 xmoon 2000 <xmoon2000@googlemail.com> wrote:
I am running this in cygwin. I call parallel from the standard
window/shell supplied by cygwin.

What makes you think it is bash? (You may well be right)

On 6 February 2015 at 14:43, Felipe Alvarez <felipe.alvarez@gmail.com>
wrote:
Is there a"lighter" shell that you can use other than bash?


On Fri, 6 Feb 2015 23:20 xmoon 2000 <xmoon2000@googlemail.com> wrote:
Hi,

I need to run about 4,000 jobs that each take around 20 seconds to
complete

using:

  cat /tmp/parList | parallel -j 28 --eta;

On my 32 core machine works OK,   BUT there is a "lull" in processing
every few seconds as new jobs are started, once the current crop have
completed. I assume this is due to an overhead in starting jobs that
is only noticable because my jobs are so short.

Is there any way I could make this more efficient, so my cores are
fully utilised and getting through the whole process is faster?

Moon





reply via email to

[Prev in Thread] Current Thread [Next in Thread]