parallel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Too much memory being used with "find" and parallel


From: Nelson A. de Oliveira
Subject: Too much memory being used with "find" and parallel
Date: Wed, 23 Jul 2014 23:51:59 -0300

Hi!

I have about 3 million PNG files across some directories and I would
like to optimize them.
I did a shell script with this function:

optimize () {
        pngquant --force --speed 1 --ext .png 16 $1
        optipng -o7 -quiet $1
}

And I am (trying) to run it against all the PNGs with this:

find $TILEDIR -type f -name "*.png" | \
        parallel --env optimize -j +0 --eta optimize

But there is one problem: parallel starts to eat all my RAM and I just
can't run it.
After using 8G of RAM I have to kill it (it seems that it waits for
the full output of find and it stores all the file paths before
calling optimize() on them)

Is there a way to accomplish this task without having problems with
the memory usage, please?

parallel is 20140722

Thank you!

Best regards,
Nelson



reply via email to

[Prev in Thread] Current Thread [Next in Thread]