[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Severe memleak in sequence expressions?

From: Marc Schiffbauer
Subject: Re: Severe memleak in sequence expressions?
Date: Thu, 1 Dec 2011 13:08:37 +0100
User-agent: Mutt/1.5.21 (2010-09-15)

* Bob Proulx schrieb am 01.12.11 um 05:34 Uhr:
> Marc Schiffbauer wrote:
> > Greg Wooledge schrieb:
> > > Marc Schiffbauer wrote:
> > > > echo {0..10000000}>/dev/null
> > > > 
> > > > This makes my system starting to swap as bash will use several GiB of
> > > > memory.
> > >
> > > In my opinion, no.  You're asking bash to generate a list of words from 0
> > > to 1000000000000000000 all at once.  It faithfully attempts to do so.
> > 
> > Yeah, ok but it will not free the mem it allocated later on (see
> > other mail)

Hi Bob,

> In total to generate all of the arguments for {0..10000000} consumes
> at least 78,888,899 bytes or 75 megabytes of memory(!) if I did all of
> the math right.  Each order of magnitude added grows the amount of
> required memory by an *order of magnitude*.  This should not in any
> way be surprising.  In order to generate 1000000000000000000 arguments
> it might consume 7.8e7 * 1e10 equals 7.8e17 bytes ignoring the smaller
> second order effects.  That is a lot of petabytes of memory!  And it
> is terribly inefficient.  You would never really want to do it this
> way.  You wouldn't want to burn that much memory all at once.  Instead
> you would want to make a for-loop to iterate over the sequence such as
> the "for ((i=1; i<=1000000000000000000; i++)); do" construct that Greg
> suggested.  That is a much more efficient way to do a loop over that
> many items.  And it will execute much faster.  Although a loop of that
> large will take a long time to complete.

I was hit by that by accident. Normally I always do normal for-loops
instead so I was a bit surprised by the fact that my machine was not
responding anymore ;-)

Clearly, when I think about it again, it is more or less obvious.

> Put yourself in a shell author's position.  What would you think of
> this situation?  Trying to generate an unreasonably large number of
> program arguments is, well, unreasonable.  I think this is clearly an
> abuse of the feature.  You can't expect any program to be able to
> generate and use that much memory.


> And as for whether a program should return unused memory back to the
> operating system for better or worse very few programs actually do it.
> It isn't simple.  It requires more accounting to keep track of memory
> in order to know what can be returned.  It adds to the complexity of
> the code and complexity tends to create bugs.  I would rather have a
> simple and bug free program than one that is full of features but also
> full of bugs.  Especially the shell where bugs are really bad.
> Especially in a case like this where that large memory footprint was
> only due to the unreasonably large argument list it was asked to
> create.  Using a more efficient language construct avoids the memory
> growth, which is undesirable no matter what, and once that memmory
> growth is avoided then there isn't a need to return the memory it
> isn't using to the system either.
> If you want bash to be reduced to a smaller size try exec'ing itself
> in order to do this.
>   $ exec bash
> That is my 2 cents worth plus a little more for free. :-)

Thank you for the explanation. 

I will not consider this a bug anymore ;-)

8AAC 5F46 83B4 DB70 8317  3723 296C 6CCA 35A6 4134

reply via email to

[Prev in Thread] Current Thread [Next in Thread]