igraph-help
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [igraph] Memory usage


From: Russell Neches
Subject: Re: [igraph] Memory usage
Date: Thu, 11 Nov 2010 14:05:03 -0800

Yeah, I guessed that it was a superabundance of strings.

The reason I didn't use igraph.Graph.De_Bruijn() is that this generates
every possible substring relationship, which I definitely do not want to
do. I only want the relationships that exist in the corpus, as well as
their multiplicities and orientation (forward or reverse strand on the
DNA). 

Also, because it's necessary to check that a substring doesn't already
exist in the graph, I created a vertex attribute containing the
substring. I assumed that this is what is causing the huge memory
footprint, but 6.3k seems a little extreme (the strings are only 32
characters long).

Also, I'm not sure what you mean about the Python interpreter's tendency
(or not) to return memory to the OS -- I rely on the garbage collector
pretty heavily in other things I do! I was suddenly a bit worried, so I
tried appending a the contents of large-ish file to a list object 25
times, and plotted the memory usage of the interpreter with each
iteration. Then I let the list go out of scope, and checked the memory
usage (see attached plot). As you can see, my interpreter, at least, is
returning the memory to the OS...

Perhaps you mean the igraph bindings for python? That would be scary
indeed...


Russell

On Thu, 2010-11-11 at 12:01 -0500, address@hidden wrote:
> Hi again,
> 
> I think what may have happened is that you used tons of string
> operations in your implementation to construct the graph. The Python
> interpreter may have interned some of the strings (i.e. it keeps the
> corresponding Python object around in memory in case you need that
> string later), and that's what could have caused the increased memory
> consumption.  Another thing that may have caused this is the fact that
> the Python interpreter never releases memory back to the OS on its own
> (again, for performance reasons), so if the peak memory consumption
> was larger than the final one, you can only measure the peak
> consumption and not the actual one.
> 
> -- 
> Tamas 

Attachment: python_memory_usage.png
Description: PNG image

Attachment: signature.asc
Description: This is a digitally signed message part


reply via email to

[Prev in Thread] Current Thread [Next in Thread]