[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[igraph] graphs too large for RAM

From: Richard Llewellyn
Subject: [igraph] graphs too large for RAM
Date: Fri, 3 Jul 2009 09:28:10 -0700

I've been shopping for a python package that will help me with big graphs (>1e7 nodes), too large to store in RAM for my 8Gb 64 bit linux desktop.  Igraph looks efficient, unlike some more popular python packages, as it is largely implemented in C -- I have reluctantly left behind kjbuckets behind, as that was quite speedy but now out of date.

It doesn't look like igraph has support for the graph>RAM problem, though, but I thought I'd ask around to see how others deal with the problem (buying more RAM won't work for this machine).  I currently store the graph in hdf5 as a sparse matrix using a pytables variable length array, but I am new to hdf5 and haven't done much to optimize disk I/O. 

It sure would be nice to have a python package that could operate intelligently on these big graphs that required disk I/O -- seems like this is an area that could put some of that fancy algorithimics to practical application.  For example, getting the transitive closure relatively efficiently has been a challenge by brute force, but someone smarter than me might determine highly connected nodes of the graph, load this subgraph into RAM, process these, and determine the next subgraph to load.... or something.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]