
From:  Ruchika Salwan 
Subject:  Re: [igraph] Graphs too large for Ram 
Date:  Mon, 19 Dec 2016 17:02:56 +0530 
Hi,
That's true. I have developed the basic version with Igraph. Can you tell me about any other library that I can use to implement the algorithm for massive graphs
Thanks,
Ruchika
I am following this research paper whose findings I have to replicate. And one of their graphs has 5million nodes and 69 million edges. That's the smallest dataset they are using.
igraph has no problems with a graph of that size on a decent machine. (Mine has 8 GB of RAM and an ErdosRenyi random graph of that size fits easily). Larger graphs can become problematic  but anyway, working with inmemory graphs and ondisk graphs is radically different, and igraph was designed for the former usecase, so it won't be of any help to you if your graph does not fit into RAM. The problem is that igraph makes assumptions about the cost of certain operations; for instance, it assumes that looking up the neighbors of a vertex can be done in constant time. These assumptions do not hold if the graph is on the disk because the operations get much more costly. So, in that case, you are better off either using another library that stores the graph in a database, or implement your algorithm from scratch.T.
_______________________________________________
igraphhelp mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/igraphhelp
[Prev in Thread]  Current Thread  [Next in Thread] 