[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
## [igraph] problems with large graphs

**From**: |
Eytan Bakshy |

**Subject**: |
[igraph] problems with large graphs |

**Date**: |
Sat, 6 Sep 2008 15:01:33 -0400 |

Hello,

`I am using igraph 0.5.1 python and I am running into some difficulty
``various things with large graphs. I am using a Mac Pro with 18GB of
``RAM (not that I can generally address more than 3-4GB for any given 32-
``bit application, like igraph/python).
`
Erdos-Reyni graphs with many vertices
---
Constructing a random graph as follows works:
g = Graph.Erdos_Renyi(n=249553,m=100000)
but when n is any larger, e.g.:
g = Graph.Erdos_Renyi(n=249554,m=100000)
I get the error:
ValueError: m must be between 0 and n^2

`This seems to happen for various values of m between 0 and n^2 for any
``n > 249,553
`
Computing similarity measures for large graphs
---

`I am running across out of memory errors and segfaults using
``similarity_inverse_log_weighted() in working with my dataset (~350k
``nodes, ~4.5m edges). For my graph and randomly generated graphs of
``this size, Jaccard and Dice seem to work just fine. Here is a
``reproducible example of my problem:
`

`Generate a very sparse random graph with 20,000 nodes (problem also
``holds for denser graphs)
` g = Graph.Erdos_Renyi(n=20000,m=400);
g.similarity_inverse_log_weighted([2000,6000])
gives me:

` python2.5(34740) malloc: *** mmap(size=3200000000) failed (error
``code=12)
` *** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
---------------------------------------------------------------------------

` InternalError Traceback (most recent call
``last)
`

` /Volumes/Data/projects/secondlife/septAnalysis/data/transactions/
``<ipython console> in <module>()
`
InternalError: Error at cocitation.c:171: , Out of memory

`For BA graphs with 300,000 nodes using the same measure, I get a
``segfault. Would it be possible to perform this computation using less
``memory?
`
Thanks,
-e

**[igraph] problems with large graphs**,
*Eytan Bakshy* **<=**