[igraph] Calculating metrics for very large / complex graphs
From:
Chris Wj
Subject:
[igraph] Calculating metrics for very large / complex graphs
Date:
Wed, 14 Jan 2009 08:11:30 -0500
I am starting to calculate metrics for very large graphs. I am looking to calculate betweenness centrality, closeness centrality, average path length, edge betweenness and maybe some others. Performing betweenness centrality calculation for all vertices on a graph with 300K vertices and 100K edges took a 3 GHz processor 53 minutes. I upped the edge count to 600K, left it running overnight, and it still hasn't finished (over 16 hours).
I was wondering what approaches others have taken to calculate metrics for large graphs. I see that there are methods of estimating these numbers, but I am not sure the best way to pick the cutoff values for estimation.
Ideally, I want each metric calculation to take no longer than 6 hours to perform, obtaining as high a degree of accuracy as possible for those 6 hours. Could I use the big-o notation for each function to estimate the duration and then assign a proper cutoff value?