[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
## Re: Sparse Eigenvalues

**From**: |
David Bateman |

**Subject**: |
Re: Sparse Eigenvalues |

**Date**: |
Sun, 21 Dec 2008 17:02:51 +0100 |

**User-agent**: |
Mozilla-Thunderbird 2.0.0.17 (X11/20081018) |

Chaman Singh Verma wrote:

`Well, here is my sincere question. It is my observation that eigs (
``which probably
``based on ARPACK code) seems to take atleast "N" sparse-matrix-vector
``operations,
``which could be quite expensive for reasonably large matrices. One of
``the application
``where someone may need second eigenvalue is for graph partitioning (
``and famous
``Google rank Matrix). For typical graph of 200,000 rows and 100-200
``columns, it takes
``more than 45-50 minutes on single node machine, which is not
``competitive to other
`graph partitioning methods such as METIS.

`The google rank matrix is just a power method to find the largest
``eigenvalue, and the google rank matrices is documented as converging in
``about 6 iterations of the power method. Google page rank doesn't fing
``the second largest eigenvalue at all. ARPACK is not the same beast!!
`

`If you only want the largest eigenvalue then use the power method. What
``ARPACK gets you is the N largest eigenvalues (or N smallest or N closest
``to a particular value), and to do that it uses a method a bit like the
``Lanzcos method with restarting.. Lanzcos method finds the largest
``eigenvalue, then forms a new problem with the largest eigenvalue removed
``and repeats.. The new problem is formulated with multiple matrix vector
``products.. Arnoldi's method adds a restarting step such that it can
``handle eigenvalues that are close together without convergence issues.
`

`How did you formulate the problems to eigs? Perhaps there is an issue in
``the manner eigs is calls. For example a function call for the matrix
``vector product is likely to be slower than a call to blas xGEMM or the
``sparse equivalent internal to eigs itself.
`

`Frankly, I've never looked at how METIS does its graph partitioning, but
``I didn't have the impression it used an eigenvalue method.. METIS has a
``repudiation as one of the better graph partitioner out there however.
`

Here is my query to all the knowledge people.

`(1) What is the most competitive algorithms for finding few
``eigenvalues/eigenvectors
`` compared to ARPACK. Has somebody done any study and have some
``number to show its
` superiority.

`Is the matrix positive definite? Are there eigenvalues in the set you
``are looking for that are close together? If the matrix is PD and you
``don't have eigenvalues close together a straight Lanzcos method will
``probably be the fastest.
`

(2) Can we reduce the number of Matrix-Vec Operations ?

`Getting rid of the restarting in the Arnoldi technique will certainly
``reduce the number, but at the cost of poor convergence if there are
``eigenvalues close together.
`

(3) Can spectral decomposition beat METIS type decompositions ?

`Perhaps, but I'm an engineer and not a mathematician, so you might want
``to ask someone who knows more..
`
D.

Thanks.
csv

--
David Bateman address@hidden
35 rue Gambetta +33 1 46 04 02 18 (Home)
92100 Boulogne-Billancourt FRANCE +33 6 72 01 06 33 (Mob)