6

I have a symmetric and weighted adjacency matrix with $n$ elements. What algorithms exist to cluster the elements from this matrix?

The matrix has values between $0$ and $1$. In the case of a similarity matrix the elements in the main diagonal are $1$, in the case of a distance matrix the values in the main diagonal are $0$.

Is it possible to use $k$-means using the "adjacency" matrix?

hu_chin
  • 69
  • 1
  • 1
  • 3
  • It has been asked about 100 times here already: no, k-means cannot be used on a distance matrix, because k-means computes *means*, and deviations from the mean - it doesn't use pointwise distances... there is PAM however, and HAC, and DBSCAN, and OPTICS, and ... – Has QUIT--Anony-Mousse Apr 25 '15 at 15:22

2 Answers2

2

If you have a similarity matrix, try to use Spectral methods for clustering. Take a look at Laplacian Eigenmaps for example. The idea is to compute eigenvectors from the Laplacian matrix (computed from the similarity matrix) and then come up with the feature vectors (one for each element) that respect the similarities. You can then cluster these feature vectors using for example k-means clustering algorithm.

From practical perspecive, if your matrix is big and dense, Spectral methods can quickly become very computationally intensive and memory hogs.

I used Spectral methods for image clustering and eventually classification. The results were pretty good. The difficult part was to get a good similarity matrix.

Vladislavs Dovgalecs
  • 2,315
  • 15
  • 18
0

Two ideas immediately come to mind:

The simpler is hierarchical clustering http://en.wikipedia.org/wiki/Hierarchical_clustering which only requires distances between points.

The other is much more complicated. There are techniques which, given distances between points, provides a distance preserving embedding into a Euclidean space. Once there, lots of clustering techniques will apply. When I remember (or successfully google) the names of several of these techniques, I'll edit the answer.

jlimahaverford
  • 3,535
  • 9
  • 23