Questions tagged [matrix-decomposition]

Matrix decomposition refers to the process of factorizing a matrix into a product of smaller matrices. By decomposing a large matrix, one can efficiently perform many matrix algorithms.

Common examples of matrix decompositions, each with its advantages and applications, include:

  • SVD
  • Spectral decomposition
  • LU decomposition
  • Cholesky
  • QR factorization
  • Schur decomposition
278 questions
33
votes
1 answer

Dimensionality reduction (SVD or PCA) on a large, sparse matrix

/edit: Further follow up now you can use irlba::prcomp_irlba /edit: following up on my own post. irlba now has "center" and "scale" arguments, which let you use it to calculate principle components, e.g: pc <- M %*% irlba(M, nv=5, nu=0,…
Zach
  • 22,308
  • 18
  • 114
  • 158
28
votes
2 answers

Why PCA of data by means of SVD of the data?

This question is about an efficient way to compute principal components. Many texts on linear PCA advocate using singular-value decomposition of the casewise data. That is, if we have data $\bf X$ and want to replace the variables (its columns) by…
ttnphns
  • 51,648
  • 40
  • 253
  • 462
28
votes
1 answer

What norm of the reconstruction error is minimized by the low-rank approximation matrix obtained with PCA?

Given a PCA (or SVD) approximation of matrix $X$ with a matrix $\hat X$, we know that $\hat X$ is the best low-rank approximation of $X$. Is this according to the induced $\parallel \cdot \parallel_2$ norm (i.e. the largest eigenvalue norm) or…
Donbeo
  • 3,001
  • 5
  • 31
  • 48
25
votes
2 answers

Efficient calculation of matrix inverse in R

I need to calculate matrix inverse and have been using solve function. While it works well on small matrices, solve tends to be very slow on large matrices. I was wondering if there is any other function or combination of functions (through SVD, QR,…
jitendra
  • 468
  • 2
  • 6
  • 12
25
votes
3 answers

How to choose an optimal number of latent factors in non-negative matrix factorization?

Given a matrix $\mathbf V^{m \times n}$, Non-negative Matrix Factorization (NMF) finds two non-negative matrices $\mathbf W^{m \times k}$ and $\mathbf H^{k \times n}$ (i.e. with all elements $\ge 0$) to represent the decomposed matrix as: $$\mathbf…
22
votes
1 answer

Updating SVD decomposition after adding one new row to the matrix

Suppose that I have a dense matrix $ \textbf{A}$ of $m \times n$ size, with SVD decomposition $$\mathbf{A}=\mathbf{USV}^\top.$$ In R I can calculate the SVD as follows: svd(A). If a new $(m+1)$-th row is added to $\mathbf A$, can one compute the new…
19
votes
2 answers

How to plot an ellipse from eigenvalues and eigenvectors in R?

Could someone come up with R code to plot an ellipse from the eigenvalues and the eigenvectors of the following matrix $$ \mathbf{A} = \left( \begin{array} {cc} 2.2 & 0.4\\ 0.4 & 2.8 \end{array} \right) $$
MYaseen208
  • 2,379
  • 7
  • 32
  • 46
18
votes
5 answers

Essential papers on matrix decompositions

I recently read Skillicorn's book on matrix decompositions, and was a bit disappointed, as it was targeted to an undergraduate audience. I would like to compile (for myself and others) a short bibliography of essential papers (surveys, but also…
gappy
  • 5,390
  • 3
  • 28
  • 50
17
votes
1 answer

Relationship between Cholesky decomposition and matrix inversion?

I've been reviewing Gaussian Processes and, from what I can tell, there's some debate whether the "covariance matrix" (returned by the kernel), which needs to be inverted, should be done so through matrix inversion (expensive and numerically…
15
votes
1 answer

Eigenfunctions of an adjacency matrix of a time series?

Consider a simple time series: > tp <- seq_len(10) > tp [1] 1 2 3 4 5 6 7 8 9 10 we can compute an adjacency matrix for this time series representing the temporal links between samples. In computing this matrix we add an imaginary site at…
Gavin Simpson
  • 37,567
  • 5
  • 110
  • 153
14
votes
1 answer

State-of-the-art in Collaborative Filtering

I am working on a project for collaborative filtering (CF), i.e. completing a partially observed matrix or more generally tensor. I am a newbie to the field, and for this project eventually I have to compare our method to other well-known ones that…
Cupitor
  • 1,415
  • 1
  • 11
  • 22
13
votes
1 answer

Explain how `eigen` helps inverting a matrix

My question relates to a computation technique exploited in geoR:::.negloglik.GRF or geoR:::solve.geoR. In a linear mixed model setup: $$ Y=X\beta+Zb+e $$ where $\beta$ and $b$ are the fixed and random effects respectively. Also,…
12
votes
3 answers

Difference between Matrix Factorization and PCA

I'm studying Matrix Factorization (to use in Recommender Systems as link predictor) and i want to know if there is any similarity with PCA? The latent features can be compared to the eigenvectors? Thank you
Augusto
  • 353
  • 1
  • 3
  • 8
11
votes
1 answer

Why is non-negativity important for collaborative filtering/recommender systems?

In all modern recommender systems that I have seen that rely on matrix factorization, a non-negative matrix factorization is performed on the user-movie matrix. I can understand why non-negativity is important for interpretability and/or if you…
11
votes
2 answers

Finding matrix eigenvectors using QR decomposition

First, a general linear algebra question: Can a matrix have more than one set of (unit size) eigenvectors? From a different angle: Is it possible that different decomposition methods/algorithms (QR, NIPALS, SVD, Householder etc.) give different sets…
Bliss
  • 443
  • 2
  • 4
  • 11
1
2 3
18 19