There are a lot of empirical results about that truncated SVD (TSVD) can help denoise the noises of images, but I wonder what is the theoretical support behind that?
We know that TSVD is the best low-rank matrix approximation method in terms of Frobenius norm and spectral norm, but why does this have anything to do with denoising?
Formally, say we have an image $A$, and a matrix of noises $E$ where each element $e$ in $E$ follows $e \sim N(0, \sigma^2)$, and we have a corrupted image $B = A + E$, we apply TSVD to denoise, the denoising performance must have something to do with $A$ and $\sigma^2$, I wonder where I can find some theoretical study about it.
So, overall, I hope I can find the answer to these two questions:
- Why small errors in terms of Frobenius norm or spectral norm are related to denoising performance?
- What is the relationship between denoising performance and $A$ and $\sigma^2$?