1

I came across a repository which uses Tikhonov regularization to compute an inverse, but then in the inference step they multiply by the Tikhonov factor again...

  1. Compute $\Phi\Phi^T$
  2. Compute the inverse $(\Phi\Phi^T + \lambda I)^{-1}$
  3. Use the inverse for the computation of posterior variance in a Bayesian linear model, but the regularization term is multiplied back in... $\lambda \phi^T(\Phi\Phi^T + \lambda I)^{-1})\phi$

The code I am referencing is here: https://github.com/google/edward2/blob/d2571a25bd4ed4a4575f4f64f5048b5e1a8bf233/edward2/tensorflow/layers/random_feature.py#L411

And it is from this paper: https://arxiv.org/pdf/2006.10108.pdf

...although that line is missing from the paper. I have never seen the regularization term multiplied back in before. Is this expected?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Joff
  • 599
  • 2
  • 13

0 Answers0