5

In the context of scale space image pyramids using Gaussian filters I noticed that it's common to downsample the image after blurring with $\sigma = 2*\sigma_{init}$ . My question is: What is the relation between variance and downsampling? What is the proof that I'm not losing information when the image is downsampled after blurring with this factor?

I think this can easily be shown using Nyquist and the cut-off frequency of the Gaussian filter, but I'm not sure how.

Royi
  • 33,983
  • 4
  • 72
  • 179

1 Answers1

7

You're correct, it has to do with the Cut Off frequency of the Gaussian Blur Filter in its Frequency Domain.

In order to see it, just apply a DFT (Using MATLAB it can be achieved by fft / fft2) and look on the absolute value.
Look for the -3dB point and you'll see.

There is also an intuitive explanation on the original article which say that blurring using Gaussian Kernel with $ \sigma $ means removing any detail which its size is smaller than $ \sqrt{ \sigma } $.

Royi
  • 33,983
  • 4
  • 72
  • 179