28

It could seem an easy question and without any doubts it is but I'm trying to calculate the variance of white Gaussian noise without any result.

The power spectral density (PSD) of additive white Gaussian noise (AWGN) is $\frac{N_0}{2}$ while the autocorrelation is $\frac{N_0}{2}\delta(\tau)$, so variance is infinite?

Royi
  • 33,983
  • 4
  • 72
  • 179
Mazzy
  • 525
  • 1
  • 6
  • 11
  • Isn't the noise power the variance of the noise voltage? One could also ask about the variance (or standard deviation) of the power measured over a specific time interval. I think the central limit theorem would describe the relationship between the duration the measurement time and the variance of the results. –  Sep 09 '15 at 14:22

3 Answers3

31

White Gaussian noise in the continuous-time case is not what is called a second-order process (meaning $E[X^2(t)]$ is finite) and so, yes, the variance is infinite. Fortunately, we can never observe a white noise process (whether Gaussian or not) in nature; it is only observable through some kind of device, e.g. a (BIBO-stable) linear filter with transfer function $H(f)$ in which case what you get is a stationary Gaussian process with power spectral density $\frac{N_0}{2}|H(f)|^2$ and finite variance $$\sigma^2 = \int_{-\infty}^\infty \frac{N_0}{2}|H(f)|^2\,\mathrm df.$$

More than what you probably want to know about white Gaussian noise can be found in the Appendix of this lecture note of mine.

Dilip Sarwate
  • 18,349
  • 3
  • 43
  • 82
  • 2
    The curious thing about this for me is that the $\sigma^2$ parameter that is used as the "variance" of the Gaussian distribution of $x(t)$ is not the variance of the sequence. As you say, it's because $E[x^2(t)]$ is infinite. Thanks for the clear explanation! – Peter K. Apr 13 '13 at 00:29
  • 9
    @PeterK. There is a difference between the notions of white Gaussian noise for discrete time and continuous time. If a discrete-time process is considered as _samples_ from a continuous-time process, then, taking into consideration that the sampler is a device with a finite bandwidth, we get a sequence of independent Gaussian random variables of common variance $\sigma^2$ which is what you have in your answer. If your $Y[n]$ is $$Y[n]=\int_{(n-1)T}^{nT}X(t)\,\mathrm dt$$ where $X(t)$ is the OP's AWGN, then $\sigma_{Y[n]}^2=\frac{N_0}{2}T$, not $\frac{N_0}{2}$ as you have it (except if $T=1$). – Dilip Sarwate Apr 13 '13 at 01:22
  • Understood, Dilip! I've not had this aspect pointed out before (or I did, and it's been forgotten). Good stuff! – Peter K. Apr 13 '13 at 02:18
  • 1
    @DilipSarwate I read your interesting appendix. But you say " One should not, however, infer that the random variables in the WGN process are themselves Gaussian random variables". I did not fully understand this. If the random variables aren't Gaussian (and this seems reasonable to me since they have infinite variance), why is the process named Gaussian? – Surfer on the fall Jul 04 '17 at 07:04
  • 1
    @Surferonthefall Try writing down the _probability density function_ $f_{X(t)}(x)$ of the alleged Gaussian random variables in the white Gaussian noise process $\{X(t)\colon -\infty < t < \infty\}$. The density function has value $0$ for all $x$. How can $X(t)$ be viewed as a Gaussian random variable? As I said repeatedly in the document you read, one should not look too closely at the random variables in a white noise process $\{X(t)\colon -\infty < t < \infty\}$. The process is a _mythical one_ and it is defined by what it produces at the output of linear filter, not by anything else. – Dilip Sarwate Jul 04 '17 at 14:21
  • 1
    @DilipSarwate sorry, I can't understand you. I know that, let's say $X_4$ (value assumed at t=4) is a random variable. Why can't it be Gaussian as WGN seems to suggest? [I originally thought that the process was a collection of independent gaussian variables, but this clearly conflicts with the variance being infinite.] I just can't get what you mean by saying "The density function has value 0 for all x."! Could you please explain it a bit? I would be very grateful to you! I find this topic really confusing as is traditionally treated :( – Surfer on the fall Jul 04 '17 at 15:08
  • 1
    The probability density function of a zero-mean Gaussian random variable is $$f_X(x) = \frac{1}{\sigma\sqrt{2\pi}}\exp(-x^2/2\sigma^2), -\infty < x < \infty.$$ What is the value of $f_X(1)$ if you "set" $\sigma=\infty$. or more properly, take the limit as $\sigma \to 0$? of $f_X(35.2869)$? of $f_X(x)$ for each and every choice of real number $x$, $-\infty < x < \infty$? – Dilip Sarwate Jul 04 '17 at 15:55
  • @DilipSarwate Ok, I got it. So you are saying that they are sort of degenerate gaussian random variable.. However this problem doesn't exist if we consider a white noise in a weak sense, so that S(f) is a large rect, since variance would then be finite and random variables just traditional Gaussian.. am I right? – Surfer on the fall Jul 04 '17 at 21:05
  • 2
    Sorry, that should have read ".... take the limit as $\sigma \to \infty$" not as $\sigma \to 0$. – Dilip Sarwate Jul 04 '17 at 21:27
  • @DilipSarwate sure, i got it. Could you tell me if thee reasoning in my last message is correct? Thanks again! – Surfer on the fall Jul 05 '17 at 06:04
  • 1
    I've just read this and I'm really confused now. AFAIK, WGN means that each $X(t_0)$ is a Gaussian random variable with *finite* variance, and independent from every other $X(t) \ \forall t\neq t_0$. Is this intuition wrong? – Tendero Mar 02 '18 at 22:12
  • 1
    @Tendero For _continuous_ time, White Gaussian Noise is _not_ what you say it is: for discrete time it is. See my comment in response to Peter K's comment above regarding discrete-time white Gaussian noise which in indeed "$X[n] \sim N(0,\sigma^2)$ for all $n$; $X[n]$ and $X[m]$ independent for $n\neq m$" but here $m$ and $n$ are restricted to be integers. The process that you state **for continuous time** _is_ also called white Gaussian noise by some mathematicians, but it has vastly different properties. See [this question](https://math.stackexchange.com/q/134193/15941) over on math.SE – Dilip Sarwate Mar 02 '18 at 22:39
6

Suppose we have a discrete-time sequence $x[t]$ which is stationary, zero mean, white noise with variance $\sigma^2$. Then the autocorrelation of $x$ is: $$ \begin{array} RR_{xx}[\tau] &=& E\left[ x[t] x[t+\tau] \right]\\ &=& \left \{ \begin{array} EE \left[ x[t]^2 \right], {\rm if\ }\tau=0 \\ 0, {\rm otherwise} \end{array} \right. \\ &=& \sigma^2 \delta[\tau] \end{array} $$ where $\delta[\tau]$ is the Kronecker delta.

So, that implies that $\sigma^2 = \frac{N_0}{2}$.

Peter K.
  • 21,266
  • 9
  • 40
  • 78
2

Yes it is: unless you take into account that infinite power is hard to come by in these post big-bang times. Actually all white noise processes end up in a physical implementation that has a capacitance and thus limits on the effective bandwidth. Consider the (reasonable) arguments leading to Johnson R noise: they would produce infinite energy; except there are always bandwidth limits in implementation. A similar situation applies at the opposite end: 1/F noise. Yes some processes fit 1/f noise very well over a long time; I have measured them. But in the end you are constrained by physical laws.

rrogers
  • 395
  • 2
  • 5