6

Entropy is defined as $H$ = $- \int_\chi p(x)$ $\log$ $p(x)$ $dx$

The Cauchy Distribution is defined as

$f(x)$ = $\frac{\gamma}{\pi}$ $\frac{1}{\gamma^2 + x^2} $

I kindly ask to show the steps to calculate the Entropy of a Cauchy distribution, which is

$$\log(4 \pi \gamma)$$

Reference: Cauchy distribution

garej
  • 227
  • 4
  • 16
RF_LSE
  • 63
  • 5
  • First you will need to use the [correct definition of entropy!](http://en.wikipedia.org/wiki/Differential_entropy#Definition) – whuber May 25 '14 at 17:23
  • Thank you very much for your comment. Sorry, about the typo. Could you also kindly help with my question by any chance? Thanks again! – RF_LSE May 25 '14 at 18:20
  • What have you tried? Please, show us where you are stuck in the derivation...THEN we can help ;-). – Néstor May 25 '14 at 18:26
  • 1
    This formula for the entropy will be correct only provided $\tau = 4^{\pi-1}$. – whuber May 25 '14 at 19:36

2 Answers2

9

As shown at How does entropy depend on location and scale?, the integral is easily reduced (via an appropriate change of variable) to the case $\gamma=1$, for which

$$H = \int_{-\infty}^{\infty} \frac{\log(1+x^2)}{1+x^2}\,dx.$$

Letting $x=\tan(\theta)$ implies $dx = \sec^2(\theta)d\theta$ whence, since $1+\tan^2(\theta) = 1/\cos^2(\theta)$,

$$H = -2\int_{-\pi/2}^{\pi/2} \log(\cos(\theta))d\theta = -4\int_{0}^{\pi/2} \log(\cos(\theta))d\theta .$$

There is an elementary way to compute this integral. Write $I= \int_{0}^{\pi/2} \log(\cos(\theta))d\theta$. Because $\cos$ on this interval $[0, \pi/2]$ is just the reflection of $\sin$, it is also the case that $I= \int_{0}^{\pi/2} \log(\sin(\theta))d\theta.$ Add the integrands:

$$\log\cos(\theta) + \log\sin(\theta) = \log(\cos(\theta)\sin(\theta)) = \log(\sin(2\theta)/2) = \log\sin(2\theta) - \log(2).$$

Therefore

$$2I = \int_0^{\pi/2} \left(\log\sin(2\theta) - \log(2)\right)d\theta =-\frac{\pi}{2} \log(2) + \int_0^{\pi/2} \log\sin(2\theta) d\theta.$$

Changing variables to $t=2\theta$ in the integral shows that

$$\int_0^{\pi/2} \log\sin(2\theta) d\theta = \frac{1}{2}\int_0^{\pi} \log\sin(t) dt = \frac{1}{2}\left(\int_0^{\pi/2} + \int_{\pi/2}^\pi\right)\log\sin(t)dt \\= \frac{1}{2}(I+I) = I$$

because $\sin$ on the interval $[\pi/2,\pi]$ merely retraces the values it attained on the interval $[0,\pi/2]$. Consequently $2I = -\frac{\pi}{2} \log(2) + I,$ giving the solution $I = -\frac{\pi}{2} \log(2)$. We conclude that

$$H = -4I = 2\pi\log(2).$$


An alternative approach factors $1+x^2 = (1 + ix)(1-ix)$ to re-express the integrand as

$$\frac{\log(1+x^2)}{1+x^2} = \frac{1}{2}\left(\frac{i}{x-i} + \frac{i}{x+i}\right)\log(1+ix) + \frac{1}{2}\left(\frac{i}{x-i} + \frac{i}{x+i}\right)\log(1-ix)$$

The integral of the first term on the right can be expressed as the limiting value as $R\to\infty$ of a contour integral from $-R$ to $+R$ followed by tracing the lower semi-circle of radius $R$ back to $-R.$ For $R\gt 1$ the interior of the region bounded by this path clearly has a single pole only at $x=-i$ where the residue is

$$\operatorname{Res}_{x=-i}\left(\left(\frac{i}{x-i} + \frac{i}{x+i}\right)\log(1+ix)\right) = i\left.\log(1 + ix)\right|_{x=-i} = i\log(2),$$

whence (because this is a negatively oriented path) the Residue Theorem says

$$\oint \left(\frac{1}{1+ix} + \frac{1}{1-ix}\right)\log(1+ix) \mathrm{d}x = -2\pi i (i\log(2)) = 2\pi\log(2).$$

Because the integrand on the circle is $o(\log(R)/R)$ which grows vanishingly small as $R\to\infty,$ in the limit we obtain

$$\int_{-\infty}^\infty \frac{1}{2}\left(\frac{1}{1+ix} + \frac{1}{1-ix}\right)\log(1+ix) \mathrm{d}x = \pi\log(2).$$

The second term of the integrand is equal to the first (use the substitution $x\to -x$), whence $H=2(\pi\log(2)) = 2\pi\log(2),$ just as before.

whuber
  • 281,159
  • 54
  • 637
  • 1,101
  • your calculations are just fine, but can you, please, explain, why do you ignore $\pi$ (scale parameter) in the definition of the Cauchy PDF? All books give the result of differential entropy as $\log(4 \pi)$. – garej Jun 30 '19 at 04:10
  • please, see the [link](http://www.de.ufpe.br/~hmo/B25.pdf) as an example (p. 67). – garej Jun 30 '19 at 04:11
  • 1
    @garej Please see the first line of the answer. – whuber Jun 30 '19 at 16:49
  • 1
    @garej Because this issue is more general and applies to any continuous distribution, not just the Cauchy, I have posted an explanation and edited my answer to link to the explanation. – whuber Jun 30 '19 at 18:57
2

This is not a full-scaled answer but just a modest extension of @whuber's answer.

If we take that $\gamma = 1$, so pdf of Cauchy distribution boils down to the following:

$$ p(x) = \frac {1} {\pi (1 + x^2)},$$

where $\pi$ is just a scaling factor (see the picture).

enter image description here

So, if we define differential entropy for valid pdf, its formula is as follows:

$$H = \int_{-\infty}^{\infty} \frac{\log(\pi (1+x^2))}{\pi(1+x^2)}\,dx $$

We can expand it a bit for analysis:

$$ H = \frac{\log(\pi)}{\pi} \int_{-\infty}^{\infty} \frac{1}{1+x^2}\,dx + \frac{1} {\pi} \int_{-\infty}^{\infty} \frac{\log(1+x^2)}{1+x^2}\,dx = $$ $$ = \frac{\log(\pi)}{\pi} H_1 + \frac{1} {\pi} H_2 $$

Step 1 It can be shown that $H_1 = \pi $ because anti-derivative of the integrand is $\arctan(x)$.

Step 2 The part $H_2$ is in detail elaborated in the accepted answer, where we can put $2$ inside $\log$.

$$H_2 = 2\pi\log(2) = \pi \log(4).$$

Step 3 Now we may combine everything to get:

$$ H = \frac{\log(\pi)}{\pi} H_1 + \frac{1} {\pi} H_2 = \frac{\log(\pi)}{\pi} \pi + \frac{1} {\pi} \pi \log(4) = \log(\pi) + \log(4). $$

Conclusion: Now we get the expected result.

$$ H(\gamma = 1) = \log(4\pi).$$

garej
  • 227
  • 4
  • 16