Let $X$ be a random variable. If $Y=aX+b$, where $a,b \in \mathbb{R}$, is the entropy of $Y$ the same as the entropy of $X$?
-
2The answer for continuous variables is at https://stats.stackexchange.com/questions/415435. – whuber Mar 28 '21 at 17:54
1 Answers
This depends on whether $X$ is discrete or continuous.
If $X$ is discrete
Let $R_X$ be the range of $X$ such that: $$ R_X = \{x_1,x_2,...,x_N\} $$ And let $P(X=x_i)$ be the probability mass function of $X$. The entropy of $X$ is: $$ H(X) = -\sum_{i=1}^N P(X=x_i) \log{P(X=x_i)} $$ Next, let: $$ Y = aX+b $$ Where $a,b \in \mathbb{R}$. This means that the range of $Y$ is: $$ R_Y = \{y_1,y_2,...,y_N\} = \{ax_1+b,ax_2+b,...,ax_N+b\} $$ And the probability mass function of $Y$ is: $$ \begin{align} P(Y = y_i) &= P(aX+b = y_i) \\ &= P\left(X = \frac{y_i-b}{a}\right) \quad i=1,2,...,N \end{align} $$ So the entropy of $Y$ is: $$ \begin{align} H(Y) &= -\sum_{i=1}^N P(Y=y_i) \log{P(Y=y_i)} \\ &= -\sum_{i=1}^N P\left(X = \frac{y_i-b}{a}\right) \log{P\left(X = \frac{y_i-b}{a}\right)} \\ &= -\sum_{i=1}^N P(X=x_i) \log{P(X=x_i)} \\ &= H(X) \end{align} $$
The entropy of $Y$ is the same as the entropy of $X$ in the case that $X$ is discrete.
If $X$ is continuous
Let $X$ be a continuous random variable with probability density function $p_X(x)$. Also, let $h(X)$ be the differential entropy of $X$ such that: $$ h(X) = -\int_{x\in\mathcal{X}} p_X(x) \log{p_X(x)} \ dx $$ Where $\mathcal{X}$ is the support of $X$. Next, let: $$ Y = aX + b $$ Such that: $$ x = \frac{y-b}{a} $$ And the probability density function of $Y$ is: $$ \begin{align} p_Y(y) &= p_X\left(\frac{y-b}{a}\right) \cdot \left|\frac{dx}{dy}\right| \\ &= \frac{1}{|a|} p_X\left(\frac{y-b}{a}\right) \end{align} $$ The differential entropy of $Y$ is: $$ \begin{align} h(Y) &= -\int_{y\in\mathcal{Y}} p_Y(y) \log{p_Y(y)} \ dy \\ &= -\int_{y\in\mathcal{Y}} \frac{1}{|a|} p_X\left(\frac{y-b}{a}\right) \log{\frac{1}{|a|} p_X\left(\frac{y-b}{a}\right)} \ dy \end{align} $$ Substituting $y=ax+b$ yields: $$ h(Y) = h(X) + \log{|a|} $$ This equation is only valid for $a \neq 0$. So, in the case that $X$ is continuous, then the differential entropy of $Y$ is not the same as the differential entropy of $X$.

- 2,582
- 1
- 4
- 17