Question: Is the concept of entropy rate defined for real-valued random processes? If so, does it have the same interpretation as in the discrete case; and how can it be estimated?
Background: As far as I know, the entropy rate of a random process is defined as:
$$ h = \lim_{n \rightarrow \infty} \frac{1}{n}H(X_1, ..., H_n) \,.$$
In conventional, finite-alphabet information theory the entropy rate of a process quantifies how many bits this process generates per realization. That is different from the entropy of the realizations themselves $H(X_i)$.
However, if the $X_i$ are continuous variables the notion of entropy $H(X_i)$ doesn't have the same interpretation as in the discrete case, since it isn't non-negative and is not invariant under change of variables.