1

In section 6.1 of the notes Stat 3701 Lecture Notes: Bayesian Inference via Markov Chain Monte Carlo (MCMC) by Charles J. Geyer, the author states

Suppose we have a probability or expectation we want to estimate. Probability is a special case of expectation: if $g$ is a zero-or-one valued function, then $$ E\{ g(X) \} = \Pr\{ g(X) = 1 \} $$ and any probability can be written this way. So we just consider expectations.

I would assume that $g$ in this context is the function $$ g(X) = \begin{cases} 1 & \ \text{if} \ \ X \in A \\ 0 & \ \text{if} \ \ X \notin A \end{cases} $$ such that $$ \Pr\{ g(X) = 1 \} = \Pr\{ X \in A \} $$ where $A$ is some subset of the range (image) of $X$. Is this true for any random variable $X$?

mhdadk
  • 2,582
  • 1
  • 4
  • 17

2 Answers2

4

Yes, you are correct. You can use indicator function to define a random variable, say $Y = I_{X\in A}(x)$ and then $Y$ would follow Bernoulli distributon with the "probability of success" $p = \Pr(Y=1) = \Pr(X \in A)$. For Bernoulli distribution, expected value is equal to the probability of success $E[Y] = 1 \times p + 0 \times (1-p) = p$.

Tim
  • 108,699
  • 20
  • 212
  • 390
  • 4
    Peter Whittle (1927-2021) made this a main theme of his engaging text which passed through two titles, three publishers and four editions between 1970 and 2000. _Probability via Expectation_ was the later title. https://link.springer.com/book/10.1007/978-1-4612-0509-8 – Nick Cox Dec 30 '21 at 13:22
2

The expectation is a Lebesgue integral with respect to some probability measure $\mu$ in $(\Omega,\mathcal{F},\mu)$, $$\int f d\mu$$ where $f$ is a $\mathcal{F}$-measurable function, which in probability are our random variables $X$.

By construction, the Lebesgue integral of an indicator function is the measure of the set associated with that indicator, $$ \int I_A d\mu = \mu(A), $$ thus under $(\Omega,\mathcal{F},\mathbb{P})$, $$\mathbb{E}[I_A]=P(X\in A) = \mathbb{P}(X^{-1}(A))$$.

BelwarDissengulp
  • 289
  • 2
  • 13