4

I have a question regarding the support of an importance sampling distribution with respect to the support of the original distribution function. I was reading that the support of the importance sampling distribution (say the importance distribution is denoted q(x), say the set Q contains all value of x which satisfies q(x)>0 ) has to cover the support of the original distribution (say the distribution denoted as p(x) and let D denoted the set of x which satisfies p(x)>0 ), so it means we have $$D \subset Q$$.

The goal is to estimate some integral say:

$$\int_{D}f(x)p(x)dx$$

by using this instead:

$$\int_{Q} f(x) \frac{p(x)}{q(x)} q(x) dx $$

My question is that the author is saying q(x) doesn't have to be entirely bigger than zero as long as it is bigger than zero when $f(x)p(x)\neq 0$, i.e. $ x \in{Q}$ whenever $f(x)p(x) \neq 0$.

And next it goes about saying this:

$E_{q} \Bigg(\frac{f(X)p(x)}{q(x)} \Bigg)$ = $\int_{Q} \frac{f(x)p(x)}{q(x)} q(x) dx$ =
$\int_{D} f(x)p(x) dx$ + $\int_{Q \cap D^c} f(x)p(x) dx-\int_{D \cap Q^c} f(x)p(x) dx $

My question has to do with the very last line here, I don't really get why the expression on the very last line here would equal this:

$\int_{Q} \frac{f(x)p(x)}{q(x)} q(x) dx$.

It seems it is saying the set $Q$ is a union of the two sets: $D$ and $Q\cap D^c$. And $D\cap Q^c$ is the intersection of $D$ and $Q\cap D^c$.

My question is, is my understanding of the union and intersection correct regarding the above?

Because it seems it is based on the formula of:

$P(A\cup B) = P(A) + P(B) - P(A \cap B)$

If it is indeed based on the above simple "set union operation", then my question is how does the terms:

$\int_{D} f(x)p(x) dx$ + $\int_{Q \cap D^c} f(x)p(x) dx-\int_{D \cap Q^c} f(x)p(x) dx $

equal $\int_{Q} \frac{f(x)p(x)}{q(x)} q(x) dx$.

P.S. Hi Xian, yes it is here: http://statweb.stanford.edu/~owen/mc/Ch-var-is.pdf

It is just some notes I read from the pdf file.

john_w
  • 619
  • 6
  • 17
  • Yes Xian, the link is there now. Thank you very much again for your clear explanation. – john_w Nov 25 '16 at 16:52
  • Most useful for readers, thank you!, especially since they are lecture notes by Art Owen, a major actor in the field of Monte Carlo methods. – Xi'an Nov 25 '16 at 17:55

1 Answers1

1

This can be easily resolved via the use of (set) indicator functions:

My question is that the author is saying $q(x)$ doesn't have to be entirely bigger than zero as long as it is bigger than zero when $f(x)p(x)≠0$, i.e. $x∈Q$ whenever $f(x)p(x)≠0$.

Indeed, the only constraint for the identity to hold is that the support $Q$ of $q$ contains the support of the function $x \to f(x)p(x)$, because, if $H$ denotes this support$$\int_D f(x)p(x) dx=\int_D f(x)p(x)\mathbb{I}_{H}(x) dx=\int_{D\cap H} f(x)p(x) dx=\int_H f(x)p(x) dx$$

As for the second part, if $D\subset Q$, \begin{align} \int_{Q} f(x)p(x) dx&=\int_{Q} \{\mathbb{I}_D(x)+\mathbb{I}_{D^c}(x)\} f(x)p(x) dx\\ &=\int_{Q \cap D} f(x)p(x) dx+\int_{Q \cap D^c} f(x)p(x) dx \end{align} but otherwise, since $$\mathbb{I}_{Q\cup D}=\mathbb{I}_Q+\mathbb{I}_{D\cap Q^c}=\mathbb{I}_D+\mathbb{I}_{Q\cap D^c}$$ we have $$\mathbb{I}_Q=\mathbb{I}_D+\mathbb{I}_{Q\cap D^c}-\mathbb{I}_{D\cap Q^c}$$

Xi'an
  • 90,397
  • 9
  • 157
  • 575
  • Oh, Thanks a lot Xi'an. It is so clear I think. Also sorry I just have one more question. The author says that there is no need to worry about sampling from Q when q(x) =0 (i.e. when the denominator could be zero , q(x) = 0), because the author says "we will never see one". I am just wondering why is it true that when sampling from the distribution Q, we will never seen a sample x such that q(x)=0? – john_w Nov 25 '16 at 12:32
  • With probability 1, a realisation of$X$ from the density q belongs to the support of $q$, $\{x;\,q(x)>0\}$. Hence there is a zero probability to obtain a realisation outside this support and $q(X)>0$ with probability one. – Xi'an Nov 25 '16 at 12:40
  • 1
    Hi Xian, I have attached the link to the file that contains the materials in the edited question above now. – john_w Nov 25 '16 at 16:49