Suppose I want to find a truncated normal distribution, but instead of its being defined on an interval $(a,b)$, where $-\infty<a<b<\infty$, its definition is on an interval $(a,b)\cup(c,d)$, where $-\infty<a<b<c<d<\infty$.
First of all, would this still satisfy the definition of a truncated normal distribution? The Wikipedia article on this just defines it using $(a,b)$, where $-\infty<a<b<\infty$ and $a<X<b$ (and X is normal with mean $\mu$ and variance $\sigma^{2}$). If it isn't a truncated normal distribution, then what is it?
If it is a truncated normal distribution, how would I compute it? I was thinking that I could approach it using the Law of Total Probability, but then I would just get the truncated distribution as 0.5 times the truncated normal distribution for each interval in the union, and this doesn't really make sense to me, because it means that instead of there being one value that X could take with maximum probability, there are two peaks in the distribution with equal probability (unless I am doing it wrong).