In order to obtain simultaneous confidence intervals for combinations of the mean $\mathbf{a'\mu}$, where $\mathbf{a,\mu} \in \mathbb{R}^p$, from a sample of n iid normal random variables $\mathbf{x_i}$~$N(\mathbf{\mu}, \Sigma)$, we use the following procedure: $$t^2=\frac{n(\mathbf{a'}(\mathbf{\overline{x}-\mu}))^2}{\mathbf{a'Sa}}$$
$$Pr(t^2 \le c,\forall \mathbf{a} \in \mathbb{R}^p)=Pr(\max_\mathbf{a \in \mathbb{R}^p}t^2\le c)=Pr(T^2\le c)$$
where $T^2$ is the Hotelling statistic. Under the null hypothesis $\mathcal{H}_0$, the scaled Hotelling Statistic is distributed as $\frac{n-p}{n-1} T^2 \sim F_{p,n-p}$. If we choose $c=\frac{(n-1)F_{p,n-p}(\alpha)}{n-p} $, where $F_{p,n-p}(\alpha)$ is the upper $\alpha$-percentile of the Fisher distribution, we obtain that $Pr(T^2 \le c)=1-\alpha$. Thus $Pr(t^2\le \frac{(n-1)F_{p,n-p}(\alpha)}{n-p}, \forall \mathbf{a} \in \mathbb{R}^p )=1-\alpha$. So all the statements obtained when varying $\mathbf{a}$ in $\mathbb{R}^p$ hold simultaneously with probability $1-\alpha$, i.e we found infinite confidence intervals the intersection of which contains all the possible combinations of the mean vector. I'm not sure about this last conclusion. My doubt is: may I take only $p$ independent vectors $a \in \mathbb{R}^p$ in order to have simultaneous confidence intervals of "global" confidence level $1-\alpha$. If yes how can I prove it? I tried to prove that if the inequalities hold for $p$ independent vectors, they hold even for their linear combinations but couldn't do that.