Let's consider a problem where you want to find $\arg \min_{\theta, \pi} f(\theta, \pi)$. Define the set of values that minimize $f$ as $S = \{(\theta^*, \pi^*)\} = \arg \min_{\theta, \pi} f(\theta, \pi)$
This is the "more options" problem. The "fewer options" problem is one where we don't get to choose $\pi$; it's fixed, often with values of 0 or 1, which can cause various sub-expressions to disappear from $f$ altogether. This problem can be written as finding the $\arg \min_{\theta(\pi)} f(\theta(\pi))$, where we write $\theta(\pi)$ to make clear the dependency of the optimal $\theta$ on the fixed value of $\pi$.
Define $\Pi^* = \{\pi^*\}$, the set of $\pi$ for which there exists a $\theta$ such that $f(\theta, \pi)$ achieves its minimum, and similarly define $\Theta^*$. Now, if $\pi \in \Pi^*$, it is clear that there exist $\theta(\pi) \in \Theta^*$ such that $f(\theta(\pi), \pi)= \min_{\theta, \pi} f(\theta, \pi)$. You can find them by extracting all the elements from $S$ for which $\pi = \pi^*$. However, there aren't any $\theta(\pi)$ for which $f(\theta(\pi), \pi) < \min f(\theta, \pi)$, which can be shown with a short proof by contradiction. So, in this case, giving ourselves more options by allowing ourselves to choose $\pi$ is neutral; the optimal value is the same for the more-option problem as for the fewer-option problem.
On the other hand, if $\pi \notin \Pi^*$, then there does not exist any $\theta(\pi)$ such that $f(\theta(\pi))= \min_{\theta, \pi} f(\theta, \pi)$, because the minimum is only achieved for values of $\pi \in \Pi^*$. This implies that $\min_{\theta(\pi)} f(\theta(\pi)) > \min_{\theta, \pi} f(\theta, \pi)$ (you can get the "$>$" with a short proof by contradiction.) In this case, giving ourselves more options by allowing ourselves to optimize over $(\theta, \pi)$ instead of just over $\theta$ has allowed us to improve the value of the objective function by the difference $\min_{\theta(\pi)} f(\theta(\pi)) - \min_{\theta, \pi} f(\theta, \pi)$.
Therefore, more options in terms of parameters to optimize over never hurts, it's either neutral or helpful. The question of whether adding those parameters makes the problem so complex that it becomes impossible to actually solve is quite a different one!