Consider the following mixture of two densities $$ f(x)=\lambda g(x-\mu_1)+(1-\lambda)g(x-\mu_2) $$ with $\lambda\in [0,1]$, $g(\cdot)$ symmetric around zero, $\mu_1<\mu_2$.
Claim: the mixture is symmetric if and only if $\lambda\in \{0,1,\frac{1}{2}\}$.
Could you help me to show this? I understand that $\lambda\in \{0,1\}$ implies that the mixture is symmetric because $g(\cdot)$ is symmetric. What I'm struggling to understand is why the weights should be equal to get a symmetric distribution.