Let $\mathbf{X}$ be a N x p matrix where the rows are assumed to be independet with distribution $\mathcal{N}(z_i\mathbf{v},\Sigma)$. A fixed vector $\mathbf{z}\in\mathbb{R}^N$. I would like to show that the MLE of $\mathbf{v} = (v_1,..v_p)^T$ is
$$\hat{\mathbf{v}}^T=\frac{\mathbf{z}^T\mathbf{X}}{||\mathbf{z}||_2^2}$$
I've tried to work on the loglikelihood function $$L(.)=\sum_{i=1}^N\log\left(\frac{\exp\left(-\frac{1}{2}(x-z_i\mathbf{v})^T\Sigma^{-1}(x-z_i\mathbf{v})\right)}{\sqrt{(2\pi)^p|\Sigma| }}\right)$$ to no avail.
Edit: My attempt. $x_i$ is the ith row in $\mathbf{X}$. Eventhough I used T for transpose, we assume p = 1 for now $$L(.)\propto -\frac{1}{2}\sum_{i=1}^N (x_i-z_i\mathbf{v})^T\Sigma^{-1}(x_i-z_i\mathbf{v})$$ $$=-\frac{1}{2}\sum_{i=1}^N x_i^T\Sigma^{-1}x_i - 2(z_i\mathbf{v})^T\Sigma^{-1}x_i + (z_i\mathbf{v})^T\Sigma^{-1}z_i\mathbf{v}$$
Then the derivative and first order condition is $$ -\frac{1}{2}\sum_{i=1}^N - 2x_i\Sigma^{-1}z_i+ 2 z_i^2\mathbf{v}^T\Sigma^{-1}= \sum_{i=1}^N x_i\Sigma^{-1}z_i- z_i^2\mathbf{v}^T\Sigma^{-1}=0$$ Multiply the concentration matrix for both sides: $$\Rightarrow\sum_{i=1}^N x_i z_i- z_i^2\mathbf{v}^T=0$$ $$\Rightarrow \mathbf{v}^T=\frac{\sum_{i=1}^N x_i z_i}{\sum_{i=1}^N z_i^2}=\frac{\mathbf{z}^T\mathbf{X}}{||\mathbf{z}||_2^2}$$