A model is given as $Y = \mu + u_i$ where $u_i $ $IID(0,\sigma^2)$ with a sample of $n$ observations.
I have to prove that the OLS estimator for $\mu = \frac{\sum Y_i}{n}$ is the BLUE estimator.
I derive its variance to be $\frac{\sigma^2}{n}$ but now I need to show that its variance is the smallest possible for all LUE's.
I have taken a stab at it, with some help of a short solution, but I would like to understand every step of it, and also understand how to arrive at the solution.
Let $\hat{\mu}^\star = \sum w_i Y_i$ be all possible linear estimators. In order for it to be unbiased $\sum_{i=1}^n w_i$ must be equal to 1.
I start trying to simplify the variance for $\hat{\mu}^\star$
$V(\hat{\mu}^\star) = V(\mu \sum_{i=1}^{n} w_i +\sum_{i=1}^{n} w_i \cdot \sum_{i=1}^n u_i) = V(\sum w_i \cdot \sum u_i) = \sigma^2 \cdot \sum w_i^2$
The solution is to add and subtract $\frac{1}{n}$ from $\sum w_i$ to get $\sigma^2 \sum (w_i + \frac{1}{n} - \frac{1}{n})^2$ from which it can be derived that the lowest variance is given by $w_i = \frac{1}{n}$.
The problem I am having is that I do not understand how one comes up with the adding and subtraction of $\frac{1}{n}$ without already knowing this to be the answer. I feel like I should find an expression for the variance, differentiate it and find the minimum point