Let $Y_{i} = \beta_{0} + \beta_{1}x_{i} + \epsilon_{i}$ $(i = 1,2,\ldots,n)$, where $\textbf{E}[\epsilon] = 0$ and $\textbf{Var}[\epsilon] = \sigma^{2}\textbf{I}_{n}$. Find the least square estimates of $\beta_{0}$ and $\beta_{1}$. Prove they are uncorrelated if and only if $\overline{x} = 0$.
MY ATTEMPT
As far as I have understood, $(\hat{\beta}_{0},\hat{\beta}_{1})^{\prime} = (\textbf{X}^{\prime}\textbf{X})^{-1}\textbf{X}^{\prime}\textbf{Y}$, where $\textbf{X}$ is the regression matrix. But I do not know how to answer the second question. Can someone help me out? Thanks in advance!
EDIT
The variance of $(\hat{\beta}_{0},\hat{\beta}_{1})$ is given by \begin{align*} \textbf{Var}((\textbf{X}^{\prime}\textbf{X})^{-1}\textbf{X}^{\prime}\textbf{Y}) = (\textbf{X}^{\prime}\textbf{X})^{-1}\textbf{X}^{\prime}\textbf{Var}(\textbf{Y})\textbf{X}(\textbf{X}^{\prime}\textbf{X})^{-1} = \sigma^{2}(\textbf{X}^{\prime}\textbf{X})^{-1} \end{align*}
given that $\textbf{Var}(\textbf{Y}) = \textbf{Var}(\textbf{Y} - \textbf{X}\beta) = \textbf{Var}(\epsilon) = \sigma^{2}\textbf{I}_{n}$. Consequently, we have \begin{align*} \textbf{X}^{\prime}\textbf{X} = \left[ {\begin{array}{cc} n & \displaystyle\sum_{k=1}^{n}x_{k}\\ \displaystyle\sum_{k=1}^{n}x_{k} & \displaystyle\sum_{k=1}^{n}x^{2}_{k} \\ \end{array} } \right] \Longrightarrow (\textbf{X}^{\prime}\textbf{X})^{-1} = \frac{1}{\det(\textbf{X}^{\prime}\textbf{X})}\left[ {\begin{array}{cc} \displaystyle\sum_{k=1}^{n}x^{2}_{k} & -\displaystyle\sum_{k=1}^{n}x_{k}\\ -\displaystyle\sum_{k=1}^{n}x_{k} & n \\ \end{array} } \right] \end{align*}
Therefore $\text{Cov}(\hat{\beta}_{0},\hat{\beta}_{1}) = 0$ iff $\displaystyle\sum_{k=1}^{n}x_{k} = 0$, that is, $\overline{x} = 0$. Am I on the right track?