Assuming $X$ is a full-rank $n \times p$ design matrix, and $y$ is an $n$ vector, it is possible to solve the regression coefficient vector $\hat{\beta}$ using either $\hat{\beta}=(X^T X)^{-1} X^T y$ ie the OLS method, or by calculating the covariance matricies $S_{XX}$ and $S_{Xy}$, and then $\hat{\beta} = S_{XX}^{-1} S_{Xy}$.
I would like to prove that both expressions are equivalent to each other. The step I can do is expand the sample covariance terms to:
$$\begin{aligned} \hat{\beta} &= S_{XX}^{-1} S_{Xy}\\ &= \left( \frac{1}{n-1} X^TX - \bar{X}^T\bar{X} \right)^{-1} \left( \frac{1}{n-1} X^Ty - \bar{X}^T\bar{y} \right)\\ &=\left( X^TX - \bar{X}^T\bar{X} \right)^{-1} \left( X^Ty - \bar{X}^T\bar{y} \right) \end{aligned}$$
Where I'm stuck is proving that these mean terms $\bar{X}^T\bar{X}$ and $\bar{X}^T\bar{y}$ cancel out to result in the above expression for the OLS coefficients. How can this be done?