Your notation has interesting assumptions. Considering $Y_i$ and $y_i$ are the same, $a_i,b_i$ should be real numbers representing your data points. But,in your notation, they're represented like vectors, e.g. you have $a_ia_i'$. If they were vectors, say $m\times 1$, you couldn't also have $a_i y_i$ because in order to have $y_i=\beta_1a_i+\beta_2b_i+u_i$, $y_i$ needs to have the same dimension with $a_i$, and the vector multiplication $a_iy_i$ becomes invalid. If you just have a typo, i.e. you have something like $\sum a_i y_i'$, which is algebraically valid, then you're either trying to solve a system of multiple outputs with more than one $\beta_1,\beta_2$ (i.e. $\beta_{1,2}$ are vectors, where you have separate $\beta_1,\beta_2$ for each element of $y_i$ vector, and you can solve the systems separately) or you constraint all the system to have the same scalar $\beta_1,\beta_2$ where you can actually add each of the outputs (i.e. $y_i=[y_{i1},y_{i2},...,y_{im}]$) as more data points, in the end having $nm$ data points. Either way, there is no need to solve something like that.
But, if we assume that these are just scalars, in which the problem translates into the usual multiple regression, then you can simply write your solution as $\beta_1=\frac{\sum a_i y_i}{\sum a_i^2}$, which is correct only if you don't have the second regressor. So, your answers are valid for the individual models $y_i=\beta_1 a_i+u_i$ and $y_i=\beta_2 b_i+u_i$. However, this doesn't account for the dependence between $\beta_1,\beta_2$. The coefficients tend to affect each other when other terms are incorporated.
The general formula for multiple regression is $\beta=(X^TX)^{-1}X^Ty$, where $X$ has $n\times p$, and $y$ has $n \times 1$, $\beta$ has $p \times 1$ dimensions (here $p=2$). Each row of $X$ corresponds to data points $(a_i,b_i)$ and rows of $\beta$ corresponds to $\beta_1$ and $\beta_2$. We can also come up with the same solution by differentiating $\text{SSE}=\sum(y_i-\hat{y_i})^2$ and solving for $\beta_1,\beta_2$. Correctly formulating the problem yields the following setup:
$$X=\begin{bmatrix} a_1 & \ldots & a_n \\ b_1 & \ldots & b_n\end{bmatrix}, y=\begin{bmatrix} y_1 \\ \vdots \\ y_n\end{bmatrix},\beta=\begin{bmatrix}\beta_1 \\ \beta_2\end{bmatrix}$$
Now, we just substitute all together into the equations:
$$(X^TX)^{-1}=\left(\begin{bmatrix} a_1 & \ldots & a_n \\ b_1 & \ldots & b_n\end{bmatrix}\begin{bmatrix}a_1 & b_1 \\ \vdots&\vdots \\ a_n & b_n \end{bmatrix}\right)^{-1}=\begin{bmatrix}\sum a_i^2 & \sum a_ib_i \\ \sum a_ib_i & \sum b_i^2\end{bmatrix}^{-1}=\frac{1}{D}\begin{bmatrix}\sum b_i^2 & -\sum a_ib_i \\ -\sum a_ib_i & \sum a_i^2\end{bmatrix}$$
where $D=(\sum a_i^2)(\sum b_i^2)-(\sum a_ib_i)^2$, i.e. the determinant. And, we have
$X^Ty=\begin{bmatrix}\sum a_iy_i \\ \sum b_iy_i\end{bmatrix}$.
Finally, we'll have $$\beta=\frac{1}{D}\begin{bmatrix}(\sum b_i^2)(\sum a_iy_i)-(\sum a_ib_i)(\sum b_iy_i) \\ (\sum a_i^2)(\sum b_iy_i)-(\sum a_ib_i)(\sum a_iy_i) \end{bmatrix}$$
First row is $\beta_1$, second row is $\beta_2$.