Other answers nicely cover how to derive the $\beta$ coefficients. I'm not sure what you mean by $n, \beta$ "put together." But, if it means that you'd like to use the coefficients to derive the model's predicted values using the coefficients, it's simply the product $XB$, where $X$ is an $m \times n$ matrix of $m$ observations, each with $n$ independent variables, and $B$ a $n \times p$ matrix of regression coefficients. ($p$ here is the number of dependent variables.)
To your question about the standard error, for a single independent variable and single coefficient, the formula is:
$s.e.(\beta_j) = \sqrt{s^2 (X'X)^{-1}_{jj} }$
where $s^2$ is the sum of squared residuals, given by $\sum_i y_i -\hat y_i $, over $m - n$. (More here.) To broaden the formula to return a vector of standard errors corresponding to each coefficient:
$s.e.(\beta) = \sqrt{s^2 diag(X'X)^{-1} }$
where $diag$ returns the diagonal of the matrix. To further broaden that formula to return a matrix of standard error corresponding to each $\beta$ coefficient, replace $s^2$ with $S^2$, a row vector containing the $s^2$ for each independent variable:
$s.e.(B) = \sqrt{diag(X'X)^{-1} S^2 }$
Note that the vector returned by $diag$ should be $n \times 1$, and $S^2$ $1 \times p$, making their product $n \times p$, standard errors corresponding to the coefficients in $B$. (Square root again applied element-wise.)