In section 3.2.3 of Elements of Statistical Learning (Link), there's this statement on multiple regression coefficients on Page 54
we have shown that the $j^{th}$ multiple regression coefficient is the univariate regression coefficient of $\mathbf{y}$ on $\mathbf{x_{j·012...(j−1)(j+1)...,p}}$, residual after regressing $\mathbf{x}_j$ on $\mathbf{x}_0, \ldots, \mathbf{x_{j-1}}, \mathbf{x}_{j+1}, \ldots, \mathbf{x}_p$
If I go by the definition of "regress $\mathbf{b}$ on $\mathbf{a}$" (introduced on Page 53), the statement above would mean that $j^{th}$ multiple regression coefficient is given by
$$ \hat{\beta}_j = \frac{<\mathbf{x}_j, \mathbf{r} >}{<\mathbf{r}, \mathbf{r}>} $$ where $\mathbf{r} = \mathbf{x}_j - \frac{<\mathbf{x}_j, \mathbf{x}_0>}{<\mathbf{x}_0, \mathbf{x}_0>} \mathbf{x}_0 - \ldots - \frac{<\mathbf{x}_j, \mathbf{x}_{j-1}>}{<\mathbf{x}_{j-1}, \mathbf{x}_{j-1}>} \mathbf{x}_{j-1} - \frac{<\mathbf{x}_{j}, \mathbf{x}_{j+1}>}{<\mathbf{x}_{j+1}, \mathbf{x}_{j+1}>} \mathbf{x}_{j+1} - \ldots - \frac{<\mathbf{x}_{j}, \mathbf{x}_{p}>}{<\mathbf{x}_{p}, \mathbf{x}_{p}>} \mathbf{x}_{p} $.
Edit: I understand that the expression for $\hat{\beta}_j$ is wrong. However, I would like to understand what am I interpretig wrong in the notation "residual after regressing $\mathbf{x}_j$ on $\mathbf{x}_0, \ldots, \mathbf{x_{j-1}}, \mathbf{x}_{j+1}, \ldots, \mathbf{x}_p$"?