Consider a linear regression model: $$ y_i = \mathbf x_i \cdot \boldsymbol \beta + \varepsilon _i, \, i=1,\ldots ,n, $$ where $\varepsilon _i \sim \mathcal L(0, b)$, that is, Laplace distribution with $0$ mean and $b$ scale parameter, are all are mutually independent. Consider a maximum likelihood estimation of unknown parameter $\boldsymbol \beta$: $$ -\log p(\mathbf y \mid \mathbf X, \boldsymbol \beta, b) = n\log (2b) + \frac 1b\sum _{i=1}^n |\mathbf x_i \cdot \boldsymbol \beta - y_i| $$ from which $$ \hat{\boldsymbol \beta}_{\mathrm {ML}} = {\arg\min }_{\boldsymbol \beta \in \mathbb R^m} \sum _{i=1}^n |\mathbf x_i \cdot \boldsymbol \beta - y_i| $$
How can one find a distribution of residuals $\mathbf y - \mathbf X\hat{\boldsymbol \beta}_{\mathrm {ML}}$ in this model?