I understand that we like working with square error instead of absolute error because it makes the calculus easy. But I was wondering about the parameters of Linear Regression minimized for absolute error.
Just to make it a little bit more clear, in the derivation of parameters for linear regression, we minimize
$$ \sum_{i=1}^{n} (Y_i - \hat{Y_i})^2 $$
to arrive at the parameters $$ \beta_0 = \bar{Y} - \beta_1\bar{x} $$ $$ \beta_1 = \frac{\sum_{i=1}^{n} (x_i - \bar{x})(Y_i - \bar{Y})}{\sum_{i=1}^{n} (x_i - \bar{x})^2} $$
Now how would the parameters change if I were to minimize $$ \sum_{i=1}^{n} |Y_i - \hat{Y_i}| $$
I cannot find a derivation online.