I know that $R^2$ is for the least square regression. Is there an analogous measure of fit to $R^2$ in LAD (Least Absolute Deviations) regression?
Here I am concerned with the "fitting quality".
I know that $R^2$ is for the least square regression. Is there an analogous measure of fit to $R^2$ in LAD (Least Absolute Deviations) regression?
Here I am concerned with the "fitting quality".
The definition of "analogue" is not clear.
If you view $R^2$ as a metric to measure the "goodness of the fit" in regression setting. Then, likelihood can be used to evaluate the "goodness of the fit" for LDA (Linear Discriminant Analysis).
EDIT: I think OP was trying to ask LAD (Least absolute deviations regression) but not LDA. Here is my answer to LAD.
I would suggest OP to review the loss function in regression setting. For squared loss regression, the loss function is
$$\sum_i (y_i-\hat y_i)^2$$
where $\hat y_i$ is the predicted value for data $i$, and the sum is over all the data points.
For least absolute deviation loss, the loss function is
$$\sum_i |(y_i-\hat y_i)|$$
Therefore, to evaluate the "goodness of fit", we can exam the loss value. Now, the problem is with the loss value metric, it will depend on number of data points, i.e., the more data points we have, in general, the higher value the will be in aforementioned 2 formulas.
One way of dealing it is divided by number of data points. And if we want to do one step more, we can "normalize" it into 0 to 1 using RSS, which is R square.
But I think dividing the number of data points should be good for your purpose.