0

I have been trying to compare several matrix factorization algorithms and I've noticed that all the papers and libraries I've seen measure the Root Mean Square Error(RMSE) when intuitively I would have expected a good metric would be the Mean Absolute Error (MAE).

Is there any reason why there appears to be a preference to use RMSE?

SARose
  • 255
  • 3
  • 8
  • 2
    I'd suggest reading https://medium.com/human-in-a-machine-world/mae-and-rmse-which-metric-is-better-e60ac3bde13d – matt Jun 23 '18 at 14:43
  • Answering with a question: why would you "intuitively" prefer MAE? – Tim Jun 23 '18 at 15:35
  • @Tim because all I care about in my problem is the how wrong a my predictions are on average since outliers are both rare and unimportant (again for my problem) – SARose Jun 23 '18 at 15:46
  • 1
    Please look at [this page](https://stats.stackexchange.com/q/48267/28500) for a good general introduction to RMSE vs Mean Absolute Error. (Some may use "MAE" to mean "median absolute error.") Then edit your question to focus on any remaining questions you have specific to matrix factorization; otherwise, this question seems like a duplicate of the question I linked. – EdM Jun 23 '18 at 17:37

1 Answers1

0

RMSE places a larger weighting on larger errors since the difference is squared (before finally taking the square root over the sum). If I remember correctly then RMSE is also continuous and differentiable - which has some implications for analytical optimization.

JDT
  • 11
  • 3