0

I have a prediction model and have experimental data. I initially tried to test the accuracy of the model by looking at the difference between the observed dependent variable and predicted dependent variable, for a given independent variable. However, I'm not sure from what range I am allowed to say that the model is accurate. Is there any range that is commonly used? Or are there any other methods of testing the accuracy of a prediction model?

Donghwi Min
  • 101
  • 2

2 Answers2

0

I'm not sure from what range I am allowed to say that the model is accurate. Is there any range that is commonly used?

What accuracy is "good" will depend on your specific domain. There are no general benchmarks. Best to compare your algorithm to very simple benchmark methods. More at Is there any standard / criteria of good forecast measured by SMAPE and MASE?

Or are there any other methods of testing the accuracy of a prediction model?

There are many possible KPIs, see, e.g., here. Note that different KPIs elicit different functionals of the unknown future distribution (Kolassa, 2020).

Stephan Kolassa
  • 95,027
  • 13
  • 197
  • 357
0

I am assuming you are building regression model. There is no robust metrics which works for all kind of data. Generally, in regression model you can use MSE, RMSE, MAPE etc. MSE and RMSE are the metrics which can be useful if you are comparing multiple models on same data. But if you do not have multiple models and trying to rely on only one model then you can choose MAPE and if I am not wrong then it should vary between 0 to 100%. MAPE error tend to 0 means good model and tend to 1 means bad model. However, MAPE also has problem if actual value is zero for some observation.

Ram
  • 31
  • 1