7

The Paper regarding die SHAP value gives a formula for the Shapley Values in (4) and for SHAP values apparently (?) in (8)

Still I dont really understand the difference between Shapley and SHAP values. As far as i understand for Shapley I need to retrain my Model on each possible subset of parameters and for SHAP i am just using the basic model trained on all parameters. Is that it? So SHAP is computationally easier?

Quastiat
  • 193
  • 4
  • shapley values represent the contribution of features/players in certain prediction/reward. Shap has incorporated this concept to find the features importance in machine learning context. – farheen Dec 09 '19 at 13:36

0 Answers0