2

If we want to use categorical variables in regression context, we are allowed to use dummy codings such as these schemes.

Is this also required in a Bayesian (MCMC) context, such as with WinBUGS/OpenBUGS, that we have to model k factors with k-1 dummy variables – or are we allowed to use k dummy variables and linear dependency of the variables is not an issue?

Danica
  • 21,852
  • 1
  • 59
  • 115
user28937
  • 141
  • 1
  • 5
  • 3
    Bayesian or non-bayesian is not an issue here. If you fit your regression using maximum likelihood or Bayesian estimation wouldn't affect how you code the dummy variables... – Rasmus Bååth Jun 08 '14 at 19:32
  • Bayesian regression with Normal priors on the regression coefficients corresponds to Ridge regression, which adds a penalty to the original regression objective. For penalized regression, there are strong arguments to use k not k-1 dummy variables, [here](https://stats.stackexchange.com/questions/440126/do-we-use-different-number-of-dummy-variables-for-classical-and-bayesian-stats) and [here](https://stats.stackexchange.com/questions/231285/dropping-one-of-the-columns-when-using-one-hot-encoding/329281#329281) – Johannes May 27 '20 at 13:10

0 Answers0