0

I'm having problems doing OLS in R using the lm() function on the following linear model:

$Y_t = \bar{Y} \cdot (1-a-b-c) + a \cdot X_{1t} + b \cdot X_{2t} + c \cdot X_{3t} + \varepsilon_{t}$

where $\bar{Y}$ is the sample mean. Moreover, the parameters need to satisfy the following constraints: $a,b,c\geq 0$ and $a+b+c<1$ in order for $Y_t$ to be valid.

To be completely honest, I'm not sure this is even a linear regression problem, due to the inequality constraints. However, in the textbook I'm following, it is vaguely specified that the above problem can be estimated using OLS, however I'm open to other suggestions. Also, if you use another method, I would like to know how to acquire the standard errors from the estimation procedure. Feel free to cook up an example using random data.

user262734
  • 11
  • 1
  • 1
    This falls under the branch of "constrained optimization" and cannot be done using vanilla `lm()`, you will have to use another function. – user2974951 Oct 14 '20 at 11:29
  • I thought so. However, I'm unsure on how to do it using constrained optimization though. – user262734 Oct 14 '20 at 11:33
  • This will require some manual work. Have a look at http://www.stat.ucla.edu/~handcock/combining/software/glmc.html – user2974951 Oct 14 '20 at 11:40
  • Thank you. I will take a look at it. I was also wondering how you estimate it, if we eliminated the constraints on the parameters? Then it should reduce to a problem that can be done using vanilla `lm()`. In this case, I'm having trouble specifying the intercept parameter under the `lm()` function. – user262734 Oct 14 '20 at 11:49

0 Answers0