0

I have to solve the least squares for $\gamma$ in the following problem.

The model is described as $y_i = \beta x_i^{\gamma} + u_i$, where $u_i $ is i.i.d. normal with mean zero and variance $\sigma^2$.

The $H_0$ is given by $\beta = 1$, such the restricted model becomes $y_i = x_i^{\gamma} + u_i$.

I want to solve the following minimization in matrix form for $\gamma$: $min$ $u^Tu = min$ $(y-x^{\gamma})^T(y-x^{\gamma})$. $x$ is just a vector in this model.

I struggle with taking the derivative of the above equation because of the power $\gamma$. Can someone help me with this problem?

David
  • 1
  • 3
    By definition, $x^\gamma = \exp(\gamma\log x).$ Use the Chain Rule to take the derivative. – whuber Dec 01 '20 at 15:19
  • @whuber Thank you. How would I solve for $\gamma$ if I have taken the first order condition? Such that I have $log(x)e^{\gamma log(x)}(y- e^{\gamma log(x)}) = 0$. How do I solve this equation for $\gamma$? – David Dec 01 '20 at 15:54
  • 1
    I presented a solution to (a generalization of) this problem at https://stats.stackexchange.com/questions/496838. – whuber Dec 01 '20 at 16:08

0 Answers0