1

One can read everywhere that linear regression is a convex optimization problem and thus gradient descent will find the global optimum.

But can someone explain how to proof that it is a convex problem? Is it because of the linear regression function or is it because of the RSS cost function?

MikeHuber
  • 1,119
  • 3
  • 13
  • 23
  • Here you go. http://math.stackexchange.com/questions/483339/proof-of-convexity-of-linear-least-squares . – Mark L. Stone Mar 08 '16 at 14:55
  • 1
    That said, gradient descent is not a very good method for solivng linear least squares problems http://stats.stackexchange.com/questions/160179/do-we-need-gradient-descent-to-find-the-coefficients-of-a-linear-regression-mode/164164#164164 . – Mark L. Stone Mar 08 '16 at 15:07

0 Answers0