Questions tagged [polynomial]

A mathematical expression w/ >1 term containing the same variable (eg, x & x^2). Polynomials are commonly used to model curvilinear relationships.

A mathematical expression with $>1$ term containing the same variable (for example, $x$ & $x^2$). Polynomials are commonly used to model curvilinear relationships.

380 questions
65
votes
4 answers

Does it make sense to add a quadratic term but not the linear term to a model?

I have a (mixed) model in which one of my predictors should a priori only be quadratically related to the predictor (due to the experimental manipulation). Hence, I would like to add only the quadratic term to the model. Two things keep me from…
Henrik
  • 13,314
  • 9
  • 63
  • 123
47
votes
3 answers

Why is polynomial regression considered a special case of multiple linear regression?

If polynomial regression models nonlinear relationships, how can it be considered a special case of multiple linear regression? Wikipedia notes that "Although polynomial regression fits a nonlinear model to the data, as a statistical estimation…
40
votes
4 answers

Polynomial regression using scikit-learn

I am trying to use scikit-learn for polynomial regression. From what I read polynomial regression is a special case of linear regression. I was hopping that maybe one of scikit's generalized linear models can be parameterised to fit higher order…
36
votes
5 answers

Why use regularisation in polynomial regression instead of lowering the degree?

When doing regression, for example, two hyper parameters to choose are often the capacity of the function (eg. the largest exponent of a polynomial), and the amount of regularisation. What I'm confused about, is why not just choose a low capacity…
31
votes
9 answers

How can we explain the "bad reputation" of higher-order polynomials?

We all must have heard it by now - when we start learning about statistical models overfitting data, the first example we are often given is about "polynomial functions" (e.g., see the picture here): We are warned that although higher-degree…
stats_noob
  • 5,882
  • 1
  • 21
  • 42
30
votes
4 answers

Raw or orthogonal polynomial regression?

I want to regress a variable $y$ onto $x,x^2,\ldots,x^5$. Should I do this using raw or orthogonal polynomials? I looked at the question on the site that deals with these, but I don't really understand what's the difference between using them. Why…
l7ll7
  • 1,075
  • 2
  • 9
  • 15
24
votes
5 answers

Why is the use of high order polynomials for regression discouraged?

I've read many times on this site that high order polynomials (generally more than third) shouldn't be used in linear regression, unless there is a substantial justification to do so. I understand the issues about extrapolation (and prediction at…
Marco Rudelli
  • 550
  • 1
  • 11
22
votes
2 answers

What happens when I include a squared variable in my regression?

I start with my OLS regression: $$ y = \beta _0 + \beta_1x_1+\beta_2 D + \varepsilon $$ where D is a dummy variable, the estimates become different from zero with a low p-value. I then preform a Ramsey RESET test and find that i have some…
22
votes
2 answers

Recovering raw coefficients and variances from orthogonal polynomial regression

It seems that if I have a regression model such as $y_i \sim \beta_0 + \beta_1 x_i+\beta_2 x_i^2 +\beta_3 x_i^3$ I can either fit a raw polynomial and get unreliable results or fit an orthogonal polynomial and get coefficients that don't have a…
18
votes
3 answers

Why are there large coefficents for higher-order polynomial

In Bishop's book on machine learning, it discusses the problem of curve-fitting a polynomial function to a set of data points. Let M be the order of the polynomial fitted. It states as that We see that, as M increases, the magnitude of the…
15
votes
3 answers

Perform linear regression, but force solution to go through some particular data points

I know how to perform a linear regression on a set of points. That is, I know how to fit a polynomial of my choice, to a given data set, (in the LSE sense). However, what I do not know, is how to force my solution to go through some particular…
Spacey
  • 1,639
  • 2
  • 13
  • 18
14
votes
2 answers

Is there ever a reason not to use orthogonal polynomials when fitting regressions?

In general, I'm wondering if there it is ever better not to use orthogonal polynomials when fitting a regression with higher order variables. In particular, I'm wondering with the use of R: If poly() with raw = FALSE produces the same fitted values…
user2374133
  • 143
  • 1
  • 6
13
votes
1 answer

What are multivariate orthogonal polynomials as computed in R?

Orthogonal polynomials in an univariate set of points are polynomials that produce values on that points in a way that its dot product and pairwise correlation are zero. R can produce orthogonal polynomials with function poly. The same function has…
Pere
  • 5,875
  • 1
  • 13
  • 29
12
votes
1 answer

Computation of polynomial contrast variables

Please give me idea how to efficiently recode a categorical variable (factor) into the set of orthogonal polynomial contrast variables. For many types of contrast variables (e.g. deviation, simple, Helmert, etc.) the pass is: Compose the contrast…
ttnphns
  • 51,648
  • 40
  • 253
  • 462
12
votes
1 answer

Can I interpret the inclusion of a quadratic term in logistic regression as indicating a turning point?

In a Logistic Regression with linear and quadratic terms only, if I have a linear coefficient $\beta_1$ and quadratic coefficient $\beta_2$, can I say that that there is turning point of the probability at $-\beta_1 / (2\beta_2)$?
FZo
  • 121
  • 1
  • 3
1
2 3
25 26