5

I have two random variables $X$ and $Y$ and I'm trying to model $\mathbb{E}[Y|X]$.

To this end, I'd like to pick a collection of functions $f_1, f_2 \dots f_n : \mathbb{R} \to \mathbb{R}$ and then fit a model to my data set $(x_i, y_i)_{1 \dots m}$ by OLS:

$$\mathrm{argmin}_{\alpha_1 \dots \alpha_n} \sum_{i=1}^m \left(y_i - \sum_{j=1}^n \alpha_j f_j(x_i) \right)^2$$

I'm looking for a reference that compares different choices of families of $\{ f_i\}$ and answers questions like:

  • Will any linearly independent $f_i$ work?
  • Is an orthogonal collection (Laguerre polynomials, Fourier basis) better and mathematically, why?

I am particularly keen to learn about the numerical stability of the resulting least squares problem. Do some bases $\{ f_i \}$ work fair noticeably worse if the least squares problem is solved by Cholesky or QR factorisation instead of SVD?

user357269
  • 351
  • 1
  • 10
  • This may be of interest: https://math.stackexchange.com/questions/279608/how-to-work-out-orthogonal-polynomials-for-regression-model – user217285 Dec 21 '17 at 06:29
  • The nature of this question is unclear: *from what set of functions* do you wish to select this finite collection of $n$ functions, *based on what information?* And what does "work" mean? *Of course* you can find a solution for any finite collection of $f_i$ whatsoever (proof: apply the standard OLS formula), so "work" must imply something besides solving the OLS problem as you have stated it. In what sense do you mean "better"? The literature looks at many aspects, including *interpretability,* *multicollinearity,* and *applicability* (to a particular problem), as well as numerical stability. – whuber Feb 12 '22 at 22:34

0 Answers0