Fitting very flexible models: Linear regression with large numbers of parameters. (arXiv:2101.07256v1 [physics.data-an])
<a href="http://arxiv.org/find/physics/1/au:+Hogg_D/0/1/0/all/0/1">David W. Hogg</a> (NYU), <a href="http://arxiv.org/find/physics/1/au:+Villar_S/0/1/0/all/0/1">Soledad Villar</a> (JHU)

There are many uses for linear fitting; the context here is interpolation and
denoising of data, as when you have calibration data and you want to fit a
smooth, flexible function to those data. Or you want to fit a flexible function
to de-trend a time series or normalize a spectrum. In these contexts,
investigators often choose a polynomial basis, or a Fourier basis, or wavelets,
or something equally general. They also choose an order, or number of basis
functions to fit, and (often) some kind of regularization. We discuss how this
basis-function fitting is done, with ordinary least squares and extensions
thereof. We emphasize that it is often valuable to choose far more parameters
than data points, despite folk rules to the contrary: Suitably regularized
models with enormous numbers of parameters generalize well and make good
predictions for held-out data; over-fitting is not (mainly) a problem of having
too many parameters. It is even possible to take the limit of infinite
parameters, at which, if the basis and regularization are chosen correctly, the
least-squares fit becomes the mean of a Gaussian process. We recommend
cross-validation as a good empirical method for model selection (for example,
setting the number of parameters and the form of the regularization), and
jackknife resampling as a good empirical method for estimating the
uncertainties of the predictions made by the model. We also give advice for
building stable computational implementations.

There are many uses for linear fitting; the context here is interpolation and
denoising of data, as when you have calibration data and you want to fit a
smooth, flexible function to those data. Or you want to fit a flexible function
to de-trend a time series or normalize a spectrum. In these contexts,
investigators often choose a polynomial basis, or a Fourier basis, or wavelets,
or something equally general. They also choose an order, or number of basis
functions to fit, and (often) some kind of regularization. We discuss how this
basis-function fitting is done, with ordinary least squares and extensions
thereof. We emphasize that it is often valuable to choose far more parameters
than data points, despite folk rules to the contrary: Suitably regularized
models with enormous numbers of parameters generalize well and make good
predictions for held-out data; over-fitting is not (mainly) a problem of having
too many parameters. It is even possible to take the limit of infinite
parameters, at which, if the basis and regularization are chosen correctly, the
least-squares fit becomes the mean of a Gaussian process. We recommend
cross-validation as a good empirical method for model selection (for example,
setting the number of parameters and the form of the regularization), and
jackknife resampling as a good empirical method for estimating the
uncertainties of the predictions made by the model. We also give advice for
building stable computational implementations.

http://arxiv.org/icons/sfx.gif