I am in the seemingly unusual situation of having practically unlimited data. In this case, will linear regression choose as good of a model as any other algorithm in the case where the number of samples approaches infinite?
To clarify, I have two cases in mind. With the same set of features, will ordinary least squares produce as good a prediction on out-of-sample data as algorithms such as lasso, ridge, or elastic net - algorithms meant to increase the generalization of a prediction. My intuition says that with enough data, OLS will do just as well. Similarly, given that the underlying relationship is linear, with enough data, will simple linear regression produce as good of a prediction as a boosting algorithm such as gradient boosting regression, or as a bagging algorithm such as random forests?