I am working with linear regression methods. The weakness of the method is the possibility of overfitting. So to reduce it, some papers use regularization. Are there other methods to reduce overfitting? Can we use a prior term to reduce overfitting?
Given $D=\{(x_1,y_1);(x_2,y_2)...(x_n,y_n)\}$, the linear regression of the data $D$ is:
$$H=wX+b$$
To reduce overfitting we add some regularization term. So the loss function is:
$$J=\sum(h(x_i)-y_i)^2+\lambda_1\sum(w_i^2)$$
But finding $\lambda_1$ is so hard. Can we ignore it by using other terms to get more effective results? Thanks.