So I've gone through this SE question and all the answers where the general consensus is that you should never remove the intercept of the linear regression model. The most upvoted answer says:
The shortest answer: never, unless you are sure that your linear approximation of the data generating process (linear regression model) either by some theoretical or any other reasons is forced to go through the origin. If not the other regression parameters will be biased even if intercept is statistically insignificant
However, I'd like to know if it's a good idea to remove the intercept if you get better prediction results (Adjusted r-squared: 0.82) ?
I've trained the model without specifying the intercept ( extra details: used OLS regression in python statsmodels package) and then tested the results.
I cross checked the results and saw that the predicted values and actual values was real close (+/-2 difference in average). So is it a good enough reason as to foresake the intercept?