For regression problem, I have seen people use "coefficient of determination" (a.k.a R squared) to perform model selection, e.g., finding the appropriate penalty coefficient for regularization.
However, it is also common to use "mean squared error" or "root mean squared error" as a measure of regression accuracy.
So what is the main difference between these two? Could they be used interchangeably for "regularization" and "regression" tasks? And what are the main usage of each in practice, such as in machine learning, data mining tasks?