This is a very broad question, although it may not seem so. Two comments:
You say "The coefficient of determination is" but whether the formula you give acts as a definition of fundamentals for anyone is unclear. I'd characterise it rather as one of several available computing formulas.
You ask "Why is this used" but that confuses or conflates the question of why the coefficient of determination is used at all with why the particular formula you cite might be used.
For me, the attractions of $R^2$ lie in being (a) a simple and single measure linked to the correlation coefficient $r$ or an analogue of that and (b) free of the units of measurement of the original variable. In multiple regression, the correlation concerned is between the values observed and those predicted from the model.
The disadvantages of $R^2$ are precisely the same points: no summary measure can capture all the virtues and limitations of a regression and there is often much point in summarising lack of fit on the scale of the measured response.
To that end, $SS_\text{res}/n$ is, contrary to your implication, often used, if indirectly. Summarising the residuals by mean square is at base a good idea, although its square root is a better one on dimensional grounds and for detailed technical reasons there is a case for using a divisor which is the sample size minus the number of parameters fitted. (Looking at the detailed pattern of the residuals is usually an even better idea.)
More broadly, $R^2$ is often over-valued in that a low $R^2$ may be a worthwhile achievement and a high $R^2$ a scientific or practical failure. Much depends on what is interesting, useful and possible scientifically or practically.