I understand that when data errors are independent and Gaussian distributed, then the maximum likelihood principle solution is the least squares solution (i.e. I understand this answer). However, some of the sources I am reading imply that the equivalence goes in both directions, e.g. one asks "The choice of parameters resulting from a least squares linear regression correspond to the maximum likelihood estimate of which Likelihood function", and their answer is presumably the Gaussian likelihood function.
But my understanding is that any function which satisfies both of the following conditions could in theory also be a likelihood function that that corresponds to the least squares estimator:
- is an increasing function of $-\sum_i(y_i-x_i\beta)^2$
- is a normalized probability density function
Is this correct and my sources were simply sloppy with language, or is there something that I'm missing and the only likelihood function which can correspond to the least squares estimator is the Gaussian?