I get that for simple linear regression, say y=b0+b1*x + error, the errors should be normally distributed, have no autocorrelation, constant variance. I also understand that there should be no collinearity, and that the relationship between x and y should be linear.
Those are the standard assumptions in an intro to regression. However, as I'm learning about power transforms such as the Box Cox transformation, it's said that the Box Cox attempts to transform either the predictor x or response y to normally distributed if they are skewed in the first place. By this I mean the marginal distribution of x and y, not the distribution of y conditioned on x (this is related to residues).
But other sources say it's not important for x and y to be normally distributed. So I'm thinking, if both x and y are uniformly distributed rather than normally distributed, and then there is a roughly linear relationship between them, wouldn't linear regression also work? You can still have the error be normally distributed in this case right?
For example, say you have a cylindrical bar of chocolate. You want to regress the weight against the length of chocolate that you cut. Since the cross section is the same, the weight is linear with the length. But let's say you cut it using a machine that randomly decides on a length from a uniform distribution (1-10 inches). Then the weight should be uniformly distributed too. You should still have normally distributed residues due to manufacturing tolerance, accuracy of the machine, accuracy of measurement, etc...
A linear regression clearly is suitable in this case, but neither x and y are normally distributed. So I'm guessing, it's not required for x and y to be normally distributed. Please advise.