I am trying to understand the properties of the least squares estimator. In particular the Gauss-Markov theorem states that the estimator is independent from the distribution of the error term.
Thus if I understand correctly, the following should be true:
$\beta | e \sim N(\mu, \sigma^2) == \beta|e\sim Beta(2,5)$
However when I tried out an example I got slightly different results:
#code in julia
(() -> begin
n = 10000
A = [ones(n) rand(n)]
β = [3.1, 12.7]
y = A * β + rand(Beta(2,5), n)
b = inv(A'A)A'y
print(b)
end)()
> 3.38 12.69
and
(() -> begin
n = 10000
A = [ones(n) rand(n)]
β = [3.1, 12.7]
y = A * β + rand(Normal(), n)
b = inv(A'A)A'y
print(b)
end)()
> 3.10 12.71
Now the normal distributed errors are indeed unbiased estimators, while for the Beta distribution there seems to be a bias of $3.38 - 3.10$ for the intercept which according to theory should not be there, right?