About this problem "Independence and homoskedasticity" some details more needed. However stating that we talk about a linear regression like:
$y = X’\beta + u$
Does $E(u|x)=0$ imply homoscedasticity? If yes, why?
In general no, but some forms of heteroskedasticity imply $E[u^2x] \neq 0$ and these go away.
Also, does $E(u|x)=0$ mean that $u$ and $x$ are fully independent?
No, stochastic independence is stronger condition.
If answers to both these question are no: does full independence of
$u$ and $x$ imply homoscedasticity? If yes, why?
As before, in general no. However sometimes heteroskedasticity is defined as:
$V[\epsilon|X=x] = E[u^2|X=x] = \sigma^2(x)$ (a function of $X$)
and in this case stochastic independence between $u$ and $X$ imply homoscedasticity.
However sometimes the definition is like:
$V[u|Z=z] = E[u^2|Z=z] = \sigma^2(z)$
where $Z$ not necessarily include some elements of $X$. In this case: independence between $u$ and $X$, and heteroskedasticity; can hold simultaneously. Read here too Homoscedasticity and independence of errors
Finally
According to the Author, the homoscedasticity assumption is added just
to "simplify calculations". However, this idea is not elaborated any
further
OLS estimator can holds under heteroskedasticity also, because it can remain consistent. However in this situation robust standard error, as White version, are needed.