So for a neural network, I'm trying to find the best weight $a$ to form the weight interval $[-a, a]$, given the function for the output of a single layer neural network
$y = \sigma(Wx)$
where $x \in R^n$ is the input, $W \in R^{h*x}$ is a matrix of weights
the exercise states that the optimal $a$ should result in $Var(y) = 1$ given that $E(x) = 0$ and $Var(x) = 1$ it also states that for small weights sigma is approximately linear so you can ommit it
How would I go about solving this?