1

So for a neural network, I'm trying to find the best weight $a$ to form the weight interval $[-a, a]$, given the function for the output of a single layer neural network

$y = \sigma(Wx)$

where $x \in R^n$ is the input, $W \in R^{h*x}$ is a matrix of weights

the exercise states that the optimal $a$ should result in $Var(y) = 1$ given that $E(x) = 0$ and $Var(x) = 1$ it also states that for small weights sigma is approximately linear so you can ommit it

How would I go about solving this?

roughosing
  • 11
  • 1

1 Answers1

0

You could normalize the data to 0..1 or -1..1 simply by taking the max values Or normalize over range data max value - data minimum vlaue.

normalized input x = 1/(maxVal-minVal)*x

Peter
  • 139
  • 5