0

I'm trying to learn about neural networks using data series prediction as case study. I've set up a small application to try different configurations of data smoothing and input neuron count. However... No matter what I do I observe a delay in the neural nets output. As the picture below illustrates quite good, the outputs from the network (red) is very well predicted and is nicely matching the actual value (blue), only a bit late...

I've even tried to use the training data as test data to make sure there wasn't any problems with my test data preparation.

Anyone know what the problem might be?

enter image description here

I'm using 1 output neuron and training with Back Propagation. Bipolar sigmoid is used as activation function.

palaslet
  • 21
  • 3
  • it seems a bit like the red value is some average of the blue value within a few steps. – dontloo Jan 14 '16 at 02:24
  • Possible duplicate of [What should I do when my neural network doesn't learn?](https://stats.stackexchange.com/questions/352036/what-should-i-do-when-my-neural-network-doesnt-learn) – Sycorax Jul 07 '18 at 22:52

1 Answers1

1

I solved it by rewriting my normalization algorithm. must have been some sort of error in there. I was trying to use a scaling formula i found in a paper about neural networks. When falling back to plain feature scaling the offset in output was gone.

https://en.wikipedia.org/wiki/Feature_scaling#Rescaling

palaslet
  • 21
  • 3
  • Can you elaborate a bit more (on the plain feature scaling)? I'm using standard normalization and I still get the lag, no matter what I do... – Molasar Dec 01 '16 at 20:00
  • @Molasar, Sorry about the late reply to your question, and It's probably too late, but for future reference I've included a link in my answer. – palaslet Apr 20 '17 at 08:02