0

I'm trying neural network for the first time. I'm getting a weird output - while loss magnitude is apporaching 0 at the first epoch itself, the predictions are trash! can some one explain what's going on. Here is the code I used -

X_train, X_test, y_train, y_test = train_test_split(predictors, outcome, test_size=0.3, random_state=0) cs=StandardScaler() X_train_scaled = cs.fit_transform(X_train) X_test_scaled = cs.transform(X_test)

l0 = keras.layers.Dense(units=5, input_shape=[30]) l1 = keras.layers.Dense(units=5,activation=tf.nn.relu) l2 = keras.layers.Dense(units=1) model=keras.Sequential([l0,l1,l2]) model.compile(loss='mean_squared_error',optimizer=keras.optimizers.Adam(0.001)) history= model.fit(X_train_scaled, y_train,epochs=500,verbose=False)

enter image description here

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
PriyankaJ
  • 101
  • 2
    Can you properly format your post? See the guide here: https://stackoverflow.com/editing-help – Jan Kukacka Aug 19 '19 at 08:06
  • 1
    Also, are you sure the loss is zero? Or does it just **look** like zero on the graph stretched to 60000? Try plotting from epoch 10 onward to get better scaling, or use logarithmic y-axis. – Jan Kukacka Aug 19 '19 at 08:07
  • 1
    Finally, check [this post](https://stats.stackexchange.com/questions/352036/what-should-i-do-when-my-neural-network-doesnt-learn) for common neural network learning troubleshooting – Jan Kukacka Aug 19 '19 at 08:07

0 Answers0