0

I working in R and I try to fit neural network with TensorFlow and Keras. Generally I am not satisficed with result from model.

I tried to fit this model sequential model.

model <- keras_model_sequential()
                  # Add layers to the model
                  model %>% 
                     layer_dense(units = 512, activation = 'relu',input_shape = dim(train_data_tf)[2]) %>% 
                     layer_dropout(rate = 0.5) %>%
                     layer_dense(units = 256, activation = 'linear') %>%
                     layer_dropout(rate = 0.5) %>%
                     layer_dense(units = 128, activation = 'linear') %>%
                     layer_dropout(rate = 0.5) %>%
                     layer_dense(units = 64,  activation = 'linear') %>%
                     layer_dropout(rate = 0.5) %>%
                     layer_dense(units = 32,  activation = 'linear') %>%
                     layer_dropout(rate = 0.5) %>%
                     layer_dense(units = 16,  activation = 'linear') %>%
                     layer_dropout(rate = 0.5) %>%
                     layer_dense(units = 1)
                  summary(model)

I also used optimizer=optimizer_rmsprop(lr = 0.001), so forecasting result you can see on next pic. Тhe red dots are the original values while the black ones are projections. So any suggestion how can tuning parameters of this model and project better?

enter image description here

Sycorax
  • 76,417
  • 20
  • 189
  • 313
  • Linear functions are closed under composition, so repeated linear layers are the same as a single linear layer. The inclusion of dropout breaks that identity, but also having so much dropout in the model is plausibly destroying the ability of the model to learn; I suspect the **training** loss does not decrease much or at all. These points, and general tips for improving the fit of NNs, are addressed in the duplicate threads. If you have a specific question not addressed in these threads, please [edit] to clarify. – Sycorax Jan 11 '22 at 15:32

0 Answers0