I'm having an interesting dilemma with the neuralnet
and nnet
packages in R
. I recently tried a series of feed-forward neural networks giving each the same data sets and every single time, no matter how I tweak the algorithms, hidden layers, neuron sizes, maximum iterations or error thresholds, both functions keep converging their predictions to approximately the mean of whatever they are training on.
A linear regression does way better for each series in terms of fit, and both of these packages seem to do a better job fitting random data from the rnorm
function than real data. In regards to the mathematics of the problem, what could be causing this and how should I resolve? I have sample code below and can paste a sample dataset below if requested. Thanks!
model6 <- neuralnet(
target ~ 1 + majorholiday + mon + sat + sun + thu + tue + wed + tickets + l1_target + l7_target, data = data_nn
,algorithm = "rprop+", hidden = c(8), stepmax = 500000
,err.fct = "sse", threshold = 0.01, lifesign = "full", lifesign.step = 100
, linear.output= T)
EDIT
A user requested I paste some data. Here is one set below and I just tried the same code again prior to uploading and the same thing happens, converges to the mean of target
at about 17.45
row.names target majorholiday mon sat sun thu tue wed backtickets l1_target l7_target
1 8 18.976573088 0 0 0 0 0 0 0 13806 18.114001584 36.521334684
2 9 20.701716096 0 1 0 0 0 0 0 15308 18.976573088 35.477867979
3 10 25.014573616 0 0 1 0 0 0 0 13439 20.701716096 28.173601042
4 11 15.706877377 1 0 0 0 0 0 0 11283 25.014573616 27.602288128
5 12 19.633596721 0 0 0 0 1 0 0 12272 15.706877377 13.801144064
6 13 20.049395337 0 0 0 0 0 1 0 9528 19.633596721 32.777717152
7 14 21.720178282 0 0 0 1 0 0 0 13747 20.049395337 18.114001584
8 15 23.390961226 0 0 0 0 0 0 0 15277 21.720178282 18.976573088
9 16 16.707829447 0 1 0 0 0 0 0 16058 23.390961226 20.701716096
10 17 15.872437975 0 0 1 0 0 0 0 14218 16.707829447 25.014573616
11 18 23.295531996 1 0 0 0 0 0 0 11249 15.872437975 15.706877377
12 19 22.363710716 0 0 0 0 1 0 0 13993 23.295531996 19.633596721
13 20 24.227353276 0 0 0 0 0 1 0 13402 22.363710716 20.049395337
14 21 20.500068156 0 0 0 1 0 0 0 14244 24.227353276 21.720178282
15 22 26.090995836 0 0 0 0 0 0 0 14502 20.500068156 23.390961226
16 23 18.636425597 0 1 0 0 0 0 0 16296 26.090995836 16.707829447
17 24 15.840961757 0 0 1 0 0 0 0 13694 18.636425597 15.872437975
18 25 20.650050308 1 0 0 0 0 0 0 10774 15.840961757 23.295531996
19 26 13.467424114 0 0 0 0 1 0 0 12348 20.650050308 22.363710716
20 27 19.752222033 0 0 0 0 0 1 0 12936 13.467424114 24.227353276
21 28 27.832676502 0 0 0 1 0 0 0 14342 19.752222033 20.500068156
22 29 18.854393759 0 0 0 0 0 0 0 14390 27.832676502 26.090995836
23 30 10.773939291 0 1 0 0 0 0 0 16724 18.854393759 18.636425597
24 31 12.569595839 0 0 1 0 0 0 0 14091 10.773939291 15.840961757
25 32 28.153882107 1 0 0 0 0 0 0 11250 12.569595839 20.650050308
26 33 24.400031160 0 0 0 0 1 0 0 12803 28.153882107 13.467424114
27 34 21.584642949 0 0 0 0 0 1 0 13318 24.400031160 19.752222033
28 35 27.215419370 0 0 0 1 0 0 0 14193 21.584642949 27.832676502
29 36 21.584642949 0 0 0 0 0 0 0 14312 27.215419370 18.854393759
30 37 15.015403791 0 1 0 0 0 0 0 16445 21.584642949 10.773939291
31 38 26.276956633 0 0 1 0 0 0 0 13753 15.015403791 12.569595839
32 39 15.139500902 1 0 0 0 0 0 0 11619 26.276956633 28.153882107
33 40 12.467824272 0 0 0 0 1 0 0 14006 15.139500902 24.400031160
34 41 21.373413039 0 0 0 0 0 1 0 14098 12.467824272 21.584642949
35 42 8.015029889 0 0 0 1 0 0 0 14462 21.373413039 27.215419370
36 43 16.030059779 0 0 0 0 0 0 0 15367 8.015029889 21.584642949
37 44 19.592295285 0 1 0 0 0 0 0 17868 16.030059779 15.015403791
38 45 18.701736409 0 0 1 0 0 0 0 15052 19.592295285 26.276956633
39 46 16.002499062 1 0 0 0 0 0 0 10035 18.701736409 15.139500902
40 47 16.943822536 0 0 0 0 1 0 0 13708 16.002499062 12.467824272
41 48 11.295881691 0 0 0 0 0 1 0 13463 16.943822536 21.373413039
42 49 19.767792959 0 0 0 1 0 0 0 13998 11.295881691 8.015029889
43 50 19.767792959 0 0 0 0 0 0 0 14745 19.767792959 16.030059779
44 51 16.943822536 0 1 0 0 0 0 0 16156 19.767792959 19.592295285
45 52 14.119852113 0 0 1 0 0 0 0 13552 16.943822536 18.701736409
46 53 22.869570079 1 0 0 0 0 0 0 11554 14.119852113 16.002499062
47 54 10.481886286 0 0 0 0 1 0 0 13437 22.869570079 16.943822536
48 55 19.057975066 0 0 0 0 0 1 0 14076 10.481886286 11.295881691
49 56 20.010873819 0 0 0 1 0 0 0 14567 19.057975066 19.767792959
50 57 9.528987533 0 0 0 0 0 0 0 14277 20.010873819 19.767792959
51 58 21.916671326 0 1 0 0 0 0 0 16545 9.528987533 16.943822536
52 59 11.000000000 1 0 0 0 0 0 1 15599 21.916671326 14.119852113
53 60 17.000000000 0 0 0 0 1 0 1 17463 11.000000000 22.869570079
54 61 10.000000000 0 0 0 0 0 1 1 17935 17.000000000 10.481886286
55 62 20.000000000 0 0 0 1 0 0 1 18357 10.000000000 19.057975066
56 63 19.000000000 0 0 0 0 0 0 1 19246 20.000000000 20.010873819
57 64 17.000000000 0 1 0 0 0 0 1 21234 19.000000000 9.528987533
58 65 11.000000000 0 0 1 0 0 0 1 18493 17.000000000 21.916671326
59 66 9.000000000 1 0 0 0 0 0 1 15315 11.000000000 11.000000000
60 67 22.000000000 0 0 0 0 1 0 1 17841 9.000000000 17.000000000
61 68 9.000000000 0 0 0 0 0 1 1 18312 22.000000000 10.000000000
62 69 11.000000000 0 0 0 1 0 0 1 17880 9.000000000 20.000000000
63 70 5.000000000 0 0 0 0 0 0 1 19371 11.000000000 19.000000000
64 71 15.000000000 0 1 0 0 0 0 1 21696 5.000000000 17.000000000
65 72 12.000000000 0 0 1 0 0 0 1 18829 15.000000000 11.000000000
66 73 10.000000000 1 0 0 0 0 0 1 14749 12.000000000 9.000000000
67 74 15.000000000 0 0 0 0 1 0 1 17928 10.000000000 22.000000000
68 75 7.000000000 0 0 0 0 0 1 1 18254 15.000000000 9.000000000