I am using neural network algorithms for a relatively large dataset with 1700 obs and 40 features.
I performed optimizing by nested cross validation.
I also wanted to compare 5 algorithm with each other by benchmarking.
When I select about 5 hyperparameters to be tuned (number of nodes, layers, alpha, dropout, epochs),it takes a lot of time for computer to calculate, then I canceled it out.
As Tunning of many of the hyperparameters especially on a large dataset with many features, is so computationally expensive,
Is it allowed that we select just a limited subset of hyperparameters (eg.just Num_nodes and dropout) and not all or many of them to be tuned?
I searched it in google and SO questions but did not find the answer.
I appereciate your kind help.