In machine learning, I've learned one of the ways to optimize hyperparameters of a model is to do a grid search, which tests model for evenly spaced out values of hyperparametrs and determines which combination gives best results on validation set.
Since space represented by hyperparameters and efficiency of the model can have multiple local optimas, would it make sense to use some metaheuristic search method, like genetic algorithm?
Our gene could be a binary sequence representing hyperparameter values, and our individual's fitness function could be score of the model for hyperparameters represented by it's genetic material.
Major flaw that comes into my mind is that model has to be trained over and over again, which would take a lot of time, but is there any way to compensate for that by doing less training iterations? Would that affect our result a lot? Also, when doing grid search, model also has to be trained for every hyperparameter combination, so maybe the time difference between the two wouldn't be that big?
What do you think?