0

I wanted to ask if this is a valid way of doing hyperparameter tuning. I have 7 parameter for my model. Since I have too many parameters to do a grid search, I was going to try a different method:

  1. Do 500 iterations starting with random numbers for the hyperparameters, recording the parameter values and the final score metric that I am trying to minimize.
  2. Using the 500 results I have, take the top 20% in terms of score and calculate the mean and standard deviation for each parameter.
  3. Repeat this except for step 1 I randomly sample from the normal distribution with the calculated mean and standard deviation from step 2 for the parameters.

Would this be a way to at least roughly do hyperparameter tuning?

  • 1
    Why not use Bayesian tuning? – matt Jul 09 '20 at 16:58
  • 3
    There are a lot of ways to try to make hyperparameter tuning more intelligent. Some examples: https://stats.stackexchange.com/questions/193306/optimization-when-cost-function-slow-to-evaluate/193310#193310 Why are you trying to invent a new method instead of using one of the established ones? Is there a particular shortcoming or drawback you're trying to ameliorate, or another reason that the existing methods are unsuitable? – Sycorax Jul 09 '20 at 18:57
  • Seems you are both right, I think Bayesian tuning would be the better option (I had no knowledge of it but I am reading into it now.) Thanks! – Ryohei Namiki Jul 09 '20 at 19:11

0 Answers0