0

I know that, for grid search algorithm, the computational cost of the grid search algorithm would be $O(v^p)$ optimizing $p$ parameters each of which can take on $v$ values, then the computational cost of the grid search algorithm would be $O(v^p)$. But I am not sure what is the computational cost for conducting random walk algorithm. Would it be simply $O(p)$?

Thank you,

HDB
  • 249
  • 1
  • 6
  • To make progress, consider what $O(v^p)$ denotes. Is $v^p$ the number of machine learning models you estimate? How many machine learning models does your random search estimate? – Sycorax Nov 23 '20 at 04:04
  • @Sycorax I am not sure what you mean by "number of machine learning models I estimate". $p$ here stands for the number of hyperparameters that the model include, not the number of models. To me, the random search algorithm does not have set grid for each parameters, so I think the computational cost for random search algorithm is $O(p)$ but I can't be sure. – HDB Nov 23 '20 at 06:13
  • The $O(\cdot)$ notation appears to quantify some kind of "cost complexity," apparently expressed in terms of the number of models that you estimate. In the case of grid search over $v$ values for each of $p$ parameters, you estimate $v^p$ models, so the meaning of $O(v^p)$ here appears to be that you estimate exactly $v^p$ models. Likewise, if I run random search for $k$ iterations, how many models am I estimating? – Sycorax Nov 23 '20 at 16:31

0 Answers0