The answer here depends on whether you want to explore the space by collecting a bunch of data points at once, or whether you'd like to quickly converge to an optimal point in the space.
If you want to collect a bunch of points at once that are informative about the rest of the space, then you want to maximize entropy of the points you collect. If you know what covariance function you're using in your GP model, then you can calculate the entropy from the training covariance before you actually make the measurements at those points. The entropy of a multivariate Gaussian is:
$${\frac {1}{2}}\ln \operatorname {det} \left(2\pi \mathrm {e} {\boldsymbol {\Sigma }}\right)$$
where $\Sigma$ is the covariance.
On the other hand, if you want to find an optimal point, then you want a Bayesian black box optimization algorithm. One simple example is the upper confidence bound algorithm:
- Collect a data point
- Train a GP regression model
- Find the point in your search space that maximizes the predicted mean + standard deviation.
- Measure that point.
- Repeat 2-3 until some stopping condition is met.
I believe one of the other commenters linked you to more resources on Bayesian black box optimization. scikit-optimize implements some of these with Gaussian processes.