I am finding it difficult to differentiate Estimation methods from Optimization methods in the context of Machine Learning. Both are used for calculating the parameters of our model so how are they any different? Is any one of them the subset of the other?
Asked
Active
Viewed 917 times
1 Answers
9
Optimization is a general term for finding the maximum, or minimum, of some function. Estimation is a statistical term for finding some estimate of unknown parameter, given some data. This can be done using optimization methods, e.g. by maximizing the likelihood function, or minimizing a loss function. So optimization is broader, as you can use it without data, just by finding the maximum of some abstract function, while estimation has a clear statistical scope. Optimization can be used for estimation, but you can also use methods not based on optimization for estimation.

Glorfindel
- 700
- 1
- 9
- 18

Tim
- 108,699
- 20
- 212
- 390
-
I'm not sure to understand "optimization is broader". Does it mean that estimation is a subset of optimization? Of course, there are a lot of optimization problems that are not about estimation. However, are all estimation problems optimization problems? For example, the method of moments for estimation doesn't seem to be constructed as an optimization method. – Pere Oct 07 '17 at 09:50
-
2NO, there are estimation methods which do not use optimization, like method of moments or bayes methods – kjetil b halvorsen Oct 07 '17 at 11:43