Any statistical analysis, machine learning or data science involves some sort of optimization at the end of the day.
I'm looking for good linear and nonlinear optimization textbooks for self learning. Here are some of the criteria (subjective) for a good optimization book:
- Written in a didactic style or problem based learning approach with examples. if the algorithm is iterative in nature, should at least have first 2 iteration in the form of hand calculation. Theorems and poofs are great, but have having numerical examples improves learning
- Present optimization algorithms in Pseudo code format not as a paragraph or in words.
- Computer programs should accompany text in high level language such as Matlab, Mathematica or Python.
- Present traditional unconstrained optimization methods such as gradient descent, quasi newton, conjugate gradient, and constrained optimization such as quadratic programming, sequential quadratic programming and reduced gradient methods. Should cover both line search and trust region methods.
- Also present nature inspired methods such as evolutionary computing, and other techniques for nonconvex optimization since most real world problems one cannot expect convexity and often noisy, discrete and/or discontinuous.
- Should be written using English language and letters as opposed to Greek letters.
Any recommendations?