We approach the problem to find better solutions in an iterated optimization procedure from a stochastic viewpoint. We consider three principal objectives. First, we want to impose the least prior assumptions by maintaining maximal entropy, unbiasedness, and invariance. Second, while being constrained to the prior point, we like to exploit all information available in the black-box scenario. Third, we want to solve at least simple functions reasonably fast. Pursuing this idea, information geometry suggests to follow the natural gradient in the parameter space of distributions to sample new solutions. More specifically, the so-called Covariance Matrix Adaptation Evolution Strategy (CMA-ES) can be derived from these principles. The CMA-ES is a comparatively well-established evolutionary numerical optimization method which, surprisingly, complies with information geometric principles. The CMA-ES can be viewed as variable-metric approach, performs well on ill-condition problems and is particularly robust to rugged search landscapes. |