Journal of Machine Learning Research 15 (2014) 949-980 Submitted 4/11; Revised 5/13; Published 3/14 Natural Evolution Strategies Daan Wierstra
[email protected] Tom Schaul
[email protected] DeepMind Technologies Ltd. Fountain House, 130 Fenchurch Street London, United Kingdom Tobias Glasmachers
[email protected] Institute for Neural Computation Universit¨atsstrasse 150 Ruhr-University Bochum, Germany Yi Sun
[email protected] Google Inc. 1600 Amphitheatre Pkwy Mountain View, United States Jan Peters
[email protected] Intelligent Autonomous Systems Institute Hochschulstrasse 10 Technische Universit¨atDarmstadt, Germany J¨urgenSchmidhuber
[email protected] Istituto Dalle Molle di Studi sull'Intelligenza Artificiale (IDSIA) University of Lugano (USI)/SUPSI Galleria 2 Manno-Lugano, Switzerland Editor: Una-May O'Reilly Abstract This paper presents Natural Evolution Strategies (NES), a recent family of black-box opti- mization algorithms that use the natural gradient to update a parameterized search distri- bution in the direction of higher expected fitness. We introduce a collection of techniques that address issues of convergence, robustness, sample complexity, computational complex- ity and sensitivity to hyperparameters. This paper explores a number of implementations of the NES family, such as general-purpose multi-variate normal distributions and separa- ble distributions tailored towards search in high dimensional spaces. Experimental results show best published performance on various standard benchmarks, as well as competitive performance on others. Keywords: natural gradient, stochastic search, evolution strategies, black-box optimiza- tion, sampling 1. Introduction Many real world optimization problems are too difficult or complex to model directly. There- fore, they might best be solved in a `black-box' manner, requiring no additional information c 2014 Daan Wierstra, Tom Schaul, Tobias Glasmachers, Yi Sun, Jan Peters and J¨urgenSchmidhuber.