

Tutorial slides will be uploaded on shortly before or after the tutorial.Ĭarola Doerr is a permanent CNRS researcher at Pierre and Marie Curie University in Paris, France.

No specific background is required to follow this tutorial, but some familiarity with evolutionary computation is certainly of advantage. Other information: This tutorial addresses experimentally as well as theory-oriented researcher. We survey both experimental and theoretical results, which demonstrate the unexploited potential of non-static parameter choices.

This tutorial aims to contribute to this goal, by providing an in-depth discussion of online parameter selection techniques. A paradigm change towards a more systematic use of non-static parameter choices is much needed.

Quite surprisingly, however, this is not the case in discrete optimization, where they play only a rather marginal role. This observation calls for adaptive parameter choices, which automatically adjust the parameter values to the current state of the optimization process.Īdaptive parameter choices are today standard in continuous optimization. In the beginning of an optimization process, for example, one may want to allow for more exploration, while later on we may prefer a more focused search (“exploitation”). What complicates the parameter selection problem is the observation that different parameter values can be optimal in different stages of the optimization process. Unfortunately, the identification of good parameter values still is one of the most challenging tasks in evolutionary computation. The chosen parameter values can have a decisive influence on performance. This parametrization allows to adjust the behavior of the algorithms to the problem at hand. To run these algorithms, we typically need to decide upon their population sizes, mutation strengths, crossover rates, selective pressure, etc. Evolutionary algorithms and other popular black-box optimization techniques are highly parametrized algorithms.
