General lower bounds for randomized direct search with isotropic sampling

Loading...
Thumbnail Image

Date

2007-07

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

The focus is on a certain class of randomized direct-search methods for optimization in (high-dimensional) Euclidean space, namely for minimization of a function f: R^n -> R, where f is given by an oracle, i. e. a black box for f-evaluations. The iterative methods under consideration generate a sequence of candidate solutions, where potential candidate solutions are generated by adding an isotropically distributed vector to the current candidate solution (possibly several times, to then choose one of these samples to become the next in the sequence of candidate solutions). This class of randomized direct-search methods covers in particular several evolutionary algorithms. Lower bounds on the number of samples (i. e. queries to the f-oracle) are proved which are necessary to enable such a method to reduce the approximation error in the search space. The lower bounds to be presented do not only hold in expectation, but they are such that runtimes below these bounds are observed only with an exponentially small probability (in the search space dimension n). To derive such strong bounds, an appealingly simple, but nevertheless powerful method is applied: We think of the guided/directed random search as a selected fragment of a purely/obliviously random search. Interestingly, the lower bounds so obtained turn out to be tight (up to an absolute constant).

Description

Table of contents

Keywords

direct search, heuristic optimization, probabilistic analysis, random search, theory

Citation