Published on Sat Sep 07 2019

Unlimited Budget Analysis of Randomised Search Heuristics

Jun He, Thomas Jansen, Christine Zarges

Performance analysis of randomised search heuristics is a rapidly growing field. The focus of this paper is on the solution quality an optimisation heuristic achieves, not on the time it takes to reach it.

0
0
0
Abstract

Performance analysis of all kinds of randomised search heuristics is a rapidly growing and developing field. Run time and solution quality are two popular measures of the performance of these algorithms. The focus of this paper is on the solution quality an optimisation heuristic achieves, not on the time it takes to reach this goal, setting it far apart from runtime analysis. We contribute to its further development by introducing a novel analytical framework, called unlimited budget analysis, to derive the expected fitness value after arbitrary computational steps. It has its roots in the very recently introduced approximation error analysis and bears some similarity to fixed budget analysis. We present the framework, apply it to simple mutation-based algorithms, covering both, local and global search. We provide analytical results for a number of pseudo-Boolean functions for unlimited budget analysis and compare them to results derived within the fixed budget framework for the same algorithms and functions. There are also results of experiments to compare bounds obtained in the two different frameworks with the actual observed performance. The study show that unlimited budget analysis may lead to the same or more general estimation beyond fixed budget.

Wed Oct 21 2020
Neural Networks
Improved Runtime Results for Simple Randomised Search Heuristics on Linear Functions with a Uniform Constraint
In the last decade remarkable progress has been made in development of suitable proof techniques for analysing randomised search heuristics. Theoretical investigation of these algorithms on classes of functions is essential to the understanding of the underlying stochastic process.
0
0
0
Mon Jul 09 2018
Neural Networks
Optimal Parameter Choices via Precise Black-Box Analysis
We prove that the unary unbiased black-box complexity of the OneMax benchmark function class is $n \ln(n) - cn \pm o(n), which is between and $ 0.2665. This runtime can be achieved with a
0
0
0
Sat Jan 06 2018
Neural Networks
Complexity Theory for Discrete Black-Box Optimization Heuristics
Running time analysis aims at understanding the performance of a given heuristic on a given problem. It is most useful when it is complemented by a meaningful complexity theory. Several black-box complexity models have been developed to analyze the best possible performance.
0
0
0
Fri Jun 12 2020
Neural Networks
Improved Fixed-Budget Results via Drift Analysis
fixed-budget theory is concerned with computing or bounding the fitness value. within a given budget of fitness evaluations. We transfer drift theory, the key tool to derive expected optimization times, to the fixed-budged perspective.
0
0
0
Tue Mar 26 2019
Neural Networks
On the Benefits of Populations on the Exploitation Speed of Standard Steady-State Genetic Algorithms
populations are useful for the global exploration of multi-modal optimisation problems. In this paper we provide evidence that evolving populations via crossover and mutation may also benefit the optimisation time for hillclimbing unimodal functions.
0
0
0
Tue Jun 13 2017
Neural Networks
Evaluating Noisy Optimisation Algorithms: First Hitting Time is Problematic
A key part of any evolutionary algorithm is fitness evaluation. When fitness evaluations are corrupted by noise, a strategy is needed in order to cope with this. Resampling is one of the most common strategies. This paper argues that focussing on final solution quality is more realistic.
0
0
0