Published on Mon Jan 06 2020

Frequency Fitness Assignment: Making Optimization Algorithms Invariant under Bijective Transformations of the Objective Function Value

Thomas Weise, Zhize Wu, Xinlu Li, Yan Chen

Under Frequency Fitness Assignment (FFA), the fitness corresponding to an purposefullyobjective value is its encounter frequency in fitness assignment steps and is subject to minimization. FFA renders optimization processes invariant under transformations of the objective function value.

0
0
0
Abstract

Under Frequency Fitness Assignment (FFA), the fitness corresponding to an objective value is its encounter frequency in fitness assignment steps and is subject to minimization. FFA renders optimization processes invariant under bijective transformations of the objective function value. On TwoMax, Jump, and Trap functions of dimension s, the classical (1+1)-EA with standard mutation at rate 1/s can have expected runtimes exponential in s. In our experiments, a (1+1)-FEA, the same algorithm but using FFA, exhibits mean runtimes that seem to scale as . Since Jump and Trap are bijective transformations of OneMax, it behaves identical on all three. On OneMax, LeadingOnes, and Plateau problems, it seems to be slower than the (1+1)-EA by a factor linear in s. The (1+1)-FEA performs much better than the (1+1)-EA on W-Model and MaxSat instances. We further verify the bijection invariance by applying the Md5 checksum computation as transformation to some of the above problems and yield the same behaviors. Finally, we show that FFA can improve the performance of a memetic algorithm for job shop scheduling.

Thu Mar 18 2021
Neural Networks
On Steady-State Evolutionary Algorithms and Selective Pressure: Why Inverse Rank-Based Allocation of Reproductive Trials is Best
0
0
0
Fri Oct 20 2006
Neural Networks
Fitness Uniform Optimization
0
0
0
Mon Dec 14 2020
Neural Networks
Theoretical Analyses of Multi-Objective Evolutionary Algorithms on Multi-Modal Objectives
This paper takes a first step towards a deeper understanding of how evolutionary algorithms solve multi-modal multi-objective problems. We propose the OneJumpZeroJump problem, a bi-objectives problem whose single objectives are isomorphic to the classic jump functions benchmark.
0
0
0
Mon Apr 15 2019
Neural Networks
The 1/5-th Rule with Rollbacks: On Self-Adjustment of the Population Size in the GA
Self-adjustment of parameters can significantly improve the performance of evolutionary algorithms. The one fifth rule, which guides the adaptation in the example above, is able to raise the population size too fast on problems which are too far away from perfect fitness-distance correlation.
0
0
0
Fri Aug 17 2018
Neural Networks
Towards a Theory-Guided Benchmarking Suite for Discrete Black-Box Optimization Heuristics: Profiling EA Variants on OneMax and LeadingOnes
No widely accepted equivalent exists in the domain ofrete black-box optimization. We adjust the COCO software to pseudo-Boolean optimization problems. This allows a fine-grained empirical analysis of black- box heuristics.
0
0
0
Mon Jul 09 2018
Neural Networks
Optimal Parameter Choices via Precise Black-Box Analysis
We prove that the unary unbiased black-box complexity of the OneMax benchmark function class is $n \ln(n) - cn \pm o(n), which is between and $ 0.2665. This runtime can be achieved with a
0
0
0