Published on Wed Jun 07 2017

Stochastic Global Optimization Algorithms: A Systematic Formal Approach

Jonatan Gomez

A stochastic global optimization algorithm (SGoal) is an iterative algorithm that generates a new population from a previous population. Although some research works have formalized SGoals using Markov kernels, such formalization is not general and sometimes is blurred.

0
0
0
Abstract

As we know, some global optimization problems cannot be solved using analytic methods, so numeric/algorithmic approaches are used to find near to the optimal solutions for them. A stochastic global optimization algorithm (SGoal) is an iterative algorithm that generates a new population (a set of candidate solutions) from a previous population using stochastic operations. Although some research works have formalized SGoals using Markov kernels, such formalization is not general and sometimes is blurred. In this paper, we propose a comprehensive and systematic formal approach for studying SGoals. First, we present the required theory of probability (\sigma-algebras, measurable functions, kernel, markov chain, products, convergence and so on) and prove that some algorithmic functions like swapping and projection can be represented by kernels. Then, we introduce the notion of join-kernel as a way of characterizing the combination of stochastic methods. Next, we define the optimization space, a formal structure (a set with a \sigma-algebra that contains strict \epsilon-optimal states) for studying SGoals, and we develop kernels, like sort and permutation, on such structure. Finally, we present some popular SGoals in terms of the developed theory, we introduce sufficient conditions for convergence of a SGoal, and we prove convergence of some popular SGoals.

Sun Oct 11 2020
Neural Networks
Non-Stationary Stochastic Global Optimization Algorithms
Gomez proposes a formal and systematic approach for characterizing stochastic global optimization algorithms. Using it, Gomez formalizes algorithms with afixed next-population stochastically method.
0
0
0
Sat Apr 28 2007
Neural Networks
Stochastic Optimization Algorithms
Deterministic methods are very CPU-intensive. They are useless on untractable NP-hard problems. In order to get a result, one needs to revert to stochastic algorithms. Such algorithms can find very good results, without any guarantee that the global optimum has been reached.
0
0
0
Thu Jun 14 2018
Neural Networks
Theory of Estimation-of-Distribution Algorithms
Estimation-of-distribution algorithms (EDAs) are general metaheuristics used in optimization. EDAs represent a more recent alternative to classical approaches like evolutionary algorithms. Recently, there has been made significant progress in the theoretical understanding of EDAs.
0
0
0
Fri Nov 27 2015
Artificial Intelligence
A Stochastic Process Model of Classical Search
Among classical search algorithms with the same heuristic information, with sufficient memory A* is essentially as fast as possible in finding a proven optimal solution. In many situations optimal solutions are simply infeasible, and thus search algorithms that trade solution quality for speed are desirable.
0
0
0
Mon Jan 25 2021
Neural Networks
Population-Based Methods: PARTICLE SWARM OPTIMIZATION -- Development of a General-Purpose Optimizer and Applications
This thesis is concerned with continuous, static, and single-objective optimization problems subject to inequality constraints. The particle swarm optimization paradigm was inspired by previous simulations of the cooperative behaviour observed in social beings. It is a bottom-up, randomly weighted, population-based method whose ability to optimize emerges from
0
0
0
Wed Jan 31 2018
Artificial Intelligence
A Cross Entropy based Optimization Algorithm with Global Convergence Guarantees
The cross entropy (CE) method is a model based search method to solve optimization problems where the objective function has minimal structure. The Monte-Carlo version of the CE method employs the naive sample averaging technique which is inefficient, both computationally and space wise.
0
0
0