Published on Sun Jul 26 2020

Beyond the Worst-Case Analysis of Algorithms (Introduction)

Tim Roughgarden

One of the primary goals of the mathematical analysis of algorithms is to provide guidance about which algorithm is the "best" for solving a given computational problem. Worst-case analysis summarizes the performance profile of an algorithm by its worst performance on any input of a given size. Strong worst-case guarantees

1
33
215
Abstract

One of the primary goals of the mathematical analysis of algorithms is to provide guidance about which algorithm is the "best" for solving a given computational problem. Worst-case analysis summarizes the performance profile of an algorithm by its worst performance on any input of a given size, implicitly advocating for the algorithm with the best-possible worst-case performance. Strong worst-case guarantees are the holy grail of algorithm design, providing an application-agnostic certification of an algorithm's robustly good performance. However, for many fundamental problems and performance measures, such guarantees are impossible and a more nuanced analysis approach is called for. This chapter surveys several alternatives to worst-case analysis that are discussed in detail later in the book.

Thu Jul 09 2009
Neural Networks
Beyond No Free Lunch: Realistic Algorithms for Arbitrary Problem Classes
We show how the necessary and sufficient conditions for the NFL to apply can be reduced to the single requirement of the set of objective functions under consideration being closed under permutation. Then we provide a more refined definition of performance under which we show that revisiting algorithms are always trumped by enumerative
0
0
0
Tue Oct 30 2012
Artificial Intelligence
An Atypical Survey of Typical-Case Heuristic Algorithms
Heuristic approaches often do so well that they seem to pretty much always give the right answer. How close can heuristic algorithms get to always giving the right answers, without inducing seismic complexity-theoretic consequences?
0
0
0
Tue Jul 16 2013
Neural Networks
The Fitness Level Method with Tail Bounds
Fitness-level method, also called the method of f-based partitions, is an intuitive and widely used technique for the running time analysis of randomized local search heuristics. It was originally defined to prove upper and lower bounds on the expected running time.
0
0
0
Wed Jan 15 2020
Neural Networks
Parameterized Complexity Analysis of Randomized Search Heuristics
This chapter compiles a number of results that apply the theory of parameterized algorithmics to the running-time analysis of randomized search heuristics. The parameterized approach articulates the running time of algorithms solving combinatorial problems in finer detail than traditional approaches.
0
0
0
Sat Apr 28 2007
Neural Networks
Stochastic Optimization Algorithms
Deterministic methods are very CPU-intensive. They are useless on untractable NP-hard problems. In order to get a result, one needs to revert to stochastic algorithms. Such algorithms can find very good results, without any guarantee that the global optimum has been reached.
0
0
0
Mon Nov 23 2015
Machine Learning
A PAC Approach to Application-Specific Algorithm Selection
This paper adapts concepts from statistical and online learning theory to reason about application-specific algorithm selection. We present one framework that models algorithm selection as a statistical learning problem. We also study the online version of the algorithm selection problem.
0
0
0