Published on Tue Feb 16 2021

Comparing and Combining Approximate Computing Frameworks

Saeid Barati, Gordon Kindlmann, Hank Hoffmann

Approximate computing frameworks configure applications so they can operate at a range of points in an accuracy-performance trade-off space. VIPER and BOA compare and combine three different approximation frameworks from across the system stack.

0
0
0
Abstract

Approximate computing frameworks configure applications so they can operate at a range of points in an accuracy-performance trade-off space. Prior work has introduced many frameworks to create approximate programs. As approximation frameworks proliferate, it is natural to ask how they can be compared and combined to create even larger, richer trade-off spaces. We address these questions by presenting VIPER and BOA. VIPER compares trade-off spaces induced by different approximation frameworks by visualizing performance improvements across the full range of possible accuracies. BOA is a family of exploration techniques that quickly locate Pareto-efficient points in the immense trade-off space produced by the combination of two or more approximation frameworks. We use VIPER and BOA to compare and combine three different approximation frameworks from across the system stack, including: one that changes numerical precision, one that skips loop iterations, and one that manipulates existing application parameters. Compared to simply looking at Pareto-optimal curves, we find VIPER's visualizations provide a quicker and more convenient way to determine the best approximation technique for any accuracy loss. Compared to a state-of-the-art evolutionary algorithm, we find that BOA explores 14x fewer configurations yet locates 35% more Pareto-efficient points.

Thu Dec 19 2019
Neural Networks
Benchmarking Discrete Optimization Heuristics with IOHprofiler
Automated benchmarking environments aim to support researchers in understanding how different algorithms perform on different types of optimization problems. Such comparisons provide insights into the strengths and weaknesses of different approaches, which can be leveraged into designing new algorithms.
0
0
0
Wed Jul 08 2020
Neural Networks
IOHanalyzer: Performance Analysis for Iterative Optimization Heuristic
IOHanalyzer is a new tool for the analysis, comparison, andvisualization of performance data of IOHs. Implemented in R and C++, it is fully open source and available on CRAN and GitHub.
0
0
0
Thu Mar 07 2019
Neural Networks
jMetalPy: a Python Framework for Multi-Objective Optimization with Metaheuristics
jMetalPy is an object-oriented Python-based framework for multi-objective optimization with metaheuristic techniques. jMetalPy offersitionally support for parallel computing in multicore and cluster systems.
0
0
0
Sun Jul 01 2018
Machine Learning
New Heuristics for Parallel and Scalable Bayesian Optimization
Bayesian optimization has emerged as a strong candidate tool for global optimization of functions with expensive evaluation costs. However, due to the evolution of computing technology, using Bayesian optimization in a parallel computing environment remains a challenge for the non-expert.
0
0
0
Wed May 11 2016
Neural Networks
COCO: Performance Assessment
We present an any-time performance assessment for benchmarking numerical optimization algorithms in a black-box scenario. The performance assessment is based on runtimes measured in number of objective function evaluations to reach one or several quality indicators.
0
0
0
Thu Oct 11 2018
Neural Networks
IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics
IOHprofiler is a new tool for analyzing and comparing iterative optimization algorithms. Given as input algorithms and problems written in C or Python, it provides as output a statistical evaluation of the algorithms' performance. It consists of two parts: an experimental part and a post-processing part.
0
0
0