Published on Fri Mar 27 2020

Deep-n-Cheap: An Automated Search Framework for Low Complexity Deep Learning

Sourya Dey, Saikrishna C. Kanala, Keith M. Chugg, Peter A. Beerel

Deep-n-Cheap is an open-source AutoML framework to search for deep learning models. This search includes both architecture and traininghyperparameters, and supports convolutional neural networks and multi-layer perceptrons.

0
0
0
Abstract

We present Deep-n-Cheap -- an open-source AutoML framework to search for deep learning models. This search includes both architecture and training hyperparameters, and supports convolutional neural networks and multi-layer perceptrons. Our framework is targeted for deployment on both benchmark and custom datasets, and as a result, offers a greater degree of search space customizability as compared to a more limited search over only pre-existing models from literature. We also introduce the technique of 'search transfer', which demonstrates the generalization capabilities of the models found by our framework to multiple datasets. Deep-n-Cheap includes a user-customizable complexity penalty which trades off performance with training time or number of parameters. Specifically, our framework results in models offering performance comparable to state-of-the-art while taking 1-2 orders of magnitude less time to train than models from other AutoML and model search frameworks. Additionally, this work investigates and develops various insights regarding the search process. In particular, we show the superiority of a greedy strategy and justify our choice of Bayesian optimization as the primary search methodology over random / grid search.

Fri Apr 28 2017
Machine Learning
DeepArchitect: Automatically Designing and Training Deep Architectures
In deep learning, performance is strongly affected by the choice of architecture and hyperparameters. We propose a framework for automatically designing and training deep models. The resulting search spaces are tree-structured and easy to traverse.
0
0
0
Fri Dec 07 2018
Machine Learning
ShuffleNASNets: Efficient CNN models through modified Efficient Neural Architecture Search
Neural network architectures found by sophistic search algorithms achieve strikingly good test performance. Although computationally efficient, their design is often very complex, impairing execution speed. We implement undiscoverable expert knowledge into the economic search algorithm Efficient Neural Architecture Search (ENAS)
0
0
0
Mon Feb 25 2019
Machine Learning
NAS-Bench-101: Towards Reproducible Neural Architecture Search
NAS-Bench-101 is the first public architecture dataset for NAS research. It allows researchers to evaluate the quality of a diverse range of models in milliseconds.
0
0
0
Fri Mar 27 2020
Machine Learning
MiLeNAS: Efficient Neural Architecture Search via Mixed-Level Reformulation
Many recently proposed methods for Neural Architecture Search (NAS) can be formulated as bilevel optimization. We demonstrate that gradient errors caused by such approximations lead to suboptimality. We propose a mixed-level reformulation for NAS that can be optimized efficiently and reliably.
0
0
0
Wed Jun 16 2021
Machine Learning
Efficient Deep Learning: A Survey on Making Deep Learning Models Smaller, Faster, and Better
Deep Learning has revolutionized the fields of computer vision, natural language understanding, speech recognition, information retrieval and more. We believe this is the first comprehensive survey in the efficient deep learning space that covers the landscape of model efficiency from modeling techniques to hardware support.
16
402
1,527
Tue Dec 20 2016
Neural Networks
Exploring the Design Space of Deep Convolutional Neural Networks at Large Scale
There is no single CNN/DNN architecture that solves all problems optimally. Instead, the "right" CNN/ DNN architecture varies depending on the application at hand. A small region of the CNN design space contains 30 billion different CNN architectures.
0
0
0