Deep-n-Cheap is an open-source AutoML framework to search for deep learning models. This search includes both architecture and traininghyperparameters, and supports convolutional neural networks and multi-layer perceptrons.
We present Deep-n-Cheap -- an open-source AutoML framework to search for deep
learning models. This search includes both architecture and training
hyperparameters, and supports convolutional neural networks and multi-layer
perceptrons. Our framework is targeted for deployment on both benchmark and
custom datasets, and as a result, offers a greater degree of search space
customizability as compared to a more limited search over only pre-existing
models from literature. We also introduce the technique of 'search transfer',
which demonstrates the generalization capabilities of the models found by our
framework to multiple datasets.
Deep-n-Cheap includes a user-customizable complexity penalty which trades off
performance with training time or number of parameters. Specifically, our
framework results in models offering performance comparable to state-of-the-art
while taking 1-2 orders of magnitude less time to train than models from other
AutoML and model search frameworks. Additionally, this work investigates and
develops various insights regarding the search process. In particular, we show
the superiority of a greedy strategy and justify our choice of Bayesian
optimization as the primary search methodology over random / grid search.