Published on Sun Jan 12 2020

Aggregated Learning: A Vector-Quantization Approach to Learning Neural Network Classifiers

Masoumeh Soflaei, Hongyu Guo, Ali Al-Bashabsheh, Yongyi Mao, Richong Zhang

In this framework, several objects are jointly classified by a single neurological network. The effectiveness of this framework is verified through extensive experiments on standard image recognition and text classification tasks.

0
0
0
Abstract

We consider the problem of learning a neural network classifier. Under the information bottleneck (IB) principle, we associate with this classification problem a representation learning problem, which we call "IB learning". We show that IB learning is, in fact, equivalent to a special class of the quantization problem. The classical results in rate-distortion theory then suggest that IB learning can benefit from a "vector quantization" approach, namely, simultaneously learning the representations of multiple input objects. Such an approach assisted with some variational techniques, result in a novel learning framework, "Aggregated Learning", for classification with neural network models. In this framework, several objects are jointly classified by a single neural network. The effectiveness of this framework is verified through extensive experiments on standard image recognition and text classification tasks.

Thu Jul 26 2018
Artificial Intelligence
Aggregated Learning: A Deep Learning Framework Based on Information-Bottleneck Vector Quantization
AgrLearn can reduce up to 80% of the training samples needed for ResNet training. Unlike standard networks, AgrLearn simultaneously optimizes against multiple data samples.
0
0
0
Sun Jan 14 2018
Machine Learning
Fix your classifier: the marginal value of training the last weight layer
Neural networks are commonly used as models for classification for a wide variety of tasks. We argue that this classifier can be fixed, up to a global scale constant, with little or no loss of accuracy for most tasks.
0
0
0
Mon May 02 2016
Machine Learning
Some Insights into the Geometry and Training of Neural Networks
Neural networks have been successfully used for classification tasks in arapidly growing number of practical applications. Despite their popularity, there are still many aspects of training and classification that are not well understood.
0
0
0
Wed May 28 2008
Machine Learning
From Data Topology to a Modular Classifier
This article describes an approach to designing a distributed and modular neural classifier. This approach introduces a new hierarchical clustering. A multilayer perceptron is then associated with each of these detected clusters and charged with recognizing elements of the associated cluster while rejecting all others.
0
0
0
Tue Aug 05 2014
Neural Networks
Multilayer bootstrap networks
Multilayer bootstrap network builds a gradually narrowed multilayer nonlinear network from bottom up for unsupervised nonlinear dimensionality reduction. Each layer of the network is a nonparametric density estimator.
0
0
0
Fri Dec 09 2016
Artificial Intelligence
Learning Representations by Stochastic Meta-Gradient Descent in Neural Networks
The performance of a learning system depends on the type of representation used for representing the data. Typically, these representations are hand-engineered using domain knowledge. Learning the representations directly from the incoming data stream reduces the human labour involved in designing alearning system.
0
0
0