Published on Sat Jul 04 2020

On Class Orderings for Incremental Learning

Marc Masana, Bartłomiej Twardowski, Joost van de Weijer

The influence of class orderings in the evaluation of incremental learning has received very little attention. We propose a method to compute various orderings for a dataset. Results show that orderings can have a significant impact on performance and the ranking of the methods.

0
0
0
Abstract

The influence of class orderings in the evaluation of incremental learning has received very little attention. In this paper, we investigate the impact of class orderings for incrementally learned classifiers. We propose a method to compute various orderings for a dataset. The orderings are derived by simulated annealing optimization from the confusion matrix and reflect different incremental learning scenarios, including maximally and minimally confusing tasks. We evaluate a wide range of state-of-the-art incremental learning methods on the proposed orderings. Results show that orderings can have a significant impact on performance and the ranking of the methods.

Fri Feb 08 2019
Machine Learning
EILearn: Learning Incrementally Using Previous Knowledge Obtained From an Ensemble of Classifiers
In incremental learning, the general convention is to use only the knowledge acquired in the previous phase. We follow this convention by retaining the previouslyacquired knowledge which is relevant and using it along with the current data. Experimental results show that the proposed approach outperforms the existing incremental learning approaches.
0
0
0
Tue Jan 05 2021
Machine Learning
One vs Previous and Similar Classes Learning -- A Comparative Study
The work proposes three learning paradigms which allow trained models to be updated without the need of retraining from scratch. A comparative analysis is performed to evaluate them against a baseline. Results show that two of them are faster than the baseline at updating.
0
0
0
Tue May 02 2017
Machine Learning
A Strategy for an Uncompromising Incremental Learner
We show that phantom sampling helps avoid catastrophic forgetting during incremental learning. We apply these strategies to competitive multi-class learning of deep neural networks. We further put our strategy to test on challenging cases, including cross-domain increments and incrementing on a novel label space.
0
0
0
Fri Apr 09 2021
Machine Learning
Unsupervised Class-Incremental Learning Through Confusion
0
0
0
Fri Oct 16 2020
Machine Learning
Class-incremental Learning with Pre-allocated Fixed Classifiers
In class-incremental learning, a learning agent faces a stream of data with the goal of learning new classes while not forgetting previous ones. To address this problem, effective methods exploit past data stored in an episodic memory while expanding the final classifier nodes to accommodate the new classes.
1
6
11
Thu Sep 01 2016
Neural Networks
A Novel Progressive Learning Technique for Multi-class Classification
A progressive learning technique for multi-classification is proposed. This newly developed learning technique is.independent of the number of class constraints and it can learn new. classes while still retaining the knowledge of previous classes.
0
0
0