Published on Thu Jul 18 2019

Automating concept-drift detection by self-evaluating predictive model degradation

Tania Cerquitelli, Stefano Proto, Francesco Ventura, Daniele Apiletti, Elena Baralis

A key aspect of automating predictive machine learning entails the capability of properly triggering the update of the trained model. To this aim, suitable automatic solutions to self-assess the prediction quality and the data distribution drift between the original training set and the new data have to be devised.

0
0
0
Abstract

A key aspect of automating predictive machine learning entails the capability of properly triggering the update of the trained model. To this aim, suitable automatic solutions to self-assess the prediction quality and the data distribution drift between the original training set and the new data have to be devised. In this paper, we propose a novel methodology to automatically detect prediction-quality degradation of machine learning models due to class-based concept drift, i.e., when new data contains samples that do not fit the set of class labels known by the currently-trained predictive model. Experiments on synthetic and real-world public datasets show the effectiveness of the proposed methodology in automatically detecting and describing concept drift caused by changes in the class-label data distributions.

Tue May 04 2021
Artificial Intelligence
Automatic Learning to Detect Concept Drift
0
0
0
Mon Aug 16 2021
Machine Learning
Task-Sensitive Concept Drift Detector with Metric Learning
Detecting drifts in data is essential for machine learning applications. Most of the available drift detection methods require access to true labels during inference time. In a real-world scenario, true labels usually available only during model training.
2
0
0
Fri Jul 28 2017
Machine Learning
Proceedings of the IJCAI 2017 Workshop on Learning in the Presence of Class Imbalance and Concept Drift (LPCICD'17)
Class imbalance happens when the data categories are not equally represented. It can cause learning bias towards the majority class and poor generalization. Concept drift is a change in the underlying distribution of the problem. Both problems should be considered during algorithm design.
0
0
0
Mon Mar 01 2021
Machine Learning
STUDD: A Student-Teacher Method for Unsupervised Concept Drift Detection
0
0
0
Mon Jun 25 2018
Artificial Intelligence
Request-and-Reverify: Hierarchical Hypothesis Testing for Concept Drift Detection with Expensive Labels
Concept drift detection aims to adapt the model so as to mitigate any deterioration in the model's performance. Most existing concept drift detection methods rely on a strong and over-optimistic condition that the true labels are available immediately for all already classified instances.
0
0
0
Mon Mar 20 2017
Machine Learning
A Systematic Study of Online Class Imbalance Learning with Concept Drift
Online class imbalance learning often combines challenges of both class imbalance and concept drift. It deals with data streams having very skewed class distributions, where concept drift may occur. Based on the analysis, a general guideline is proposed for the development of an effective algorithm.
0
0
0