Published on Tue Jan 14 2020

Anomaly Detection with Density Estimation

Benjamin Nachman, David Shih

New unsupervised anomaly detection technique (ANODE) can be used to find anomalies in data. The method is robust against systematic differences between signal region and sidebands, giving it a broader applicability.

0
0
0
Abstract

We leverage recent breakthroughs in neural density estimation to propose a new unsupervised anomaly detection technique (ANODE). By estimating the probability density of the data in a signal region and in sidebands, and interpolating the latter into the signal region, a likelihood ratio of data vs. background can be constructed. This likelihood ratio is broadly sensitive to overdensities in the data that could be due to localized anomalies. In addition, a unique potential benefit of the ANODE method is that the background can be directly estimated using the learned densities. Finally, ANODE is robust against systematic differences between signal region and sidebands, giving it broader applicability than other methods. We demonstrate the power of this new approach using the LHC Olympics 2020 R\&D Dataset. We show how ANODE can enhance the significance of a dijet bump hunt by up to a factor of 7 with a 10\% accuracy on the background prediction. While the LHC is used as the recurring example, the methods developed here have a much broader applicability to anomaly detection in physics and beyond.

Wed Dec 14 2011
Machine Learning
Semi-Supervised Anomaly Detection - Towards Model-Independent Searches of New Physics
Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques.
0
0
0
Mon Dec 05 2016
Neural Networks
Known Unknowns: Uncertainty Quality in Bayesian Neural Networks
We evaluate the uncertainty quality in neural networks using anomalydetection. We extract uncertainty measures (e.g. entropy) from the predictions of candidate models. We also propose a novel method for sampling a variational approximation of a Bayesian neural network.
0
0
0
Wed May 23 2018
Machine Learning
Deep Active Learning for Anomaly Detection
Anomalies are intuitively easy for human experts to understand, but they are hard to define mathematically. In order to have performance assurances in unsupervised anomaly detection, priors need to be assumed on what the anomalies are. Active learning provides the necessary priors through appropriate expert feedback.
0
0
0
Mon Dec 07 2020
Machine Learning
Perfect density models cannot guarantee anomaly detection
Deep generative models show promise for seemingly straightforward but important applications. However, the values attributed to anomalies conflict with the proposed applications. We conclude that the use of these likelihoods for out-of-distribution detection relies on strong and implicit hypotheses.
4
2
34
Mon Aug 31 2020
Machine Learning
Anomaly Detection by Recombining Gated Unsupervised Experts
ArGUE is a novel unsupervised anomaly detection method. It is a combination of multiple expert networks, which specialise on parts of the input data. ARGUE fuses the distributed knowledge across the expert systems using a gated mixture-of-experts architecture.
0
0
0
Fri Dec 11 2020
Machine Learning
Comparison of Anomaly Detectors: Context Matters
Deep generative models are challenging the classical methods in the field of anomaly detection. Every new method provides evidence of outperforming predecessors, often with contradictory results. Different methods perform the best in different contexts.
0
0
0