Published on Mon Aug 31 2020

Multi-Scale One-Class Recurrent Neural Networks for Discrete Event Sequence Anomaly Detection

Zhiwei Wang, Zhengzhang Chen, Jingchao Ni, Hui Liu, Haifeng Chen, Jiliang Tang

OC4Seq is a multi-scale neural network for detecting anomalies in discrete event sequences. It integrates the anomaly detection objective with recurrent neural networks (RNNs) to embed event sequences into latent spaces where anomalies can be easily detected.

1
1
0
Abstract

Discrete event sequences are ubiquitous, such as an ordered event series of process interactions in Information and Communication Technology systems. Recent years have witnessed increasing efforts in detecting anomalies with discrete-event sequences. However, it still remains an extremely difficult task due to several intrinsic challenges including data imbalance issues, the discrete property of the events, and sequential nature of the data. To address these challenges, in this paper, we propose OC4Seq, a multi-scale one-class recurrent neural network for detecting anomalies in discrete event sequences. Specifically, OC4Seq integrates the anomaly detection objective with recurrent neural networks (RNNs) to embed the discrete event sequences into latent spaces, where anomalies can be easily detected. In addition, given that an anomalous sequence could be caused by either individual events, subsequences of events, or the whole sequence, we design a multi-scale RNN framework to capture different levels of sequential patterns simultaneously. Experimental results on three benchmark datasets show that OC4Seq consistently outperforms various representative baselines by a large margin. Moreover, through both quantitative and qualitative analysis, the importance of capturing multi-scale sequential patterns for event anomaly detection is verified.

Wed Nov 13 2019
Machine Learning
Uncertainty on Asynchronous Time Event Prediction
Asynchronous event sequences are the basis of many applications throughout different industries. We present two new architectures, WGP-LN and FD-Dir, to model the evolution of the distribution on the probability simplex.
0
0
0
Mon Apr 19 2021
Machine Learning
SALAD: Self-Adaptive Lightweight Anomaly Detection for Real-time Recurrent Time Series
0
0
0
Fri Feb 08 2019
Artificial Intelligence
BINet: Multi-perspective Business Process Anomaly Classification
BINet is designed to handle both the control flow and the data perspective of a business process. We demonstrate that BINet can be used to detect anomalies in event logs not only on a case level but also on event attribute level.
0
0
0
Fri Jul 16 2021
Machine Learning
Neural Contextual Anomaly Detection for Time Series
Neural Contextual Anomaly Detection (NCAD) scales seamlessly from the unsupervised to supervised setting. Our window-based approach facilitates learning the boundary between normal and anomalous classes. Our method can effectively take advantage of all the available information.
1
0
0
Mon Apr 15 2019
Machine Learning
Exploiting Event Log Event Attributes in RNN Based Prediction
In predictive process analytics, current and historical process data is used to predict the future. Recurrent neural networks have been demonstrated to be well suited for creating Prediction models. The biggest challenge in exploiting them in prediction models is the potentially large amount of event attributes and attribute values.
0
0
0
Fri Jan 24 2020
Machine Learning
RePAD: Real-time Proactive Anomaly Detection for Time Series
RePAD is a Real-time Proactive Anomaly Detection algorithm for streaming time series based on Long Short-Term Memory (LSTM) RePAD utilizes short-term historic data points to predict and determine whether or not an anomaly is likely to happen in the near future.
0
0
0
Mon Dec 22 2014
Machine Learning
Adam: A Method for Stochastic Optimization
Adam is an algorithm for first-order gradient-based optimization of stochastic objective functions. The method is straightforward to implement and has little memory requirements. It is well suited for problems that are large in terms of data and parameters.
3
0
2
Tue Mar 13 2018
Neural Networks
Recurrent Neural Network Attention Mechanisms for Interpretable System Log Anomaly Detection
Deep learning has recently demonstrated state-of-the-art performance on key tasks related to the maintenance of computer systems. In these contexts, model interpretability is crucial for administrator and analyst to trust and act on the automated analysis of machine learning models.
0
0
0
Tue Feb 13 2018
Machine Learning
Detecting Spacecraft Anomalies Using LSTMs and Nonparametric Dynamic Thresholding
As spacecraft send back increasing amounts of telemetry data, improved anomalies detection systems are needed. We demonstrate the effectiveness of Long Short-Term Memory (LSTMs) in overcoming these issues. We also propose a complementary unsupervised and nonparametricomaly thresholding approach.
0
0
0
Sat Dec 02 2017
Neural Networks
Recurrent Neural Network Language Models for Open Vocabulary Event-Level Cyber Anomaly Detection
This work introduces a flexible, powerful, and unsupervised approach to detecting anomalous behavior in computer and network logs. By treating logs as threads of interleaved "sentences" (event log lines), our approach provides an adaptive model of normal network behavior. Compared to Isolation
0
0
0
Tue Jun 03 2014
Neural Networks
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
The proposed model learns a semantically and syntactically meaningful representation of linguistic phrases. The performance of a statistical machine translation system is empirically found to improve by using the conditional probabilities of phrase pairs computed by the RNN Encoder-Decoder.
0
0
0
Wed Nov 04 2015
Machine Learning
Semi-supervised Sequence Learning
We present two approaches that use unlabeled data to improve sequence progressivelylearning with recurrent networks. The first approach is to predict what comes next in a sequence, which is a conventional language model in natural language processing. The second is to use a sequence autoencoder, which reads the
0
0
0