MAD: Self-Supervised Masked Anomaly Detection Task for Multivariate Time Series
- GE Research,Niskayuna,NY,USA
In this paper, we introduce Masked Anomaly Detection (MAD), a general self-supervised learning task for multivariate time series anomaly detection. With the increasing availability of sensor data from industrial systems, being able to detecting anomalies from streams of multivariate time series data is of significant importance. Given the scarcity of anomalies in real-world applications, the majority of literature has been focusing on modeling normality. The learned normal representations can empower anomaly detection as the model has learned to capture certain key underlying data regularities. A typical formulation is to learn a predictive model, i.e., use a window of time series data to predict future data values. In this paper, we propose an alternative self-supervised learning task. By randomly masking a portion of the inputs and training a model to estimate them using the remaining ones, MAD is an improvement over the traditional left-to-right next step prediction (NSP) task. Our experimental results demonstrate that MAD can achieve better anomaly detection rates over traditional NSP approaches when using exactly the same neural network (NN) base models, and can be modified to run as fast as NSP models during test time on the same hardware, thus making it an ideal upgrade for many existing NSP-based NN anomaly detection models.
- Research Organization:
- GE Research
- Sponsoring Organization:
- USDOE Office of Fossil Energy (FE)
- DOE Contract Number:
- FE0031763
- OSTI ID:
- 1905117
- Report Number(s):
- DOE-GER-FE0031763-4
- Journal Information:
- 2022 International Joint Conference on Neural Networks (IJCNN), Conference: 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18-23 July 2022
- Country of Publication:
- United States
- Language:
- English
Similar Records
Multivariate Time Series Anomaly Detection with Few Positive Samples
Self-Supervised Anomaly Detection via Neural Autoregressive Flows with Active Learning