CN114386521A - Method, system, device and storage medium for detecting abnormality of time-series data - Google Patents

Method, system, device and storage medium for detecting abnormality of time-series data Download PDF

Info

Publication number
CN114386521A
CN114386521A CN202210042038.XA CN202210042038A CN114386521A CN 114386521 A CN114386521 A CN 114386521A CN 202210042038 A CN202210042038 A CN 202210042038A CN 114386521 A CN114386521 A CN 114386521A
Authority
CN
China
Prior art keywords
sliding window
time
scalar
data
detection model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210042038.XA
Other languages
Chinese (zh)
Inventor
刘金平
汪家喜
徐鹏飞
蔡美玲
谢永明
马天雨
郑之伟
王靖超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Normal University
Original Assignee
Hunan Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Normal University filed Critical Hunan Normal University
Priority to CN202210042038.XA priority Critical patent/CN114386521A/en
Publication of CN114386521A publication Critical patent/CN114386521A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2474Sequence data queries, e.g. querying versioned data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/088Non-supervised learning, e.g. competitive learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Fuzzy Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Databases & Information Systems (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

The application relates to a method, a system, a device and a storage medium for detecting abnormity of time series data. The method comprises the following steps: collecting time sequence data generated by a sensor, and dividing the time sequence data into sliding window subsequences after normalizing the time sequence data; scalar quantity mapping is carried out on each sliding window subsequence, and the sliding window subsequences are subjected to unified input representation; constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model; and generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, declaring the newly arrived data set to be tested as abnormal. The method can effectively solve the problem of abnormal detection of the time sequence data in the industrial process, and has high detection efficiency, and is quick and accurate.

Description

Method, system, device and storage medium for detecting abnormality of time-series data
Technical Field
The present application relates to the field of neural networks, and in particular, to a method, a system, a device, and a storage medium for detecting an anomaly of time-series data.
Background
With the development of technologies such as internet of things, artificial intelligence and the like, the sensor seems to be ubiquitous, and is widely applied to various fields such as industry, traffic, national defense, scientific research and the like. In the field of industrial automation, sensors are the first link to realize industrial automatic detection and automatic control, and a large number of sensors are intensively used to monitor facilities and systems thereof so as to improve efficiency and safety. Therefore, it is important to closely monitor the event behavior of the system through anomaly detection using the data collected by these sensors. Due to the development of hardware technologies such as sensors and the like, more and more reliable time series data can be collected, wherein time series abnormity detection is an important task for finding problems and avoiding risks in time. However, as the complexity and dimensionality of sensor data increases, it may become frustrating for a human to manually monitor such data.
At present, the conventional methods are mainly used for detecting the abnormality of the time series, wherein the conventional methods include a clustering method, a density-based method, a distance-based method and an isolation-based method. These conventional methods typically model the dependency of time series data in a relatively simple manner. However, in fact as dimensions increase, the complexity level that no longer allows the construction of rules using these traditional methods has been reached due to dimensional disasters.
Disclosure of Invention
In view of the above, it is necessary to provide a method, a system, a device, and a storage medium for detecting an abnormality of time-series data in response to the above technical problem.
In a first aspect, an embodiment of the present invention provides an anomaly detection method for time-series data, including:
acquiring time sequence data generated by a sensor, and dividing the time sequence data into sliding window subsequences after normalization processing is carried out on the time sequence data;
scalar mapping each of the sliding window subsequences, mapping the input scalar timestamp to d through a learnable fully-connected layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model;
and generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, declaring the newly arrived data set to be tested as abnormal.
Further, the acquiring the time series data generated by the sensor, after normalizing the time series data, dividing the time series data into sliding window subsequences, including:
obtaining N sensors at TtrainSensor data at each time step, the time series data comprising the sensor data consisting of:
Figure BDA0003470641600000021
dividing the time series data into a set of subsequences by applying a sliding window with window size w and step size h
Figure BDA0003470641600000023
Wherein
Figure BDA0003470641600000022
Is the number of subsequences.
Further, scalar mapping is carried out on each sliding window subsequence, and an input scalar time stamp is mapped to d through a learnable full-connection layermodelEmbedding dimension, and performing unified input representation on the sliding window subsequence, wherein the step comprises the following steps:
by scalar mapping each sliding window, an embedded representation of the input scalar timestamp is mapped from N dimensions to d dimensions using a learnable fully-connected layermodelMaintaining;
scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelDimension, embedding the mapped vector as ui
Position coding the sliding window of the scalar map;
and weighting and adding the sliding window mapped by the scalar and the window subjected to position coding to obtain the final output which is the input representation.
Further, the constructing an anomaly detection model, feeding the sliding window subsequence to the anomaly detection model for training by generating an confrontation, and updating parameters of the anomaly detection model includes:
constructing the anomaly detection model through a latent space, a generator and a discriminator; wherein the potential space is a Gaussian random noise space, and the generator and the discriminator are respectively an improved Transformer neural network;
forming the improved Transformer neural network by an embedded layer, a position code, an encoder, a decoder and an output layer;
the encoder is composed of a number of encoder blocks, each encoder block comprising: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization;
using a trained generator, a potential space that reflects the test data set as optimal is found by mapping the test data set back to the potential space.
On the other hand, the embodiment of the invention also provides an anomaly detection system based on time series data, which comprises:
the acquisition processing module is used for acquiring time sequence data generated by the sensor, and dividing the time sequence data into sliding window subsequences after the time sequence data is subjected to normalization processing;
a time sequence representation module for carrying out scalar mapping on each sliding window subsequence, and mapping the input scalar time stamp into d through a learnable full connection layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
the off-line training module is used for feeding the sliding window subsequence to the anomaly detection model to train in a countermeasure mode and updating parameters of the anomaly detection model;
and the anomaly detection module is used for generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, the newly arrived data set to be tested is declared to be abnormal.
Further, the acquisition processing module includes a window processing unit, and the window processing unit is configured to:
obtaining N sensors at TtrainSensor data at each time step, the time series data comprising the sensor data consisting of:
Figure BDA0003470641600000031
dividing the time series data into a set of subsequences by applying a sliding window with window size w and step size h
Figure BDA0003470641600000042
Wherein
Figure BDA0003470641600000041
Is the number of subsequences.
Further, the timing representing module includes a timing calculating unit, and the timing calculating unit is configured to:
by scalar mapping each sliding window, an embedded representation of the input scalar timestamp is mapped from N dimensions to d dimensions using a learnable fully-connected layermodelMaintaining;
scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelDimension, embedding the mapped vector as ui
Position coding the sliding window of the scalar map;
and weighting and adding the sliding window mapped by the scalar and the window subjected to position coding to obtain the final output which is the input representation.
Further, the offline training module includes a neural network unit, and the neural network unit is configured to:
constructing the anomaly detection model through a latent space, a generator and a discriminator; wherein the potential space is a Gaussian random noise space, and the generator and the discriminator are respectively an improved Transformer neural network;
forming the improved Transformer neural network by an embedded layer, a position code, an encoder, a decoder and an output layer;
the encoder is composed of a number of encoder blocks, each encoder block comprising: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization;
using a trained generator, a potential space that reflects the test data set as optimal is found by mapping the test data set back to the potential space.
The embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the computer program, the following steps are implemented:
acquiring time sequence data generated by a sensor, and dividing the time sequence data into sliding window subsequences after normalization processing is carried out on the time sequence data;
scalar mapping each of the sliding window subsequences, mapping the input scalar timestamp to d through a learnable fully-connected layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model;
and generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, declaring the newly arrived data set to be tested as abnormal.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the following steps:
acquiring time sequence data generated by a sensor, and dividing the time sequence data into sliding window subsequences after normalization processing is carried out on the time sequence data;
scalar mapping each of the sliding window subsequences, mapping the input scalar timestamp to d through a learnable fully-connected layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model;
and generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, declaring the newly arrived data set to be tested as abnormal.
The method, system, device and storage medium for detecting the abnormity of the time series data comprise the following steps: collecting time sequence data generated by a sensor, and carrying out processing on the time sequence dataAfter row normalization processing, dividing the time sequence data into sliding window subsequences; scalar mapping each of the sliding window subsequences, mapping the input scalar timestamp to d through a learnable fully-connected layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence; constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model; and generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, declaring the newly arrived data set to be tested as abnormal. The method can effectively solve the problem of abnormity detection of the time series data in the industrial process, has high detection efficiency, is quick and accurate, and can provide a brand new thought for the application of the subsequent time series modeling and abnormity detection fields.
Drawings
FIG. 1 is a schematic flow chart of a method for detecting an abnormality in time-series data according to an embodiment;
FIG. 2 is a flow chart illustrating window partitioning of time series data according to an embodiment;
FIG. 3 is a flow diagram illustrating the computation of an input representation in one embodiment;
FIG. 4 is a schematic flow chart illustrating training and generation of an anomaly detection model by a neural network in one embodiment;
FIG. 5 is a block diagram of an anomaly detection system based on time series data in one embodiment;
FIG. 6 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a method of anomaly detection of time-series data, the method including:
step 101, collecting time sequence data generated by a sensor, and dividing the time sequence data into sliding window subsequences after the time sequence data is subjected to normalization processing;
102, carrying out scalar quantity mapping on each sliding window subsequence, and mapping the input scalar quantity time stamp into d through a learnable full connection layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
103, constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model;
and 104, generating an abnormality score for the newly arrived data set to be tested according to the trained abnormality detection model, and if the abnormality score of the window is higher than a preset abnormality threshold value, declaring the newly arrived data set to be tested as abnormal.
Specifically, in the multivariate timing anomaly detection method for the generation countermeasure network architecture based on the Transformer, in the field of anomaly detection, due to the fact that the number of anomaly logs is huge, the operation types cannot be enumerated and are limited by the scarcity of label data, the method can effectively solve the problem of anomaly detection of time series data in the industrial process, and a brand new thought is provided for the application of the fields of subsequent time series modeling and anomaly detection. The Transformer neural network allows direct connection between data units, and can better capture the global dependency; the generation of the countermeasure network aims to extract correlation features between time series and reconstruct complex data distribution. The architecture of the combination of the countermeasure network and the Transformer helps, on the one hand, the generator to capture the overall pattern of the time series and align the generated time series with the normal time series, thereby preventing the discriminator from rapidly converging to the local optimum. On the other hand, the discriminator regularizes the reconstructed time series from a global perspective. Therefore, the method can learn the distribution of data from massive time series data, and can capture the complex dependence relationship among time series to detect the abnormity.
In one embodiment, as shown in fig. 2, the process of window-dividing the time series data includes the following steps:
step 201, acquiring N sensors at TtrainSensor data at each time step, the time series data comprising the sensor data consisting of:
Figure BDA0003470641600000071
step 202, dividing the time series data into a set of subsequences by using a sliding window with window size w and step size h
Figure BDA0003470641600000072
Wherein
Figure BDA0003470641600000073
Is the number of subsequences.
Specifically, time series data generated by a plurality of sensors in an industrial process are collected, and the normalization processing of the time series data set specifically comprises the following steps: in the experiment, our training data contained data from N sensors at TtrainThe sensor (i.e., multivariate time series) data at each time step consists of:
Figure BDA0003470641600000074
we will train the model with these training data. At each time step t, the sensor value
Figure BDA0003470641600000075
The shapes represent the values of the N sensors at time step t, forming an N-dimensional vector. Unsupervised anomaly detection learning is generally followed, with the training data assuming only normal data. Our goal is to detect anomaly detection on the test set, which is also derived from N sensors, but which is at TtestAt a separate time step, our test data is presented as
Figure BDA0003470641600000076
Training data set S for time seriestrainAnd a test data set StestAnd carrying out maximum and minimum normalization processing.
The dividing of the normalized data into subsequences of time series sliding windows specifically includes: to effectively learn the training data set StrainWe apply a sliding window with window size w and step size h to divide the multivariate time sequence into a set of subsequences of the multivariate time sequence
Figure BDA0003470641600000081
Wherein
Figure BDA0003470641600000082
Is the number of subsequences.
Test data set
Figure BDA0003470641600000083
A sliding window with window size w and step size h is also applied. I.e. dividing the multivariate time series into a set of subsequences of the multivariate time series
Figure BDA0003470641600000084
Wherein
Figure BDA0003470641600000085
Is the number of subsequences of the test data set.
In one embodiment, as shown in FIG. 3, the flow of performing the calculation of the input representation includes:
step 301, mapping the embedded representation of the input scalar timestamp from N-dimension to d using a learnable fully-connected layer by scalar mapping for each sliding windowmodelMaintaining;
step 302, scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelDimension, embedding mapThe latter vector is denoted ui
Step 303, performing position coding on the sliding window of the scalar mapping;
step 304, the sliding window of the scalar mapping and the window for position coding are added in a weighted manner, and the final output is the input representation.
Specifically, the unified input expression of the sliding window subsequence of the time series is specifically as follows: by scalar mapping each sliding window, using a learnable generic linear layer, i.e., a fully-connected layer, the embedded representation of the input scalar timestamp is mapped from the N-dimension to d using the fully-connected layermodelMaintaining; position coding the sliding window of the scalar map; and weighting and adding the sliding window of the scalar mapping and the position coding to obtain the final output which is the uniform input representation. Further, performing embedded mapping representation on the subsequence of the time window, and performing scalar mapping on each sliding window specifically includes: scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelAnd (5) maintaining. The vector after embedding the mapping is denoted ui
Further, the scalar mapping for each sliding window x is performed by using a learnable general linear layer, i.e. a fully connected layer, and mapping the input scalar timestamp to d by using the fully connected layermodelAnd (5) embedding the dimension. Further, each embedded mapped sliding window is position coded. The specific calculation formula is
Figure BDA0003470641600000086
Figure BDA0003470641600000087
Where pos is the position of the sliding window x in the sequence and i is the ith dimension in the vector. The position-coded representation vector for each sliding window is denoted by PE here. Further, the sliding window of scalar mapping and the position coding are added in a weighted mode to obtain the final output which is the unified input representation and is marked as x. The specific formula is as follows:
Figure BDA0003470641600000091
wherein the content of the first and second substances,
Figure BDA0003470641600000092
the embedding representing the t-th time step represents the vector, i is the ith dimension of the model in the vector (where the input dimensions of the model are unified as d through the embedding preprocessing described abovemodel), λ is a weighting parameter. We here take λ to 1.
In one embodiment, as shown in fig. 4, the process of training and generating the anomaly detection model through the neural network includes:
step 401, constructing the anomaly detection model through a potential space, a generator and a discriminator; wherein the potential space is a Gaussian random noise space, and the generator and the discriminator are respectively an improved Transformer neural network;
step 402, forming the improved Transformer neural network by an embedded layer, a position code, a coder, a decoder and an output layer;
step 403, the encoder is composed of a plurality of encoder blocks, each encoder block includes: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization;
using the trained generator, a potential space that reflects the test data set as optimal is found by mapping the test data set back to the potential space, step 404.
Specifically, a well-established anomaly detection model, and a sliding window subsequence is fed to the model to be trained in a countermeasure generation mode so as to update parameters of the model, specifically: the anomaly detection model includes: latent space, generator, discriminator; wherein the potential space is a Gaussian random noise space. The generator and the discriminator are respectively improved Transformer neural networks. Further, the improved Transformer neural network comprises: an embedded layer, a position code, an encoder, a decoder, an output layer; the above-mentioned unified input representation of the embedding layer and the position code has already been explained, and will not be described herein again. The encoder is comprised of a plurality of encoder blocks, each encoder block comprising: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization; wherein the multi-head self-attention layer is composed of a plurality of self-attention layers and a universal linear layer. The method specifically comprises the following steps: the multi-head attention calculation formula is as follows:
MultiHead(Q,K,V)=Concat(head1,L,headh)WO
Figure BDA0003470641600000093
the decoder is composed of a plurality of decoder blocks, each decoder block comprising: masking a multi-head self-attention layer, an encoder-decoder multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization; the residual error connection calculation formula is as follows:
RC(x)=x+f(x)
where RC (-) represents the final output of the residual join;
f (-) represents the output of the input through the middle layer. The masking multi-head self-attention layer is composed of a plurality of self-attention layers and a universal linear layer. Only one masking operation is added inside the self-attention layer; the position-by-position feedforward neural network is composed of two fully-connected layers, the first fully-connected layer contains a Relu activation function, and the second function does not contain the Relu activation function. The calculation formula is as follows:
FFN(x)=max(0,W1x+b1)W2+b2
wherein W1,W2Are all learnable weight matrices, b1,b2Is the bias term. The decoder also includes a multi-headed attention layer and a feed-forward neural network. The encoder-decoderThe encoder multi-head self-attention layer is composed of a plurality of self-attention layers and a universal linear layer. Is completely similar to the previous multi-headed self-attention hierarchy, except that the input mechanism is different. The final output of the encoder block results in an encoded matrix of timing information, denoted here as C. The C matrix serves as the input matrices K and V for the encoder-decoder multi-headed self-attention layer in each decoder block, the input matrix Q coming from the final output of the previous layer decoder block. The output layer is composed of a linear layer and a Sigmoid activated function unit. The method specifically comprises the following steps: the final output of the decoder is denoted F. That is, our final output formula is:
Figure BDA0003470641600000101
where W is a learnable weight matrix, and σ (·) represents a sigmoid function. Wherein the content of the first and second substances,
Figure BDA0003470641600000102
the output of the output layer is represented, i.e. the final output of the transform as modified by the generator or discriminator. Further, said feeding a sliding window subsequence to the model is trained by generating an opponent to update parameters of the model comprises: a subsequence of sliding windows is fed to the model, which comprises a generator and a discriminator, by which training is done by generating countermeasures, by being able to generate real samples, the generator will capture a hidden multivariate distribution of the training sequence and can be considered as an implicit model in the normal state. At the same time, the resulting discriminator is also trained to distinguish spurious (i.e., abnormal) data from true (i.e., normal) data with high sensitivity. The generator will capture the hidden multivariate distribution of the training sequence by the generated fake samples and can be seen as an implicit model in the normal state. At the same time, the resulting discriminator is also trained to distinguish spurious (i.e., abnormal) data from true (i.e., normal) data with high sensitivity.
The training process has two stages, including:
parameters of the discriminator are updated, and parameters of the generator are updated.
The parameter stage of updating the discriminator is that a generator is used to generate a fake sample from the potential space, and the real training data sample and the fake sample generated by the generator are fed to the discriminator. The discriminator makes the generated sample judged as false as possible, and the sample of the real training data is true as possible. Training a discriminator using a discriminant-based loss function;
the parameter updating stage of the generator is to freeze the parameter of the discriminator and feed the forged sample generated by the generator from the potential space to the discriminator so that the discriminator judges the generated sample as true as possible. Training a generator using a reconstruction-based loss function;
thus after a sufficient number of rounds of training, the two phases are iteratively trained for each round to update the parameters of the discriminator and generator models. At this time, the generator has the capability of generating real data from the latent space, and the discriminator also has the capability of distinguishing fake (i.e. abnormal) data from real (i.e. normal) data;
it is assumed that information is lost in reconstructing the anomalous samples. We use a trained generator to find the potential space that reflects the test dataset's optimality by mapping it back to the potential space. If a corresponding sample can be found in the potential space of the test data, the similarity between the test sample and the reconstructed test sample can explain how well the test data obeys the distribution reflected by the generator.
Further, the mapping of the test data set back to the potential space to find the potential space that reflects the test data set as optimal. The method comprises the following steps:
first, the parameters of the generator are frozen, from the potential space to the potential space set Z1Sampling is carried out by mixing Z1Fed to a generator already trained to obtain reconstructed original samples G (Z)1). Here with Z1An optimal sample representing the 1 st round of the potential space;
the samples of the underlying space are then updated with the gradients derived from the error loss function. Calculating the formula:
Lerr=∑||y-G(z)||2
wherein | · | purple sweet2Represents the L2 norm;
Lerrrepresenting an error loss function that maps the test data set back into a potential space;
after a sufficient number of rounds of training, the potential space reflecting the test data set is optimal.
Further, the Transformer-based generative confrontation network architecture multivariate time series anomaly detection model loss function includes:
L=LR+LD
wherein L represents a multivariate time series anomaly detection model loss function of a generative countermeasure network architecture based on a Transformer;
LRrepresenting a reconstruction-based loss function; the reconstruction-based loss function calculation formula is as follows:
LR=∑||x-G(z)||2
wherein | · | purple sweet2Represents the L2 norm;
LDrepresenting a discriminant-based loss function. The formula for calculating the loss function based on the judgment is as follows:
LD=∑||D(x)-D(G(z))||2
wherein | · | purple sweet2Represents the L2 norm;
in the present embodiment, one is generally not concerned with point abnormalities for the time-series abnormality detection task. It is the actual case that any point within a continuous anomaly segment is classified as anomalous, i.e. a window is marked as anomalous as soon as it is detected that a point contained by the window is anomalous. Based on this, we divide the time series into a plurality of sub-series, i.e. use the sliding window way to detect the abnormality.
Further, using the trained model, feeding the new incoming time series to the model to calculate the anomaly score for each time window is embodied as: the trained generator and discriminator obtained in the above steps can reflect the optimal potential space of the test data set. Because information is lost in the process of reconstructing the anomaly, namely the error of the anomaly sample reconstructed by the generator through the potential space is very large, the method is a means for distinguishing the anomaly data from the normal data; in addition, the discriminator has the capability of distinguishing abnormal data from normal data after enough training;
in order to discover potential anomalies, two means of a generator and a discriminator are combined, and anomaly scores are calculated based on reconstruction loss and discrimination loss; the calculation formula is as follows:
Figure BDA0003470641600000121
wherein α + β ═ 1, | · | | | noncash2Represents the L2 norm;
α and β are parameters that are weighted in balance;
Figure BDA0003470641600000122
an anomaly score representing a sliding window at the ith time step of the test data set;
zirepresenting a subsequence y in a reflection test data setiThe corresponding sub-sequence in the optimal potential space.
Generating an anomaly score for the newly arrived time window according to the trained anomaly detection model, and declaring the newly arrived time window as anomalous if the anomaly score of the window is higher than a defined anomaly threshold τ. Otherwise, it is marked as normal.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 5, there is provided an abnormality detection system based on time-series data, including:
the acquisition processing module 501 is configured to acquire time series data generated by a sensor, and divide the time series data into sliding window subsequences after performing normalization processing on the time series data;
a time sequence representation module 502 for performing scalar mapping on each of the sliding window subsequences, and mapping the input scalar time stamp to d through a learnable full-connection layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
an offline training module 503, configured to feed the sliding window subsequence to the anomaly detection model, train the sliding window subsequence by generating an confrontation, and update parameters of the anomaly detection model;
and the anomaly detection module 504 is configured to generate an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold, the newly arrived data set to be tested is declared to be abnormal.
In one embodiment, as shown in fig. 5, the acquisition processing module 501 includes a window processing unit 5011, the window processing unit 5011 is configured to:
obtaining N sensors at TtrainSensor data at each time step, the time series data comprising the sensor data consisting of:
Figure BDA0003470641600000131
dividing the time series data into a set of subsequences by applying a sliding window with window size w and step size h
Figure BDA0003470641600000132
Wherein
Figure BDA0003470641600000133
Is the number of subsequences.
In one embodiment, as shown in fig. 5, the timing representing module 502 includes a timing calculating unit 5021, and the timing calculating unit 5021 is configured to:
by scalar mapping each sliding window, an embedded representation of the input scalar timestamp is mapped from N dimensions to d dimensions using a learnable fully-connected layermodelMaintaining;
scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelDimension, embedding the mapped vector as ui
Position coding the sliding window of the scalar map;
and weighting and adding the sliding window mapped by the scalar and the window subjected to position coding to obtain the final output which is the input representation.
In one embodiment, as shown in fig. 5, the offline training module 503 comprises a neural network unit 5031, and the neural network unit 5031 is configured to:
constructing the anomaly detection model through a latent space, a generator and a discriminator; wherein the potential space is a Gaussian random noise space, and the generator and the discriminator are respectively an improved Transformer neural network;
forming the improved Transformer neural network by an embedded layer, a position code, an encoder, a decoder and an output layer;
the encoder is composed of a number of encoder blocks, each encoder block comprising: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization;
using a trained generator, a potential space that reflects the test data set as optimal is found by mapping the test data set back to the potential space.
For specific limitations of the anomaly detection system based on time-series data, reference may be made to the above limitations of the anomaly detection method for time-series data, which are not described herein again. The modules in the anomaly detection system based on time-series data can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
FIG. 6 is a diagram illustrating an internal structure of a computer device in one embodiment. As shown in fig. 6, the computer apparatus includes a processor, a memory, a network interface, an input device, and a display screen connected through a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the computer device stores an operating system and may also store a computer program that, when executed by the processor, causes the processor to implement an anomaly detection method for time-series data. The internal memory may also store a computer program that, when executed by the processor, causes the processor to perform a method of anomaly detection on time-series data. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring time sequence data generated by a sensor, and dividing the time sequence data into sliding window subsequences after normalization processing is carried out on the time sequence data;
scalar mapping each of the sliding window subsequences, mapping the input scalar timestamp to d through a learnable fully-connected layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model;
and generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, declaring the newly arrived data set to be tested as abnormal.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
obtaining N sensors at TtrainSensor data at each time step, the time series data comprising the sensor data consisting of:
Figure BDA0003470641600000151
dividing the time series data into a set of subsequences by applying a sliding window with window size w and step size h
Figure BDA0003470641600000152
Wherein
Figure BDA0003470641600000153
Is the number of subsequences.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
by scalar mapping each sliding window, a mathematics is usedThe learned fully-connected layer maps the embedded representation of the input scalar timestamp from N-dimensions to dmodelMaintaining;
scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelDimension, embedding the mapped vector as ui
Position coding the sliding window of the scalar map;
and weighting and adding the sliding window mapped by the scalar and the window subjected to position coding to obtain the final output which is the input representation.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
constructing the anomaly detection model through a latent space, a generator and a discriminator; wherein the potential space is a Gaussian random noise space, and the generator and the discriminator are respectively an improved Transformer neural network;
forming the improved Transformer neural network by an embedded layer, a position code, an encoder, a decoder and an output layer;
the encoder is composed of a number of encoder blocks, each encoder block comprising: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization;
using a trained generator, a potential space that reflects the test data set as optimal is found by mapping the test data set back to the potential space.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring time sequence data generated by a sensor, and dividing the time sequence data into sliding window subsequences after normalization processing is carried out on the time sequence data;
scalar mapping each of the sliding window subsequences, mapping the input scalar timestamp to d through a learnable fully-connected layermodelEmbedding dimension, and uniformly inputting the sliding window subsequenceRepresents;
constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model;
and generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, declaring the newly arrived data set to be tested as abnormal.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
obtaining N sensors at TtrainSensor data at each time step, the time series data comprising the sensor data consisting of:
Figure BDA0003470641600000161
dividing the time series data into a set of subsequences by applying a sliding window with window size w and step size h
Figure BDA0003470641600000162
Wherein
Figure BDA0003470641600000163
Is the number of subsequences.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
by scalar mapping each sliding window, an embedded representation of the input scalar timestamp is mapped from N dimensions to d dimensions using a learnable fully-connected layermodelMaintaining;
scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelDimension, embedding the mapped vector as ui
Position coding the sliding window of the scalar map;
and weighting and adding the sliding window mapped by the scalar and the window subjected to position coding to obtain the final output which is the input representation.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
constructing the anomaly detection model through a latent space, a generator and a discriminator; wherein the potential space is a Gaussian random noise space, and the generator and the discriminator are respectively an improved Transformer neural network;
forming the improved Transformer neural network by an embedded layer, a position code, an encoder, a decoder and an output layer;
the encoder is composed of a number of encoder blocks, each encoder block comprising: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization;
using a trained generator, a potential space that reflects the test data set as optimal is found by mapping the test data set back to the potential space.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method for detecting an abnormality in time-series data, the method comprising:
acquiring time sequence data generated by a sensor, and dividing the time sequence data into sliding window subsequences after normalization processing is carried out on the time sequence data;
scalar mapping each of the sliding window subsequences, mapping the input scalar timestamp to d through a learnable fully-connected layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
constructing an abnormality detection model, feeding the sliding window subsequence to the abnormality detection model, training in a countermeasure generation mode, and updating parameters of the abnormality detection model;
and generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, declaring the newly arrived data set to be tested as abnormal.
2. The method for detecting the abnormality of the time-series data according to claim 1, wherein the step of dividing the time-series data into sliding window subsequences after the time-series data generated by the acquisition sensor is normalized comprises:
obtaining N sensors at TtrainSensor data at each time step, the time series data comprising the sensor data consisting of:
Figure FDA0003470641590000011
dividing the time series data into a set of subsequences by applying a sliding window with window size w and step size h
Figure FDA0003470641590000013
Wherein
Figure FDA0003470641590000012
Is the number of subsequences.
3. The method of detecting anomalies in time-series data of claim 1 wherein said scalar mapping each of said sliding-window subsequences is performed by mapping an input scalar timestamp to d through a learnable fully-connected layermodelEmbedding dimension, and performing unified input representation on the sliding window subsequence, wherein the step comprises the following steps:
by scalar mapping each sliding window, an embedded representation of the input scalar timestamp is mapped from N dimensions to d dimensions using a learnable fully-connected layermodelMaintaining;
scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelDimension, embedding the mapped vector as ui
Position coding the sliding window of the scalar map;
and weighting and adding the sliding window mapped by the scalar and the window subjected to position coding to obtain the final output which is the input representation.
4. The method of claim 1, wherein the constructing an anomaly detection model, feeding the sliding window subsequence to the anomaly detection model, training by generating a countermeasure, and updating parameters of the anomaly detection model comprises:
constructing the anomaly detection model through a latent space, a generator and a discriminator; wherein the potential space is a Gaussian random noise space, and the generator and the discriminator are respectively an improved Transformer neural network;
forming the improved Transformer neural network by an embedded layer, a position code, an encoder, a decoder and an output layer;
the encoder is composed of a number of encoder blocks, each encoder block comprising: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization;
using a trained generator, a potential space that reflects the test data set as optimal is found by mapping the test data set back to the potential space.
5. An abnormality detection system based on time-series data, characterized by comprising:
the acquisition processing module is used for acquiring time sequence data generated by the sensor, and dividing the time sequence data into sliding window subsequences after the time sequence data is subjected to normalization processing;
a time sequence representation module for carrying out scalar mapping on each sliding window subsequence, and mapping the input scalar time stamp into d through a learnable full connection layermodelEmbedding dimension, and carrying out uniform input representation on the sliding window subsequence;
the off-line training module is used for feeding the sliding window subsequence to the anomaly detection model to train in a countermeasure mode and updating parameters of the anomaly detection model;
and the anomaly detection module is used for generating an anomaly score for the newly arrived data set to be tested according to the trained anomaly detection model, and if the anomaly score of the window is higher than a preset anomaly threshold value, the newly arrived data set to be tested is declared to be abnormal.
6. The time-series data-based anomaly detection system according to claim 5, wherein said acquisition processing module comprises a window processing unit configured to:
obtaining N sensors at TtrainSensor data at each time step, the time series data comprising the sensor data consisting of:
Figure FDA0003470641590000031
Figure FDA0003470641590000032
7. the time-series data-based anomaly detection system according to claim 5, wherein said time-series representation module comprises a time-series calculation unit for:
by scalar mapping each sliding window, an embedded representation of the input scalar timestamp is mapped from N dimensions to d dimensions using a learnable fully-connected layermodelMaintaining;
scalar mapping is performed on each sliding window x, and input data is mapped from N dimensions to d dimensionsmodelDimension, embedding the mapped vector as ui
Position coding the sliding window of the scalar map;
and weighting and adding the sliding window mapped by the scalar and the window subjected to position coding to obtain the final output which is the input representation.
8. The time-series data-based anomaly detection system of claim 5, wherein said offline training module comprises a neural network unit configured to:
constructing the anomaly detection model through a latent space, a generator and a discriminator; wherein the potential space is a Gaussian random noise space, and the generator and the discriminator are respectively an improved Transformer neural network;
forming the improved Transformer neural network by an embedded layer, a position code, an encoder, a decoder and an output layer;
the encoder is composed of a number of encoder blocks, each encoder block comprising: the method comprises the following steps of a multi-head self-attention layer, a position-by-position feedforward neural network, residual connection and layer normalization;
using a trained generator, a potential space that reflects the test data set as optimal is found by mapping the test data set back to the potential space.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 4 are implemented when the computer program is executed by the processor.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 4.
CN202210042038.XA 2022-01-14 2022-01-14 Method, system, device and storage medium for detecting abnormality of time-series data Pending CN114386521A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210042038.XA CN114386521A (en) 2022-01-14 2022-01-14 Method, system, device and storage medium for detecting abnormality of time-series data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210042038.XA CN114386521A (en) 2022-01-14 2022-01-14 Method, system, device and storage medium for detecting abnormality of time-series data

Publications (1)

Publication Number Publication Date
CN114386521A true CN114386521A (en) 2022-04-22

Family

ID=81200969

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210042038.XA Pending CN114386521A (en) 2022-01-14 2022-01-14 Method, system, device and storage medium for detecting abnormality of time-series data

Country Status (1)

Country Link
CN (1) CN114386521A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115081555A (en) * 2022-08-16 2022-09-20 南京航空航天大学 Anomaly detection method and device based on generation countermeasure and bidirectional cyclic neural network
CN115078894A (en) * 2022-08-22 2022-09-20 广东电网有限责任公司肇庆供电局 Method, device and equipment for detecting abnormity of electric power machine room and readable storage medium
CN115565525A (en) * 2022-12-06 2023-01-03 四川大学华西医院 Audio anomaly detection method and device, electronic equipment and storage medium
CN115840875A (en) * 2022-11-10 2023-03-24 北京擎天信安科技有限公司 Millimeter wave radar abnormal signal detection method and system based on analog transducer
CN116149896A (en) * 2023-03-27 2023-05-23 阿里巴巴(中国)有限公司 Time sequence data abnormality detection method, storage medium and electronic device
CN116662811A (en) * 2023-06-13 2023-08-29 无锡物联网创新中心有限公司 Time sequence state data reconstruction method and related device of industrial equipment
CN116664000A (en) * 2023-06-13 2023-08-29 无锡物联网创新中心有限公司 Industrial equipment abnormality detection method and related device based on long-short-term memory network
CN116738170A (en) * 2023-06-13 2023-09-12 无锡物联网创新中心有限公司 Abnormality analysis method and related device for industrial equipment
CN116956282A (en) * 2023-06-07 2023-10-27 广州天懋信息系统股份有限公司 Abnormality detection system based on network asset memory time sequence multi-feature data
CN117793764A (en) * 2023-12-27 2024-03-29 广东宜通衡睿科技有限公司 5G private network soft probe dial testing data integrity checksum completion method and system

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115081555A (en) * 2022-08-16 2022-09-20 南京航空航天大学 Anomaly detection method and device based on generation countermeasure and bidirectional cyclic neural network
CN115081555B (en) * 2022-08-16 2023-12-08 南京航空航天大学 Anomaly detection method and device based on generation countermeasure and bidirectional circulating neural network
CN115078894A (en) * 2022-08-22 2022-09-20 广东电网有限责任公司肇庆供电局 Method, device and equipment for detecting abnormity of electric power machine room and readable storage medium
CN115840875A (en) * 2022-11-10 2023-03-24 北京擎天信安科技有限公司 Millimeter wave radar abnormal signal detection method and system based on analog transducer
CN115565525A (en) * 2022-12-06 2023-01-03 四川大学华西医院 Audio anomaly detection method and device, electronic equipment and storage medium
CN116149896A (en) * 2023-03-27 2023-05-23 阿里巴巴(中国)有限公司 Time sequence data abnormality detection method, storage medium and electronic device
CN116149896B (en) * 2023-03-27 2023-07-21 阿里巴巴(中国)有限公司 Time sequence data abnormality detection method, storage medium and electronic device
CN116956282A (en) * 2023-06-07 2023-10-27 广州天懋信息系统股份有限公司 Abnormality detection system based on network asset memory time sequence multi-feature data
CN116956282B (en) * 2023-06-07 2024-02-06 广州天懋信息系统股份有限公司 Abnormality detection system based on network asset memory time sequence multi-feature data
CN116738170A (en) * 2023-06-13 2023-09-12 无锡物联网创新中心有限公司 Abnormality analysis method and related device for industrial equipment
CN116664000A (en) * 2023-06-13 2023-08-29 无锡物联网创新中心有限公司 Industrial equipment abnormality detection method and related device based on long-short-term memory network
CN116662811A (en) * 2023-06-13 2023-08-29 无锡物联网创新中心有限公司 Time sequence state data reconstruction method and related device of industrial equipment
CN116662811B (en) * 2023-06-13 2024-02-06 无锡物联网创新中心有限公司 Time sequence state data reconstruction method and related device of industrial equipment
CN117793764A (en) * 2023-12-27 2024-03-29 广东宜通衡睿科技有限公司 5G private network soft probe dial testing data integrity checksum completion method and system

Similar Documents

Publication Publication Date Title
CN114386521A (en) Method, system, device and storage medium for detecting abnormality of time-series data
CN111914873A (en) Two-stage cloud server unsupervised anomaly prediction method
CN112784965A (en) Large-scale multi-element time series data abnormity detection method oriented to cloud environment
CN109948117A (en) A kind of satellite method for detecting abnormality fighting network self-encoding encoder
Chen et al. Time series data for equipment reliability analysis with deep learning
Wu et al. A weighted deep domain adaptation method for industrial fault prognostics according to prior distribution of complex working conditions
Wang et al. A Bayesian inference-based approach for performance prognostics towards uncertainty quantification and its applications on the marine diesel engine
CN115903741B (en) Industrial control system data anomaly detection method
Gu et al. An improved sensor fault diagnosis scheme based on TA-LSSVM and ECOC-SVM
CN110956309A (en) Flow activity prediction method based on CRF and LSTM
CN116522265A (en) Industrial Internet time sequence data anomaly detection method and device
CN114297918A (en) Aero-engine residual life prediction method based on full-attention depth network and dynamic ensemble learning
CN115983087A (en) Method for detecting time sequence data abnormity by combining attention mechanism and LSTM and terminal
CN116796272A (en) Method for detecting multivariate time sequence abnormality based on transducer
Belkhouja et al. Analyzing deep learning for time-series data through adversarial lens in mobile and IoT applications
CN112115184A (en) Time series data detection method and device, computer equipment and storage medium
Yang et al. Predictive maintenance for general aviation using convolutional transformers
CN116992380A (en) Satellite multidimensional telemetry sequence anomaly detection model construction method and device, anomaly detection method and device
CN117034099A (en) System log abnormality detection method
CN116628612A (en) Unsupervised anomaly detection method, device, medium and equipment
Morris et al. Self-supervised deep learning framework for anomaly detection in traffic data
US20210177307A1 (en) Repetitive human activities abnormal motion detection
Jan Hagendorfer Knowledge incorporation for machine learning in condition monitoring: A survey
KR20220108678A (en) Arrhythmia classification method using long short-term memory based on recurrent neural network
Correia et al. Online time-series anomaly detection: A survey of modern model-based approaches

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination