CN113837351B - New different detector - Google Patents

New different detector Download PDF

Info

Publication number
CN113837351B
CN113837351B CN202011622881.2A CN202011622881A CN113837351B CN 113837351 B CN113837351 B CN 113837351B CN 202011622881 A CN202011622881 A CN 202011622881A CN 113837351 B CN113837351 B CN 113837351B
Authority
CN
China
Prior art keywords
data
arbiter
encoder
anomaly detector
actual data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011622881.2A
Other languages
Chinese (zh)
Other versions
CN113837351A (en
Inventor
李中彦
庆宗旻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Korea Advanced Institute of Science and Technology KAIST
SK Hynix Inc
Original Assignee
Korea Advanced Institute of Science and Technology KAIST
SK Hynix Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Korea Advanced Institute of Science and Technology KAIST, SK Hynix Inc filed Critical Korea Advanced Institute of Science and Technology KAIST
Priority to US17/149,627 priority Critical patent/US20210383253A1/en
Priority to KR1020210017546A priority patent/KR20210152369A/en
Publication of CN113837351A publication Critical patent/CN113837351A/en
Application granted granted Critical
Publication of CN113837351B publication Critical patent/CN113837351B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The present disclosure relates to a new anomaly detector comprising: a generator configured to output reconstruction data from the actual data; and a discriminator configured to receive the actual data and the reconstructed data and to generate discrimination data indicating whether the actual data is normal or abnormal using the actual data and the generated reconstructed data.

Description

New different detector
Cross Reference to Related Applications
The present application claims priority from korean patent application No. 10-2020-0068807 filed on 6/8/2020, which is incorporated herein by reference in its entirety.
Technical Field
Various embodiments may relate to a new alien (novelty) detector, and more particularly, to a new alien detector that is trained with various samples without labels.
Background
New anomaly detection refers to a technique of detecting data that is different from previously known data.
When performing a new anomaly detection using the neural network, the neural network is trained using normal samples. The normal samples may be from a distribution of feature spaces of the normal samples.
When data is input to the neural network after training is completed, the neural network infers whether the actual data is a normal sample (such as a sample from a distribution of feature spaces of the normal sample) or an abnormal sample (such as a sample not from a distribution of feature spaces of the normal sample).
Fig. 1 is a block diagram showing a conventional novel exclusive detector 1.
The conventional novel exclusive detector 1 includes a generator 10 and a discriminator 20.
Each of the generator 10 and the arbiter 20 has a neural network structure such as a Convolutional Neural Network (CNN).
A neural network having the structure shown in fig. 1 is called a generation countermeasure network (GAN).
The generator 10 includes: the encoder 11 and the decoder 12 are sequentially coupled as shown in fig. 2 to receive the actual data x and generate the reconstructed data G (x).
The encoder 11 and the decoder 12 may also have a neural network structure such as CNN.
The discriminator 20 receives the reconstruction data G (x) and outputs discrimination data D indicating whether the actual data x is a normal sample or an abnormal sample.
For example, the discrimination data D may become 0 or 1, where 1 indicates that the actual data is determined as a normal sample, and 0 indicates that the actual data is determined as an abnormal sample.
The conventional new anomaly detector 1 further comprises a coupling circuit 30.
The coupling circuit 30 provides the actual data x or the reconstructed data G (x) as input to the arbiter 20 for training operations.
The coupling circuit 30 provides the reconstruction data G (x) as input to the arbiter 20 for an inference operation.
Fig. 3 is a flowchart showing a training operation of the conventional new anomaly detector 1. The purpose of the training generator 10 is to reconstruct samples that are similar to the actual samples, while the purpose of the training arbiter 20 is to distinguish the samples generated by the generator 10 from the actual samples.
In the conventional new anomaly detector 1, the arbiter 20 and the generator 10 are trained alternately.
When training the arbiter 20 in step S10, the weight of the neural network of the arbiter 20 is adjusted while the weight of the neural network of the generator 10 is fixed.
At this time, the actual data x and the reconstruction data G (x) are alternately input to the arbiter 20.
At this time, the weight of the discriminator 20 is adjusted so that the discrimination data D becomes 1 when the actual data x is input and the discrimination data D becomes 0 when the reconstruction data G (x) is input.
When training the generator 10 in step S20, the weight of the generator 10 is adjusted while the weight of the arbiter 20 is fixed.
At this time, the reconstruction data G (x) is input to the discriminator 20.
When the reconstruction data G (x) is input, the weight of the generator 10 is adjusted so that the discrimination data D becomes 1 and the Mean Square Error (MSE) between the actual data x and the reconstruction data G (x) becomes 0.
The above two steps may be repeated during the training operation.
In training the neural network, normal samples labeled with categories may be used.
When training a neural network using normal samples with labels, new anomaly detection performance may be improved.
However, there is a problem in that it takes a lot of time and cost to prepare many normal samples having category labels required for industrial use.
For this reason, it is common practice to train using samples of various categories without labels. In this case, as the number of categories increases, there is a problem in that the new anomaly detection performance rapidly decreases.
Disclosure of Invention
According to an embodiment of the present disclosure, a new anomaly detector may include: a generator configured to output reconstruction data from the actual data; and a discriminator configured to receive the actual data and the reconstructed data and to use the actual data and the reconstructed data to generate discrimination data indicating whether the actual data is normal or abnormal.
Drawings
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views and which together with the detailed description below are incorporated in and form part of the specification, serve to further illustrate embodiments of the concepts comprising the claimed novel detector and to explain various principles and advantages of these embodiments.
Fig. 1 is a block diagram showing a conventional new anomaly detector.
Fig. 2 is a block diagram showing a conventional generator.
Fig. 3 is a flowchart showing a training operation of a conventional new anomaly detector.
Fig. 4 is a block diagram illustrating a new anomaly detector in accordance with an embodiment of the present disclosure.
Fig. 5 is a block diagram illustrating a new anomaly detector in accordance with another embodiment of the present disclosure.
Fig. 6 is a block diagram illustrating a new anomaly detector in accordance with another embodiment of the present disclosure.
Fig. 7 is a flowchart illustrating a training process of the new anomaly detector of fig. 4, according to an embodiment.
Fig. 8 is a flowchart illustrating a training process of the new anomaly detector of fig. 5, according to an embodiment.
Fig. 9 is a graph illustrating the effect of an embodiment of the present disclosure.
Detailed Description
Various embodiments will be described below with reference to the accompanying drawings. These embodiments are provided for purposes of illustration and other embodiments not explicitly shown or described may also exist. Further, modifications may be made to the embodiments of the present disclosure, which will be described in detail below.
Fig. 4 is a block diagram illustrating a new anomaly detector 1000 in accordance with an embodiment of the present disclosure.
The new anomaly detector 1000 includes a generator 100 and a discriminator 200.
Each of the generator 100 and the arbiter 200 may include a neural network such as a Convolutional Neural Network (CNN).
The generator 100 receives the actual data x and generates the reconstructed data G (x), and an encoder and a decoder such as shown in fig. 2 may be included in the generator 100, wherein each of the encoder and the decoder may have a neural network structure such as CNN.
Hereinafter, the actual data x may be referred to as input data x.
Unlike the related art, the arbiter 200 receives the reconstruction data G (x) and the actual data x simultaneously during the inference process and outputs the discrimination data D.
In the present embodiment, the discrimination data D has a value from 0 to 1, and when the actual data x is a normal sample, the discrimination data D has a value larger than a value corresponding to an abnormal sample. In an embodiment, a higher value of the discrimination data D may correspond to an increased likelihood or confidence that the actual data x is a normal sample.
The new anomaly detector 1000 may further include a coupling circuit 300.
During a training operation, the coupling circuit 300 provides either a first pair of actual data x and actual data x or a second pair of actual data x and reconstruction data G (x) as inputs to the arbiter 200.
For example, during a training operation, a first pair (x, x) and a second pair (x, G (x)) may be alternately provided as inputs to the arbiter 200.
During the inference process, the coupling circuit 300 provides both the actual data x and the reconstruction data G (x) as inputs to the arbiter 200.
Fig. 7 is a flow chart illustrating a training operation 700 of the new anomaly detector 1000.
In this embodiment, the arbiter 200 and the generator 100 are trained alternately.
During the training operation of the arbiter 200 in step S100, the weight of the arbiter 200 is adjusted while the weight of the generator 100 is fixed.
During a training operation of the arbiter 200, the first pair (x, x) or the second pair (x, G (x)) is alternately input to the arbiter 200.
In the embodiment of step S100, in step S102, a first pair (x, x) is input to the arbiter 200 to generate the discrimination data D. In step S104, the weight of the discriminator 200 is adjusted so that the discrimination data D tends to1 when the first pair (x, x) is input. In step S106, the actual data x is input to the generator 100 to generate the generated reconstruction data G (x) and the second pair (x, G (x)) is input to the arbiter 200 to generate the discrimination data D. In step S108, the weight of the discriminator 200 is adjusted so that the discrimination data D goes toward 0 when the second pair (x, G (x)) is input. That is, in order to adjust the weight of the arbiter 200, both the first value of the discrimination data D corresponding to the first pair (x, x) and the second value of the discrimination data D corresponding to the second pair (x, G (x)) are considered.
During the training operation 700 of the generator 100 in step S200, the weight of the generator 100 is adjusted while the weight of the arbiter 200 is fixed.
In step S200, in step S202, the actual data x is supplied to the generator 100 to generate the reconstructed data G (x), and the actual data x and the reconstructed data G (x) are simultaneously input to the arbiter 200. That is, only the second pair (x, G (x)) is input to the arbiter 200.
In step S204, the weight of the generator 100 is adjusted so that the discrimination data D tends to 1 based on the input actual data x and the reconstruction data G (x). In the present embodiment, the mean square error between the actual data x and the reconstructed data G (x) is not taken into consideration for the training generator 100.
In training operation 700, the arbiter 200 and generator 100 may be trained by alternately repeating steps S100 and S200.
In the conventional new anomaly detector 1, the arbiter 20 receives only the reconstructed data G (x) and does not receive the actual data x, and distinguishes between normal samples and abnormal samples.
In contrast, in the new anomaly detector 1000 according to the present embodiment, the arbiter 200 receives both the actual data x and the reconstructed data (G (x)).
The generalization error can be used to represent new anomaly detection performance when test samples that were not used during the training operation are input during the inference process.
When using GAN structures, we want the generator to reconstruct well when normal samples are entered and not when abnormal samples are entered. This means that the generator is not generalized or the generalization capability of the generator is not good. On the other hand, we want the arbiter to distinguish the sample well, regardless of whether the sample is normal or abnormal, which means that the generalization ability of the arbiter is good.
In conventional GAN, it is difficult to make the generalization error of the arbiter and the generator different. For example, good generalization capability of a arbiter will result in good generalization capability of the generator and vice versa.
However, in the embodiment, the generalization performance of the generator and the arbiter may be set differently by using two discriminators. In the present embodiment, the generalization error of the generator may be regarded as the same as the reconstruction error and the generalization performance of the generator may be regarded as the same as the reconstruction performance. The generalization performance improves as the generalization error becomes smaller.
The generator 100 should have good reconstruction performance, i.e., small reconstruction errors, for normal samples and poor reconstruction performance, i.e., large reconstruction errors, for abnormal samples to improve overall new anomaly detection performance. As the generalization performance of the arbiter 200 improves, the generalization performance of the generator 100 for abnormal samples improves, and thus the overall new anomaly detection performance may be limited.
In the embodiment of fig. 5, the improvement of the generalization performance of the generator can be suppressed while the generalization performance of the arbiter is improved. Therefore, the generalization performance of the arbiter and the generalization performance of the generator may be set differently.
Fig. 5 is a block diagram illustrating a new anomaly detector 2000 in accordance with another embodiment of the present disclosure.
The new anomaly detector 2000 includes a generator 400 and a arbiter 500.
The generator 400 includes a first encoder 411, a second encoder 412, an arithmetic circuit 430, and a decoder 420.
The first encoder 411 and the second encoder 412 encode the actual data x, respectively.
The first encoder 411 corresponds to the encoder included in the generator 100 of fig. 4, and the second encoder 412 corresponds to the encoder included in the generator 10 of fig. 1.
This will be disclosed in detail in explaining the training operation.
The arithmetic circuit 430 combines the outputs of the first encoder 411 and the second encoder 412. For example, the arithmetic circuit 430 may linearly combine the outputs from the first encoder 411 and the second encoder 412 by normalization.
The decoder 420 decodes the output of the arithmetic circuit 430 and outputs the reconstruction data G (x).
Each of the first encoder 411, the second encoder 412, and the decoder 420 may include a neural network such as CNN.
The arbiter 500 receives the actual data x and the reconstruction data G (x) and outputs the discrimination data D.
For example, the discrimination data D has a value from 0 to 1. When the actual data x is a normal sample, the discrimination data D has a large value.
The arbiter 500 includes a first arbiter 510 and a second arbiter 520.
The first arbiter 510 is substantially identical to arbiter 200 of fig. 4, while the second arbiter 520 is substantially identical to arbiter 20 of fig. 1.
The first discriminator 510 receives the actual data x and the reconstructed data G (x) simultaneously during the inference operation, and outputs the discrimination data D.
The discrimination data D output from the first discriminator 510 may be referred to as first discrimination data.
The discrimination data SD output from the second discriminator 520 is used only in the training operation, and is not used in the estimation operation.
The discrimination data SD output from the second discriminator 520 may be referred to as second discrimination data SD.
Thus, the second arbiter 520 may be turned off during the inference operation to reduce power consumption.
The new anomaly detector 2000 further includes a coupling circuit 600.
The coupling circuit 600 includes a first coupling circuit 610 and a second coupling circuit 620.
The first coupling circuit 610 corresponds to the coupling circuit 300 of fig. 4, and the second coupling circuit 620 corresponds to the coupling circuit 30 of fig. 1.
The first coupling circuit 610 supplies the first arbiter 510 with the actual data x and the reconstruction data G (x) simultaneously in the inference operation.
The first coupling circuit 610 alternately supplies the first pair (x, x) or the second pair (x, G (x)) to the first arbiter 510 in a training operation of the first arbiter 510, and supplies the second pair (x, G (x)) to the first arbiter 510 in a training operation of the first encoder 411 or the decoder 420.
The second coupling circuit 620 alternately supplies the actual data x or the reconstruction data G (x) to the second arbiter 520 in the training operation of the second arbiter 520, and supplies the reconstruction data G (x) to the second arbiter 520 in the training operation of the second encoder 412 or the decoder 420.
Since the second arbiter 520 does not operate during the inference operation, the second coupling circuit 610 may have any state during the inference operation.
Fig. 8 is a flow chart illustrating a training process 800 of the new anomaly detector 2000.
First, the first arbiter 510, the first encoder 411, and the decoder 420 are sequentially trained in step S300.
In step S300, the first coupling circuit 610 operates to alternately provide the first pair (x, x) and the second pair (x, G (x)) to the first arbiter 510 in a training operation of the first arbiter 510, and the first coupling circuit 610 operates to provide the second pair (x, G (x)) to the first arbiter 510 in a training operation of the first encoder 411 and the decoder 420, at step S304.
This is the operation corresponding to the flowchart of fig. 7.
That is, the training operation of the first arbiter 510 in step S302 corresponds to step S100 in fig. 7, and the training operations of the first encoder 411 and the decoder 420 in step S304 correspond to step S200 in fig. 7.
Then, the second arbiter 520, the second encoder 412, and the decoder 420 are sequentially trained in step S400.
In step S400, the second coupling circuit 620 operates to alternately supply the actual data x and the reconstruction data G (x) to the second arbiter 520 in a training operation of the second arbiter 520 and train the second arbiter 520 using the output SD of the second arbiter, and the second coupling circuit 620 operates to supply the reconstruction data G (x) to the second arbiter 520 in a training operation of the second encoder 412 and the decoder 420 using the output SD of the second arbiter at step S404.
This is the operation corresponding to the flowchart of fig. 3.
That is, the training operation of the second arbiter 520 in step S402 corresponds to step S10 in fig. 3, and the training operation of the second encoder 412 and the decoder 420 in step S404 corresponds to step S20 in fig. 3.
As described above, since the first encoder 411 is trained using the first arbiter 510 according to the present embodiment, the generalization error of the first encoder 411 is relatively smaller than that of the second encoder 412. Since the second encoder 412 is trained using the second arbiter 520 according to the prior art, the generalization error of the second encoder 412 will tend to be larger than that of the first encoder 411.
In the embodiment of fig. 5, the generator 400 operates by using the outputs of the first encoder 411 and the second encoder 412 together such that the overall generalization error of the generator 400 is greater than the generalization error of the generator 100 of fig. 4.
Since the new anomaly detector 2000 of fig. 5 operates using the generator 400 and the first discriminator 510 during the inference operation, the generalization error of the first discriminator 510 may be maintained at a level corresponding to that of the discriminator 200 of fig. 4, and the generalization error of the generator 400 may be increased to a level greater than that of the generator 100 of fig. 4, thereby improving overall new anomaly detection performance.
That is, unlike the embodiment in fig. 4, in the new anomaly detector 2000 of fig. 5, the generalization error of the generator 400 and the generalization error of the arbiter 500 may be differently set, thereby further improving the new anomaly detection performance.
Fig. 9 is a graph showing the effect of the present embodiment.
The graph shows the accuracy of each compared new anomaly detector when 5 normal samples and 5 abnormal samples are input to the compared new anomaly detector. The new anomaly detector in FIG. 9 was trained using the modified national institute of standards Technology (Modified National Institute of STANDARDS AND Technology, MNIST) dataset, which was a handwritten dataset.
The horizontal axis represents the number of training phases performed during the training operation.
As shown in the graph, overall accuracy is measured in this embodiment when a sufficient number or more of training phases are performed during the training operation (e.g., when 600 or more training phases are performed).
Fig. 6 is a block diagram illustrating a new anomaly detector 3000 in accordance with another embodiment of the present disclosure.
In fig. 6, a new anomaly detector 3000 differs from a new anomaly detector 1000 in that: the new anomaly detector 3000 further includes an object detector 30. In the embodiment shown in fig. 6, the generator 100, the arbiter 200, and the coupling circuit 300 of fig. 4 are used, but the embodiment is not limited thereto; for example, in another embodiment, generator 400, arbiter 500, and coupling circuit 600 of fig. 5 may be used instead.
The output data of the object detector 30 corresponds to the actual data x.
For example, the object detector 30 may extract feature data from the input image Y.
The object detector 30 may include a neural network trained in advance, which is well known in the art, and thus a detailed description thereof will be omitted.
Embodiments of the present disclosure may be implemented using electronic circuitry, optical circuitry, one or more processors executing software or firmware stored in non-transitory computer readable media, or a combination thereof. The electronic or optical circuitry may include circuitry configured to perform neural network processing. The one or more processors may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a combination thereof.
Although various embodiments have been described for purposes of illustration, various changes and modifications may be made.

Claims (17)

1. A new anomaly detector comprising:
A generator outputting reconstruction data according to the actual data; a discriminator that receives the actual data and the reconstructed data, and generates discrimination data that indicates whether the actual data is normal or abnormal using the actual data and the reconstructed data; and
And a coupling circuit:
during an inference operation, the actual data and the reconstruction data are provided to the arbiter simultaneously,
During a first step of the training operation, a first pair is provided, the first pair comprising two copies of the actual data, and
During the second and third steps of the training operation, a second pair is provided to the arbiter, the second pair comprising the actual data and the reconstruction data.
2. The novel anomaly detector of claim 1, wherein the generator comprises: an encoder for encoding the actual data; and a decoder outputting the reconstructed data by decoding an output from the encoder.
3. The novel anomaly detector of claim 1, wherein each of the generator and the arbiter comprises a neural network.
4. A new anomaly detector as claimed in claim 3 wherein during the third step of the training operation the weight of the generator is adjusted while the weight of the arbiter is fixed.
5. A new anomaly detector as claimed in claim 3 wherein the weights of the discriminants are adjusted during the first and second steps of the training operation, while the weights of the generators are fixed.
6. The new anomaly detector of claim 1, further comprising: and an object detector extracting the actual data from the received data.
7. The novel anomaly detector of claim 1, wherein the generator comprises:
A first encoder for encoding the actual data;
a second encoder for encoding the actual data; and
And a decoder for generating the reconstruction data according to the output of the first encoder and the output of the second encoder.
8. The new anomaly detector of claim 7, further comprising: an arithmetic circuit combines the outputs of the first encoder and the second encoder and provides the combined output to the decoder.
9. The new anomaly detector of claim 7, wherein the discriminator comprises:
A first discriminator that receives the actual data and the reconstructed data simultaneously to generate first discrimination data from the actual data and the reconstructed data, and that provides the first discrimination data as the discrimination data; and
And a second discriminator which receives the actual data or the reconstructed data and generates second discriminating data from the received data.
10. The new anomaly detector of claim 9, wherein each of the first encoder, the decoder, and the first arbiter comprises a neural network.
11. The new anomaly detector of claim 10, wherein the coupling circuit is a first coupling circuit:
providing the actual data and the reconstructed data simultaneously to the first arbiter during an inference operation;
During a first step of the training operation, a first pair is provided, the first pair comprising two copies of the actual data, and
During the second and third steps of the training operation, a second pair is provided to the first arbiter, the second pair comprising the actual data and the reconstruction data.
12. The new anomaly detector of claim 11, wherein during the third step of the training operation, weights of the first encoder or the decoder are adjusted while weights of the first arbiter are fixed.
13. The new anomaly detector of claim 12, wherein the weight of the first arbiter is adjusted while the weights of the first encoder and the decoder are fixed during the first and second steps of the training operation.
14. The new anomaly detector of claim 10, wherein each of the second encoder, the decoder, and the second arbiter comprises a neural network.
15. The new anomaly detector of claim 14, further comprising: a second coupling circuit:
Providing the actual data to the second arbiter during a fourth step of a training operation; and
The reconstruction data is provided to the second arbiter during a fifth step and a sixth step of the training operation.
16. The new anomaly detector of claim 15, wherein during the sixth step of the training operation, weights of the second encoder or the decoder are adjusted while weights of the second arbiter are fixed.
17. The new anomaly detector of claim 15, wherein the weight of the second arbiter is adjusted while the weights of the second encoder and the decoder are fixed during the fourth and fifth steps of the training operation.
CN202011622881.2A 2020-06-08 2020-12-31 New different detector Active CN113837351B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/149,627 US20210383253A1 (en) 2020-06-08 2021-01-14 Novelty detector
KR1020210017546A KR20210152369A (en) 2020-06-08 2021-02-08 Novelty detector

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20200068807 2020-06-08
KR10-2020-0068807 2020-06-08

Publications (2)

Publication Number Publication Date
CN113837351A CN113837351A (en) 2021-12-24
CN113837351B true CN113837351B (en) 2024-04-23

Family

ID=78962502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011622881.2A Active CN113837351B (en) 2020-06-08 2020-12-31 New different detector

Country Status (1)

Country Link
CN (1) CN113837351B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948117A (en) * 2019-03-13 2019-06-28 南京航空航天大学 A kind of satellite method for detecting abnormality fighting network self-encoding encoder
CN110147323A (en) * 2019-04-24 2019-08-20 北京百度网讯科技有限公司 A kind of change intelligence inspection method and device based on generation confrontation network
CN110705376A (en) * 2019-09-11 2020-01-17 南京邮电大学 Abnormal behavior detection method based on generative countermeasure network
CN110781433A (en) * 2019-10-11 2020-02-11 腾讯科技(深圳)有限公司 Data type determination method and device, storage medium and electronic device
CN110795585A (en) * 2019-11-12 2020-02-14 福州大学 Zero sample image classification model based on generation countermeasure network and method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948117A (en) * 2019-03-13 2019-06-28 南京航空航天大学 A kind of satellite method for detecting abnormality fighting network self-encoding encoder
CN110147323A (en) * 2019-04-24 2019-08-20 北京百度网讯科技有限公司 A kind of change intelligence inspection method and device based on generation confrontation network
CN110705376A (en) * 2019-09-11 2020-01-17 南京邮电大学 Abnormal behavior detection method based on generative countermeasure network
CN110781433A (en) * 2019-10-11 2020-02-11 腾讯科技(深圳)有限公司 Data type determination method and device, storage medium and electronic device
CN110795585A (en) * 2019-11-12 2020-02-14 福州大学 Zero sample image classification model based on generation countermeasure network and method thereof

Also Published As

Publication number Publication date
CN113837351A (en) 2021-12-24

Similar Documents

Publication Publication Date Title
US8787460B1 (en) Method and apparatus for motion vector estimation for an image sequence
CN113222916A (en) Method, apparatus, device and medium for detecting image using target detection model
US20220318623A1 (en) Transformation of data samples to normal data
US20200410285A1 (en) Anomaly Augmented Generative Adversarial Network
CN113792853B (en) Training method of character generation model, character generation method, device and equipment
CN116563302B (en) Intelligent medical information management system and method thereof
CN110929733A (en) Denoising method and device, computer equipment, storage medium and model training method
CN112364939A (en) Abnormal value detection method, device, equipment and storage medium
CN115409855B (en) Image processing method, device, electronic equipment and storage medium
Shin et al. Mixup-based classification of mixed-type defect patterns in wafer bin maps
CN115146676A (en) Circuit fault detection method and system
CN113837351B (en) New different detector
CN112819848B (en) Matting method, matting device and electronic equipment
Chen et al. Efficient and differentiable low-rank matrix completion with back propagation
CN112967251A (en) Picture detection method, and training method and device of picture detection model
US20210383253A1 (en) Novelty detector
Jie et al. A fast and efficient network for single image shadow detection
CN115186738B (en) Model training method, device and storage medium
CN116743555A (en) Robust multi-mode network operation and maintenance fault detection method, system and product
CN116306610A (en) Model training method and device, natural language processing method and device
CN113112464B (en) RGBD (red, green and blue) saliency object detection method and system based on cross-mode alternating current encoder
Zhang et al. Outlier deletion based improvement on the StOMP algorithm for sparse solution of large-scale underdetermined problems
Zhao et al. Single image super-resolution reconstruction using multiple dictionaries and improved iterative back-projection
CN114359633A (en) Hyperspectral image clustering method and device, electronic equipment and storage medium
CN113205521A (en) Image segmentation method of medical image data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant