CN113837351A - New different detector - Google Patents

New different detector Download PDF

Info

Publication number
CN113837351A
CN113837351A CN202011622881.2A CN202011622881A CN113837351A CN 113837351 A CN113837351 A CN 113837351A CN 202011622881 A CN202011622881 A CN 202011622881A CN 113837351 A CN113837351 A CN 113837351A
Authority
CN
China
Prior art keywords
data
discriminator
detector
encoder
actual data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011622881.2A
Other languages
Chinese (zh)
Other versions
CN113837351B (en
Inventor
李中彦
庆宗旻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Korea Advanced Institute of Science and Technology KAIST
SK Hynix Inc
Original Assignee
Korea Advanced Institute of Science and Technology KAIST
SK Hynix Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Korea Advanced Institute of Science and Technology KAIST, SK Hynix Inc filed Critical Korea Advanced Institute of Science and Technology KAIST
Priority to US17/149,627 priority Critical patent/US20210383253A1/en
Priority to KR1020210017546A priority patent/KR20210152369A/en
Publication of CN113837351A publication Critical patent/CN113837351A/en
Application granted granted Critical
Publication of CN113837351B publication Critical patent/CN113837351B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Neurology (AREA)
  • Maintenance And Management Of Digital Transmission (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to a new anomaly detector, comprising: a generator configured to output reconstructed data from the actual data; and a discriminator configured to receive the actual data and the reconstructed data and to produce discrimination data indicating whether the actual data is normal or abnormal using the actual data and the generated reconstructed data.

Description

New different detector
Cross Reference to Related Applications
This application claims priority to korean patent application No. 10-2020-0068807, filed 6, 8 of 2020, which is hereby incorporated by reference in its entirety.
Technical Field
Various embodiments may relate to a novelty detector, and more particularly, to a novelty detector trained with various samples without tags.
Background
New anomaly detection refers to a technique that detects data that is different from previously known data.
When performing a new anomaly detection using a neural network, the neural network is trained using normal samples. The normal samples may be from a distribution of feature spaces of the normal samples.
When data is input to the neural network after training is completed, the neural network infers whether the actual data is a normal sample (such as a sample of the distribution of the feature space from the normal sample) or an abnormal sample (such as a sample of the distribution of the feature space not from the normal sample).
Fig. 1 is a block diagram showing a conventional exclusive-nor detector 1.
The conventional novice detector 1 includes a generator 10 and a discriminator 20.
Each of the generator 10 and the discriminator 20 has a neural network structure such as a Convolutional Neural Network (CNN).
A neural network having the structure shown in fig. 1 is called a generative confrontation network (GAN).
The generator 10 comprises: the encoder 11 and the decoder 12, sequentially coupled as shown in fig. 2, receive the actual data x and generate the reconstruction data g (x).
The encoder 11 and the decoder 12 may also have a neural network structure such as CNN.
The discriminator 20 receives the reconstruction data g (x) and outputs discrimination data D indicating whether the actual data x is a normal sample or an abnormal sample.
For example, the discrimination data D may become 0 or 1, where 1 indicates that the actual data is determined to be a normal sample and 0 indicates that the actual data is determined to be an abnormal sample.
The conventional exclusive or detector 1 further includes a coupling circuit 30.
Coupling circuit 30 provides either actual data x or reconstructed data g (x) as input to arbiter 20 for the training operation.
Coupling circuit 30 provides reconstruction data g (x) as input to arbiter 20 for the inference operation.
Fig. 3 is a flowchart showing a training operation of the conventional exclusive-or detector 1. The purpose of training the generator 10 is to reconstruct samples that are similar to actual samples, while the purpose of training the arbiter 20 is to distinguish the samples generated by the generator 10 from actual samples.
In the conventional anomaly detector 1, the arbiter 20 and the generator 10 are alternately trained.
When the arbiter 20 is trained in step S10, the weights of the neural network of the arbiter 20 are adjusted while the weights of the neural network of the generator 10 are fixed.
At this time, the actual data x and the reconstruction data g (x) are alternately input to the discriminator 20.
At this time, the weight of the discriminator 20 is adjusted so that the discrimination data D becomes 1 when the actual data x is input, and becomes 0 when the reconstruction data g (x) is input.
When the generator 10 is trained in step S20, the weight of the generator 10 is adjusted when the weight of the discriminator 20 is fixed.
At this time, the reconstruction data g (x) is input to the discriminator 20.
When the reconstruction data g (x) is input, the weight of the generator 10 is adjusted so that the discrimination data D becomes 1 and the Mean Square Error (MSE) between the actual data x and the reconstruction data g (x) becomes 0.
During the training operation, the above two steps may be repeatedly performed.
In training the neural network, normal samples labeled with categories may be used.
When the neural network is trained using normal samples with labels, the newness detection performance can be improved.
However, there is a problem in that it takes much time and cost to prepare many normal samples having category labels required for industrial use.
For this reason, it is common practice to use samples of various classes without labels for training. In this case, as the number of categories increases, there is a problem in that the new anomaly detection performance rapidly decreases.
Disclosure of Invention
According to an embodiment of the present disclosure, the novice detector may include: a generator configured to output reconstructed data from the actual data; and a discriminator configured to receive the actual data and the reconstructed data and to generate discrimination data indicating whether the actual data is normal or abnormal using the actual data and the reconstructed data.
Drawings
The accompanying figures, in which like reference numerals refer to identical or functionally similar elements throughout the separate views and which together with the detailed description below are incorporated in and form part of the specification, serve to further illustrate embodiments and to explain various principles and advantages of the embodiments, including the claimed novel detector concept.
Fig. 1 is a block diagram illustrating a conventional novice detector.
Fig. 2 is a block diagram illustrating a conventional generator.
Fig. 3 is a flowchart illustrating a training operation of a conventional novice detector.
Fig. 4 is a block diagram illustrating a novelty detector according to an embodiment of the present disclosure.
Fig. 5 is a block diagram illustrating a novelty detector according to another embodiment of the present disclosure.
Fig. 6 is a block diagram illustrating a novelty detector according to another embodiment of the present disclosure.
Fig. 7 is a flow diagram illustrating a training process of the novelty detector of fig. 4 according to an embodiment.
Fig. 8 is a flow diagram illustrating a training process of the novelty detector of fig. 5 according to an embodiment.
Fig. 9 is a graph illustrating the effect of the embodiment of the present disclosure.
Detailed Description
Various embodiments will be described below with reference to the accompanying drawings. These embodiments are provided for the purpose of illustration and other embodiments not explicitly shown or described may also be present. Further, modifications may be made to the embodiments of the present disclosure that will be described in detail below.
Fig. 4 is a block diagram illustrating a new anomaly detector 1000 according to an embodiment of the present disclosure.
The new anomaly detector 1000 includes a generator 100 and a discriminator 200.
Each of the generator 100 and the discriminator 200 may include, for example, a Convolutional Neural Network (CNN) neural network.
The generator 100 receives actual data x and generates reconstructed data g (x), and an encoder and a decoder such as shown in fig. 2 may be included in the generator 100, wherein each of the encoder and the decoder may have a neural network structure such as CNN.
Hereinafter, the actual data x may be referred to as input data x.
Unlike the prior art, the discriminator 200 receives the reconstruction data g (x) and the actual data x simultaneously during the inference process and outputs the discrimination data D.
In the present embodiment, the discrimination data D has a value from 0 to 1, and when the actual data x is a normal sample, the discrimination data D has a value larger than a value corresponding to an abnormal sample. In an embodiment, a higher value of the discrimination data D may correspond to an increased likelihood or confidence that the actual data x is a normal sample.
The alien detector 1000 may further include a coupling circuit 300.
During the training operation, the coupling circuit 300 provides either a first pair of actual data x and actual data x or a second pair of actual data x and reconstructed data g (x) as inputs to the arbiter 200.
For example, during a training operation, a first pair (x, x) and a second pair (x, g (x)) may be alternately provided as inputs to arbiter 200.
During the inference process, coupling circuit 300 provides both actual data x and reconstructed data g (x) as inputs to arbiter 200.
Fig. 7 is a flow chart illustrating training operations 700 of the novice detector 1000.
In the present embodiment, the arbiter 200 and the generator 100 are alternately trained.
During the training operation of the discriminator 200 in step S100, the weight of the discriminator 200 is adjusted while the weight of the generator 100 is fixed.
During the training operation of the arbiter 200, the first pair (x, x) or the second pair (x, g (x)) is alternately input to the arbiter 200.
In the embodiment of step S100, in step S102, the first pair (x, x) is input to the discriminator 200 to generate discrimination data D. In step S104, the weight of the discriminator 200 is adjusted so that the discrimination data D goes to 1 when the first pair (x, x) is input. In step S106, the actual data x is input to the generator 100 to produce the generated reconstructed data g (x) and the second pair (x, g (x)) is input to the discriminator 200 to produce the discrimination data D. In step S108, the weight of the discriminator 200 is adjusted so that the discrimination data D goes to 0 when the second pair (x, g (x)) is input. That is, in order to adjust the weight of the discriminator 200, both the first value of the discrimination data D corresponding to the first pair (x, x) and the second value of the discrimination data D corresponding to the second pair (x, g (x)) are considered.
During the training operation 700 of the generator 100 in step S200, the weight of the generator 100 is adjusted while the weight of the discriminator 200 is fixed.
In step S200, in step S202, the actual data x is supplied to the generator 100 to generate reconstructed data g (x), and the actual data x and the reconstructed data g (x) are simultaneously input to the discriminator 200. That is, only the second pair (x, g (x)) is input to the discriminator 200.
In step S204, based on the input actual data x and reconstruction data g (x), the weight of the generator 100 is adjusted so that the discrimination data D tends to 1. In the present embodiment, the mean square error between the actual data x and the reconstructed data g (x) is not considered for the training generator 100.
In the training operation 700, the discriminator 200 and the generator 100 may be trained by alternately repeating steps S100 and S200.
In the conventional novice detector 1, the discriminator 20 receives only the reconstruction data g (x) and not the actual data x, and discriminates between normal samples and abnormal samples.
In contrast, in the new anomaly detector 1000 according to the present embodiment, the discriminator 200 receives the actual data x and the reconstructed data (g (x)) at the same time.
The generalization error may be used to represent the performance of new anomaly detection when test samples that were not used during the training operation are input during the inference process.
When using the GAN structure, we want the producers to reconstruct well when normal samples come in, and poorly when abnormal samples come in. This means that the generator is not generalized or the generalization capability of the generator is not good. On the other hand, we want the discriminator to discriminate the sample well regardless of whether the sample is normal or abnormal, which means that the generalization capability of the discriminator is good.
In the conventional GAN, it is difficult to make the generalization errors of the discriminator and the generator different. For example, a good generalization capability of the arbiter will result in a good generalization capability of the generator, and vice versa.
However, in the embodiment, the generalization performance of the generator and the discriminator may be set differently by using two discriminators. In the present embodiment, the generalization error of the generator can be considered to be the same as the reconstruction error and the generalization performance of the generator can be considered to be the same as the reconstruction performance. The generalization performance improves as the generalization error becomes smaller.
The generator 100 should have good reconstruction performance, i.e. small reconstruction errors, for normal samples and poor reconstruction performance, i.e. large reconstruction errors, for abnormal samples to improve the overall new anomaly detection performance. As the generalization performance of the discriminator 200 improves, the generalization performance of the generator 100 on the abnormal sample also improves, and thus the overall new anomaly detection performance may be limited.
In the embodiment of fig. 5, while the generalization performance of the discriminator is improved, the improvement of the generalization performance of the generator can be suppressed. Therefore, the generalization performance of the discriminator and the generalization performance of the generator can be set differently.
Fig. 5 is a block diagram illustrating a new anomaly detector 2000 according to another embodiment of the present disclosure.
The novice detector 2000 includes a generator 400 and a discriminator 500.
The generator 400 includes a first encoder 411, a second encoder 412, an arithmetic circuit 430, and a decoder 420.
The first encoder 411 and the second encoder 412 encode actual data x, respectively.
The first encoder 411 corresponds to an encoder included in the generator 100 of fig. 4, and the second encoder 412 corresponds to an encoder included in the generator 10 of fig. 1.
This will be disclosed in detail in explaining the training operation.
The arithmetic circuit 430 combines the outputs of the first encoder 411 and the second encoder 412. For example, the arithmetic circuit 430 may linearly combine the outputs from the first encoder 411 and the second encoder 412 by normalization.
The decoder 420 decodes the output of the arithmetic circuit 430 and outputs reconstruction data g (x).
Each of the first encoder 411, the second encoder 412, and the decoder 420 may include a neural network such as CNN.
The discriminator 500 receives the actual data x and the reconstructed data g (x) and outputs discrimination data D.
For example, the discrimination data D has a value from 0 to 1. When the actual data x is a normal sample, the discrimination data D has a large value.
The discriminator 500 includes a first discriminator 510 and a second discriminator 520.
The first discriminator 510 is substantially the same as the discriminator 200 of fig. 4, and the second discriminator 520 is substantially the same as the discriminator 20 of fig. 1.
The first discriminator 510 receives the actual data x and the reconstruction data g (x) simultaneously during the inference operation, and outputs discrimination data D.
The discrimination data D output from the first discriminator 510 may be referred to as first discrimination data.
The discrimination data SD output from the second discriminator 520 is used only in the training operation, and not in the inference operation.
The discrimination data SD output from the second discriminator 520 may be referred to as second discrimination data SD.
Accordingly, the second discriminator 520 may be turned off during the inference operation to reduce power consumption.
The exclusive-or detector 2000 further includes a coupling circuit 600.
The coupling circuit 600 includes a first coupling circuit 610 and a second coupling circuit 620.
The first coupling circuit 610 corresponds to the coupling circuit 300 of fig. 4, and the second coupling circuit 620 corresponds to the coupling circuit 30 of fig. 1.
First coupling circuit 610 simultaneously provides actual data x and reconstructed data g (x) to first arbiter 510 in an inference operation.
The first coupling circuit 610 alternately provides the first pair (x, x) or the second pair (x, g (x)) to the first arbiter 510 in the training operation of the first arbiter 510, and provides the second pair (x, g (x)) to the first arbiter 510 in the training operation of the first encoder 411 or the decoder 420.
The second coupling circuit 620 alternately provides the actual data x or the reconstruction data g (x) to the second discriminator 520 in the training operation of the second discriminator 520, and provides the reconstruction data g (x) to the second discriminator 520 in the training operation of the second encoder 412 or the decoder 420.
Since the second discriminator 520 does not operate during the inference operation, the second coupling circuit 610 may have an arbitrary state during the inference operation.
Fig. 8 is a flow chart illustrating a training process 800 of the novice detector 2000.
First, the first discriminator 510, the first encoder 411, and the decoder 420 are trained sequentially in step S300.
In step S300, first coupling circuit 610 operates to alternately provide first pair (x, x) and second pair (x, g (x)) to first arbiter 510 in the training operation of first arbiter 510 at step S302, and first coupling circuit 610 operates to provide second pair (x, g (x)) to first arbiter 510 in the training operation of first encoder 411 and decoder 420 at step S304.
This is an operation corresponding to the flowchart of fig. 7.
That is, the training operation of the first discriminator 510 in step S302 corresponds to step S100 in fig. 7, and the training operations of the first encoder 411 and the decoder 420 in step S304 correspond to step S200 in fig. 7.
Then, the second discriminator 520, the second encoder 412, and the decoder 420 are trained sequentially in step S400.
In step S400, the second coupling circuit 620 operates to alternately supply the actual data x and the reconstructed data g (x) to the second discriminator 520 in the training operation of the second discriminator 520 and train the second discriminator 520 using the output SD of the second discriminator at step S402, and the second coupling circuit 620 operates to supply the reconstructed data g (x) to the second discriminator 520 in the training operation of the second encoder 412 and the decoder 420 using the output SD of the second discriminator at step S404.
This is an operation corresponding to the flowchart of fig. 3.
That is, the training operation of the second discriminator 520 in step S402 corresponds to step S10 in fig. 3, and the training operations of the second encoder 412 and the decoder 420 in step S404 corresponds to step S20 in fig. 3.
As described above, since the first encoder 411 is trained using the first discriminator 510 according to the present embodiment, the generalization error of the first encoder 411 is relatively smaller than that of the second encoder 412. Since the second encoder 412 is trained using the second discriminator 520 according to the prior art, the generalization error of the second encoder 412 will tend to be larger than the generalization error of the first encoder 411.
In the embodiment of fig. 5, the generator 400 operates by using the outputs of the first and second encoders 411, 412 together such that the overall generalization error of the generator 400 is greater than that of the generator 100 of fig. 4.
Since the novice detector 2000 of fig. 5 operates using the generator 400 and the first discriminator 510 during the inference operation, the generalization error of the first discriminator 510 can be maintained at a level corresponding to the generalization error of the discriminator 200 of fig. 4, and the generalization error of the generator 400 can be increased to a level greater than the generalization error of the generator 100 of fig. 4, thereby improving the overall novice detection performance.
That is, unlike the embodiment in fig. 4, in the exclusive or detector 2000 of fig. 5, the generalization error of the generator 400 and the generalization error of the discriminator 500 may be set differently, thereby further improving the exclusive or detection performance.
Fig. 9 is a graph showing the effect of the present embodiment.
The graph shows the accuracy of each of the compared nova detectors when 5 normal samples and 5 abnormal samples are input to the compared nova detectors. The novice detector in fig. 9 was trained using a Modified National Institute of Standards and Technology, MNIST, dataset, which is a handwritten dataset.
The horizontal axis represents the number of training phases performed during the training operation.
As shown in the graph, when a sufficient number or more training phases are performed during the training operation (for example, when 600 or more training phases are performed), the overall accuracy is measured to be high in the present embodiment.
Fig. 6 is a block diagram illustrating a new anomaly detector 3000 according to another embodiment of the present disclosure.
In fig. 6, the novice detector 3000 differs from the novice detector 1000 in that: the novice detector 3000 further includes an object detector 30. In the embodiment shown in fig. 6, the generator 100, the discriminator 200, and the coupling circuit 300 of fig. 4 are used, but the embodiment is not limited thereto; for example, in another embodiment, generator 400, discriminator 500, and coupling circuit 600 of fig. 5 may be used instead.
The output data of the object detector 30 corresponds to the actual data x.
For example, the object detector 30 may extract feature data from the input image Y.
The object detector 30 may include a neural network trained in advance, which is well known in the art, and thus a detailed description thereof will be omitted.
Embodiments of the disclosure may be implemented using electronic circuitry, optical circuitry, one or more processors running software or firmware stored in a non-transitory computer-readable medium, or a combination thereof. The electronic or optical circuit may include circuitry configured to perform neural network processing. The one or more processors may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a combination thereof.
Although various embodiments have been described for purposes of illustration, various changes and modifications may be made.

Claims (18)

1. A novice detector comprising:
a generator for outputting reconstructed data based on the actual data; and
a discriminator receiving the actual data and the reconstruction data and generating discrimination data using the actual data and the reconstruction data, the discrimination data indicating whether the actual data is normal or abnormal.
2. The novelt detector of claim 1, wherein the generator comprises: an encoder for encoding the actual data; and a decoder outputting the reconstruction data by decoding an output from the encoder.
3. The novelt detector of claim 1 wherein each of the generator and the discriminator comprises a neural network.
4. The novelty detector of claim 3, further comprising: the coupling circuit:
during an inference operation, simultaneously providing the actual data and the reconstructed data to the arbiter,
during a first step of a training operation, a first pair is provided, the first pair comprising two copies of the actual data, an
Providing a second pair to the arbiter during a second step and a third step of the training operation, the second pair comprising the actual data and the reconstructed data.
5. The novice detector of claim 4 wherein during the third step of the training operation, the weight of the generator is adjusted while the weight of the arbiter is fixed.
6. The novice detector of claim 4, wherein the weight of the discriminator is adjusted while the weight of the generator is fixed during the first and second steps of the training operation.
7. The novelty detector of claim 1, further comprising: an object detector that extracts the actual data from the received data.
8. The novelt detector of claim 1, wherein the generator comprises:
a first encoder for encoding the actual data;
a second encoder for encoding the actual data; and
a decoder to generate the reconstruction data from the output of the first encoder and the output of the second encoder.
9. The novelty detector of claim 8, further comprising: an arithmetic circuit that combines outputs of the first encoder and the second encoder and provides the combined output to the decoder.
10. The novelt detector of claim 8, wherein the discriminator comprises:
a first discriminator that receives the actual data and the reconstruction data at the same time, generates first discrimination data from the actual data and the reconstruction data, and supplies the first discrimination data as the discrimination data; and
a second discriminator receiving the actual data or the reconstructed data and generating second discrimination data from the received data.
11. The novelty detector of claim 10 wherein each of the first encoder, the decoder, and the first discriminator comprises a neural network.
12. The novelty detector of claim 11, further comprising: a first coupling circuit:
during an inference operation, simultaneously providing the actual data and the reconstructed data to the first arbiter;
during a first step of a training operation, a first pair is provided, the first pair comprising two copies of the actual data, an
Providing a second pair to the first discriminator during a second step and a third step of the training operation, the second pair comprising the actual data and the reconstructed data.
13. The freshness detector of claim 12, wherein during the third step of the training operation, weights of the first encoder or the decoder are adjusted while weights of the first discriminator are fixed.
14. The freshness detector of claim 13, wherein the weight of the first discriminator is adjusted while the weights of the first encoder and the decoder are fixed during the first step and the second step of the training operation.
15. The newcastle detector as in claim 11, wherein each of the second encoder, the decoder, and the second discriminator comprises a neural network.
16. The novelty detector of claim 15, further comprising: a second coupling circuit:
providing the actual data to the second discriminator during a fourth step of a training operation; and is
Providing the reconstruction data to the second discriminator during a fifth step and a sixth step of the training operation.
17. The novelty detector of claim 16, wherein during the sixth step of the training operation, weights of the second encoder or the decoder are adjusted while weights of the second discriminator are fixed.
18. The novice detector of claim 16, wherein weights of the second discriminator are adjusted while weights of the second encoder and the decoder are fixed during the fourth step and the fifth step of the training operation.
CN202011622881.2A 2020-06-08 2020-12-31 New different detector Active CN113837351B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/149,627 US20210383253A1 (en) 2020-06-08 2021-01-14 Novelty detector
KR1020210017546A KR20210152369A (en) 2020-06-08 2021-02-08 Novelty detector

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20200068807 2020-06-08
KR10-2020-0068807 2020-06-08

Publications (2)

Publication Number Publication Date
CN113837351A true CN113837351A (en) 2021-12-24
CN113837351B CN113837351B (en) 2024-04-23

Family

ID=78962502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011622881.2A Active CN113837351B (en) 2020-06-08 2020-12-31 New different detector

Country Status (1)

Country Link
CN (1) CN113837351B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948117A (en) * 2019-03-13 2019-06-28 南京航空航天大学 A kind of satellite method for detecting abnormality fighting network self-encoding encoder
CN110147323A (en) * 2019-04-24 2019-08-20 北京百度网讯科技有限公司 A kind of change intelligence inspection method and device based on generation confrontation network
CN110705376A (en) * 2019-09-11 2020-01-17 南京邮电大学 Abnormal behavior detection method based on generative countermeasure network
CN110781433A (en) * 2019-10-11 2020-02-11 腾讯科技(深圳)有限公司 Data type determination method and device, storage medium and electronic device
CN110795585A (en) * 2019-11-12 2020-02-14 福州大学 Zero sample image classification model based on generation countermeasure network and method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948117A (en) * 2019-03-13 2019-06-28 南京航空航天大学 A kind of satellite method for detecting abnormality fighting network self-encoding encoder
CN110147323A (en) * 2019-04-24 2019-08-20 北京百度网讯科技有限公司 A kind of change intelligence inspection method and device based on generation confrontation network
CN110705376A (en) * 2019-09-11 2020-01-17 南京邮电大学 Abnormal behavior detection method based on generative countermeasure network
CN110781433A (en) * 2019-10-11 2020-02-11 腾讯科技(深圳)有限公司 Data type determination method and device, storage medium and electronic device
CN110795585A (en) * 2019-11-12 2020-02-14 福州大学 Zero sample image classification model based on generation countermeasure network and method thereof

Also Published As

Publication number Publication date
CN113837351B (en) 2024-04-23

Similar Documents

Publication Publication Date Title
Zhang et al. SteganoGAN: High capacity image steganography with GANs
CN114419323B (en) Cross-modal learning and domain self-adaptive RGBD image semantic segmentation method
CN110990595B (en) Cross-domain alignment embedded space zero sample cross-modal retrieval method
CN116563302B (en) Intelligent medical information management system and method thereof
CN112598053A (en) Active significance target detection method based on semi-supervised learning
CN112819848B (en) Matting method, matting device and electronic equipment
CN113837351B (en) New different detector
CN112967251A (en) Picture detection method, and training method and device of picture detection model
US20210383253A1 (en) Novelty detector
Jie et al. A fast and efficient network for single image shadow detection
Shreelekshmi et al. Cover image preprocessing for more reliable LSB replacement steganography
US20220004882A1 (en) Learning apparatus, method, program and inference apparatus
CN112820412B (en) User information processing method and device, storage medium and electronic equipment
CN115174178A (en) Semi-supervised network flow abnormity detection method based on generation countermeasure network
Zhao et al. Single image super-resolution reconstruction using multiple dictionaries and improved iterative back-projection
CN114359633A (en) Hyperspectral image clustering method and device, electronic equipment and storage medium
Zhang et al. Outlier deletion based improvement on the StOMP algorithm for sparse solution of large-scale underdetermined problems
CN113205521A (en) Image segmentation method of medical image data
Ahmad et al. Weighted multivariate curve resolution—Alternating least squares based on sample relevance
CN113112464A (en) RGBD (red, green and blue) saliency object detection method and system based on cross-mode alternating current encoder
CN111008276A (en) Complete entity relationship extraction method and device
Feng et al. RankDVQA-mini: Knowledge Distillation-Driven Deep Video Quality Assessment
WO2015125946A1 (en) Magic state generation apparatus, magic state generation method, and quantum gate operation method
CN114978195B (en) Method and system for searching error pattern set related to polar code serial offset list decoding code words
Laszkiewicz et al. Single-Model Attribution via Final-Layer Inversion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant