CN113627576A - Code scanning information detection method, device, equipment and storage medium - Google Patents

Code scanning information detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN113627576A
CN113627576A CN202111168620.2A CN202111168620A CN113627576A CN 113627576 A CN113627576 A CN 113627576A CN 202111168620 A CN202111168620 A CN 202111168620A CN 113627576 A CN113627576 A CN 113627576A
Authority
CN
China
Prior art keywords
code scanning
sample
discrimination
code
samples
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111168620.2A
Other languages
Chinese (zh)
Other versions
CN113627576B (en
Inventor
吴志成
张莉
乔延柯
任杰
袁雅云
栾雅理
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202111168620.2A priority Critical patent/CN113627576B/en
Publication of CN113627576A publication Critical patent/CN113627576A/en
Application granted granted Critical
Publication of CN113627576B publication Critical patent/CN113627576B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • G06K17/0025Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device the arrangement consisting of a wireless interrogation device in combination with a device for optically marking the record carrier
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to artificial intelligence and provides a code scanning information detection method, a code scanning information detection device, code scanning information detection equipment and a storage medium. The method includes the steps that a generation learner and a discrimination learner are adjusted in an iterative mode based on code scanning samples to obtain a generation network and a discrimination network, code scanning samples to be analyzed during iterative adjustment are obtained, target code scanning samples of the last iterative adjustment are extracted, the target code scanning samples are processed based on the generation network to obtain generated code scanning samples, the samples are extracted to obtain training code scanning samples, discrimination accuracy is generated according to prediction results and labeling results of the code scanning samples, if the discrimination accuracy is smaller than a preset threshold value, samples to be added are extracted according to the labeling results and the prediction results, network parameters are adjusted based on the generation network and the samples to be added to obtain a discrimination model, information to be detected is processed based on the discrimination model, and detection results are obtained. The invention can improve the accuracy of the detection result. In addition, the invention also relates to a block chain technology, and the detection result can be stored in the block chain.

Description

Code scanning information detection method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a code scanning information detection method, a code scanning information detection device, code scanning information detection equipment and a storage medium.
Background
With the development of artificial intelligence, code scanning scenes are gradually increased, but the code scanning cheating problem is followed. At present, in a code scanning information detection method, a corresponding determination rule is generally established based on a specific code scanning scene, and then the determination rule is used to detect whether code scanning information meets the specification, however, the detection accuracy of this method is low.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, an apparatus, a device and a storage medium for detecting code scanning information, which can improve the accuracy of the detection result.
In one aspect, the present invention provides a method for detecting code scanning information, where the method for detecting code scanning information includes:
acquiring a code scanning sample, wherein the code scanning sample comprises a code scanning image and annotation information corresponding to the code scanning image;
iteratively adjusting a pre-constructed generation learner and a discrimination learner based on the code scanning sample until an iteration stop condition is met to obtain a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner;
acquiring a sample with a preset output result of the discrimination learner during iterative adjustment as a code scanning sample to be analyzed, and extracting a target code scanning sample subjected to the final iterative adjustment from the code scanning sample to be analyzed;
processing the target code scanning sample based on the generation network to obtain a generated code scanning sample, and extracting the code scanning sample to be analyzed and the generated code scanning sample to obtain a training code scanning sample, wherein the training code scanning sample comprises code scanning data and a labeling result;
generating discrimination accuracy according to a prediction result obtained by recognizing the scanning code data by the discrimination network and the labeling result;
if the discrimination accuracy is smaller than a preset threshold value, extracting a sample to be added from the training code scanning sample according to the labeling result and the prediction result;
adjusting network parameters in the discrimination network based on the generation network and the sample to be added to obtain a discrimination model;
and when a detection request is received, acquiring information to be detected according to the detection request, and processing the information to be detected based on the discrimination model to obtain a detection result.
According to the preferred embodiment of the present invention, the code scanning samples include real code scanning samples and false code scanning samples, and the iteratively adjusting a pre-constructed generation learner and a discriminant learner based on the code scanning samples until an iteration stop condition is satisfied to obtain a generation network corresponding to the generation learner and a discriminant network corresponding to the discriminant learner include:
in a first round of iterative adjustment, processing the false code scanning samples based on the generation learner to obtain false samples, and determining the false samples and the real code scanning samples as initial training samples, wherein the initial training samples comprise initial training data and data results;
identifying the initial training data based on the discrimination learner to obtain an initial discrimination result;
adjusting the generation parameters in the generation learner according to the data result and the initial judgment result to obtain an adjusted generation learner, and adjusting the judgment parameters in the judgment learner according to the data result and the initial judgment result to obtain an adjusted judgment learner;
determining the adjusted generation learner as a generation learner for next iterative adjustment, and determining the adjusted discrimination learner as a discrimination learner for next iterative adjustment;
and determining the initial training sample with the data result as the configuration result as a false code scanning sample for next iteration adjustment until the iteration stop condition is met, and obtaining the generated network and the judgment network.
According to a preferred embodiment of the present invention, the code scanning samples to be analyzed include the fake code scanning samples and the characteristic code scanning samples, and the extracting the code scanning samples to be analyzed and the generating code scanning samples to obtain the training code scanning samples includes:
analyzing the generation accuracy of the generated network according to the generated code scanning sample;
determining the generation accuracy as a first extraction proportion of the generated code scanning sample, and determining a difference value between a preset value and the generation accuracy as a second extraction proportion of the code scanning sample to be analyzed;
detecting the number of iterative adjustment rounds of the code scanning sample to be analyzed generated by the generation learner;
if the iteration adjustment round number is larger than a preset round number, determining the ratio of the second extraction proportion to the preset round number as a third extraction proportion of the false code scanning samples, and determining the difference value of the second extraction proportion and the third extraction proportion as a fourth extraction proportion of the characteristic code scanning samples;
randomly extracting the generated code scanning samples based on the first extraction proportion to obtain first training samples, and randomly extracting the fake code scanning samples based on the third extraction proportion to obtain second training samples;
randomly extracting the feature code scanning samples based on the fourth extraction proportion to obtain third training samples;
determining the first training sample, the second training sample, and the third training sample as the training code-scanning sample.
According to a preferred embodiment of the present invention, the processing the target code-scanning sample based on the generation network to obtain a generated code-scanning sample includes:
for target code scanning data in each target code scanning sample, calculating the total number of pixels in the target code scanning data;
acquiring synthesis parameters in the generation network;
calculating the product of the synthesis parameters and the total amount of the pixels to obtain the number of real pixels;
acquiring a target pixel from a pixel sample library based on the real pixel quantity;
and randomly replacing the target pixel with the pixel in the target code scanning data to obtain the generated code scanning sample.
According to a preferred embodiment of the present invention, the generating of the discrimination accuracy according to the prediction result obtained by identifying the scan code data by the discrimination network and the labeling result includes:
recognizing the scanning code data based on the discrimination network to obtain the prediction result;
determining a prediction result which is the same as the labeling result as a target result;
counting the identification total amount of the scan code data, and counting the target number of the target result;
and calculating the ratio of the target quantity in the total recognition quantity to obtain the discrimination accuracy.
According to a preferred embodiment of the present invention, the extracting samples to be added from the training code-scanning samples according to the labeling result and the prediction result comprises:
comparing the prediction result with the labeling result;
and if the prediction result is not matched with the labeling result, determining the training code scanning sample corresponding to the prediction result as the sample to be added.
According to a preferred embodiment of the present invention, the adjusting the network parameters in the discriminant network based on the generated network and the to-be-augmented sample to obtain a discriminant model includes:
processing the sample to be increased based on the generation network to obtain a target sample;
and adjusting the network parameters based on the target sample until the discrimination accuracy of the adjusted discrimination network is greater than or equal to the preset threshold value to obtain the discrimination model.
In another aspect, the present invention further provides a code scanning information detecting apparatus, including:
the system comprises an acquisition unit, a storage unit and a processing unit, wherein the acquisition unit is used for acquiring a code scanning sample, and the code scanning sample comprises a code scanning image and annotation information corresponding to the code scanning image;
the adjusting unit is used for iteratively adjusting a pre-constructed generation learner and a discrimination learner based on the code scanning sample until an iteration stop condition is met, and obtaining a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner;
the extraction unit is used for acquiring a sample of which the output result is a preset result when the discrimination learner performs iterative adjustment as a code scanning sample to be analyzed, and extracting a target code scanning sample of the last round of iterative adjustment from the code scanning sample to be analyzed;
the extraction unit is used for processing the target code scanning sample based on the generation network to obtain a generated code scanning sample, and extracting the code scanning sample to be analyzed and the generated code scanning sample to obtain a training code scanning sample, wherein the training code scanning sample comprises code scanning data and a labeling result;
the generating unit is used for generating discrimination accuracy according to a prediction result obtained by identifying the scanning code data by the discrimination network and the labeling result;
the extracting unit is further configured to extract a to-be-added sample from the training code-scanning sample according to the labeling result and the prediction result if the discrimination accuracy is smaller than a preset threshold;
the adjusting unit is further configured to adjust network parameters in the discrimination network based on the generation network and the sample to be added to obtain a discrimination model;
and the processing unit is used for acquiring the information to be detected according to the detection request when receiving the detection request, and processing the information to be detected based on the discrimination model to obtain a detection result.
In another aspect, the present invention further provides an electronic device, including:
a memory storing computer readable instructions; and
a processor executing computer readable instructions stored in the memory to implement the code scanning information detection method.
In another aspect, the present invention further provides a computer-readable storage medium, where computer-readable instructions are stored in the computer-readable storage medium, and the computer-readable instructions are executed by a processor in an electronic device to implement the code scanning information detection method.
According to the technical scheme, the training code scanning samples are extracted from the code scanning samples to be analyzed and the generated code scanning samples, the training code scanning samples are analyzed based on the discrimination network, the problem that the discrimination network is over-fitted to the analysis of the specific examples generated by the generation learner after being adjusted in a certain iteration is solved, the discrimination accuracy of the discrimination network is improved, and then when the discrimination accuracy is smaller than a preset threshold value, the network parameters in the discrimination network are further adjusted based on the generation network and the samples to be added, so that the discrimination accuracy of the discrimination model on the sample types of the samples to be added can be improved, and the accuracy of the detection result is improved.
Drawings
FIG. 1 is a flowchart illustrating a method for detecting code scanning information according to a preferred embodiment of the present invention.
FIG. 2 is a functional block diagram of a code scanning information detecting device according to a preferred embodiment of the present invention.
Fig. 3 is a schematic structural diagram of an electronic device implementing a code scanning information detection method according to a preferred embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a flow chart of a method for detecting code scanning information according to a preferred embodiment of the present invention. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
The code scanning information detection method can acquire and process related data based on an artificial intelligence technology. Among them, Artificial Intelligence (AI) is a theory, method, technique and application system that simulates, extends and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge and uses the knowledge to obtain the best result.
The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a robot technology, a biological recognition technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
The code scanning information detection method is applied to one or more electronic devices, which are devices capable of automatically performing numerical calculation and/or information processing according to computer readable instructions set or stored in advance, and the hardware of the electronic devices includes but is not limited to a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The electronic device may be any electronic product capable of performing human-computer interaction with a user, for example, a Personal computer, a tablet computer, a smart phone, a Personal Digital Assistant (PDA), a game machine, an interactive Internet Protocol Television (IPTV), a smart wearable device, and the like.
The electronic device may include a network device and/or a user device. Wherein the network device includes, but is not limited to, a single network electronic device, an electronic device group consisting of a plurality of network electronic devices, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of hosts or network electronic devices.
The network in which the electronic device is located includes, but is not limited to: the internet, a wide area Network, a metropolitan area Network, a local area Network, a Virtual Private Network (VPN), and the like.
S10, acquiring a code scanning sample, wherein the code scanning sample comprises a code scanning image and annotation information corresponding to the code scanning image.
In at least one embodiment of the present invention, the code-scanning sample includes a code-scanning image and annotation information corresponding to the code-scanning image.
The code scanning image can be image information such as a two-dimensional code.
In at least one embodiment of the invention, the electronic device can randomly acquire the code-scanning sample from a sample library.
And S11, iteratively adjusting the pre-constructed generation learner and the discrimination learner based on the code scanning sample until an iteration stop condition is met, and obtaining a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner.
In at least one embodiment of the present invention, the generation learner is generated from a plurality of upsampled convolutional layer stitching constructs, and the discrimination learner is generated from a plurality of downsampled convolutional layers and classification layer constructs.
The iteration stop condition may include that the number of iterations is a configuration number, and according to an experiment result, when the configuration number is set to 3, the sample generation effect of the generation network and the sample discrimination effect of the discrimination network are optimal.
The generation network is generated after the iterative adjustment processing is carried out on the generation learner, and the judgment network is generated after the iterative adjustment processing is carried out on the judgment learner.
In at least one embodiment of the present invention, the code scanning samples include real code scanning samples and false code scanning samples, the electronic device iteratively adjusts a pre-constructed generation learner and a discrimination learner based on the code scanning samples until an iteration stop condition is satisfied, and obtaining a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner includes:
in a first round of iterative adjustment, processing the false code scanning samples based on the generation learner to obtain false samples, and determining the false samples and the real code scanning samples as initial training samples, wherein the initial training samples comprise initial training data and data results;
identifying the initial training data based on the discrimination learner to obtain an initial discrimination result;
adjusting the generation parameters in the generation learner according to the data result and the initial judgment result to obtain an adjusted generation learner, and adjusting the judgment parameters in the judgment learner according to the data result and the initial judgment result to obtain an adjusted judgment learner;
determining the adjusted generation learner as a generation learner for next iterative adjustment, and determining the adjusted discrimination learner as a discrimination learner for next iterative adjustment;
and determining the initial training sample with the data result as the configuration result as a false code scanning sample for next iteration adjustment until the iteration stop condition is met, and obtaining the generated network and the judgment network.
Wherein, the pseudo samples refer to the samples generated by the generation learner after synthesizing noise on the pseudo code-scanning samples.
The data result comprises a configuration result and a real result. The configuration results typically include a false result. The initial discrimination result is a result obtained by the discrimination learner after recognizing the initial training data.
The generation parameters comprise parameters which are initially set in the generation learner, and the discrimination parameters comprise parameters which are initially set in the discrimination learner.
By carrying out iterative adjustment on the generation learner and the discrimination learner, the generation effect of the generated network samples and the sample discrimination effect of the discrimination network can be improved, the subsequent fake effect of the generated code scanning samples can be improved, and the discrimination network can be prevented from being tuned and optimized for multiple times, so that the generation efficiency of the discrimination model is improved.
Specifically, the false code scanning samples include false images and image tags, and the processing, by the electronic device, of the false code scanning samples based on the generation learner to obtain the false samples includes:
acquiring a plurality of convolution layers of the generation learner, and determining the convolution sequence of the plurality of convolution layers in the generation learner;
sequentially carrying out convolution processing on the false images based on the plurality of convolution layers according to the sequence from small to large of the convolution sequence to obtain a false image;
determining the pseudo-image and the image label as the pseudo-sample.
Wherein the plurality of convolutional layers comprise a plurality of noise parameters.
Specifically, the adjusting, by the electronic device, the generation parameter in the generation learner according to the data result and the initial determination result, and obtaining an adjusted generation learner includes:
counting the number of the data results as a prediction total;
counting the number of data results which are the same as the initial judgment result to obtain the number of results;
and adjusting the generation parameters according to a preset amplitude until the ratio of the result quantity to the predicted total quantity is not increased any more, so as to obtain the adjusted generation learner.
The preset amplitude is set according to the requirement, multiple times of adjustment on the generation parameters can be avoided through setting the preset amplitude, and the generation efficiency of the generation learner is improved.
Specifically, the manner in which the electronic device adjusts the discriminant parameters in the discriminant learner according to the data result and the initial discriminant result is similar to the manner in which the electronic device adjusts the generation parameters in the generation learner according to the data result and the initial discriminant result, which is not repeated herein.
And S12, obtaining a sample of which the output result is a preset result when the discrimination learner performs iterative adjustment as a code scanning sample to be analyzed, and extracting a target code scanning sample of the last iterative adjustment from the code scanning sample to be analyzed.
In at least one embodiment of the present invention, the preset result may be set as a false result.
The target code scanning sample is a sample generated after the last iteration adjustment is carried out on the discrimination learner.
S13, processing the target code scanning sample based on the generation network to obtain a generated code scanning sample, and extracting the code scanning sample to be analyzed and the generated code scanning sample to obtain a training code scanning sample, wherein the training code scanning sample comprises code scanning data and a labeling result.
In at least one embodiment of the present invention, the generated code-scanning sample refers to a sample obtained by synthesizing the target code-scanning sample through the generation network.
The training code scanning samples comprise partial samples in the code scanning samples to be analyzed and partial samples in the generated code scanning samples.
In at least one embodiment of the present invention, the processing, by the electronic device, the target code-scanning sample based on the generation network to obtain a generated code-scanning sample includes:
for target code scanning data in each target code scanning sample, calculating the total number of pixels in the target code scanning data;
acquiring synthesis parameters in the generation network;
calculating the product of the synthesis parameters and the total amount of the pixels to obtain the number of real pixels;
acquiring a target pixel from a pixel sample library based on the real pixel quantity;
and randomly replacing the target pixel with the pixel in the target code scanning data to obtain the generated code scanning sample.
Wherein the total number of pixels refers to the total number of all pixels in the target scan data.
The synthesis parameter refers to a proportion of real pixels required by the generation network to generate the generated code-scanning sample.
The pixel sample library has a plurality of real pixels stored therein. The plurality of real pixels may be acquired from the real code-scan sample.
The required real pixel quantity can be rapidly determined through the synthesis parameters and the total pixel quantity, and then the generated code scanning sample can be rapidly synthesized.
In at least one embodiment of the present invention, the code scanning samples to be analyzed include the fake code scanning samples and the characteristic code scanning samples, the electronic device extracts the code scanning samples to be analyzed and the generated code scanning samples, and obtaining the training code scanning samples includes:
analyzing the generation accuracy of the generated network according to the generated code scanning sample;
determining the generation accuracy as a first extraction proportion of the generated code scanning sample, and determining a difference value between a preset value and the generation accuracy as a second extraction proportion of the code scanning sample to be analyzed;
detecting the number of iterative adjustment rounds of the code scanning sample to be analyzed generated by the generation learner;
if the iteration adjustment round number is larger than a preset round number, determining the ratio of the second extraction proportion to the preset round number as a third extraction proportion of the false code scanning samples, and determining the difference value of the second extraction proportion and the third extraction proportion as a fourth extraction proportion of the characteristic code scanning samples;
randomly extracting the generated code scanning samples based on the first extraction proportion to obtain first training samples, and randomly extracting the fake code scanning samples based on the third extraction proportion to obtain second training samples;
randomly extracting the feature code scanning samples based on the fourth extraction proportion to obtain third training samples;
determining the first training sample, the second training sample, and the third training sample as the training code-scanning sample.
The generation accuracy refers to a sample proportion that the generated code scanning sample cannot be accurately identified by the discriminant learner.
The iterative adjustment round number is the total round number required by the generation learner to generate the code scanning sample to be analyzed.
The preset number of rounds is set according to actual requirements, and according to experimental results, when the preset number of rounds is set to be 2, the accuracy of the subsequently generated discriminant model is the highest.
By setting the generation accuracy as the extraction proportion of the generated code scanning samples, the identification precision of the discrimination model can be improved, and meanwhile, by taking the ratio of the second extraction proportion to the preset number of rounds as the extraction proportion of the false code scanning samples, because the false code scanning samples belong to samples which are not generated by the discrimination learner, overfitting of the subsequently generated discrimination model to the feature code scanning samples can be avoided, so that the accuracy of the subsequently generated discrimination model is improved.
And S14, generating the discrimination accuracy according to the prediction result obtained by recognizing the scan code data by the discrimination network and the labeling result.
In at least one embodiment of the present invention, the prediction result refers to a result obtained after the recognition prediction is performed on the scan code data by the discrimination network.
The discrimination accuracy refers to the accurate proportion of the discrimination network identification sample.
In at least one embodiment of the present invention, the generating, by the electronic device, a determination accuracy according to the prediction result obtained by identifying the scan code data by the determination network and the labeling result includes:
recognizing the scanning code data based on the discrimination network to obtain the prediction result;
determining a prediction result which is the same as the labeling result as a target result;
counting the identification total amount of the scan code data, and counting the target number of the target result;
and calculating the ratio of the target quantity in the total recognition quantity to obtain the discrimination accuracy.
And the target result is a prediction result when the judging network identifies that the scanned data is correct.
The identification total amount refers to the total amount of the scanned code data identified by the identification network, and the target amount refers to the total amount of the scanned code data identified by the identification network to be correct.
Through the statistics of the total recognition amount and the target number, the discrimination accuracy of the discrimination network can be quickly determined.
And S15, if the discrimination accuracy is smaller than a preset threshold value, extracting a sample to be added from the training code scanning sample according to the labeling result and the prediction result.
In at least one embodiment of the present invention, the preset threshold is set according to actual requirements, for example, the preset threshold may be 90%.
The samples to be added refer to training code scanning samples which cannot be accurately identified by the discrimination network.
In at least one embodiment of the present invention, the electronic device extracting to-be-added samples from the training code-scanning samples according to the labeling result and the prediction result comprises:
comparing the prediction result with the labeling result;
and if the prediction result is not matched with the labeling result, determining the training code scanning sample corresponding to the prediction result as the sample to be added.
It can be understood that, since the prediction result does not match the labeling result, the discrimination network cannot accurately identify the type of the result of the to-be-added sample.
By the implementation mode, the prediction result is directly compared with the labeling result, and the determination efficiency of the sample to be added can be improved.
In at least one embodiment of the present invention, if the discrimination accuracy is greater than or equal to the preset threshold, the discrimination network is determined as a discrimination model.
And S16, adjusting network parameters in the discrimination network based on the generation network and the sample to be added to obtain a discrimination model.
In at least one embodiment of the present invention, the network parameter includes a parameter that is initially set in the discriminant network, and the network parameter is the same as the discriminant parameter.
The discriminant model is a discriminant network with the discriminant accuracy greater than or equal to the preset threshold.
In at least one embodiment of the present invention, the adjusting, by the electronic device, network parameters in the discriminant network based on the generated network and the to-be-augmented sample to obtain a discriminant model includes:
processing the sample to be increased based on the generation network to obtain a target sample;
and adjusting the network parameters based on the target sample until the discrimination accuracy of the adjusted discrimination network is greater than or equal to the preset threshold value to obtain the discrimination model.
The target sample comprises the sample to be added and a sample obtained after the sample to be added is processed by the generation network.
And generating a target sample similar to the sample to be added through the generation network, wherein the sample to be added cannot be accurately identified by the discrimination network, so that the discrimination accuracy of the discrimination model can be improved by adjusting the network parameters based on the generated target sample, and meanwhile, the discrimination accuracy of the discrimination model can be improved by updating the sample weight of a training sample in the discrimination model based on the target sample.
Specifically, a manner in which the electronic device processes the to-be-increased sample based on the generation network is the same as a manner in which the electronic device processes the target code scanning sample based on the generation network, which is not described in detail herein.
Specifically, a manner in which the electronic device adjusts the network parameters in the discriminant network based on the target sample is similar to a manner in which the electronic device adjusts the discriminant parameters in the discriminant learner according to the data result and the initial discriminant result, which is not described again in the present invention.
And S17, when a detection request is received, acquiring information to be detected according to the detection request, and processing the information to be detected based on the discrimination model to obtain a detection result.
In at least one embodiment of the invention, the detection request may be generated based on a user trigger that has a detection requirement.
The information to be detected refers to code scanning information which needs to be detected, for example, the information to be detected can be a payment two-dimensional code on an APP.
The detection result refers to a result obtained after the discrimination model identifies the information to be detected. The detection result may be true, and the detection result may also be false.
It should be emphasized that, in order to further ensure the privacy and security of the detection result, the detection result may also be stored in a node of a block chain.
In at least one embodiment of the present invention, the acquiring, by the electronic device, the to-be-detected information according to the detection request includes:
analyzing the message of the detection request to obtain the data information carried by the message;
acquiring a storage path from the data information;
and acquiring the information to be detected from the storage path.
In at least one embodiment of the present invention, a manner in which the electronic device processes the to-be-detected information based on the discriminant model is the same as a manner in which the electronic device identifies the scan code data according to the discriminant network, which is not described in detail herein.
According to the technical scheme, the training code scanning samples are extracted from the code scanning samples to be analyzed and the generated code scanning samples, the training code scanning samples are analyzed based on the discrimination network, the problem that the discrimination network is over-fitted to the analysis of the specific examples generated by the generation learner after being adjusted in a certain iteration is solved, the discrimination accuracy of the discrimination network is improved, and then when the discrimination accuracy is smaller than a preset threshold value, the network parameters in the discrimination network are further adjusted based on the generation network and the samples to be added, so that the discrimination accuracy of the discrimination model on the sample types of the samples to be added can be improved, and the accuracy of the detection result is improved.
FIG. 2 is a functional block diagram of a code scanning information detecting device according to a preferred embodiment of the present invention. The code scanning information detection device 11 includes an acquisition unit 110, an adjustment unit 111, an extraction unit 112, an extraction unit 113, a generation unit 114, and a processing unit 115. The module/unit referred to herein is a series of computer readable instruction segments that can be accessed by the processor 13 and perform a fixed function and that are stored in the memory 12. In the present embodiment, the functions of the modules/units will be described in detail in the following embodiments.
The obtaining unit 110 obtains a code scanning sample, where the code scanning sample includes a code scanning image and annotation information corresponding to the code scanning image.
In at least one embodiment of the present invention, the code-scanning sample includes a code-scanning image and annotation information corresponding to the code-scanning image.
The code scanning image can be image information such as a two-dimensional code.
In at least one embodiment of the present invention, the obtaining unit 110 may randomly obtain the code-scanning sample from a sample library.
The adjusting unit 111 iteratively adjusts a pre-constructed generation learner and a discrimination learner based on the code scanning sample until an iteration stop condition is satisfied, and obtains a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner.
In at least one embodiment of the present invention, the generation learner is generated from a plurality of upsampled convolutional layer stitching constructs, and the discrimination learner is generated from a plurality of downsampled convolutional layers and classification layer constructs.
The iteration stop condition may include that the number of iterations is a configuration number, and according to an experiment result, when the configuration number is set to 3, the sample generation effect of the generation network and the sample discrimination effect of the discrimination network are optimal.
The generation network is generated after the iterative adjustment processing is carried out on the generation learner, and the judgment network is generated after the iterative adjustment processing is carried out on the judgment learner.
In at least one embodiment of the present invention, the code scanning samples include real code scanning samples and false code scanning samples, the adjusting unit 111 iteratively adjusts a pre-constructed generation learner and a discrimination learner based on the code scanning samples until an iteration stop condition is satisfied, and obtaining a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner includes:
in a first round of iterative adjustment, processing the false code scanning samples based on the generation learner to obtain false samples, and determining the false samples and the real code scanning samples as initial training samples, wherein the initial training samples comprise initial training data and data results;
identifying the initial training data based on the discrimination learner to obtain an initial discrimination result;
adjusting the generation parameters in the generation learner according to the data result and the initial judgment result to obtain an adjusted generation learner, and adjusting the judgment parameters in the judgment learner according to the data result and the initial judgment result to obtain an adjusted judgment learner;
determining the adjusted generation learner as a generation learner for next iterative adjustment, and determining the adjusted discrimination learner as a discrimination learner for next iterative adjustment;
and determining the initial training sample with the data result as the configuration result as a false code scanning sample for next iteration adjustment until the iteration stop condition is met, and obtaining the generated network and the judgment network.
Wherein, the pseudo samples refer to the samples generated by the generation learner after synthesizing noise on the pseudo code-scanning samples.
The data result comprises a configuration result and a real result. The configuration results typically include a false result.
The initial discrimination result is a result obtained by the discrimination learner after recognizing the initial training data.
The generation parameters comprise parameters which are initially set in the generation learner, and the discrimination parameters comprise parameters which are initially set in the discrimination learner.
By carrying out iterative adjustment on the generation learner and the discrimination learner, the generation effect of the generated network samples and the sample discrimination effect of the discrimination network can be improved, the subsequent fake effect of the generated code scanning samples can be improved, and the discrimination network can be prevented from being tuned and optimized for multiple times, so that the generation efficiency of the discrimination model is improved.
Specifically, the false code scanning samples include false images and image labels, and the processing, by the adjusting unit 111, the false code scanning samples based on the generation learner to obtain the false samples includes:
acquiring a plurality of convolution layers of the generation learner, and determining the convolution sequence of the plurality of convolution layers in the generation learner;
sequentially carrying out convolution processing on the false images based on the plurality of convolution layers according to the sequence from small to large of the convolution sequence to obtain a false image;
determining the pseudo-image and the image label as the pseudo-sample.
Wherein the plurality of convolutional layers comprise a plurality of noise parameters.
Specifically, the adjusting unit 111 adjusts the generation parameters in the generation learner according to the data result and the initial judgment result, and the adjusted generation learner includes:
counting the number of the data results as a prediction total;
counting the number of data results which are the same as the initial judgment result to obtain the number of results;
and adjusting the generation parameters according to a preset amplitude until the ratio of the result quantity to the predicted total quantity is not increased any more, so as to obtain the adjusted generation learner.
The preset amplitude is set according to the requirement, multiple times of adjustment on the generation parameters can be avoided through setting the preset amplitude, and the generation efficiency of the generation learner is improved.
Specifically, the manner in which the adjusting unit 111 adjusts the discriminant parameters in the discriminant learner according to the data result and the initial discriminant result is similar to the manner in which the adjusting unit 111 adjusts the generation parameters in the generation learner according to the data result and the initial discriminant result, which is not repeated herein.
The extracting unit 112 obtains a sample of which the output result of the discriminative learner is a preset result during iterative adjustment as a code scanning sample to be analyzed, and extracts a target code scanning sample of the last iterative adjustment from the code scanning sample to be analyzed.
In at least one embodiment of the present invention, the preset result may be set as a false result.
The target code scanning sample is a sample generated after the last iteration adjustment is carried out on the discrimination learner.
The extracting unit 113 processes the target code scanning sample based on the generation network to obtain a generated code scanning sample, and extracts the code scanning sample to be analyzed and the generated code scanning sample to obtain a training code scanning sample, where the training code scanning sample includes code scanning data and a labeling result.
In at least one embodiment of the present invention, the generated code-scanning sample refers to a sample obtained by synthesizing the target code-scanning sample through the generation network.
The training code scanning samples comprise partial samples in the code scanning samples to be analyzed and partial samples in the generated code scanning samples.
In at least one embodiment of the present invention, the extracting unit 113 processes the target code-scanning sample based on the generating network, and obtaining a generated code-scanning sample includes:
for target code scanning data in each target code scanning sample, calculating the total number of pixels in the target code scanning data;
acquiring synthesis parameters in the generation network;
calculating the product of the synthesis parameters and the total amount of the pixels to obtain the number of real pixels;
acquiring a target pixel from a pixel sample library based on the real pixel quantity;
and randomly replacing the target pixel with the pixel in the target code scanning data to obtain the generated code scanning sample.
Wherein the total number of pixels refers to the total number of all pixels in the target scan data.
The synthesis parameter refers to a proportion of real pixels required by the generation network to generate the generated code-scanning sample.
The pixel sample library has a plurality of real pixels stored therein. The plurality of real pixels may be acquired from the real code-scan sample.
The required real pixel quantity can be rapidly determined through the synthesis parameters and the total pixel quantity, and then the generated code scanning sample can be rapidly synthesized.
In at least one embodiment of the present invention, the code-scanning samples to be analyzed include the fake code-scanning samples and the characteristic code-scanning samples, the extracting unit 113 extracts the code-scanning samples to be analyzed and the generated code-scanning samples, and obtaining the training code-scanning samples includes:
analyzing the generation accuracy of the generated network according to the generated code scanning sample;
determining the generation accuracy as a first extraction proportion of the generated code scanning sample, and determining a difference value between a preset value and the generation accuracy as a second extraction proportion of the code scanning sample to be analyzed;
detecting the number of iterative adjustment rounds of the code scanning sample to be analyzed generated by the generation learner;
if the iteration adjustment round number is larger than a preset round number, determining the ratio of the second extraction proportion to the preset round number as a third extraction proportion of the false code scanning samples, and determining the difference value of the second extraction proportion and the third extraction proportion as a fourth extraction proportion of the characteristic code scanning samples;
randomly extracting the generated code scanning samples based on the first extraction proportion to obtain first training samples, and randomly extracting the fake code scanning samples based on the third extraction proportion to obtain second training samples;
randomly extracting the feature code scanning samples based on the fourth extraction proportion to obtain third training samples;
determining the first training sample, the second training sample, and the third training sample as the training code-scanning sample.
The generation accuracy refers to a sample proportion that the generated code scanning sample cannot be accurately identified by the discriminant learner.
The iterative adjustment round number is the total round number required by the generation learner to generate the code scanning sample to be analyzed.
The preset number of rounds is set according to actual requirements, and according to experimental results, when the preset number of rounds is set to be 2, the accuracy of the subsequently generated discriminant model is the highest.
By setting the generation accuracy as the extraction proportion of the generated code scanning samples, the identification precision of the discrimination model can be improved, and meanwhile, by taking the ratio of the second extraction proportion to the preset number of rounds as the extraction proportion of the false code scanning samples, because the false code scanning samples belong to samples which are not generated by the discrimination learner, overfitting of the subsequently generated discrimination model to the feature code scanning samples can be avoided, so that the accuracy of the subsequently generated discrimination model is improved.
The generating unit 114 generates the discrimination accuracy according to the prediction result obtained by the discrimination network recognizing the scan code data and the labeling result.
In at least one embodiment of the present invention, the prediction result refers to a result obtained after the recognition prediction is performed on the scan code data by the discrimination network.
The discrimination accuracy refers to the accurate proportion of the discrimination network identification sample.
In at least one embodiment of the present invention, the generating unit 114 generates the discrimination accuracy according to the prediction result obtained by the discrimination network recognizing the scan data and the labeling result, including:
recognizing the scanning code data based on the discrimination network to obtain the prediction result;
determining a prediction result which is the same as the labeling result as a target result;
counting the identification total amount of the scan code data, and counting the target number of the target result;
and calculating the ratio of the target quantity in the total recognition quantity to obtain the discrimination accuracy.
And the target result is a prediction result when the judging network identifies that the scanned data is correct.
The identification total amount refers to the total amount of the scanned code data identified by the identification network, and the target amount refers to the total amount of the scanned code data identified by the identification network to be correct.
Through the statistics of the total recognition amount and the target number, the discrimination accuracy of the discrimination network can be quickly determined.
If the discrimination accuracy is smaller than a preset threshold, the extracting unit 112 extracts a to-be-added sample from the training code-scanning sample according to the labeling result and the prediction result.
In at least one embodiment of the present invention, the preset threshold is set according to actual requirements, for example, the preset threshold may be 90%.
The samples to be added refer to training code scanning samples which cannot be accurately identified by the discrimination network.
In at least one embodiment of the present invention, the extracting unit 112 extracts the to-be-added samples from the training code-scanning samples according to the labeling result and the prediction result, including:
comparing the prediction result with the labeling result;
and if the prediction result is not matched with the labeling result, determining the training code scanning sample corresponding to the prediction result as the sample to be added.
It can be understood that, since the prediction result does not match the labeling result, the discrimination network cannot accurately identify the type of the result of the to-be-added sample.
By the implementation mode, the prediction result is directly compared with the labeling result, and the determination efficiency of the sample to be added can be improved.
In at least one embodiment of the present invention, if the discrimination accuracy is greater than or equal to the preset threshold, the discrimination network is determined as a discrimination model.
The adjusting unit 111 adjusts the network parameters in the discriminant network based on the generated network and the to-be-added sample to obtain a discriminant model.
In at least one embodiment of the present invention, the network parameter includes a parameter that is initially set in the discriminant network, and the network parameter is the same as the discriminant parameter.
The discriminant model is a discriminant network with the discriminant accuracy greater than or equal to the preset threshold.
In at least one embodiment of the present invention, the adjusting unit 111 adjusts a network parameter in the discriminant network based on the generated network and the to-be-augmented sample, and obtaining a discriminant model includes:
processing the sample to be increased based on the generation network to obtain a target sample;
and adjusting the network parameters based on the target sample until the discrimination accuracy of the adjusted discrimination network is greater than or equal to the preset threshold value to obtain the discrimination model.
The target sample comprises the sample to be added and a sample obtained after the sample to be added is processed by the generation network.
And generating a target sample similar to the sample to be added through the generation network, wherein the sample to be added cannot be accurately identified by the discrimination network, so that the discrimination accuracy of the discrimination model can be improved by adjusting the network parameters based on the generated target sample, and meanwhile, the discrimination accuracy of the discrimination model can be improved by updating the sample weight of a training sample in the discrimination model based on the target sample.
Specifically, the way in which the adjustment unit 111 processes the to-be-increased sample based on the generation network is the same as the way in which the extraction unit 113 processes the target code scanning sample based on the generation network, which is not described in detail herein.
Specifically, the way of adjusting the network parameters in the discriminant network by the adjusting unit 111 based on the target sample is similar to the way of adjusting the discriminant parameters in the discriminant learner by the adjusting unit 111 according to the data result and the initial discriminant result, which is not repeated herein.
When receiving the detection request, the processing unit 115 obtains the information to be detected according to the detection request, and processes the information to be detected based on the discrimination model to obtain a detection result.
In at least one embodiment of the invention, the detection request may be generated based on a user trigger that has a detection requirement.
The information to be detected refers to code scanning information which needs to be detected, for example, the information to be detected can be a payment two-dimensional code on an APP.
The detection result refers to a result obtained after the discrimination model identifies the information to be detected. The detection result may be true, and the detection result may also be false.
It should be emphasized that, in order to further ensure the privacy and security of the detection result, the detection result may also be stored in a node of a block chain.
In at least one embodiment of the present invention, the acquiring, by the processing unit 115, the to-be-detected information according to the detection request includes:
analyzing the message of the detection request to obtain the data information carried by the message;
acquiring a storage path from the data information;
and acquiring the information to be detected from the storage path.
In at least one embodiment of the present invention, a manner of processing the to-be-detected information by the processing unit 115 based on the discriminant model is the same as a manner of identifying the scan code data by the generating unit 114 according to the discriminant network, which is not described again in the present invention.
According to the technical scheme, the training code scanning samples are extracted from the code scanning samples to be analyzed and the generated code scanning samples, the training code scanning samples are analyzed based on the discrimination network, the problem that the discrimination network is over-fitted to the analysis of the specific examples generated by the generation learner after being adjusted in a certain iteration is solved, the discrimination accuracy of the discrimination network is improved, and then when the discrimination accuracy is smaller than a preset threshold value, the network parameters in the discrimination network are further adjusted based on the generation network and the samples to be added, so that the discrimination accuracy of the discrimination model on the sample types of the samples to be added can be improved, and the accuracy of the detection result is improved.
Fig. 3 is a schematic structural diagram of an electronic device implementing a method for detecting code scanning information according to a preferred embodiment of the present invention.
In one embodiment of the present invention, the electronic device 1 includes, but is not limited to, a memory 12, a processor 13, and computer readable instructions, such as a code scanning information detection program, stored in the memory 12 and executable on the processor 13.
It will be appreciated by a person skilled in the art that the schematic diagram is only an example of the electronic device 1 and does not constitute a limitation of the electronic device 1, and that it may comprise more or less components than shown, or some components may be combined, or different components, e.g. the electronic device 1 may further comprise an input output device, a network access device, a bus, etc.
The Processor 13 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. The processor 13 is an operation core and a control center of the electronic device 1, and is connected to each part of the whole electronic device 1 by various interfaces and lines, and executes an operating system of the electronic device 1 and various installed application programs, program codes, and the like.
Illustratively, the computer readable instructions may be partitioned into one or more modules/units that are stored in the memory 12 and executed by the processor 13 to implement the present invention. The one or more modules/units may be a series of computer readable instruction segments capable of performing specific functions, which are used for describing the execution process of the computer readable instructions in the electronic device 1. For example, the computer readable instructions may be divided into an acquisition unit 110, an adjustment unit 111, an extraction unit 112, an extraction unit 113, a generation unit 114, and a processing unit 115.
The memory 12 may be used for storing the computer readable instructions and/or modules, and the processor 13 implements various functions of the electronic device 1 by executing or executing the computer readable instructions and/or modules stored in the memory 12 and invoking data stored in the memory 12. The memory 12 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the electronic device, and the like. The memory 12 may include non-volatile and volatile memories, such as: a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other storage device.
The memory 12 may be an external memory and/or an internal memory of the electronic device 1. Further, the memory 12 may be a memory having a physical form, such as a memory stick, a TF Card (Trans-flash Card), or the like.
The integrated modules/units of the electronic device 1 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow of the method according to the above embodiments may be implemented by hardware that is configured to be instructed by computer readable instructions, which may be stored in a computer readable storage medium, and when the computer readable instructions are executed by a processor, the steps of the method embodiments may be implemented.
Wherein the computer readable instructions comprise computer readable instruction code which may be in source code form, object code form, an executable file or some intermediate form, and the like. The computer-readable medium may include: any entity or device capable of carrying said computer readable instruction code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM).
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
With reference to fig. 1, the memory 12 in the electronic device 1 stores computer-readable instructions to implement a code scanning information detection method, and the processor 13 can execute the computer-readable instructions to implement:
acquiring a code scanning sample;
iteratively adjusting a pre-constructed generation learner and a discrimination learner based on the code scanning sample until an iteration stop condition is met to obtain a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner;
acquiring a sample with a preset output result of the discrimination learner during iterative adjustment as a code scanning sample to be analyzed, and extracting a target code scanning sample subjected to the final iterative adjustment from the code scanning sample to be analyzed;
processing the target code scanning sample based on the generation network to obtain a generated code scanning sample, and extracting the code scanning sample to be analyzed and the generated code scanning sample to obtain a training code scanning sample, wherein the training code scanning sample comprises code scanning data and a labeling result;
generating discrimination accuracy according to a prediction result obtained by recognizing the scanning code data by the discrimination network and the labeling result;
if the discrimination accuracy is smaller than a preset threshold value, extracting a sample to be added from the training code scanning sample according to the labeling result and the prediction result;
adjusting network parameters in the discrimination network based on the generation network and the sample to be added to obtain a discrimination model;
and when a detection request is received, acquiring information to be detected according to the detection request, and processing the information to be detected based on the discrimination model to obtain a detection result.
Specifically, the processor 13 may refer to the description of the relevant steps in the embodiment corresponding to fig. 1 for a specific implementation method of the computer readable instructions, which is not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The computer readable storage medium has computer readable instructions stored thereon, wherein the computer readable instructions when executed by the processor 13 are configured to implement the steps of:
acquiring a code scanning sample;
iteratively adjusting a pre-constructed generation learner and a discrimination learner based on the code scanning sample until an iteration stop condition is met to obtain a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner;
acquiring a sample with a preset output result of the discrimination learner during iterative adjustment as a code scanning sample to be analyzed, and extracting a target code scanning sample subjected to the final iterative adjustment from the code scanning sample to be analyzed;
processing the target code scanning sample based on the generation network to obtain a generated code scanning sample, and extracting the code scanning sample to be analyzed and the generated code scanning sample to obtain a training code scanning sample, wherein the training code scanning sample comprises code scanning data and a labeling result;
generating discrimination accuracy according to a prediction result obtained by recognizing the scanning code data by the discrimination network and the labeling result;
if the discrimination accuracy is smaller than a preset threshold value, extracting a sample to be added from the training code scanning sample according to the labeling result and the prediction result;
adjusting network parameters in the discrimination network based on the generation network and the sample to be added to obtain a discrimination model;
and when a detection request is received, acquiring information to be detected according to the detection request, and processing the information to be detected based on the discrimination model to obtain a detection result.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. The plurality of units or devices may also be implemented by one unit or device through software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A code scanning information detection method is characterized in that the code scanning information detection method comprises the following steps:
acquiring a code scanning sample, wherein the code scanning sample comprises a code scanning image and annotation information corresponding to the code scanning image;
iteratively adjusting a pre-constructed generation learner and a discrimination learner based on the code scanning sample until an iteration stop condition is met to obtain a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner;
acquiring a sample with a preset output result of the discrimination learner during iterative adjustment as a code scanning sample to be analyzed, and extracting a target code scanning sample subjected to the final iterative adjustment from the code scanning sample to be analyzed;
processing the target code scanning sample based on the generation network to obtain a generated code scanning sample, and extracting the code scanning sample to be analyzed and the generated code scanning sample to obtain a training code scanning sample, wherein the training code scanning sample comprises code scanning data and a labeling result;
generating discrimination accuracy according to a prediction result obtained by recognizing the scanning code data by the discrimination network and the labeling result;
if the discrimination accuracy is smaller than a preset threshold value, extracting a sample to be added from the training code scanning sample according to the labeling result and the prediction result;
adjusting network parameters in the discrimination network based on the generation network and the sample to be added to obtain a discrimination model;
and when a detection request is received, acquiring information to be detected according to the detection request, and processing the information to be detected based on the discrimination model to obtain a detection result.
2. The code-scanning information detection method of claim 1, wherein the code-scanning samples include real code-scanning samples and false code-scanning samples, and the iteratively adjusting a pre-constructed generation learner and a discriminant learner based on the code-scanning samples until an iteration stop condition is satisfied to obtain a generation network corresponding to the generation learner and a discriminant network corresponding to the discriminant learner includes:
in a first round of iterative adjustment, processing the false code scanning samples based on the generation learner to obtain false samples, and determining the false samples and the real code scanning samples as initial training samples, wherein the initial training samples comprise initial training data and data results;
identifying the initial training data based on the discrimination learner to obtain an initial discrimination result;
adjusting the generation parameters in the generation learner according to the data result and the initial judgment result to obtain an adjusted generation learner, and adjusting the judgment parameters in the judgment learner according to the data result and the initial judgment result to obtain an adjusted judgment learner;
determining the adjusted generation learner as a generation learner for next iterative adjustment, and determining the adjusted discrimination learner as a discrimination learner for next iterative adjustment;
and determining the initial training sample with the data result as the configuration result as a false code scanning sample for next iteration adjustment until the iteration stop condition is met, and obtaining the generated network and the judgment network.
3. The method as claimed in claim 2, wherein the code-scanning samples to be analyzed include the fake code-scanning samples and the characteristic code-scanning samples, and the extracting the code-scanning samples to be analyzed and the generating code-scanning samples to obtain the training code-scanning samples includes:
analyzing the generation accuracy of the generated network according to the generated code scanning sample;
determining the generation accuracy as a first extraction proportion of the generated code scanning sample, and determining a difference value between a preset value and the generation accuracy as a second extraction proportion of the code scanning sample to be analyzed;
detecting the number of iterative adjustment rounds of the code scanning sample to be analyzed generated by the generation learner;
if the iteration adjustment round number is larger than a preset round number, determining the ratio of the second extraction proportion to the preset round number as a third extraction proportion of the false code scanning samples, and determining the difference value of the second extraction proportion and the third extraction proportion as a fourth extraction proportion of the characteristic code scanning samples;
randomly extracting the generated code scanning samples based on the first extraction proportion to obtain first training samples, and randomly extracting the fake code scanning samples based on the third extraction proportion to obtain second training samples;
randomly extracting the feature code scanning samples based on the fourth extraction proportion to obtain third training samples;
determining the first training sample, the second training sample, and the third training sample as the training code-scanning sample.
4. The code-scanning information detecting method of claim 1, wherein the processing the target code-scanning sample based on the generating network to obtain a generated code-scanning sample comprises:
for target code scanning data in each target code scanning sample, calculating the total number of pixels in the target code scanning data;
acquiring synthesis parameters in the generation network;
calculating the product of the synthesis parameters and the total amount of the pixels to obtain the number of real pixels;
acquiring a target pixel from a pixel sample library based on the real pixel quantity;
and randomly replacing the target pixel with the pixel in the target code scanning data to obtain the generated code scanning sample.
5. The method as claimed in claim 1, wherein said generating a determination accuracy according to the prediction result obtained by said identification network for said scanned data and said tagging result comprises:
recognizing the scanning code data based on the discrimination network to obtain the prediction result;
determining a prediction result which is the same as the labeling result as a target result;
counting the identification total amount of the scan code data, and counting the target number of the target result;
and calculating the ratio of the target quantity in the total recognition quantity to obtain the discrimination accuracy.
6. The method for detecting code-scanning information according to claim 1, wherein said extracting samples to be added from the training code-scanning samples according to the labeling result and the prediction result comprises:
comparing the prediction result with the labeling result;
and if the prediction result is not matched with the labeling result, determining the training code scanning sample corresponding to the prediction result as the sample to be added.
7. The method according to claim 1, wherein the adjusting the network parameters in the discriminant network based on the generated network and the to-be-added sample to obtain a discriminant model comprises:
processing the sample to be increased based on the generation network to obtain a target sample;
and adjusting the network parameters based on the target sample until the discrimination accuracy of the adjusted discrimination network is greater than or equal to the preset threshold value to obtain the discrimination model.
8. A code scanning information detection device, comprising:
the system comprises an acquisition unit, a storage unit and a processing unit, wherein the acquisition unit is used for acquiring a code scanning sample, and the code scanning sample comprises a code scanning image and annotation information corresponding to the code scanning image;
the adjusting unit is used for iteratively adjusting a pre-constructed generation learner and a discrimination learner based on the code scanning sample until an iteration stop condition is met, and obtaining a generation network corresponding to the generation learner and a discrimination network corresponding to the discrimination learner;
the extraction unit is used for acquiring a sample of which the output result is a preset result when the discrimination learner performs iterative adjustment as a code scanning sample to be analyzed, and extracting a target code scanning sample of the last round of iterative adjustment from the code scanning sample to be analyzed;
the extraction unit is used for processing the target code scanning sample based on the generation network to obtain a generated code scanning sample, and extracting the code scanning sample to be analyzed and the generated code scanning sample to obtain a training code scanning sample, wherein the training code scanning sample comprises code scanning data and a labeling result;
the generating unit is used for generating discrimination accuracy according to a prediction result obtained by identifying the scanning code data by the discrimination network and the labeling result;
the extracting unit is further configured to extract a to-be-added sample from the training code-scanning sample according to the labeling result and the prediction result if the discrimination accuracy is smaller than a preset threshold;
the adjusting unit is further configured to adjust network parameters in the discrimination network based on the generation network and the sample to be added to obtain a discrimination model;
and the processing unit is used for acquiring the information to be detected according to the detection request when receiving the detection request, and processing the information to be detected based on the discrimination model to obtain a detection result.
9. An electronic device, characterized in that the electronic device comprises:
a memory storing computer readable instructions; and
a processor executing computer readable instructions stored in the memory to implement the code scanning information detection method of any one of claims 1 to 7.
10. A computer-readable storage medium characterized by: the computer-readable storage medium stores computer-readable instructions, which are executed by a processor in an electronic device to implement the code scanning information detection method according to any one of claims 1 to 7.
CN202111168620.2A 2021-10-08 2021-10-08 Code scanning information detection method, device, equipment and storage medium Active CN113627576B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111168620.2A CN113627576B (en) 2021-10-08 2021-10-08 Code scanning information detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111168620.2A CN113627576B (en) 2021-10-08 2021-10-08 Code scanning information detection method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113627576A true CN113627576A (en) 2021-11-09
CN113627576B CN113627576B (en) 2022-01-18

Family

ID=78390694

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111168620.2A Active CN113627576B (en) 2021-10-08 2021-10-08 Code scanning information detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113627576B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114330385A (en) * 2021-12-28 2022-04-12 福建新大陆支付技术有限公司 Test method of missing one-dimensional bar code and computer readable medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107016406A (en) * 2017-02-24 2017-08-04 中国科学院合肥物质科学研究院 The pest and disease damage image generating method of network is resisted based on production
CN109903242A (en) * 2019-02-01 2019-06-18 深兰科技(上海)有限公司 A kind of image generating method and device
CN110598765A (en) * 2019-08-28 2019-12-20 腾讯科技(深圳)有限公司 Sample generation method and device, computer equipment and storage medium
CN110751004A (en) * 2019-10-25 2020-02-04 北京达佳互联信息技术有限公司 Two-dimensional code detection method, device, equipment and storage medium
CN110874542A (en) * 2018-08-31 2020-03-10 北京意锐新创科技有限公司 Method and device suitable for reading bar codes displayed by different carriers
CN111444951A (en) * 2020-03-24 2020-07-24 腾讯科技(深圳)有限公司 Method and device for generating sample identification model, computer equipment and storage medium
US10733733B1 (en) * 2019-04-19 2020-08-04 Lunit Inc. Method for detecting anomaly using generative adversarial networks, apparatus and system thereof
CN111709408A (en) * 2020-08-18 2020-09-25 腾讯科技(深圳)有限公司 Image authenticity detection method and device
AU2020102667A4 (en) * 2020-10-11 2021-01-14 George, Tony DR Adversarial training for large scale healthcare data using machine learning system
CN112529109A (en) * 2020-12-29 2021-03-19 四川长虹电器股份有限公司 Unsupervised multi-model-based anomaly detection method and system
US20210209415A1 (en) * 2020-01-03 2021-07-08 Mayachitra, Inc. Detecting digital image manipulations
CN113435522A (en) * 2021-06-30 2021-09-24 平安科技(深圳)有限公司 Image classification method, device, equipment and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107016406A (en) * 2017-02-24 2017-08-04 中国科学院合肥物质科学研究院 The pest and disease damage image generating method of network is resisted based on production
CN110874542A (en) * 2018-08-31 2020-03-10 北京意锐新创科技有限公司 Method and device suitable for reading bar codes displayed by different carriers
CN109903242A (en) * 2019-02-01 2019-06-18 深兰科技(上海)有限公司 A kind of image generating method and device
US10733733B1 (en) * 2019-04-19 2020-08-04 Lunit Inc. Method for detecting anomaly using generative adversarial networks, apparatus and system thereof
CN110598765A (en) * 2019-08-28 2019-12-20 腾讯科技(深圳)有限公司 Sample generation method and device, computer equipment and storage medium
CN110751004A (en) * 2019-10-25 2020-02-04 北京达佳互联信息技术有限公司 Two-dimensional code detection method, device, equipment and storage medium
US20210209415A1 (en) * 2020-01-03 2021-07-08 Mayachitra, Inc. Detecting digital image manipulations
CN111444951A (en) * 2020-03-24 2020-07-24 腾讯科技(深圳)有限公司 Method and device for generating sample identification model, computer equipment and storage medium
CN111709408A (en) * 2020-08-18 2020-09-25 腾讯科技(深圳)有限公司 Image authenticity detection method and device
AU2020102667A4 (en) * 2020-10-11 2021-01-14 George, Tony DR Adversarial training for large scale healthcare data using machine learning system
CN112529109A (en) * 2020-12-29 2021-03-19 四川长虹电器股份有限公司 Unsupervised multi-model-based anomaly detection method and system
CN113435522A (en) * 2021-06-30 2021-09-24 平安科技(深圳)有限公司 Image classification method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SAMAKSH AGARWAL等: "A Novel Neural Model based Framework for Detection of GAN Generated Fake Images", 《2021 11TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING, DATA SCIENCE & ENGINEERING (CONFLUENCE)》 *
曹仰杰 等: "生成式对抗网络及其计算机视觉应用研究综述", 《中国图象图形学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114330385A (en) * 2021-12-28 2022-04-12 福建新大陆支付技术有限公司 Test method of missing one-dimensional bar code and computer readable medium

Also Published As

Publication number Publication date
CN113627576B (en) 2022-01-18

Similar Documents

Publication Publication Date Title
CN106845440B (en) Augmented reality image processing method and system
CN107958230B (en) Facial expression recognition method and device
CN113435522A (en) Image classification method, device, equipment and storage medium
CN113449725B (en) Object classification method, device, equipment and storage medium
CN112668453B (en) Video identification method and related equipment
CN112232203B (en) Pedestrian recognition method and device, electronic equipment and storage medium
CN113870395A (en) Animation video generation method, device, equipment and storage medium
CN114090794A (en) Event map construction method based on artificial intelligence and related equipment
CN111783593A (en) Human face recognition method and device based on artificial intelligence, electronic equipment and medium
CN113627576B (en) Code scanning information detection method, device, equipment and storage medium
CN113705468A (en) Digital image identification method based on artificial intelligence and related equipment
CN113033305B (en) Living body detection method, living body detection device, terminal equipment and storage medium
CN112712005B (en) Training method of recognition model, target recognition method and terminal equipment
CN113918467A (en) Financial system testing method, device, equipment and storage medium
CN113378852A (en) Key point detection method and device, electronic equipment and storage medium
CN113850632B (en) User category determination method, device, equipment and storage medium
CN112949305B (en) Negative feedback information acquisition method, device, equipment and storage medium
CN113627186B (en) Entity relation detection method based on artificial intelligence and related equipment
CN111476775B (en) DR symptom identification device and method
CN114581177A (en) Product recommendation method, device, equipment and storage medium
CN113420545A (en) Abstract generation method, device, equipment and storage medium
CN113516205A (en) Data classification method, device, equipment and storage medium based on artificial intelligence
CN113486848A (en) Document table identification method, device, equipment and storage medium
CN113762031A (en) Image identification method, device, equipment and storage medium
CN113283421B (en) Information identification method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant