CN110647965A - Method for converting artistic two-dimensional code into conventional two-dimensional code - Google Patents
Method for converting artistic two-dimensional code into conventional two-dimensional code Download PDFInfo
- Publication number
- CN110647965A CN110647965A CN201910728382.2A CN201910728382A CN110647965A CN 110647965 A CN110647965 A CN 110647965A CN 201910728382 A CN201910728382 A CN 201910728382A CN 110647965 A CN110647965 A CN 110647965A
- Authority
- CN
- China
- Prior art keywords
- dimensional code
- image
- artistic
- network
- converting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
- G06K19/06009—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
- G06K19/06046—Constructional details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
- G06K19/06009—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
- G06K19/06037—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking multi-dimensional coding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Image Processing (AREA)
Abstract
In view of the current situation that no method for converting the art two-dimensional code into the conventional two-dimensional code exists in the prior art, the invention provides a method for converting the art two-dimensional code into the conventional two-dimensional code based on a Dual GAN convolutional neural network, which comprises the following steps: s1, performing gray level transformation on an art two-dimensional code to be converted to obtain a gray level image; s2, training a Dual GAN network to obtain a weight file; and S3, adding the gray image obtained in the step S1 into the weight file obtained in the step S2 to obtain a conventional two-dimensional code image of the result. Compared with the prior art, the two-dimensional code deformation recovery method has the advantages of convenience and rapidness, capability of recovering two-dimensional codes of various deformation types, high accuracy and the like.
Description
Technical Field
The invention relates to the field of two-dimension code conversion, in particular to a method for converting an art two-dimension code into a conventional two-dimension code by a convolutional neural network based on deep learning, which is a method for converting some complex and diversified two-dimension codes into the conventional two-dimension code by combining with a Dual learning GAN network so as to be easy to identify.
Background
With the development of automatic identification technology and the popularization of smart phones carrying high-definition cameras in the public, the two-dimensional codes are rapidly developed. In various advertisements and flyers, merchants adopt colored and animated artistic two-dimensional codes to attract the attention of people; meanwhile, in real life, the artistic two-dimensional code is also loved by people. However, the artistic two-dimensional code includes various backgrounds and deformation of the two-dimensional code, which brings complexity to the two-dimensional code and makes identification difficult, and for the deformed artistic two-dimensional code, the current technology cannot be recovered, so that the artistic two-dimensional code needs to be converted into a conventional two-dimensional code to solve the problems. In the prior art, a method of obtaining a gray value of a two-dimensional code image and then performing binarization is mainly adopted, for example:
in patent document CN106709393A, an integral graph is calculated by changing the gray level of an image, dark areas and bright areas are distinguished according to whether the integral graph in each area satisfies an equation, and a binarization threshold is obtained, so that the two-dimensional code image is binarized, thereby achieving an effect.
In patent document CN109558927A, an original two-dimensional code array of a two-dimensional code is generated, and a picture of a constituent element unit of an artistic two-dimensional code is set and stored; carrying out binarization processing on the picture of the constituent element unit to obtain a two-dimensional code array pattern of each picture of the constituent element unit, thereby establishing a mapping relation between the picture of the constituent element unit and the two-dimensional code array pattern; and replacing the two-dimension code array pattern in the original two-dimension code array with the corresponding constituent element unit picture according to the mapping relation to form the artistic two-dimension code.
In the paper "An Aestitic QR Code Solution Based on Error Correction Mechanism", An Error Correction Mechanism is adopted to generate An artistic two-dimensional Code according to the characteristics of the two-dimensional Code and the technology of significance characteristics, and the feasibility and the accuracy of reading the information of the two-dimensional Code are ensured.
The prior art has no method aiming at converting the artistic two-dimensional code into the conventional two-dimensional code for the moment, the prior art binarizes some fuzzy two-dimensional codes with uneven illumination so as to recover the conventional two-dimensional code, and the artistic two-dimensional code is only researched for the generation thereof at present and does not have the technology of converting the artistic two-dimensional code into the conventional two-dimensional code, which brings inconvenience to the deformation recovery, research and the like of the two-dimensional code.
Disclosure of Invention
In view of the current situation that the method for converting the artistic two-dimensional code into the conventional two-dimensional code does not exist in the prior art,
the invention provides a method for converting an artistic two-dimensional code into a conventional two-dimensional code based on a Dual GAN convolutional neural network.
In order to achieve the purpose, the invention adopts the specific scheme that: a method for converting an artistic two-dimensional code into a conventional two-dimensional code is characterized by comprising the following steps:
s1, performing gray level transformation on an art two-dimensional code to be converted to obtain a gray level image;
s2, training a Dual GAN network to obtain a weight file;
s3, adding the gray level image obtained in the step S1 into the weight file obtained in the step S2 to obtain a conventional two-dimensional code image of a result;
the specific operation steps of the step S2 are as follows:
s201, in a Dual GAN network, a generator adopts a U-net network, and the loss function of the U-net network is as follows:
lg(u,v)=λU||u-GB(GA(u,z),z')||+λV||v-GA(GB(v,z'),z)||-DB(GB(v,z'))-DA(GA(u,z));
wherein U ∈ U, V ∈ V, U, V are two unlabeled, different image sets, GA,GBRespectively two generators, DA,DBAre respectively two discriminators, λU,λVIs a constant coefficient; z, z' are random noise;
s202. in the Dual GAN network, D is described aboveA,DBThe arbiter uses patchGAN, whose penalty function:
wherein U belongs to U, V belongs to V and GA,GBRespectively two generators, DA,DBTwo discriminators respectively; defining omegaA、ωBIs the parameter in the discriminator A, B, thetaA、θBIs a collection of parameters in the generator.
S203, continuously iterating the network and continuously updating omega in the network training processA、ωB、θA、θBThe value of (D) is minimized to two loss functions in S201 and S202, so as to achieve a balanced optimal solution for the generator and the discriminator, and obtain a final weight file.
The method for obtaining the gray-scale image in the step S1 is as follows: the image can be changed into a gray scale image through the following formula calculation; the formula of the gray scale map is as follows: gray ═ (R30 + G59 + B11)/100; wherein R is a red element value on the artistic two-dimensional code picture to be converted; g is a green element value on the to-be-converted art two-dimensional code picture; and B is a blue element value on the artistic two-dimensional code picture to be converted.
Wherein, the weight file obtained in the step of S203 is added in the step of S3 for converting the artistic two-dimensional code into the conventional two-dimensional code, and the specific steps are as follows:
s301, adding the weight file obtained in the step S203 into a Dual GAN network, and putting an image of the art two-dimensional code to be converted into the image to obtain a conventional two-dimensional code image;
and S302, performing corresponding processing on the conventional two-dimensional code image obtained in the S301 according to requirements.
Has the advantages that: compared with the prior art, the method for converting the artistic two-dimensional code into the conventional two-dimensional code has the advantages of convenience and rapidness, capability of recovering two-dimensional codes of various deformation types, high accuracy and the like.
Drawings
FIG. 1 is a flow chart of the present invention.
Fig. 2 is an effect diagram of the present invention.
It is to be understood that: the meaning of the individual parameters in fig. 1 is: omegaA、ωBIs discriminator DA、DBSet of interior parameters, θA、θBIs a collection of parameters in the generator.
It is to be understood that: the two-dimensional code before the gray level transformation in fig. 1 is a colorful artistic two-dimensional code, which includes multiple colors of red, green, blue and the like.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
The method for converting the artistic two-dimensional code into the conventional two-dimensional code comprises the following steps of:
1) preprocessing an artistic two-dimensional code picture;
2) training Dual GAN network to obtain weight file
3) And adding the trained weight file for converting the artistic two-dimensional code into the conventional two-dimensional code, and performing subsequent processing.
Specifically, step 1) of performing gray level transformation on the artistic two-dimensional code picture specifically comprises the following steps:
step 11) preprocessing the artistic two-dimensional codes with uneven brightness, fuzziness and the like, wherein the preprocessing comprises the adjustment of brightness, contrast and the like;
step 12) representing colors of the artistic two-dimensional code image processed in the step 11) by using R, G, B three-element values, and performing algorithm:
Gray=(R*30+G*59+B*11)/100;
the image can be changed into a gray image through calculation;
for step 2), training the Dual GAN network to obtain a weight file, specifically comprising the following steps:
step 21) in the Dual GAN network, the generator adopts a U-net network, and the loss function of the generator is as follows:
lg(u,v)=λU||u-GB(GA(u,z),z')||+
λV||v-GA(GB(v,z'),z)||
-DB(DB(v,z'))-DA(DA(u,z'))
wherein U belongs to U, V belongs to V and GA,GBRespectively two generators, DA,DBAre respectively two discriminators, λU,λVIs a constant coefficient.
Step 22) in a Dual GAN network, DA,DBThe arbiter uses patchGAN, whose penalty function:
wherein U belongs to U, V belongs to V and GA,GBRespectively two generators, DA,DBRespectively two discriminators.
And step 23) continuously iterating the network, updating the parameter pair, minimizing the two loss functions in the step 21) and the step 22) to achieve the balance optimal solution of the generator and the discriminator, and repeating the steps to obtain the final weight file.
And 3), adding the trained weight file for converting the artistic two-dimensional code into the conventional two-dimensional code, and performing subsequent processing. The method specifically comprises the following steps:
step 31) adding the weight file obtained in the step 12) into a network, and putting the artistic two-dimensional code image into the network to obtain a conventional two-dimensional code image of a result;
and step 32) carrying out corresponding processing such as scaling, selection and the like on the conventional two-dimensional code image according to corresponding requirements on the result in the step 31).
The invention realizes the method for converting the artistic two-dimensional code into the conventional two-dimensional code based on the deep learning Dual GAN convolution network so as to achieve simple reading and simplified processing of deformation recovery, and the method does not appear in the existing patents or papers.
The invention adopts a deep learning method to realize the conversion from the art two-dimension code to the conventional two-dimension code, and can convert various types of art two-dimension codes while processing conveniently and quickly; meanwhile, the adopted Dual GAN network combines Dual learning on the basis of a WGAN network, and forward content loss and reverse content loss are added into a loss function, so that the network has better conversion effect from an artistic two-dimensional code to a conventional two-dimensional code, and the information integrity and accuracy of the original artistic two-dimensional code are ensured.
The invention introduces the model of the convolutional neural network Dual GAN in the deep learning, compared with the binarization method adopted for the two-dimensional code due to the inconsistency of the fuzzy and lighting, the convolutional neural network in the deep learning can obtain a model capable of converting the artistic two-dimensional code into the conventional two-dimensional code through continuous training of the data set, and is convenient, rapid and suitable for various artistic two-dimensional codes.
Compared with the technologies in patent documents CN109558927A, CN106709393, and the paper An Aesthetic QR Code Solution Based on Error Correction Mechanism, the method of the present invention mainly focuses on the conversion from the artistic two-dimensional Code to the conventional two-dimensional Code, rather than the generation process of the artistic two-dimensional Code.
For the method, the method has wide application space for converting the artistic two-dimensional code into the common two-dimensional code, and is worthy of popularization.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily change or replace the present invention within the technical scope of the present invention. Therefore, the protection scope of the present invention is subject to the protection scope of the claims.
Claims (3)
1. A method for converting an artistic two-dimensional code into a conventional two-dimensional code is characterized by comprising the following steps:
s1, performing gray level transformation on an art two-dimensional code to be converted to obtain a gray level image;
s2, training a Dual GAN network to obtain a weight file;
s3, adding the gray level image obtained in the step S1 into the weight file obtained in the step S2 to obtain a conventional two-dimensional code image of a result;
the specific operation steps of the step S2 are as follows:
s201, in a Dual GAN network, a generator adopts a U-net network, and the loss function of the U-net network is as follows:
lg(u,v)=λU||u-GB(GA(u,z),z')||+λV||v-GA(GB(v,z'),z)||-DB(GB(v,z'))-DA(GA(u,z));
wherein U ∈ U, V ∈ V, U, V are two unlabeled, different image sets, GA,GBRespectively two generators, DA,DBAre respectively two discriminators, λU,λVIs a constant coefficient; z, z' are random noise;
s202. in DualGAN network, DA、DBThe discriminator adopts PatchGAN, and the loss function of the PatchGAN is as follows:
wherein U belongs to U, V belongs to V and GA,GBRespectively two generators, DA,DBTwo discriminators respectively;
defining omegaA、ωBIs the parameter in the discriminator A, B, thetaA、θBIs a collection of parameters in the generator.
S203, continuously iterating the network and continuously updating omega in the network training processA、ωB、θA、θBThe value of (D) is minimized to two loss functions in S201 and S202 so as to achieve a balanced optimal solution for the generator and the discriminator to obtain a final weight file。
2. The method for converting an artistic two-dimensional code into a conventional two-dimensional code according to claim 1, wherein the method for obtaining the gray image in the step S1 is: the image can be changed into a gray scale image through the following formula calculation; the formula of the gray scale map is as follows: gray ═ (R30 + G59 + B11)/100; wherein, R is a red element value on the two-dimensional code picture to be converted; g is a green element value on the two-dimensional code picture to be converted; and B is a blue element value on the two-dimensional code picture to be converted.
3. The method for converting the conventional two-dimensional code into the artistic two-dimensional code according to claim 1, wherein the weight file obtained in the step S203 is added in the step S3 for converting the artistic two-dimensional code into the conventional two-dimensional code, and the specific steps are as follows:
s301, adding the weight file obtained in the step S203 into a Dual GAN network, and putting the gray image obtained in the step S1 into the Dual GAN network to obtain a conventional two-dimensional code image;
and S302, performing corresponding processing on the conventional two-dimensional code image obtained in the S301 according to requirements.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910728382.2A CN110647965A (en) | 2019-08-08 | 2019-08-08 | Method for converting artistic two-dimensional code into conventional two-dimensional code |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910728382.2A CN110647965A (en) | 2019-08-08 | 2019-08-08 | Method for converting artistic two-dimensional code into conventional two-dimensional code |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110647965A true CN110647965A (en) | 2020-01-03 |
Family
ID=68990041
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910728382.2A Pending CN110647965A (en) | 2019-08-08 | 2019-08-08 | Method for converting artistic two-dimensional code into conventional two-dimensional code |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110647965A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117911793A (en) * | 2024-03-18 | 2024-04-19 | 南开大学 | Deep learning-based marine organism intelligent detection method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108573479A (en) * | 2018-04-16 | 2018-09-25 | 西安电子科技大学 | The facial image deblurring and restoration methods of confrontation type network are generated based on antithesis |
CN108596003A (en) * | 2018-04-11 | 2018-09-28 | 中山大学 | It is a kind of that Quick Response Code restorative procedure and system are stained based on machine learning |
CN109671018A (en) * | 2018-12-12 | 2019-04-23 | 华东交通大学 | A kind of image conversion method and system based on production confrontation network and ResNets technology |
CN110097615A (en) * | 2018-12-06 | 2019-08-06 | 北京大学 | A kind of joint is stylized and removes the characters in a fancy style edit methods and system of stylization |
-
2019
- 2019-08-08 CN CN201910728382.2A patent/CN110647965A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108596003A (en) * | 2018-04-11 | 2018-09-28 | 中山大学 | It is a kind of that Quick Response Code restorative procedure and system are stained based on machine learning |
CN108573479A (en) * | 2018-04-16 | 2018-09-25 | 西安电子科技大学 | The facial image deblurring and restoration methods of confrontation type network are generated based on antithesis |
CN110097615A (en) * | 2018-12-06 | 2019-08-06 | 北京大学 | A kind of joint is stylized and removes the characters in a fancy style edit methods and system of stylization |
CN109671018A (en) * | 2018-12-12 | 2019-04-23 | 华东交通大学 | A kind of image conversion method and system based on production confrontation network and ResNets technology |
Non-Patent Citations (4)
Title |
---|
ZILI YI等: "DualGAN: Unsupervised Dual Learning for Image-to-Image Translation", 《2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION》 * |
吴华明等: "基于生成对抗网络的人脸图像翻译", 《天津大学学报(自然科学与工程技术版)》 * |
蔡若君等: "基于深度学习的二维码定位与检测技术", 《现代计算机(专业版)》 * |
邹建成等: "《数学及其应用新进展 2010年全国数学与信息科学研究生学术研讨会论文集》", 31 December 2011 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117911793A (en) * | 2024-03-18 | 2024-04-19 | 南开大学 | Deep learning-based marine organism intelligent detection method |
CN117911793B (en) * | 2024-03-18 | 2024-05-17 | 南开大学 | Deep learning-based marine organism intelligent detection method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108875935B (en) | Natural image target material visual characteristic mapping method based on generation countermeasure network | |
CN108520216B (en) | Gait image-based identity recognition method | |
CN112287940A (en) | Semantic segmentation method of attention mechanism based on deep learning | |
CN110543846B (en) | Multi-pose face image obverse method based on generation countermeasure network | |
CN112507617B (en) | Training method of SRFlow super-resolution model and face recognition method | |
CN109410135B (en) | Anti-learning image defogging and fogging method | |
CN110415184B (en) | Multi-modal image enhancement method based on orthogonal element space | |
CN112819692B (en) | Real-time arbitrary style migration method based on dual-attention module | |
CN113744153B (en) | Double-branch image restoration forgery detection method, system, equipment and storage medium | |
CN105426884A (en) | Fast document type recognition method based on full-sized feature extraction | |
CN111652233A (en) | Text verification code automatic identification method for complex background | |
CN109360179A (en) | A kind of image interfusion method, device and readable storage medium storing program for executing | |
CN114119356B (en) | CycleGAN-based method for converting thermal infrared image into visible light color image | |
CN113392711A (en) | Smoke semantic segmentation method and system based on high-level semantics and noise suppression | |
CN109815653A (en) | A kind of extraction of pdf Text Watermarking and comparison method based on deep learning | |
CN106845312A (en) | A kind of pre- determination methods of image in 2 D code quality | |
CN111932645B (en) | Method for automatically generating ink and wash painting based on generation countermeasure network GAN | |
CN105931211A (en) | Face image beautification method | |
Rana et al. | MSRD-CNN: Multi-scale residual deep CNN for general-purpose image manipulation detection | |
CN114862707A (en) | Multi-scale feature recovery image enhancement method and device and storage medium | |
CN110647965A (en) | Method for converting artistic two-dimensional code into conventional two-dimensional code | |
Xiao et al. | Effective PRNU extraction via densely connected hierarchical network | |
CN112016592B (en) | Domain adaptive semantic segmentation method and device based on cross domain category perception | |
CN116612490A (en) | Braille image recognition method based on target detection | |
CN113255704B (en) | Pixel difference convolution edge detection method based on local binary pattern |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200103 |