CN107977591B - Two-dimensional code image identification method and mobile terminal - Google Patents

Two-dimensional code image identification method and mobile terminal Download PDF

Info

Publication number
CN107977591B
CN107977591B CN201711306827.5A CN201711306827A CN107977591B CN 107977591 B CN107977591 B CN 107977591B CN 201711306827 A CN201711306827 A CN 201711306827A CN 107977591 B CN107977591 B CN 107977591B
Authority
CN
China
Prior art keywords
dimensional code
code image
identification points
color
gray scale
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711306827.5A
Other languages
Chinese (zh)
Other versions
CN107977591A (en
Inventor
谢莲花
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201711306827.5A priority Critical patent/CN107977591B/en
Publication of CN107977591A publication Critical patent/CN107977591A/en
Application granted granted Critical
Publication of CN107977591B publication Critical patent/CN107977591B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Toxicology (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a two-dimensional code image identification method and a mobile terminal, wherein the method comprises the following steps: acquiring a first two-dimensional code image; acquiring feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points; determining characteristic offset according to the characteristic information of the M positioning identification points; according to the characteristic offset, performing characteristic calibration on the characteristic information of the N identification points to generate a second two-dimensional code image; and carrying out image recognition on the second two-dimensional code image to obtain a recognition result. Thus, the characteristic offset can be determined according to the characteristic information of the M positioning identification points. And then, the first two-dimensional code image can be subjected to characteristic calibration according to the characteristic offset to obtain a second two-dimensional code image. And finally, performing image recognition on the second two-dimensional code image obtained through feature calibration. The influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.

Description

Two-dimensional code image identification method and mobile terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a two-dimensional code image identification method and a mobile terminal.
Background
The two-dimensional code image can only contain black and white two colors, and the two-dimensional code image is a black and white two-dimensional code image. The information of a single identification point of the black-and-white two-dimensional code image is not 0, namely 1, and the storage data of the whole two-dimensional code image is limited. Besides the black-and-white two-dimensional code image, two-dimensional code images in two forms of a color two-dimensional code image and a gray-scale two-dimensional code image are also available. Compared with a black-and-white two-dimensional code image, the information storage capacity of the color two-dimensional code image and the gray-scale two-dimensional code image is higher. However, problems may occur in recognizing a color two-dimensional code image or a gray-scale two-dimensional code image. For example, different display devices have color differences for a color two-dimensional code image, and the displayed color two-dimensional code image has a color cast problem. In addition, the external light of various colors is irradiated on the color two-dimensional code image, so that the color two-dimensional code image is also subjected to color cast. For gray scale two-dimensional code images, different display devices have brightness differences, and the displayed gray scale two-dimensional code images have the problem of gray scale deviation, so that the situation of misrecognition is caused finally. Therefore, in the prior art, when a color two-dimensional code image or a gray-scale two-dimensional code image is identified, the error identification rate is high.
Disclosure of Invention
The embodiment of the invention provides a two-dimensional code image identification method and a mobile terminal, and aims to solve the problem that in the prior art, when a color two-dimensional code image or a gray-scale two-dimensional code image is identified, the error identification rate is high.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a two-dimensional code image identification method, including:
acquiring a first two-dimensional code image;
acquiring feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points;
determining characteristic offset according to the characteristic information of the M positioning identification points;
according to the characteristic offset, performing characteristic calibration on the characteristic information of the N identification points to generate a second two-dimensional code image;
performing image recognition on the second two-dimensional code image to obtain a recognition result;
wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including:
the first acquisition module is used for acquiring a first two-dimensional code image;
the second acquisition module is used for acquiring the feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points;
the determining module is used for determining characteristic offset according to the characteristic information of the M positioning identification points;
the calibration module is used for performing characteristic calibration on the characteristic information of the N identification points according to the characteristic offset to generate a second two-dimensional code image;
the identification module is used for carrying out image identification on the second two-dimensional code image to obtain an identification result;
wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the steps of the two-dimensional code image recognition method.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the two-dimensional code image recognition method are implemented.
In the embodiment of the invention, a first two-dimensional code image is obtained; acquiring feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points; determining characteristic offset according to the characteristic information of the M positioning identification points; according to the characteristic offset, performing characteristic calibration on the characteristic information of the N identification points to generate a second two-dimensional code image; performing image recognition on the second two-dimensional code image to obtain a recognition result; wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N. Thus, the characteristic offset can be determined according to the characteristic information of the M positioning identification points. And then, the first two-dimensional code image can be subjected to characteristic calibration according to the characteristic offset to obtain a second two-dimensional code image. And finally, performing image recognition on the second two-dimensional code image obtained through feature calibration. The influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
Drawings
Fig. 1 is a flowchart of a two-dimensional code image recognition method according to an embodiment of the present invention;
fig. 2 is a second flowchart of a two-dimensional code image recognition method according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating the correspondence between color types and numbers in a 16-system according to an embodiment of the present invention;
fig. 4 is a schematic diagram of three positioning identification points in a color two-dimensional code image according to an embodiment of the present invention;
fig. 5 is a third flowchart of a two-dimensional code image recognition method according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a relationship between gray scale types and numbers in a 16-ary system according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of three positioning identification points in a gray-scale two-dimensional code image according to an embodiment of the present invention;
fig. 8 is one of the structural diagrams of a mobile terminal according to an embodiment of the present invention;
fig. 9 is a second block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 10 is a third block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 11 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a two-dimensional code image recognition method according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following steps:
step 101, obtaining a first two-dimensional code image.
In step 101, a first two-dimensional code image may be acquired. The first two-dimensional code image can be a color two-dimensional code image or a gray-scale two-dimensional code image.
102, obtaining feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points, N is an integer larger than 1, M is an integer larger than 1, and M is smaller than N.
In step 102, feature information of N identification points in the first two-dimensional code image may be obtained, and the N identification points include M positioning identification points. Wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N. In the embodiment of the present invention, M may take the value of 3. That is, the first two-dimensional code image may include three positioning identification points, and the three positioning identification points may be located at the upper left corner, the upper right corner, and the lower left corner of the first two-dimensional code image, respectively.
For a color two-dimensional code image, the colors of the three positioning identification points can be red, yellow and blue respectively. The color of the positioning identification point positioned at the upper left corner of the color two-dimensional code image can be red; the color of the positioning identification point positioned at the upper right corner of the color two-dimensional code image can be yellow; the color of the positioning identification point located at the lower left corner of the color two-dimensional code image may be blue. The color coordinates of the N identification points in the first two-dimensional code image in the color coordinate system can be obtained.
For the gray-scale two-dimensional code image, the gray scales of the three positioning identification points can be 3-order, 5-order and 8-order respectively. The gray scale of the positioning identification point positioned at the upper left corner of the gray scale two-dimensional code image can be 5-order; the gray scale of the positioning identification point positioned at the upper right corner of the gray scale two-dimensional code image can be 3-order; the gray scale of the positioning identification point located at the lower left corner of the gray scale two-dimensional code image can be 8-level. The gray-scale values of the N identification points in the first two-dimensional code image can be obtained.
And 103, determining characteristic offset according to the characteristic information of the M positioning identification points.
In step 103, a feature offset may be determined according to the feature information of the M positioning identification points.
For the color two-dimensional code image, color coordinate difference values between the color coordinates of the M positioning identification points and the pre-stored M standard color coordinates can be respectively calculated. Namely, the color coordinate (X) of the color of the positioning identification point positioned at the position of the upper left corner of the color two-dimensional code image in the color coordinate system is identified1,Y1) The color coordinate (X) can then be utilized1,Y1) Minus the pre-stored color coordinates (X) of standard redStandard 1,YStandard 1) Obtaining the color coordinate difference value (X) corresponding to the positioning identification point at the position of the upper left corner of the color two-dimensional code imageDifference 1,YDifference 1) (ii) a Identify outColor coordinates (X) of colors of positioning identification points positioned at the upper right corner of the color two-dimensional code image in a color coordinate system2,Y2) The color coordinate (X) can then be utilized2,Y2) Subtract the pre-stored color coordinates (X) of the standard yellowStandard 2,YStandard 2) Obtaining the color coordinate difference value (X) corresponding to the positioning identification point at the position of the upper right corner of the color two-dimensional code imageDifference 2,YDifference 2) (ii) a Identifying the color coordinate (X) of the color of the positioning identification point positioned at the lower left corner of the color two-dimensional code image in the color coordinate system3,Y3) The color coordinate (X) can then be utilized3,Y3) Subtracting the pre-stored color coordinates (X) of standard blueStandard 3,YStandard 3) Obtaining the color coordinate difference (X) corresponding to the positioning identification point at the lower left corner of the color two-dimensional code imageDifference 3,YDifference 3)。
And further, the average value of M color coordinate difference values corresponding to the M positioning identification points can be calculated to obtain the color offset. Namely, the color coordinate difference value (X) corresponding to the positioning identification point positioned at the upper left corner of the color two-dimensional code image can be measuredDifference 1,YDifference 1) And a color coordinate difference value (X) corresponding to the positioning identification point at the upper right corner of the color two-dimensional code imageDifference 2,YDifference 2) And a color coordinate difference value (X) corresponding to the positioning identification point located at the lower left corner of the color two-dimensional code imageDifference 3,YDifference 3) The difference values of the three color coordinates are averaged to obtain the color offset (X)Offset of,YOffset of)。
Figure BDA0001502109460000051
For the gray-scale two-dimensional code image, gray-scale difference values between the gray-scale values of the M positioning identification points and the pre-stored M standard gray-scale values can be respectively calculated. Namely, the gray scale value of the positioning identification point positioned at the upper left corner of the gray scale two-dimensional code image is identified to be Z1The gray scale can then be utilizedValue Z1Subtracting a pre-stored standard gray level value ZLabel 1Subtracting a pre-stored standard gray scale value 5 to obtain a gray scale difference value Z corresponding to the positioning identification point at the upper left corner of the gray scale two-dimensional code imageDifference 1(ii) a Identifying the gray scale value of a positioning identification point at the upper right corner of the gray scale two-dimensional code image as Z2Thereafter, a pre-stored standard gray level value Z may be subtracted from the gray level value ZLabel 2That is, subtracting the pre-stored standard gray scale value 3 to obtain the gray scale difference value Z corresponding to the positioning identification point located at the upper right corner of the gray scale two-dimensional code imageDifference 2(ii) a Identifying the gray scale value of a positioning identification point at the lower left corner of the gray scale two-dimensional code image as Z3The gray level value Z can then be utilized3Subtracting a pre-stored standard gray level value ZLabel 3Namely, subtracting a prestored standard gray scale value 8 to obtain a gray scale difference value Z corresponding to the positioning identification point positioned at the lower left corner of the gray scale two-dimensional code imageDifference 3
And further, the average value of M gray scale difference values corresponding to the M positioning identification points can be calculated to obtain the gray scale offset. Namely, the gray scale difference value Z corresponding to the positioning identification point positioned at the upper left corner of the gray scale two-dimensional code image can be obtainedDifference 1And the gray scale difference value Z corresponding to the positioning identification point positioned at the upper right corner of the gray scale two-dimensional code imageDifference 2And gray scale difference value Z corresponding to the positioning identification point positioned at the lower left corner of the gray scale two-dimensional code imageDifference 3The three gray scale difference values are averaged to obtain a gray scale offset ZOffset of
Figure BDA0001502109460000061
And step 104, performing characteristic calibration on the characteristic information of the N identification points according to the characteristic offset to generate a second two-dimensional code image.
In step 104, feature calibration may be performed on the feature information of the N identification points according to the feature offset, so as to generate a second two-dimensional code image.
For color two-dimensional code mapImage, can be based on the amount of color shift (X)Offset of,YOffset of) And carrying out color calibration on the characteristic information of the N identification points to generate a second two-dimensional code image.
For a gray-scale two-dimensional code image, the gray-scale offset Z can be usedOffset ofAnd performing gray scale calibration on the characteristic information of the N identification points to generate a second two-dimensional code image.
And 105, performing image recognition on the second two-dimensional code image to obtain a recognition result.
In step 105, image recognition may be performed on the second two-dimensional code image to obtain a recognition result.
For the color two-dimensional code image, after the true colors of the N identification points in the color two-dimensional code image are restored, the color two-dimensional code image subjected to color restoration is identified. Therefore, the influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
For the gray-scale two-dimensional code image, after the real gray scales of the N identification points in the gray-scale two-dimensional code image are restored, the gray-scale two-dimensional code image restored by the gray scales is identified. Therefore, the influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
In an embodiment of the present invention, the Mobile terminal may be a Mobile phone, a Tablet personal Computer (Tablet personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
According to the two-dimensional code image identification method, the first two-dimensional code image is obtained; acquiring feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points; determining characteristic offset according to the characteristic information of the M positioning identification points; according to the characteristic offset, performing characteristic calibration on the characteristic information of the N identification points to generate a second two-dimensional code image; performing image recognition on the second two-dimensional code image to obtain a recognition result; wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N. Thus, the characteristic offset can be determined according to the characteristic information of the M positioning identification points. And then, the first two-dimensional code image can be subjected to characteristic calibration according to the characteristic offset to obtain a second two-dimensional code image. And finally, performing image recognition on the second two-dimensional code image obtained through feature calibration. The influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
Referring to fig. 2, fig. 2 is a second flowchart of a two-dimensional code image recognition method according to an embodiment of the present invention. The main difference between this embodiment and the previous embodiment is that the process of correcting the colors of N identification points in a color two-dimensional code image is explained in detail. As shown in fig. 2, the method comprises the following steps:
step 201, obtaining a first two-dimensional code image.
In step 201, a first two-dimensional code image may be acquired. For example, a color two-dimensional code image may be acquired.
When the black-and-white two-dimensional code image represents a large amount of information, the density of the identification points of the black-and-white two-dimensional code image is relatively high. If the pixel of the camera of the mobile terminal is low, the situation that the image cannot be identified may occur when the high-density black-and-white two-dimensional code image is identified. The invention can also convert the black-and-white two-dimensional code image into the color two-dimensional code image.
For example, 16 colors may be selected as reference colors, and 16 systems may be represented by the 16 colors. As shown in fig. 3, it is a corresponding relationship diagram of a color type and each number in 16-system. Suppose that the black-and-white two-dimensional code image has eight identification points, "black" represents 1, and "white" represents 0. The data for the eight identified points is assumed to be 11001001, which translates to a hexadecimal representation of C9. That is, the original "black, white, black" can be expressed as "yellow brown". Therefore, the high-density black-and-white two-dimensional code image is converted into the low-density color two-dimensional code image, and the information quantity carried by the two-dimensional code image is not changed in the conversion process, namely the information quantity carried by the low-density color two-dimensional code image is the same as the information quantity carried by the high-density black-and-white two-dimensional code image. The success rate of recognizing the camera with lower pixels can be improved.
Step 202, obtaining color coordinates of N identification points in the first two-dimensional code image in a color coordinate system, where the N identification points include M positioning identification points, where N is an integer greater than 1, M is an integer greater than 1, and M is less than N.
In step 202, the color coordinates of N identification points in the first two-dimensional code image in the color coordinate system may be obtained, and the N identification points include M positioning identification points. Wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N. In the embodiment of the present invention, M may be 3, that is, the color two-dimensional code image may include three positioning identification points, and the three positioning identification points may be located at the upper left corner, the upper right corner, and the lower left corner of the color two-dimensional code image, respectively.
Optionally, the color value of each positioning identification point in the M positioning identification points is different.
In the color two-dimensional code image, the color value of each positioning identification point in the M positioning identification points is different. For example, the three positioning identification points may be respectively red, yellow and blue in color. The color of the positioning identification point positioned at the upper left corner of the color two-dimensional code image can be red; the color of the positioning identification point positioned at the upper right corner of the color two-dimensional code image can be yellow; the color of the positioning identification point located at the lower left corner of the color two-dimensional code image may be blue. Fig. 4 is a schematic diagram of three positioning identification points in a color two-dimensional code image.
And 203, respectively calculating color coordinate differences between the color coordinates of the M positioning identification points and M pre-stored standard color coordinates.
In step 203, for the color two-dimensional code image, color coordinate differences between the color coordinates of the M positioning identification points and the M standard color coordinates stored in advance may be calculated, respectively. Namely, the color coordinate (X) of the color of the positioning identification point positioned at the position of the upper left corner of the color two-dimensional code image in the color coordinate system is identified1,Y1) The color can then be utilizedCoordinate (X)1,Y1) Minus the pre-stored color coordinates (X) of standard redStandard 1,YStandard 1) Obtaining the color coordinate difference value (X) corresponding to the positioning identification point at the position of the upper left corner of the color two-dimensional code imageDifference 1,YDifference 1) (ii) a Identifying the color coordinate (X) of the color of the positioning identification point positioned at the upper right corner of the color two-dimensional code image in the color coordinate system2,Y2) The color coordinate (X) can then be utilized2,Y2) Subtract the pre-stored color coordinates (X) of the standard yellowStandard 2,YStandard 2) Obtaining the color coordinate difference value (X) corresponding to the positioning identification point at the position of the upper right corner of the color two-dimensional code imageDifference 2,YDifference 2) (ii) a Identifying the color coordinate (X) of the color of the positioning identification point positioned at the lower left corner of the color two-dimensional code image in the color coordinate system3,Y3) The color coordinate (X) can then be utilized3,Y3) Subtracting the pre-stored color coordinates (X) of standard blueStandard 3,YStandard 3) Obtaining the color coordinate difference (X) corresponding to the positioning identification point at the lower left corner of the color two-dimensional code imageDifference 3,YDifference 3)。
And 204, calculating the average value of the M color coordinate difference values corresponding to the M positioning identification points to obtain the color offset.
In step 204, an average value of M color coordinate difference values corresponding to the M positioning identification points may be calculated to obtain a color offset. Namely, the color coordinate difference value (X) corresponding to the positioning identification point positioned at the upper left corner of the color two-dimensional code image can be measuredDifference 1,YDifference 1) And a color coordinate difference value (X) corresponding to the positioning identification point at the upper right corner of the color two-dimensional code imageDifference 2,YDifference 2) And a color coordinate difference value (X) corresponding to the positioning identification point located at the lower left corner of the color two-dimensional code imageDifference 3,YDifference 3) The difference values of the three color coordinates are averaged to obtain the color offset (X)Offset of,YOffset of)。
Figure BDA0001502109460000091
And step 205, performing color calibration on the feature information of the N identification points according to the color offset to generate a second two-dimensional code image.
In step 205, the amount of color shift (X) may be based onOffset of,YOffset of) And carrying out color calibration on the characteristic information of the N identification points to generate a second two-dimensional code image.
And step 206, carrying out image recognition on the second two-dimensional code image to obtain a recognition result.
In step 206, image recognition may be performed on the second two-dimensional code image to obtain a recognition result.
For the color two-dimensional code image, after the true colors of the N identification points in the color two-dimensional code image are restored, the color two-dimensional code image subjected to color restoration is identified. Therefore, the influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
The two-dimensional code image identification method provided by the embodiment of the invention can calculate the color offset. Then, according to the color offset, color calibration can be performed on the first two-dimensional code image to obtain a second two-dimensional code image. And finally, carrying out image recognition on the second two-dimensional code image obtained through color calibration. Therefore, the influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
Referring to fig. 5, fig. 5 is a third flowchart of a two-dimensional code image recognition method according to an embodiment of the present invention. The main difference between this embodiment and the previous embodiment is that the process of correcting the gray-scale values of N identification points in the gray-scale two-dimensional code image is explained in detail. As shown in fig. 5, the method comprises the following steps:
and step 501, obtaining a first two-dimensional code image.
In step 501, a first two-dimensional code image may be acquired. For example, a grayscale two-dimensional code image may be acquired.
When the black-and-white two-dimensional code image represents a large amount of information, the density of the identification points of the black-and-white two-dimensional code image is relatively high. If the pixel of the camera of the mobile terminal is low, the situation that the image cannot be identified may occur when the high-density black-and-white two-dimensional code image is identified. The invention can also convert the black-and-white two-dimensional code image into the gray-scale two-dimensional code image.
For example, 16 gray levels may be selected as the reference gray level, and 16 gray levels may be used to represent the 16-ary system. FIG. 6 shows a relationship between gray level types and numbers in 16-ary system. Suppose that the black-and-white two-dimensional code image has eight identification points, "black" represents 1, and "white" represents 0. The data for the eight identified points is assumed to be 11001001, which translates to a hexadecimal representation of C9. That is, the original "black, white, black" can be expressed by "gray level 12, gray level 9". Therefore, the high-density black-and-white two-dimensional code image is converted into the low-density gray-scale two-dimensional code image, and the information quantity carried by the two-dimensional code image is not changed in the conversion process, namely the information quantity carried by the low-density gray-scale two-dimensional code image is the same as the information quantity carried by the high-density black-and-white two-dimensional code image. The success rate of recognizing the camera with lower pixels can be improved.
Step 502, obtaining grayscale values of N identification points in the first two-dimensional code image, where the N identification points include M positioning identification points, where N is an integer greater than 1, M is an integer greater than 1, and M is smaller than N.
In step 502, grayscale values of N identification points in the first two-dimensional code image may be obtained, and the N identification points include M positioning identification points. Wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N. In the embodiment of the present invention, M may take the value of 3. That is, the grayscale two-dimensional code image may include three positioning identification points, and the three positioning identification points may be located at the upper left corner, the upper right corner, and the lower left corner of the grayscale two-dimensional code image, respectively.
Optionally, the gray scale value of each of the M positioning identification points is different.
In the gray-scale two-dimensional code image, the gray-scale value of each positioning identification point in the M positioning identification points is different. For example, the gray scale values of the three location identification points may be 3 th order, 5 th order and 8 th order, respectively. The gray scale value of the positioning identification point positioned at the upper left corner of the gray scale two-dimensional code image can be 5-order; the gray scale value of the positioning identification point positioned at the upper right corner of the gray scale two-dimensional code image can be 3-order; the gray scale value of the positioning identification point located at the lower left corner of the gray scale two-dimensional code image may be 8 levels. Fig. 7 is a schematic diagram of three positioning identification points in a gray-scale two-dimensional code image.
Step 503, calculating gray scale difference values between the gray scale values of the M positioning identification points and M pre-stored standard gray scale values respectively.
In step 503, for the grayscale two-dimensional code image, grayscale differences between grayscale values of the M positioning identification points and M pre-stored standard grayscale values may be calculated, respectively. Namely, the gray scale value of the positioning identification point positioned at the upper left corner of the gray scale two-dimensional code image is identified to be Z1The gray level value Z can then be utilized1Subtracting a pre-stored standard gray level value ZLabel 1Subtracting a pre-stored standard gray scale value 5 to obtain a gray scale difference value Z corresponding to the positioning identification point at the upper left corner of the gray scale two-dimensional code imageDifference 1(ii) a Identifying the gray scale value of a positioning identification point at the upper right corner of the gray scale two-dimensional code image as Z2The gray level value Z can then be utilized2Subtracting a pre-stored standard gray level value ZLabel 2That is, subtracting the pre-stored standard gray scale value 3 to obtain the gray scale difference value Z corresponding to the positioning identification point located at the upper right corner of the gray scale two-dimensional code imageDifference 2(ii) a Identifying the gray scale value of a positioning identification point at the lower left corner of the gray scale two-dimensional code image as Z3The gray level value Z can then be utilized3Subtracting a pre-stored standard gray level value ZLabel 3Namely, subtracting a prestored standard gray scale value 8 to obtain a gray scale difference value Z corresponding to the positioning identification point positioned at the lower left corner of the gray scale two-dimensional code imageDifference 3
And step 504, calculating the average value of the M gray scale difference values corresponding to the M positioning identification points to obtain the gray scale offset.
In step 504, an average value of the M gray scale differences corresponding to the M positioning identification points may be calculated to obtain a gray scale offset. Namely, the gray scale difference value Z corresponding to the positioning identification point positioned at the upper left corner of the gray scale two-dimensional code image can be obtainedDifference 1And the gray scale difference value Z corresponding to the positioning identification point positioned at the upper right corner of the gray scale two-dimensional code imageDifference 2And gray scale difference value Z corresponding to the positioning identification point positioned at the lower left corner of the gray scale two-dimensional code imageDifference 3The three gray scale difference values are averaged to obtain a gray scale offset ZOffset of
Figure BDA0001502109460000111
And 505, performing gray scale calibration on the characteristic information of the N identification points according to the gray scale offset to generate a second two-dimensional code image.
In step 505, the gray scale shift amount Z can be determinedOffset ofAnd performing gray scale calibration on the characteristic information of the N identification points to generate a second two-dimensional code image.
And step 506, performing image recognition on the second two-dimensional code image to obtain a recognition result.
In step 506, image recognition may be performed on the second two-dimensional code image to obtain a recognition result.
For the gray-scale two-dimensional code image, after the real gray scales of the N identification points in the gray-scale two-dimensional code image are restored, the gray-scale two-dimensional code image restored by the gray scales is identified. Therefore, the influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
The two-dimensional code image identification method provided by the embodiment of the invention can be used for calculating the gray scale offset. And then, carrying out gray scale calibration on the first two-dimensional code image according to the gray scale offset to obtain a second two-dimensional code image. And finally, carrying out image recognition on the second two-dimensional code image obtained through gray scale calibration. Therefore, the influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
Referring to fig. 8, fig. 8 is a block diagram of a mobile terminal provided in the implementation of the present invention, as shown in fig. 8, a mobile terminal 800 includes a first obtaining module 801, a second obtaining module 802, a determining module 803, a calibrating module 804, and an identifying module 805, where:
a first obtaining module 801, configured to obtain a first two-dimensional code image;
a second obtaining module 802, configured to obtain feature information of N identification points in the first two-dimensional code image, where the N identification points include M positioning identification points;
a determining module 803, configured to determine a feature offset according to the feature information of the M positioning identification points;
the calibration module 804 is configured to perform feature calibration on the feature information of the N identification points according to the feature offset to generate a second two-dimensional code image;
the identification module 805 is configured to perform image identification on the second two-dimensional code image to obtain an identification result;
wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N.
Optionally, as shown in fig. 9, the second obtaining module 802 is specifically configured to obtain color coordinates of N identification points in the first two-dimensional code image in a color coordinate system;
the determining module 803 includes:
the first calculating submodule 8031 is configured to calculate color coordinate differences between the color coordinates of the M positioning identification points and M pre-stored standard color coordinates, respectively;
the second calculating submodule 8032 is configured to calculate an average value of M color coordinate difference values corresponding to the M positioning identification points, so as to obtain a color offset;
the calibration module 804 is specifically configured to perform color calibration on the feature information of the N identification points according to the color offset, and generate a second two-dimensional code image.
Optionally, the color value of each positioning identification point in the M positioning identification points is different.
Optionally, as shown in fig. 10, the second obtaining module 802 is specifically configured to obtain gray-scale values of N identification points in the first two-dimensional code image;
the determining module 803 includes:
a third computing submodule 8033, configured to respectively compute grayscale differences between the grayscale values of the M positioning identification points and M standard grayscale values stored in advance;
a fourth calculating submodule 8034, configured to calculate an average value of M grayscale differences corresponding to the M positioning identification points, so as to obtain a grayscale offset;
the calibration module 804 is specifically configured to perform gray scale calibration on the feature information of the N identification points according to the gray scale offset, and generate a second two-dimensional code image.
Optionally, the gray scale value of each of the M positioning identification points is different.
The mobile terminal 800 is capable of implementing each process implemented by the mobile terminal in the method embodiments of fig. 1, fig. 2, and fig. 5, and is not described here again to avoid repetition. And the mobile terminal 800 may determine the characteristic offset according to the characteristic information of the M positioning identification points. And then, the first two-dimensional code image can be subjected to characteristic calibration according to the characteristic offset to obtain a second two-dimensional code image. And finally, performing image recognition on the second two-dimensional code image obtained through feature calibration. The influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
Fig. 11 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention.
The mobile terminal 1100 includes, but is not limited to: radio frequency unit 1101, network module 1102, audio output unit 1103, input unit 1104, sensor 1105, display unit 1106, user input unit 1107, interface unit 1108, memory 1109, processor 1110, and power supply 1111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 11 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
A processor 1110 configured to obtain a first two-dimensional code image; acquiring feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points; determining characteristic offset according to the characteristic information of the M positioning identification points; according to the characteristic offset, performing characteristic calibration on the characteristic information of the N identification points to generate a second two-dimensional code image; performing image recognition on the second two-dimensional code image to obtain a recognition result; wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N.
The characteristic offset can be determined according to the characteristic information of the M positioning identification points. And then, the first two-dimensional code image can be subjected to characteristic calibration according to the characteristic offset to obtain a second two-dimensional code image. And finally, performing image recognition on the second two-dimensional code image obtained through feature calibration. The influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1101 may be configured to receive and transmit signals during a message transmission or a call, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 1110; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 1101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 1101 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 1102, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 1103 may convert audio data received by the radio frequency unit 1101 or the network module 1102 or stored in the memory 1109 into an audio signal and output as sound. Also, the audio output unit 1103 may also provide audio output related to a specific function performed by the mobile terminal 1100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1104 is used to receive audio or video signals. The input Unit 1104 may include a Graphics Processing Unit (GPU) 11041 and a microphone 11042, and the Graphics processor 11041 processes image data of still pictures or video obtained by an image capturing device, such as a camera, in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1106. The image frames processed by the graphic processor 11041 may be stored in the memory 1109 (or other storage medium) or transmitted via the radio frequency unit 1101 or the network module 1102. The microphone 11042 may receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1101 in case of the phone call mode.
The mobile terminal 1100 also includes at least one sensor 1105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 11061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 11061 and/or a backlight when the mobile terminal 1100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 1105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., and will not be described in detail herein.
The display unit 1106 is used to display information input by a user or information provided to the user. The Display unit 1106 may include a Display panel 11061, and the Display panel 11061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 1107 includes a touch panel 11071 and other input devices 11072. The touch panel 11071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 11071 (e.g., operations by a user on or near the touch panel 11071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 11071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1110, and receives and executes commands sent from the processor 1110. In addition, the touch panel 11071 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 1107 may include other input devices 11072 in addition to the touch panel 11071. In particular, the other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 11071 can be overlaid on the display panel 11061, and when the touch panel 11071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1110 to determine the type of the touch event, and then the processor 1110 provides a corresponding visual output on the display panel 11061 according to the type of the touch event. Although the touch panel 11071 and the display panel 11061 are shown in fig. 11 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 11071 and the display panel 11061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 1108 is an interface through which an external device is connected to the mobile terminal 1100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1108 may be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within mobile terminal 1100 or may be used to transmit data between mobile terminal 1100 and external devices.
The memory 1109 may be used to store software programs as well as various data. The memory 1109 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. In addition, the memory 1109 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 1109 and calling data stored in the memory 1109, thereby integrally monitoring the mobile terminal. Processor 1110 may include one or more processing units; preferably, the processor 1110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1110.
The mobile terminal 1100 may also include a power supply 1111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 1111 may be logically connected to the processor 1110 via a power management system such that functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the mobile terminal 1100 includes some functional modules that are not shown, and thus will not be described in detail herein.
Optionally, the processor 1110 is further configured to:
acquiring color coordinates of N identification points in the first two-dimensional code image in a color coordinate system;
respectively calculating color coordinate difference values between the color coordinates of the M positioning identification points and M pre-stored standard color coordinates;
calculating the average value of M color coordinate difference values corresponding to the M positioning identification points to obtain color offset;
and performing color calibration on the characteristic information of the N identification points according to the color offset to generate a second two-dimensional code image.
Optionally, the processor 1110 is further configured to:
and the color value of each positioning identification point in the M positioning identification points is different.
Optionally, the processor 1110 is further configured to:
acquiring gray-scale values of N identification points in the first two-dimensional code image;
respectively calculating gray scale difference values between the gray scale values of the M positioning identification points and M standard gray scale values stored in advance;
calculating the average value of M gray scale difference values corresponding to the M positioning identification points to obtain a gray scale offset;
and performing gray scale calibration on the characteristic information of the N identification points according to the gray scale offset to generate a second two-dimensional code image.
Optionally, the processor 1110 is further configured to:
and the gray-scale value of each positioning identification point in the M positioning identification points is different.
The mobile terminal 1100 is capable of implementing each process implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition. And the mobile terminal 1100 may determine the characteristic offset according to the characteristic information of the M positioning identification points. And then, the first two-dimensional code image can be subjected to characteristic calibration according to the characteristic offset to obtain a second two-dimensional code image. And finally, performing image recognition on the second two-dimensional code image obtained through feature calibration. The influence caused by the difference of the display equipment or the interference of the external environment can be reduced, and the success rate of identification is improved.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 1110, a memory 1109, and a computer program stored in the memory 1109 and capable of running on the processor 1110, where the computer program, when executed by the processor 1110, implements each process of the two-dimensional code image identification method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the two-dimensional code image identification method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the computer program is not described herein again. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. A two-dimensional code image recognition method is characterized by comprising the following steps:
acquiring a first two-dimensional code image;
acquiring feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points;
determining characteristic offset according to the characteristic information of the M positioning identification points;
according to the characteristic offset, performing characteristic calibration on the characteristic information of the N identification points to generate a second two-dimensional code image;
performing image recognition on the second two-dimensional code image to obtain a recognition result;
wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N;
the obtaining of the feature information of the N identification points in the first two-dimensional code image includes:
acquiring color coordinates of N identification points in the first two-dimensional code image in a color coordinate system; or
And acquiring gray-scale values of N identification points in the first two-dimensional code image.
2. The method of claim 1,
the determining the characteristic offset according to the characteristic information of the M positioning identification points includes:
respectively calculating color coordinate difference values between the color coordinates of the M positioning identification points and M pre-stored standard color coordinates;
calculating the average value of M color coordinate difference values corresponding to the M positioning identification points to obtain color offset;
the feature calibration is performed on the feature information of the N identification points according to the feature offset to generate a second two-dimensional code image, and the method includes:
and performing color calibration on the characteristic information of the N identification points according to the color offset to generate a second two-dimensional code image.
3. The method of claim 2, wherein the color value of each of the M localized identified points is different.
4. The method of claim 1,
the determining the characteristic offset according to the characteristic information of the M positioning identification points includes:
respectively calculating gray scale difference values between the gray scale values of the M positioning identification points and M standard gray scale values stored in advance;
calculating the average value of M gray scale difference values corresponding to the M positioning identification points to obtain a gray scale offset;
the feature calibration is performed on the feature information of the N identification points according to the feature offset to generate a second two-dimensional code image, and the method includes:
and performing gray scale calibration on the characteristic information of the N identification points according to the gray scale offset to generate a second two-dimensional code image.
5. The method according to claim 4, wherein the gray scale value of each of the M positioning identification points is different.
6. A mobile terminal, comprising:
the first acquisition module is used for acquiring a first two-dimensional code image;
the second acquisition module is used for acquiring the feature information of N identification points in the first two-dimensional code image, wherein the N identification points comprise M positioning identification points;
the determining module is used for determining characteristic offset according to the characteristic information of the M positioning identification points;
the calibration module is used for performing characteristic calibration on the characteristic information of the N identification points according to the characteristic offset to generate a second two-dimensional code image;
the identification module is used for carrying out image identification on the second two-dimensional code image to obtain an identification result;
wherein N is an integer greater than 1, M is an integer greater than 1 and M is less than N;
the second obtaining module is specifically configured to obtain color coordinates of N identification points in the first two-dimensional code image in a color coordinate system;
or
The second obtaining module is specifically configured to obtain gray-scale values of the N identification points in the first two-dimensional code image.
7. The mobile terminal of claim 6,
the determining module comprises:
the first calculation submodule is used for respectively calculating color coordinate difference values between the color coordinates of the M positioning identification points and M pre-stored standard color coordinates;
the second calculation submodule is used for calculating the average value of M color coordinate difference values corresponding to the M positioning identification points to obtain color offset;
the calibration module is specifically used for carrying out color calibration on the feature information of the N identification points according to the color offset to generate a second two-dimensional code image.
8. The mobile terminal of claim 7, wherein the color value of each of the M positioning identification points is different.
9. The mobile terminal of claim 6,
the determining module comprises:
the third calculation submodule is used for respectively calculating gray scale difference values between the gray scale values of the M positioning identification points and M standard gray scale values stored in advance;
the fourth calculation submodule is used for calculating the average value of M gray scale difference values corresponding to the M positioning identification points to obtain gray scale offset;
the calibration module is specifically used for carrying out gray scale calibration on the characteristic information of the N identification points according to the gray scale offset to generate a second two-dimensional code image.
10. The mobile terminal of claim 9, wherein the gray scale value of each of the M positioning identification points is different.
11. A mobile terminal, characterized by comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the two-dimensional code image recognition method according to any one of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the two-dimensional code image recognition method according to any one of claims 1 to 5.
CN201711306827.5A 2017-12-11 2017-12-11 Two-dimensional code image identification method and mobile terminal Active CN107977591B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711306827.5A CN107977591B (en) 2017-12-11 2017-12-11 Two-dimensional code image identification method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711306827.5A CN107977591B (en) 2017-12-11 2017-12-11 Two-dimensional code image identification method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107977591A CN107977591A (en) 2018-05-01
CN107977591B true CN107977591B (en) 2020-04-28

Family

ID=62009868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711306827.5A Active CN107977591B (en) 2017-12-11 2017-12-11 Two-dimensional code image identification method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107977591B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108846663B (en) * 2018-06-21 2020-10-30 维沃移动通信有限公司 Two-dimensional code adjusting method and device and mobile terminal
CN117560402B (en) * 2024-01-12 2024-04-19 凌锐蓝信科技(北京)有限公司 SD-WAN-based system and method for industrial security isolation and data exchange

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102043940A (en) * 2009-10-14 2011-05-04 北大方正集团有限公司 Method and device for reading two-dimensional code symbol data
CN104517089A (en) * 2013-09-29 2015-04-15 北大方正集团有限公司 Two-dimensional code decoding system and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6045752B2 (en) * 2014-05-14 2016-12-14 共同印刷株式会社 Two-dimensional code, two-dimensional code analysis system, and two-dimensional code creation system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102043940A (en) * 2009-10-14 2011-05-04 北大方正集团有限公司 Method and device for reading two-dimensional code symbol data
CN104517089A (en) * 2013-09-29 2015-04-15 北大方正集团有限公司 Two-dimensional code decoding system and method

Also Published As

Publication number Publication date
CN107977591A (en) 2018-05-01

Similar Documents

Publication Publication Date Title
CN107908383B (en) Screen color adjusting method and device and mobile terminal
CN110602473B (en) White balance calibration method and device
CN108153422B (en) Display object control method and mobile terminal
CN109238460B (en) Method for obtaining ambient light intensity and terminal equipment
CN108257104B (en) Image processing method and mobile terminal
CN111401463B (en) Method for outputting detection result, electronic equipment and medium
CN107734172B (en) Information display method and mobile terminal
CN111459233B (en) Display method, electronic device and storage medium
CN109727212B (en) Image processing method and mobile terminal
CN108960120B (en) Fingerprint identification processing method and electronic equipment
CN111031178A (en) Video stream clipping method and electronic equipment
CN108494936B (en) Light intensity detection method and mobile terminal
CN110312070B (en) Image processing method and terminal
CN109348212B (en) Image noise determination method and terminal equipment
CN110933307A (en) Electronic equipment and image processing method
CN107977591B (en) Two-dimensional code image identification method and mobile terminal
CN111007980A (en) Information input method and terminal equipment
CN107977947B (en) Image processing method and mobile terminal
CN109189517B (en) Display switching method and mobile terminal
CN111028161B (en) Image correction method and electronic equipment
CN110740265B (en) Image processing method and terminal equipment
CN109005377B (en) Video processing method and terminal equipment
CN109729280B (en) Image processing method and mobile terminal
CN108965701B (en) Jitter correction method and terminal equipment
CN110781331A (en) Picture display method, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant