CN101908144A - Bar code image correction method and correction point acquisition method - Google Patents

Bar code image correction method and correction point acquisition method Download PDF

Info

Publication number
CN101908144A
CN101908144A CN2010101898292A CN201010189829A CN101908144A CN 101908144 A CN101908144 A CN 101908144A CN 2010101898292 A CN2010101898292 A CN 2010101898292A CN 201010189829 A CN201010189829 A CN 201010189829A CN 101908144 A CN101908144 A CN 101908144A
Authority
CN
China
Prior art keywords
module
image
bar code
coordinate
modules
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2010101898292A
Other languages
Chinese (zh)
Other versions
CN101908144B (en
Inventor
陈瑞琳
王文敏
刘荣生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Newland Digital Technology Co ltd
Original Assignee
Fujian Newland Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Newland Computer Co Ltd filed Critical Fujian Newland Computer Co Ltd
Priority to CN201010189829A priority Critical patent/CN101908144B/en
Publication of CN101908144A publication Critical patent/CN101908144A/en
Application granted granted Critical
Publication of CN101908144B publication Critical patent/CN101908144B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a bar code image correction method and a correction point acquisition method. The correction point acquisition method comprises the following steps of: determining the module coordinates of two modules positioned on the same line or row of a bar code image and the image coordinates of module centers; determining the module coordinates of middle modules on a connecting line of the module centers of the two modules and determining whether a diacritical module boundary exists or not; selecting odd continuous middle modules having module boundaries at two ends from the middle modules and taking the module center of the middle module positioned in the center of the odd continuous middle modules as a correction point; and calculating the image coordinates of intersection points of the connecting line of the module centers and the module boundaries at the two ends of each of the odd continuous middle modules and averaging the image coordinates of the intersection points to obtain the image coordinates of the correction point. Due to the adoption of the method, the correction point for bar code image correction can be acquired under the condition that a symbolic pattern of the bar code image does not need searching and the possibility of performing image correction under the condition that the symbolic pattern is damaged is provided.

Description

A kind of bar code image correction method and check point acquisition methods
[technical field]
The present invention relates to a kind of method for correcting image, particularly a kind of bar code image correction method and check point acquisition methods.
[background technology]
Barcode technology is an appearance coding, printing, identification, the data acquisition and processing (DAP) emerging technology that grows up on computer technology and infotech basis.Barcode technology is because its identification is quick, accurate, reliable and low cost and other advantages, be widely used in fields such as commerce, taking care of books, storage, post and telecommunications, traffic and Industry Control, and certainly will in " Internet of Things " that rise gradually used, bring into play great function.
The bar code that is widely used at present comprises bar code and two-dimensional bar code.Bar code claims linear bar code to be made up of a plurality of " bars " that are arranged in parallel and " sky " unit again, and bar code information is expressed by bar and empty different in width and position.Bar code is not then expressed any information just in a direction (generally being horizontal direction) expressing information in vertical direction, so information capacity and space availability ratio are lower, and promptly can't discern after bar code is damaged.
Two-dimensional bar code is made up of the chequered with black and white particular geometric figure that distributes on two-dimensional directional according to certain rules, its can be on two-dimensional directional expressing information, so information capacity and space availability ratio are lower, and have certain verifying function.Two-dimensional bar code can be divided into stack two-dimensional bar code and matrix two-dimensional barcode.The stack two-dimensional bar code is that the bar code by the multirow cutting back piles up and forms, and representational stack two-dimensional bar code comprises PDF417, Code 49, Code 16K etc.Matrix two-dimensional barcode is made up of black, the white module that is distributed in by pre-defined rule in the matrix, and representational matrix two-dimensional barcode comprises Codeone, Aztec, Date MatriX, QR sign indicating number etc.
All need bar code image is proofreaied and correct in the two-dimensional bar code identifying of the prior art, and this trimming process often depends on the fixed form pattern in the bar code image.With the QR sign indicating number is example, is provided with in the QR sign indicating number to survey figure and correction graph, determines the bar code image correction coefficient by surveying figure and correction graph, and utilizes transformation for mula that bar code image is proofreaied and correct.Therefore, if detection figure and correction graph are tarnished, then be difficult to proofread and correct.
[summary of the invention]
In order to solve the problem that at first to search in the bar code image correction process in the prior art, the invention provides a kind of bar code image correction method and check point acquisition methods to the significant figure of bar code image.Said method need not can obtain check point by the search sign figure, and can carry out bar code image based on check point and proofread and correct.
The present invention solves the problems of the technologies described above the technical scheme of being taked and provides a kind of bar code image correction method.This bar code image correction method comprises: a. determines to be positioned at the module coordinate of two first modules on the same row or column of bar code image and the image coordinate of module centers; B. determine the module coordinate of first intermediate module on the module centers line of two first modules, and determine whether to exist differentiable module border; C. from first intermediate module, select two ends to have continuous first intermediate module of odd number on module border, and the module centers of first intermediate module of selecting to be positioned at the continuous first intermediate module center of odd number is as check point; D. calculate the image coordinate of first intersection point on the module border at the module centers line of two first modules and the continuous first intermediate module two ends of odd number, and the image coordinate of first intersection point is averaged, with image coordinate as check point; E. repeating step a-d obtains a plurality of check points, and utilizes a plurality of check points that bar code image is proofreaied and correct.
According to one preferred embodiment of the present invention, on direction perpendicular to the module centers line of two first modules, with the check point is that the center determines that but two ends have continuous second intermediate module of odd number on discriminating module border, determine to be positioned at the image coordinate of module centers of two second modules of the continuous second intermediate module both sides of odd number, and determine second intersection point on the module border at the module centers line of two second modules and the continuous second intermediate module two ends of odd number, and average the image coordinate of proofreading and correct check point by image coordinate to second intersection point.
According to one preferred embodiment of the present invention, in step e, utilize following perspective transform formula to calculate the perspective transform coefficient:
U=(aX+bY+c)/(gX+hY+1)
V=(dX+eY+f)/(gX+hY+1)
Wherein, U and V are the coordinates of a plurality of check points on bar code image, and X and Y are the respective coordinates of a plurality of check points on target image, and a, b, c, d, e, c, f, g and h are the perspective transform coefficient.
According to one preferred embodiment of the present invention, U and V are the image coordinate of a plurality of check points on bar code image, and X and Y are the respective modules coordinates of a plurality of check points on target image.
According to one preferred embodiment of the present invention, a plurality of check point is calculated by the module coordinate of a plurality of check points on bar code image at the respective modules coordinate on the target image and obtains.
According to one preferred embodiment of the present invention, in step e, further utilize the perspective transform formula to determine the correspondence position of each target pixel points on bar code image on the target image according to the perspective transform coefficient, and with the Pixel Information of the correspondence position Pixel Information as target pixel points.
The present invention solves the problems of the technologies described above the check point acquisition methods that the technical scheme of being taked provides a kind of bar code image.This check point acquisition methods comprises: a. determines to be positioned at the module coordinate of two first modules on the same row or column of bar code image and the image coordinate of module centers; B. determine the module coordinate of first intermediate module on the module centers line of two first modules, and determine whether to exist differentiable module border; C. from first intermediate module, select two ends to have continuous first intermediate module of odd number on module border, and the module centers of first intermediate module of selecting to be positioned at the continuous first intermediate module center of odd number is as check point; D. calculate the image coordinate of first intersection point on the module border at the module centers line of two first modules and the continuous first intermediate module two ends of odd number, and the image coordinate of first intersection point is averaged, with image coordinate as check point.
According to one preferred embodiment of the present invention, on direction perpendicular to the module centers line of two first modules, with the check point is that the center determines that but two ends have continuous second intermediate module of odd number on discriminating module border, determine to be positioned at the image coordinate of module centers of two second modules of the continuous second intermediate module both sides of odd number, and determine second intersection point on the module border at the module centers line of two second modules and the continuous second intermediate module two ends of odd number, and average the image coordinate of proofreading and correct check point by image coordinate to second intersection point.
By said method, can under the situation of the significant figure that need not to search for bar code image, obtain and be used for the check point that bar code image is proofreaied and correct, provide under the stained situation of significant figure, carry out image rectification may.
[description of drawings]
Fig. 1 is the process flow diagram according to method for sampling barcode images of the present invention;
Fig. 2 is the synoptic diagram according to the mapping mode in the bar code image sampling process of the present invention;
Fig. 3 is the process flow diagram according to first kind of check point acquisition methods of the present invention;
Fig. 4 is the synoptic diagram according to the sub-image area of first kind of check point acquisition methods of the present invention;
Fig. 5 is the synoptic diagram according to the dynamic template of first kind of check point acquisition methods of the present invention;
Fig. 6 is the process flow diagram according to second kind of check point acquisition methods of the present invention;
Fig. 7 is the partial enlarged drawing according to the two-dimensional barcode image of second kind of check point acquisition methods of the present invention.
[embodiment]
The invention provides a kind of method for sampling barcode images based on several bar code images, this method for sampling barcode images improves the sharpness of bar code image by the Pixel Information that merges two width of cloth or the above bar code image of two width of cloth.In addition, this method for sampling barcode images can also further be fused to two width of cloth bar code images in super-resolution target image or the Subresolution target image at least.
Shown in Fig. 1-2, the invention provides a kind of method for sampling barcode images.In method for sampling barcode images of the present invention, at first obtain two width of cloth bar code image A, B.Bar code image A, B preferably take acquisition at same bar code at interval at the fixed time continuously, so that the otherness between bar code image A, the B is less relatively, and then guarantee the accuracy of image sampling.After obtaining two width of cloth bar code image A, B, respectively two width of cloth bar code image A, B are scanned, to obtain the significant position that can be used for registration respectively.For example, the intersecting point of bar code image, survey figure, correction graph, positioning pattern and be used for other check points that bar code proofreaies and correct or the like.
After obtaining above-mentioned significant position, calculate the mapping coefficient a between bar code image A, B and the target image C respectively 1, b 1, c 1... and a 2, b 2, c 2....Mapping formula between bar code image A, B and the target image C and mapping coefficient a 1, b 1, c 1... and a 2, b 2, c 2... can obtain for example perspective transform, quadratic polynomial, cubic polynomial, triangle gridding, wavelet transformation etc. in several ways.
The present invention is that example is described in detail with the perspective transform.Behind a plurality of significant position of determining on bar code image A, the B (for example, correction graph), obtain the coordinate information (for example, module coordinate and image coordinate) of each significant position.Utilize the perspective transform formula calculate between bar code image A, B and the target image C perspective transform coefficient a 1, b 1, c 1... and a 2, b 2, c 2...:
U=(aX+bY+c)/(gX+hY+1) (1)
V=(dX+eY+f)/(gX+hY+1) (2)
Wherein, U and V are each significant position coordinate on bar code image A, B, and X and Y are each significant position respective coordinates on target image C, and a, b, c, d, e, c, f, g and h are the perspective transform coefficient.In actual applications, U and V can be each significant position image coordinate on bar code image A, B, X and Y can be each significant position module coordinates on target image C, and can push away according to the module coordinate calculating of each significant position on bar code image A, B.For example, under bar code image A, the B shown in Figure 2 situation identical with the resolution of target image C, each significant position is its respective modules coordinate on target image C at the module coordinate on bar code image A, the B.
As shown in Figure 2, at the perspective transform coefficient a that obtains respectively between bar code image A, B and the target image C 1, b 1, c 1... and a 2, b 2, c 2... after, respectively bar code image A, B are mapped on the target image C by above-mentioned perspective transform formula.
In the present embodiment, can determine the target pixel points c correspondence position on bar code image A and bar code image B respectively on the target image C according to above-mentioned perspective transform coefficient and perspective transform formula.That is, according to the coordinate (X of target pixel points c c, Y c) can try to achieve its respective coordinates (U on bar code image A by above-mentioned perspective transform formula a, V a) and bar code image B on respective coordinates (U b, V b).In the present invention, will merging from the Pixel Information of bar code image A, B respectively corresponding to the same target pixel points c of target image C.Specifically, with the pixel value H of the correspondence position of target pixel points c on bar code image A and bar code image B of target image C a, H bMerge, with pixel value H as this target pixel points c c, H for example c=(H a+ H b)/2.By the way, target image C can merge the Pixel Information of bar code image A, B, has improved the sharpness of target image C thus.The amalgamation mode that the present invention mentioned includes but not limited to average or weighted mean computing.
Because target pixel points c can not that is to say respective coordinates (U fully corresponding to a real pixel point on bar code image A and the bar code image B at the correspondence position on bar code image A and the bar code image B a, V a) and respective coordinates (U b, V b) non-integral situation appears.In the present invention, by the Pixel Information of a plurality of pixels of correspondence position periphery is carried out the Pixel Information that interpolation obtains correspondence position.For example, in the present embodiment, with respective coordinates (U a, V a) pixel 1a, the 2a of periphery, the pixel value H of 3a, 4a 1a, H 2a, H 3a, H 4aCarry out interpolation and obtain respective coordinates (U a, V a) pixel value H a, and with respective coordinates (U b, V b) pixel 1b, the 2b of periphery, the pixel value H of 3b, 4b 1b, H 2b, H 3b, H 4bCarry out interpolation and obtain respective coordinates (U b, V b) pixel value H bSubsequently, again to H aAnd H bMerge the pixel value H that obtains target pixel points c cInterpolation arithmetic with many pixels in the width of cloth image is a techniques well known, does not repeat them here.
In addition, except the described bar code image A of Fig. 2, situation that B is identical with the resolution of target image C, method for sampling barcode images of the present invention also can be applied to bar code image A, the B situation different with resolution of target images.
Under the situation of resolution of target images less than bar code image, be under the situation of Subresolution sampling, at first can obtain the respective modules coordinate of this significant position on target image, and utilize above-mentioned perspective transform formula to calculate perspective transform coefficient between bar code image and the target image by significant position is dwindled at the module coordinate on the bar code image.Subsequently, according to the perspective transform coefficient respectively with the coordinate Mapping of each target pixel points in the target image to each bar code image, to determine the correspondence position on itself and each bar code image.Utilize and above to describe the Pixel Information (for example, pixel value) that interpolation method obtains correspondence position, again the Pixel Information of the correspondence position of same target pixel points on each bar code image of target image is fused into the Pixel Information of the target pixel points on the target image.
Under the situation of resolution of target images greater than bar code image, be under the situation of super-resolution sampling, at first can obtain the respective modules coordinate of this significant position on target image, and utilize above-mentioned perspective transform formula to calculate perspective transform coefficient between bar code image and the target image by significant position is enlarged at the module coordinate on the bar code image.Subsequently, according to the perspective transform coefficient respectively with the coordinate Mapping of each target pixel points in the target image to each bar code image, to determine the correspondence position on itself and each bar code image.Utilize and above to describe the Pixel Information (for example, pixel value) that interpolation method obtains correspondence position, be fused into the Pixel Information of the target pixel points on the target image in the Pixel Information of the correspondence position of same target pixel points on each bar code image of target image.
By above-mentioned image sampling method, the Pixel Information from least two width of cloth images can be fused in the same width of cloth target image, improved the sharpness of target image thus.In addition, can also further utilize above-mentioned method for sampling barcode images to obtain super-resolution target image or Subresolution target image.
Shown in Fig. 3-5, the present invention further provides a kind of method of obtaining check point.This method utilizes dynamic template to obtain check point in the bar code image.This check point can also be used to the single width bar code image is proofreaied and correct except can be used for the above-mentioned bar code method of sampling.
As shown in Figure 3, in this check point acquisition methods, at first obtain two-dimensional barcode image by camera system.This two-dimensional barcode image is gray level image or will be converted into gray level image by the coloured image that camera system obtains by pretreatment unit preferably.This two-dimensional bar code is made up of a plurality of black and white modules of arranging by rectangular in form.In the gray level image of actual photographed, each black and white module is made up of a plurality of pixels respectively, and the pixel value of each pixel is not black or pure white, but has certain gray-scale value.In addition, because camera system self, shooting gimmick and Effect of Environmental, can there be certain distortion in the gray level image of actual photographed with respect to original two-dimensional bar code, promptly has certain transformation relation.In the prior art, it is image coordinate by the correcting image in the gray level image of search actual photographed, and calculate the conversion coefficient (correction coefficient) of the gray level image of actual photographed with respect to target image by image coordinate, and utilize this correction coefficient that photographic images is mapped on the target image, realize correction thus to photographic images.
Yet, in check point acquisition methods of the present invention, after obtaining the gray level image of two-dimensional bar code, directly correction graph is not searched for, but the module coordinate of each module in the sub-image area in definite two-dimensional bar code gray level image and the image coordinate of module centers.For example, as shown in Figure 3, with the QR sign indicating number is example, obtains border and module width figures coefficient by the reconnaissance probe figure, and utilizes known method to calculate the module coordinate of each module in the sub-image area of two-dimensional bar code gray level image and the image coordinate of module centers.In the present invention, module coordinate is meant with the module to be the coordinate of unit, be certain module number of modules with respect to true origin on change in coordinate axis direction, and image coordinate is to be the coordinate of unit with the pixel, promptly certain pixel on change in coordinate axis direction with respect to the number of pixels of true origin.
As shown in Figure 4, after the module coordinate of each module in obtaining the sub-image area and the image coordinate of module centers, utilize threshold value that the gray-scale value of each module centers is carried out binary conversion treatment,, promptly obtain the black and white values of each module centers to judge the black and white feature of each module centers.In this step, threshold value choose and binary processing method can adopt choosing of known global threshold or local threshold and binary processing method, and do not giving unnecessary details at this.
After the black and white values of having determined each module centers, the module centers of the part of module in the chooser image-region forms dynamic template as unique point.Specifically, the selection of dynamic template generally is based on each module significant with respect to adjacent block, promptly chooses with respect to adjacent block than the significant module preferably that is easier to distinguish, and forms dynamic template.For example, as shown in Figure 4, in the present embodiment, choose with respect to adjacent block than the module centers of the module 1,2,3,4,5,6 that is easier to distinguish, form dynamic module shown in Figure 5 as unique point.As shown in Figure 5, each unique point in the dynamic template all has three parameters: the module coordinate of respective modules, image coordinate and black and white values, i.e. (X 1, Y 1), (U 1, V 1), H 1, (X 2, Y 2), (U 2, V 2), H 2, (X 3, Y 3), (U 3, V 3), H 3, (X 4, Y 4), (U 4, V 4), H 4, (X 5, Y 5), (U 5, V 5), H 5, (X 6, Y 6), (U 6, V 6), H 6After forming dynamic template, also can further judge the validity of dynamic template.For example, with the pairing module coordinate (X of each unique point in the dynamic template 1, Y 1), (X 2, Y 2), (X 3, Y 3), (X 4, Y 4), (X 5, Y 5), (X 6, Y 6) the module coordinate translation of unifying vector, one or two module of translation for example, and with the black and white values H of each unique point in the dynamic template 1, H 2, H 3, H 4, H 5, H 6Mate with the black and white values of the pairing module centers of module coordinate after the translation.If matching degree is higher than predetermined threshold, think that then this dynamic template is not easy to distinguish mutually with peripheral modules, this dynamic template is invalid template, if matching degree is lower than predetermined threshold, think that then this dynamic template can distinguish mutually with peripheral modules, this dynamic template is effective template.
After having determined dynamic template, the relative gray level image of dynamic template is carried out translation and carry out the gray scale coupling, to determine the best match position of dynamic template and gray level image.Specifically, with the image coordinate U of each unique point in the dynamic template 1, V 1), (U 2, V 2), (U 3, V 3), (U 4, V 4), (U 5, V 5), (U 6, V 6) the image coordinate translation of unifying vector, for example increase or reduce a pixel successively, and with the black and white values H of each unique point in the dynamic template 1, H 2, H 3, H 4, H 5, H 6Mate with the gray-scale value of the correspondence position of image coordinate in gray level image after the translation, and the highest position of definite matching degree is a best match position.In the present embodiment, can adopt known matching process to determine the matching degree of dynamic template and gray level image.In addition, also can be in advance with the black and white values H of each unique point in the dynamic template 1, H 2, H 3, H 4, H 5, H 6Reverse, again with the image coordinate (U of each unique point in the dynamic template 1, V 1), (U 2, V 2), (U 3, V 3), (U 4, V 4), (U 5, V 5), (U 6, V 6) the image coordinate translation of unifying vector, and with black and white values after the counter-rotating of each unique point in the dynamic template and the image coordinate after the translation gray-scale value of the correspondence position in gray level image mates.At this moment, determine that then the minimum position of matching degree is a best match position.
After determining best match position, select a unique point in the dynamic template as check point, and preferably select the nearest unique point in the center of dynamic template as check point.This check point is above-described based on the significant position in the method for sampling barcode images of many bar code images except can be used as, and can also be used to single image is proofreaied and correct.
Specifically, repeat above-mentioned steps, can obtain a plurality of check points.Subsequently, utilize methods such as perspective transform, quadratic polynomial, cubic polynomial, triangle gridding to determine mapping coefficient between bar code image and the target image, and gray level image is mapped on the target image, realize the correction of image thus.
Be example below with the perspective transform, explain the detailed process of utilizing above-mentioned check point that the single width bar code image is proofreaied and correct of the present invention.Obtain 4 pairing module coordinate (X of check point by said method a, Y a), (X b, Y b), (X c, Y c), (X d, Y d) and image coordinate (U a, V a), (U b, V b), (U c, V c), (U d, V d).
According to module coordinate (X a, Y a), (X b, Y b), (X c, Y c), (X d, Y d) and former gray level image and resolution of target images, determine the respective modules coordinate of these a plurality of check points on target image.For example, under the situation that former gray level image and resolution of target images remain unchanged, the respective modules coordinate of these a plurality of check points on target image is (X a, Y a), (X b, Y b), (X c, Y c), (X d, Y d).
At this moment, utilize the perspective transform formula to calculate perspective transform coefficient a, b, b, d, e, f, g, the h of former gray level image and target image:
U=(aX+bY+c)/(gX+hY+1) (1)
V=(dX+eY+f)/(gX+hY+1) (2)
Obtaining perspective transform coefficient a, b, b, d, e, f, g, h, then can further determine that according to above-mentioned formula each pixel is at the correspondence position of former gray level image on the target image, and obtain the Pixel Information of correspondence position by above-described interpolation method, and then, realized the correction of image thus as the Pixel Information of each pixel on the target image.
Pass through said method, can under the situation of the significant figures such as correction graph that need not to search for bar code image, obtain and be used for the check point that bar code image sampling or bar code image are proofreaied and correct, provide carry out under the stained situation of significant figure that bar code image sampling and bar code image proofread and correct may.
Shown in Fig. 6-7, the present invention further provides the another kind of method of obtaining check point.This check point can also be used to the single width bar code image is proofreaied and correct except can be used for the significant position of the above-mentioned bar code method of sampling.
In the present embodiment, after obtaining two-dimensional barcode image, be positioned at the module coordinate (X of two modules 1,2 of the space on the same row or column in search and definite this two-dimensional barcode image 1, Y 1), (X 2, Y 2) and the image coordinate (U of module centers 1, V 1), (U 2, V 2).For example, be example with the QR sign indicating number, obtain border and module width figures coefficient by the reconnaissance probe figure, and utilize known method to search for and determine to be positioned at the module coordinate (X of two modules 1,2 of the space on the same row or column 1, Y 1), (X 2, Y 2) and the image coordinate (U of module centers 1, V 1), (U 2, V 2).In the present invention, module coordinate is meant with the module to be the coordinate of unit, be certain module number of modules with respect to true origin on change in coordinate axis direction, and image coordinate is to be the coordinate of unit with the pixel, promptly certain pixel on change in coordinate axis direction with respect to the number of pixels of true origin.In the present embodiment, two-dimensional barcode image is not limited to gray level image, and can be binary image or coloured image.
Module coordinate (X at determination module 1,2 1, Y 1), (X 2, Y 2) and the image coordinate (U of module centers 1, V 1), (U 2, V 2) after, the module coordinate (X of a plurality of intermediate modules on the module centers line of further determination module 1,2 3, Y 3), (X 4, Y 4), (X 5, Y 5) ...., and determine whether there is differentiable module border (for example, black and white border) on this module centers line.In these a plurality of intermediate modules, select the continuous intermediate module of odd number that two ends have the module border, module 3,4,5 in the present embodiment for example, and the module centers of intermediate module 5 of center of selecting to be positioned at the continuous intermediate module 3,4,5 of this odd number is as check point.Subsequently, the image coordinate (U of the intersection point on the module border at the determination module line of centres and module 3,4,5 two ends 3, V 3), (U 4, V 4), and to the image coordinate (U of these two intersection points 3, V 3), (U 4, V 4) average, result of calculation is the image coordinate (U of check point 5, V 5).
In addition, can be further on the vertical direction of the module centers line of module 1,2 to the image coordinate (U of the module centers of middle module 5 5, V 5) proofread and correct.Specifically, but be that the center determines that two ends have the continuous intermediate module of odd number on discriminating module border with intermediate module 5 in vertical direction, and determine to be positioned at the image coordinate of module centers of two modules of the continuous intermediate module of this odd number both sides.Further determine the intersection point on the module border at the module centers line of these two modules and the continuous intermediate module of odd number two ends, and average the image coordinate of the module centers of the intermediate module 5 of trying to achieve again thus by image coordinate to these two intersection points.And the image coordinate of the module centers of the module centers line of forward position module 1,2 intermediate module 5 of trying to achieve is carried out parallelly more with it, further proofreaies and correct the image coordinate of the module centers of intermediate module 5.
Subsequently, repeat said process, then can determine a plurality of check points, and according to the trimming process of describing in the foregoing description, utilize the module coordinate of check point and image coordinate to calculate mapping coefficient between two-dimensional barcode image and the target image, and utilize this mapping coefficient to realize correction two-dimensional barcode image.
In the above-described embodiments, only the present invention has been carried out exemplary description, but those skilled in the art can carry out various modifications to the present invention after reading present patent application under the situation that does not break away from the spirit and scope of the present invention.

Claims (8)

1. a bar code image correction method is characterized in that, described bar code image correction method comprises:
A. determine to be positioned at the module coordinate of two first modules on the same row or column of bar code image and the image coordinate of module centers;
B. determine the module coordinate of first intermediate module on the module centers line of described two first modules, and determine whether to exist differentiable module border;
C. from described first intermediate module, select two ends to have continuous first intermediate module of odd number on module border, and the module centers of first intermediate module of selecting to be positioned at the continuous first intermediate module center of described odd number is as check point;
D. calculate the image coordinate of first intersection point on the module border at the module centers line of described two first modules and the continuous first intermediate module two ends of described odd number, and the image coordinate of described first intersection point averaged, with image coordinate as described check point;
E. repeating step a-d obtains a plurality of check points, and utilizes described a plurality of check point that described bar code image is proofreaied and correct.
2. bar code image correction method according to claim 1, it is characterized in that, on direction perpendicular to the module centers line of described two first modules, with described check point is that the center determines that but two ends have continuous second intermediate module of odd number on discriminating module border, determine to be positioned at the image coordinate of module centers of two second modules of the continuous second intermediate module both sides of described odd number, and determine second intersection point on the module border at the module centers line of described two second modules and the continuous second intermediate module two ends of described odd number, and average the image coordinate of proofreading and correct described check point by image coordinate to described second intersection point.
3. bar code image correction method according to claim 1 is characterized in that, in described step e, utilizes following perspective transform formula to calculate the perspective transform coefficient:
U=(aX+bY+c)/(gX+hY+1)
V=(dX+eY+f)/(gX+hY+1)
Wherein, U and V are the coordinate of described a plurality of check point on described bar code image, and X and Y are the respective coordinates of described a plurality of check point on target image, and a, b, c, d, e, c, f, g and h are the perspective transform coefficient.
4. bar code image correction method according to claim 3 is characterized in that, U and V are the image coordinate of described a plurality of check point on described bar code image, and X and Y are the respective modules coordinate of described a plurality of check point on described target image.
5. bar code image correction method according to claim 4 is characterized in that, described a plurality of check points are calculated by the module coordinate of described a plurality of check points on described bar code image at the respective modules coordinate on the described target image and obtain.
6. bar code image correction method according to claim 3, it is characterized in that, in described step e, further utilize described perspective transform formula to determine the correspondence position of each target pixel points on described bar code image on the described target image according to described perspective transform coefficient, and with the Pixel Information of the described correspondence position Pixel Information as described target pixel points.
7. the check point acquisition methods of a bar code image is characterized in that, described check point acquisition methods comprises:
A. determine to be positioned at the module coordinate of two first modules on the same row or column of bar code image and the image coordinate of module centers;
B. determine the module coordinate of first intermediate module on the module centers line of described two first modules, and determine whether to exist differentiable module border;
C. from described first intermediate module, select two ends to have continuous first intermediate module of odd number on module border, and the module centers of first intermediate module of selecting to be positioned at the continuous first intermediate module center of described odd number is as check point;
D. calculate the image coordinate of first intersection point on the module border at the module centers line of described two first modules and the continuous first intermediate module two ends of described odd number, and the image coordinate of described first intersection point averaged, with image coordinate as described check point.
8. check point acquisition methods according to claim 7, it is characterized in that, on direction perpendicular to the module centers line of described two first modules, with described check point is that the center determines that but two ends have continuous second intermediate module of odd number on discriminating module border, determine to be positioned at the image coordinate of module centers of two second modules of the continuous second intermediate module both sides of described odd number, and determine second intersection point on the module border at the module centers line of described two second modules and the continuous second intermediate module two ends of described odd number, and average the image coordinate of proofreading and correct described check point by image coordinate to described second intersection point.
CN201010189829A 2010-06-01 2010-06-01 Bar code image correction method and correction point acquisition method Expired - Fee Related CN101908144B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010189829A CN101908144B (en) 2010-06-01 2010-06-01 Bar code image correction method and correction point acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010189829A CN101908144B (en) 2010-06-01 2010-06-01 Bar code image correction method and correction point acquisition method

Publications (2)

Publication Number Publication Date
CN101908144A true CN101908144A (en) 2010-12-08
CN101908144B CN101908144B (en) 2012-10-03

Family

ID=43263599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010189829A Expired - Fee Related CN101908144B (en) 2010-06-01 2010-06-01 Bar code image correction method and correction point acquisition method

Country Status (1)

Country Link
CN (1) CN101908144B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105930840A (en) * 2016-04-18 2016-09-07 河海大学常州校区 Medicine bar code defect detection method based on statistical analysis
US9892300B2 (en) 2012-11-13 2018-02-13 Kyodo Printing Co., Ltd. Two-dimensional code
CN109508573A (en) * 2018-11-12 2019-03-22 上海商米科技有限公司 The coding/decoding method and device of two dimensional code

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9892300B2 (en) 2012-11-13 2018-02-13 Kyodo Printing Co., Ltd. Two-dimensional code
CN105930840A (en) * 2016-04-18 2016-09-07 河海大学常州校区 Medicine bar code defect detection method based on statistical analysis
CN105930840B (en) * 2016-04-18 2019-04-23 河海大学常州校区 Drug bar code defect inspection method based on statistical analysis
CN109508573A (en) * 2018-11-12 2019-03-22 上海商米科技有限公司 The coding/decoding method and device of two dimensional code
CN109508573B (en) * 2018-11-12 2020-10-30 上海商米科技集团股份有限公司 Two-dimensional code decoding method and device

Also Published As

Publication number Publication date
CN101908144B (en) 2012-10-03

Similar Documents

Publication Publication Date Title
CN101882220B (en) Bar code image correction method based on dynamic template and method for acquiring correction point
EP2387770B1 (en) Method to match images
CN110096920B (en) High-precision high-speed positioning label and positioning method for visual servo
CN101833644B (en) Correction graph searching method based on dynamic template
CN109712071B (en) Unmanned aerial vehicle image splicing and positioning method based on track constraint
JP5500163B2 (en) Image processing system, image processing method, and image processing program
CN102763121B (en) Method for decoding a linear bar code
CN101882213B (en) Method for sampling barcode images
CN114119992A (en) Multi-mode three-dimensional target detection method and device based on image and point cloud fusion
CN101908144B (en) Bar code image correction method and correction point acquisition method
CN115511031A (en) Capacity-expansion two-dimensional code and three-dimensional code decoding method, system, equipment and medium
CN112270694A (en) Method for detecting urban environment dynamic target based on laser radar scanning pattern
CN113420580A (en) Method and device for positioning auxiliary locator for two-dimensional code, two-dimensional code scanning equipment and storage medium
CN111583191B (en) Light field EPI Fourier transform-based refraction characteristic detection method
CN113793309A (en) Sub-pixel level ellipse detection method based on morphological characteristics
CN201946014U (en) Bar code image correction device and correction point acquisition device based on dynamic template
CN115619678B (en) Correction method and device for image deformation, computer equipment and storage medium
CN201927034U (en) Barcode image sampling device
JP2009146150A (en) Method and device for detecting feature position
CN201927053U (en) Bar code image correcting unit and checking point acquiring device
CN201927054U (en) Corrected figure searching device based on dynamic template
CN116724315A (en) Method for determining encoder architecture of neural network
CN113012132A (en) Image similarity determining method and device, computing equipment and storage medium
JP6218237B2 (en) Image conversion program, apparatus and method for parallelizing photographed image
WO2021000471A1 (en) High-resolution image matching method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CP03 Change of name, title or address

Address after: 350015 No. 1 Rujiangxi Road, Mawei District, Fuzhou City, Fujian Province

Patentee after: NEWLAND DIGITAL TECHNOLOGY Co.,Ltd.

Address before: 350015 New Continental Science Park No. 1 Rujiangxi Road, Mawei District, Fuzhou City, Fujian Province

Patentee before: Fujian Newland Computer Co.,Ltd.

CP03 Change of name, title or address
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121003

CF01 Termination of patent right due to non-payment of annual fee