CN102682266B - Cylindrical surface bidimensional bar code reading method based on image splicing - Google Patents

Cylindrical surface bidimensional bar code reading method based on image splicing Download PDF

Info

Publication number
CN102682266B
CN102682266B CN201210152638.8A CN201210152638A CN102682266B CN 102682266 B CN102682266 B CN 102682266B CN 201210152638 A CN201210152638 A CN 201210152638A CN 102682266 B CN102682266 B CN 102682266B
Authority
CN
China
Prior art keywords
image
bar code
nmvtemp
sigma
width
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210152638.8A
Other languages
Chinese (zh)
Other versions
CN102682266A (en
Inventor
何卫平
林清松
雷蕾
王伟
刘涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201210152638.8A priority Critical patent/CN102682266B/en
Publication of CN102682266A publication Critical patent/CN102682266A/en
Application granted granted Critical
Publication of CN102682266B publication Critical patent/CN102682266B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a cylindrical surface bidimensional bar code reading method based on image splicing. The method comprises the following steps of: acquiring a group of bidimensional bar code pictures in a rotatable manner, correcting non-uniform image illumination, enhancing edge information of bar codes, identifying bar code images and bar code positions, classifying bar code modules, roughly registering the bar code images, finely registering the bar code images, splicing the images and identifying the bar codes.. The identification theory based on an image of the conventional bidimensional bar code identification system is changed, so that cylindrical surface bar code information can be completely acquired, and the problems of distortion, non-uniform illumination and the like of the cylindrical surface bar codes can be solved. According to the characteristic of a Data Matrix bidimensional bar code image, the outstanding problems of low correctness, low efficiency and the like during splicing of bidimensional bar code images can be well solved by the designed splicing fusion algorithm; and the cylindrical surface bidimensional bar code information can be read quickly and accurately.

Description

A kind of cylinder two-dimensional bar code reading method based on Image Mosaics
Technical field
The present invention relates to cylinder two-dimensional bar code recognition technical field, be specially a kind of cylinder two-dimensional bar code reading method based on Image Mosaics.
Background technology
The direct marking two-dimensional bar code of product surface is quick and precisely distinguished, and is the basis of product lifecycle management and tracking of information, is to become the key that improves stock control efficiency, realizes production process information collection and real-time tracing.Current multiselect is the two-dimensional bar code permanent identification as product by two-dimentional Data Matrix bar code, and this is because Data Matrix barcode encoding capacity is large, density is high, error correcting capability is strong.
The image that current two-dimensional bar code reading method all adopts CCD camera collection one width to contain two-dimensional bar code, then image is carried out to a series of processing, remove bar code region, background location, then extract barcode data information.That existing two-dimensional bar code recognition system can only be processed is complete under flat state, deformation is less and the even situation of illumination under two-dimensional barcode image.The closed code reader MATRIX2000 that the hand-held Dataman 7500 that for example U.S.'s Cognex is produced and Germany produce.But in reality, often run into columniform product, because the two-dimensional bar code and the planar bar code that are identified on cylinder have very large difference, cause recognition difficulty, specific as follows:
1, be identified at the two-dimensional bar code on cylinder, because cylinder carrying can cause cylinder distortion, cause the dimension scale of bar code to change.
, generally all can there is uneven illumination to a certain degree in the two-dimensional barcode image 2, gathering on cylinder.If cylinder smoother, particularly when metal cylinder, adopts and will form serious highlighted reflectively, cause information thoroughly to lose.
3, the width two-dimensional barcode image gathering on cylinder, due to cylinder block or acquisition angles improper, easily cause the information of collection imperfect.
4, these above-mentioned typical differences, less at cylinder product diameter, and marking is in the time that the two-dimensional strip yardstick of cylinder is larger, performance more obvious.
These typical differences make the two-dimensional bar code recognition on cylinder more difficult, even cannot distinguish.The two-dimensional bar code reading device of existing maturation, in the time of recognition cylinder two-dimensional bar code, generally recognition rate is lower, has had a strong impact on recognition efficiency, thereby has limited the application of 2D bar code technology in cylinder Product Identifying is followed the trail of.In existing technical research, in order to increase recognition efficiency and the accuracy of cylinder two-dimensional bar code, generally take the measure of two aspects: the one, take the safeguard measure in enhancing processing or the use procedure in marking process.The article that what Xie Zhifeng as upper in China Mechanical Engineering 05 phase in 2011 etc. delivered be entitled as " the process parameter optimizing research of the direct marking two-dimensional bar code of piece surface laser ", introduce how Optimal Parameters improves the quality of marking, these measures are conceived to improve and safeguard contrast and the quality of metal surface two-dimensional bar code, not from solving in essence cylinder bar code reading difficulty, so the effect obtaining bad.The 2nd, take the auxiliary method with image co-registration of hardware, but existing image interfusion method, do not consider a series of features of two-dimensional barcode image, be directly used in cylinder two-dimensional barcode image and merge, efficiency is low, and error rate is high.For example application number is " 201110100489.6 ", denomination of invention is the patent of " reading device and the reading method of the direct marking two-dimensional bar code of a kind of metal cylinder ", has realized rotating acquisition cylindrical picture from hardware aspect, eliminates highlighted reflective, but it is idealized that software is processed, can not be practical.
Summary of the invention
The technical matters solving
For the problem that solves prior art existence the present invention proposes a kind of cylinder two-dimensional bar code reading method based on Image Mosaics, by the Data Matrix bar code image feature of marking on research cylinder, design the difficult problem that a kind of brand-new image split-joint method solves the cylinder two-dimensional bar code aspects such as uneven illumination, cylinder distort in the time of recognition, Information Monitoring is imperfect, obtain fast and accurately high-quality two-dimensional barcode image to reach, realize cylinder information acquisition, improve the object of cylinder two-dimensional bar code recognition efficiency.
Technical scheme
Technical scheme of the present invention is:
Described a kind of cylinder two-dimensional bar code reading method based on Image Mosaics, is characterized in that: comprise the following steps:
Step 1: the N width image of continuous acquisition two-dimensional bar code
Figure BDA00001648868500021
the full detail that described N width image has comprised two-dimensional bar code; The width of every width image is w, is highly h, pix n i, jrepresent n width image M vtemp nin i be listed as the pixel value of the capable pixel of j;
Step 2: correcting image uneven illumination:
Step (2-1): choose arbitrarily in comprise bar code information piece image Mvtemp, upwards travel through and ask for longitudinal gradient from h/2 in the middle of image M vtemp:
grad j Mvtemp = Σ i = 0 w ( pix i , j Mvtemp - pix i , j + 1 Mvtemp ) j ∈ ( h / 2 , h )
Wherein,
Figure BDA00001648868500033
the capable longitudinal Grad of j in presentation video Mvtemp, and at y uprow is got maximum longitudinally Grad;
Step (2-2): the illuminance array of background area in computed image MVtemp:
I i Mvtemp = 1 20 Σ j = y up y up + β pix i , j Mvtemp i ∈ [ 0 , w )
Wherein, in image M vtemp, background area refers to that the 0th row are to w-1 row, y upwalk to y upthe region that+β is capable, β gets 10~h-y up;
Figure BDA00001648868500035
the illuminance of presentation video Mvtemp background area i row;
Step (2-3): the average light illumination of background area in computed image Mvtemp
I ‾ Mvtemp = 1 ( β + 1 ) w Σ i = 0 w - 1 Σ j = y up y up + β pix i , j Mvtemp
Step (2-4): reverse correcting image sequence
Figure BDA00001648868500038
in the uneven illumination of every piece image:
pix i , j n = pix i , j n · I ‾ Mvtemp / I i Mvtemp n ∈ [ 0 , N - 1 ] , i ∈ [ 0 , w - 1 ] , j ∈ [ 0 , h - 1 ] ,
Figure BDA000016488685000310
Step 3: use Roberts operator to image sequence do edge contour information extraction, then edge strength information is increased in original image, realize bar edges information and strengthen;
Step 4: identification bar code image and barcode position:
Step (4-1): sequence of computed images
Figure BDA000016488685000312
in the transverse projection data of every width image
avg j n = 1 w Σ i = 0 w pix i , j n j ∈ [ 0 , h )
For n width image, obtain one group of transverse projection data
Figure BDA000016488685000315
calculate
Figure BDA000016488685000316
in minimum value
Figure BDA000016488685000317
Step (4-2): by transverse projection data
Figure BDA000016488685000318
entirety translation downwards
Figure BDA000016488685000319
transverse projection data after calculating translation
Figure BDA00001648868500041
maximal value
Figure BDA00001648868500042
and mean value
Figure BDA00001648868500043
to the transverse projection data after translation
Figure BDA00001648868500044
be weighted mean filter and medium filtering, wherein weighted mean Filtering Template is:
1 9 × 1 2 3 2 1
Medium filtering adopts 5 × 1 moving window; To filtered transverse projection data
Figure BDA00001648868500046
carry out Threshold segmentation, segmentation function is:
Step (4-3): to process step (4-2) transverse projection data after treatment
Figure BDA00001648868500048
carry out data fitting, the form of fitting function is:
y = a 1 ( 0 &le; x < x 1 ) b 1 ( x 1 &le; x &le; h / 2 ) a 2 ( h / 2 < x &le; x 2 ) b 2 ( x 2 < x < h )
Wherein, a 1, b 1, a 2, b 2for the matching variable of linear fit function, x 1, x 2for the waypoint of linear fit function; Adopt least square fitting, obtain error of fitting and be:
S ^ min 2 ( x 1 n ) = &Sigma; i = 0 h / 2 ( avg i n ) 2 - ( &Sigma; i = 0 x 1 n - 1 avg i n ) 2 x 1 n - ( &Sigma; i = x 1 n h / 2 avg i n ) 2 h / 2 - x 1 n + 1
S ^ min 2 ( x 2 n ) = &Sigma; i = h / 2 + 1 h - 1 ( avg i n ) 2 - ( &Sigma; i = h / 2 + 1 x 2 n avg i n ) 2 x 2 n - h / 2 - ( &Sigma; i = x 2 n + 1 h - 1 avg i n ) 2 h - x 2 n - 1
In n width image, digital simulation error
Figure BDA000016488685000412
corresponding while getting minimum value
Figure BDA000016488685000413
order
Figure BDA000016488685000414
digital simulation error
Figure BDA000016488685000415
corresponding while getting minimum value
Figure BDA000016488685000416
order
Figure BDA000016488685000417
Step (4-4): repeating step (4-1)~step (4-3), obtains image sequence
Figure BDA000016488685000418
in every width image with
Figure BDA000016488685000420
thereby obtain array
Figure BDA000016488685000421
Step (4-5): sequence of computed images in every width image separately
Figure BDA000016488685000423
longitudinal Grad of position:
grad y d n n = &Sigma; i = 0 w - 1 | pix i , y d n n - pix i , y d n + 1 n |
Thereby obtain array
Figure BDA00001648868500052
calculate array
Figure BDA00001648868500053
mean value from array
Figure BDA00001648868500055
in the data of n=0 start backward successively with
Figure BDA00001648868500056
relatively, when being relatively greater than to the data of n=η
Figure BDA00001648868500057
till; From array
Figure BDA00001648868500058
in the data of n=N-1 start forward successively with
Figure BDA00001648868500059
relatively, when being relatively greater than to the data of n=κ
Figure BDA000016488685000510
till; Image sequence
Figure BDA000016488685000511
in, Mvtemp η~Mvtemp κimage be bar code image;
Step (4-6): calculate in all bar code images
Figure BDA000016488685000512
the mean value y of value s; Appoint the piece image of getting in bar code image, calculate this image
Figure BDA000016488685000513
position and
Figure BDA000016488685000514
longitudinal Grad of position; When
Figure BDA000016488685000515
longitudinal Grad at place is greater than
Figure BDA000016488685000516
longitudinal Grad at place, by all bar code images from separately
Figure BDA000016488685000517
position is cutting upwards, retains
Figure BDA000016488685000518
as new bar code image; When
Figure BDA000016488685000519
longitudinal Grad at place is greater than
Figure BDA000016488685000520
longitudinal Grad at place, by all bar code images from separately
Figure BDA000016488685000521
the downward cutting in position, retains
Figure BDA000016488685000522
as new bar code image; The bar code image newly obtaining is saved as again by original order
Figure BDA000016488685000523
picture traverse w is constant, is highly y s;
Step 5: the dividing mode of determining bar code module:
Step (5-1): will
Figure BDA000016488685000524
transversely arranged in order, longitudinally alignment is merged into piece image NMvtemp, and image NMvtemp is highly y s, width is Mw; Image NMvtemp is done to longitudinal gradient projection:
y grad j NMvtemp = &Sigma; i = 0 Mw - 1 | pix i , j NMvtemp - pix i , j + 1 NMvtemp | j &Element; [ 0 , y s )
Wherein
Figure BDA000016488685000526
the capable longitudinal gradient of j in presentation video NMvtemp;
Step (5-2): choose dividing mode l × l ∈ C from two-dimensional bar code Module Division mode set C={L × L}, obtain the longitudinal cut-point of a pack module:
H = { h m | h m = y s l &times; m } m = 1 l - 1
Step (5-3): computed image NMvtemp is the longitudinal cut-point h of each module in set H mlongitudinal Grad at place
Figure BDA000016488685000528
and calculate all longitudinal Grad
Figure BDA000016488685000529
mean value, as the Grad of dividing mode l × l ∈ C;
Step (5-4): repeat step (5-2)~step (5-3), the Grad of all dividing mode in set of computations C, gets the dividing mode p × p of Grad maximum as two-dimensional bar code transverse module dividing mode;
Step 6: the thick registration of bar code image:
Step (6-1): will
Figure BDA00001648868500061
in every piece image NMvtemp nall be converted into the data matrix of a p × w, conversion calculations mode is:
X n = [ ( x k , i n = 1 M s &Sigma; j = k M s ( k + 1 ) M s pix i , j n ) i = 0 w - 1 ] k = 0 p - 1
Wherein X npresentation video NMvemp ncorresponding data matrix,
Figure BDA00001648868500063
the capable i column element of k in representing matrix, M srepresent the longitudinal size of bar code module, M s=y s/ p;
Step (6-2): will
Figure BDA00001648868500064
in data matrix X corresponding to adjacent two width images n, X n+1stepping is overlapping, stepping columns δ gscope be 1 row~5 row; Calculate each mean square deviation S of overlapping region element when overlapping 2(g n):
S 2 ( g n ) = 1 p g n ( &Sigma; k = 0 p - 1 &Sigma; i = 0 g n ( X k , i n + 1 - X k , w - i - g n n ) 2 ) ( 1 &le; g n &le; w )
Wherein S 2(g n) the overlapping g of expression data matrix nmean square deviation when column element;
Step (6-3): computational data matrix X n, X n+1all S in the overlapping process of stepping 2(g n), get the wherein overlapping columns corresponding to three mean square deviations of value minimum
Figure BDA00001648868500066
as adjacent two width image NMvtemp nand NMvtemp n+1between thick registration position, and note
Figure BDA00001648868500067
Step (6-4): repeating step (6-2)~step (6-3), to image sequence
Figure BDA00001648868500068
in every two width adjacent images carry out thick registration, obtain thick registration position sequence
Step 7: bar code image essence registration:
Step (7-1): the adjacent two width image NMvtemp of calculating that adopt classical similarity measure method nand NMvemp n+1at matched position g nthe matching degree R at place n(g n):
R n ( g n ) = &Sigma; i = 0 g n - 1 &Sigma; j = 0 y s - 1 ( pix w - i - g n , j n &times; pix i , j n + 1 ) &Sigma; i = 0 g n - 1 &Sigma; j = 0 y s - 1 ( pix w - i - g n , j n ) 2 &Sigma; i = 0 g n - 1 &Sigma; j = 1 y s - 1 ( pix i , j n + 1 ) 2 g n &Element; U ( g 1 n , &delta; ) &cap; U ( g 2 n , &delta; ) &cap; U ( g 3 n , &delta; )
The columns δ of fine setting gets δ g+ 1;
Step (7-2): get matching degree R n(g n) maximal value be adjacent two width image NMvtemp nand NMvtemp n+1between optimum matching degree, be designated as R n=max{R n(g n), and by max{R n(g n) corresponding position g nas adjacent two width image NMvtemp nand NMvtemp n+1between smart registration position, and remember smart registration position C n=g n;
Step (7-3): repeating step (7-1)~step (7-2), sequence of computed images
Figure BDA00001648868500071
in optimum matching degree and the smart registration position of every two width adjacent images, obtain optimum matching number of degrees group
Figure BDA00001648868500072
with smart registration position array
Figure BDA00001648868500073
Step 8: Image Mosaics merges and bar-code identification:
Step (8-1): traversal optimum matching number of degrees group
Figure BDA00001648868500074
with two location point n of matching degree numerical value minimum 1and n 2as waypoint, by image sequence
Figure BDA00001648868500075
be divided into three parts;
Step (8-2): the image of each part is fade-in to the method for weighted mean gradually going out according to the smart registration position employing between image and splices fusion, obtain the composograph Part of three parts 0, Part 1, Part 2, three width picture traverses are respectively w 0, w 1, w 2, be highly y s, the smart registration position between three width images is respectively PC 0, PC 1,
Figure BDA00001648868500076
PC 1 = C n 2 ;
Step (8-3): calculate
Figure BDA00001648868500078
middle image NMvtemp 0transverse gradients:
x grad j NMvtem p 0 = &Sigma; j = 0 y s - 1 ( pix i + 1 , j NMvtem p 0 - pix i , j NMvtem p 0 ) i &Element; [ 0 , w )
Figure BDA000016488685000710
representative image NMvtemp 0the transverse gradients of middle i row; Obtain array and calculate
Figure BDA000016488685000712
in maximal value
Figure BDA000016488685000713
position x lbe image NMvtemp 0middle bar code region and white space boundary position;
Step (8-4): calculate
Figure BDA000016488685000714
middle image NMvtemp m-1transverse gradients:
x grad j NMvtem p M - 1 = &Sigma; j = 0 y s - 1 ( pix i , j NMvtem p M - 1 - pix i + 1 , j NMvtem p M - 1 ) i &Element; [ 0 , w )
Figure BDA000016488685000716
representative image NMvtemp m-1the transverse gradients of middle i row; Obtain array and calculate in maximal value
Figure BDA000016488685000719
position x rbe image NMvtemp m-1middle bar code region and white space boundary position;
Step (8-5): by Part 0, Part 1, Part 2width after three width Image Mosaics merge is w m=(x l+ y s+ w-x r), be highly y s; Set up image memory buffer zone, size is w m× y s; By image Part 0put into left side, buffer zone, by image Part 2put into right side, buffer zone; Judgement | (w m-w 0-w 2)-(w 1-PC 0-PC 1) |≤10, if meet Rule of judgment by image Part 1according to Part 0, Part 2smart registration position PC 0, PC 1put into buffer zone, overlapping region employing is fade-in the method for weighted mean gradually going out and completes Image Mosaics fusion, enters step (8-7), enters step (8-6) if do not meet Rule of judgment;
Step (8-6): in image memory buffer zone by Part 1be placed between two width images starting position and Part 0coincidence w row, then Part 1stepping moves right, until Part 1with Part 2coincidence w only classifies as; By Part in step motion process 1with Part 0, Part 2associating overlapping region is merged in the region overlapping respectively, calculates associating overlapping region matching degree, the same step of matching degree computing method (7-1), and the position of getting the maximum matching degree in associating overlapping region in step motion process is as registration position, by image Part 1put into buffer zone according to registration position, overlapping region employing is fade-in the method for weighted mean gradually going out and completes Image Mosaics fusion;
Step (8-7): splicing in image memory buffer zone is merged to the two-dimensional barcode image obtaining, carry out cutting according to width, get two-dimensional barcode image x l~x l+ y spart, obtains the y that a width is new s× y sdata Matrix two-dimensional barcode image, barcode block size is M s× M s; Use decode system to read the bar code information in new Data Matrix two-dimensional barcode image, decode system is decoded and error correction to it according to decoding principle and Reed-Solomon error correction algorithm.
Beneficial effect
A kind of cylinder two-dimensional bar code recognition technology based on Image Mosaics that the present invention proposes, change the recognition principle of existing two-dimensional bar code recognition system based on piece image, can complete collection cylinder bar code information, eliminate the problem such as distortion and uneven illumination of cylinder bar code.According to the feature of Data Matrix two-dimensional barcode image, the splicing blending algorithm of design can run into the outstanding problems such as accuracy is not high, efficiency is low in fine solution two-dimensional barcode image splicing, realizes and distinguishes fast and accurately cylinder two-dimensional barcode information.According to inventor's rough estimates, in the time not using reading method of the present invention, reading a cylinder two-dimensional bar code on average needs 2~4 seconds, and only needs 0.4~0.8 second after using reading method of the present invention, has improved the efficiency of 3~5 times, and the recognition rate greatly improving.
Accompanying drawing explanation
Fig. 1: process flow diagram of the present invention;
Fig. 2: actual convex data plot;
Fig. 3: the convex data plot after standardization;
Fig. 4: the processing procedure figure in embodiment;
Embodiment
Below in conjunction with specific embodiment, the present invention is described:
This example is taken as the cylinder metal of Φ 6, under recognition frock, with MV1300, gathers the sequence image of a group.It is as follows that this example is chosen camera parameter: shutter speed 10us, and gain-adjusted 60, picking rate is at a high speed.For achieving the above object, the total process of technical scheme of the present invention as shown in Figure 1.
Step 1: under recognition frock, regulate and reduce baffle plate gap, the bar code image that makes to gather do not exist highlighted reflective till, then rotate the N width image of continuous acquisition two-dimensional bar code
Figure BDA00001648868500091
the full detail that described N width image has comprised two-dimensional bar code; The width of every width image is w=97, is highly h=384,
Figure BDA00001648868500092
represent n width image M vtemp nin i be listed as the pixel value of the capable pixel of j.8 width images in the present embodiment, are gathered, as shown in (a) in accompanying drawing 4.
Step 2: correcting image uneven illumination:
This step is utilized the uneven illumination in bar code region in the illuminance Changing Pattern correcting image of background area in image, and step is as follows:
Step (2-1): choose arbitrarily in comprise bar code information piece image Mvtemp, upwards travel through and ask for longitudinal gradient from h/2 in the middle of image M vtemp:
grad j Mvtemp = &Sigma; i = 0 w ( pix i , j Mvtemp - pix i , j + 1 Mvtemp ) j &Element; ( h / 2 , h )
Wherein,
Figure BDA00001648868500095
the capable longitudinal Grad of j in presentation video Mvtemp, and at y uprow is got maximum longitudinally Grad;
Step (2-2): the illuminance array of background area in computed image Mvtemp:
I i Mvtemp = 1 20 &Sigma; j = y up y up + &beta; pix i , j Mvtemp i &Element; [ 0 , w )
Wherein, in image M vtemp, background area refers to that the 0th row are to w-1 row, y upwalk to y upthe region that+β is capable, β gets 10~h-y up;
Figure BDA00001648868500097
the illuminance of presentation video Mvtemp background area i row;
Step (2-3): the average light illumination of background area in computed image Mvtemp
I &OverBar; Mvtemp = 1 ( &beta; + 1 ) w &Sigma; i = 0 w - 1 &Sigma; j = y up y up + &beta; pix i , j Mvtemp
Step (2-4): reverse correcting image sequence in the uneven illumination of every piece image:
pix i , j n = pix i , j n &CenterDot; I &OverBar; Mvtemp / I i Mvtemp n &Element; [ 0 , N - 1 ] , i &Element; [ 0 , w - 1 ] , j &Element; [ 0 , h - 1 ] ,
Figure BDA00001648868500102
In the present embodiment, the longitudinal pixel of image belongs to the same radian of cylinder, under the parallel bar shaped white light source in both sides, image be longitudinally that illumination is uniform, horizontal illuminance diminishes in the middle of side direction gradually from two.Based on this rule, the present embodiment utilizes the uneven illumination in bar code region in the illuminance Changing Pattern correcting image in image empty region.Calculate the rectangle white space in the 4th width image in sequence image: longitudinal 15~35, horizontal 0~233, average light illumination
Figure BDA00001648868500103
image after reverse correction is as shown in (b) in Fig. 4.
Step 3: use Roberts operator to image sequence
Figure BDA00001648868500104
do edge contour information extraction, then edge strength information is increased in original image, realize bar edges information and strengthen; Image after enhancing is as shown in (c) in Fig. 4.
Step 4: identification bar code image and barcode position:
Which image is this step, by the method for Gray Projection in fitted figure picture, identify and contain bar code, and which image is not containing bar code, and the lengthwise position at the bar code place in the image that contains bar code, realizes the identification of bar code image and barcode position.Step is as follows:
Step (4-1): sequence of computed images
Figure BDA00001648868500105
in the transverse projection data of every width image
Figure BDA00001648868500106
avg j n = 1 w &Sigma; i = 0 w pix i , j n j &Element; [ 0 , h )
For n width image, obtain one group of transverse projection data
Figure BDA00001648868500108
calculate
Figure BDA00001648868500109
in minimum value
Figure BDA000016488685001010
due to bar code region data for projection different from background, so
Figure BDA000016488685001011
data can present convex shape, are called convex data, as shown in Figure 2.
Step (4-2): for simplifying the operand calculating, by transverse projection data
Figure BDA000016488685001012
entirety translation downwards
Figure BDA000016488685001013
transverse projection data after calculating translation maximal value
Figure BDA000016488685001015
and mean value
Figure BDA000016488685001016
to the transverse projection data after translation
Figure BDA000016488685001017
be weighted mean filter and medium filtering, further eliminated the impacts such as various noises, pollution, wherein weighted mean Filtering Template is:
1 9 &times; 1 2 3 2 1
Medium filtering adopts 5 × 1 moving window; To filtered transverse projection data
Figure BDA00001648868500112
carry out Threshold segmentation, eliminate convex data and have crenellated phenomena, segmentation function is:
Figure BDA00001648868500113
By the standardized operation of step (4-2), make that convex data are irregular has obtained very large improvement, as shown in Figure 3.
Step (4-3): to process step (4-2) transverse projection data after treatment
Figure BDA00001648868500114
carry out data fitting, the form of fitting function is:
y = a 1 ( 0 &le; x < x 1 ) b 1 ( x 1 &le; x &le; h / 2 ) a 2 ( h / 2 < x &le; x 2 ) b 2 ( x 2 < x < h )
Wherein, a 1, b 1, a 2, b 2for the matching variable of linear fit function, x 1, x 2for the waypoint of linear fit function; Adopt least square fitting, obtain error of fitting and be:
S ^ min 2 ( x 1 n ) = &Sigma; i = 0 h / 2 ( avg i n ) 2 - ( &Sigma; i = 0 x 1 n - 1 avg i n ) 2 x 1 n - ( &Sigma; i = x 1 n h / 2 avg i n ) 2 h / 2 - x 1 n + 1
S ^ min 2 ( x 2 n ) = &Sigma; i = h / 2 + 1 h - 1 ( avg i n ) 2 - ( &Sigma; i = h / 2 + 1 x 2 n avg i n ) 2 x 2 n - h / 2 - ( &Sigma; i = x 2 n + 1 h - 1 avg i n ) 2 h - x 2 n - 1
In n width image, digital simulation error
Figure BDA00001648868500118
corresponding while getting minimum value
Figure BDA00001648868500119
order
Figure BDA000016488685001110
digital simulation error
Figure BDA000016488685001111
corresponding while getting minimum value
Figure BDA000016488685001112
order
Step (4-4): repeating step (4-1)~step (4-3), obtains image sequence
Figure BDA000016488685001114
in every width image
Figure BDA000016488685001115
with
Figure BDA000016488685001116
thereby obtain array
Figure BDA000016488685001117
Step (4-5): sequence of computed images
Figure BDA000016488685001118
in every width image separately
Figure BDA000016488685001119
longitudinal Grad of position:
grad y d n n = &Sigma; i = 0 w - 1 | pix i , y d n n - pix i , y d n + 1 n |
Thereby obtain array
Figure BDA00001648868500122
calculate array
Figure BDA00001648868500123
mean value
Figure BDA00001648868500124
from array
Figure BDA00001648868500125
in the data of n=0 start backward successively with
Figure BDA00001648868500126
relatively, when being relatively greater than to the data of n=η
Figure BDA00001648868500127
till; From array in the data of n=N-1 start forward successively with
Figure BDA00001648868500129
relatively, when being relatively greater than to the data of n=κ
Figure BDA000016488685001210
till; Image sequence
Figure BDA000016488685001211
in, Mvtemp η~Mvtemp κimage be bar code image;
Step (4-6): calculate in all bar code images
Figure BDA000016488685001212
the mean value y of value s; Appoint the piece image of getting in bar code image, calculate this image
Figure BDA000016488685001213
position and
Figure BDA000016488685001214
longitudinal Grad of position; When
Figure BDA000016488685001215
longitudinal Grad at place is greater than longitudinal Grad at place, by all bar code images from separately
Figure BDA000016488685001217
position is cutting upwards, retains
Figure BDA000016488685001218
as new bar code image; When
Figure BDA000016488685001219
longitudinal Grad at place is greater than
Figure BDA000016488685001220
longitudinal Grad at place, by all bar code images from separately the downward cutting in position, retains as new bar code image; The bar code image newly obtaining is saved as again by original order
Figure BDA000016488685001223
picture traverse w is constant, is highly y s.
In the present embodiment, step 4 by the every width image of matching in the method for Gray Projection, identify which image and contain bar code, which image is containing bar code, the lengthwise position at the bar code place in the image that contains bar code, realizes the identification of bar code image and barcode position.Obtain minimum error of fitting position in every width image by calculating, standardization, matching projection convex data { y d 0 = 153 , y u 0 = 196 } , { y d 1 = 34 , y u 1 = 345 } , { y d 2 = 34 , y u 2 = 345 } , { y d 3 = 35 , y u 3 = 346 } , { y d 4 = 31 , y u 4 = 342 } , { y d 5 = 29 , y u 5 = 341 } , { y d 6 = 31 , y u d = 343 } , { y d 7 = 150 , y u 7 = 200 } , Judge in known the first width and last piece image and do not contain two-dimensional bar code by gradient.According to matching position cutting image, picture altitude y s=311, width w is constant, has eliminated like this vertical misalignment between image, and the image after cutting is as shown in (d) in Fig. 4.
Step 5: the dividing mode of determining bar code module:
This step is according to the projection of image vertical gradient, and by judging that best pre-mode of dividing gradient realizes the horizontal Module Division of bar code, process is as follows:
Step (5-1): will transversely arranged in order, longitudinally alignment is merged into piece image NMvtemp, and image NMvtemp is highly y s, width is Mw; Image NMvtemp is done to longitudinal gradient projection:
y grad j NMvtemp = &Sigma; i = 0 Mw - 1 | pix i , j NMvtemp - pix i , j + 1 NMvtemp | j &Element; [ 0 , y s )
Wherein
Figure BDA00001648868500131
the capable longitudinal gradient of j in presentation video NMvtemp;
Figure BDA00001648868500132
in presentation video NMvtemp, i is listed as the pixel value of the capable pixel of j;
Step (5-2): choose dividing mode l × l ∈ C from two-dimensional bar code Module Division mode set C={L × L}, obtain the longitudinal cut-point of a pack module:
H = { h m | h m = y s l &times; m } m = 1 l - 1
Step (5-3): computed image NMvtemp is the longitudinal cut-point h of each module in set H mlongitudinal Grad at place and calculate all longitudinal Grad
Figure BDA00001648868500135
mean value, as the Grad of dividing mode l × l ∈ C;
Step (5-4): repeat step (5-2)~step (5-3), the Grad of all dividing mode in set of computations C, gets the dividing mode p × p of Grad maximum as two-dimensional bar code transverse module dividing mode.
The two-dimensional bar code Module Division mode set of selecting in the present embodiment
Figure BDA00001648868500136
according to image vertical gradient projection waveform, by judging that best pre-mode of dividing gradient realizes the horizontal Module Division of bar code.The Grad of various pre-dividing mode is:
Dividing mode 8*8 10*10 12*12 14*14 16*16 18*18 20*20 22*22 24*24
Grad 489.42 328.22 1079.0 333.15 367.33 446.29 349.10 343.33 640.69
Known, 12*12 is the optimum level Module Division mode of two-dimensional bar code, divides effect as shown in (e) in Fig. 4.
Step 6: the thick registration of bar code image:
Through above-mentioned steps, the skew between image is only present in laterally, also needs the horizontal registration between image to be divided into thick registration and two stages of smart registration.This step is converted into data matrix by image according to divided horizontal bar code module, then mates by data matrix the thick registration of realizing between adjacent image, and process is as follows:
Step (6-1): will
Figure BDA00001648868500137
in every piece image NMvtemp nall be converted into the data matrix of a p × w, conversion calculations mode is:
X n = [ ( x k , i n = 1 M s &Sigma; j = k M s ( k + 1 ) M s pix i , j n ) i = 0 w - 1 ] k = 0 p - 1
Wherein X npresentation video NMvtemp ncorresponding data matrix,
Figure BDA00001648868500139
the capable i column element of k in representing matrix, M srepresent the longitudinal size of bar code module, M s=y s/ p; In step 6 and later step
Figure BDA00001648868500141
represent n width image NMvtemp nin i be listed as the pixel value of the capable pixel of j;
Step (6-2): will
Figure BDA00001648868500142
in data matrix X corresponding to adjacent two width images n, X n+1stepping is overlapping, stepping columns δ gscope be 1 row~5 row; Calculate each mean square deviation S of overlapping region element when overlapping 2(g n):
S 2 ( g n ) = 1 p g n ( &Sigma; k = 0 p - 1 &Sigma; i = 0 g n ( X k , i n + 1 - X k , w - i - g n n ) 2 ) ( 1 &le; g n &le; w )
Wherein S 2(g n) the overlapping g of expression data matrix nmean square deviation when column element;
Step (6-3): computational data matrix X n, X n+1all S in the overlapping process of stepping 2(g n), get the wherein overlapping columns corresponding to three mean square deviations of value minimum as adjacent two width image NMvtemp nand NMvtemp n+1between thick registration position, and note
Figure BDA00001648868500145
Step (6-4): repeating step (6-2)~step (6-3), to image sequence
Figure BDA00001648868500146
in every two width adjacent images carry out thick registration, obtain thick registration position sequence
Step 7: bar code image essence registration:
This step, by each thick registration position fine setting, is calculated best matching degree, finally determines smart registration between adjacent two width images, and detailed process is as follows:
Step (7-1): the adjacent two width image NMvtemp of calculating that adopt classical similarity measure method nand NMvtemp n+1at matched position g nthe matching degree R at place n(g n):
R n ( g n ) = &Sigma; i = 0 g n - 1 &Sigma; j = 0 y s - 1 ( pix w - i - g n , j n &times; pix i , j n + 1 ) &Sigma; i = 0 g n - 1 &Sigma; j = 0 y s - 1 ( pix w - i - g n , j n ) 2 &Sigma; i = 0 g n - 1 &Sigma; j = 1 y s - 1 ( pix i , j n + 1 ) 2 g n &Element; U ( g 1 n , &delta; ) &cap; U ( g 2 n , &delta; ) &cap; U ( g 3 n , &delta; )
The columns δ of fine setting gets δ g+ 1;
Step (7-2): get matching degree R n(g n) maximal value be adjacent two width image NMvtemp nand NMvtemp n+1between optimum matching degree, be designated as R n=max{R n(g n), and by max{R n(g n) corresponding position g nas adjacent two width image NMvtemp nand NMvtemp n+1between smart registration position, and remember smart registration position C n=g n;
Step (7-3): repeating step (7-1)~step (7-2), sequence of computed images
Figure BDA00001648868500149
in optimum matching degree and the smart registration position of every two width adjacent images, obtain optimum matching number of degrees group
Figure BDA000016488685001410
with smart registration position array
Figure BDA00001648868500151
In the present embodiment, step 6 and step 7 are set up image data matrix, calculate Minimum Mean Square Error and determine the thick registration between image, then determine optimum matching degree and smart registration position between image according to classical similarity measure method, as follows:
Figure BDA00001648868500152
Step 8: Image Mosaics merges and bar-code identification:
Smart registration position and the matching degree of this step based between image, is first fused into three fragments by picture splicing, then inserts predetermined two-dimensional barcode image memory buffer region, completes Image Mosaics and merges, and then realizes two-dimensional barcode information recognition.Process is as follows:
Step (8-1): traversal optimum matching number of degrees group
Figure BDA00001648868500153
with two location point n of matching degree numerical value minimum 1and n 2as waypoint, by image sequence
Figure BDA00001648868500154
be divided into three parts;
Optimum matching degrees of data as shown above, three parts are respectively { NMvtemp 0, NMvtemp 1, { NMvtemp 2and { NMvtemp 3, NMvtemp 4, NMvtemp 5.
Step (8-2): the image of each part is fade-in to the method for weighted mean gradually going out according to the smart registration position employing between image and splices fusion, obtain the composograph Part of three parts 0, Part 1, Part 2, as shown in (f1), (f2), (f3) in Fig. 4.Three width picture traverses are respectively w 0=172, w 1=97, w 2=166, height y ssmart registration position between=311, three width images is respectively PC 0=41, PC 1=43;
Step (8-3): calculate middle image NMvtemp 0transverse gradients:
x grad j NMvtem p 0 = &Sigma; j = 0 y s - 1 ( pix i + 1 , j NMvtem p 0 - pix i , j NMvtem p 0 ) i &Element; [ 0 , w )
Figure BDA00001648868500157
representative image NMvtemp 0the transverse gradients of middle i row; Obtain array
Figure BDA00001648868500158
and calculate
Figure BDA00001648868500159
in maximal value
Figure BDA000016488685001510
position x lbe image NMvtemp 0middle bar code region and white space boundary position; The present embodiment x l=12.
Step (8-4): calculate
Figure BDA00001648868500161
middle image NMvtemp m-1transverse gradients:
x grad i NMvtem p M - 1 = &Sigma; j = 0 y s - 1 ( pix i , j NMvtem p M - 1 - pix i + 1 , j NMvtem p M - 1 ) i &Element; [ 0 , w )
representative image NMvtemp m-1the transverse gradients of middle i row; Obtain array and calculate
Figure BDA00001648868500165
in maximal value
Figure BDA00001648868500166
position x rbe image NMvtemp m-1middle bar code region and white space boundary position; The present embodiment x r=76.
Step (8-5): by Part 0, Part 1, Part 2width after three width Image Mosaics merge is w m=(x l+ y s+ w-x r)=344 are highly y s; Set up image memory buffer zone, size is w m× y s; By image Part 0put into left side, buffer zone, by image Part 2put into right side, buffer zone; Judgement | (w m-w 0-w 2)-(w 1-PC 0-PC 1) |≤10, if meet Rule of judgment by image Part 1according to Part 0, Part 2smart registration position PC 0, PC 1put into buffer zone, overlapping region employing is fade-in the method for weighted mean gradually going out and completes Image Mosaics fusion, enters step (8-7), enters step (8-6) if do not meet Rule of judgment;
In the present embodiment | (w m-w 0-w 2)-(w 1-PC 0-PC 1) |=7≤10.Meet Rule of judgment, by image Part 1according to Part 0, Part 2smart registration position PC 0, PC 1put into buffer zone, complete Image Mosaics and merge, result is as shown in (g) in Fig. 4.
Step (8-6): in image memory buffer zone by Part 1be placed between two width images starting position and Part 0coincidence w row, then Part 1stepping moves right, until Part 1with Part 2coincidence w only classifies as; By Part in step motion process 1with Part 0, Part 2associating overlapping region is merged in the region overlapping respectively, calculates associating overlapping region matching degree, the same step of matching degree computing method (7-1), and the position of getting the maximum matching degree in associating overlapping region in step motion process is as registration position, by image Part 1put into buffer zone according to registration position, overlapping region employing is fade-in the method for weighted mean gradually going out and completes Image Mosaics fusion;
Step (8-7): splicing in image memory buffer zone is merged to the two-dimensional barcode image obtaining, carry out cutting according to width, get two-dimensional barcode image x l~x l+ y spart, obtains the y that a width is new s× y sdata Matrix two-dimensional barcode image, barcode block size is M s× M s, as shown in (h) in Fig. 4; Use decode system to read the bar code information in new Data Matrix two-dimensional barcode image, decode system is decoded and error correction to it according to decoding principle and Reed-Solomon error correction algorithm.

Claims (1)

1. the cylinder two-dimensional bar code reading method based on Image Mosaics, is characterized in that: comprise the following steps:
Step 1: the N width image of continuous acquisition two-dimensional bar code
Figure FDA0000460566750000011
the full detail that described N width image has comprised two-dimensional bar code; The width of every width image is w, is highly h,
Figure FDA0000460566750000012
represent n width image M vtemp nin i be listed as the pixel value of the capable pixel of j;
Step 2: correcting image uneven illumination:
Step (2-1): choose arbitrarily
Figure FDA0000460566750000013
in comprise bar code information piece image Mvtemp, upwards travel through and ask for longitudinal gradient from h/2 in the middle of image M vtemp:
grad j Mvtemp = &Sigma; i = 0 w ( pix i , j Mvtemp - pix i , j + 1 Mvtemp ) , j &Element; ( h / 2 , h )
Wherein,
Figure FDA0000460566750000015
the capable longitudinal Grad of j in presentation video Mvtemp, and at y uprow is got maximum longitudinally Grad;
Step (2-2): the illuminance array of background area in computed image Mvtemp:
I i Mvtemp = 1 20 &Sigma; j = y up y up + &beta; pix i , j Mvtemp , i &Element; [ 0 , w )
Wherein, in image M vtemp, background area refers to that the 0th row are to w-1 row, y upwalk to y upthe region that+β is capable, β gets 10~h-y up;
Figure FDA0000460566750000017
the illuminance of presentation video Mvtemp background area i row;
Step (2-3): the average light illumination of background area in computed image Mvtemp
Figure FDA0000460566750000018
I &OverBar; Mvtemp = 1 ( &beta; + 1 ) w &Sigma; i = 0 w - 1 &Sigma; j = y up y up + &beta; pix i , j Mvtemp
Step (2-4): reverse correcting image sequence in the uneven illumination of every piece image:
pix i , j n = pix i , j n &CenterDot; I &OverBar; Mvtemp / I i Mvtemp , n &Element; [ 0 , N - 1 ] , i &Element; [ 0 , w - 1 ] , j &Element; [ 0 , h - 1 ] ,
Figure FDA00004605667500000112
Step 3: use Roberts operator to image sequence
Figure FDA00004605667500000113
do edge strength information extraction, then edge strength information is increased in original image, realize edge strength information and strengthen;
Step 4: identification bar code image and barcode position:
Step (4-1): sequence of computed images
Figure FDA00004605667500000114
in the transverse projection data of every width image
Figure FDA00004605667500000115
avg j n = 1 w &Sigma; i = 0 w pix i , j n , j &Element; [ 0 , h )
For n width image, obtain one group of transverse projection data
Figure FDA0000460566750000022
calculate in minimum value
Figure FDA0000460566750000024
Step (4-2): by transverse projection data
Figure FDA0000460566750000025
entirety translation downwards
Figure FDA0000460566750000026
transverse projection data after calculating translation
Figure FDA0000460566750000027
maximal value and mean value
Figure FDA0000460566750000029
to the transverse projection data after translation
Figure FDA00004605667500000210
be weighted mean filter and medium filtering, wherein weighted mean Filtering Template is:
1 9 &times; [ 1 2 3 2 1 ]
Medium filtering adopts 5 × 1 moving window; To filtered transverse projection data carry out Threshold segmentation, segmentation function is:
Figure FDA00004605667500000213
Step (4-3): to process step (4-2) transverse projection data after treatment
Figure FDA00004605667500000214
carry out data fitting, the form of fitting function is:
y = a 1 ( 0 &le; x < x 1 ) b 1 ( x 1 &le; x &le; h / 2 ) a 2 ( h / 2 < x &le; x 2 ) b 2 ( x 2 < x < h )
Wherein, a 1, b 1, a 2, b 2for the matching variable of linear fit function, x 1, x 2for the waypoint of linear fit function; Adopt least square fitting, obtain error of fitting and be:
S ^ min 2 ( x 1 n ) = &Sigma; i = 0 h / 2 ( avg i n ) 2 - ( &Sigma; i = 0 x 1 n - 1 avg i n ) 2 x 1 n - ( &Sigma; i = x 1 n h / 2 avg i n ) 2 h / 2 - x 1 n + 1
S ^ min 2 ( x 2 n ) = &Sigma; i = h / 2 + 1 h - 1 ( avg i n ) 2 - ( &Sigma; i = h / 2 + 1 x 2 n avg i n ) 2 x 2 n - h / 2 - ( &Sigma; i = x 2 n + 1 h - 2 avg i n ) 2 h - x 2 n - 1
In n width image, digital simulation error
Figure FDA00004605667500000218
corresponding while getting minimum value
Figure FDA00004605667500000219
order
Figure FDA00004605667500000220
digital simulation error
Figure FDA00004605667500000221
corresponding while getting minimum value
Figure FDA00004605667500000222
order
Figure FDA00004605667500000223
Step (4-4): repeating step (4-1)~step (4-3), obtains image sequence
Figure FDA0000460566750000031
in every width image
Figure FDA0000460566750000032
with thereby obtain array
Figure FDA0000460566750000034
Step (4-5): sequence of computed images
Figure FDA0000460566750000035
in every width image separately
Figure FDA0000460566750000036
longitudinal Grad of position:
grad y d n n = &Sigma; i = 0 w - 1 | pix i , y d n n - pix i , y d n + 1 n |
Thereby obtain array
Figure FDA0000460566750000038
calculate array mean value
Figure FDA00004605667500000310
from array
Figure FDA00004605667500000311
in the data of n=0 start backward successively with
Figure FDA00004605667500000312
relatively, when being relatively greater than to the data of n=η
Figure FDA00004605667500000313
till; From array
Figure FDA00004605667500000314
in the data of n=N-1 start forward successively with
Figure FDA00004605667500000315
relatively, when being relatively greater than to the data of n=κ
Figure FDA00004605667500000316
till; Image sequence
Figure FDA00004605667500000317
in, the image of Mvtemp η~Mvtemp κ is bar code image;
Step (4-6): calculate in all bar code images
Figure FDA00004605667500000318
the mean value y of value s; Appoint the piece image of getting in bar code image, calculate this image
Figure FDA00004605667500000319
position and
Figure FDA00004605667500000334
longitudinal Grad of position; When longitudinal Grad at place is greater than longitudinal Grad at place, by all bar code images from separately
Figure FDA00004605667500000322
position is cutting upwards, retains
Figure FDA00004605667500000323
as new bar code image; When
Figure FDA00004605667500000324
longitudinal Grad at place is greater than
Figure FDA00004605667500000325
longitudinal Grad at place, by all bar code images from separately
Figure FDA00004605667500000326
the downward cutting in position, retains
Figure FDA00004605667500000327
as new bar code image; The bar code image newly obtaining is saved as again by original order
Figure FDA00004605667500000328
picture traverse w is constant, is highly ys;
Step 5: the dividing mode of determining bar code module:
Step (5-1): will
Figure FDA00004605667500000329
transversely arranged in order, longitudinally alignment is merged into piece image NMvtemp, and image NMvtemp is highly y s, width is Mw; Image NMvtemp is done to longitudinal gradient projection:
y grad j NMvemp = &Sigma; i = 0 Mw - 1 | pix i , j NMvtemp - pix i , j + 1 NMvtemp | , j &Element; [ 0 , y s )
Wherein the capable longitudinal gradient of j in presentation video NMvtemp;
Figure FDA00004605667500000332
in presentation video NMvtemp, i is listed as the pixel value of the capable pixel of j;
Step (5-2): choose dividing mode l × l ∈ C from two-dimensional bar code Module Division mode set C={L × L}, obtain the longitudinal cut-point of a pack module:
H = { h m | h m = y s l &times; m } m = 1 l - 1
Step (5-3): computed image NMvtemp is the longitudinal cut-point h of each module in set H mlongitudinal Grad at place
Figure FDA0000460566750000041
and calculate all longitudinal Grad
Figure FDA0000460566750000042
mean value, as the Grad of dividing mode l × l ∈ C;
Step (5-4): repeat step (5-2)~step (5-3), the Grad of all dividing mode in set of computations C, gets the dividing mode p × p of Grad maximum as two-dimensional bar code transverse module dividing mode;
Step 6: the thick registration of bar code image:
Step (6-1): will
Figure FDA0000460566750000043
in every piece image NMvtemp nall be converted into the data matrix of a p × w, conversion calculations mode is:
X n = [ ( x k , i n = 1 M s &Sigma; j = kM s ( k + 1 ) M s pix i , j n ) i = 0 w - 1 ] k = 0 p - 1
Wherein Xn presentation video NMvtemp ncorresponding data matrix,
Figure FDA0000460566750000045
the capable i column element of k in representing matrix, M srepresent the longitudinal size of bar code module, M s=y s/ p; In step 6 and later step
Figure FDA0000460566750000046
represent n width image NMvtemp nin i be listed as the pixel value of the capable pixel of j;
Step (6-2): will
Figure FDA0000460566750000047
in data matrix X corresponding to adjacent two width images n, X n+1stepping is overlapping, stepping columns δ gscope be 1 row~5 row; Calculate each mean square deviation S of overlapping region element when overlapping 2(g n):
S 2 ( g n ) = 1 pg n ( &Sigma; k = 0 p - 1 &Sigma; i = 0 g n ( X k , i n + 1 - X k , w - i - g n n ) 2 ) , ( 1 &le; g n &le; w )
Wherein S 2(g n) the overlapping g of expression data matrix nmean square deviation when column element;
Step (6-3): computational data matrix X n, X n+1all S in the overlapping process of stepping 2(g n), get the wherein overlapping columns corresponding to three mean square deviations of value minimum
Figure FDA0000460566750000049
as adjacent two width image NMvtemp nand NMvtemp n+1between thick registration position, and note
Figure FDA00004605667500000410
Step (6-4): repeating step (6-2)~step (6-3), to image sequence
Figure FDA00004605667500000411
in every two width adjacent images carry out thick registration, obtain thick registration position sequence
Figure FDA00004605667500000412
Step 7: bar code image essence registration:
Step (7-1): the adjacent two width image NMvtemp of calculating that adopt classical similarity measure method nand NMvtemp n+1at matched position g nthe matching degree R at place n(g n):
R n ( g n ) = &Sigma; i = 0 g n - 1 &Sigma; j = 0 y s - 1 ( pix w - i - g n , j n &times; pix i , j n + 1 ) &Sigma; i = 0 g n - 1 &Sigma; j = 0 y s - 1 ( pix w - i - g n , j n ) 2 &Sigma; i = 0 g n - 1 &Sigma; j = 1 y s - 1 ( pix i , j n + 1 ) 2 , g n &Element; U ( g 1 n , &delta; ) &cap; U ( g 2 n , &delta; ) &cap; U ( g 3 n , &delta; )
The columns δ of fine setting gets δ g+ 1;
Step (7-2): get matching degree R n(g n) maximal value be adjacent two width image NMvtemp nand NMvtemp n+1between optimum matching degree, be designated as R n=max{R n(g n), and by max{R n(g n) corresponding position g nas adjacent two width image NMvtemp nand NMvtemp n+1between smart registration position, and remember smart registration position C n=g n;
Step (7-3): repeating step (7-1)~step (7-2), sequence of computed images
Figure FDA0000460566750000052
in optimum matching degree and the smart registration position of every two width adjacent images, obtain optimum matching number of degrees group
Figure FDA0000460566750000053
with smart registration position array
Figure FDA0000460566750000054
Step 8: Image Mosaics merges and bar-code identification:
Step (8-1): traversal optimum matching number of degrees group
Figure FDA0000460566750000055
with two location point n of matching degree numerical value minimum 1and n 2as waypoint, by image sequence
Figure FDA0000460566750000056
be divided into three parts;
Step (8-2): the image of each part is fade-in to the method for weighted mean gradually going out according to the smart registration position employing between image and splices fusion, obtain the composograph Part of three parts 0, Par t1, Part 2, three width picture traverses are respectively w 0, w 1, w 2, be highly y s, the smart registration position between three width images is respectively
Figure FDA00004605667500000517
PC 1 = C n 2 ;
Step (8-3): calculate
Figure FDA0000460566750000057
middle image NMvtemp 0transverse gradients:
x grad i NMvtem p 0 = &Sigma; j = 0 y s - 1 ( pix i + 1 , j NMvtem p 0 - pix i , j NMvtem p 0 ) , i &Element; [ 0 , w )
Figure FDA0000460566750000059
representative image NMvtemp 0the transverse gradients of middle i row; Obtain array
Figure FDA00004605667500000510
and calculate
Figure FDA00004605667500000511
in maximal value
Figure FDA00004605667500000512
position x lbe image NMvtemp 0middle bar code region and white space boundary position;
Step (8-4): calculate middle image NMvtemp m-1transverse gradients:
x grad i NMvtem p M - 1 = &Sigma; j = 0 y s - 1 ( pix i + 1 , j NMvtem p M - 1 - pix i , j NMvtem p M - 1 ) , i &Element; [ 0 , w )
Figure FDA00004605667500000515
representative image NMvtemp m-1the transverse gradients of middle i row; Obtain array and calculate
Figure FDA0000460566750000061
in maximal value
Figure FDA0000460566750000062
position x rbe image NMvtemp m-1middle bar code region and white space boundary position;
Step (8-5): by Part 0, Part 1, Part 2width after three width Image Mosaics merge is w m=(x l+ y s+ w-x r), be highly y s; Set up image memory buffer zone, size is w m× y s; By image Part 0put into left side, buffer zone, by image Part 2put into right side, buffer zone; Judgement | (w m-w 0-w 2)-(w 1-PC 0-PC 1) |≤10, if meet Rule of judgment by image Part1 according to Part 0, Part 2smart registration position PC 0, PC 1put into buffer zone, overlapping region employing is fade-in the method for weighted mean gradually going out and completes Image Mosaics fusion, enters step (8-7), enters step (8-6) if do not meet Rule of judgment;
Step (8-6): in image memory buffer zone by Part 1be placed between two width images starting position and Part 0coincidence w row, then Part 1stepping moves right, until Part 1with Part 2coincidence w only classifies as; By Part in step motion process 1with Part 0, Part 2associating overlapping region is merged in the region overlapping respectively, calculates associating overlapping region matching degree, the same step of matching degree computing method (7-1), and the position of getting the maximum matching degree in associating overlapping region in step motion process is as registration position, by image Part 1put into buffer zone according to registration position, overlapping region employing is fade-in the method for weighted mean gradually going out and completes Image Mosaics fusion;
Step (8-7): splicing in image memory buffer zone is merged to the two-dimensional barcode image obtaining, carry out cutting according to width, get two-dimensional barcode image x l~x l+ y spart, obtains the y that a width is new s× y sdata Matrix two-dimensional barcode image, barcode block size is M s× M s; Use decode system to read the bar code information in new Data Matrix two-dimensional barcode image, decode system is decoded and error correction to it according to decoding principle and Reed-Solomon error correction algorithm.
CN201210152638.8A 2012-05-17 2012-05-17 Cylindrical surface bidimensional bar code reading method based on image splicing Expired - Fee Related CN102682266B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210152638.8A CN102682266B (en) 2012-05-17 2012-05-17 Cylindrical surface bidimensional bar code reading method based on image splicing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210152638.8A CN102682266B (en) 2012-05-17 2012-05-17 Cylindrical surface bidimensional bar code reading method based on image splicing

Publications (2)

Publication Number Publication Date
CN102682266A CN102682266A (en) 2012-09-19
CN102682266B true CN102682266B (en) 2014-06-11

Family

ID=46814167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210152638.8A Expired - Fee Related CN102682266B (en) 2012-05-17 2012-05-17 Cylindrical surface bidimensional bar code reading method based on image splicing

Country Status (1)

Country Link
CN (1) CN102682266B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3104306B2 (en) 2015-06-11 2023-11-01 Scantrust SA Two dimensional barcode
CN106203564B (en) * 2016-06-23 2019-02-01 北京印刷学院 A kind of generation of the two dimensional code on circle-prism assembly surface and acquisition method
CN106529365B (en) * 2016-12-05 2019-09-06 广东工业大学 Automatic price machine
CN108345817A (en) * 2018-02-06 2018-07-31 徐州智融图像科技有限公司 A kind of recognition methods of cylindrical surface Quick Response Code
CN111553317B (en) * 2020-05-14 2023-08-08 北京惠朗时代科技有限公司 Anti-fake code acquisition method and device, computer equipment and storage medium
CN114936631B (en) * 2021-04-26 2023-06-09 华为技术有限公司 Model processing method and device
CN114882370A (en) * 2022-07-07 2022-08-09 西安超嗨网络科技有限公司 Intelligent commodity identification method and device, terminal and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156849A (en) * 2011-04-21 2011-08-17 西北工业大学 Reading device and reading method of two-dimensional bar code marked on metal cylindrical surface directly
CN102354363A (en) * 2011-09-15 2012-02-15 西北工业大学 Identification method of two-dimensional barcode image on high-reflect light cylindrical metal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7175090B2 (en) * 2004-08-30 2007-02-13 Cognex Technology And Investment Corporation Methods and apparatus for reading bar code identifications

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156849A (en) * 2011-04-21 2011-08-17 西北工业大学 Reading device and reading method of two-dimensional bar code marked on metal cylindrical surface directly
CN102354363A (en) * 2011-09-15 2012-02-15 西北工业大学 Identification method of two-dimensional barcode image on high-reflect light cylindrical metal

Also Published As

Publication number Publication date
CN102682266A (en) 2012-09-19

Similar Documents

Publication Publication Date Title
CN102682266B (en) Cylindrical surface bidimensional bar code reading method based on image splicing
CN102354363B (en) Identification method of two-dimensional barcode image on high-reflect light cylindrical metal
CN101334263B (en) Circular target circular center positioning method
CN105528604B (en) A kind of bill automatic identification and processing system based on OCR
CN101587556B (en) Two-dimension bar-code recognition method
CN103793708B (en) A kind of multiple dimensioned car plate precise positioning method based on motion correction
EP3330887B1 (en) Chinese-sensitive code feature pattern detection method and system
CN105528614B (en) A kind of recognition methods of the cartoon image space of a whole page and automatic recognition system
CN104239420B (en) A kind of video Similarity Match Method based on video finger print
CN102708351B (en) Method for fast identifying Data Matrix two-dimensional bar code under complicated working condition background
CN103473551A (en) Station logo recognition method and system based on SIFT operators
CN107688811B (en) License plate recognition method and device
CN104867135A (en) High-precision stereo matching method based on guiding image guidance
CN109341580A (en) Method for tracing, system and the device of profile of steel rail
CN102790841A (en) Method of detecting and correcting digital images of books in the book spine area
CN103400151A (en) Optical remote-sensing image, GIS automatic registration and water body extraction integrated method
CN101487838A (en) Extraction method for dimension shape characteristics of profiled fiber
CN105184225A (en) Multinational paper money image identification method and apparatus
CN112819066A (en) Res-UNet single tree species classification technology
CN102750531A (en) Method for detecting handwriting mark symbols for bill document positioning grids
CN102096826B (en) Compound classification method for multi-resolution remote sensing image on basis of real likelihood characteristic
CN103913464A (en) High speed railway track surface defect coupling method based on machine visual inspection
CN102163336B (en) Method for coding and decoding image identification codes
CN113658206A (en) Plant leaf segmentation method
CN109410233A (en) A kind of accurate extracting method of high-definition picture road of edge feature constraint

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140611

Termination date: 20150517

EXPY Termination of patent right or utility model