CN103593838A - Rapid cross-correlation grey-scale image coupling method and rapid cross-correlation grey-scale image coupling device - Google Patents
Rapid cross-correlation grey-scale image coupling method and rapid cross-correlation grey-scale image coupling device Download PDFInfo
- Publication number
- CN103593838A CN103593838A CN201310331842.0A CN201310331842A CN103593838A CN 103593838 A CN103593838 A CN 103593838A CN 201310331842 A CN201310331842 A CN 201310331842A CN 103593838 A CN103593838 A CN 103593838A
- Authority
- CN
- China
- Prior art keywords
- image
- matching area
- matching
- target
- target pattern
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a rapid cross-correlation grey-scale image coupling method and a rapid cross-correlation grey-scale image coupling device. The method is used for searching a position and a direction of a target pattern in a grey-scale target image. The method comprises steps that: polar coordinate conversion on a template image is carried out; an external box of the target pattern in the grey-scale target image is acquired and is defined to be a target pattern searching zone; matching zones which have a same size as the template image are selected in the target pattern searching zone, polar coordinate conversion on the matching zones is carried out, and correlation between one dimensional data of each of the matching zones and one dimensional data of the template image is calculated; till correlation calculation of all the matching zones is accomplished, the matching zone with the largest correlation is selected as the target pattern, and the position and the direction of the target patter are acquired. Through the method, a matching searching scope is reduced, a matching speed is improved, +/-180 DEG full-angle matching of the target image is realized through normalization cross-correlation calculation through a polar coordinate mode.
Description
Technical field
The invention belongs to technical field of image processing, more specifically, relate to a kind of quick simple crosscorrelation gray level image matching method and device.
Background technology
In integrated circuit (IC) manufacturing industry, high-speed, high-precisionly pick up, chip placement is the key that affects production efficiency.And pick up, chip placement depends on high-speed, high-precision chip positioning technology.Machine vision technique utilizes video camera to take pictures to detected object, then by corresponding image processing algorithm, calculate and analyze, thereby complete target detection and location, as a kind of non-contacting measurement means, machine vision technique obtains very widespread use in IC manufacturing industry.In machine vision, image matching technology is to realize key high-speed, hi-Fix.
Traditional normalized crosscorrelation Gray-scale Matching method is searched for coupling to entire image, has wasted a large amount of match times.Due to the machine error of IC manufacturing equipment own, usually occur that the pattern in die plate pattern and target image exists the situation of rotating, the existence due to rotation, makes image matching algorithm more complicated simultaneously.
Summary of the invention
Above defect or Improvement requirement for prior art, the invention provides the quick full angle of a kind of energy and be normalized simple crosscorrelation Gray-scale Matching method, its object is to improve Gray-scale Matching speed, solve thus IC manufacturing equipment because of images match overlong time, affect the technical matters of production efficiency.
For achieving the above object, according to one aspect of the present invention, provide a kind of quick simple crosscorrelation gray level image matching method, for search position and the direction of target pattern at gray scale target image, having comprised:
(1) template image is carried out to polar coordinates conversion, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search, wherein said external connection box refers to the minimum rectangle that comprises target pattern;
(3) in the target pattern field of search, choose the matching area onesize with template image, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeating step (3), until all matching areas are all completed to correlation calculations, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method proposed by the invention, the image collecting is carried out to pre-service, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed; Convert template image and target search area image to one-dimension array in addition, the normalized crosscorrelation that then carries out polar form calculates, realize target image ± 180 degree full angle coupling.
Preferably, described step (1) specifically comprises:
If the width of template image is W
m, the height of template image is H
m, the grey scale pixel value of template image point is Mode[Y] and [X], wherein 0≤Y<H
m, 0≤X<W
m, the center point coordinate of template image is Y
cM=(H
m-1)/2, X
cM=(W
m-1)/2, take template image center as the center of circle, with template W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array that the image representation after changing is grey scale pixel value: ModePolar[P] and=Mode[Y] [X], wherein Y=FLOOR (Y
cM-ρ sin θ
m), X=FLOOR (X
cM+ ρ cos θ
m), FLOOR symbology is got the maximum integer less than the number in bracket, ρ=1,2 ... R
m, θ
m=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, N=CEIL ((2 π/Δ θ+1) R
m-1), the smallest positive integral larger than the number in bracket got in CEIL representative, by ModePolar[P] array copies 1 part, 2 arrays join end to end and form new one-dimension array ModePolar D[Q], Q=0 wherein, 1,2 ... 2N, and calculate normalized crosscorrelation parameter:
By converting two-dimensional image data to a dimensional data image through polar coordinate transform, thereby by the matching problem that has the anglec of rotation of complexity, convert simple one-dimension array to and calculated the problem of cross correlation value, some parameters in calculated off-line part normalized crosscorrelation value simultaneously, have improved the speed of On-line matching.
Preferably, described ModePolar[P] assignment procedure be:
(A11) initiation parameter: ρ=1, θ
m=0, P=0, Y
c=(H
m-1)/2, X
c=(W
m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y
cM-ρ sin θ
m), X=FLOOR (X
cM+ ρ co θ
m);
(A13) data of template image are carried out to polar coordinates conversion: ModePolar[P]=Mode[Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R
mif, be greater than, carry out (A16) step, if be not more than, returned and carry out (A12) step;
(A16)ρ=0,θ
M=θ
M+Δθ;
(A17) judgement θ
mwhether be greater than 2 π, if be greater than, finish, if be not more than, returned and carry out (A12) step.
Preferably, described step (2) is specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, to the gray scale target image after binaryzation, adopt element marking or distance of swimming connectivity analysis methods to carry out connected domain analysis, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle that comprises target pattern;
The pixel coordinate of the relative target image true origin in the upper left corner of the external connection box that design is calculated is (X
b, Y
b), the width of external connection box is W
b, be highly H
b, the upper left corner, the target pattern field of search in former target image is X with respect to the coordinate of target image true origin
s0=X
b-Offset, Y
s0=Y
b-Offset, the width W of the target pattern field of search
s=W
b+ 2 * Offset is highly H
s=H
b+ 2 * Offset, wherein Offset is integer, span is 5~10 pixels.
By calculating external connection box, thereby fast target pattern has been carried out to coarse localization, thereby dwindled next step, carried out the hunting zone of Cross Correlation Matching, improved images match speed.
Preferably, described step (3) specifically comprises:
From former target image, intercept the image in the target pattern field of search, be called searching image, the width of searching image is W
s, be highly H
s, searching image X coordinates table is shown X
s, Y coordinates table is shown Y
s, MaxCcorr represents the maximum cross-correlation value of searching image and template image, is normalized simple crosscorrelation Gray-scale Matching in searching image, coupling flow process is as follows:
B1) initiation parameter: X
s=0, Y
s=0, MaxCcorr=0;
B2) judgement Y
swhether be less than H
sif, be not less than, finish, if be less than, carry out B3);
B3) judgement X
swhether be less than W
sif be not less than X
s=0, Y
s=Y
s+ 1, return and carry out B2) step, if be less than, carry out B4) step;
B4) image-region identical with template size in searching image is carried out to polar coordinates conversion, the image in this region is called matching area image; The width of matching area image and height be respectively W identical with template image
m, be highly H
m, in matching area, the grey scale pixel value of certain point is Image[Y] and [X], wherein Y
s≤ Y<Y
s+ H
m, X
s≤ X<X
s+ W
m, the center point coordinate of matching area is Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2, take matching area center as the center of circle, with W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array ImagePolar[P that the image representation after changing is grey scale pixel value] and=Image[Y] [X], Y=FLOOR (Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i), the maximum integer less than the number in bracket got in FLOOR representative, ρ=1 wherein, 2 ... R
m, θ
i=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R
m-1), the smallest positive integral larger than the number in bracket got in CEIL representative;
B5) calculate normalized crosscorrelation parameter:
B6) initial integer i=0;
B7) judge whether i is greater than N, if be greater than, X
s=X
s+ 1, return and carry out B3), otherwise carry out B8);
B8) the normalized crosscorrelation value Ccorr of calculation template image and matching area
i:
B9) judgement Ccorr
iwhether be greater than MaxCcorr, if be not more than, carry out B10), if be greater than, by Ccorr
iassignment is to MaxCcorr, and records the now coordinate (X of matching area central point
t, Y
t), X wherein
t=X
s+ (W
m-1)/2, Y
t=Y
s+ (H
m-1)/2, recording angular θ
t=Δ θ FLOOR ((P+R
m)/R
m);
B10) i=i+1 returns and carries out B6) step.
By adopting, template image and target image are carried out to the mode that polar coordinates convert one-dimension array to, thereby by the rotation matching of template image and target image, converted the problem of carrying out relative translation calculating cross correlation value between two one-dimension array to, avoided to piece image every certain angle rotation once, then mate problem once, thereby shortened match time, improved matching speed.
Preferably, described ImagePolar[P] assignment procedure be:
(B41) initiation parameter: ρ=1, θ
i=0, P=0, Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2;
(B42) calculating parameter: Y=FLOOR (Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i);
(B43) data of matching area are carried out to polar coordinates conversion: ImagePolar[P]=Image [Y] [X];
(B44)ρ=ρ+1,P=P+1;
(B45) judge whether ρ is greater than R
mif, be greater than, carry out (B46) step, if be not more than, returned and carry out (B42) step;
(B46)ρ=0,θ
I=θ
I+Δθ;
(B47) judgement θ
iwhether be greater than 2 π, if be greater than, finish, if be not more than, returned and carry out (B42) step.
Preferably, described step (4) is specially:
If at B2) in step, YS is no longer less than HS, calculate and finish, when the matching area to all completes after correlativity meter, obtain the position (XF of template image in target image, YF) and angle θ F, XF=XT+Xs0 wherein, YF=YT+Ys0, θ F=θ T, XT wherein, YT and θ T are B1)~B10) calculate the value that finishes front last preservation.
According to another aspect of the present invention, a kind of quick simple crosscorrelation Image Matching device is provided, it is characterized in that, for search position and the direction of target pattern at gray scale target image, described device comprises:
The first module, for template image is carried out to polar coordinates conversion, and is converted to one-dimension array by the template image data after conversion, is defined as template image one-dimension array;
The second module, for obtaining the external connection box of gray scale target image target pattern, is defined as the target pattern field of search;
The 3rd module, for choosing the matching area onesize with template image in the target pattern field of search, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described the first matching area one-dimensional data and described template image one-dimension array;
Four module, when described the 3rd module all completes correlation calculations to all matching areas, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method provided by the present invention, the invention has the beneficial effects as follows:
1. the image chip pattern collecting in manufacturing for IC and chip background gray levels differ larger feature, by adding image to collecting to carry out the preprocess method of binaryzation and connected domain analysis, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed.
2. adopt and convert template image and target search area image to one-dimension array, the normalized crosscorrelation that then carries out polar form calculates, thus realize target image ± 180 degree full angle coupling.
Accompanying drawing explanation
Fig. 1 is the overall flow figure of quick simple crosscorrelation gray level image matching method of the present invention;
Fig. 2 is the polar coordinates flow path switch figure of template image in a constructed embodiment of the present invention;
Fig. 3 is target image search pattern image process flow diagram in a constructed embodiment of the present invention;
Fig. 4 is matching area image polar coordinates flow path switch figure in a constructed embodiment of the present invention.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.In addition,, in each embodiment of described the present invention, involved technical characterictic just can not combine mutually as long as do not form each other conflict.
As shown in Figure 1, the invention provides a kind of quick simple crosscorrelation gray level image matching method, for search position and the direction of target pattern at gray scale target image, described method comprises:
(1) template image is carried out to polar coordinates conversion, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search;
(3) in the target pattern field of search, choose the matching area onesize with template image, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeating step (3), until all matching areas are all completed to correlation calculations, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method proposed by the invention, the image collecting is carried out to pre-service, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed; Convert template image and target search area image to one-dimension array in addition, the normalized crosscorrelation that then carries out polar form calculates, realize target image ± 180 degree full angle coupling.
Concrete, following to the matching process of target pattern in gray level image being elaborated in the present invention, for gray level image, can be expressed as the two-dimensional array of ranks direction, the value of each array element has represented the grey scale pixel value of this point, take the initial point that the image upper left corner is image coordinate, vertical downward direction is Y positive dirction, simultaneously also for line number label increases progressively direction, represented the short transverse of image, level is X positive dirction to right, also for columns label increases progressively direction, represented the Width of image simultaneously.
This method is specifically divided into off-line phase and on-line stage.
Off-line phase, first carries out pre-service to template image, and concrete steps are as follows:
A1) template image is carried out to polar coordinates conversion, the data after conversion save as one-dimension array, and concrete steps are as follows:
The width of supposing template image is W
m, the height of template image is H
m, the grey scale pixel value of template image point is Mode[Y] and [X], wherein 0≤Y<H
m, 0≤X<W
m, the center point coordinate of template image is Y
cM=(H
m-1)/2, X
cM=(W
m-1)/2, take template image center as the center of circle, with template W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array that the image representation after changing is grey scale pixel value:
ModePolar[P]=Mode[Y][X]
Y=FLOOR(Y
CM-ρsinθ
M)
X=FLOOR(X
CM+ρcosθ
M)
Wherein FLOOR symbology is got the maximum integer less than the number in bracket, ρ=1,2 ... R
m, θ
m=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, for example requiring the angle precision of coupling is 1 degree, Δ θ=π/180, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R
m-1), the maximum integer larger than the number in bracket got in CEIL representative, as shown in Figure 2, described ModePolar[P] assignment procedure be:
(A11) initiation parameter: ρ=1, θ
m=0, P=0, Y
c=(H
m-1)/2, X
c=(W
m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y
cM-ρ sin θ
m), X=FLOOR (X
cM+ ρ co θ
m);
(A13) data of template image are carried out to polar coordinates conversion: ModePolar[P]=Mode[Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R
mif, be greater than, carry out the 6th step, if be not more than, returned and carry out (2) step;
(A16)ρ=0,θ
M=θ
M+Δθ;
(A17) judgement θ
mwhether be greater than 2 π, if be greater than, finish A1) execution A2), if be not more than, returned and carry out (2) step.
A2) calculate the partial parameters in normalized crosscorrelation formula:
A3) by ModePolar[P] array copies 1 part, 2 arrays join end to end and form new one-dimension array ModePolarD[Q], Q=0 wherein, 1,2 ... 2N.
On-line stage, at target image search pattern picture pattern, concrete steps are as follows:
B1) copy former target image, by corresponding gray threshold is set, the image after copy is carried out to binaryzation, image after binaryzation becomes black-and-white two color, because the image chip pattern and the chip background gray levels that collect in IC manufacture differ larger, after binaryzation, the major part of chip becomes a kind of look, background becomes look in contrast, and such as chip major part is black, background is white.Target image after binaryzation is adopted and carries out connected domain analysis as element marking, distance of swimming connectivity analysis methods, calculate the external connection box in chip image region, wherein external connection box refers to and comprises the target area minimum rectangle of (referring to chip design here), and general rectangle is towards coordinate axis.The pixel coordinate of supposing the relative target image true origin in the upper left corner of the external connection box that calculates is (X
b, Y
b), the width of external connection box is W
b, be highly H
b, the upper left corner, region of search in former target image is X with respect to the coordinate of target image true origin
s0=X
b-Offset, Y
s0=Y
b-Offset, the width W of region of search
s=W
b+ 2 * Offset is highly H
s=H
b+ 2 * Offset, wherein Offset is integer, span is 5~10 pixels.
From former target image, intercept the image in region of search, be called searching image, the width of searching image is W
s, be highly H
s, searching image X coordinates table is shown X
s, Y coordinates table is shown Y
s, M
axC
corr represents the maximum cross-correlation value of searching image and template image, is normalized simple crosscorrelation Gray-scale Matching in searching image, and as shown in Figure 3, described coupling flow process is as follows:
B2) initiation parameter: X
s=0, Y
s=0, MaxCcorr=0;
B3) judgement Y
swhether be less than H
sif, be not less than, finish, if be less than, carry out B4).
B4) judgement X
swhether be less than W
sif be not less than X
s=0, Y
s=Y
s+ 1, return and carry out B3) step, if be less than, carry out B5) step.
B5) image-region identical with template size in searching image is carried out to polar coordinates conversion, the image in this region is called matching area image.The width of matching area image and height be respectively W identical with template image
m, be highly H
m, in matching area, the grey scale pixel value of certain point is Image[Y] and [X], wherein Y
s≤ Y<Y
s+ H
m, X
s≤ X<X
s+ W
m, the center point coordinate of matching area is Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2, take matching area center as the center of circle, with W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array ImagePolar[P that the image representation after changing is grey scale pixel value] and=Image[Y] [X], Y=FLOOR (Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i), the maximum integer less than the number in bracket got in FLOOR representative, ρ=1 wherein, 2 ... R
m, θ
i=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, identical with the precision of template image, for example requiring the angle precision of coupling is 1 degree, Δ θ=π/180, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R
m-1), the maximum integer larger than the number in bracket got in CEIL representative, as shown in Figure 4, described ImagePolar[P] assignment procedure be:
(B51) initiation parameter: ρ=1, θ
i=0, P=0, Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2;
(B52) calculating parameter: Y=FLOOR(Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i);
(B53) data of matching area are carried out to polar coordinates conversion: ImagePolar[P]=Image[Y] [X];
(B54)ρ=ρ+1,P=P+1;
(B55) judge whether ρ is greater than R
mif, be greater than, carry out (6) step, if be not more than, returned and carry out (2) step;
(B56)ρ=0,θ
I=θ
I+Δθ;
(B57) judgement θ
iwhether be greater than 2 π, if be greater than, finish B5), carry out B6), if be not more than, returned and carry out (2) step.
B6) calculate normalized crosscorrelation parameter:
B7) initial integer i=0;
B8) judge whether i is greater than N, if be greater than, X
s=X
s+ 1, return and carry out B4), otherwise carry out B9);
B9) the normalized crosscorrelation value Ccorr of calculation template image and matching area
i:
B10) judgement Ccorr
iwhether be greater than MaxCcorr, if be not more than, carry out B11), if be greater than, by Ccorr
iassignment is to MaxCcorr, and records the now coordinate (X of matching area central point
t, Y
t), X wherein
t=X
s+ (W
m-1)/2, Y
t=Y
s+ (H
m-1)/2, recording angular θ
t=Δ θ FLOOR ((P+R
m)/R
m);
B11) i=i+1 returns and carries out B8) step.
Through B1)~B11) after step, if at B3) and in step, Y
sno longer be less than H
s, calculate and finish.Can obtain the position (X of template image in target image
f, Y
f) and angle θ
f, X wherein
f=X
t+ X
s0, Y
f=Y
t+ Y
s0, θ
f=θ
t, X wherein
t, Y
tand θ
tfor B1)~B11) calculate the value finish front last preservation.
Those skilled in the art will readily understand; the foregoing is only preferred embodiment of the present invention; not in order to limit the present invention, all any modifications of doing within the spirit and principles in the present invention, be equal to and replace and improvement etc., within all should being included in protection scope of the present invention.
Claims (8)
1. a quick simple crosscorrelation gray level image matching method, is characterized in that, for search position and the direction of target pattern at gray scale target image, comprising:
(1) template image is carried out to polar coordinates conversion, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search, wherein said external connection box refers to the minimum rectangle that comprises target pattern;
(3) in the target pattern field of search, choose the matching area onesize with template image, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeating step (3), until all matching areas are all completed to correlation calculations, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
2. the method for claim 1, is characterized in that, described step (1) specifically comprises: the width of establishing template image is W
m, the height of template image is H
m, the grey scale pixel value of template image point is Mode[Y] and [X], wherein 0≤Y<H
m, 0≤X<W
m, the center point coordinate of template image is Y
cM=(H
m-1)/2, X
cM=(W
m-1)/2, take template image center as the center of circle, with template W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array that the image representation after changing is grey scale pixel value: ModePolar[P] and=Mode[Y] [X], wherein Y=FLOOR (Y
cM-ρ sin θ
m), X=FLOOR (X
cM+ ρ cos θ
m), FLOOR symbology is got the maximum integer less than the number in bracket, ρ=1,2 ... R
m, θ
m=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, N=CEIL ((2 π/Δ θ+1) R
m-1), the smallest positive integral larger than the number in bracket got in CEIL representative, by ModePolar[P] array copies 1 part, 2 arrays join end to end and form new one-dimension array ModePolar D[Q], Q=0 wherein, 1,2 ... 2N, and calculate normalized crosscorrelation parameter:
3. method as claimed in claim 2, is characterized in that, described ModePolar[P] assignment procedure be:
(A11) initiation parameter: ρ=1, θ
m=0, P=0, Y
c=(H
m-1)/2, X
c=(W
m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y
cM-ρ sin θ
m), X=FLOOR (X
cM+ ρ co θ
m);
(A13) data of template image are carried out to polar coordinates conversion: ModePolar[P]=Mode[Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R
mif, be greater than, carry out (A16) step, if be not more than, returned and carry out (A12) step;
(A16)ρ=0,θ
M=θ
M+Δθ;
(A17) judgement θ
mwhether be greater than 2 π, if be greater than, finish, if be not more than, returned and carry out (A12) step.
4. the method for claim 1, is characterized in that, described step (2) is specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, to the gray scale target image after binaryzation, adopt element marking or distance of swimming connectivity analysis methods to carry out connected domain analysis, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle that comprises target pattern;
The pixel coordinate of the relative target image true origin in the upper left corner of the external connection box that design is calculated is (X
b, Y
b), the width of external connection box is W
b, be highly H
b, the upper left corner, the target pattern field of search in former target image is X with respect to the coordinate of target image true origin
s0=X
b-Offset, Y
s0=Y
b-Offset, the width W of the target pattern field of search
s=W
b+ 2 * Offset is highly H
s=H
b+ 2 * Offset, wherein Offset is integer, span is 5~10 pixels.
5. method as claimed in claim 4, is characterized in that, described step (3) specifically comprises:
From former target image, intercept the image in the target pattern field of search, be called searching image, the width of searching image is W
s, be highly H
s, searching image X coordinates table is shown X
s, Y coordinates table is shown Y
s, MaxCcorr represents the maximum cross-correlation value of searching image and template image, is normalized simple crosscorrelation Gray-scale Matching in searching image, coupling flow process is as follows:
B1) initiation parameter: X
s=0, Y
s=0, MaxCcorr=0;
B2) judgement Y
swhether be less than H
sif, be not less than, finish, if be less than, carry out B3);
B3) judgement X
swhether be less than W
sif be not less than X
s=0, Y
s=Y
s+ 1, return and carry out B2) step, if be less than, carry out B4) step;
B4) image-region identical with template size in searching image is carried out to polar coordinates conversion, the image in this region is called matching area image; The width of matching area image and height be respectively W identical with template image
m, be highly H
m, in matching area, the grey scale pixel value of certain point is Image[Y] and [X], wherein Y
s≤ Y<Y
s+ H
m, X
s≤ X<X
s+ W
m, the center point coordinate of matching area is Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2, take matching area center as the center of circle, with W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array ImagePolar[P that the image representation after changing is grey scale pixel value] and=Image[Y] [X], Y=FLOOR (Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i), the maximum integer less than the number in bracket got in FLOOR representative, ρ=1 wherein, 2 ... R
m, θ
i=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R
m-1), the smallest positive integral larger than the number in bracket got in CEIL representative;
B5) calculate normalized crosscorrelation parameter:
B6) initial integer i=0;
B7) judge whether i is greater than N, if be greater than, X
s=X
s+ 1, return and carry out B3), otherwise carry out B8);
B8) the normalized crosscorrelation value Ccorr of calculation template image and matching area
i:
B9) judgement Ccorr
iwhether be greater than MaxCcorr, if be not more than, carry out B10), if be greater than, by Ccorr
iassignment is to MaxCcorr, and records the now coordinate (X of matching area central point
t, Y
t), X wherein
t=X
s+ (W
m-1)/2, Y
t=Y
s+ (H
m-1)/2, recording angular θ
t=Δ θ FLOOR ((P+R
m)/R
m);
B10) i=i+1 returns and carries out B6) step.
6. method as claimed in claim 5, is characterized in that, described ImagePolar[P] assignment procedure be:
(B41) initiation parameter: ρ=1, θ
i=0, P=0, Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2;
(B42) calculating parameter: Y=FLOOR (Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i);
(B43) data of matching area are carried out to polar coordinates conversion: ImagePolar[P]=Image[Y] [X];
(B44)ρ=ρ+1,P=P+1;
(B45) judge whether ρ is greater than R
mif, be greater than, carry out (B46) step, if be not more than, returned and carry out (B42) step;
(B46)ρ=0,θ
I=θ
I+Δθ;
(B47) judgement θ
iwhether be greater than 2 π, if be greater than, finish, if be not more than, returned and carry out (B42) step.
7. method as claimed in claim 5, is characterized in that, described step (4) is specially:
If at B2) in step, YS is no longer less than HS, calculates and finishes, and when the matching area to all completes after correlativity meter, obtains the position (X of template image in target image
f, Y
f) and angle θ
f, X wherein
f=X
t+ X
s0, Y
f=Y
t+ Y
s0, θ
f=θ
t, X wherein
t, Y
tand θ
tfor B1)~B10) calculate the value finish front last preservation.
8. a quick simple crosscorrelation Image Matching device, is characterized in that, for search position and the direction of target pattern at gray scale target image, described device comprises:
The first module, for template image is carried out to polar coordinates conversion, and is converted to one-dimension array by the template image data after conversion, is defined as template image one-dimension array;
The second module, for obtaining the external connection box of gray scale target image target pattern, is defined as the target pattern field of search;
The 3rd module, for choosing the matching area onesize with template image in the target pattern field of search, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described the first matching area one-dimensional data and described template image one-dimension array;
Four module, when described the 3rd module all completes correlation calculations to all matching areas, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310331842.0A CN103593838B (en) | 2013-08-01 | 2013-08-01 | A kind of cross-correlation gray level image matching method and device fast |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310331842.0A CN103593838B (en) | 2013-08-01 | 2013-08-01 | A kind of cross-correlation gray level image matching method and device fast |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103593838A true CN103593838A (en) | 2014-02-19 |
CN103593838B CN103593838B (en) | 2016-04-13 |
Family
ID=50083963
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310331842.0A Active CN103593838B (en) | 2013-08-01 | 2013-08-01 | A kind of cross-correlation gray level image matching method and device fast |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103593838B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104081435A (en) * | 2014-04-29 | 2014-10-01 | 中国科学院自动化研究所 | Image matching method based on cascading binary encoding |
CN105740899A (en) * | 2016-01-29 | 2016-07-06 | 长安大学 | Machine vision image characteristic point detection and matching combination optimization method |
CN105843972A (en) * | 2016-06-13 | 2016-08-10 | 北京京东尚科信息技术有限公司 | Method and device for comparing product attribute information |
CN106899864A (en) * | 2015-12-18 | 2017-06-27 | 北京国双科技有限公司 | Commercial detection method and device |
CN106898017A (en) * | 2017-02-27 | 2017-06-27 | 网易(杭州)网络有限公司 | Method, device and terminal device for recognizing image local area |
CN109348731A (en) * | 2016-10-14 | 2019-02-15 | 深圳配天智能技术研究院有限公司 | A kind of method and device of images match |
CN110603535A (en) * | 2019-07-29 | 2019-12-20 | 香港应用科技研究院有限公司 | Iterative multi-directional image search supporting large template matching |
CN112215304A (en) * | 2020-11-05 | 2021-01-12 | 珠海大横琴科技发展有限公司 | Gray level image matching method and device for geographic image splicing |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006132046A1 (en) * | 2005-06-07 | 2006-12-14 | National Institute Of Advanced Industrial Science And Technology | Three-dimensional shape aligning method and program |
CN101859384A (en) * | 2010-06-12 | 2010-10-13 | 北京航空航天大学 | Target image sequence measurement method |
CN101950419A (en) * | 2010-08-26 | 2011-01-19 | 西安理工大学 | Quick image rectification method in presence of translation and rotation at same time |
CN103020945A (en) * | 2011-09-21 | 2013-04-03 | 中国科学院电子学研究所 | Remote sensing image registration method of multi-source sensor |
-
2013
- 2013-08-01 CN CN201310331842.0A patent/CN103593838B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006132046A1 (en) * | 2005-06-07 | 2006-12-14 | National Institute Of Advanced Industrial Science And Technology | Three-dimensional shape aligning method and program |
CN101859384A (en) * | 2010-06-12 | 2010-10-13 | 北京航空航天大学 | Target image sequence measurement method |
CN101950419A (en) * | 2010-08-26 | 2011-01-19 | 西安理工大学 | Quick image rectification method in presence of translation and rotation at same time |
CN103020945A (en) * | 2011-09-21 | 2013-04-03 | 中国科学院电子学研究所 | Remote sensing image registration method of multi-source sensor |
Non-Patent Citations (2)
Title |
---|
卢蓉 等: "一种提取目标图像最小外接矩形的快速算法", 《计算机工程》 * |
张伟: "地面目标图象自动识别算法和系统研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104081435A (en) * | 2014-04-29 | 2014-10-01 | 中国科学院自动化研究所 | Image matching method based on cascading binary encoding |
CN106899864A (en) * | 2015-12-18 | 2017-06-27 | 北京国双科技有限公司 | Commercial detection method and device |
CN105740899A (en) * | 2016-01-29 | 2016-07-06 | 长安大学 | Machine vision image characteristic point detection and matching combination optimization method |
CN105740899B (en) * | 2016-01-29 | 2019-08-23 | 长安大学 | A kind of detection of machine vision image characteristic point and match compound optimization method |
CN105843972A (en) * | 2016-06-13 | 2016-08-10 | 北京京东尚科信息技术有限公司 | Method and device for comparing product attribute information |
CN105843972B (en) * | 2016-06-13 | 2020-05-01 | 北京京东尚科信息技术有限公司 | Product attribute information comparison method and device |
CN109348731A (en) * | 2016-10-14 | 2019-02-15 | 深圳配天智能技术研究院有限公司 | A kind of method and device of images match |
CN109348731B (en) * | 2016-10-14 | 2022-05-17 | 深圳配天智能技术研究院有限公司 | Image matching method and device |
CN106898017A (en) * | 2017-02-27 | 2017-06-27 | 网易(杭州)网络有限公司 | Method, device and terminal device for recognizing image local area |
CN106898017B (en) * | 2017-02-27 | 2019-05-31 | 网易(杭州)网络有限公司 | The method, apparatus and terminal device of image local area for identification |
CN110603535A (en) * | 2019-07-29 | 2019-12-20 | 香港应用科技研究院有限公司 | Iterative multi-directional image search supporting large template matching |
CN112215304A (en) * | 2020-11-05 | 2021-01-12 | 珠海大横琴科技发展有限公司 | Gray level image matching method and device for geographic image splicing |
Also Published As
Publication number | Publication date |
---|---|
CN103593838B (en) | 2016-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103593838B (en) | A kind of cross-correlation gray level image matching method and device fast | |
Novatnack et al. | Scale-dependent/invariant local 3D shape descriptors for fully automatic registration of multiple sets of range images | |
CN105139416A (en) | Object identification method based on image information and depth information | |
Lysenkov et al. | Pose estimation of rigid transparent objects in transparent clutter | |
Zhou et al. | Extrinsic calibration of a camera and a lidar based on decoupling the rotation from the translation | |
CN103824080B (en) | Robot SLAM object state detection method in dynamic sparse environment | |
CN102661708B (en) | High-density packaged element positioning method based on speeded up robust features (SURFs) | |
Wu et al. | A novel high precise laser 3D profile scanning method with flexible calibration | |
CN104766309A (en) | Plane feature point navigation and positioning method and device | |
Xie et al. | A4lidartag: Depth-based fiducial marker for extrinsic calibration of solid-state lidar and camera | |
Dao et al. | A robust recognition technique for dense checkerboard patterns | |
Sihombing et al. | Perspective rectification in vehicle number plate recognition using 2D-2D transformation of Planar Homography | |
Gajdošech et al. | Towards Deep Learning-based 6D Bin Pose Estimation in 3D Scans | |
CN102663756A (en) | Registration method of special shaped elements and high-density packing components in printed circuit board | |
Wang et al. | Visual positioning of rectangular lead components based on Harris corners and Zernike moments | |
CN104700400A (en) | High-precision image matching method based on feature point training | |
Fujita et al. | Floor fingerprint verification using a gravity-aware smartphone | |
Xie et al. | Real-time Reconstruction of unstructured scenes based on binocular vision depth | |
Du et al. | Optimization of stereo vision depth estimation using edge-based disparity map | |
Xu | A unified approach to autofocus and alignment for pattern localization using hybrid weighted Hausdorff distance | |
Jia et al. | Improved normal iterative closest point algorithm with multi-information | |
Wang et al. | An image registration algorithm based on features point and cluster | |
Duan et al. | The icosahedron marker for Robots 6-dof pose estimation | |
CN104637055B (en) | A kind of high precision image matching process based on small scale features point | |
Chen et al. | Monocular obstacle detection using reciprocal-polar rectification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C41 | Transfer of patent application or patent right or utility model | ||
TR01 | Transfer of patent right |
Effective date of registration: 20161125 Address after: 430074 Hubei Province, Wuhan city Hongshan District Luoyu Road No. 1037 Patentee after: Huazhong University of Science and Technology Patentee after: GUANGDONG HUST INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE Address before: 430074 Hubei Province, Wuhan city Hongshan District Luoyu Road No. 1037 Patentee before: Huazhong University of Science and Technology |