A kind of quick simple crosscorrelation gray level image matching method and device
Technical field
The invention belongs to technical field of image processing, more specifically, relate to a kind of quick simple crosscorrelation gray level image matching method and device.
Background technology
In integrated circuit (IC) manufacturing industry, high-speed, high-precisionly pick up, chip placement is the key that affects production efficiency.And pick up, chip placement depends on high-speed, high-precision chip positioning technology.Machine vision technique utilizes video camera to take pictures to detected object, then by corresponding image processing algorithm, calculate and analyze, thereby complete target detection and location, as a kind of non-contacting measurement means, machine vision technique obtains very widespread use in IC manufacturing industry.In machine vision, image matching technology is to realize key high-speed, hi-Fix.
Traditional normalized crosscorrelation Gray-scale Matching method is searched for coupling to entire image, has wasted a large amount of match times.Due to the machine error of IC manufacturing equipment own, usually occur that the pattern in die plate pattern and target image exists the situation of rotating, the existence due to rotation, makes image matching algorithm more complicated simultaneously.
Summary of the invention
Above defect or Improvement requirement for prior art, the invention provides the quick full angle of a kind of energy and be normalized simple crosscorrelation Gray-scale Matching method, its object is to improve Gray-scale Matching speed, solve thus IC manufacturing equipment because of images match overlong time, affect the technical matters of production efficiency.
For achieving the above object, according to one aspect of the present invention, provide a kind of quick simple crosscorrelation gray level image matching method, for search position and the direction of target pattern at gray scale target image, having comprised:
(1) template image is carried out to polar coordinates conversion, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search, wherein said external connection box refers to the minimum rectangle that comprises target pattern;
(3) in the target pattern field of search, choose the matching area onesize with template image, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeating step (3), until all matching areas are all completed to correlation calculations, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method proposed by the invention, the image collecting is carried out to pre-service, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed; Convert template image and target search area image to one-dimension array in addition, the normalized crosscorrelation that then carries out polar form calculates, realize target image ± 180 degree full angle coupling.
Preferably, described step (1) specifically comprises:
If the width of template image is W
m, the height of template image is H
m, the grey scale pixel value of template image point is Mode[Y] and [X], wherein 0≤Y<H
m, 0≤X<W
m, the center point coordinate of template image is Y
cM=(H
m-1)/2, X
cM=(W
m-1)/2, take template image center as the center of circle, with template W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array that the image representation after changing is grey scale pixel value: ModePolar[P] and=Mode[Y] [X], wherein Y=FLOOR (Y
cM-ρ sin θ
m), X=FLOOR (X
cM+ ρ cos θ
m), FLOOR symbology is got the maximum integer less than the number in bracket, ρ=1,2 ... R
m, θ
m=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, N=CEIL ((2 π/Δ θ+1) R
m-1), the smallest positive integral larger than the number in bracket got in CEIL representative, by ModePolar[P] array copies 1 part, 2 arrays join end to end and form new one-dimension array ModePolar D[Q], Q=0 wherein, 1,2 ... 2N, and calculate normalized crosscorrelation parameter:
By converting two-dimensional image data to a dimensional data image through polar coordinate transform, thereby by the matching problem that has the anglec of rotation of complexity, convert simple one-dimension array to and calculated the problem of cross correlation value, some parameters in calculated off-line part normalized crosscorrelation value simultaneously, have improved the speed of On-line matching.
Preferably, described ModePolar[P] assignment procedure be:
(A11) initiation parameter: ρ=1, θ
m=0, P=0, Y
c=(H
m-1)/2, X
c=(W
m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y
cM-ρ sin θ
m), X=FLOOR (X
cM+ ρ co θ
m);
(A13) data of template image are carried out to polar coordinates conversion: ModePolar[P]=Mode[Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R
mif, be greater than, carry out (A16) step, if be not more than, returned and carry out (A12) step;
(A16)ρ=0,θ
M=θ
M+Δθ;
(A17) judgement θ
mwhether be greater than 2 π, if be greater than, finish, if be not more than, returned and carry out (A12) step.
Preferably, described step (2) is specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, to the gray scale target image after binaryzation, adopt element marking or distance of swimming connectivity analysis methods to carry out connected domain analysis, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle that comprises target pattern;
The pixel coordinate of the relative target image true origin in the upper left corner of the external connection box that design is calculated is (X
b, Y
b), the width of external connection box is W
b, be highly H
b, the upper left corner, the target pattern field of search in former target image is X with respect to the coordinate of target image true origin
s0=X
b-Offset, Y
s0=Y
b-Offset, the width W of the target pattern field of search
s=W
b+ 2 * Offset is highly H
s=H
b+ 2 * Offset, wherein Offset is integer, span is 5~10 pixels.
By calculating external connection box, thereby fast target pattern has been carried out to coarse localization, thereby dwindled next step, carried out the hunting zone of Cross Correlation Matching, improved images match speed.
Preferably, described step (3) specifically comprises:
From former target image, intercept the image in the target pattern field of search, be called searching image, the width of searching image is W
s, be highly H
s, searching image X coordinates table is shown X
s, Y coordinates table is shown Y
s, MaxCcorr represents the maximum cross-correlation value of searching image and template image, is normalized simple crosscorrelation Gray-scale Matching in searching image, coupling flow process is as follows:
B1) initiation parameter: X
s=0, Y
s=0, MaxCcorr=0;
B2) judgement Y
swhether be less than H
sif, be not less than, finish, if be less than, carry out B3);
B3) judgement X
swhether be less than W
sif be not less than X
s=0, Y
s=Y
s+ 1, return and carry out B2) step, if be less than, carry out B4) step;
B4) image-region identical with template size in searching image is carried out to polar coordinates conversion, the image in this region is called matching area image; The width of matching area image and height be respectively W identical with template image
m, be highly H
m, in matching area, the grey scale pixel value of certain point is Image[Y] and [X], wherein Y
s≤ Y<Y
s+ H
m, X
s≤ X<X
s+ W
m, the center point coordinate of matching area is Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2, take matching area center as the center of circle, with W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array ImagePolar[P that the image representation after changing is grey scale pixel value] and=Image[Y] [X], Y=FLOOR (Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i), the maximum integer less than the number in bracket got in FLOOR representative, ρ=1 wherein, 2 ... R
m, θ
i=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R
m-1), the smallest positive integral larger than the number in bracket got in CEIL representative;
B5) calculate normalized crosscorrelation parameter:
B6) initial integer i=0;
B7) judge whether i is greater than N, if be greater than, X
s=X
s+ 1, return and carry out B3), otherwise carry out B8);
B8) the normalized crosscorrelation value Ccorr of calculation template image and matching area
i:
B9) judgement Ccorr
iwhether be greater than MaxCcorr, if be not more than, carry out B10), if be greater than, by Ccorr
iassignment is to MaxCcorr, and records the now coordinate (X of matching area central point
t, Y
t), X wherein
t=X
s+ (W
m-1)/2, Y
t=Y
s+ (H
m-1)/2, recording angular θ
t=Δ θ FLOOR ((P+R
m)/R
m);
B10) i=i+1 returns and carries out B6) step.
By adopting, template image and target image are carried out to the mode that polar coordinates convert one-dimension array to, thereby by the rotation matching of template image and target image, converted the problem of carrying out relative translation calculating cross correlation value between two one-dimension array to, avoided to piece image every certain angle rotation once, then mate problem once, thereby shortened match time, improved matching speed.
Preferably, described ImagePolar[P] assignment procedure be:
(B41) initiation parameter: ρ=1, θ
i=0, P=0, Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2;
(B42) calculating parameter: Y=FLOOR (Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i);
(B43) data of matching area are carried out to polar coordinates conversion: ImagePolar[P]=Image [Y] [X];
(B44)ρ=ρ+1,P=P+1;
(B45) judge whether ρ is greater than R
mif, be greater than, carry out (B46) step, if be not more than, returned and carry out (B42) step;
(B46)ρ=0,θ
I=θ
I+Δθ;
(B47) judgement θ
iwhether be greater than 2 π, if be greater than, finish, if be not more than, returned and carry out (B42) step.
Preferably, described step (4) is specially:
If at B2) in step, YS is no longer less than HS, calculate and finish, when the matching area to all completes after correlativity meter, obtain the position (XF of template image in target image, YF) and angle θ F, XF=XT+Xs0 wherein, YF=YT+Ys0, θ F=θ T, XT wherein, YT and θ T are B1)~B10) calculate the value that finishes front last preservation.
According to another aspect of the present invention, a kind of quick simple crosscorrelation Image Matching device is provided, it is characterized in that, for search position and the direction of target pattern at gray scale target image, described device comprises:
The first module, for template image is carried out to polar coordinates conversion, and is converted to one-dimension array by the template image data after conversion, is defined as template image one-dimension array;
The second module, for obtaining the external connection box of gray scale target image target pattern, is defined as the target pattern field of search;
The 3rd module, for choosing the matching area onesize with template image in the target pattern field of search, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described the first matching area one-dimensional data and described template image one-dimension array;
Four module, when described the 3rd module all completes correlation calculations to all matching areas, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method provided by the present invention, the invention has the beneficial effects as follows:
1. the image chip pattern collecting in manufacturing for IC and chip background gray levels differ larger feature, by adding image to collecting to carry out the preprocess method of binaryzation and connected domain analysis, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed.
2. adopt and convert template image and target search area image to one-dimension array, the normalized crosscorrelation that then carries out polar form calculates, thus realize target image ± 180 degree full angle coupling.
Accompanying drawing explanation
Fig. 1 is the overall flow figure of quick simple crosscorrelation gray level image matching method of the present invention;
Fig. 2 is the polar coordinates flow path switch figure of template image in a constructed embodiment of the present invention;
Fig. 3 is target image search pattern image process flow diagram in a constructed embodiment of the present invention;
Fig. 4 is matching area image polar coordinates flow path switch figure in a constructed embodiment of the present invention.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.In addition,, in each embodiment of described the present invention, involved technical characterictic just can not combine mutually as long as do not form each other conflict.
As shown in Figure 1, the invention provides a kind of quick simple crosscorrelation gray level image matching method, for search position and the direction of target pattern at gray scale target image, described method comprises:
(1) template image is carried out to polar coordinates conversion, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search;
(3) in the target pattern field of search, choose the matching area onesize with template image, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeating step (3), until all matching areas are all completed to correlation calculations, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method proposed by the invention, the image collecting is carried out to pre-service, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed; Convert template image and target search area image to one-dimension array in addition, the normalized crosscorrelation that then carries out polar form calculates, realize target image ± 180 degree full angle coupling.
Concrete, following to the matching process of target pattern in gray level image being elaborated in the present invention, for gray level image, can be expressed as the two-dimensional array of ranks direction, the value of each array element has represented the grey scale pixel value of this point, take the initial point that the image upper left corner is image coordinate, vertical downward direction is Y positive dirction, simultaneously also for line number label increases progressively direction, represented the short transverse of image, level is X positive dirction to right, also for columns label increases progressively direction, represented the Width of image simultaneously.
This method is specifically divided into off-line phase and on-line stage.
Off-line phase, first carries out pre-service to template image, and concrete steps are as follows:
A1) template image is carried out to polar coordinates conversion, the data after conversion save as one-dimension array, and concrete steps are as follows:
The width of supposing template image is W
m, the height of template image is H
m, the grey scale pixel value of template image point is Mode[Y] and [X], wherein 0≤Y<H
m, 0≤X<W
m, the center point coordinate of template image is Y
cM=(H
m-1)/2, X
cM=(W
m-1)/2, take template image center as the center of circle, with template W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array that the image representation after changing is grey scale pixel value:
ModePolar[P]=Mode[Y][X]
Y=FLOOR(Y
CM-ρsinθ
M)
X=FLOOR(X
CM+ρcosθ
M)
Wherein FLOOR symbology is got the maximum integer less than the number in bracket, ρ=1,2 ... R
m, θ
m=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, for example requiring the angle precision of coupling is 1 degree, Δ θ=π/180, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R
m-1), the maximum integer larger than the number in bracket got in CEIL representative, as shown in Figure 2, described ModePolar[P] assignment procedure be:
(A11) initiation parameter: ρ=1, θ
m=0, P=0, Y
c=(H
m-1)/2, X
c=(W
m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y
cM-ρ sin θ
m), X=FLOOR (X
cM+ ρ co θ
m);
(A13) data of template image are carried out to polar coordinates conversion: ModePolar[P]=Mode[Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R
mif, be greater than, carry out the 6th step, if be not more than, returned and carry out (2) step;
(A16)ρ=0,θ
M=θ
M+Δθ;
(A17) judgement θ
mwhether be greater than 2 π, if be greater than, finish A1) execution A2), if be not more than, returned and carry out (2) step.
A2) calculate the partial parameters in normalized crosscorrelation formula:
A3) by ModePolar[P] array copies 1 part, 2 arrays join end to end and form new one-dimension array ModePolarD[Q], Q=0 wherein, 1,2 ... 2N.
On-line stage, at target image search pattern picture pattern, concrete steps are as follows:
B1) copy former target image, by corresponding gray threshold is set, the image after copy is carried out to binaryzation, image after binaryzation becomes black-and-white two color, because the image chip pattern and the chip background gray levels that collect in IC manufacture differ larger, after binaryzation, the major part of chip becomes a kind of look, background becomes look in contrast, and such as chip major part is black, background is white.Target image after binaryzation is adopted and carries out connected domain analysis as element marking, distance of swimming connectivity analysis methods, calculate the external connection box in chip image region, wherein external connection box refers to and comprises the target area minimum rectangle of (referring to chip design here), and general rectangle is towards coordinate axis.The pixel coordinate of supposing the relative target image true origin in the upper left corner of the external connection box that calculates is (X
b, Y
b), the width of external connection box is W
b, be highly H
b, the upper left corner, region of search in former target image is X with respect to the coordinate of target image true origin
s0=X
b-Offset, Y
s0=Y
b-Offset, the width W of region of search
s=W
b+ 2 * Offset is highly H
s=H
b+ 2 * Offset, wherein Offset is integer, span is 5~10 pixels.
From former target image, intercept the image in region of search, be called searching image, the width of searching image is W
s, be highly H
s, searching image X coordinates table is shown X
s, Y coordinates table is shown Y
s, M
axC
corr represents the maximum cross-correlation value of searching image and template image, is normalized simple crosscorrelation Gray-scale Matching in searching image, and as shown in Figure 3, described coupling flow process is as follows:
B2) initiation parameter: X
s=0, Y
s=0, MaxCcorr=0;
B3) judgement Y
swhether be less than H
sif, be not less than, finish, if be less than, carry out B4).
B4) judgement X
swhether be less than W
sif be not less than X
s=0, Y
s=Y
s+ 1, return and carry out B3) step, if be less than, carry out B5) step.
B5) image-region identical with template size in searching image is carried out to polar coordinates conversion, the image in this region is called matching area image.The width of matching area image and height be respectively W identical with template image
m, be highly H
m, in matching area, the grey scale pixel value of certain point is Image[Y] and [X], wherein Y
s≤ Y<Y
s+ H
m, X
s≤ X<X
s+ W
m, the center point coordinate of matching area is Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2, take matching area center as the center of circle, with W
m/ 2 and H
m/ 2 minimum value is radius R
m, carry out polar coordinates conversion, the one-dimension array ImagePolar[P that the image representation after changing is grey scale pixel value] and=Image[Y] [X], Y=FLOOR (Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i), the maximum integer less than the number in bracket got in FLOOR representative, ρ=1 wherein, 2 ... R
m, θ
i=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, identical with the precision of template image, for example requiring the angle precision of coupling is 1 degree, Δ θ=π/180, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R
m-1), the maximum integer larger than the number in bracket got in CEIL representative, as shown in Figure 4, described ImagePolar[P] assignment procedure be:
(B51) initiation parameter: ρ=1, θ
i=0, P=0, Y
cI=Y
s+ (H
m-1)/2, X
cI=X
s+ (W
m-1)/2;
(B52) calculating parameter: Y=FLOOR(Y
cI-ρ sin θ
i), X=FLOOR (X
cI+ ρ cos θ
i);
(B53) data of matching area are carried out to polar coordinates conversion: ImagePolar[P]=Image[Y] [X];
(B54)ρ=ρ+1,P=P+1;
(B55) judge whether ρ is greater than R
mif, be greater than, carry out (6) step, if be not more than, returned and carry out (2) step;
(B56)ρ=0,θ
I=θ
I+Δθ;
(B57) judgement θ
iwhether be greater than 2 π, if be greater than, finish B5), carry out B6), if be not more than, returned and carry out (2) step.
B6) calculate normalized crosscorrelation parameter:
B7) initial integer i=0;
B8) judge whether i is greater than N, if be greater than, X
s=X
s+ 1, return and carry out B4), otherwise carry out B9);
B9) the normalized crosscorrelation value Ccorr of calculation template image and matching area
i:
B10) judgement Ccorr
iwhether be greater than MaxCcorr, if be not more than, carry out B11), if be greater than, by Ccorr
iassignment is to MaxCcorr, and records the now coordinate (X of matching area central point
t, Y
t), X wherein
t=X
s+ (W
m-1)/2, Y
t=Y
s+ (H
m-1)/2, recording angular θ
t=Δ θ FLOOR ((P+R
m)/R
m);
B11) i=i+1 returns and carries out B8) step.
Through B1)~B11) after step, if at B3) and in step, Y
sno longer be less than H
s, calculate and finish.Can obtain the position (X of template image in target image
f, Y
f) and angle θ
f, X wherein
f=X
t+ X
s0, Y
f=Y
t+ Y
s0, θ
f=θ
t, X wherein
t, Y
tand θ
tfor B1)~B11) calculate the value finish front last preservation.
Those skilled in the art will readily understand; the foregoing is only preferred embodiment of the present invention; not in order to limit the present invention, all any modifications of doing within the spirit and principles in the present invention, be equal to and replace and improvement etc., within all should being included in protection scope of the present invention.