CN103593838B - A kind of cross-correlation gray level image matching method and device fast - Google Patents

A kind of cross-correlation gray level image matching method and device fast Download PDF

Info

Publication number
CN103593838B
CN103593838B CN201310331842.0A CN201310331842A CN103593838B CN 103593838 B CN103593838 B CN 103593838B CN 201310331842 A CN201310331842 A CN 201310331842A CN 103593838 B CN103593838 B CN 103593838B
Authority
CN
China
Prior art keywords
image
target pattern
target
matching area
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310331842.0A
Other languages
Chinese (zh)
Other versions
CN103593838A (en
Inventor
杨华
尹周平
王瑜辉
张步阳
魏飞龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Guangdong Hust Industrial Technology Research Institute
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201310331842.0A priority Critical patent/CN103593838B/en
Publication of CN103593838A publication Critical patent/CN103593838A/en
Application granted granted Critical
Publication of CN103593838B publication Critical patent/CN103593838B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a kind of cross-correlation gray level image matching method and device fast, for searching position and the direction of target pattern in gray scale target image, described method comprises: carry out polar coordinates conversion to template image; Obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search; In the target pattern field of search, choose the matching area onesize with template image, polar coordinates conversion is carried out to described matching area, calculate the correlativity of described matching area one-dimensional data and described template image one-dimension array; Until all complete correlation calculations to all matching areas, choose the maximum matching area of correlativity as target pattern, obtain position and the direction of described target pattern.By the inventive method, thus reduce the hunting zone of coupling, improve matching speed, and calculated by the normalized crosscorrelation of polar form, realize target image ± 180 degree full angle coupling.

Description

A kind of cross-correlation gray level image matching method and device fast
Technical field
The invention belongs to technical field of image processing, more specifically, relate to a kind of cross-correlation gray level image matching method and device fast.
Background technology
In integrated circuit (IC) manufacturing industry, high-speed, high-precisionly carry out picking up, chip placement is the key affecting production efficiency.And pick up, chip placement depends on high speed, high-precision chip positioning technology.Machine vision technique utilizes video camera to take pictures to detected object, then undertaken calculating by corresponding image processing algorithm and analyze, thus complete target detection and location, as the non-contacting measurement means of one, machine vision technique is widely applied in IC manufacturing industry.In machine vision, image matching technology is the key realizing high speed, hi-Fix.
Traditional normalized crosscorrelation Gray-scale Matching method carries out search coupling to entire image, wastes a large amount of match times.Simultaneously due to the machine error of IC manufacturing equipment own, usually occur that the pattern in die plate pattern and target image exists situation about rotating, due to the existence rotated, make image matching algorithm more complicated.
Summary of the invention
For above defect or the Improvement requirement of prior art, the invention provides the quick full angle of a kind of energy and be normalized cross-correlation Gray-scale Matching method, its object is to improve Gray-scale Matching speed, solve IC manufacturing equipment because images match overlong time thus, affect the technical matters of production efficiency.
For achieving the above object, according to one aspect of the present invention, providing a kind of cross-correlation gray level image matching method fast, for searching position and the direction of target pattern in gray scale target image, comprising:
(1) polar coordinates conversion is carried out to template image, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search, wherein said external connection box refers to the minimum rectangle comprising target pattern; Be specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, element marking or distance of swimming connectivity analysis methods is adopted to carry out connected domain analysis to the gray scale target image after binaryzation, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle comprising target pattern;
The pixel coordinate designing the upper left corner relative target image coordinate initial point of the external connection box calculated is (X b, Y b), the width of external connection box is W b, be highly H b, then the upper left corner, the target pattern field of search in former target image is X relative to the coordinate of target image true origin s0=X b-Offset, Y s0=Y b-Offset, the width W of the target pattern field of search s=W b+ 2 × Offset is highly H s=H b+ 2 × Offset, wherein Offset is integer, and span is 5 ~ 10 pixels;
(3) in the target pattern field of search, the matching area onesize with template image is chosen, polar coordinates conversion is carried out to described matching area, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeat step (3), until all complete correlation calculations to all matching areas, choose the maximum matching area of correlativity as target pattern, obtain position and the direction of described target pattern.
By method proposed by the invention, pre-service is carried out to the image collected, determine the external connection box position of pattern in target image, thus reduce the hunting zone of coupling, improve matching speed; Convert template image and target search area image to one-dimension array in addition, the normalized crosscorrelation then carrying out polar form calculates, realize target image ± 180 degree full angle coupling.
Preferably, described step (1) specifically comprises:
If the width of template image is W m, the height of template image is H m, the grey scale pixel value of template image point is Mode [Y] [X], wherein 0≤Y<H m, 0≤X<W m, then the center point coordinate of template image is Y cM=(H m-1)/2, X cM=(W m-1)/2, with template image center for the center of circle, with template W m/ 2 and H mthe minimum value of/2 is radius R m, carry out polar coordinates conversion, the image after conversion is expressed as the one-dimension array of grey scale pixel value: ModePolar [P]=Mode [Y] [X], wherein Y=FLOOR (Y cM-ρ sin θ m), X=FLOOR (X cM+ ρ cos θ m), FLOOR symbology gets the maximum integer less than the number in bracket, ρ=1,2 ... R m, θ m=0, Δ θ, 2 Δ θ ... 2 π, Δ θ are the precision of angle automatching, N=CEIL ((2 π/Δ θ+1) R m-1), the smallest positive integral larger than the number in bracket is got in CEIL representative, and ModePolar [P] array is copied 1 part, 2 arrays join end to end and form new one-dimension array ModePolarD [Q], wherein Q=0,1,2 ... 2N, and calculate normalized crosscorrelation parameter:
S M = &Sigma; P = 0 P = N M o d e P o l a r &lsqb; P &rsqb;
S ( M 2 ) = &Sigma; P = 0 P = N ( M o d e P o l a r &lsqb; P &rsqb; ) 2
SM 2 = ( &Sigma; P = 0 P = N M o d e P o l a r &lsqb; P &rsqb; ) 2
By converting two-dimensional image data to one dimensional image data through polar coordinate transform, thus by the matching problem having the anglec of rotation of complexity, be converted to the problem that simple one-dimension array calculates cross correlation value, some parameters simultaneously in calculated off-line part normalized crosscorrelation value, improve the speed of On-line matching.
Preferably, the assignment procedure of described ModePolar [P] is:
(A11) initiation parameter: ρ=1, θ m=0, P=0, Y c=(H m-1)/2, X c=(W m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y cM-ρ sin θ m), X=FLOOR (X cM+ ρ co θ m);
(A13) data of template image are carried out polar coordinates conversion: ModePolar [P]=Mode [Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R mif be greater than, perform (A16) step, if be not more than, return execution (A12) step;
(A16)ρ=0,θ M=θ M+Δθ;
(A17) θ is judged mwhether be greater than 2 π, if be greater than, terminate, if be not more than, return execution (A12) step.
Preferably, described step (2) is specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, element marking or distance of swimming connectivity analysis methods is adopted to carry out connected domain analysis to the gray scale target image after binaryzation, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle comprising target pattern;
The pixel coordinate designing the upper left corner relative target image coordinate initial point of the external connection box calculated is (X b, Y b), the width of external connection box is W b, be highly H b, then the upper left corner, the target pattern field of search in former target image is X relative to the coordinate of target image true origin s0=X b-Offset, Y s0=Y b-Offset, the width W of the target pattern field of search s=W b+ 2 × Offset is highly H s=H b+ 2 × Offset, wherein Offset is integer, and span is 5 ~ 10 pixels.
By calculating external connection box, thus fast target pattern is carried out coarse localization, thus reduced the hunting zone that next step carries out Cross Correlation Matching, improve images match speed.
Preferably, described step (3) specifically comprises:
From former target image, intercept the image in the target pattern field of search, be called searching image, then the width of searching image is W s, be highly H s, searching image X-coordinate is expressed as X s, Y-coordinate is expressed as Y s, MaxCcorr represents the maximum cross-correlation value of searching image and template image, is normalized cross-correlation Gray-scale Matching in searching image, and coupling flow process is as follows:
B1) initiation parameter: X s=0, Y s=0, MaxCcorr=0;
B2) Y is judged swhether be less than H sif be not less than, terminate, if be less than, perform B3);
B3) X is judged swhether be less than W sif be not less than, X s=0, Y s=Y s+ 1, return and perform B2) step, if be less than, perform B4) step;
B4) carry out polar coordinates conversion to image-region identical with template size in searching image, the image in this region is called matching area image; The width of matching area image is identical with template image with height is respectively W m, be highly H m, in matching area, the grey scale pixel value of certain point is Image [Y] [X], wherein Y s≤ Y<Y s+ H m, X s≤ X<X s+ W m, then the center point coordinate of matching area is Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2, with matching area center for the center of circle, with W m/ 2 and H mthe minimum value of/2 is radius R m, carry out polar coordinates conversion, the image after conversion is expressed as one-dimension array ImagePolar [P]=Image [Y] [X] of grey scale pixel value, Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i), FLOOR representative gets the maximum integer less than the number in bracket, wherein ρ=1,2 ... R m, θ i=0, Δ θ, 2 Δ θ ... 2 π, Δ θ are the precision of angle automatching, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R m-1), the smallest positive integral larger than the number in bracket is got in CEIL representative;
B5) normalized crosscorrelation parameter is calculated:
S I = &Sigma; P = 0 P = N Im a g e P o l a r &lsqb; P &rsqb;
S ( I 2 ) = &Sigma; P = 0 P = N ( Im a g e P o l a r &lsqb; P &rsqb; ) 2
SI 2 = ( &Sigma; P = 0 P = N Im a g e P o l a r &lsqb; P &rsqb; ) 2
B6) initial integer i=0;
B7) judge whether i is greater than N, if be greater than, X s=X s+ 1, return and perform B3), otherwise perform B8);
B8) the normalized crosscorrelation value Ccorr of calculation template image and matching area i:
B9) Ccorr is judged iwhether be greater than MaxCcorr, if be not more than, perform B10), if be greater than, by Ccorr iassignment to MaxCcorr, and records the coordinate (X of now matching area central point t, Y t), wherein X t=X s+ (W m-1)/2, Y t=Y s+ (H m-1)/2, recording angular θ t=Δ θ FLOOR ((P+R m)/R m);
B10) i=i+1 returns and performs B7) step.
By adopting, template image and target image are carried out the mode that polar coordinates convert one-dimension array to, thus by the rotation matching of template image and target image, be converted between two one-dimension array and carry out the problem that relative translation calculates cross correlation value, avoid and piece image is rotated once every certain angle, then problem is once mated, thus shorten match time, improve matching speed.
Preferably, the assignment procedure of described ImagePolar [P] is:
(B41) initiation parameter: ρ=1, θ i=0, P=0, Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2;
(B42) calculating parameter: Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i);
(B43) data of matching area are carried out polar coordinates conversion: ImagePolar [P]=Image [Y] [X];
(B44)ρ=ρ+1,P=P+1;
(B45) judge whether ρ is greater than R mif be greater than, perform (B46) step, if be not more than, return execution (B42) step;
(B46)ρ=0,θ I=θ I+Δθ;
(B47) θ is judged iwhether be greater than 2 π, if be greater than, terminate, if be not more than, return execution (B42) step.
Preferably, described step (4) is specially:
If at B2) in step, YS is no longer less than HS, then calculate end, after correlativity meter is all completed to all matching areas, obtain template image position (XF in the target image, and angle θ F YF), wherein XF=XT+Xs0, YF=YT+Ys0, θ F=θ T, wherein XT, YT and θ T is B1) ~ B10) calculate the value of finally preserving before terminating.
According to another aspect of the present invention, provide a kind of cross-correlation Image Matching device fast, it is characterized in that, for searching position and the direction of target pattern in gray scale target image, described device comprises:
First module, for carrying out polar coordinates conversion to template image, and being converted to one-dimension array by the template image data after conversion, being defined as template image one-dimension array;
Second module, for obtaining the external connection box of target pattern in gray scale target image, is defined as the target pattern field of search; Be specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, element marking or distance of swimming connectivity analysis methods is adopted to carry out connected domain analysis to the gray scale target image after binaryzation, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle comprising target pattern;
The pixel coordinate designing the upper left corner relative target image coordinate initial point of the external connection box calculated is (X b, Y b), the width of external connection box is W b, be highly H b, then the upper left corner, the target pattern field of search in former target image is X relative to the coordinate of target image true origin s0=X b-Offset, Y s0=Y b-Offset, the width W of the target pattern field of search s=W b+ 2 × Offset is highly H s=H b+ 2 × Offset, wherein Offset is integer, and span is 5 ~ 10 pixels;
3rd module, for choosing the matching area onesize with template image in the target pattern field of search, polar coordinates conversion is carried out to described matching area, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
Four module, when described 3rd module all completes correlation calculations to all matching areas, chooses the maximum matching area of correlativity as target pattern, obtains position and the direction of described target pattern.
By method provided by the present invention, the invention has the beneficial effects as follows:
1. the image chip pattern collected in manufacturing for IC differs larger feature with chip background gray levels, by adding the preprocess method image collected being carried out to binaryzation and connected domain analysis, determine the external connection box position of pattern in target image, thus reduce the hunting zone of coupling, improve matching speed.
2. adopt and convert template image and target search area image to one-dimension array, the normalized crosscorrelation then carrying out polar form calculates, thus realize target image ± 180 degree full angle coupling.
Accompanying drawing explanation
Fig. 1 is the overall flow figure of quick cross-correlation gray level image matching method of the present invention;
Fig. 2 is the polar coordinates flow path switch figure of template image in an embodiment constructed by the present invention;
Fig. 3 is target image search pattern image process flow diagram in an embodiment constructed by the present invention;
Fig. 4 is matching area image polar coordinates flow path switch figure in an embodiment constructed by the present invention.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.In addition, if below in described each embodiment of the present invention involved technical characteristic do not form conflict each other and just can mutually combine.
As shown in Figure 1, the invention provides a kind of cross-correlation gray level image matching method fast, for searching position and the direction of target pattern in gray scale target image, described method comprises:
(1) polar coordinates conversion is carried out to template image, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search;
(3) in the target pattern field of search, the matching area onesize with template image is chosen, polar coordinates conversion is carried out to described matching area, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeat step (3), until all complete correlation calculations to all matching areas, choose the maximum matching area of correlativity as target pattern, obtain position and the direction of described target pattern.
By method proposed by the invention, pre-service is carried out to the image collected, determine the external connection box position of pattern in target image, thus reduce the hunting zone of coupling, improve matching speed; Convert template image and target search area image to one-dimension array in addition, the normalized crosscorrelation then carrying out polar form calculates, realize target image ± 180 degree full angle coupling.
Concrete, followingly in the present invention, the matching process of target pattern in gray level image to be elaborated, the two-dimensional array in ranks direction can be expressed as gray level image, the value of each array element represents the grey scale pixel value of this point, with the initial point that the image upper left corner is image coordinate, vertical downward direction is Y positive dirction, simultaneously also for line number label increases progressively direction, represent the short transverse of image, horizontal right direction is X positive dirction, simultaneously also for columns label increases progressively direction, represent the Width of image.
This method is specifically divided into off-line phase and on-line stage.
Off-line phase, first carry out pre-service to template image, concrete steps are as follows:
A1) carry out polar coordinates conversion to template image, the data after conversion save as one-dimension array, and concrete steps are as follows:
Suppose that the width of template image is W m, the height of template image is H m, the grey scale pixel value of template image point is Mode [Y] [X], wherein 0≤Y<H m, 0≤X<W m, then the center point coordinate of template image is Y cM=(H m-1)/2, X cM=(W m-1)/2, with template image center for the center of circle, with template W m/ 2 and H mthe minimum value of/2 is radius R m, carry out polar coordinates conversion, the image after conversion is expressed as the one-dimension array of grey scale pixel value:
ModePolar[P]=Mode[Y][X]
Y=FLOOR(Y CM-ρsinθ M)
X=FLOOR(X CM+ρcosθ M)
Wherein FLOOR symbology gets the maximum integer less than the number in bracket, ρ=1,2 ... R m, θ m=0, Δ θ, 2 Δ θ ... 2 π, Δ θ are the precision of angle automatching, such as, require that the angle precision mated is 1 degree, then Δ θ=π/180, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R m-1), the maximum integer larger than the number in bracket is got in CEIL representative, and as shown in Figure 2, the assignment procedure of described ModePolar [P] is:
(A11) initiation parameter: ρ=1, θ m=0, P=0, Y c=(H m-1)/2, X c=(W m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y cM-ρ sin θ m), X=FLOOR (X cM+ ρ co θ m);
(A13) data of template image are carried out polar coordinates conversion: ModePolar [P]=Mode [Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R mif be greater than, performed the 6th step, if be not more than, return execution (2) step;
(A16)ρ=0,θ M=θ M+Δθ;
(A17) θ is judged mwhether be greater than 2 π, if be greater than, terminate A1) perform A2), if be not more than, return execution (2) step.
A2) partial parameters in normalized crosscorrelation formula is calculated:
S M = &Sigma; P = 0 P = N M o d e P o l a r &lsqb; P &rsqb;
S ( M 2 ) = &Sigma; P = 0 P = N ( M o d e P o l a r &lsqb; P &rsqb; ) 2
SM 2 = ( &Sigma; P = 0 P = N M o d e P o l a r &lsqb; P &rsqb; ) 2
A3) ModePolar [P] array is copied 1 part, 2 arrays join end to end and form new one-dimension array ModePolarD [Q], wherein Q=0,1,2 ... 2N.
On-line stage, for search pattern picture pattern in the target image, concrete steps are as follows:
B1) former target image is copied, by arranging corresponding gray threshold, the image after copy is carried out binaryzation, image after binaryzation becomes black-and-white two color, because the image chip pattern collected in IC manufacture differs larger with chip background gray levels, after binaryzation, the major part of chip becomes a kind of look, background becomes look in contrast, and such as chip major part is black, and background is white.Adopt as element marking, distance of swimming connectivity analysis methods carry out connected domain analysis to the target image after binaryzation, calculate the external connection box in chip image region, wherein external connection box refers to the minimum rectangle comprising target area (referring to chip design here), and general rectangle is towards coordinate axis.The pixel coordinate supposing the upper left corner relative target image coordinate initial point of the external connection box calculated is (X b, Y b), the width of external connection box is W b, be highly H b, then the upper left corner, region of search in former target image is X relative to the coordinate of target image true origin s0=X b-Offset, Y s0=Y b-Offset, the width W of region of search s=W b+ 2 × Offset is highly H s=H b+ 2 × Offset, wherein Offset is integer, and span is 5 ~ 10 pixels.
From former target image, intercept the image in region of search, be called searching image, then the width of searching image is W s, be highly H s, searching image X-coordinate is expressed as X s, Y-coordinate is expressed as Y s, MaxCcorr represents the maximum cross-correlation value of searching image and template image, is normalized cross-correlation Gray-scale Matching in searching image, and as shown in Figure 3, described coupling flow process is as follows:
B2) initiation parameter: X s=0, Y s=0, MaxCcorr=0;
B3) Y is judged swhether be less than H sif be not less than, terminate, if be less than, perform B4).
B4) X is judged swhether be less than W sif be not less than, X s=0, Y s=Y s+ 1, return and perform B3) step, if be less than, perform B5) step.
B5) carry out polar coordinates conversion to image-region identical with template size in searching image, the image in this region is called matching area image.The width of matching area image is identical with template image with height is respectively W m, be highly H m, in matching area, the grey scale pixel value of certain point is Image [Y] [X], wherein Y s≤ Y<Y s+ H m, X s≤ X<X s+ W m, then the center point coordinate of matching area is Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2, with matching area center for the center of circle, with W m/ 2 and H mthe minimum value of/2 is radius R m, carry out polar coordinates conversion, the image after conversion is expressed as one-dimension array ImagePolar [P]=Image [Y] [X] of grey scale pixel value, Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i), FLOOR representative gets the maximum integer less than the number in bracket, wherein ρ=1,2 ... R m, θ i=0, Δ θ, 2 Δ θ ... 2 π, Δ θ are the precision of angle automatching, identical with the precision of template image, such as, require that the angle precision mated is 1 degree, then Δ θ=π/180, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R m-1), the maximum integer larger than the number in bracket is got in CEIL representative, and as shown in Figure 4, the assignment procedure of described ImagePolar [P] is:
(B51) initiation parameter: ρ=1, θ i=0, P=0, Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2;
(B52) calculating parameter: Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i);
(B53) data of matching area are carried out polar coordinates conversion: ImagePolar [P]=Image [Y] [X];
(B54)ρ=ρ+1,P=P+1;
(B55) judge whether ρ is greater than R mif be greater than, perform (6) step, if be not more than, return execution (2) step;
(B56)ρ=0,θ I=θ I+Δθ;
(B57) θ is judged iwhether be greater than 2 π, if be greater than, terminate B5), perform B6), if be not more than, return execution (2) step.
B6) normalized crosscorrelation parameter is calculated:
S I = &Sigma; P = 0 P = N Im a g e P o l a r &lsqb; P &rsqb; ,
S ( I 2 ) = &Sigma; P = 0 P = N ( Im a g e P o l a r &lsqb; P &rsqb; ) 2 ,
SI 2 = ( &Sigma; P = 0 P = N Im a g e P o l a r &lsqb; P &rsqb; ) 2 .
B7) initial integer i=0;
B8) judge whether i is greater than N, if be greater than, X s=X s+ 1, return and perform B4), otherwise perform B9);
B9) the normalized crosscorrelation value Ccorr of calculation template image and matching area i:
B10) Ccorr is judged iwhether be greater than MaxCcorr, if be not more than, perform B11), if be greater than, by Ccorr iassignment to MaxCcorr, and records the coordinate (X of now matching area central point t, Y t), wherein X t=X s+ (W m-1)/2, Y t=Y s+ (H m-1)/2, recording angular θ t=Δ θ FLOOR ((P+R m)/R m);
B11) i=i+1 returns and performs B8) step.
Through B1) ~ B11) after step, if at B3) and in step, Y sno longer be less than H s, then end is calculated.Template image position (X in the target image can be obtained f, Y f) and angle θ f, wherein X f=X t+ X s0, Y f=Y t+ Y s0, θ ft, wherein X t, Y tand θ tfor B1) ~ B11) calculate the value of finally preserving before terminating.
Those skilled in the art will readily understand; the foregoing is only preferred embodiment of the present invention; not in order to limit the present invention, all any amendments done within the spirit and principles in the present invention, equivalent replacement and improvement etc., all should be included within protection scope of the present invention.

Claims (7)

1. a quick cross-correlation gray level image matching method, is characterized in that, for searching position and the direction of target pattern in gray scale target image, comprising:
(1) polar coordinates conversion is carried out to template image, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search, wherein said external connection box refers to the minimum rectangle comprising target pattern; Be specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, element marking or distance of swimming connectivity analysis methods is adopted to carry out connected domain analysis to the gray scale target image after binaryzation, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle comprising target pattern;
The pixel coordinate designing the upper left corner relative target image coordinate initial point of the external connection box calculated is (X b, Y b), the width of external connection box is W b, be highly H b, then the upper left corner, the target pattern field of search in former target image is X relative to the coordinate of target image true origin s0=X b-Offset, Y s0=Y b-Offset, the width W of the target pattern field of search s=W b+ 2 × Offset is highly H s=H b+ 2 × Offset, wherein Offset is integer, and span is 5 ~ 10 pixels;
(3) in the target pattern field of search, the matching area onesize with template image is chosen, polar coordinates conversion is carried out to described matching area, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeat step (3), until all complete correlation calculations to all matching areas, choose the maximum matching area of correlativity as target pattern, obtain position and the direction of described target pattern.
2. the method for claim 1, is characterized in that, described step (1) specifically comprises: set the width of template image as W m, the height of template image is H m, the grey scale pixel value of template image point is Mode [Y] [X], wherein 0≤Y<H m, 0≤X<W m, then the center point coordinate of template image is Y cM=(H m-1)/2, X cM=(W m-1)/2, with template image center for the center of circle, with template W m/ 2 and H mthe minimum value of/2 is radius R m, carry out polar coordinates conversion, the image after conversion is expressed as the one-dimension array of grey scale pixel value: ModePolar [P]=Mode [Y] [X], wherein Y=FLOOR (Y cM-ρ sin θ m), X=FLOOR (X cM+ ρ cos θ m), FLOOR symbology gets the maximum integer less than the number in bracket, ρ=1,2 ... R m, θ m=0, Δ θ, 2 Δ θ ... 2 π, Δ θ are the precision of angle automatching, N=CEIL ((2 π/Δ θ+1) R m-1), the smallest positive integral larger than the number in bracket is got in CEIL representative, and ModePolar [P] array is copied 1 part, 2 arrays join end to end and form new one-dimension array ModePolarD [Q], wherein Q=0,1,2 ... 2N, and calculate normalized crosscorrelation parameter:
S M = &Sigma; P = 0 P = N M o d e P o l a r &lsqb; P &rsqb;
S ( M 2 ) = &Sigma; P = 0 P = N ( M o d e P o l a r &lsqb; P &rsqb; ) 2
SM 2 = ( &Sigma; P = 0 P = N M o d e P o l a r &lsqb; P &rsqb; ) 2
c o r r D e n = N &CenterDot; S ( M 2 ) - SM 2 .
3. method as claimed in claim 2, it is characterized in that, the assignment procedure of described ModePolar [P] is:
(A11) initiation parameter: ρ=1, θ m=0, P=0, Y c=(H m-1)/2, X c=(W m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y cM-ρ sin θ m), X=FLOOR (X cM+ ρ co θ m);
(A13) data of template image are carried out polar coordinates conversion: ModePolar [P]=Mode [Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R mif be greater than, perform (A16) step, if be not more than, return execution (A12) step;
(A16)ρ=0,θ M=θ M+Δθ;
(A17) θ is judged mwhether be greater than 2 π, if be greater than, terminate, if be not more than, return execution (A12) step.
4. the method for claim 1, is characterized in that, described step (3) specifically comprises:
From former target image, intercept the image in the target pattern field of search, be called searching image, then the width of searching image is W s, be highly H s, searching image X-coordinate is expressed as X s, Y-coordinate is expressed as Y s, MaxCcorr represents the maximum cross-correlation value of searching image and template image, is normalized cross-correlation Gray-scale Matching in searching image, and coupling flow process is as follows:
B1) initiation parameter: X s=0, Y s=0, MaxCcorr=0;
B2) Y is judged swhether be less than H sif be not less than, terminate, if be less than, perform B3);
B3) X is judged swhether be less than W sif be not less than, X s=0, Y s=Y s+ 1, return and perform B2) step, if be less than, perform B4) step;
B4) carry out polar coordinates conversion to image-region identical with template size in searching image, the image in this region is called matching area image; The width of matching area image is identical with template image with height is respectively W m, be highly H m, in matching area, the grey scale pixel value of certain point is Image [Y] [X], wherein Y s≤ Y<Y s+ H m, X s≤ X<X s+ W m, then the center point coordinate of matching area is Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2, with matching area center for the center of circle, with W m/ 2 and H mthe minimum value of/2 is radius R m, carry out polar coordinates conversion, the image after conversion is expressed as one-dimension array ImagePolar [P]=Image [Y] [X] of grey scale pixel value, Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i), FLOOR representative gets the maximum integer less than the number in bracket, wherein ρ=1,2 ... R m, θ i=0, Δ θ, 2 Δ θ ... 2 π, Δ θ are the precision of angle automatching, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R m-1), the smallest positive integral larger than the number in bracket is got in CEIL representative;
B5) normalized crosscorrelation parameter is calculated:
S I = &Sigma; P = 0 P = N Im a g e P o l a r &lsqb; P &rsqb;
S ( I 2 ) = &Sigma; P = 0 P = N ( Im a g e P o l a r &lsqb; P &rsqb; ) 2
SI 2 = ( &Sigma; P = 0 P = N Im a g e P o l a r &lsqb; P &rsqb; ) 2
B6) initial integer i=0;
B7) judge whether i is greater than N, if be greater than, X s=X s+ 1, return and perform B3), otherwise perform B8);
B8) the normalized crosscorrelation value Ccorr of calculation template image and matching area i:
S i ( I M ) = &Sigma; P = i P = N + i ( Im a g e P o l a r &lsqb; P - i &rsqb; &CenterDot; M o d e P o l a r D &lsqb; P &rsqb; )
Ccorr i = N &CenterDot; S i ( I M ) - S I &CenterDot; S M | N &CenterDot; S ( I 2 ) - SI 2 | | N &CenterDot; S ( M 2 ) - SM 2 |
B9) Ccorr is judged iwhether be greater than MaxCcorr, if be not more than, perform B10), if be greater than, by Ccorr iassignment to MaxCcorr, and records the coordinate (X of now matching area central point t, Y t), wherein X t=X s+ (W m-1)/2, Y t=Y s+ (H m-1)/2, recording angular θ t=Δ θ FLOOR ((P+R m)/R m);
B10) i=i+1 returns and performs B7) step.
5. method as claimed in claim 4, it is characterized in that, the assignment procedure of described ImagePolar [P] is:
(B41) initiation parameter: ρ=1, θ i=0, P=0, Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2;
(B42) calculating parameter: Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i);
(B43) data of matching area are carried out polar coordinates conversion: ImagePolar [P]=Image [Y] [X];
(B44)ρ=ρ+1,P=P+1;
(B45) judge whether ρ is greater than R mif be greater than, perform (B46) step, if be not more than, return execution (B42) step;
(B46)ρ=0,θ I=θ I+Δθ;
(B47) θ is judged iwhether be greater than 2 π, if be greater than, terminate, if be not more than, return execution (B42) step.
6. method as claimed in claim 4, it is characterized in that, described step (4) is specially:
If at B2) in step, YS is no longer less than HS, then calculate end, after all completing correlativity meter to all matching areas, obtains template image position (X in the target image f, Y f) and angle θ f, wherein X f=X t+ X s0, Y f=Y t+ Y s0, θ ft,wherein X t, Y tand θ tfor B1) ~ B10) calculate the value of finally preserving before terminating.
7. a quick cross-correlation Image Matching device, is characterized in that, for searching position and the direction of target pattern in gray scale target image, described device comprises:
First module, for carrying out polar coordinates conversion to template image, and being converted to one-dimension array by the template image data after conversion, being defined as template image one-dimension array;
Second module, for obtaining the external connection box of target pattern in gray scale target image, is defined as the target pattern field of search; Be specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, element marking or distance of swimming connectivity analysis methods is adopted to carry out connected domain analysis to the gray scale target image after binaryzation, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle comprising target pattern;
The pixel coordinate designing the upper left corner relative target image coordinate initial point of the external connection box calculated is (X b, Y b), the width of external connection box is W b, be highly H b, then the upper left corner, the target pattern field of search in former target image is X relative to the coordinate of target image true origin s0=X b-Offset, Y s0=Y b-Offset, the width W of the target pattern field of search s=W b+ 2 × Offset is highly H s=H b+ 2 × Offset, wherein Offset is integer, and span is 5 ~ 10 pixels;
3rd module, for choosing the matching area onesize with template image in the target pattern field of search, polar coordinates conversion is carried out to described matching area, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
Four module, when described 3rd module all completes correlation calculations to all matching areas, chooses the maximum matching area of correlativity as target pattern, obtains position and the direction of described target pattern.
CN201310331842.0A 2013-08-01 2013-08-01 A kind of cross-correlation gray level image matching method and device fast Active CN103593838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310331842.0A CN103593838B (en) 2013-08-01 2013-08-01 A kind of cross-correlation gray level image matching method and device fast

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310331842.0A CN103593838B (en) 2013-08-01 2013-08-01 A kind of cross-correlation gray level image matching method and device fast

Publications (2)

Publication Number Publication Date
CN103593838A CN103593838A (en) 2014-02-19
CN103593838B true CN103593838B (en) 2016-04-13

Family

ID=50083963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310331842.0A Active CN103593838B (en) 2013-08-01 2013-08-01 A kind of cross-correlation gray level image matching method and device fast

Country Status (1)

Country Link
CN (1) CN103593838B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9996764B2 (en) * 2014-04-29 2018-06-12 Institute Of Automation Chinese Academy Of Sciences Image matching method based on cascaded binary encoding
CN106899864A (en) * 2015-12-18 2017-06-27 北京国双科技有限公司 Commercial detection method and device
CN105740899B (en) * 2016-01-29 2019-08-23 长安大学 A kind of detection of machine vision image characteristic point and match compound optimization method
CN105843972B (en) * 2016-06-13 2020-05-01 北京京东尚科信息技术有限公司 Product attribute information comparison method and device
CN109348731B (en) * 2016-10-14 2022-05-17 深圳配天智能技术研究院有限公司 Image matching method and device
CN106898017B (en) * 2017-02-27 2019-05-31 网易(杭州)网络有限公司 The method, apparatus and terminal device of image local area for identification
CN110603535B (en) * 2019-07-29 2023-02-28 香港应用科技研究院有限公司 Iterative multi-directional image search supporting large template matching
CN112215304A (en) * 2020-11-05 2021-01-12 珠海大横琴科技发展有限公司 Gray level image matching method and device for geographic image splicing

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859384A (en) * 2010-06-12 2010-10-13 北京航空航天大学 Target image sequence measurement method
CN101950419A (en) * 2010-08-26 2011-01-19 西安理工大学 Quick image rectification method in presence of translation and rotation at same time
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4686762B2 (en) * 2005-06-07 2011-05-25 独立行政法人産業技術総合研究所 Three-dimensional shape alignment method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859384A (en) * 2010-06-12 2010-10-13 北京航空航天大学 Target image sequence measurement method
CN101950419A (en) * 2010-08-26 2011-01-19 西安理工大学 Quick image rectification method in presence of translation and rotation at same time
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种提取目标图像最小外接矩形的快速算法;卢蓉 等;《计算机工程》;20101130;第36卷(第21期);论文摘要、第1-3节 *
地面目标图象自动识别算法和系统研究;张伟;《中国优秀硕士学位论文全文数据库 信息科技辑》;20080315(第3期);论文第2章第2.1-2.4节,图2-8 *

Also Published As

Publication number Publication date
CN103593838A (en) 2014-02-19

Similar Documents

Publication Publication Date Title
CN103593838B (en) A kind of cross-correlation gray level image matching method and device fast
Romero-Ramirez et al. Speeded up detection of squared fiducial markers
JP5699788B2 (en) Screen area detection method and system
CN105139416A (en) Object identification method based on image information and depth information
Lysenkov et al. Pose estimation of rigid transparent objects in transparent clutter
CN105551039A (en) Calibration method and calibration device for structured light 3D scanning system
TWI500925B (en) Check the device, check the method and check the program
CN102661708B (en) High-density packaged element positioning method based on speeded up robust features (SURFs)
US20130058526A1 (en) Device for automated detection of feature for calibration and method thereof
Zhang et al. A region-based normalized cross correlation algorithm for the vision-based positioning of elongated IC chips
Bai et al. Corner point-based coarse–fine method for surface-mount component positioning
Yuan et al. Combining maps and street level images for building height and facade estimation
Xie et al. A4lidartag: Depth-based fiducial marker for extrinsic calibration of solid-state lidar and camera
Qiu et al. Image mosaics algorithm based on SIFT feature point matching and transformation parameters automatically recognizing
Dao et al. A robust recognition technique for dense checkerboard patterns
Wang et al. Visual positioning of rectangular lead components based on Harris corners and Zernike moments
Sihombing et al. Perspective rectification in vehicle number plate recognition using 2D-2D transformation of Planar Homography
CN103177416B (en) A kind of QR code image position method based on least square method
Lu et al. Automatic Detection of Chip Pin Defect in Semiconductor Assembly Using Vision Measurement
Fujita et al. Floor fingerprint verification using a gravity-aware smartphone
CN105469085A (en) Board card image acquisition method and system
Liang et al. An integrated camera parameters calibration approach for robotic monocular vision guidance
CN104236518B (en) A kind of antenna main beam sensing method based on optical imagery and pattern-recognition
JP2011186916A (en) Image recognition device, image recognition method and image recognition program
Huang et al. A checkerboard corner detection method using circular samplers

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20161125

Address after: 430074 Hubei Province, Wuhan city Hongshan District Luoyu Road No. 1037

Patentee after: Huazhong University of Science and Technology

Patentee after: GUANGDONG HUST INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE

Address before: 430074 Hubei Province, Wuhan city Hongshan District Luoyu Road No. 1037

Patentee before: Huazhong University of Science and Technology