CN103593838A - Rapid cross-correlation grey-scale image coupling method and rapid cross-correlation grey-scale image coupling device - Google Patents

Rapid cross-correlation grey-scale image coupling method and rapid cross-correlation grey-scale image coupling device Download PDF

Info

Publication number
CN103593838A
CN103593838A CN201310331842.0A CN201310331842A CN103593838A CN 103593838 A CN103593838 A CN 103593838A CN 201310331842 A CN201310331842 A CN 201310331842A CN 103593838 A CN103593838 A CN 103593838A
Authority
CN
China
Prior art keywords
image
matching
target
target pattern
correlation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310331842.0A
Other languages
Chinese (zh)
Other versions
CN103593838B (en
Inventor
杨华
尹周平
王瑜辉
张步阳
魏飞龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Guangdong Hust Industrial Technology Research Institute
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201310331842.0A priority Critical patent/CN103593838B/en
Publication of CN103593838A publication Critical patent/CN103593838A/en
Application granted granted Critical
Publication of CN103593838B publication Critical patent/CN103593838B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

本发明公开了一种快速互相关灰度图像匹配方法与装置,用于在灰度目标图像中查找目标图案的位置与方向,所述方法包括:对模板图像进行极坐标转换;获取灰度目标图像中目标图案的外接盒,将其定义为目标图案搜索区;在目标图案搜索区中选取与模板图像同样大小的匹配区域,对所述匹配区域进行极坐标转换,计算所述匹配区域一维数据与所述模板图像一维数组的相关性;直至对所有的匹配区域均完成相关性计算,选取相关性最大的匹配区域作为目标图案,获取所述目标图案的位置与方向。通过本发明方法,从而缩小匹配的搜索范围,提高匹配速度,并且通过极坐标形式的归一化互相关计算,实现目标图像±180度全角度匹配。

Figure 201310331842

The invention discloses a fast cross-correlation grayscale image matching method and device for searching the position and direction of a target pattern in a grayscale target image. The method includes: performing polar coordinate transformation on a template image; acquiring a grayscale target The bounding box of the target pattern in the image is defined as the target pattern search area; in the target pattern search area, a matching area with the same size as the template image is selected, and the matching area is converted into polar coordinates, and the one-dimensional matching area is calculated. The correlation between the data and the one-dimensional array of the template image; until the correlation calculation is completed for all the matching areas, the matching area with the highest correlation is selected as the target pattern, and the position and direction of the target pattern are obtained. Through the method of the invention, the matching search range is narrowed, the matching speed is improved, and the target image is matched at ±180 degrees at all angles through the normalized cross-correlation calculation in the form of polar coordinates.

Figure 201310331842

Description

A kind of quick simple crosscorrelation gray level image matching method and device
Technical field
The invention belongs to technical field of image processing, more specifically, relate to a kind of quick simple crosscorrelation gray level image matching method and device.
Background technology
In integrated circuit (IC) manufacturing industry, high-speed, high-precisionly pick up, chip placement is the key that affects production efficiency.And pick up, chip placement depends on high-speed, high-precision chip positioning technology.Machine vision technique utilizes video camera to take pictures to detected object, then by corresponding image processing algorithm, calculate and analyze, thereby complete target detection and location, as a kind of non-contacting measurement means, machine vision technique obtains very widespread use in IC manufacturing industry.In machine vision, image matching technology is to realize key high-speed, hi-Fix.
Traditional normalized crosscorrelation Gray-scale Matching method is searched for coupling to entire image, has wasted a large amount of match times.Due to the machine error of IC manufacturing equipment own, usually occur that the pattern in die plate pattern and target image exists the situation of rotating, the existence due to rotation, makes image matching algorithm more complicated simultaneously.
Summary of the invention
Above defect or Improvement requirement for prior art, the invention provides the quick full angle of a kind of energy and be normalized simple crosscorrelation Gray-scale Matching method, its object is to improve Gray-scale Matching speed, solve thus IC manufacturing equipment because of images match overlong time, affect the technical matters of production efficiency.
For achieving the above object, according to one aspect of the present invention, provide a kind of quick simple crosscorrelation gray level image matching method, for search position and the direction of target pattern at gray scale target image, having comprised:
(1) template image is carried out to polar coordinates conversion, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search, wherein said external connection box refers to the minimum rectangle that comprises target pattern;
(3) in the target pattern field of search, choose the matching area onesize with template image, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeating step (3), until all matching areas are all completed to correlation calculations, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method proposed by the invention, the image collecting is carried out to pre-service, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed; Convert template image and target search area image to one-dimension array in addition, the normalized crosscorrelation that then carries out polar form calculates, realize target image ± 180 degree full angle coupling.
Preferably, described step (1) specifically comprises:
If the width of template image is W m, the height of template image is H m, the grey scale pixel value of template image point is Mode[Y] and [X], wherein 0≤Y<H m, 0≤X<W m, the center point coordinate of template image is Y cM=(H m-1)/2, X cM=(W m-1)/2, take template image center as the center of circle, with template W m/ 2 and H m/ 2 minimum value is radius R m, carry out polar coordinates conversion, the one-dimension array that the image representation after changing is grey scale pixel value: ModePolar[P] and=Mode[Y] [X], wherein Y=FLOOR (Y cM-ρ sin θ m), X=FLOOR (X cM+ ρ cos θ m), FLOOR symbology is got the maximum integer less than the number in bracket, ρ=1,2 ... R m, θ m=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, N=CEIL ((2 π/Δ θ+1) R m-1), the smallest positive integral larger than the number in bracket got in CEIL representative, by ModePolar[P] array copies 1 part, 2 arrays join end to end and form new one-dimension array ModePolar D[Q], Q=0 wherein, 1,2 ... 2N, and calculate normalized crosscorrelation parameter:
SM = &Sigma; P = 0 P = N ModePolar [ P ]
S ( M 2 ) = &Sigma; P = 0 P = N ( ModePolar [ P ] ) 2
SM 2 = ( &Sigma; P = 0 P = N ModePolar [ P ] ) 2
corrDen = N &CenterDot; S ( M 2 ) - SM 2 .
By converting two-dimensional image data to a dimensional data image through polar coordinate transform, thereby by the matching problem that has the anglec of rotation of complexity, convert simple one-dimension array to and calculated the problem of cross correlation value, some parameters in calculated off-line part normalized crosscorrelation value simultaneously, have improved the speed of On-line matching.
Preferably, described ModePolar[P] assignment procedure be:
(A11) initiation parameter: ρ=1, θ m=0, P=0, Y c=(H m-1)/2, X c=(W m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y cM-ρ sin θ m), X=FLOOR (X cM+ ρ co θ m);
(A13) data of template image are carried out to polar coordinates conversion: ModePolar[P]=Mode[Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R mif, be greater than, carry out (A16) step, if be not more than, returned and carry out (A12) step;
(A16)ρ=0,θ MM+Δθ;
(A17) judgement θ mwhether be greater than 2 π, if be greater than, finish, if be not more than, returned and carry out (A12) step.
Preferably, described step (2) is specially:
Corresponding gray threshold is set by gray scale target image binaryzation, gray scale target image after binaryzation becomes black-and-white two color, to the gray scale target image after binaryzation, adopt element marking or distance of swimming connectivity analysis methods to carry out connected domain analysis, calculate the external connection box of target pattern, wherein external connection box refers to the minimum rectangle that comprises target pattern;
The pixel coordinate of the relative target image true origin in the upper left corner of the external connection box that design is calculated is (X b, Y b), the width of external connection box is W b, be highly H b, the upper left corner, the target pattern field of search in former target image is X with respect to the coordinate of target image true origin s0=X b-Offset, Y s0=Y b-Offset, the width W of the target pattern field of search s=W b+ 2 * Offset is highly H s=H b+ 2 * Offset, wherein Offset is integer, span is 5~10 pixels.
By calculating external connection box, thereby fast target pattern has been carried out to coarse localization, thereby dwindled next step, carried out the hunting zone of Cross Correlation Matching, improved images match speed.
Preferably, described step (3) specifically comprises:
From former target image, intercept the image in the target pattern field of search, be called searching image, the width of searching image is W s, be highly H s, searching image X coordinates table is shown X s, Y coordinates table is shown Y s, MaxCcorr represents the maximum cross-correlation value of searching image and template image, is normalized simple crosscorrelation Gray-scale Matching in searching image, coupling flow process is as follows:
B1) initiation parameter: X s=0, Y s=0, MaxCcorr=0;
B2) judgement Y swhether be less than H sif, be not less than, finish, if be less than, carry out B3);
B3) judgement X swhether be less than W sif be not less than X s=0, Y s=Y s+ 1, return and carry out B2) step, if be less than, carry out B4) step;
B4) image-region identical with template size in searching image is carried out to polar coordinates conversion, the image in this region is called matching area image; The width of matching area image and height be respectively W identical with template image m, be highly H m, in matching area, the grey scale pixel value of certain point is Image[Y] and [X], wherein Y s≤ Y<Y s+ H m, X s≤ X<X s+ W m, the center point coordinate of matching area is Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2, take matching area center as the center of circle, with W m/ 2 and H m/ 2 minimum value is radius R m, carry out polar coordinates conversion, the one-dimension array ImagePolar[P that the image representation after changing is grey scale pixel value] and=Image[Y] [X], Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i), the maximum integer less than the number in bracket got in FLOOR representative, ρ=1 wherein, 2 ... R m, θ i=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R m-1), the smallest positive integral larger than the number in bracket got in CEIL representative;
B5) calculate normalized crosscorrelation parameter:
SI = &Sigma; P = 0 P = N ImagePolar [ P ]
S ( I 2 ) = &Sigma; P = 0 P = N ( ImagePolar [ P ] ) 2
SI 2 = ( &Sigma; P = 0 P = N ImagePolar [ P ] ) 2
B6) initial integer i=0;
B7) judge whether i is greater than N, if be greater than, X s=X s+ 1, return and carry out B3), otherwise carry out B8);
B8) the normalized crosscorrelation value Ccorr of calculation template image and matching area i:
S i ( IM ) = &Sigma; P = i P = N + i ( ImagePolar [ P - i ] &CenterDot; ModePolarD [ P ] )
Ccorr i = N &CenterDot; S i ( IM ) - SI &CenterDot; SM | N &CenterDot; S ( I 2 ) - SI 2 | | N &CenterDot; S ( M 2 ) - SM 2 |
B9) judgement Ccorr iwhether be greater than MaxCcorr, if be not more than, carry out B10), if be greater than, by Ccorr iassignment is to MaxCcorr, and records the now coordinate (X of matching area central point t, Y t), X wherein t=X s+ (W m-1)/2, Y t=Y s+ (H m-1)/2, recording angular θ t=Δ θ FLOOR ((P+R m)/R m);
B10) i=i+1 returns and carries out B6) step.
By adopting, template image and target image are carried out to the mode that polar coordinates convert one-dimension array to, thereby by the rotation matching of template image and target image, converted the problem of carrying out relative translation calculating cross correlation value between two one-dimension array to, avoided to piece image every certain angle rotation once, then mate problem once, thereby shortened match time, improved matching speed.
Preferably, described ImagePolar[P] assignment procedure be:
(B41) initiation parameter: ρ=1, θ i=0, P=0, Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2;
(B42) calculating parameter: Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i);
(B43) data of matching area are carried out to polar coordinates conversion: ImagePolar[P]=Image [Y] [X];
(B44)ρ=ρ+1,P=P+1;
(B45) judge whether ρ is greater than R mif, be greater than, carry out (B46) step, if be not more than, returned and carry out (B42) step;
(B46)ρ=0,θ II+Δθ;
(B47) judgement θ iwhether be greater than 2 π, if be greater than, finish, if be not more than, returned and carry out (B42) step.
Preferably, described step (4) is specially:
If at B2) in step, YS is no longer less than HS, calculate and finish, when the matching area to all completes after correlativity meter, obtain the position (XF of template image in target image, YF) and angle θ F, XF=XT+Xs0 wherein, YF=YT+Ys0, θ F=θ T, XT wherein, YT and θ T are B1)~B10) calculate the value that finishes front last preservation.
According to another aspect of the present invention, a kind of quick simple crosscorrelation Image Matching device is provided, it is characterized in that, for search position and the direction of target pattern at gray scale target image, described device comprises:
The first module, for template image is carried out to polar coordinates conversion, and is converted to one-dimension array by the template image data after conversion, is defined as template image one-dimension array;
The second module, for obtaining the external connection box of gray scale target image target pattern, is defined as the target pattern field of search;
The 3rd module, for choosing the matching area onesize with template image in the target pattern field of search, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described the first matching area one-dimensional data and described template image one-dimension array;
Four module, when described the 3rd module all completes correlation calculations to all matching areas, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method provided by the present invention, the invention has the beneficial effects as follows:
1. the image chip pattern collecting in manufacturing for IC and chip background gray levels differ larger feature, by adding image to collecting to carry out the preprocess method of binaryzation and connected domain analysis, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed.
2. adopt and convert template image and target search area image to one-dimension array, the normalized crosscorrelation that then carries out polar form calculates, thus realize target image ± 180 degree full angle coupling.
Accompanying drawing explanation
Fig. 1 is the overall flow figure of quick simple crosscorrelation gray level image matching method of the present invention;
Fig. 2 is the polar coordinates flow path switch figure of template image in a constructed embodiment of the present invention;
Fig. 3 is target image search pattern image process flow diagram in a constructed embodiment of the present invention;
Fig. 4 is matching area image polar coordinates flow path switch figure in a constructed embodiment of the present invention.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.In addition,, in each embodiment of described the present invention, involved technical characterictic just can not combine mutually as long as do not form each other conflict.
As shown in Figure 1, the invention provides a kind of quick simple crosscorrelation gray level image matching method, for search position and the direction of target pattern at gray scale target image, described method comprises:
(1) template image is carried out to polar coordinates conversion, and the template image data after conversion is converted to one-dimension array, be defined as template image one-dimension array;
(2) obtain the external connection box of target pattern in gray scale target image, be defined as the target pattern field of search;
(3) in the target pattern field of search, choose the matching area onesize with template image, described matching area is carried out to polar coordinates conversion, and the matching area view data after conversion is also converted to one-dimension array, be defined as matching area one-dimension array, calculated the correlativity of described matching area one-dimensional data and described template image one-dimension array;
(4) repeating step (3), until all matching areas are all completed to correlation calculations, chooses the matching area of correlativity maximum as target pattern, obtains position and the direction of described target pattern.
By method proposed by the invention, the image collecting is carried out to pre-service, determine the external connection box position of pattern in target image, thereby dwindle the hunting zone of coupling, improve matching speed; Convert template image and target search area image to one-dimension array in addition, the normalized crosscorrelation that then carries out polar form calculates, realize target image ± 180 degree full angle coupling.
Concrete, following to the matching process of target pattern in gray level image being elaborated in the present invention, for gray level image, can be expressed as the two-dimensional array of ranks direction, the value of each array element has represented the grey scale pixel value of this point, take the initial point that the image upper left corner is image coordinate, vertical downward direction is Y positive dirction, simultaneously also for line number label increases progressively direction, represented the short transverse of image, level is X positive dirction to right, also for columns label increases progressively direction, represented the Width of image simultaneously.
This method is specifically divided into off-line phase and on-line stage.
Off-line phase, first carries out pre-service to template image, and concrete steps are as follows:
A1) template image is carried out to polar coordinates conversion, the data after conversion save as one-dimension array, and concrete steps are as follows:
The width of supposing template image is W m, the height of template image is H m, the grey scale pixel value of template image point is Mode[Y] and [X], wherein 0≤Y<H m, 0≤X<W m, the center point coordinate of template image is Y cM=(H m-1)/2, X cM=(W m-1)/2, take template image center as the center of circle, with template W m/ 2 and H m/ 2 minimum value is radius R m, carry out polar coordinates conversion, the one-dimension array that the image representation after changing is grey scale pixel value:
ModePolar[P]=Mode[Y][X]
Y=FLOOR(Y CM-ρsinθ M)
X=FLOOR(X CM+ρcosθ M)
Wherein FLOOR symbology is got the maximum integer less than the number in bracket, ρ=1,2 ... R m, θ m=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, for example requiring the angle precision of coupling is 1 degree, Δ θ=π/180, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R m-1), the maximum integer larger than the number in bracket got in CEIL representative, as shown in Figure 2, described ModePolar[P] assignment procedure be:
(A11) initiation parameter: ρ=1, θ m=0, P=0, Y c=(H m-1)/2, X c=(W m-1)/2;
(A12) calculating parameter: Y=FLOOR (Y cM-ρ sin θ m), X=FLOOR (X cM+ ρ co θ m);
(A13) data of template image are carried out to polar coordinates conversion: ModePolar[P]=Mode[Y] [X];
(A14)ρ=ρ+1,P=P+1;
(A15) judge whether ρ is greater than R mif, be greater than, carry out the 6th step, if be not more than, returned and carry out (2) step;
(A16)ρ=0,θ MM+Δθ;
(A17) judgement θ mwhether be greater than 2 π, if be greater than, finish A1) execution A2), if be not more than, returned and carry out (2) step.
A2) calculate the partial parameters in normalized crosscorrelation formula:
SM = &Sigma; P = 0 P = N ModePolar [ P ]
S ( M 2 ) = &Sigma; P = 0 P = N ( ModePolar [ P ] ) 2
SM 2 = ( &Sigma; P = 0 P = N ModePolar [ P ] ) 2
corrDen = N &CenterDot; S ( M 2 ) - SM 2 .
A3) by ModePolar[P] array copies 1 part, 2 arrays join end to end and form new one-dimension array ModePolarD[Q], Q=0 wherein, 1,2 ... 2N.
On-line stage, at target image search pattern picture pattern, concrete steps are as follows:
B1) copy former target image, by corresponding gray threshold is set, the image after copy is carried out to binaryzation, image after binaryzation becomes black-and-white two color, because the image chip pattern and the chip background gray levels that collect in IC manufacture differ larger, after binaryzation, the major part of chip becomes a kind of look, background becomes look in contrast, and such as chip major part is black, background is white.Target image after binaryzation is adopted and carries out connected domain analysis as element marking, distance of swimming connectivity analysis methods, calculate the external connection box in chip image region, wherein external connection box refers to and comprises the target area minimum rectangle of (referring to chip design here), and general rectangle is towards coordinate axis.The pixel coordinate of supposing the relative target image true origin in the upper left corner of the external connection box that calculates is (X b, Y b), the width of external connection box is W b, be highly H b, the upper left corner, region of search in former target image is X with respect to the coordinate of target image true origin s0=X b-Offset, Y s0=Y b-Offset, the width W of region of search s=W b+ 2 * Offset is highly H s=H b+ 2 * Offset, wherein Offset is integer, span is 5~10 pixels.
From former target image, intercept the image in region of search, be called searching image, the width of searching image is W s, be highly H s, searching image X coordinates table is shown X s, Y coordinates table is shown Y s, M axC corr represents the maximum cross-correlation value of searching image and template image, is normalized simple crosscorrelation Gray-scale Matching in searching image, and as shown in Figure 3, described coupling flow process is as follows:
B2) initiation parameter: X s=0, Y s=0, MaxCcorr=0;
B3) judgement Y swhether be less than H sif, be not less than, finish, if be less than, carry out B4).
B4) judgement X swhether be less than W sif be not less than X s=0, Y s=Y s+ 1, return and carry out B3) step, if be less than, carry out B5) step.
B5) image-region identical with template size in searching image is carried out to polar coordinates conversion, the image in this region is called matching area image.The width of matching area image and height be respectively W identical with template image m, be highly H m, in matching area, the grey scale pixel value of certain point is Image[Y] and [X], wherein Y s≤ Y<Y s+ H m, X s≤ X<X s+ W m, the center point coordinate of matching area is Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2, take matching area center as the center of circle, with W m/ 2 and H m/ 2 minimum value is radius R m, carry out polar coordinates conversion, the one-dimension array ImagePolar[P that the image representation after changing is grey scale pixel value] and=Image[Y] [X], Y=FLOOR (Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i), the maximum integer less than the number in bracket got in FLOOR representative, ρ=1 wherein, 2 ... R m, θ i=0, Δ θ, 2 Δ θ ... 2 π, the precision that Δ θ is angle automatching, identical with the precision of template image, for example requiring the angle precision of coupling is 1 degree, Δ θ=π/180, P=0,1,2 ... N, N=CEIL ((2 π/Δ θ+1) R m-1), the maximum integer larger than the number in bracket got in CEIL representative, as shown in Figure 4, described ImagePolar[P] assignment procedure be:
(B51) initiation parameter: ρ=1, θ i=0, P=0, Y cI=Y s+ (H m-1)/2, X cI=X s+ (W m-1)/2;
(B52) calculating parameter: Y=FLOOR(Y cI-ρ sin θ i), X=FLOOR (X cI+ ρ cos θ i);
(B53) data of matching area are carried out to polar coordinates conversion: ImagePolar[P]=Image[Y] [X];
(B54)ρ=ρ+1,P=P+1;
(B55) judge whether ρ is greater than R mif, be greater than, carry out (6) step, if be not more than, returned and carry out (2) step;
(B56)ρ=0,θ II+Δθ;
(B57) judgement θ iwhether be greater than 2 π, if be greater than, finish B5), carry out B6), if be not more than, returned and carry out (2) step.
B6) calculate normalized crosscorrelation parameter:
SI = &Sigma; P = 0 P = N ImagePolar [ P ]
S ( I 2 ) = &Sigma; P = 0 P = N ( ImagePolar [ P ] ) 2
SI 2 = ( &Sigma; P = 0 P = N ImagePolar [ P ] ) 2
B7) initial integer i=0;
B8) judge whether i is greater than N, if be greater than, X s=X s+ 1, return and carry out B4), otherwise carry out B9);
B9) the normalized crosscorrelation value Ccorr of calculation template image and matching area i:
S i ( IM ) = &Sigma; P = i P = N + i ( ImagePolar [ P - i ] &CenterDot; ModePolarD [ P ] )
Ccorr i = N &CenterDot; S i ( IM ) - SI &CenterDot; SM | N &CenterDot; S ( I 2 ) - SI 2 | | N &CenterDot; S ( M 2 ) - SM 2 |
B10) judgement Ccorr iwhether be greater than MaxCcorr, if be not more than, carry out B11), if be greater than, by Ccorr iassignment is to MaxCcorr, and records the now coordinate (X of matching area central point t, Y t), X wherein t=X s+ (W m-1)/2, Y t=Y s+ (H m-1)/2, recording angular θ t=Δ θ FLOOR ((P+R m)/R m);
B11) i=i+1 returns and carries out B8) step.
Through B1)~B11) after step, if at B3) and in step, Y sno longer be less than H s, calculate and finish.Can obtain the position (X of template image in target image f, Y f) and angle θ f, X wherein f=X t+ X s0, Y f=Y t+ Y s0, θ ft, X wherein t, Y tand θ tfor B1)~B11) calculate the value finish front last preservation.
Those skilled in the art will readily understand; the foregoing is only preferred embodiment of the present invention; not in order to limit the present invention, all any modifications of doing within the spirit and principles in the present invention, be equal to and replace and improvement etc., within all should being included in protection scope of the present invention.

Claims (8)

1.一种快速互相关灰度图像匹配方法,其特征在于,用于在灰度目标图像中查找目标图案的位置与方向,包括:1. A fast cross-correlation grayscale image matching method is characterized in that, it is used to find the position and direction of the target pattern in the grayscale target image, including: (1)对模板图像进行极坐标转换,并将转换后的模板图像数据转换为一维数组,将其定义为模板图像一维数组;(1) Perform polar coordinate conversion on the template image, and convert the converted template image data into a one-dimensional array, which is defined as a template image one-dimensional array; (2)获取灰度目标图像中目标图案的外接盒,将其定义为目标图案搜索区,其中所述外接盒是指包含目标图案的最小长方形;(2) Obtain the bounding box of the target pattern in the grayscale target image, and define it as the target pattern search area, wherein the bounding box refers to the smallest rectangle containing the target pattern; (3)在目标图案搜索区中选取与模板图像同样大小的匹配区域,对所述匹配区域进行极坐标转换,并将转换后的匹配区域图像数据也转换为一维数组,将其定义为匹配区域一维数组,计算所述匹配区域一维数据与所述模板图像一维数组的相关性;(3) Select a matching area of the same size as the template image in the target pattern search area, perform polar coordinate conversion on the matching area, and convert the converted matching area image data into a one-dimensional array, which is defined as matching A one-dimensional array of regions, calculating the correlation between the one-dimensional data of the matching region and the one-dimensional array of the template image; (4)重复步骤(3),直至对所有的匹配区域均完成相关性计算,选取相关性最大的匹配区域作为目标图案,获取所述目标图案的位置与方向。(4) Repeat step (3) until the correlation calculation is completed for all matching regions, select the matching region with the highest correlation as the target pattern, and obtain the position and direction of the target pattern. 2.如权利要求1所述的方法,其特征在于,所述步骤(1)具体包括:设模板图像的宽度为WM,模板图像的高度为HM,模板图像某点的像素灰度值为Mode[Y][X],其中0≤Y<HM,0≤X<WM,则模板图像的中心点坐标为YCM=(HM-1)/2,XCM=(WM-1)/2,以模板图像中心为圆心,以模板WM/2和HM/2的最小值为半径RM,进行极坐标转换,转换后的图像表示为像素灰度值的一维数组:ModePolar[P]=Mode[Y][X],其中Y=FLOOR(YCM-ρsinθM),X=FLOOR(XCM+ρcosθM),FLOOR符号代表取比括号中的数小的最大整数,ρ=1,2…RMM=0,Δθ,2Δθ…2π,Δθ为角度匹配的精度,N=CEIL((2π/Δθ+1)·RM-1),CEIL代表取比括号中的数大的最小整数,将ModePolar[P]数组复制1份,2个数组首尾相连组成新的一维数组ModePolar D[Q],其中Q=0,1,2…2N,并计算归一化互相关参数:2. The method according to claim 1, characterized in that, the step (1) specifically comprises: setting the width of the template image as W M , the height of the template image as H M , and the pixel gray value of a certain point in the template image is Mode[Y][X], where 0≤Y<H M , 0≤X<W M , then the coordinates of the center point of the template image are Y CM =(H M -1)/2, X CM =(W M -1)/2, take the center of the template image as the center, and take the minimum value of the template W M /2 and H M /2 as the radius R M to perform polar coordinate conversion, and the converted image is expressed as a one-dimensional pixel gray value Array: ModePolar[P]=Mode[Y][X], where Y=FLOOR(Y CM -ρsinθ M ), X=FLOOR(X CM +ρcosθ M ), the FLOOR symbol represents the largest Integer, ρ=1,2… RM , θ M =0,Δθ,2Δθ…2π, Δθ is the accuracy of angle matching, N=CEIL((2π/Δθ+1) RM -1), CEIL stands for The smallest integer larger than the number in the brackets, copy the ModePolar[P] array 1, connect the two arrays end to end to form a new one-dimensional array ModePolar D[Q], where Q=0,1,2...2N, and calculate Normalized cross-correlation parameters: SMSM == &Sigma;&Sigma; PP == 00 PP == NN ModePolarModePolar [[ PP ]] SS (( Mm 22 )) == &Sigma;&Sigma; PP == 00 PP == NN (( ModePolarModePolar [[ PP ]] )) 22 SMSM 22 == (( &Sigma;&Sigma; PP == 00 PP == NN ModePolarModePolar [[ PP ]] )) 22 corrDencorrDen == NN &CenterDot;&Center Dot; SS (( Mm 22 )) -- SMSM 22 .. 3.如权利要求2所述的方法,其特征在于,所述ModePolar[P]的赋值过程为:3. the method for claim 2, is characterized in that, the assignment process of described ModePolar[P] is: (A11)初始化参数:ρ=1,θM=0,P=0,YC=(HM-1)/2,XC=(WM-1)/2;(A11) Initialization parameters: ρ=1, θ M =0, P=0, Y C =(H M -1)/2, X C =(W M -1)/2; (A12)计算参数:Y=FLOOR(YCM-ρsinθM),X=FLOOR(XCM+ρcoθM);(A12) Calculation parameters: Y=FLOOR(Y CM -ρsinθ M ), X=FLOOR(X CM +ρcoθ M ); (A13)将模板图像的数据进行极坐标转换:ModePolar[P]=Mode[Y][X];(A13) The data of the template image is converted into polar coordinates: ModePolar[P]=Mode[Y][X]; (A14)ρ=ρ+1,P=P+1;(A14)ρ=ρ+1, P=P+1; (A15)判断ρ是否大于RM,如果大于则执行第(A16)步,如果不大于则返回执行第(A12)步;(A15) Determine whether ρ is greater than R M , if it is greater, execute step (A16), if not, return to execute step (A12); (A16)ρ=0,θMM+Δθ;(A16) ρ=0, θ M = θ M + Δθ; (A17)判断θM是否大于2π,如果大于则结束,如果不大于则返回执行第(A12)步。(A17) Determine whether θ M is greater than 2π, if it is greater than 2π, then end, if not, return to step (A12). 4.如权利要求1所述的方法,其特征在于,所述步骤(2)具体为:4. The method according to claim 1, characterized in that, the step (2) is specifically: 设置相应的灰度阈值将灰度目标图像二值化,二值化后的灰度目标图像变成黑白两色,对经过二值化后的灰度目标图像采用像素标记或者游程连通性分析方法进行连通域分析,计算出目标图案的外接盒,其中外接盒是指包含目标图案的最小长方形;Set the corresponding grayscale threshold to binarize the grayscale target image, and the grayscale target image after binarization becomes black and white, and use pixel marking or run-length connectivity analysis method for the grayscale target image after binarization Perform connected domain analysis to calculate the bounding box of the target pattern, where the bounding box refers to the smallest rectangle containing the target pattern; 设计算出的外接盒的左上角相对目标图像坐标原点的像素坐标为(XB,YB),外接盒的宽度为WB,高度为HB,则原目标图像中的目标图案搜索区左上角相对于目标图像坐标原点的坐标为XS0=XB-Offset,YS0=YB-Offset,目标图案搜索区的宽度WS=WB+2×Offset,高度为HS=HB+2×Offset,其中Offset为整数,取值范围为5~10个像素。The calculated pixel coordinates of the upper left corner of the external box relative to the origin of the coordinates of the target image are (X B , Y B ), the width of the external box is W B , and the height is H B , then the upper left corner of the target pattern search area in the original target image The coordinates relative to the origin of the target image coordinates are X S0 =X B -Offset, Y S0 =Y B -Offset, the width of the target pattern search area W S =W B +2×Offset, and the height is H S =H B +2 ×Offset, where Offset is an integer, and the value range is 5 to 10 pixels. 5.如权利要求4所述的方法,其特征在于,所述步骤(3)具体包括:5. The method according to claim 4, characterized in that, the step (3) specifically comprises: 从原目标图像中截取目标图案搜索区内的图像,称为搜索图像,则搜索图像的宽度为WS,高度为HS,搜索图像X坐标表示为XS,Y坐标表示为YS,MaxCcorr代表搜索图像与模板图像的最大互相关值,在搜索图像中进行归一化互相关灰度匹配,匹配流程如下:The image in the target pattern search area is intercepted from the original target image, which is called the search image, then the width of the search image is W S , the height is H S , the X coordinate of the search image is expressed as X S , the Y coordinate is expressed as Y S , MaxCcorr Represents the maximum cross-correlation value between the search image and the template image, and performs normalized cross-correlation grayscale matching in the search image. The matching process is as follows: B1)初始化参数:XS=0,YS=0,MaxCcorr=0;B1) Initialization parameters: X S =0, Y S =0, MaxCcorr=0; B2)判断YS是否小于HS,如果不小于则结束,如果小于则执行B3);B2) Judging whether Y S is smaller than H S , if not, then end, if not, execute B3); B3)判断XS是否小于WS,如果不小于则XS=0,YS=YS+1,返回执行B2)步,如果小于则执行B4)步;B3) judge whether X S is less than W S , if not less then X S =0, Y S =Y S +1, return to execute B2) step, if less then execute B4) step; B4)对搜索图像中与模板大小相同的图像区域进行极坐标转换,该区域的图像称为匹配区域图像;匹配区域图像的宽度和高度与模板图像相同分别为WM,高度为HM,匹配区域中某点的像素灰度值为Image[Y][X],其中YS≤Y<YS+HM,XS≤X<XS+WM,则匹配区域的中心点坐标为YCI=Ys+(HM-1)/2,XCI=Xs+(WM-1)/2,以匹配区域中心为圆心,以WM/2和HM/2的最小值为半径RM,进行极坐标转换,转换后的图像表示为像素灰度值的一维数组ImagePolar[P]=Image[Y][X],Y=FLOOR(YCI-ρsinθI),X=FLOOR(XCI+ρcosθI),FLOOR代表取比括号中的数小的最大整数,其中ρ=1,2…RMI=0,Δθ,2Δθ…2π,Δθ为角度匹配的精度,P=0,1,2…N,N=CEIL((2π/Δθ+1)RM-1),CEIL代表取比括号中的数大的最小整数;B4) Carry out polar coordinate transformation on the image area with the same size as the template in the search image, and the image in this area is called the matching area image; the width and height of the matching area image are the same as the template image, respectively W M , and the height is H M , matching The pixel gray value of a point in the area is Image[Y][X], where Y S ≤Y<Y S +H M , X S ≤X<X S +W M , then the coordinate of the center point of the matching area is Y CI =Y s +(H M -1)/2, X CI =X s +(W M -1)/2, with the center of the matching area as the center, and the minimum value of W M /2 and H M /2 Radius R M , for polar coordinate conversion, the converted image is expressed as a one-dimensional array of pixel gray values ImagePolar[P]=Image[Y][X], Y=FLOOR(Y CI -ρsinθ I ), X=FLOOR (X CI +ρcosθ I ), FLOOR represents the largest integer smaller than the number in the brackets, where ρ=1,2... RMI =0,Δθ,2Δθ...2π,Δθ is the accuracy of angle matching, P =0,1,2...N, N=CEIL((2π/Δθ+1) RM -1), CEIL represents the smallest integer larger than the number in brackets; B5)计算归一化互相关参数:B5) Calculation of normalized cross-correlation parameters: SISi == &Sigma;&Sigma; PP == 00 PP == NN ImagePolarImage Polar [[ PP ]] SS (( II 22 )) == &Sigma;&Sigma; PP == 00 PP == NN (( ImagePolarImage Polar [[ PP ]] )) 22 SISi 22 == (( &Sigma;&Sigma; PP == 00 PP == NN ImagePolarImage Polar [[ PP ]] )) 22 B6)初始整数i=0;B6) initial integer i=0; B7)判断i是否大于N,如果大于则XS=XS+1,返回执行B3),否则执行B8);B7) Determine whether i is greater than N, if it is greater than X S = X S +1, return to execute B3), otherwise execute B8); B8)计算模板图像与匹配区域的归一化互相关值CcorriB8) Calculate the normalized cross-correlation value Ccorri of the template image and the matching area: SS ii (( IMIM )) &Sigma;&Sigma; PP == ii PP == NN ++ ii (( ImagePolarImage Polar [[ PP -- ii ]] &CenterDot;&Center Dot; ModePolarDModePolarD [[ PP ]] )) CcorrCCorr ii == NN &CenterDot;&CenterDot; SS ii (( IMIM )) -- SISi &CenterDot;&Center Dot; SMSM || NN &CenterDot;&Center Dot; SS (( II 22 )) -- SISi 22 || || NN &CenterDot;&Center Dot; SS (( Mm 22 )) -- SMSM 22 || B9)判断Ccorri是否大于MaxCcorr,如果不大于则执行B10),如果大于则将Ccorri赋值给MaxCcorr,并记录此时匹配区域中心点的坐标(XT,YT),其中XT=XS+(WM-1)/2,YT=YS+(HM-1)/2,记录角度θT=Δθ·FLOOR((P+RM)/RM);B9) Determine whether Ccorr i is greater than MaxCcorr, if it is not greater, execute B10), if greater, assign Ccorr i to MaxCcorr, and record the coordinates (X T , Y T ) of the center point of the matching area at this time, where X T =X S +(W M -1)/2, Y T =Y S +(H M -1)/2, recording angle θ T =Δθ·FLOOR((P+R M )/R M ); B10)i=i+1返回执行B6)步。B10) i=i+1 returns to execute B6) step. 6.如权利要求5所述的方法,其特征在于,所述ImagePolar[P]的赋值过程为:6. the method for claim 5 is characterized in that, the assignment process of described ImagePolar[P] is: (B41)初始化参数:ρ=1,θI=0,P=0,YCI=YS+(HM-1)/2,XCI=XS+(WM-1)/2;(B41) Initialization parameters: ρ=1, θ I =0, P=0, Y CI =Y S +(H M -1)/2, X CI =X S +(W M -1)/2; (B42)计算参数:Y=FLOOR(YCI-ρsinθI),X=FLOOR(XCI+ρcosθI);(B42) Calculation parameters: Y=FLOOR(Y CI -ρsinθ I ), X=FLOOR(X CI +ρcosθ I ); (B43)将匹配区域的数据进行极坐标转换:ImagePolar[P]=Image[Y][X];(B43) Perform polar coordinate transformation on the data in the matching area: ImagePolar[P]=Image[Y][X]; (B44)ρ=ρ+1,P=P+1;(B44) ρ=ρ+1, P=P+1; (B45)判断ρ是否大于RM,如果大于则执行第(B46)步,如果不大于则返回执行第(B42)步;(B45) Determine whether ρ is greater than R M , if it is greater, execute step (B46), if not, return to execute step (B42); (B46)ρ=0,θII+Δθ;(B46) ρ = 0, θ I = θ I + Δθ; (B47)判断θI是否大于2π,如果大于则结束,如果不大于则返回执行第(B42)步。(B47) Judging whether θ I is greater than 2π, if greater, then end, if not greater, return to step (B42). 7.如权利要求5所述的方法,其特征在于,所述步骤(4)具体为:7. The method according to claim 5, characterized in that, the step (4) is specifically: 若在B2)步中,YS不再小于HS,则计算结束,当对所有的匹配区域均完成相关性计后,获得模板图像在目标图像中的位置(XF,YF)及角度θF,其中XF=XT+Xs0,YF=YT+Ys0,θFT,其中XT,YT和θT为B1)~B10)计算结束前最后保存的值。If in step B2), YS is no longer smaller than HS, the calculation ends, and after the correlation calculation is completed for all matching regions, the position (X F , Y F ) and angle θ F of the template image in the target image are obtained , where X F =X T +X s0 , Y F =Y T +Y s0 , θ FT , where X T , Y T and θ T are the last saved values before the end of B1)~B10) calculation. 8.一种快速互相关灰度图像匹配装置,其特征在于,用于在灰度目标图像中查找目标图案的位置与方向,所述装置包括:8. A fast cross-correlation grayscale image matching device, characterized in that, for searching the position and direction of the target pattern in the grayscale target image, the device comprises: 第一模块,用于对模板图像进行极坐标转换,并将转换后的模板图像数据转换为一维数组,将其定义为模板图像一维数组;The first module is used to perform polar coordinate transformation on the template image, and convert the converted template image data into a one-dimensional array, which is defined as a template image one-dimensional array; 第二模块,用于获取灰度目标图像中目标图案的外接盒,将其定义为目标图案搜索区;The second module is used to obtain the bounding box of the target pattern in the grayscale target image, which is defined as the target pattern search area; 第三模块,用于在目标图案搜索区中选取与模板图像同样大小的匹配区域,对所述匹配区域进行极坐标转换,并将转换后的匹配区域图像数据也转换为一维数组,将其定义为匹配区域一维数组,计算所述第一匹配区域一维数据与所述模板图像一维数组的相关性;The third module is used to select a matching area with the same size as the template image in the target pattern search area, perform polar coordinate conversion on the matching area, and convert the converted matching area image data into a one-dimensional array, and convert it to Defined as a one-dimensional array of matching areas, calculating the correlation between the one-dimensional data of the first matching area and the one-dimensional array of the template image; 第四模块,当所述第三模块对所有的匹配区域均完成相关性计算,选取相关性最大的匹配区域作为目标图案,获取所述目标图案的位置与方向。The fourth module, when the third module completes the correlation calculation for all the matching regions, selects the matching region with the highest correlation as the target pattern, and obtains the position and direction of the target pattern.
CN201310331842.0A 2013-08-01 2013-08-01 A kind of cross-correlation gray level image matching method and device fast Active CN103593838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310331842.0A CN103593838B (en) 2013-08-01 2013-08-01 A kind of cross-correlation gray level image matching method and device fast

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310331842.0A CN103593838B (en) 2013-08-01 2013-08-01 A kind of cross-correlation gray level image matching method and device fast

Publications (2)

Publication Number Publication Date
CN103593838A true CN103593838A (en) 2014-02-19
CN103593838B CN103593838B (en) 2016-04-13

Family

ID=50083963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310331842.0A Active CN103593838B (en) 2013-08-01 2013-08-01 A kind of cross-correlation gray level image matching method and device fast

Country Status (1)

Country Link
CN (1) CN103593838B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104081435A (en) * 2014-04-29 2014-10-01 中国科学院自动化研究所 Image matching method based on cascading binary encoding
CN105740899A (en) * 2016-01-29 2016-07-06 长安大学 Machine vision image characteristic point detection and matching combination optimization method
CN105843972A (en) * 2016-06-13 2016-08-10 北京京东尚科信息技术有限公司 Method and device for comparing product attribute information
CN106899864A (en) * 2015-12-18 2017-06-27 北京国双科技有限公司 Commercial detection method and device
CN106898017A (en) * 2017-02-27 2017-06-27 网易(杭州)网络有限公司 Method, device and terminal device for recognizing image local area
CN109348731A (en) * 2016-10-14 2019-02-15 深圳配天智能技术研究院有限公司 A kind of method and device of images match
CN110603535A (en) * 2019-07-29 2019-12-20 香港应用科技研究院有限公司 Iterative multi-directional image search supporting large template matching
CN112215304A (en) * 2020-11-05 2021-01-12 珠海大横琴科技发展有限公司 Gray level image matching method and device for geographic image splicing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006132046A1 (en) * 2005-06-07 2006-12-14 National Institute Of Advanced Industrial Science And Technology Three-dimensional shape aligning method and program
CN101859384A (en) * 2010-06-12 2010-10-13 北京航空航天大学 Target Image Sequence Metrics
CN101950419A (en) * 2010-08-26 2011-01-19 西安理工大学 Quick image rectification method in presence of translation and rotation at same time
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006132046A1 (en) * 2005-06-07 2006-12-14 National Institute Of Advanced Industrial Science And Technology Three-dimensional shape aligning method and program
CN101859384A (en) * 2010-06-12 2010-10-13 北京航空航天大学 Target Image Sequence Metrics
CN101950419A (en) * 2010-08-26 2011-01-19 西安理工大学 Quick image rectification method in presence of translation and rotation at same time
CN103020945A (en) * 2011-09-21 2013-04-03 中国科学院电子学研究所 Remote sensing image registration method of multi-source sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
卢蓉 等: "一种提取目标图像最小外接矩形的快速算法", 《计算机工程》 *
张伟: "地面目标图象自动识别算法和系统研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104081435A (en) * 2014-04-29 2014-10-01 中国科学院自动化研究所 Image matching method based on cascading binary encoding
CN106899864A (en) * 2015-12-18 2017-06-27 北京国双科技有限公司 Commercial detection method and device
CN105740899A (en) * 2016-01-29 2016-07-06 长安大学 Machine vision image characteristic point detection and matching combination optimization method
CN105740899B (en) * 2016-01-29 2019-08-23 长安大学 A kind of detection of machine vision image characteristic point and match compound optimization method
CN105843972A (en) * 2016-06-13 2016-08-10 北京京东尚科信息技术有限公司 Method and device for comparing product attribute information
CN105843972B (en) * 2016-06-13 2020-05-01 北京京东尚科信息技术有限公司 Product attribute information comparison method and device
CN109348731A (en) * 2016-10-14 2019-02-15 深圳配天智能技术研究院有限公司 A kind of method and device of images match
CN109348731B (en) * 2016-10-14 2022-05-17 深圳配天智能技术研究院有限公司 Image matching method and device
CN106898017A (en) * 2017-02-27 2017-06-27 网易(杭州)网络有限公司 Method, device and terminal device for recognizing image local area
CN106898017B (en) * 2017-02-27 2019-05-31 网易(杭州)网络有限公司 The method, apparatus and terminal device of image local area for identification
CN110603535A (en) * 2019-07-29 2019-12-20 香港应用科技研究院有限公司 Iterative multi-directional image search supporting large template matching
CN112215304A (en) * 2020-11-05 2021-01-12 珠海大横琴科技发展有限公司 Gray level image matching method and device for geographic image splicing

Also Published As

Publication number Publication date
CN103593838B (en) 2016-04-13

Similar Documents

Publication Publication Date Title
CN103593838A (en) Rapid cross-correlation grey-scale image coupling method and rapid cross-correlation grey-scale image coupling device
CN105427298B (en) Remote sensing image registration method based on anisotropic gradient metric space
CN103295239B (en) A kind of autoegistration method of the laser point cloud data based on datum plane image
CN104007444B (en) Ground laser radar reflection intensity image generation method based on central projection
Lysenkov et al. Pose estimation of rigid transparent objects in transparent clutter
CN105139416A (en) Object identification method based on image information and depth information
CN103679741B (en) Method for automatically registering cloud data of laser dots based on three-dimensional line characters
CN105095822B (en) A kind of Chinese letter co pattern image detection method and system
CN105046271A (en) MELF (Metal Electrode Leadless Face) component positioning and detecting method based on match template
US20130058526A1 (en) Device for automated detection of feature for calibration and method thereof
Zhang et al. A region-based normalized cross correlation algorithm for the vision-based positioning of elongated IC chips
CN104766309A (en) Plane feature point navigation and positioning method and device
CN103697815A (en) Method for acquiring three-dimensional information of frequency mixing structured light based on phase encoding
KR102721010B1 (en) Method and device for generating point cloud histogram
CN101246595A (en) Multi-view point cloud data combination method in optical 3D scanning system
TW201415010A (en) Inspection device, inspection method, and inspection program
CN109711457A (en) A Fast Image Matching Method Based on Improved HU Invariant Moment and Its Application
CN107748855A (en) A kind of detection method of Quick Response Code view finding figure
CN103196514A (en) Image-based micro-chemical process liquid level detecting method
Xie et al. A4lidartag: Depth-based fiducial marker for extrinsic calibration of solid-state lidar and camera
CN106815830A (en) The defect inspection method of image
Gao et al. Text spotting for curved metal surface: Clustering, fitting, and rectifying
Huang et al. Multimodal image matching using self similarity
CN102663756A (en) Registration method of special shaped elements and high-density packing components in printed circuit board
Wang et al. Visual positioning of rectangular lead components based on Harris corners and Zernike moments

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20161125

Address after: 430074 Hubei Province, Wuhan city Hongshan District Luoyu Road No. 1037

Patentee after: Huazhong University of Science and Technology

Patentee after: GUANGDONG HUST INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE

Address before: 430074 Hubei Province, Wuhan city Hongshan District Luoyu Road No. 1037

Patentee before: Huazhong University of Science and Technology