CN1896682A - X-shaped angular-point sub-pixel extraction - Google Patents
X-shaped angular-point sub-pixel extraction Download PDFInfo
- Publication number
- CN1896682A CN1896682A CNA200510082766XA CN200510082766A CN1896682A CN 1896682 A CN1896682 A CN 1896682A CN A200510082766X A CNA200510082766X A CN A200510082766XA CN 200510082766 A CN200510082766 A CN 200510082766A CN 1896682 A CN1896682 A CN 1896682A
- Authority
- CN
- China
- Prior art keywords
- point
- angle point
- sub
- location
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The invention relates to the modification to the extracting method for the sub image element of the X type angular point. The process is: first to determine the image element position of the X type angular point by the Hessian matrix; then to describe the gray value near to the angular point by the second-order Taylor expansion, the minimax is the position. The invention is simple and quick. Even the picture has the aberrance and big yawp; it also has the high extracting precision.
Description
Technical field
The invention belongs to 3D vision detection system calibration technique, relate to improvement X type angular-point sub-pixel extracting method.
Background technology
The extraction of calibration point is the important step in the 3D vision detection system calibration process, and common precision and the stability in order to guarantee the system calibrating result requires the extraction precision of calibration point will be at sub-pixel.X type angle point (see figure 1) since its be easy to discern, extracting method simply reaches and extracts the precision height and use very extensively in the demarcation of vision detection system, traditional sub-pixel extraction mainly contains:
(1) the Harris angle point extracts.
The Harris operator is a kind of feature point extraction operator relatively more commonly used, mainly is the shape of judging unique point according to the degree of correlation of variation of image grayscale in the unique point near zone.X type angular-point sub-pixel based on Harris extracts the location of pixels of judging angle point usually earlier according to the Harris operator, utilize interpolation algorithm to calculate the gray-scale value of sub-pixel location point in this near zone then, and according to the intensity profile curved surface of these match topographies, the saddle point that calculates this curved surface again is the sub-pixel location of X type angle point.The advantage of this method is to extract the precision height, and shortcoming is a calculation of complex, and leaching process is loaded down with trivial details.
(2) edge fitting find intersection.
This method mainly is to utilize the marginal point of black and white piece in the edge operator extraction angle point near zone such as Sobel, and utilizes fitting a straight line to calculate the straight-line equation of two edges, and then the intersection point of two straight lines is the sub-pixel location of X type angle point.The advantage of this method is that leaching process is simple, and shortcoming is that it is not high to extract precision when image exists distortion or big noise.
Summary of the invention
The objective of the invention is:, propose the sub-pixel extraction of the X type angle point that a kind of leaching process is simple, precision is high at the deficiency of existing method.
Technical scheme of the present invention is: a kind of X type angular-point sub-pixel extracting method is characterized in that extraction step is as follows:
1, judges the location of pixels of X type angle point according to the Hessian matrix; The expression formula of Hessian matrix is as shown in Equation 1:
Wherein, f
Xx, f
Xy, f
YyBe respectively the second-order partial differential coefficient of gradation of image with respect to x, y, can obtain by the Gauss operator convolution of gradation of image and corresponding differential form, for X type angle point, one of two eigenwert of Hessian matrix are for just, another judges then that for negative the operator s of its location of pixels is expressed as:
Wherein, λ
1, λ
2Eigenwert for the Hessian matrix;
2, determine the sub-pixel location of X type angle point;
Behind the location of pixels of judging angle point, utilize a second order Taylor expansion to describe the interior gray-scale value of any arbitrarily of this near zone, its expression formula is:
Wherein, (x
0, y
0) be the location of pixels of angle point, r
0Be the gray scale of this point, r
x, r
y, r
Xx, r
Xy, r
YyBe respectively the second-order partial differential coefficient of this gray scale with respect to x, y, s, t are that angle point is with respect to (x
0, y
0) sub-pixel location, distribution surface through angle point near zone gradation of image after the low-pass filtering is a level and smooth saddle face, its saddle point is the sub-pixel location of X type angle point, according to the character of saddle point, obtains determining the system of linear equations of saddle point position on the basis of formula (3):
Then the sub-pixel location of X type angle point is (x
0+ s, y
0+ t), wherein,
Advantage of the present invention is:
(1) with respect to traditional angle point extraction algorithm based on Harris, the present invention does not need interpolation and surface fitting, and leaching process is simple, and speed is fast.
(2) with respect to the method for edge fitting find intersection, the present invention just utilizes the gradation of image in the angle point close region, therefore, even exist at image under the situation of distortion and big noise, still has higher extraction precision.
Description of drawings
Fig. 1 is the target synoptic diagram with X type angle point.
Fig. 2 is the S value distribution schematic diagram of X type angle point near zone.
Fig. 3 is the intensity profile synoptic diagram of X type angle point near zone.
Fig. 4 is a virtual plane target that is generated by computing machine.
Fig. 5 is the target image of target after the virtual video camera imaging of Fig. 4.
Embodiment
Below the present invention is described in further details.The present invention utilizes a second order Taylor expansion to describe the distribution surface of this angle point near zone gradation of image according to the location of pixels of the eigenwert judgement X type angle point of Hassian matrix then, and then the saddle point of this curved surface is the sub-pixel location of the angle point of asking.Concrete extraction step is as follows:
(1) judges the location of pixels of X type angle point according to the Hessian matrix
The expression formula of Hessian matrix is as shown in Equation 1:
Wherein, f
Xx, f
Xt, f
YyBe respectively the second-order partial differential coefficient of gradation of image, can obtain by the Gauss operator convolution of gradation of image and corresponding differential form with respect to x, y.For X type angle point, one of two eigenwert of Hessian matrix are for just, and another judges then that for negative the operator representation of its location of pixels is:
Wherein, λ
1, λ
2Eigenwert for the Hessian matrix.Fig. 2 is that the S value of ideally angle point near zone distributes, and its negative extreme point is the angle point position.
(2) determine the sub-pixel location of X type angle point
Behind the location of pixels of judging angle point, can utilize a second order Taylor expansion to describe the interior gray-scale value of any arbitrarily of this near zone, its expression formula is:
Wherein, (x
0, y
0) be the location of pixels of angle point, r
0Be the gray scale of this point, r
x, r
y, r
Xx, r
Xy, r
YyBe respectively the second-order partial differential coefficient of this gray scale with respect to x, y, (s is that angle point is with respect to (x t)
0, y
0) sub-pixel location.
Fig. 3 is that this curved surface is a level and smooth saddle face through the distribution surface of angle point near zone gradation of image after the low-pass filtering, and its saddle point is the sub-pixel location of X type angle point.According to the character of saddle point, on the basis of formula 3, can determine the system of linear equations of saddle point position:
Then the sub-pixel location of X type angle point is (x
0+ s, y
0+ t), wherein,
Embodiment
It below is one group of simulation example.Fig. 4 is a virtual plane target that is generated by computing machine, has 144 X type angle points on the target, every distance between two points 16mm.The image resolution ratio of virtual video camera is made as 512 * 512; Inner parameter is set at: α=1000, β=1000, γ=0, u
0=256, v
0=256; External parameter is set at: r
1=[0.951-0.174 0.255]
T, r
2=[0.168 0.9850.045]
T, t=[0 0 500]
TFig. 5 is the target image after the virtual video camera imaging.Concrete leaching process is as follows:
(1), utilize the Gaussian convolution of differential form to assess this gray scale of calculation with respect to x, each rank partial derivative r of y at each picture point in the image
x, r
y, r
Xx, r
Xy, r
Yy
(2) calculate pairing Hessian matrix of each picture point and eigenwert thereof, and judge according to formula 2 whether this point is the location of pixels of X type angle point;
(3) to meeting the pixel of formula 2, calculate the sub-pixel location of X type angle point according to formula 5.
Table 1 is the contrast that traditional Harris Angular Point Extracting Method and new method are extracted the result, and extraction precision outline of the present invention is better than classic method.
The extraction precision contrast of two kinds of methods of table 1
Noise level | Extract precision | |
Classic method | This | |
0 | 0.0088 | 0.0086 |
0.04 | 0.0331 | 0.0327 |
0.08 | 0.0644 | 0.0638 |
0.12 | 0.0958 | 0.0949 |
0.16 | 0.1279 | 0.1268 |
0.20 | 0.1598 | 0.1585 |
Table 2 is two kinds of methods to the contrast of 144 angle point extraction times, and the extraction rate of new method will be obviously faster than classic method as can be known.
The extraction rate contrast of two kinds of methods of table 2
Extraction time (second/144 angle points) | |
Classic method | New method |
3.17 | 1.15 |
Claims (1)
1, a kind of X type angular-point sub-pixel extracting method is characterized in that extraction step is as follows:
1.1, judge the location of pixels of X type angle point according to the Hessian matrix; The expression formula of Hessian matrix is as shown in Equation 1:
Wherein, f
Xx, f
Xy, f
YyBe respectively the second-order partial differential coefficient of gradation of image with respect to x, y, can obtain by the Gauss operator convolution of gradation of image and corresponding differential form, for X type angle point, one of two eigenwert of Hessian matrix are for just, another judges then that for negative the operator s of its location of pixels is expressed as:
Wherein, λ
1, λ
2Eigenwert for the Hessian matrix;
1.2, determine the sub-pixel location of X type angle point;
Behind the location of pixels of judging angle point, utilize a second order Taylor expansion to describe the interior gray-scale value of any arbitrarily of this near zone, its expression formula is:
Wherein, (x
0, y
0) be the location of pixels of angle point, r
0Be the gray scale of this point, r
x, r
y, r
Xx, r
Xy, r
YyBe respectively the second-order partial differential coefficient of this gray scale with respect to x, y, s, t are that angle point is with respect to (x
0, y
0) sub-pixel location, distribution surface through angle point near zone gradation of image after the low-pass filtering is a level and smooth saddle face, its saddle point is the sub-pixel location of X type angle point, according to the character of saddle point, obtains determining the system of linear equations of saddle point position on the basis of formula (3):
Then the sub-pixel location of X type angle point is (x
0+ s, y
0+ t), wherein,
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNA200510082766XA CN1896682A (en) | 2005-07-12 | 2005-07-12 | X-shaped angular-point sub-pixel extraction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNA200510082766XA CN1896682A (en) | 2005-07-12 | 2005-07-12 | X-shaped angular-point sub-pixel extraction |
Publications (1)
Publication Number | Publication Date |
---|---|
CN1896682A true CN1896682A (en) | 2007-01-17 |
Family
ID=37609260
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNA200510082766XA Pending CN1896682A (en) | 2005-07-12 | 2005-07-12 | X-shaped angular-point sub-pixel extraction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN1896682A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102095370A (en) * | 2010-11-22 | 2011-06-15 | 北京航空航天大学 | Detection identification method for three-X combined mark |
CN103345755A (en) * | 2013-07-11 | 2013-10-09 | 北京理工大学 | Chessboard angular point sub-pixel extraction method based on Harris operator |
CN103824275B (en) * | 2012-10-31 | 2018-02-06 | 康耐视公司 | Saddle dots structure and the system and method for determining its information are searched in the picture |
CN108428250A (en) * | 2018-01-26 | 2018-08-21 | 山东大学 | A kind of X angular-point detection methods applied to vision positioning and calibration |
CN111833405A (en) * | 2020-07-27 | 2020-10-27 | 北京大华旺达科技有限公司 | Calibration identification method and device based on machine vision |
-
2005
- 2005-07-12 CN CNA200510082766XA patent/CN1896682A/en active Pending
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102095370A (en) * | 2010-11-22 | 2011-06-15 | 北京航空航天大学 | Detection identification method for three-X combined mark |
CN103824275B (en) * | 2012-10-31 | 2018-02-06 | 康耐视公司 | Saddle dots structure and the system and method for determining its information are searched in the picture |
CN103345755A (en) * | 2013-07-11 | 2013-10-09 | 北京理工大学 | Chessboard angular point sub-pixel extraction method based on Harris operator |
CN108428250A (en) * | 2018-01-26 | 2018-08-21 | 山东大学 | A kind of X angular-point detection methods applied to vision positioning and calibration |
CN108428250B (en) * | 2018-01-26 | 2021-09-21 | 山东大学 | X-corner detection method applied to visual positioning and calibration |
CN111833405A (en) * | 2020-07-27 | 2020-10-27 | 北京大华旺达科技有限公司 | Calibration identification method and device based on machine vision |
CN111833405B (en) * | 2020-07-27 | 2023-12-08 | 北京大华旺达科技有限公司 | Calibration and identification method and device based on machine vision |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111145161B (en) | Pavement crack digital image processing and identifying method | |
CN111968144B (en) | Image edge point acquisition method and device | |
CN109035276B (en) | Image edge extraction method and device and automatic driving system | |
CN101408985A (en) | Method and apparatus for extracting circular luminous spot second-pixel center | |
CN108416766B (en) | Double-side light-entering type light guide plate defect visual detection method | |
CN112651968B (en) | Wood board deformation and pit detection method based on depth information | |
CN105574533B (en) | A kind of image characteristic extracting method and device | |
CN114820773B (en) | Silo transport vehicle carriage position detection method based on computer vision | |
CN111105452B (en) | Binocular vision-based high-low resolution fusion stereo matching method | |
CN1896682A (en) | X-shaped angular-point sub-pixel extraction | |
CN109540925B (en) | Complex ceramic tile surface defect detection method based on difference method and local variance measurement operator | |
CN111274939A (en) | Monocular camera-based automatic extraction method for road surface pothole damage | |
CN101029823A (en) | Method for tracking vehicle based on state and classification | |
CN111354047A (en) | Camera module positioning method and system based on computer vision | |
CN110223332B (en) | Bridge crack calibration method | |
CN114549441A (en) | Sucker defect detection method based on image processing | |
CN103700071A (en) | Depth map up-sampling edge enhancement method | |
CN109146888A (en) | A kind of soil cracking gap fractal dimension calculation method based on difference coefficient analysis | |
CN106127765A (en) | Image binaryzation system based on self-adapting window and smooth threshold method | |
CN108805854B (en) | Method for rapidly counting tablets and detecting completeness of tablets in complex environment | |
CN111429437B (en) | Image non-reference definition quality detection method for target detection | |
CN108629227B (en) | Method and system for determining left and right boundaries of vehicle in image | |
CN111178210B (en) | Image identification and alignment method for cross mark | |
CN109175718B (en) | Picture laser engraving method based on halftone technology | |
CN116385268A (en) | Remote sensing image enhancement method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |