CN108428250A - A kind of X angular-point detection methods applied to vision positioning and calibration - Google Patents

A kind of X angular-point detection methods applied to vision positioning and calibration Download PDF

Info

Publication number
CN108428250A
CN108428250A CN201810077053.1A CN201810077053A CN108428250A CN 108428250 A CN108428250 A CN 108428250A CN 201810077053 A CN201810077053 A CN 201810077053A CN 108428250 A CN108428250 A CN 108428250A
Authority
CN
China
Prior art keywords
angle points
pixel
value
point
sample data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810077053.1A
Other languages
Chinese (zh)
Other versions
CN108428250B (en
Inventor
赵子健
王芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201810077053.1A priority Critical patent/CN108428250B/en
Publication of CN108428250A publication Critical patent/CN108428250A/en
Application granted granted Critical
Publication of CN108428250B publication Critical patent/CN108428250B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of X angular-point detection methods applied to vision positioning and calibration, including:S1:Image is acquired, image is sampled using annular square window;S2:Whether include X angle points in preliminary judgement sample data according to the characteristics of image of X angle points;S3:Whether further judgement sample data include X angle points, and exclude the X angle points for repeating to judge;S4:Using X angle points as window center, sample data is reacquired, and judges whether data meet X angle point Symmetry Conditions, meets the sub-pixel position for then calculating X angle points with the method for curve matching, setting X angle points repeat detection mark;S5:Step S2 to S4 is repeated, detects all X angle points.When the present invention is to image sampling, the half of each interval sampling window length of side improves detection speed, and will not omit X angle points.The present invention is based on the characteristics of image of X angle points to judge whether contain X angle points in sampling window, enhances the robustness of algorithm.

Description

A kind of X angular-point detection methods applied to vision positioning and calibration
Technical field
The present invention relates to a kind of X angular-point detection methods applied to vision positioning and calibration, belong to computer vision application Technical field.
Background technology
Vision positioning and calibration are the important components of 3D computer vision.One of basic task of computer vision It is the geological information of object in three dimensions to be calculated from the image information that video camera obtains, and thus rebuild and identify object Body, and the three-dimensional geometry position of space object surface point and its correlation in the picture between corresponding points are by video camera What the geometrical model of imaging determined, these geometrical model parameters are exactly camera parameters.In most conditions, these parameters must It must can just be obtained by experiment and calculating, this process is referred to as vision calibration.Calibration process is just to determine the geometry of video camera And optical parameter, orientation of the video camera relative to world coordinate system;Vision positioning process is exactly logical according to the parameter of vision calibration Cross the three-dimensional information that two-dimensional image information calculates object.The size of stated accuracy directly affects the positioning accurate of computer vision Degree.
Plane reference method is a kind of common camera vision scaling method, is by known gridiron pattern scaling board, i.e., The size and shape of scaling board it is known that using on scaling board X angle points and shoot between corresponding points on its obtained image Correspondence founding mathematical models, with this mathematical model come calibrating camera inside and outside parameter.Gridiron pattern scaling board is passed through due to making Ji is simply widely used in camera calibration;In addition, the optical positioning system application with X angle point visual beacons Than wide.
Detection for X angle points has had certain methods proposition at present, and there are commonly the sides based on Harris Corner Detections Method, the detection method based on Hessian matrixes and the method based on improvement Susan Corner Detections.The method master proposed If judging the power of X angle points by the feature calculation of various different modes, algorithm operation quantity is big, is not suitable at parallel batch Reason.
Invention content
In view of the deficiencies of the prior art, the present invention provides a kind of Corner Detection sides X applied to vision positioning and calibration Method;
The present invention improves arithmetic speed, interference resistance and the accuracy of Corner Detection Algorithm.
Term is explained:
X angle points:The chessboard of vision calibration is composed of black-white colors sudden change region, wherein adjacent black and white The critical point that gridiron pattern has a common boundary, as X angle points.
The technical scheme is that:
A kind of X angular-point detection methods applied to vision positioning and calibration, including:
S1:Image is acquired, image is sampled using back-shaped window;The length of side of the back-shaped window sampling is set as 2r Pixel, the back-shaped window are square, then the back-shaped window samples this altogether containing 8r-4 pixel, and r is less than in image The half of the minimum X angle point length of sides;The all pixels point of back-shaped window is included in an annular data queue, the institute of back-shaped window It is P to have pixel i.e. sample data, note ith pixel pointi, PiGray value be fi, i=1,2... (8r-4);
S2:According to the characteristics of image of X angle points, whether includes X angle points in preliminary judgement sample data, judge item if met Otherwise part, then the sub-pixel position for calculating X angle points enter step S5;
S3:According to the sub-pixel position of the obtained X angle points of step S2, whether further judgement sample data include the angles X Point, and exclude the X angle points for repeating to judge;
S4:Using X angle points as back-shaped window center, sample data is reacquired, and judge whether data meet X angle points pair Title property condition, meets the sub-pixel position that X angle points are then calculated with the method for curve matching, and setting X angle points repeat detection mark Will;
S5:So that back-shaped window is moved on the image and obtains new sample data, per n pixel of minor tick, n ∈ (1,2r), Step S2 to S4 is repeated, detects all X angle points.
According to currently preferred, n=r.
According to currently preferred, the step S2, including:
S21:Gray processing is carried out to sample data successively;Threshold value can adaptively be chosen.
S22:The gray value of sample data is subjected to binaryzation twice, calculates step S21 treated the rank of sample data Jump times NsIf Ns=4, S23 is thened follow the steps, otherwise, executes step S5;
S23:Using the mean value of sample data gray value as threshold value, to the gray value binaryzation of sample data;Setting procedure Pixel when sample data gray value that S22 is calculated generates step is step A, step B, step C, step D, calculate this four The distance between the index value of a pixel LAB、LBC、LCD、LDAIf LAB、LBC、LCD、LDARespectively less than max_T and LAB、LBC、LCD、 LDAIt is all higher than min_T, max_T ∈ (10,15), min_T ∈ (5,10), then includes X angle points in preliminary judgement sample data, continues Step S24 is executed, otherwise, executes step S5;
S24:By photography geometry and symmetry principle, the sub-pixel position L of X angle points, the i.e. friendship of straight line AC and BD are calculated Point.Calculation formula is L=AC × BD.
The X corner locations being calculated in step S2 are sub-pixel (being accurate to one decimal place), positional precision It is relatively high.
According to currently preferred, in the step S22, binary-state threshold is mean ± Δs, and mean is sample data ash The mean value of angle value, Δ are threshold value regulated value, and the value range of Δ is 20-160 pixels.The brightness of the value and whole image of Δ has It closes, in addition Δ can enhance algorithm as threshold value regulated value to avoid false judgment caused by the influence due to picture noise Robustness.
According to currently preferred, the step S24, including:
Using step A, step B, step C, step D pixel value as the coordinate value of point A, B, C, D, acquire their three-dimensional The homogeneous coordinates multiplication cross of homogeneous coordinates, point A and point C obtain the vector representation form of the homogeneous equation of straight line AC, point B and point D's Homogeneous coordinates multiplication cross obtains the vector representation form of the homogeneous equation of straight line BD, will indicate the vector sum of the homogeneous equation of straight line AC The vector multiplication cross of the homogeneous equation of straight line BD obtains the homogeneous coordinates L1 of straight line AC and BD intersection point, if the coordinate of L1 be (x1, x2, X3), then point (x1/x3, x2/x3) be intersection point two-dimensional coordinate, after rounding up to X angle points (the sub-pixel positions pixel value L L)。
According to currently preferred, the step S3, including:
S31:Judge that X angle points repeat detection mark, if the pixel value L for the X angle points that step S23 is obtained is positioned at inactive Area then judges that the X angle points have been detected, then jumps out this cycle, executes step S5;Otherwise, step S32 is executed;
S32:The gray value of the pixel value L neighborhood territory pixels of X angle points is obtained, the neighborhood refers to being with the pixel value L of X angle points Center, using r pixel as the range of radius;Using the mean value of the neighborhood gray value as threshold value by the neighborhood binaryzation, gray scale is calculated The step number Δ V of valueCIf Δ VC>Min_V continues to execute step S4, otherwise, executes step S5;Min_V=4.
According to currently preferred, the step S4 is specifically included:
S41:Using the pixel value L of X angle points as the center of back-shaped window, sample sequence P ' is reacquired;
S42:Using the mean value of gray value as threshold value, by the gray value binaryzation of sample sequence P ', gray value binaryzation is remembered Pixel when generating step is step A1, step B1, step C1, step D1, calculates the distance between this four pixel index values L′A1B1, L 'B1C1, L 'C1D1, L 'D1A1If L 'A1B1=L 'C1D1And L 'B1C1=L 'D1A1, step S42 is continued to execute, otherwise, is executed Step S5;
S43:The one-dimensional sub-pixel location of step A1, step B1, step C1, step D1 are found out with the method for curve matching A′、B′、C′、D′;
S44:The step A1 that is found out according to step S43, step B1, step C1, step D1 one-dimensional sub-pixel location A ', The pixel value L for the X angle points that B ', C ', D ' and step S32 are found out, find out step A1, step B1, step C1, step D1 two Tie up sub-pixel location A ', B ', C ', D ';
Assuming that the one-dimensional sub-pixel location of certain step is m, the pixel for corresponding to X angle points center is (x, y), finds out step The two-dimensional sub-pixel position of A1, step B1, step C1, step D1;The two-dimensional sub-pixel position of step A1 is (x+A '-r+1, y-r + 0.5), the two-dimensional sub-pixel position of step B1 is (x+r+0.5, y+B ' -3r+1), and the two-dimensional sub-pixel position of step C1 is (x- C '+5r-1, y+r+0.5), the two-dimensional sub-pixel position of step D1 is (x-r+0.5, y-D '+7r-1);
S45:According to the method for step S32, calculate the intersecting point coordinate of straight line A ' C ' and B ' D ', i.e. the pixel value L's of X angle points Sub-pixel location;
S46:Calculate the directional information of X angle points:Counterclockwise, two boundary lines are obtained according to black and white change sequence, Including BW (Black-to-White) line, WB (White-to-Black) line, BW lines refer to the boundary line of the saltus step from black to white;WB Line refers to the boundary line of saltus step from white to black;Seek the angle theta of BW lines, WB lines and horizontal direction1、θ2, i.e. the direction of X angle points is believed Breath;
S47:The neighborhood of the pixel value L of the X angle points is set as inactive region, indicates that the X angle points have been detected.Avoid X Angle point repeats to detect.
According to currently preferred, the step S43, step A1, step B1, step are found out with the method for curve matching One-dimensional sub-pixel location A ', B ', C ', the D ' of C1, step D1, including:Take five pixel (platforms near sample sequence P ' steps A1 Three before rank A1, two behind step A1), using this index value of five pixels in sample sequence P ' as x coordinate, gray scale The gradient of value is y-coordinate, carries out conic fitting, and the curve approximation fitted is second-degree parabola, the pole of second-degree parabola Value point is along the maximum place of gradient direction grey scale change, the one-dimensional sub-pixel location A ' of as step A1;With same Method finds out one-dimensional sub-pixel location B ', C ', the D ' of step B1, step C1, step D1 respectively.
Beneficial effects of the present invention are:
1, when the present invention is to image sampling, the half of each interval sampling window length of side improves detection speed, and will not There is the case where omitting X angle points.Assuming that back-shaped windows radius is r=10, then the detection speed using the method for the present invention will be 10 times of individual element scan method in the prior art;By taking 640*480 image in different resolution as an example, if individual element scan method Detection speed be 3 frames/second, then the present invention detection speed will be 30 frames/second, can achieve the purpose that detect in real time.
2, the present invention is based on the characteristics of image of X angle points to judge whether contain X angle points in sampling window, enhances the Shandong of algorithm Stick.
Description of the drawings
Fig. 1 is X angle points and detects the schematic diagram of the back-shaped window used.
Fig. 2 is the flow diagram for the X angular-point detection methods that the present invention is applied to vision positioning and calibration.
Specific implementation mode
The present invention is further qualified with embodiment with reference to the accompanying drawings of the specification, but not limited to this.
Embodiment 1
A kind of X angular-point detection methods applied to vision positioning and calibration, as shown in Fig. 2, including:
S1:Image is acquired, image is sampled using back-shaped window (as shown in Figure 1);Set the back-shaped window sampling The length of side be 2r pixel, which is square, then the back-shaped window samples altogether containing 8r-4 pixel, R is less than the half of the minimum X angle point length of sides in image;The all pixels point of back-shaped window is included in an annular data queue, All pixels point, that is, sample data P of back-shaped window, note ith pixel point are Pi, PiGray value be fi, i=1,2... (8r- 4);
S2:According to the characteristics of image of X angle points, whether includes X angle points in preliminary judgement sample data P, judge if met Otherwise condition, then the sub-pixel position for calculating X angle points enter step S5;
S3:According to the sub-pixel position of the obtained X angle points of step S2, whether further judgement sample data include the angles X Point, and exclude the X angle points for repeating to judge;
S4:Using X angle points as back-shaped window center, sample data is reacquired, and judge whether data meet X angle points pair Title property condition, meets the sub-pixel position that X angle points are then calculated with the method for curve matching, and setting X angle points repeat detection mark Will;
S5:So that back-shaped window is moved on the image and obtains new sample data, per n pixel of minor tick, n ∈ (1,2r), Step S2 to S4 is repeated, detects all X angle points.N=r.
Embodiment 2
According to a kind of X angular-point detection methods applied to vision positioning and calibration described in embodiment 1, difference lies in, The step S2, including:
S21:Gray processing is carried out to sample data successively;Threshold value can adaptively be chosen.
S22:The gray value of sample data is subjected to binaryzation twice, binary-state threshold is mean ± Δs, and mean is sample The mean value of data gray value, Δ are threshold value regulated value, and the value range of Δ is 20-160 pixels.The value of Δ and whole image Brightness is related, in addition Δ can be enhanced as threshold value regulated value to avoid false judgment caused by the influence due to picture noise The robustness of algorithm.Calculate step S21 treated the step times N of sample datasIf Ns=4, S23 is thened follow the steps, Otherwise, step S5 is executed;
S23:Using the mean value of sample data gray value as threshold value, to the gray value binaryzation of sample data;Setting procedure Pixel when sample data gray value that S22 is calculated generates step is step A, step B, step C, step D, calculate this four The distance between the index value of a pixel LAB、LBC、LCD、LDAIf LAB、LBC、LCD、LDARespectively less than max_T and LAB、LDC、LCD、 LDAIt is all higher than min_T, max_T ∈ (10,15), min_T ∈ (5,10), then includes X angle points in preliminary judgement sample data, continues Step S24 is executed, otherwise, executes step S5;
S24:By photography geometry and symmetry principle, the sub-pixel position L of X angle points, the i.e. friendship of straight line AC and BD are calculated Point.Calculation formula is L=AC × BD.Including:Using step A, step B, step C, step D pixel value as point A, B, C, D Coordinate value, acquires their three-dimensional homogeneous coordinates, and the homogeneous coordinates multiplication cross of point A and point C obtains the arrow of the homogeneous equation of straight line AC Representation is measured, the homogeneous coordinates multiplication cross of point B and point D obtains the vector representation form of the homogeneous equation of straight line BD, will indicate straight The vector multiplication cross of the homogeneous equation of the vector sum straight line BD of the homogeneous equation of line AC, obtains the homogeneous coordinates of straight line AC and BD intersection point L1, if the coordinate of L1 is (x1, x2, x3), then point (x1/x3, x2/x3) is the two-dimensional coordinate of intersection point, up to X angle points after rounding Pixel value L.The X corner locations being calculated in step S2 are sub-pixel (being accurate to one decimal place), position essence It spends relatively high.
Embodiment 3
According to a kind of X angular-point detection methods applied to vision positioning and calibration described in embodiment 1, difference lies in, The step S3, including:
S31:Judge that X angle points repeat detection mark, if the pixel value L for the X angle points that step S23 is obtained is positioned at inactive Area then judges that the X angle points have been detected, then jumps out this cycle, executes step S5;Otherwise, step S32 is executed;
S32:The gray value of the pixel value L neighborhood territory pixels of X angle points is obtained, the neighborhood refers to being with the pixel value L of X angle points Center, using r pixel as the range of radius;Using the mean value of the neighborhood gray value as threshold value by the neighborhood binaryzation, gray scale is calculated The step number Δ V of valueCIf Δ VC>Min_V continues to execute step S4, otherwise, executes step S5;Min_V=4.
Embodiment 4
According to a kind of X angular-point detection methods applied to vision positioning and calibration described in embodiment 1, difference lies in, The step S4, including:
S41:Using the pixel value L of X angle points as the center of back-shaped window, sample sequence P ' is reacquired;
S42:Using the mean value of gray value as threshold value, by the gray value binaryzation of sample sequence P ', gray value binaryzation is remembered Pixel when generating step is step A1, step B1, step C1, step D1, calculates the distance between this four pixel index values L′A1B1、L′B1C1、L′C1D1、L′D1A1If L 'A1B1=L 'C1D1And L 'B1C1=L 'D1A1, step S42 is continued to execute, otherwise, is executed Step S5;
S43:The one-dimensional sub-pixel location of step A1, step B1, step C1, step D1 are found out with the method for curve matching A ', B ', C ', D ', including:Take five pixels near sample sequence P ' steps A1 (three before step A1, two behind step A1 It is a), using this index value of five pixels in sample sequence P ' as x coordinate, the gradient of gray value is y-coordinate, is carried out secondary Curve matching, the curve approximation fitted are second-degree parabola, and the extreme point of second-degree parabola is along the gradient direction gray scale Change maximum place, the one-dimensional sub-pixel location A ' of as step A1;Find out step B1, step respectively in the same way One-dimensional sub-pixel location B ', C ', the D ' of C1, step D1.
S44:The step A1 that is found out according to step S43, step B1, step C1, step D1 one-dimensional sub-pixel location A ', The pixel value L for the X angle points that B ', C ', D ' and step S32 are found out, find out step A1, step B1, step C1, step D1 two Tie up sub-pixel location A ', B ', C ', D ';
Assuming that the one-dimensional sub-pixel location of certain step is m, the pixel for corresponding to X angle points center is (x, y), finds out step The two-dimensional sub-pixel position of A1, step B1, step C1, step D1;The two-dimensional sub-pixel position of step A1 is (x+A '-r+1, y-r + 0.5), the two-dimensional sub-pixel position of step B1 is (x+r+0.5, y+B ' -3r+1), and the two-dimensional sub-pixel position of step C1 is (x- C '+5r-1, y+r+0.5), the two-dimensional sub-pixel position of step D1 is (x-r+0.5, y-D '+7r-1);
S45:According to the method for step S32, calculate the intersecting point coordinate of straight line A ' C ' and B ' D ', i.e. the pixel value L's of X angle points Sub-pixel location;
S46:Calculate the directional information of X angle points:Counterclockwise, two boundary lines are obtained according to black and white change sequence, Including BW (Black-to-White) line, WB (White-to-Black) line, BW lines refer to the boundary line of the saltus step from black to white;WB Line refers to the boundary line of saltus step from white to black;Seek the angle theta of BW lines, WB lines and horizontal direction1、θ2, i.e. the direction of X angle points is believed Breath;
S47:The neighborhood of the pixel value L of the X angle points is set as inactive region, indicates that the X angle points have been detected.Avoid X Angle point repeats to detect.

Claims (8)

1. a kind of X angular-point detection methods applied to vision positioning and calibration, which is characterized in that including:
S1:Image is acquired, image is sampled using back-shaped window;The length of side of the back-shaped window sampling is set as 2r pixel Point, the back-shaped window are square, then the back-shaped window samples this altogether containing 8r-4 pixel, and r is less than minimum in image The X angle point length of sides half;The all pixels point of back-shaped window is included in an annular data queue, all pictures of back-shaped window Vegetarian refreshments, that is, sample data, note ith pixel point are Pi, PiGray value be fi, i=1,2... (8r-4);
S2:Whether include X angle points in preliminary judgement sample data according to the characteristics of image of X angle points, if meeting Rule of judgment, Otherwise the sub-pixel position for then calculating X angle points enters step S5;
S3:According to the sub-pixel position of the obtained X angle points of step S2, whether further judgement sample data include X angle points, and Exclude the X angle points for repeating to judge;
S4:Using X angle points as back-shaped window center, sample data is reacquired, and judge whether data meet X angle point symmetry Condition, meets the sub-pixel position that X angle points are then calculated with the method for curve matching, and setting X angle points repeat detection mark;
S5:So that back-shaped window is moved on the image and obtain new sample data, per n pixel of minor tick, n ∈ (1,2r) are repeated Step S2 to S4 detects all X angle points.
2. a kind of X angular-point detection methods applied to vision positioning and calibration according to claim 1, which is characterized in that The step S2, including:
S21:Gray processing is carried out to sample data successively;
S22:The gray value of sample data is subjected to binaryzation twice, calculates step S21 treated the step time of sample data Number NsIf Ns=4, S23 is thened follow the steps, otherwise, executes step S5;
S23:Using the mean value of sample data gray value as threshold value, to the gray value binaryzation of sample data;Setting procedure S22 meters Pixel when obtained sample data gray value generation step is step A, step B, step C, step D, calculates this four pictures The distance between the index value of element LAB、LBC、LCD、LDAIf LAB、LBC、LCD、LDARespectively less than max_T and LAB、LBC、LCD、LDA More than min_T, max_T ∈ (10,15), min_T ∈ (5,10), then includes X angle points in preliminary judgement sample data, continue to execute Otherwise step S24 executes step S5;
S24:By photography geometry and symmetry principle, the sub-pixel position L of X angle points, the i.e. intersection point of straight line AC and BD are calculated.
3. a kind of X angular-point detection methods applied to vision positioning and calibration according to claim 2, which is characterized in that In the step S22, binary-state threshold is mean ± Δs, and mean is the mean value of sample data gray value, and Δ is threshold value regulated value, The value range of Δ is 20-160 pixels.
4. a kind of X angular-point detection methods applied to vision positioning and calibration according to claim 2, which is characterized in that The step S24, including:Using step A, step B, step C, step D pixel value as the coordinate value of point A, B, C, D, acquire The homogeneous coordinates multiplication cross of their three-dimensional homogeneous coordinates, point A and point C obtain the vector representation form of the homogeneous equation of straight line AC, The homogeneous coordinates multiplication cross of point B and point D obtains the vector representation form of the homogeneous equation of straight line BD, will indicate the homogeneous side of straight line AC The vector multiplication cross of the homogeneous equation of the vector sum straight line BD of journey, obtains the homogeneous coordinates L1 of straight line AC and BD intersection point, if the seat of L1 It is designated as (x1, x2, x3), then point (x1/x3, x2/x3) is the two-dimensional coordinate of intersection point, up to the pixel value L, X of X angle points after rounding The sub-pixel position L of pixel value L, that is, X angle points of angle point.
5. a kind of X angular-point detection methods applied to vision positioning and calibration according to claim 4, which is characterized in that The step S3, including:
S31:Judge that X angle points repeat detection mark, if the pixel value L for the X angle points that step S23 is obtained is located at inactive region, Judge that the X angle points have been detected, then jump out this cycle, executes step S5;Otherwise, step S32 is executed;
S32:The gray value of the pixel value L neighborhood territory pixels of X angle points is obtained, the neighborhood refers in being with the pixel value L of X angle points The heart, using r pixel as the range of radius;Using the mean value of the neighborhood gray value as threshold value by the neighborhood binaryzation, gray value is calculated Step number Δ VCIf Δ VC>Min_V continues to execute step S4, otherwise, executes step S5;Min_V=4.
6. a kind of X angular-point detection methods applied to vision positioning and calibration according to claim 4, which is characterized in that The step S4, specifically includes:
S41:Using the pixel value L of X angle points as the center of back-shaped window, sample sequence P ' is reacquired;
S42:Using the mean value of gray value as threshold value, by the gray value binaryzation of sample sequence P ', note gray value binaryzation generates Pixel when step is step A1, step B1, step C1, step D1, calculates the distance between this four pixel index values L ′A1B1、L′B1C1、L′C1D1、L′D1A1If L 'A1B1=L 'C1D1And L 'B1C1=L 'D1A1, step S42 is continued to execute, otherwise, is executed Step S5;
S43:With the method for curve matching find out step A1, step B1, step C1, the one-dimensional sub-pixel location A ' of step D1, B ', C ', D ';
S44:The step A1 that is found out according to step S43, step B1, step C1, the one-dimensional sub-pixel location A ' of step D1, B ', C ', The pixel value L for the X angle points that D ' and step S32 are found out finds out the sub- picture of two dimension of step A1, step B1, step C1, step D1 Plain position A ', B ', C ', D ';I.e.:Assuming that the one-dimensional sub-pixel location of certain step is m, the pixel for corresponding to X angle points centers be (x, Y), the two-dimensional sub-pixel position of step A1, step B1, step C1, step D1 are found out;The two-dimensional sub-pixel position of step A1 is (x + A '-r+1, y-r+0.5), the two-dimensional sub-pixel position of step B1 is (x+r+0.5, y+B ' -3r+1), the sub- picture of two dimension of step C1 Plain position is (x-C '+5r-1, y+r+0.5), and the two-dimensional sub-pixel position of step D1 is (x-r+0.5, y-D '+7r-1);
S45:According to the method for step S32, the intersecting point coordinate of straight line A ' C ' and B ' D ', the i.e. sub- picture of the pixel value L of X angle points are calculated Plain position;
S46:Calculate the directional information of X angle points:Counterclockwise, two boundary lines are obtained according to black and white change sequence, including BW lines, WB lines, BW lines refer to the boundary line of the saltus step from black to white;WB lines refer to the boundary line of saltus step from white to black;Seek BW lines, The angle theta of WB lines and horizontal direction1、θ2, i.e. the directional information of X angle points;
S47:The neighborhood of the pixel value L of the X angle points is set as inactive region, indicates that the X angle points have been detected.
7. a kind of X angular-point detection methods applied to vision positioning and calibration according to claim 6, which is characterized in that The step S43 finds out the one-dimensional sub-pixel location of step A1, step B1, step C1, step D1 with the method for curve matching A ', B ', C ', D ', including:Five pixels near sample sequence P ' steps A1 are taken, with this five pixels in sample sequence P ' Index value be x coordinate, the gradient of gray value is y-coordinate, carries out conic fitting, and the curve approximation fitted is secondary The extreme point of parabola, second-degree parabola is along the maximum place of gradient direction grey scale change, and as step A1's is one-dimensional Sub-pixel location A ';Find out respectively in the same way step B1, step C1, the one-dimensional sub-pixel location B ' of step D1, C ', D′。
8. according to a kind of any X angular-point detection methods applied to vision positioning and calibration of claim 1-7, feature It is, n=r.
CN201810077053.1A 2018-01-26 2018-01-26 X-corner detection method applied to visual positioning and calibration Active CN108428250B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810077053.1A CN108428250B (en) 2018-01-26 2018-01-26 X-corner detection method applied to visual positioning and calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810077053.1A CN108428250B (en) 2018-01-26 2018-01-26 X-corner detection method applied to visual positioning and calibration

Publications (2)

Publication Number Publication Date
CN108428250A true CN108428250A (en) 2018-08-21
CN108428250B CN108428250B (en) 2021-09-21

Family

ID=63156290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810077053.1A Active CN108428250B (en) 2018-01-26 2018-01-26 X-corner detection method applied to visual positioning and calibration

Country Status (1)

Country Link
CN (1) CN108428250B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111047614A (en) * 2019-10-10 2020-04-21 南昌市微轲联信息技术有限公司 Feature extraction-based method for extracting target corner of complex scene image
CN111428720A (en) * 2020-04-14 2020-07-17 北京神工科技有限公司 Sub-pixel level visual feature point positioning method and device based on step response matching

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1896682A (en) * 2005-07-12 2007-01-17 北京航空航天大学 X-shaped angular-point sub-pixel extraction
CN102095370A (en) * 2010-11-22 2011-06-15 北京航空航天大学 Detection identification method for three-X combined mark
CN103093451A (en) * 2011-11-03 2013-05-08 北京理工大学 Checkerboard intersection recognition algorithm
CN103345755A (en) * 2013-07-11 2013-10-09 北京理工大学 Chessboard angular point sub-pixel extraction method based on Harris operator
CN103927750A (en) * 2014-04-18 2014-07-16 上海理工大学 Detection method of checkboard grid image angular point sub pixel
CN104036516A (en) * 2014-06-30 2014-09-10 山东科技大学 Camera calibration checkerboard image corner detection method based on symmetry analysis
CN104331900A (en) * 2014-11-25 2015-02-04 湖南科技大学 Corner sub-pixel positioning method in CCD (charge coupled device) camera calibration
CN105740818A (en) * 2016-01-29 2016-07-06 山东大学 Artificial mark detection method applied to augmented reality
CN105787912A (en) * 2014-12-18 2016-07-20 南京大目信息科技有限公司 Classification-based step type edge sub pixel localization method
CN106846412A (en) * 2017-01-23 2017-06-13 上海兴芯微电子科技有限公司 A kind of checkerboard angle point detection process and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1896682A (en) * 2005-07-12 2007-01-17 北京航空航天大学 X-shaped angular-point sub-pixel extraction
CN102095370A (en) * 2010-11-22 2011-06-15 北京航空航天大学 Detection identification method for three-X combined mark
CN103093451A (en) * 2011-11-03 2013-05-08 北京理工大学 Checkerboard intersection recognition algorithm
CN103345755A (en) * 2013-07-11 2013-10-09 北京理工大学 Chessboard angular point sub-pixel extraction method based on Harris operator
CN103927750A (en) * 2014-04-18 2014-07-16 上海理工大学 Detection method of checkboard grid image angular point sub pixel
CN104036516A (en) * 2014-06-30 2014-09-10 山东科技大学 Camera calibration checkerboard image corner detection method based on symmetry analysis
CN104331900A (en) * 2014-11-25 2015-02-04 湖南科技大学 Corner sub-pixel positioning method in CCD (charge coupled device) camera calibration
CN105787912A (en) * 2014-12-18 2016-07-20 南京大目信息科技有限公司 Classification-based step type edge sub pixel localization method
CN105740818A (en) * 2016-01-29 2016-07-06 山东大学 Artificial mark detection method applied to augmented reality
CN106846412A (en) * 2017-01-23 2017-06-13 上海兴芯微电子科技有限公司 A kind of checkerboard angle point detection process and device

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ABDULRAHMAN S. ALTURKI 等: ""A new X-Corner Detection for Camera Calibration Using Saddle Points"", 《RESEARCHGATE》 *
FUQING ZHAO 等: ""An Automated X-corner Detection Algorithm(AXDA)"", 《JOURNAL OF SOFTWARE》 *
WANG YAN 等: ""An X-corner Detection Algorithm Based on Checkerboard Features"", 《INTERNATIONAL CONFERENCE ON LOGISTICS ENGINEERING, MANAGEMENT AND COMPUTER SCIENCE (LEMCS 2014)》 *
孟偲 等: ""具有方向特性的X角点的亚像素检测定位"", 《北京航空航天大学学报》 *
马帅依凡 等: ""基于人工标记的手术导航仪"", 《山东大学学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111047614A (en) * 2019-10-10 2020-04-21 南昌市微轲联信息技术有限公司 Feature extraction-based method for extracting target corner of complex scene image
CN111047614B (en) * 2019-10-10 2023-09-29 南昌市微轲联信息技术有限公司 Feature extraction-based method for extracting target corner of complex scene image
CN111428720A (en) * 2020-04-14 2020-07-17 北京神工科技有限公司 Sub-pixel level visual feature point positioning method and device based on step response matching
CN111428720B (en) * 2020-04-14 2023-09-26 北京神工科技有限公司 Sub-pixel level visual feature point positioning method and device based on step response matching

Also Published As

Publication number Publication date
CN108428250B (en) 2021-09-21

Similar Documents

Publication Publication Date Title
US10846844B1 (en) Collaborative disparity decomposition
CN107014294B (en) Contact net geometric parameter detection method and system based on infrared image
US9235928B2 (en) 3D body modeling, from a single or multiple 3D cameras, in the presence of motion
US8842906B2 (en) Body measurement
JP5132832B1 (en) Measuring apparatus and information processing apparatus
KR101554241B1 (en) A method for depth map quality enhancement of defective pixel depth data values in a three-dimensional image
CN103345736B (en) A kind of virtual viewpoint rendering method
CN104537705B (en) Mobile platform three dimensional biological molecular display system and method based on augmented reality
JP2017103602A (en) Position detection device, and position detection method and program
Song et al. DOE-based structured-light method for accurate 3D sensing
CN105894521A (en) Sub-pixel edge detection method based on Gaussian fitting
KR101589167B1 (en) System and Method for Correcting Perspective Distortion Image Using Depth Information
CN111256628A (en) Wall surface flatness detection method and device, computer equipment and storage medium
CN108335325A (en) A kind of cube method for fast measuring based on depth camera data
Chalom et al. Measuring image similarity: an overview of some useful applications
CN113822942A (en) Method for measuring object size by monocular camera based on two-dimensional code
CN108428250A (en) A kind of X angular-point detection methods applied to vision positioning and calibration
CN113723432B (en) Intelligent identification and positioning tracking method and system based on deep learning
CN113743265B (en) Depth camera-based automatic driving drivable area detection method and system
CN113409334B (en) Centroid-based structured light angle point detection method
Slossberg et al. Freehand Laser Scanning Using Mobile Phone.
Brunken et al. Incorporating Plane-Sweep in Convolutional Neural Network Stereo Imaging for Road Surface Reconstruction.
JP6061631B2 (en) Measuring device, information processing device, measuring method, information processing method, and program
Xue et al. A novel stripe extraction scheme for the multi-line structured light systems
Takaoka et al. Depth map super-resolution for cost-effective rgb-d camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant