CN101363713A - Method for demarcating structural parameter of light sensor based on two-dimension double ratio constant structure - Google Patents

Method for demarcating structural parameter of light sensor based on two-dimension double ratio constant structure Download PDF

Info

Publication number
CN101363713A
CN101363713A CNA2008100818734A CN200810081873A CN101363713A CN 101363713 A CN101363713 A CN 101363713A CN A2008100818734 A CNA2008100818734 A CN A2008100818734A CN 200810081873 A CN200810081873 A CN 200810081873A CN 101363713 A CN101363713 A CN 101363713A
Authority
CN
China
Prior art keywords
target
striation
point
plane
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008100818734A
Other languages
Chinese (zh)
Other versions
CN100565097C (en
Inventor
张广军
孙军华
刘谦哲
魏振忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Beijing University of Aeronautics and Astronautics
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CNB2008100818734A priority Critical patent/CN100565097C/en
Priority to US12/258,398 priority patent/US8078025B2/en
Publication of CN101363713A publication Critical patent/CN101363713A/en
Application granted granted Critical
Publication of CN100565097C publication Critical patent/CN100565097C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a calibration method of structural parameters of a structured light sensor, and the method is based on the invariance of two-dimensional cross-ratio and comprises the following steps: a camera coordinate system and an image plane coordinate system are established; a target coordinate system is established for obtaining a plane target image used for calibrating the structural parameters of the sensor and obtaining coordinates of a plurality of light bar points of light bars which are projected on the target plane by structured light under the camera coordinate system; the coordinates of a plurality of light bar points of the light bars which are projected on the target plane by the structured light for a plurality of times under the camera coordinate system are obtained, and a structured light equation is fit according to all the obtained coordinates of the light bar points. The invention adopts the method with the invariance of the two-dimensional cross-ratio, thereby allowing each position on the plane target to be able to obtain any number of the calibration points; and the calibrated structured light can be any mode.

Description

A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio
Technical field
The present invention relates to the vision measurement technology, be specifically related to a kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio.
Background technology
What the structured light visual sensing technology adopted is the non-cpntact measurement mode, and measurement range is bigger.And measuring speed is fast, system flexibility is good, moderate accuracy, has a wide range of applications in fields such as three-dimensional reconstruction, On-line Product detection, reverse-engineerings.
Structured light sensor is a kind of vision measurement system based on the optical triangulation principle.The structured light of structured light projector projection certain pattern is to testee, formed the modulation striation by the testee surface modulation, take with video camera and to contain the testee surface image of modulating striation, can obtain testee surface three dimension coordinate according to the spatial relation of video camera imaging model and video camera isostructure light.
The structured light that structured light sensor is throwed can be divided into the structured light of different modes such as dot structure light, line-structured light, single line structured light, multi-line structured light, circle structure light, pyramidal structure light.How to determine the spatial relation of the structured light of different mode with video camera, i.e. the demarcation of structured light sensor structural parameters is at first to need the problem that solves.
Structured light sensor structural parameters calibration method commonly used has fiber elongation method, sawtooth target method, based on the not political reform of double ratio of three-dimensional target target and based on the scaling method of plane target drone etc.Wherein,
Fiber elongation method is to allow structured light be projected at several of space distribution not on the filament of coplane, owing to the filament scattering forms bright spot, adopt electronic theodolite to measure the coordinate figure of bright spot in the space, according to bright spot in coordinate of bright spot on the image and the space survey coordinate figure, find the solution the location parameter between structured light and video camera.This scaling method needs the coordinate of manual measurement bright spot, complicated operation, and measuring error is bigger, and needs utility appliance.
Sawtooth target method is that people such as Duan Fajie are at Chinese journal of scientific instrument, 2000,21 (1): propose in the article that 108-110 delivers " a kind of new structure optical sensor structural parameters calibration method ", this method adopts a simple high-precision calibrating of demarcating a target and an one dimension worktable realization line structure optical sensor, does not need the coordinate of putting on the Other Instruments subsidiary optical plane.But this method need be adjusted the attitude of one dimension worktable or structured light sensor, makes that optical plane and crest line are perpendicular, so complicated operation.In addition, processing profile of tooth target cost height, tip surface is limited, and the calibration point number of acquisition is few.
Not political reform is that people such as Xu Guangyou are at Chinese journal of computers based on three-dimensional target target double ratio, 1995, Vol.18, propose in the article that No.6 delivers " a kind of new three dimensional vision system scaling method " based on structured light, this method adopts a high-precision stereo target with two mutual vertical planes, each plane obtains the unique point of at least three conllinear, utilizes the double ratio invariance principle to obtain the intersection point of structural light strip and these known 3 place straight lines.The three-dimensional target processing cost height that this method needs occurs blocking between two vertical planes easily, and the calibration point that obtains is also few.
At present, simple and effective scaling method commonly used is based on the scaling method of two dimensional surface target, wherein based on the not political reform of double ratio of plane target drone, be that Zhou Fuqiang is at Image and Vision Computing, Volume23, Issue 1, January 2005, the scaling method of a kind of structural parameter of structure optical vision sensor that proposes in the article that Pages59-67 delivers " Complete calibration of astructured light stripe vision sensor through planar target of unknownorientations[J] ", this method adopts the double ratio invariance principle to obtain calibration point, the unique point line number seldom on the target, be generally 3~10 row, and delegation's unique point (at least three) can only obtain a calibration point, therefore, this method only is fit to the calibration line structured light, can produce big error of fitting if demarcate nono-linear structure light.
Continue to be valued in the patent of invention that the patent No. is CN200510013231.7 and propose a kind of quick calibrating method for line structure optical sensor based on coplanar reference object in the Zhu, a state in the Zhou Dynasty, this method has proposed to utilize the plane that video camera photocentre and straight line optical strip image determined and the intersection of coplanar reference object to come the calibration line structured light.This method need not utility appliance, does not have occlusion issue, and is simple to operate, but the method can only be used for the demarcation of line-structured light or multi-line structured light.
In sum, it is few that structured light demarcation at present also exists extract minutiae quantity, the shortcoming that can only demarcate the AD HOC line-structured light.
Summary of the invention
In view of this, it is a kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio that fundamental purpose of the present invention is to provide, and retrievable calibration point is more, and can demarcate the structured light of arbitrary patterns.
For achieving the above object, technical scheme of the present invention is achieved in that
A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, this method comprises:
A, set up camera coordinate system, plane of delineation coordinate system;
B, set up the target coordinate system, obtain the plane target drone image that is used for the calibration sensor structural parameters, and obtain the coordinate of a plurality of striation points under camera coordinate system that structured light is incident upon the striation on the target plane;
C, obtain the coordinate of a plurality of striation points under camera coordinate system that structured light repeatedly is incident upon the striation on the target plane, according to all striation point coordinate match structured light equations that obtain.
Comprise before the described step b: place the plane target drone that has unique point without restrictions in the structured light vision sensor measured zone, form the light bar on the plane target drone by being incident upon of structured light.
Obtaining the plane target drone image that is used for the calibration sensor structural parameters described in the step b comprises: use video camera camera plane target image, and the plane target drone image of taking is carried out distortion correction.
Captured plane target drone image comprises striation and at least nine non-colinear unique points, and described unique point is triplex row three row rectangular arranged at least.
Obtaining the coordinate of a plurality of striation points under camera coordinate system that structured light is incident upon the striation on the target plane described in the step b comprises:
B1, in the plane of delineation virtual target unique point of three conllinear of structure, the intersection point that obtains structural light strip on straight line by described three conllinear virtual target unique points and the image is the coordinate of striation point;
The double ratio of b2, each striation point of obtaining among the calculation procedure b1 respectively and three conllinear virtual target unique points;
B3, obtain the corresponding coordinate of actual striation point under the target coordinate system of striation point in the target image, and be transformed under the camera coordinate system;
B4, repeated execution of steps b1~b3 obtains the coordinate of a plurality of striation points under camera coordinate system of target corresponding striation under same position.
Comprise further before the step b1 and obtain the striation coordinate that the described coordinate of striation point that obtains of step b1 is for calculating according to striation coordinate and three definite straight-line equations of conllinear virtual target unique point.
Further comprise before the step b1 and obtain the transformation relation that the target coordinate is tied to camera coordinate system, be translation vector and rotation matrix, step b3 is described with striation point in the coordinate conversion under the target coordinate system under camera coordinate system to be: change according to translation vector that obtains and rotation matrix.
The virtual target unique point of three conllinear of the described structure of step b1 comprises:
B11, choose three collinear feature points, take up an official post at the straight line of crossing described three collinear feature points and get a bit, and calculate the double ratio of this point and described three collinear feature points as the virtual target unique point;
B12, choose the unique point of other two groups of colleague/row, every group of three unique point choosing and described three collinear feature points of step b11 are same column/OK respectively, finding any respectively on the straight line of crossing every group of three collinear feature points is the virtual target unique point, makes the double ratio of this point and three collinear feature points of this group equal the double ratio that step b11 calculates;
The described coordinate that repeatedly is incident upon the striation on the target plane that obtains of step c is: plane of motion target repeatedly without restrictions, whenever move a plane target drone, repeated execution of steps b.
The described structured light of step c is the structured light of arbitrary patterns.
Step b3 is described to be obtained the actual striation point that striation point is corresponding in the target image and at the coordinate under the target coordinate system is: earlier according to the virtual target unique point and and three unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, obtain in the target image three virtual target unique points respectively at the coordinate of the corresponding point on the target under the target coordinate system; And then according to striation point in the target image and and three virtual target unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, obtain the corresponding coordinate of actual striation point under the target coordinate system of striation point in the target image.
The present invention is based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, choosing under the situation of different double ratios, can obtain a plurality of striation points of plane target drone at same position, position by the changing the plane target, can obtain a plurality of striation points of plane target drone again at diverse location, the striation point of all acquisitions of match just can obtain the structured light equation, because the double ratio difference of choosing, on each position of plane target drone, can obtain the calibration point of any amount, and the structured light of demarcating can be arbitrary patterns.
Description of drawings
Fig. 1 is a calibration system structural drawing of the present invention;
Fig. 2 the present invention is based on the constant structured light sensor structural parameters calibration method flow diagram of two-dimentional double ratio;
Fig. 3 the present invention is based on the constant structured light sensor structural parameters calibration principle schematic of two-dimentional double ratio;
Fig. 4 is an embodiment plane target drone synoptic diagram;
Fig. 5, Fig. 6, Fig. 7 are respectively plane target drone is used for calibration structure sensor construction parameter when three diverse locations plane target drone image.
Embodiment
The present invention is based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, under the different situation of double ratio, can obtain a plurality of striation points of plane target drone at same position, position by the changing the plane target, can obtain a plurality of striation points of plane target drone at diverse location again, the striation point of all acquisitions of match just can obtain the structured light equation.
Be described in further detail the present invention is based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio below by specific embodiments and the drawings.
Fig. 1 is a calibration system structural drawing of the present invention, as shown in Figure 1, calibration system of the present invention is made of structured light projector, plane target drone and video camera, structured light projector is used to throw the structured light of certain pattern to plane target drone, plane target drone is used for modulated structure light and forms the modulation striation, and video camera is used for taking and contains the plane target drone image of modulating striation.
With pyramidal structure light is example, and Fig. 2 the present invention is based on the constant structured light sensor structural parameters calibration method flow diagram of two-dimentional double ratio, as shown in Figure 2, the present invention is based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio and may further comprise the steps:
Step 201: set up plane of delineation coordinate system and camera coordinate system.
Fig. 3 the present invention is based on the constant structured light sensor structural parameters calibration principle schematic of two-dimentional double ratio, as shown in Figure 3, sets up camera coordinate system Oc-xcyczc and plane of delineation coordinate system O-UV according to the putting position of video camera.
Step 202: set up the target coordinate system, use video camera camera plane target image, and the plane target drone image of taking is carried out distortion correction.
Specifically comprise: open the laser projecting apparatus power supply,, place the plane target drone that has unique point without restrictions, form the light bar on the plane target drone by being incident upon of structured light in the structured light vision sensor measured zone.Here, the plane target drone of described placement pre-sets, set target is a two dimensional surface, it is provided with the circular feature point, and the unique point number is 9~100, and the diameter of unique point is 3mm~100mm, its precision is 0.001mm~0.01mm, the unique point spacing is 3mm~50mm, and its precision is 0.001mm~0.01mm, and the shape of unique point also can be shapes such as square.
Fig. 4 is an embodiment plane target drone synoptic diagram, as shown in Figure 4, the plane target drone that it is the circular feature point of 5mm that the present embodiment employing comprises 9 diameters, the target size is 150mm * 150mm, and horizontal spacing is 60mm between unique point, and longitudinal pitch is 60mm.
As shown in Figure 3, set up target coordinate system Ot-xtytzt according to the placement location of plane target drone.
Here, the plane target drone image of taking with video camera should comprise: striation and at least nine non-colinear unique points, described unique point are triplex row three row rectangular arranged at least; The described distortion correction that carries out is according to intrinsic parameters of the camera the plane target drone image of taking to be carried out distortion correction.As for specifically how carrying out distortion correction, belong to prior art, do not repeat them here.
Step 203: at least four non-colinear characteristic point coordinates on the target of distortion correction back plane described in the extraction step 202 image, and, determine that the target coordinate is tied to the transformation relation of camera coordinate system according to the unique point coordinate that extracts.
The present invention adopts the location of pixels of determining unique point based on the shape operator of Hessian matrix, utilizes the second order Taylor expansion to describe the intensity profile curved surface of angle point adjacent domain then, asks for characteristic point coordinates by calculating the curved surface saddle point at last.
Concrete extracting method the article of Chen Dazhi " A New Sub-Pixel Detector for X-Corners inCamera Calibration Targets[C]; WSCG ' 2005 Short Papers Proceedings; 13thInternational Conference in Central Europe on Computer Graphics; Visualizationand Computer Vision; 2005; Plzen, Czech Republic " in have a detailed description.
The transformation relation that the target coordinate is tied to camera coordinate system refers to rotation matrix R and translation vector T.Usually, utilize the image coordinate and the corresponding O of at least four non-colinear unique points t-x ty tz tUnder coordinate, try to achieve earlier the linear solution of 3 * 3 homography matrix H between the plane of delineation and the target plane with least square method; Adopt the Levenberg-Marquardt optimization method to obtain optimum homography matrix H then; From H, decomposite at last from O t-x ty tz tTo O c-x cy cz cRotation matrix R and translation vector T.
The algorithm of specifically asking for homography matrix H, rotation matrix R and translation vector T has a detailed description in Z.Y.Zhang " A flexible new technique for camera calibration[R] (Microsoft Corporation; NSR-TR-98-71; 1998) " article, repeats no more here.
Step 204: the striation coordinate on the target of distortion correction back plane described in the extraction step 202 image.
This step is by the Hessian matrix of calculating chart picture point, and the pairing vector of eigenwert of the absolute value maximum of Hessian matrix, thereby obtains striation normal direction and the second derivative on this direction, finally obtains sub-pixel striation center.Wherein, concrete extracting method is referring to Carsten Steger " Unbiased Extraction of Curvilinear Structure from 2D and3D Images[D] (Germany, Technology University Munich, 1998) ".
Step 205: the virtual target unique point of three conllinear of structure in the plane of delineation.
Specifically comprise: at first, choose three collinear feature points, get a bit as the virtual target unique point taking up an official post, and calculate the double ratio of this point and described three collinear feature points by the straight line of described three collinear feature points.As shown in Figure 3, in plane of delineation coordinate system O-UV, get 1 s taking up an official post by the straight line of three vertical collinear feature point a1, a2, a3 I1As the virtual target unique point, calculation level s I1Double ratio α with a1, a2, a3 I1Then, choose the unique point of other two groups of colleague/row, every group of three unique points choosing and described three collinear feature points difference same column/OK, finding any respectively on the straight line by every group of three collinear feature points is the virtual target unique point, makes the double ratio of this point and 3 collinear feature points of this group equal double ratio α I1, find respectively by the some s on the straight line of other two groups of vertical collinear feature points I2And s I3, satisfy s I2Double ratio, s with b1, b2, b3 I3All equal α with the double ratio of c1, c2, c3 I1, i.e. s I2And s I3Be two other virtual target unique point.
The plane target drone of present embodiment only comprises nine unique points, so selected unique point is positioned at adjacent lines and adjacent column, and under plane target drone comprises situation more than nine unique points, can interlacing or get unique point every column selection.
Step 206: the intersection point that obtains structural light strip on straight line by three conllinear virtual target unique points and the image is the coordinate of striation point.
In plane of delineation coordinate system, by geometric relationship as can be known, s I1, s I2, s I3On straight line li, according to s I1, s I2, s I3Any 2 coordinate can be in the hope of straight line l iEquation, and straight line l iWith striation two intersection point p are arranged I1, p I2, asked striation coordinate and straight line l by step 204 iEquation can calculate p I1, p I2Coordinate.In actual calibration process, put more than one if choose the striation that double ratio obtains at every turn, also can only obtain one of them striation point and demarcate.
Step 207: the double ratio of the striation point that obtains in the calculation procedure 206 and three conllinear virtual target unique points respectively, promptly calculate p I1With s I1, s I2, s I3Double ratio β I1, and p I2With s I1, s I2, s I3Double ratio β I2
Step 208: obtain the corresponding coordinate of actual striation point under the target coordinate system of striation point in the plane of delineation, and be transformed under the camera coordinate system.According to the virtual target unique point and and three unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, and coordinate and the double ratio α 1 of unique point a1, a2, a3, b1, b2, b3, c1, c2, c3 corresponding point a1 ', a2 ', a3 ', b1 ', b2 ', b3 ', c1 ', c2 ', c3 ' in the target coordinate system determines virtual target unique point s in the plane of delineation coordinate system I1, s I2, s I3Corresponding actual point s I1', s I2', s I3' coordinate in the target coordinate system; And then according to striation point in the target image and and three virtual target unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, and s I1', s I2', s I3' at target coordinate system O t-x ty tz tIn coordinate and double ratio β I1Determine p I1Corresponding striation point p I1' coordinate, s in the target coordinate system I1', s I2', s I3' at target coordinate system O t-x ty tz tIn coordinate and double ratio β I2Determine p I2Corresponding striation point p I2' coordinate in the target coordinate system; Rotation matrix R and the translation vector T that obtains according to step 203 puts p with striation again I1With striation point p I2In the coordinate conversion under the target coordinate system under camera coordinate system.
The character that the double ratio of the double ratio of unique point and virtual target unique point and virtual target unique point and striation point all remains unchanged in perspective transform in the described plane of delineation is referred to as two-dimentional double ratio invariance principle in the present invention.
Step 209: repeated execution of steps 205~step 208, obtain a plurality of striation points the coordinate under camera coordinate system of plane target drone at corresponding striation under the same position.
Step 210: nothing retrains repeatedly plane of motion target, and behind each plane of motion target, repeated execution of steps 202 to 209 is obtained the coordinate of a plurality of striation points under camera coordinate system that structured light repeatedly is incident upon the striation on the target plane.
Step 211: by the coordinate of all striation points under camera coordinate system that step 208~step 210 obtains, the equation of match structured light under camera coordinate system preserved formed equation, calls in order to measuring.
In the present embodiment, constraint plane of motion target three times, according to step 201 and the described method of step 202, obtain being used for plane target drone image such as Fig. 5, Fig. 6, shown in Figure 7 of calibration structure sensor construction parameter respectively, calculate the inner parameter of video camera according to prior art, confidential reference items matrix and distortion factor are respectively:
A = 2868.434082 0 826.6427 0 2868.270264 572.651062 0 0 1
(k 1,k 2,p 1,p 2)=(-0.145619,0.218327,-0.000452,-0.000743)。
Calculate position target coordinate system O shown in Figure 5 according to the described method of step 203 t-x ty tz tTo camera coordinate system O c-x cy cz cRotation matrix R 1With translation vector T 1Be respectively:
R 1 = 0.959072 - 0.077754 - 0.272278 - 0.052515 0.896039 - 0.440859 0.278251 0.437114 0.855283
T 1 = - 59.926098 - 51.798515 356.107635
According to the described method of step 205~step 208, obtain target when position shown in Figure 5, forms the coordinate of a plurality of striation points under image coordinate system on the striation reach under camera coordinate system coordinate as shown in Table 1:
Be used for the selected double ratio value of constructing virtual point Calculate striation point with the double ratio value between the virtual point The coordinate of striation point on the image The coordinate under camera coordinate system of actual striation point
8.50 -13.9655 (209.49,553.00) (-77.1,-0.08,377.3)
8.70 -3.9891 (211.92,586.00) (-76.89,4.05,379.5)
8.60 4.1108 (217.81,479.96) (-76.11,-9.24,372.89)
10.00 -1.4406 (227.42,649.85) (-75.32,12.09,384.14)
9.60 2.1934 (240.41,415.21) (-73.6,-17.44,369.46)
13.60 -0.8053 (254.28,705.83) (-72.51,19.23,388.72)
12.00 1.6930 (271.40,365.29) (-70.17,-23.88,367.24)
15.00 1.5317 (292.32,339.29) (-67.84,-27.29,366.22)
20.00 1.4368 (311.08,320.09) (-65.74,-29.85,365.57)
30.00 1.3633 (330.65,302.53) (-63.53,-32.22,365.05)
-22.00 -0.3401 (353.86,805.20) (-61.65,32.47,398.99)
-11.50 -0.3015 (379.17,819.65) (-58.81,34.52,400.96)
-7.00 -0.2710 (407.17,832.56) (-55.62,36.41,402.95)
-9.00 1.1726 (440.14,242.40) (-50.9,-40.85,364.62)
-6.00 1.1468 (473.93,232.71) (-46.9,-42.42,365.09)
-4.00 1.1290 (508.03,226.36) (-42.81,-43.57,365.8)
-1.70 -0.2397 (542.97,855.55) (-39.54,40.61,410.23)
-2.00 1.1163 (578.98,225.37) (-34.11,-44.36,368.16)
-1.40 1.1176 (613.00,228.95) (-29.85,-44.19,369.6)
-0.55 -0.3241 (648.12,834.36) (-26.37,38.68,413.43)
-0.70 1.1443 (680.17,248.60) (-21.27,-42.11,373.41)
-0.18 -0.4409 (706.91,806.82) (-18.7,35.38,414.17)
-0.05 -0.5263 (732.97,789.95) (-15.23,33.24,414.17)
0.05 -0.6322 (756.29,772.30) (-12.09,30.94,413.98)
-0.03 1.2725 (776.17,310.83) (-8.61,-34.2,381.52)
0.20 -0.9464 (794.36,734.31) (-6.86,25.83,413.01)
0.25 -1.2208 (811.12,712.08) (-4.52,22.77,412.18)
Table one
Calculate position target coordinate system O shown in Figure 6 according to the described method of step 203 t-x ty tz tTo camera coordinate system O c-x cy cz cRotation matrix R 2With translation vector T 2Be respectively:
R 2 = 0.973928 - 0.038511 - 0.223563 - 0.082464 0.857968 - 0.507040 0.211336 0.512257 0.832424
T 2 = - 82.035706 - 27.551346 367.943481
According to the described method of step 205~step 208, obtain target when Fig. 6 position, forms a plurality of striation points on the striation the coordinate under the image coordinate system reach under camera coordinate system coordinate as shown in Table 2:
Be used for the selected double ratio value of constructing virtual point Striation point is with the double ratio value between the virtual point The coordinate of striation point on the image The coordinate under camera coordinate system of actual striation point
-62.90 1.2891 (207.47,468.05) (-97.75,12.25,387.96)
-50.30 1.5220 (208.67,526.01) (-97.38,19.17,392.28)
-41.60 1.1747 (215.55,424.87) (-97.06,7.03,384.97)
-21.90 2.1310 (225.92,599.03) (-95.19,27.86,398.16)
-15.00 2.7039 (239.37,630.80) (-93.58,31.65,400.9)
-13.50 1.0293 (254.70,342.78) (-93.14,-3.14,379.83)
-10.20 1.0026 (270.52,322.55) (-91.48,-5.73,378.7)
-7.80 0.9792 (288.94,302.93) (-89.51,-8.28,377.67)
-6.10 0.9613 (307.71,286.68) (-87.48,-10.45,376.9)
-4.00 -6.8682 (329.75,748.28) (-82.98,45.91,412.43)
-3.75 0.9324 (352.84,257.70) (-82.48,-14.49,375.78)
-2.45 -2.6138 (379.28,785.54) (-77.12,50.59,416.86)
-2.20 0.9123 (408.94,235.75) (-76.07,-17.86,375.45)
-1.35 -1.5969 (440.82,817.42) (-69.73,54.74,421.37)
-1.20 0.9020 (473.97,224.60) (-68.38,-20.06,376.17)
-0.85 0.9012 (507.03,224.67) (-64.36,-20.47,377)
-0.40 -1.2108 (542.03,841.67) (-57.28,58.19,426.82)
-0.30 0.9067 (578.97,234.13) (-55.38,-20.18,379.59)
-0.15 0.9131 (611.85,243.42) (-51.16,-19.4,381.2)
0.02 0.9226 (642.98,256.04) (-47.08,-18.15,383.06)
0.20 -1.5687 (674.98,827.96) (-40.27,56.96,430.64)
0.35 -1.8920 (704.93,816.84) (-36.34,55.6,430.87)
0.40 -2.4506 (730.95,802.92) (-32.9,53.84,430.72)
0.45 0.9907 (753.68,329.31) (-32,-9.73,392.24)
0.50 1.0158 (774.14,350.85) (-29.09,-7.04,394.65)
0.55 1.0444 (792.16,372.86) (-26.49,-4.25,397.05)
0.60 1.0817 (809.99,398.00) (-23.89,-1,399.73)
Table two
Calculate position target coordinate system O shown in Figure 7 according to the described method of step 203 t-x ty tz tTo camera coordinate system O c-x cy cz cRotation matrix R 3With translation vector T 3Be respectively:
R 3 = 0.920771 - 0.087683 - 0.380122 - 0.125856 0.855539 - 0.502209 0.369244 0.510260 0.77672
T 3 = - 99.789154 - 52.651661 362.485687
According to the described method of step 205~step 208, obtain target when Fig. 7 position, form a plurality of striation points on the striation the coordinate under the image coordinate system reach under camera coordinate system coordinate as shown in Table 3:
Be used for the selected double ratio value of constructing virtual point Striation point is with the double ratio value between the virtual point The coordinate of striation point on the image The coordinate under camera coordinate system of actual striation point
-2.00 10.4213 (256.47,516.01) (-112.06,-7.09,385.94)
-1.95 -70.2624 (258.05,538.99) (-111.97,-4.35,387.75)
-1.88 2.4289 (266.24,432.07) (-110.78,-17.28,379.98)
-1.60 -2.1875 (276.68,610.11) (-110.25,4.05,394.03)
-1.52 1.7543 (290.85,375.90) (-108.01,-24.35,376.76)
-1.18 -1.0330 (305.74,665.16) (-107.37,10.52,399.62)
-1.13 1.4837 (322.81,331.83) (-104.47,-30.12,374.76)
-0.93 1.3969 (342.19,312.18) (-102.32,-32.8,374.08)
-0.62 -0.5117 (360.76,730.24) (-101.68,18.17,407.35)
-0.57 1.2805 (384.10,279.18) (-97.65,-37.49,373.33)
-0.26 -0.3448 (409.73,767.42) (-96.45,22.56,412.75)
-0.09 -0.2836 (440.09,784.83) (-93.13,24.63,415.71)
0.05 -0.2386 (470.86,799.31) (-89.7,26.35,418.5)
0.19 -0.2037 (504.09,811.67) (-85.95,27.83,421.3)
0.24 1.1394 (538.98,225.85) (-79.87,-46.84,375.99)
0.44 -0.1658 (575.97,826.15) (-77.6,29.52,426.48)
0.54 -0.1572 (611.95,829.44) (-73.3,29.88,428.81)
0.58 1.1394 (649.93,229.47) (-66.54,-48.47,381.47)
0.66 1.1542 (685.05,237.79) (-62.2,-48.05,383.85)
0.77 -0.1834 (717.86,817.53) (-60.18,28.11,434.09)
0.79 1.2062 (752.22,262.56) (-53.75,-46.06,389.28)
0.88 -0.2449 (780.80,794.61) (-52.02,24.89,436)
0.89 1.2934 (808.74,295.35) (-46.48,-42.72,395)
0.92 1.3542 (832.12,313.83) (-43.44,-40.64,397.83)
0.98 -0.4173 (853.44,747.44) (-42.22,18.22,436.48)
1.00 -0.5069 (872.40,729.38) (-39.59,15.66,436.11)
1.01 1.6451 (890.19,373.85) (-35.79,-33.44,406.23)
Table three
At last, according to all striation points that obtain be at the structured light equation that the coordinate fitting under the camera coordinate system goes out:
0.085x 2+0.2086y 2+0.5688z 2+0.2016xy-0.6220yz-0.3569xz
+151.6199x+261.2671y-477.3929z-100000=0
After obtaining the structured light equation of match, structured light vision sensor just can call resulting equation and detect in testing process.
The above is preferred embodiment of the present invention only, is not to be used to limit protection scope of the present invention.

Claims (11)

1, a kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, it is characterized in that this method comprises:
A, set up camera coordinate system, plane of delineation coordinate system;
B, set up the target coordinate system, obtain the plane target drone image that is used for the calibration sensor structural parameters, and obtain the coordinate of a plurality of striation points under camera coordinate system that structured light is incident upon the striation on the target plane;
C, obtain the coordinate of a plurality of striation points under camera coordinate system that structured light repeatedly is incident upon the striation on the target plane, according to all striation point coordinate match structured light equations that obtain.
2, method according to claim 1 is characterized in that, comprises before the described step b: place the plane target drone that has unique point without restrictions in the structured light vision sensor measured zone, form the light bar on the plane target drone by being incident upon of structured light.
3, method according to claim 1 is characterized in that, obtains the plane target drone image that is used for the calibration sensor structural parameters described in the step b and comprises: use video camera camera plane target image, and the plane target drone image of taking is carried out distortion correction.
4, method according to claim 3 is characterized in that, captured plane target drone image comprises striation and at least nine non-colinear unique points, and described unique point is triplex row three row rectangular arranged at least.
5, method according to claim 4 is characterized in that, obtains the coordinate of a plurality of striation points under camera coordinate system that structured light is incident upon the striation on the target plane described in the step b and comprises:
B1, in the plane of delineation virtual target unique point of three conllinear of structure, the intersection point that obtains structural light strip on straight line by described three conllinear virtual target unique points and the image is the coordinate of striation point;
The double ratio of b2, each striation point of obtaining among the calculation procedure b1 respectively and three conllinear virtual target unique points;
B3, obtain the corresponding coordinate of actual striation point under the target coordinate system of striation point in the target image, and be transformed under the camera coordinate system;
B4, repeated execution of steps b1~b3 obtains the coordinate of a plurality of striation points under camera coordinate system of target corresponding striation under same position.
6, method according to claim 5 is characterized in that, further comprises before the step b1 and obtains the striation coordinate, and the described coordinate of striation point that obtains of step b1 is for calculating according to striation coordinate and three definite straight-line equations of conllinear virtual target unique point.
7, method according to claim 5, it is characterized in that, further comprise before the step b1 and obtain the transformation relation that the target coordinate is tied to camera coordinate system, be translation vector and rotation matrix, step b3 is described with striation point in the coordinate conversion under the target coordinate system under camera coordinate system to be: change according to translation vector that obtains and rotation matrix.
8, method according to claim 5 is characterized in that, the virtual target unique point of three conllinear of the described structure of step b1 comprises:
B11, choose three collinear feature points, take up an official post at the straight line of crossing described three collinear feature points and get a bit, and calculate the double ratio of this point and described three collinear feature points as the virtual target unique point;
B12, choose the unique point of other two groups of colleague/row, every group of three unique point choosing and described three collinear feature points of step b11 are same column/OK respectively, finding any respectively on the straight line of crossing every group of three collinear feature points is the virtual target unique point, makes the double ratio of this point and three collinear feature points of this group equal the double ratio that step b11 calculates.
9, method according to claim 1 is characterized in that, the described coordinate that repeatedly is incident upon the striation on the target plane that obtains of step c is: plane of motion target repeatedly without restrictions, whenever move a plane target drone, repeated execution of steps b.
According to each described method of claim 1 to 4, it is characterized in that 10, the described structured light of step c is the structured light of arbitrary patterns.
11, according to claim 5 or 8 described methods, it is characterized in that, step b3 is described to be obtained the actual striation point that striation point is corresponding in the target image and at the coordinate under the target coordinate system is: earlier according to the virtual target unique point and and three unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, obtain in the target image three virtual target unique points respectively at the coordinate of the corresponding point on the target under the target coordinate system; And then according to striation point in the target image and and three virtual target unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, obtain the corresponding coordinate of actual striation point under the target coordinate system of striation point in the target image.
CNB2008100818734A 2007-10-26 2008-05-13 A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio Expired - Fee Related CN100565097C (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CNB2008100818734A CN100565097C (en) 2007-12-29 2008-05-13 A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio
US12/258,398 US8078025B2 (en) 2007-10-26 2008-10-25 Vehicle dynamic measurement device and method for comprehensive parameters of rail wear

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN200710308568 2007-12-29
CN200710308568.X 2007-12-29
CNB2008100818734A CN100565097C (en) 2007-12-29 2008-05-13 A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio

Publications (2)

Publication Number Publication Date
CN101363713A true CN101363713A (en) 2009-02-11
CN100565097C CN100565097C (en) 2009-12-02

Family

ID=40390217

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2008100818734A Expired - Fee Related CN100565097C (en) 2007-10-26 2008-05-13 A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio

Country Status (1)

Country Link
CN (1) CN100565097C (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101943563A (en) * 2010-03-26 2011-01-12 天津大学 Rapid calibration method of line-structured light vision sensor based on space plane restriction
CN101526336B (en) * 2009-04-20 2011-08-24 陈炳生 Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN103411553A (en) * 2013-08-13 2013-11-27 天津大学 Fast calibration method of multiple line structured light visual sensor
CN104848801A (en) * 2015-06-05 2015-08-19 北京航空航天大学 Line structure light vision sensor calibration method based on parallel bicylindrical target
CN106524943A (en) * 2016-11-10 2017-03-22 华南理工大学 Three-dimensional reconstruction device and method of dual-rotation laser
CN106989669A (en) * 2017-02-16 2017-07-28 上海大学 Big visual field high-precision vision system calibrating method based on virtual three-dimensional target
CN107255443A (en) * 2017-07-14 2017-10-17 北京航空航天大学 Binocular vision sensor field calibration method and device under a kind of complex environment
CN107560549A (en) * 2017-08-29 2018-01-09 哈尔滨理工大学 A kind of laser vision two-dimension displacement measuring system practicality calibration technique scheme
CN108827157A (en) * 2018-08-31 2018-11-16 广州视源电子科技股份有限公司 Method of calibration, device, system, equipment and the storage medium of laser measurement
CN110470320A (en) * 2019-09-11 2019-11-19 河北科技大学 The scaling method and terminal device of oscillatory scanning formula line-structured light measuring system
CN111156900A (en) * 2018-11-08 2020-05-15 中国科学院沈阳自动化研究所 Line-of-depth linear light measurement method for bullet primer assembly
CN111179351A (en) * 2018-11-13 2020-05-19 北京图森智途科技有限公司 Parameter calibration method and device and processing equipment thereof
CN111256591A (en) * 2020-03-13 2020-06-09 易思维(杭州)科技有限公司 External parameter calibration device and method for structured light sensor
CN113686262A (en) * 2021-08-13 2021-11-23 桂林电子科技大学 Line structure optical scanner calibration method and device and storage medium
CN113834488A (en) * 2021-11-25 2021-12-24 之江实验室 Robot space attitude calculation method based on remote identification of structured light array

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101526336B (en) * 2009-04-20 2011-08-24 陈炳生 Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks
CN101943563B (en) * 2010-03-26 2012-04-25 天津大学 Rapid calibration method of line-structured light vision sensor based on space plane restriction
CN101943563A (en) * 2010-03-26 2011-01-12 天津大学 Rapid calibration method of line-structured light vision sensor based on space plane restriction
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN102927908B (en) * 2012-11-06 2015-04-22 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN103411553B (en) * 2013-08-13 2016-03-02 天津大学 The quick calibrating method of multi-linear structured light vision sensors
CN103411553A (en) * 2013-08-13 2013-11-27 天津大学 Fast calibration method of multiple line structured light visual sensor
CN104848801A (en) * 2015-06-05 2015-08-19 北京航空航天大学 Line structure light vision sensor calibration method based on parallel bicylindrical target
CN104848801B (en) * 2015-06-05 2017-06-13 北京航空航天大学 A kind of line structured light vision sensor calibration method based on parallel bicylindrical target
CN106524943A (en) * 2016-11-10 2017-03-22 华南理工大学 Three-dimensional reconstruction device and method of dual-rotation laser
CN106989669A (en) * 2017-02-16 2017-07-28 上海大学 Big visual field high-precision vision system calibrating method based on virtual three-dimensional target
CN107255443A (en) * 2017-07-14 2017-10-17 北京航空航天大学 Binocular vision sensor field calibration method and device under a kind of complex environment
CN107255443B (en) * 2017-07-14 2020-09-01 北京航空航天大学 Method and device for calibrating binocular vision sensor in site in complex environment
CN107560549A (en) * 2017-08-29 2018-01-09 哈尔滨理工大学 A kind of laser vision two-dimension displacement measuring system practicality calibration technique scheme
CN107560549B (en) * 2017-08-29 2020-05-08 哈尔滨理工大学 Calibration method of laser vision two-dimensional displacement measurement system
CN108827157A (en) * 2018-08-31 2018-11-16 广州视源电子科技股份有限公司 Method of calibration, device, system, equipment and the storage medium of laser measurement
CN111156900B (en) * 2018-11-08 2021-07-13 中国科学院沈阳自动化研究所 Line-of-depth linear light measurement method for bullet primer assembly
CN111156900A (en) * 2018-11-08 2020-05-15 中国科学院沈阳自动化研究所 Line-of-depth linear light measurement method for bullet primer assembly
CN111179351A (en) * 2018-11-13 2020-05-19 北京图森智途科技有限公司 Parameter calibration method and device and processing equipment thereof
CN111179351B (en) * 2018-11-13 2023-07-14 北京图森智途科技有限公司 Parameter calibration method and device and processing equipment thereof
CN110470320B (en) * 2019-09-11 2021-03-05 河北科技大学 Calibration method of swinging scanning type line structured light measurement system and terminal equipment
CN110470320A (en) * 2019-09-11 2019-11-19 河北科技大学 The scaling method and terminal device of oscillatory scanning formula line-structured light measuring system
CN111256591A (en) * 2020-03-13 2020-06-09 易思维(杭州)科技有限公司 External parameter calibration device and method for structured light sensor
CN113686262A (en) * 2021-08-13 2021-11-23 桂林电子科技大学 Line structure optical scanner calibration method and device and storage medium
CN113686262B (en) * 2021-08-13 2022-10-11 桂林电子科技大学 Line structure optical scanner calibration method and device and storage medium
CN113834488A (en) * 2021-11-25 2021-12-24 之江实验室 Robot space attitude calculation method based on remote identification of structured light array
CN113834488B (en) * 2021-11-25 2022-03-25 之江实验室 Robot space attitude calculation method based on remote identification of structured light array

Also Published As

Publication number Publication date
CN100565097C (en) 2009-12-02

Similar Documents

Publication Publication Date Title
CN100565097C (en) A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio
Huang et al. Research on multi-camera calibration and point cloud correction method based on three-dimensional calibration object
CN101526336B (en) Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks
CN102927908B (en) Robot eye-on-hand system structured light plane parameter calibration device and method
CN102155923B (en) Splicing measuring method and system based on three-dimensional target
Zhou et al. Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations
Zhao et al. Calibration for stereo vision system based on phase matching and bundle adjustment algorithm
CN101109620A (en) Method for standardizing structural parameter of structure optical vision sensor
Wei et al. A novel 1D target-based calibration method with unknown orientation for structured light vision sensor
CN106989695A (en) A kind of projector calibrating method
CN102564348A (en) Systematic geometric demarcation method for reflection three-dimensional measurement of stripe
CN203231736U (en) Specular object measurement device based on binocular vision
CN102032878A (en) Accurate on-line measurement method based on binocular stereo vision measurement system
CN104567727A (en) Three-dimensional target and global unified calibration method for linear structured light profile sensor
Huang et al. A new reconstruction method based on fringe projection of three-dimensional measuring system
CN104111039A (en) Calibrating method for randomly placing fringe projection three-dimensional measuring system
Liao et al. Flexible calibration method for line-scan cameras using a stereo target with hollow stripes
Wei et al. Calibration method for line structured light vision sensor based on vanish points and lines
Xu et al. 3D multi-directional sensor with pyramid mirror and structured light
CN104380036A (en) Synthesis-parameter generation device for three-dimensional measurement apparatus
Pinto et al. Regular mesh measurement of large free form surfaces using stereo vision and fringe projection
Zhou et al. Constructing feature points for calibrating a structured light vision sensor by viewing a plane from unknown orientations
CN109443214A (en) A kind of scaling method of structured light three-dimensional vision, device and measurement method, device
Li et al. 3D shape measurement based on structured light projection applying polynomial interpolation technique
CN107957251A (en) Reflecting sphere generalization detection method based on computer-assisted correction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20091202

Termination date: 20190513

CF01 Termination of patent right due to non-payment of annual fee