CN100565097C - A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio - Google Patents

A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio Download PDF

Info

Publication number
CN100565097C
CN100565097C CNB2008100818734A CN200810081873A CN100565097C CN 100565097 C CN100565097 C CN 100565097C CN B2008100818734 A CNB2008100818734 A CN B2008100818734A CN 200810081873 A CN200810081873 A CN 200810081873A CN 100565097 C CN100565097 C CN 100565097C
Authority
CN
China
Prior art keywords
target
striation
point
coordinate system
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB2008100818734A
Other languages
Chinese (zh)
Other versions
CN101363713A (en
Inventor
张广军
孙军华
刘谦哲
魏振忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CNB2008100818734A priority Critical patent/CN100565097C/en
Priority to US12/258,398 priority patent/US8078025B2/en
Publication of CN101363713A publication Critical patent/CN101363713A/en
Application granted granted Critical
Publication of CN100565097C publication Critical patent/CN100565097C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The present invention discloses a kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, and this method comprises: set up camera coordinate system, plane of delineation coordinate system; Set up the target coordinate system, obtain the plane target drone image that is used for the calibration sensor structural parameters, and obtain the coordinate of a plurality of striation points under camera coordinate system that structured light is incident upon the striation on the target plane; Obtain the coordinate of a plurality of striation points under camera coordinate system that structured light repeatedly is incident upon the striation on the target plane, again according to all striation point coordinate match structured light equations that obtain.The present invention adopts the constant method of two-dimentional double ratio, makes the calibration point that can obtain any amount on each position of plane target drone, and the structured light of demarcating can be arbitrary patterns.

Description

A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio
Technical field
The present invention relates to the vision measurement technology, be specifically related to a kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio.
Background technology
What the structured light visual sensing technology adopted is the non-cpntact measurement mode, and measurement range is bigger.And measuring speed is fast, system flexibility is good, moderate accuracy, has a wide range of applications in fields such as three-dimensional reconstruction, On-line Product detection, reverse-engineerings.
Structured light sensor is a kind of vision measurement system based on the optical triangulation principle.The structured light of structured light projector projection certain pattern is to testee, formed the modulation striation by the testee surface modulation, take with video camera and to contain the testee surface image of modulating striation, can obtain testee surface three dimension coordinate according to the spatial relation of video camera imaging model and video camera isostructure light.
The structured light that structured light sensor is throwed can be divided into the structured light of different modes such as dot structure light, line-structured light, single line structured light, multi-line structured light, circle structure light, pyramidal structure light.How to determine the spatial relation of the structured light of different mode with video camera, i.e. the demarcation of structured light sensor structural parameters is at first to need the problem that solves.
Structured light sensor structural parameters calibration method commonly used has fiber elongation method, sawtooth target method, based on the not political reform of double ratio of three-dimensional target target and based on the scaling method of plane target drone etc.Wherein,
Fiber elongation method is to allow structured light be projected at several of space distribution not on the filament of coplane, owing to the filament scattering forms bright spot, adopt electronic theodolite to measure the coordinate figure of bright spot in the space, according to bright spot in coordinate of bright spot on the image and the space survey coordinate figure, find the solution the location parameter between structured light and video camera.This scaling method needs the coordinate of manual measurement bright spot, complicated operation, and measuring error is bigger, and needs utility appliance.
Sawtooth target method is that people such as Duan Fajie are at Chinese journal of scientific instrument, 2000,21 (1): propose in the article that 108-110 delivers " a kind of new structure optical sensor structural parameters calibration method ", this method adopts a simple high-precision calibrating of demarcating a target and an one dimension worktable realization line structure optical sensor, does not need the coordinate of putting on the Other Instruments subsidiary optical plane.But this method need be adjusted the attitude of one dimension worktable or structured light sensor, makes that optical plane and crest line are perpendicular, so complicated operation.In addition, processing profile of tooth target cost height, tip surface is limited, and the calibration point number of acquisition is few.
Not political reform is that people such as Xu Guangyou are at Chinese journal of computers based on three-dimensional target target double ratio, 1995, Vol.18, propose in the article that No.6 delivers " a kind of new three dimensional vision system scaling method " based on structured light, this method adopts a high-precision stereo target with two mutual vertical planes, each plane obtains the unique point of at least three conllinear, utilizes the double ratio invariance principle to obtain the intersection point of structural light strip and these known 3 place straight lines.The three-dimensional target processing cost height that this method needs occurs blocking between two vertical planes easily, and the calibration point that obtains is also few.
At present, simple and effective scaling method commonly used is based on the scaling method of two dimensional surface target, wherein based on the not political reform of double ratio of plane target drone, be that Zhou Fuqiang is at Image and Vision Computing, Volume23, Issue 1, January 2005, the scaling method of a kind of structural parameter of structure optical vision sensor that proposes in the article that Pages59-67 delivers " Complete calibration of astructured light stripe vision sensor through planar target of unknownorientations[J] ", this method adopts the double ratio invariance principle to obtain calibration point, the unique point line number seldom on the target, be generally 3~10 row, and delegation's unique point (at least three) can only obtain a calibration point, therefore, this method only is fit to the calibration line structured light, can produce big error of fitting if demarcate nono-linear structure light.
Continue to be valued in the patent of invention that the patent No. is CN200510013231.7 and propose a kind of quick calibrating method for line structure optical sensor based on coplanar reference object in the Zhu, a state in the Zhou Dynasty, this method has proposed to utilize the plane that video camera photocentre and straight line optical strip image determined and the intersection of coplanar reference object to come the calibration line structured light.This method need not utility appliance, does not have occlusion issue, and is simple to operate, but the method can only be used for the demarcation of line-structured light or multi-line structured light.
In sum, it is few that structured light demarcation at present also exists extract minutiae quantity, the shortcoming that can only demarcate the AD HOC line-structured light.
Summary of the invention
In view of this, it is a kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio that fundamental purpose of the present invention is to provide, and retrievable calibration point is more, and can demarcate the structured light of arbitrary patterns.
For achieving the above object, technical scheme of the present invention is achieved in that
A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, this method comprises:
A, set up camera coordinate system, plane of delineation coordinate system;
B, set up the target coordinate system, with video camera camera plane target image, and the plane target drone image of taking carried out distortion correction, extract at least four non-colinear characteristic point coordinates on distortion correction back plane target image then, according to the unique point coordinate that extracts, determine that the target coordinate is tied to the transformation relation of camera coordinate system again; And extract striation coordinate on the described distortion correction back plane target image;
C, in the plane of delineation virtual target unique point of three conllinear of structure, obtain the coordinate of structural light strip point on straight line by three conllinear virtual target unique points and the image; Calculate the double ratio of the striation point obtained and three conllinear virtual target unique points respectively, obtain the coordinate of actual striation point under the target coordinate system of striation point correspondence in the plane of delineation, and be transformed under the camera coordinate system;
D, repeated execution of steps c obtain a plurality of striation points the coordinate under camera coordinate system of plane target drone at corresponding striation under the same position;
E, nothing retrain repeatedly plane of motion target, and behind each plane of motion target, repeated execution of steps b~steps d obtains the coordinate of a plurality of striation points under camera coordinate system that structured light repeatedly is incident upon the striation on the target plane; According to the coordinate of described striation point under camera coordinate system, the equation of match structured light under camera coordinate system.
Wherein, captured plane target drone image comprises striation and at least nine non-colinear unique points, and described unique point is triplex row three row rectangular arranged at least.
Step c further comprises and obtains the striation coordinate, and the described coordinate that obtains striation point calculates for the straight-line equations of determining according to striation coordinate and three conllinear virtual target unique points.
Described to obtain the transformation relation that the target coordinate is tied to camera coordinate system be translation vector and rotation matrix, describedly with striation point in the coordinate conversion under the target coordinate system under camera coordinate system is: change according to translation vector that obtains and rotation matrix.
The virtual target unique point of three conllinear of the described structure of step c comprises:
C1, choose three collinear feature points, take up an official post at the straight line of crossing described three collinear feature points and get a bit, and calculate the double ratio of this point and described three collinear feature points as the virtual target unique point;
C2, choose the unique point of other two groups of colleague/row, every group of three unique points choosing and described three collinear feature points difference same column/OK, finding any respectively on the straight line of crossing every group of three collinear feature points is the virtual target unique point, makes the double ratio of this point and three collinear feature points of this group equal the described double ratio that calculates;
Wherein, described structured light is the structured light of arbitrary patterns.
The described corresponding actual striation point of striation point in the target image that obtains at the coordinate under the target coordinate system is: earlier according to the virtual target unique point and and three unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, obtain in the target image three virtual target unique points respectively at the coordinate of the corresponding point on the target under the target coordinate system; And then according to striation point in the target image and and three virtual target unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, obtain the corresponding coordinate of actual striation point under the target coordinate system of striation point in the target image.
The present invention is based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, choosing under the situation of different double ratios, can obtain a plurality of striation points of plane target drone at same position, position by the changing the plane target, can obtain a plurality of striation points of plane target drone again at diverse location, the striation point of all acquisitions of match just can obtain the structured light equation, because the double ratio difference of choosing, on each position of plane target drone, can obtain the calibration point of any amount, and the structured light of demarcating can be arbitrary patterns.
Description of drawings
Fig. 1 is a calibration system structural drawing of the present invention;
Fig. 2 the present invention is based on the constant structured light sensor structural parameters calibration method flow diagram of two-dimentional double ratio;
Fig. 3 the present invention is based on the constant structured light sensor structural parameters calibration principle schematic of two-dimentional double ratio;
Fig. 4 is an embodiment plane target drone synoptic diagram;
Fig. 5, Fig. 6, Fig. 7 are respectively plane target drone is used for calibration structure sensor construction parameter when three diverse locations plane target drone image.
Embodiment
The present invention is based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, under the different situation of double ratio, can obtain a plurality of striation points of plane target drone at same position, position by the changing the plane target, can obtain a plurality of striation points of plane target drone at diverse location again, the striation point of all acquisitions of match just can obtain the structured light equation.
Be described in further detail the present invention is based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio below by specific embodiments and the drawings.
Fig. 1 is a calibration system structural drawing of the present invention, as shown in Figure 1, calibration system of the present invention is made of structured light projector, plane target drone and video camera, structured light projector is used to throw the structured light of certain pattern to plane target drone, plane target drone is used for modulated structure light and forms the modulation striation, and video camera is used for taking and contains the plane target drone image of modulating striation.
With pyramidal structure light is example, and Fig. 2 the present invention is based on the constant structured light sensor structural parameters calibration method flow diagram of two-dimentional double ratio, as shown in Figure 2, the present invention is based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio and may further comprise the steps:
Step 201: set up plane of delineation coordinate system and camera coordinate system.
Fig. 3 the present invention is based on the constant structured light sensor structural parameters calibration principle schematic of two-dimentional double ratio, as shown in Figure 3, sets up camera coordinate system O according to the putting position of video camera c-x cy cz cWith plane of delineation coordinate system O-UV.
Step 202: set up the target coordinate system, use video camera camera plane target image, and the plane target drone image of taking is carried out distortion correction.
Specifically comprise: open the laser projecting apparatus power supply,, place the plane target drone that has unique point without restrictions, form the light bar on the plane target drone by being incident upon of structured light in the structured light vision sensor measured zone.Here, the plane target drone of described placement pre-sets, set target is a two dimensional surface, it is provided with the circular feature point, and the unique point number is 9~100, and the diameter of unique point is 3mm~100mm, its precision is 0.001mm~0.01mm, the unique point spacing is 3mm~50mm, and its precision is 0.001mm~0.01mm, and the shape of unique point also can be shapes such as square.
Fig. 4 is an embodiment plane target drone synoptic diagram, as shown in Figure 4, the plane target drone that it is the circular feature point of 5mm that the present embodiment employing comprises 9 diameters, the target size is 150mm * 150mm, and horizontal spacing is 60mm between unique point, and longitudinal pitch is 60mm.
As shown in Figure 3, set up target coordinate system O according to the placement location of plane target drone t-x ty tz t
Here, the plane target drone image of taking with video camera should comprise: striation and at least nine non-colinear unique points, described unique point are triplex row three row rectangular arranged at least; The described distortion correction that carries out is according to intrinsic parameters of the camera the plane target drone image of taking to be carried out distortion correction.As for specifically how carrying out distortion correction, belong to prior art, do not repeat them here.
Step 203: at least four non-colinear characteristic point coordinates on the target of distortion correction back plane described in the extraction step 202 image, and, determine that the target coordinate is tied to the transformation relation of camera coordinate system according to the unique point coordinate that extracts.
The present invention adopts the location of pixels of determining unique point based on the shape operator of Hessian matrix, utilizes the second order Taylor expansion to describe the intensity profile curved surface of angle point adjacent domain then, asks for characteristic point coordinates by calculating the curved surface saddle point at last.
Concrete extracting method the article of Chen Dazhi " A New Sub-Pixel Detector for X-Corners inCamera Calibration Targets[C]; WSCG ' 2005 Short Papers Proceedings; 13thInternational Conference in Central Europe on Computer Graphics; Visualizationand Computer Vision; 2005; Plzen, Czech Republic " in have a detailed description.
The transformation relation that the target coordinate is tied to camera coordinate system refers to rotation matrix R and translation vector T.Usually, utilize the image coordinate and the corresponding O of at least four non-colinear unique points t-x ty tz tUnder coordinate, try to achieve earlier the linear solution of 3 * 3 homography matrix H between the plane of delineation and the target plane with least square method; Adopt the Levenberg-Marquardt optimization method to obtain optimum homography matrix H then; From H, decomposite at last from O t-x ty tz tTo O c-x cy cz cRotation matrix R and translation vector T.
The algorithm of specifically asking for homography matrix H, rotation matrix R and translation vector T has a detailed description in Z.Y.Zhang " A flexible new technique for camera calibration[R] (Microsoft Corporation; NSR-TR-98-71; 1998) " article, repeats no more here.
Step 204: the striation coordinate on the target of distortion correction back plane described in the extraction step 202 image.
This step is by the Hessian matrix of calculating chart picture point, and the pairing vector of eigenwert of the absolute value maximum of Hessian matrix, thereby obtains striation normal direction and the second derivative on this direction, finally obtains sub-pixel striation center.Wherein, concrete extracting method is referring to Carsten Steger " Unbiased Extraction of Curvilinear Structure from 2D and 3D Images[D] (Germany, Technology University Munich, 1998) ".
Step 205: the virtual target unique point of three conllinear of structure in the plane of delineation.
Specifically comprise: at first, choose three collinear feature points, get a bit as the virtual target unique point taking up an official post, and calculate the double ratio of this point and described three collinear feature points by the straight line of described three collinear feature points.As shown in Figure 3, in plane of delineation coordinate system O-UV, get 1 s taking up an official post by the straight line of three vertical collinear feature point a1, a2, a3 I1As the virtual target unique point, calculation level s I1Double ratio α with a1, a2, a3 I1Then, choose the unique point of other two groups of colleague/row, every group of three unique points choosing and described three collinear feature points difference same column/OK, finding any respectively on the straight line by every group of three collinear feature points is the virtual target unique point, makes the double ratio of this point and 3 collinear feature points of this group equal double ratio α I1, find respectively by the some s on the straight line of other two groups of vertical collinear feature points I2And s I3, satisfy s I2Double ratio, s with b1, b2, b3 I3All equal α with the double ratio of c1, c2, c3 I1, i.e. s I2And s I3Be two other virtual target unique point.
The plane target drone of present embodiment only comprises nine unique points, so selected unique point is positioned at adjacent lines and adjacent column, and under plane target drone comprises situation more than nine unique points, can interlacing or get unique point every column selection.
Step 206: the intersection point that obtains structural light strip on straight line by three conllinear virtual target unique points and the image is the coordinate of striation point.
In plane of delineation coordinate system, by geometric relationship as can be known, s I1, s I2, s I3At straight line l iOn, according to s I1, s I2, s I3Any 2 coordinate can be in the hope of straight line l iEquation, and straight line l iWith striation two intersection point p are arranged I1, p I2, asked striation coordinate and straight line l by step 204 iEquation can calculate p I1, p I2Coordinate.In actual calibration process, put more than one if choose the striation that double ratio obtains at every turn, also can only obtain one of them striation point and demarcate.
Step 207: the double ratio of the striation point that obtains in the calculation procedure 206 and three conllinear virtual target unique points respectively, promptly calculate p I1With s I1, s I2, s I3Double ratio β I1, and p I2With s I1, s I2, s I3Double ratio β I2
Step 208: obtain the corresponding coordinate of actual striation point under the target coordinate system of striation point in the plane of delineation, and be transformed under the camera coordinate system.According to the virtual target unique point and and three unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, and coordinate and the double ratio α 1 of unique point a1, a2, a3, b1, b2, b3, c1, c2, c3 corresponding point a1 ', a2 ', a3 ', b1 ', b2 ', b3 ', c1 ', c2 ', c3 ' in the target coordinate system determines virtual target unique point s in the plane of delineation coordinate system I1, s I2, s I3Corresponding actual point s I1', s I2', s I3' coordinate in the target coordinate system; And then according to striation point in the target image and and three virtual target unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, and s I1', s I2', s I3' at target coordinate system O t-x ty tz tIn coordinate and double ratio β I1Determine p I1Corresponding striation point p I1' coordinate, s in the target coordinate system I1', s I2', s I3' at target coordinate system O t-x ty tz tIn coordinate and double ratio β I2Determine p I2Corresponding striation point p I2' coordinate in the target coordinate system; Rotation matrix R and the translation vector T that obtains according to step 203 puts p with striation again I1With striation point p I2In the coordinate conversion under the target coordinate system under camera coordinate system.
The character that the double ratio of the double ratio of unique point and virtual target unique point and virtual target unique point and striation point all remains unchanged in perspective transform in the described plane of delineation is referred to as two-dimentional double ratio invariance principle in the present invention.
Step 209: repeated execution of steps 205~step 208, obtain a plurality of striation points the coordinate under camera coordinate system of plane target drone at corresponding striation under the same position.
Step 210: nothing retrains repeatedly plane of motion target, and behind each plane of motion target, repeated execution of steps 202 to 209 is obtained the coordinate of a plurality of striation points under camera coordinate system that structured light repeatedly is incident upon the striation on the target plane.
Step 211: by the coordinate of all striation points under camera coordinate system that step 208~step 210 obtains, the equation of match structured light under camera coordinate system preserved formed equation, calls in order to measuring.
In the present embodiment, constraint plane of motion target three times, according to step 201 and the described method of step 202, obtain being used for plane target drone image such as Fig. 5, Fig. 6, shown in Figure 7 of calibration structure sensor construction parameter respectively, calculate the inner parameter of video camera according to prior art, confidential reference items matrix and distortion factor are respectively:
A = 2868.434082 0 826.6427 0 2868.270264 572.651062 0 0 1
(k 1,k 2,p 1,p 2)=(-0.145619,0.218327,-0.000452,-0.000743)。
Calculate position target coordinate system O shown in Figure 5 according to the described method of step 203 t-x ty tz tTo camera coordinate system O c-x cy cz cRotation matrix R 1With translation vector T 1Be respectively:
R 1 = 0.959072 - 0.077754 - 0.272278 - 0.052515 0.896039 - 0.440859 0.278251 0.437114 0.855283
T 1 = - 59 . 926098 - 51.798515 - 356.107635
According to the described method of step 205~step 208, obtain target when position shown in Figure 5, forms the coordinate of a plurality of striation points under image coordinate system on the striation reach under camera coordinate system coordinate as shown in Table 1:
Be used for the selected double ratio value of constructing virtual point Calculate striation point with the double ratio value between the virtual point The coordinate of striation point on the image The coordinate under camera coordinate system of actual striation point
8.50 -13.9655 (209.49,553.00) (-77.1,-0.08,377.3)
8.70 -3.9891 (211.92,586.00) (-76.89,4.05,379.5)
8.60 4.1108 (217.81,479.96) (-76.11,-9.24,372.89)
10.00 -1.4406 (227.42,649.85) (-75.32,12.09,384.14)
9.60 2.1934 (240.41,415.21) (-73.6,-17.44,369.46)
13.60 -0.8053 (254.28,705.83) (-72.51,19.23,388.72)
12.00 1.6930 (271.40,365.29) (-70.17,-23.88,367.24)
15.00 1.5317 (292.32,339.29) (-67.84,-27.29,366.22)
20.00 1.4368 (311.08,320.09) (-65.74,-29.85,365.57)
30.00 1.3633 (330.65,302.53) (-63.53,-32.22,365.05)
-22.00 -0.3401 (353.86,805.20) (-61.65,32.47,398.99)
-11.50 -0.3015 (379.17,819.65) (-58.81,34.52,400.96)
-7.00 -0.2710 (407.17,832.56) (-55.62,36.41,402.95)
-9.00 1.1726 (440.14,242.40) (-50.9,-40.85,364.62)
-6.00 1.1468 (473.93,232.71) (-46.9,-42.42,365.09)
-4.00 1.1290 (508.03,226.36) (-42.81,-43.57,365.8)
-1.70 -0.2397 (542.97,855.55) (-39.54,40.61,410.23)
-2.00 1.1163 (578.98,225.37) (-34.11,-44.36,368.16)
-1.40 1.1176 (613.00,228.95) (-29.85,-44.19,369.6)
-0.55 -0.3241 (648.12,834.36) (-26.37,38.68,413.43)
-0.70 1.1443 (680.17,248.60) (-21.27,-42.11,373.41)
-0.18 -0.4409 (706.91,806.82) (-18.7,35.38,414.17)
-0.05 -0.5263 (732.97,789.95) (-15.23,33.24,414.17)
0.05 -0.6322 (756.29,772.30) (-12.09,30.94,413.98)
-0.03 1.2725 (776.17,310.83) (-8.61,-34.2,381.52)
0.20 -0.9464 (794.36,734.31) (-6.86,25.83,413.01)
0.25 -1.2208 (811.12,712.08) (-4.52,22.77,412.18)
Table one
Calculate position target coordinate system O shown in Figure 6 according to the described method of step 203 t-x ty tz tTo camera coordinate system O c-x cy cz cRotation matrix R 2With translation vector T 2Be respectively:
R 2 = 0.973928 - 0.038511 - 0.223563 - 0.082464 0.857968 - 0.507040 0.211336 0.512257 0.832424
T 2 = - 82.035706 - 27.551346 367.943481
According to the described method of step 205~step 208, obtain target when Fig. 6 position, forms a plurality of striation points on the striation the coordinate under the image coordinate system reach under camera coordinate system coordinate as shown in Table 2:
Be used for the selected double ratio value of constructing virtual point Striation point is with the double ratio value between the virtual point The coordinate of striation point on the image The coordinate under camera coordinate system of actual striation point
-62.90 1.2891 (207.47,468.05) (-97.75,12.25,387.96)
-50.30 1.5220 (208.67,526.01) (-97.38,19.17,392.28)
-41.60 1.1747 (215.55,424.87) (-97.06,7.03,384.97)
-21.90 2.1310 (225.92,599.03) (-95.19,27.86,398.16)
-15.00 2.7039 (239.37,630.80) (-93.58,31.65,400.9)
-13.50 1.0293 (254.70,342.78) (-93.14,-3.14,379.83)
-10.20 1.0026 (270.52,322.55) (-91.48,-5.73,378.7)
-7.80 0.9792 (288.94,302.93) (-89.51,-8.28,377.67)
-6.10 0.9613 (307.71,286.68) (-87.48,-10.45,376.9)
-4.00 -6.8682 (329.75,748.28) (-82.98,45.91,412.43)
-3.75 0.9324 (352.84,257.70) (-82.48,-14.49,375.78)
-2.45 -2.6138 (379.28,785.54) (-77.12,50.59,416.86)
-2.20 0.9123 (408.94,235.75) (-76.07,-17.86,375.45)
-1.35 -1.5969 (440.82,817.42) (-69.73,54.74,421.37)
-1.20 0.9020 (473.97,224.60) (-68.38,-20.06,376.17)
-0.85 0.9012 (507.03,224.67) (-64.36,-20.47,377)
-0.40 -1.2108 (542.03,841.67) (-57.28,58.19,426.82)
-0.30 0.9067 (578.97,234.13) (-55.38,-20.18,379.59)
-0.15 0.9131 (611.85,243.42) (-51.16,-19.4,381.2)
0.02 0.9226 (642.98,256.04) (-47.08,-18.15,383.06)
0.20 -1.5687 (674.98,827.96) (-40.27,56.96,430.64)
0.35 -1.8920 (704.93,816.84) (-36.34,55.6,430.87)
0.40 -2.4506 (730.95,802.92) (-32.9,53.84,430.72)
0.45 0.9907 (753.68,329.31) (-32,-9.73,392.24)
0.50 1.0158 (774.14,350.85) (-29.09,-7.04,394.65)
0.55 1.0444 (792.16,372.86) (-26.49,-4.25,397.05)
0.60 1.0817 (809.99,398.00) (-23.89,-1,399.73)
Table two
Calculate position target coordinate system O shown in Figure 7 according to the described method of step 203 t-x ty tz tTo camera coordinate system O c-x cy cz cRotation matrix R 3With translation vector T 3Be respectively:
R 3 = 0.920771 - 0.087683 - 0.380122 - 0.125856 0.855539 - 0.502209 0.369244 0.510260 0.77672
T 3 = - 99.789154 - 52.651661 362.485687
According to the described method of step 205~step 208, obtain target when Fig. 7 position, form a plurality of striation points on the striation the coordinate under the image coordinate system reach under camera coordinate system coordinate as shown in Table 3:
Be used for the selected double ratio value of constructing virtual point Striation point is with the double ratio value between the virtual point The coordinate of striation point on the image The coordinate under camera coordinate system of actual striation point
-2.00 10.4213 (256.47,516.01) (-112.06,-7.09,385.94)
-1.95 -70.2624 (258.05,538.99) (-111.97,-4.35,387.75)
-1.88 2.4289 (266.24,432.07) (-110.78,-17.28,379.98)
-1.60 -2.1875 (276.68,610.11) (-110.25,4.05,394.03)
-1.52 1.7543 (290.85,375.90) (-108.01,-24.35,376.76)
-1.18 -1.0330 (305.74,665.16) (-107.37,10.52,399.62)
-1.13 1.4837 (322.81,331.83) (-104.47,-30.12,374.76)
-0.93 1.3969 (342.19,312.18) (-102.32,-32.8,374.08)
-0.62 -0.5117 (360.76,730.24) (-101.68,18.17,407.35)
-0.57 1.2805 (384.10,279.18) (-97.65,-37.49,373.33)
-0.26 -0.3448 (409.73,767.42) (-96.45,22.56,412.75)
-0.09 -0.2836 (440.09,784.83) (-93.13,24.63,415.71)
0.05 -0.2386 (470.86,799.31) (-89.7,26.35,418.5)
0.19 -0.2037 (504.09,811.67) (-85.95,27.83,421.3)
0.24 1.1394 (538.98,225.85) (-79.87,-46.84,375.99)
0.44 -0.1658 (575.97,826.15) (-77.6,29.52,426.48)
0.54 -0.1572 (611.95,829.44) (-73.3,29.88,428.81)
0.58 1.1394 (649.93,229.47) (-66.54,-48.47,381.47)
0.66 1.1542 (685.05,237.79) (-62.2,-48.05,383.85)
0.77 -0.1834 (717.86,817.53) (-60.18,28.11,434.09)
0.79 1.2062 (752.22,262.56) (-53.75,-46.06,389.28)
0.88 -0.2449 (780.80,794.61) (-52.02,24.89,436)
0.89 1.2934 (808.74,295.35) (-46.48,-42.72,395)
0.92 1.3542 (832.12,313.83) (-43.44,-40.64,397.83)
0.98 -0.4173 (853.44,747.44) (-42.22,18.22,436.48)
1.00 -0.5069 (872.40,729.38) (-39.59,15.66,436.11)
1.01 1.6451 (890.19,373.85) (-35.79,-33.44,406.23)
Table three
At last, according to all striation points that obtain be at the structured light equation that the coordinate fitting under the camera coordinate system goes out:
0.085x 2+0.2086y 2+0.5688z 2+0.2016xy-0.6220yz-0.3569xz+151.6199x+261.2671y-477.3929z-100000=0
After obtaining the structured light equation of match, structured light vision sensor just can call resulting equation and detect in testing process.
The above is preferred embodiment of the present invention only, is not to be used to limit protection scope of the present invention.

Claims (7)

1, a kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio, it is characterized in that this method comprises:
A, set up camera coordinate system, plane of delineation coordinate system;
B, set up the target coordinate system, with video camera camera plane target image, and the plane target drone image of taking carried out distortion correction, extract at least four non-colinear characteristic point coordinates on distortion correction back plane target image then, according to the unique point coordinate that extracts, determine that the target coordinate is tied to the transformation relation of camera coordinate system again; And extract striation coordinate on the described distortion correction back plane target image;
C, in the plane of delineation virtual target unique point of three conllinear of structure, obtain the coordinate of structural light strip point on straight line by three conllinear virtual target unique points and the image; Calculate the double ratio of the striation point obtained and three conllinear virtual target unique points respectively, obtain the coordinate of actual striation point under the target coordinate system of striation point correspondence in the plane of delineation, and be transformed under the camera coordinate system;
D, repeated execution of steps c obtain a plurality of striation points the coordinate under camera coordinate system of plane target drone at corresponding striation under the same position;
E, nothing retrain repeatedly plane of motion target, and behind each plane of motion target, repeated execution of steps b~steps d obtains the coordinate of a plurality of striation points under camera coordinate system that structured light repeatedly is incident upon the striation on the target plane; According to the coordinate of described striation point under camera coordinate system, the equation of match structured light under camera coordinate system.
2, method according to claim 1 is characterized in that, captured plane target drone image comprises striation and at least nine non-colinear unique points, and described unique point is triplex row three row rectangular arranged at least.
3, method according to claim 2 is characterized in that, step c further comprises and obtain the striation coordinate, and the described coordinate that obtains striation point calculates for the straight-line equations of determining according to striation coordinate and three conllinear virtual target unique points.
4, method according to claim 3, it is characterized in that, described to obtain the transformation relation that the target coordinate is tied to camera coordinate system be translation vector and rotation matrix, describedly with striation point in the coordinate conversion under the target coordinate system under camera coordinate system is: change according to translation vector that obtains and rotation matrix.
5, method according to claim 1 is characterized in that, the virtual target unique point of three conllinear of the described structure of step c comprises:
C1, choose three collinear feature points, take up an official post at the straight line of crossing described three collinear feature points and get a bit, and calculate the double ratio of this point and described three collinear feature points as the virtual target unique point;
C2, choose the unique point of other two groups of colleague/row, every group of three unique points choosing and described three collinear feature points difference same column/OK, finding any respectively on the straight line of crossing every group of three collinear feature points is the virtual target unique point, makes the double ratio of this point and three collinear feature points of this group equal the described double ratio that calculates;
6, method according to claim 1 is characterized in that, described structured light is the structured light of arbitrary patterns.
7, according to claim 3 or 5 described methods, it is characterized in that, the described corresponding actual striation point of striation point in the target image that obtains at the coordinate under the target coordinate system is: earlier according to the virtual target unique point and and three unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, obtain in the target image three virtual target unique points respectively at the coordinate of the corresponding point on the target under the target coordinate system; And then according to striation point in the target image and and three virtual target unique points of its conllinear between the principle that in perspective transform, remains unchanged of double ratio, obtain the corresponding coordinate of actual striation point under the target coordinate system of striation point in the target image.
CNB2008100818734A 2007-10-26 2008-05-13 A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio Expired - Fee Related CN100565097C (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CNB2008100818734A CN100565097C (en) 2007-12-29 2008-05-13 A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio
US12/258,398 US8078025B2 (en) 2007-10-26 2008-10-25 Vehicle dynamic measurement device and method for comprehensive parameters of rail wear

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN200710308568.X 2007-12-29
CN200710308568 2007-12-29
CNB2008100818734A CN100565097C (en) 2007-12-29 2008-05-13 A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio

Publications (2)

Publication Number Publication Date
CN101363713A CN101363713A (en) 2009-02-11
CN100565097C true CN100565097C (en) 2009-12-02

Family

ID=40390217

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2008100818734A Expired - Fee Related CN100565097C (en) 2007-10-26 2008-05-13 A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio

Country Status (1)

Country Link
CN (1) CN100565097C (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101526336B (en) * 2009-04-20 2011-08-24 陈炳生 Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks
CN101943563B (en) * 2010-03-26 2012-04-25 天津大学 Rapid calibration method of line-structured light vision sensor based on space plane restriction
CN102927908B (en) * 2012-11-06 2015-04-22 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN103411553B (en) * 2013-08-13 2016-03-02 天津大学 The quick calibrating method of multi-linear structured light vision sensors
CN104848801B (en) * 2015-06-05 2017-06-13 北京航空航天大学 A kind of line structured light vision sensor calibration method based on parallel bicylindrical target
CN106524943B (en) * 2016-11-10 2019-12-10 华南理工大学 Three-dimensional reconstruction device and method of double-rotation laser
CN106989669B (en) * 2017-02-16 2018-12-07 上海大学 Big visual field high-precision vision system calibrating method based on virtual three-dimensional target
CN107255443B (en) * 2017-07-14 2020-09-01 北京航空航天大学 Method and device for calibrating binocular vision sensor in site in complex environment
CN107560549B (en) * 2017-08-29 2020-05-08 哈尔滨理工大学 Calibration method of laser vision two-dimensional displacement measurement system
CN108827157B (en) * 2018-08-31 2020-11-20 广州视源电子科技股份有限公司 Laser measurement verification method, device, system, equipment and storage medium
CN111156900B (en) * 2018-11-08 2021-07-13 中国科学院沈阳自动化研究所 Line-of-depth linear light measurement method for bullet primer assembly
CN111179351B (en) * 2018-11-13 2023-07-14 北京图森智途科技有限公司 Parameter calibration method and device and processing equipment thereof
CN110470320B (en) * 2019-09-11 2021-03-05 河北科技大学 Calibration method of swinging scanning type line structured light measurement system and terminal equipment
CN111256591B (en) * 2020-03-13 2021-11-02 易思维(杭州)科技有限公司 External parameter calibration device and method for structured light sensor
CN113686262B (en) * 2021-08-13 2022-10-11 桂林电子科技大学 Line structure optical scanner calibration method and device and storage medium
CN113834488B (en) * 2021-11-25 2022-03-25 之江实验室 Robot space attitude calculation method based on remote identification of structured light array

Also Published As

Publication number Publication date
CN101363713A (en) 2009-02-11

Similar Documents

Publication Publication Date Title
CN100565097C (en) A kind of based on the constant structured light sensor structural parameters calibration method of two-dimentional double ratio
KR101601331B1 (en) System and method for three-dimensional measurment of the shape of material object
CN102927908B (en) Robot eye-on-hand system structured light plane parameter calibration device and method
CN102155923B (en) Splicing measuring method and system based on three-dimensional target
CN101526336B (en) Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks
CN104111039B (en) For arbitrarily putting the scaling method of fringe projection three-dimension measuring system
CN101109620A (en) Method for standardizing structural parameter of structure optical vision sensor
CN102564348A (en) Systematic geometric demarcation method for reflection three-dimensional measurement of stripe
CN106989695A (en) A kind of projector calibrating method
CN109059806B (en) A kind of mirror article three dimension profile measurement device and method based on infrared stripes
CN203231736U (en) Specular object measurement device based on binocular vision
CN102032878A (en) Accurate on-line measurement method based on binocular stereo vision measurement system
CN104567727A (en) Three-dimensional target and global unified calibration method for linear structured light profile sensor
Xu et al. 3D multi-directional sensor with pyramid mirror and structured light
Pinto et al. Regular mesh measurement of large free form surfaces using stereo vision and fringe projection
CN104380036A (en) Synthesis-parameter generation device for three-dimensional measurement apparatus
Liao et al. Flexible calibration method for line-scan cameras using a stereo target with hollow stripes
CN107907055A (en) Pattern projection module, three-dimensional information obtain system, processing unit and measuring method
Zhou et al. Constructing feature points for calibrating a structured light vision sensor by viewing a plane from unknown orientations
Li et al. 3D shape measurement based on structured light projection applying polynomial interpolation technique
CN107957251A (en) Reflecting sphere generalization detection method based on computer-assisted correction
JP2013178174A (en) Three-dimensional shape measuring apparatus using a plurality of gratings
US20230083039A1 (en) Method and system for optically measuring an object having a specular and/or partially specular surface and corresponding measuring arrangement
CN113280755A (en) Large-curvature mirror surface three-dimensional shape measuring method based on curved surface screen phase deflection
Yu et al. An improved projector calibration method for structured-light 3D measurement systems

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20091202

Termination date: 20190513

CF01 Termination of patent right due to non-payment of annual fee