CN106705849B - Calibrating Technique For The Light-strip Sensors - Google Patents

Calibrating Technique For The Light-strip Sensors Download PDF

Info

Publication number
CN106705849B
CN106705849B CN201710056155.0A CN201710056155A CN106705849B CN 106705849 B CN106705849 B CN 106705849B CN 201710056155 A CN201710056155 A CN 201710056155A CN 106705849 B CN106705849 B CN 106705849B
Authority
CN
China
Prior art keywords
calibration
structured light
preset number
feature points
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710056155.0A
Other languages
Chinese (zh)
Other versions
CN106705849A (en
Inventor
汤宜军
张慧
陆娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Step Electric Corp
Original Assignee
Shanghai Step Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Step Electric Corp filed Critical Shanghai Step Electric Corp
Priority to CN201710056155.0A priority Critical patent/CN106705849B/en
Publication of CN106705849A publication Critical patent/CN106705849A/en
Application granted granted Critical
Publication of CN106705849B publication Critical patent/CN106705849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

The present invention relates to line structure optical sensor technical fields, disclose a kind of Calibrating Technique For The Light-strip Sensors.Comprising: acquire N number of 3D target image by line structure optical sensor to be calibrated;Wherein, include more than or equal to the non-coplanar characteristic point of the first preset number and 2 structural light stripes in each 3D target image, include more than or equal to the non-coplanar characteristic point of the second preset number in structural light stripes;The non-coplanar characteristic point of the first preset number is chosen in N number of 3D target image;Video camera is demarcated according to the non-coplanar characteristic point of the first preset number of selection;The non-coplanar characteristic point of the second preset number is chosen in the structural light stripes of N number of 3D target image;The calibration of line-structured light plane is completed according to the calibration result of the non-coplanar characteristic point of the second preset number of selection and video camera.So that video camera and structure light Planar realization combined calibrating in line structure optical sensor, are conducive to improve stated accuracy, simplify calibration process.

Description

Line structured light sensor calibration method
Technical Field
The invention relates to the technical field of line structured light sensors, in particular to a calibration method of a line structured light sensor.
Background
The calibration of the line structure optical sensor is a tool for digitalizing a molded surface and detecting the appearance, and the accurate calibration of line structure optical parameters is the premise of accurate measurement. The structured light method is an active optical measurement technology, and the basic principle is that a structured light projector projects controllable light spots, light bars or smooth surface structures to the surface of a measured object, an image sensor (such as a camera) obtains images, and the three-dimensional coordinates of the object are calculated by utilizing the trigonometric principle through the system geometric relationship.
The line structured light generating device includes: a laser projector and a camera. Mathematical model of line structured light: 1. perspective projection model 2 of the camera, structured light plane equation. The contents to be calibrated are as follows: internal parameters of the video camera (focal length, principal point and distortion coefficient of the camera), and structural parameters of the line-structured light sensor (light plane equation). Mainly refers to the calibration of structural parameters, i.e. the position of the light plane relative to the camera.
As is well known, the robot must have two calibration processes before using the line structured light sensor: calibration of a line structured light sensor and calibration of hand-eye relationship. The hand-eye relationship calibration is a process for calculating the mutual relationship between the sensor system and the robot system in the space; the line structured light sensor calibration is a process for calculating the mutual relation between a camera in the sensor and a structured light plane in a space, wherein the calibration method comprises 2 important components, one is a calibration algorithm, and the other is an auxiliary calibration target.
In recent years, line laser sensor calibration methods are more and more, and both calibration algorithms and auxiliary calibration devices are greatly improved. However, any calibration method has the defects of self design, such as the separate calibration of a camera system and a structured light plane, poor calibration precision, the need of an external tool platform for assistance, the need of partial manual participation, complicated calibration process and the like.
The inventor of the invention finds out that: in the calibration method of the line-structured light sensor provided by the prior art, calibration of a camera and a structured light plane is usually performed separately, and the calibration method is low in calibration precision and complicated in calibration process.
Disclosure of Invention
The invention aims to provide a calibration method of a line-structured light sensor, which realizes the joint calibration of a camera and a structured light plane in the line-structured light sensor, and is beneficial to improving the calibration precision and simplifying the calibration process.
In order to solve the above technical problem, an embodiment of the present invention provides a method for calibrating a line structured light sensor, including: acquiring N3D target images through a line structured light sensor to be calibrated; each 3D target image comprises non-coplanar feature points and 2 structural light stripes, wherein the number of the non-coplanar feature points is greater than or equal to a first preset number, the number of the non-coplanar feature points is greater than or equal to a second preset number, and N is a positive integer; selecting a first preset number of non-coplanar feature points from the N3D target images; calibrating the camera according to the selected first preset number of non-coplanar characteristic points; selecting a second preset number of non-coplanar feature points from the structured light stripes of the N3D target images; and finishing the calibration of the line structured light plane according to the selected second preset number of non-coplanar feature points and the calibration result of the camera.
Compared with the prior art, the method and the device for calibrating the camera can calibrate the camera according to the first preset number of non-coplanar feature points selected from the N3D target images. And the calibration of the line structured light plane can be realized according to the second preset number of non-coplanar feature points selected from the structured light stripes of the N3D target images and the calibration result of the camera. Therefore, the camera and the structured light plane in the line structured light sensor can realize combined calibration, thereby being beneficial to improving the calibration precision and simplifying the calibration process.
In addition, calibrating the camera according to the selected first preset number of non-coplanar characteristic points, specifically comprising: and calibrating the camera by solving a homography matrix between a three-dimensional target coordinate system and an image pixel coordinate system for a first preset number of non-coplanar feature points. The embodiment of the invention provides feasibility for the calibration of the camera.
In addition, the calibration of the line structured light plane is completed according to the selected second preset number of non-coplanar feature points and the calibration result of the camera, and the method specifically comprises the following steps: and calculating the coordinates of the characteristic points on the linear structure light plane under the camera coordinate system by using a second preset number of non-coplanar characteristic points and the calibration result of the camera, establishing a linear equation of the linear structure light plane, and completing the calibration of the linear structure light plane equation by using a least square method. The embodiment of the invention ensures that the calibration of the line structure light plane is more convenient and feasible.
In addition, in N3D target images are acquired through the line structured light sensor to be calibrated, the acquisition angles of the 3D target images are different. Thus, the calibration reliability of the line structured light sensor can be improved.
Drawings
Fig. 1 is a flowchart of a line structured light sensor calibration method according to a first embodiment of the present invention;
FIG. 2 is a schematic geometric relationship diagram of a line-structured light sensor model according to a first embodiment of the present invention;
fig. 3 is a flowchart of a line structured light sensor calibration method according to a second embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
A first embodiment of the present invention relates to a method for calibrating a line structured light sensor, as shown in fig. 1, including:
step 101: and acquiring N3D target images through a line structured light sensor to be calibrated. Each 3D target image includes non-coplanar feature points greater than or equal to a first preset number and 2 structured light stripes, where the 2 structured light stripes include non-coplanar feature points greater than or equal to a second preset number. The number N of the shot 3D target images is a positive integer, and collecting an appropriate number of 3D target images is beneficial to improving the calibration efficiency and the calibration precision, in one example, N may be a value between [6 and 9], for example, N is equal to 6.
In the embodiment of the invention, in the N3D target images acquired by the line-structured light sensor to be calibrated, the acquisition angles of the 3D target images can be different, for example, the 3D target images can be acquired at different poses. Because the angles of the 3D target images collected under different poses are different, the collected 3D target images at different angles are not limited by the depth of field and a specific position under a certain pose, and therefore the reliability of the calibration of the line structured light sensor can be improved.
Step 102: and selecting a first preset number of non-coplanar feature points from the N3D target images. The first preset number is determined by a mathematical model of a camera calibration algorithm, and in combination with the camera calibration algorithm (see below), the first preset number is 6.
Step 103: and calibrating the camera according to the selected first preset number of non-coplanar characteristic points. Specifically, the camera is calibrated by solving a homography matrix between a three-dimensional target coordinate system and an image pixel coordinate system for a first preset number of selected non-coplanar feature points.
Step 104: and selecting a second preset number of non-coplanar feature points from the structured light stripes of the N3D target images. Wherein, the second preset number may be 5.
Step 105: and finishing the calibration of the line structured light plane according to the selected second preset number of non-coplanar feature points and the calibration result of the camera. Specifically, the coordinates of the feature points on the linear structured light plane under the camera coordinate system are calculated by using a second preset number of non-coplanar feature points and the calibration result of the camera, a linear equation of the linear structured light plane is established, and the calibration of the linear structured light plane equation is completed by using a least square method.
The following describes the calibration algorithm of the present embodiment in detail with reference to fig. 2:
FIG. 2 is a schematic diagram of the geometric relationship of a model of a line structured light sensor, in FIG. 2, OwxwywzwAs a global coordinate system, OuxuyuAs an image coordinate system, OcxcyczcFor the camera coordinate system, as shown in fig. 2, the line structured light sensor model includes a camera 201, an image plane 202, a line structured light sensor 203, a structured light emitter 204, a structured light plane 205, and a workpiece 206 under test.
Let a point P on the structured-light plane 205, a projection in the image coordinate system be P, and a homogeneous coordinate in the image coordinate system be PHomogeneous coordinates in the global coordinate system areAccording to the pinhole imaging model of the camera, the following results can be obtained:
wherein,kappa is a proportionality constant; [ R t]Describing a rotation matrix and a translation vector from a global coordinate system to a camera coordinate system for an external parameter matrix of the camera;is an intrinsic parameter matrix of the camera, where (u)0,v0) Pixel coordinates representing principal points of the image, fx、fyRespectively representing X, Y axial equivalent focal lengths, gamma representing the imaging plane tilt factor; m is the projection matrix of the camera.
Meanwhile, the point P is a point on the structured light plane, and satisfies the structured light plane equation. Let the pixel value of point P be (u, v), and the equation of the structured-light plane in the camera coordinate system be:
Alxc+Blyc+Clzc=Dl (2)
wherein A isl、Bl、Cl、DlRepresenting 4 coefficients of the structured light plane, respectively.
The point P goes from the structured light plane to the camera coordinate system OcxcyczcLower coordinate (x)c,yc,zc) The transformation process of (a) is as follows: wherein the camera intrinsic parameters are obtained through calibrationDistortion coefficient k1、k2Equation of structured light Alxc+Blyc+Clzc=Dl Binding of Alxc+Blyc+Clzc=DlObtaining:
the calibration and solution process of the camera according to the selected non-coplanar characteristic points is as follows:
let homography matrix M ═ a [ R t ], we can get from (1):
it is clear that the homography matrix M of the 3 x 4 matrix is associated with a scale factor.
Remember that M ═ M1 m2 m3 m4],mi=[mi1 mi2 mi3]TIs M element of ith column, R ═ R1 r2 r3],riThe i-th column element of R. From (4)
[m1 m2 m3 m4]=λA[r1 r2 r3 t] (5)
Where λ is an arbitrary scalar.
The camera system may consist of an imaging chip and an optical lens, and there is generally radial distortion and tangential distortion, the radial distortion being dominant.
Let (x, y) be coordinates of the normalized image coordinate system of the ideal projection point when there is no distortion,the image coordinate system coordinates are normalized for the actual projected point when distortion is present. Then there are:
and can be obtained from (1)Wherein,is the actual pixel coordinates of the image.
In the embodiment of the present invention, Pi=(Xi,Yi,Zi)T、pi=(ui,vi)TRespectively representing the target point of the point p in the global coordinate system and the point in the image coordinate system, and recording the homogeneous coordinateIn practice, P is due to noise interference during the acquisition of the image pointsiAnd piEquation (1) cannot be satisfied. Assuming that an image point is subjected to a covariance matrix with a mean of 0 ofGaussian noise interference, the maximum likelihood estimate of the single mapping matrix H can be obtained by making a functionAnd taking the minimum value to obtain. Wherein, is a momentRow i of array M.
In practical applications, if the extraction of each point is independent of each other, it can simply be assumed that there is a presence for all iIn this case, the solution of the single mapping matrix M becomes a non-linear least squaresAnd solving the problem. The non-linear minimization problem can be solved using the Levenberg-Marquardt (Levenberg-Marquardt) criterion. The criterion requires an initial value, which can be obtained by the following method.
Note the bookFrom (5) not to give m43When 1, equation (4) becomes:whereinIf n points are provided, the equation set is recorded as L2n×11X=K2n×1Where there are 11 unknowns, X can be resolved as long as n ≧ 6, and X ═ LTL)-1LTK, the initial value of the single mapping matrix M can be obtained.
The solving process of the calibration parameters A and [ R t ] is as follows:
from (5), it can be seen that:
m1=λAr1m2=λAr2m3=λAr3m4considering that R is an orthogonal matrix, λ t (6), the following constraint relationship exists:
and
note the book
Wherein B is a symmetric matrix, and a definition vector B ═ B11B12B22B13B23B33]TThen there isWherein,
(7) assuming that n calibration model images are acquired, n sets of equations (5) are formed as the following equations:
Vb=0 (8)
where V is a 5n 6 matrix. If n ≧ 2, a unique solution b can be obtained. If n is 1, the imaging plane tilt factor γ may be 0, i.e. equation (8) may be augmented by a constraint equation [ 01000 ═ 0]b is 0. The solution of equation (8) is the matrix VTAnd V is the eigenvector corresponding to the minimum eigenvalue.
From B, a matrix B is calculated, which is associated with a scaling factor, i.e. B ═ λ a-TA-1And λ is an arbitrary scalar. The intrinsic parameter matrix can be calculated according to the following formula:
after the internal parameter matrix A is calculated, respectively calculating the external parameter matrix of each image according to the step (6):
wherein, k is 1/| | A-1h1||=1/||A-1h2||=1/||A-1h3L. The matrix r calculated above is due to interference from noise1 r2 r3]And not an identity orthogonal matrix.
Let Q ═ r1 r2 r3]Its singular value decomposition into Q ═ USVTWherein S ═ diag (σ)123) Then R ═ UVTIs an orthogonal matrix most similar to Q, i.e.And minimum.
The embodiment of the invention can also optimize the obtained camera parameters, and the specific optimization process is as follows:
assuming that n reference images are acquired, m index points are taken on each image, and the image points are extracted independently and are interfered by noise with the same distribution. We can do this by making a functionAnd obtaining optimized parameters at minimum.
In function f mijIs the pixel value, k, of the j point of the ith image1,k2Respectively, introduced first-order and second-order distortion coefficients, MijA reference coordinate corresponding to the jth pixel point of the ith image, A is the camera intrinsic parameter, and Ri,ti]The parameter is the camera external parameter of the ith image. I (m)ij,A,k1,k2) For the normalized image coordinate system coordinates calculated from the image pixel coordinates,the coordinate of the normalized image coordinate system is calculated by the object reference coordinate and the external parameter matrix. In optimization, the rotation matrix R is usediConverted into pitch Roll (RPY) angles phi, theta,and the sum of phi, theta,optimized as a parameter.
The non-linear minimization can be solved using the Levenberg-Marquardt criterion. The criterion requires an initial value. The initial value of the distortion coefficient can be obtained byAnd solving, wherein,
assuming that a total of n images are acquired, 2mn equations can be obtained by taking m calibration points from each image. The distortion coefficient initial value k ═ k can be obtained by taking the above equation system as Mk ═ d1k2]'=(MTM)-1MTd, the initial values of the rest parameters are obtained in the previous step.
Points on the structured light surface are under the camera coordinate system OcxcyczcIs the lower coordinate (x)c,yc,zc) All satisfy the equation structural light plane equation Alxc+Blyc+Clzc=DlThe point of the image coordinate system on the structured light plane selected by the graph can be converted into the corresponding point coordinate under the camera coordinate system through the calculated camera calibration parameter, the point coordinate under the camera coordinate system is brought into the structured light plane equation, and the unknown parameter of the equation can be solved by using least square. Therefore, the calibration of the light plane of the line structure can be conveniently realized.
Compared with the prior art, the method and the device for calibrating the camera can calibrate the camera according to the first preset number of non-coplanar feature points selected from the N3D target images. And the calibration of the line structured light plane can be realized according to the second preset number of non-coplanar feature points selected from the structured light stripes of the N3D target images and the calibration result of the camera. Therefore, the embodiment can realize the combined calibration of the camera and the structured light plane in the line structured light sensor, and can adopt the calibration algorithm based on the mathematical model to calibrate, thereby being beneficial to improving the calibration precision and simplifying the calibration process.
The second embodiment of the invention relates to a line structure optical sensor calibration method. The second embodiment is further improved on the basis of the first embodiment, and the main improvement is that: in the second embodiment of the present invention, after completing the calibration of the line structured light plane according to the selected second preset number of non-coplanar feature points and the calibration result of the camera, the method further includes: and calculating and displaying the calibration error. As shown in fig. 3, the method for calibrating a line structured light sensor according to the present embodiment includes:
step 301: and acquiring N3D target images through a line structured light sensor to be calibrated. Each 3D target image includes non-coplanar feature points greater than or equal to a first preset number and 2 structured light stripes.
Step 302: and selecting a first preset number of non-coplanar feature points from the N3D target images. Wherein the first predetermined number is determined based on a calibration algorithm.
Step 303: and calibrating the camera according to the selected first preset number of non-coplanar characteristic points. Specifically, the camera is calibrated by solving a homography matrix between a three-dimensional target coordinate system and an image pixel coordinate system for a first preset number of selected non-coplanar feature points.
Step 304: and selecting a second preset number of non-coplanar feature points from the structured light stripes of the N3D target images. Wherein, the second preset number may be 5.
Step 305: and finishing the calibration of the line structured light plane according to the selected second preset number of non-coplanar feature points and the calibration result of the camera. Specifically, the coordinates of the feature points on the linear structured light plane under the camera coordinate system are calculated by using a second preset number of non-coplanar feature points and the calibration result of the camera, a linear equation of the linear structured light plane is established, and the calibration of the linear structured light plane equation is completed by using a least square method.
Step 306: and calculating and displaying the calibration error.
In step 301, the operation of acquiring the 3D target image includes placing the 3D target on a calibration tool table, connecting a sensor, turning on a power supply, acquiring 22 images at different poses (the number of the acquired images is not specifically limited), and naming the acquired images from 1 to 22 and storing the images in a bmp format.
Secondly, after a plurality of 3D target images are collected through a line structured light sensor to be calibrated, an MATLAB calibration program is opened, getpixel (n) is input into a command window, n is a corresponding picture serial number, and preset related instructions are sequentially completed according to prompts. In this way, the target coordinate system and its coordinates in the image coordinate system, and the coordinates of the point on the structured light plane in the image coordinate system can be obtained.
After a target coordinate system, coordinates of the target coordinate system in an image coordinate system and coordinates of points on a structured light plane in the image coordinate system are obtained, STEPLASERVisionCall (m) can be input into an MATLAB command window, wherein m is the total number of processed pictures, calibration of the linear structured light sensor can be completed, and a final calibration result STEPSensor. And the work of solving the homography matrix of the image and the object, optimizing the distortion of the camera, calibrating the structured light plane equation and the like can be finished in MATLAB.
Finally, error analysis can be performed according to the obtained calibration result. The calibration error and the calculation method thereof in the embodiment comprise the following steps:
1. camera calibration error and calculation method thereof
Converting the 3D target feature points (namely the selected first preset number of non-coplanar feature points) into an image coordinate system, obtaining a group of actual target feature points and ideal target feature points after distortion correction in a calibration algorithm, calculating a position residual error between the actual target feature points and the ideal target feature points through a 2 norm again, and taking the position residual error as a camera calibration error.
2. Structured light plane fitting error
Fitting the structured light plane (A) with all the laser fringe feature points (i.e., a second predetermined number of non-coplanar feature points) obtainedlxc+Blyc+Clzc=Dl) Then, the distance from each non-coplanar characteristic point in each structured light stripe to the structured light plane is calculated and obtained through a point-to-plane calculation formulaSeparation:
the distances of all non-coplanar feature points in the structured light stripe are taken as the fitting error of the structured light plane. Wherein x isc0、yc0、zc0Is the coordinate of a certain structured light stripe point on the structured light plane, and A, B, C, D are the coefficients of the structured light plane equation.
In the embodiment of the present invention, the first predetermined number and the second predetermined number of non-coplanar feature points are selected from MATLAB. And calibrating the camera according to the selected first preset number of non-coplanar feature points, and completing calibration of the linear structure light plane according to the selected second preset number of non-coplanar feature points and the calibration result of the camera, wherein the calibration is executed in MATLAB software. Since the target feature points (i.e., the first preset number of non-coplanar feature points to be selected) or the laser plane feature points (i.e., the second preset number of non-coplanar feature points to be selected) are both the intersection points of the target coordinate system identification lines or the laser stripes and their intersection points, in one example, the step of obtaining these feature points is:
according to a specified sequence, clicking a mouse at two ends of a straight line where the characteristic points to be calibrated are located in sequence, and determining a straight line through 2 points to finally obtain a target identification line or a laser stripe line where the characteristic points are located.
By analogy, all target identification lines and laser stripe lines meeting the attribute of the feature points are obtained, an intersection point is determined through 2 non-parallel straight lines (namely, a solve () function in MATLAB is called), and the coordinate positions of all the feature points are finally calculated, so that the calibration precision can be improved by improving the coordinate precision of the features, and the calibration efficiency can be improved.
In the embodiment of the invention, after the calibration of the line structured light plane is finished according to the selected second preset number of non-coplanar feature points and the calibration result of the camera, the calibration error can be calculated and displayed. Therefore, the user can know the calibration result conveniently.
The steps of the above methods are divided for clarity, and the implementation may be combined into one step or split some steps, and the steps are divided into multiple steps, so long as the steps contain the same logical relationship, which is within the protection scope of the present patent; it is within the scope of the patent to add insignificant modifications to the algorithms or processes or to introduce insignificant design changes to the core design without changing the algorithms or processes.
Those skilled in the art can understand that all or part of the steps in the method of the foregoing embodiments may be implemented by a program to instruct related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, etc.) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (9)

1. A method for calibrating a line-structured light sensor is characterized by comprising the following steps:
acquiring N3D target images through a line structured light sensor to be calibrated; wherein the acquisition angles of the 3D target images are different; each 3D target image comprises non-coplanar feature points and 2 structural light stripes, wherein the number of the non-coplanar feature points is greater than or equal to a first preset number, the structural light stripes comprise non-coplanar feature points, the number of the non-coplanar feature points is greater than or equal to a second preset number, and N is a positive integer;
selecting a first preset number of non-coplanar feature points from the N3D target images;
calibrating the camera according to the selected first preset number of non-coplanar characteristic points;
selecting a second preset number of non-coplanar feature points from the structured light stripes of the N3D target images;
and finishing the calibration of the line structured light plane according to the selected second preset number of non-coplanar feature points and the calibration result of the camera.
2. The line structured light sensor calibration method according to claim 1, wherein the calibrating the camera according to the selected first preset number of non-coplanar feature points specifically comprises:
and calibrating the camera by solving a homography matrix between the three-dimensional target coordinate system and the image pixel coordinate system for the first preset number of non-coplanar feature points.
3. The line structured light sensor calibration method according to claim 2, wherein the first predetermined number is 6.
4. The line structured light sensor calibration method according to claim 1,
the calibration of the line structured light plane is completed according to the selected second preset number of non-coplanar feature points and the calibration result of the camera, and specifically includes:
and calculating the coordinates of the characteristic points on the linear structure light plane under the camera coordinate system by using a second preset number of non-coplanar characteristic points and the calibration result of the camera, establishing a linear equation of the linear structure light plane, and completing the calibration of the linear structure light plane equation by using a least square method.
5. The line structured light sensor calibration method according to claim 4, wherein the second predetermined number is 5.
6. The line structured light sensor calibration method according to claim 1, wherein N is a value between [6, 9 ].
7. The line structured light sensor calibration method according to claim 1,
after the calibration of the line structured light plane is completed according to the selected second preset number of non-coplanar feature points and the calibration result of the camera, the method further includes:
and calculating and displaying the calibration error.
8. The line structured light sensor calibration method according to claim 1,
the first preset number and the second preset number of non-coplanar feature points are selected in MATLAB.
9. The line structured light sensor calibration method according to claim 1,
and calibrating the camera according to the selected first preset number of non-coplanar feature points, and calibrating the linear structure light plane according to the selected second preset number of non-coplanar feature points and the calibration result of the camera, wherein the calibration is performed in MATLAB software.
CN201710056155.0A 2017-01-25 2017-01-25 Calibrating Technique For The Light-strip Sensors Active CN106705849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710056155.0A CN106705849B (en) 2017-01-25 2017-01-25 Calibrating Technique For The Light-strip Sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710056155.0A CN106705849B (en) 2017-01-25 2017-01-25 Calibrating Technique For The Light-strip Sensors

Publications (2)

Publication Number Publication Date
CN106705849A CN106705849A (en) 2017-05-24
CN106705849B true CN106705849B (en) 2019-06-21

Family

ID=58908954

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710056155.0A Active CN106705849B (en) 2017-01-25 2017-01-25 Calibrating Technique For The Light-strip Sensors

Country Status (1)

Country Link
CN (1) CN106705849B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697736B (en) * 2017-10-20 2021-10-15 杭州海康机器人技术有限公司 Calibration method and device of measurement system, electronic equipment and readable storage medium
CN108981608B (en) * 2018-05-29 2020-09-22 华南理工大学 Novel line structured light vision system and calibration method
CN108717715B (en) * 2018-06-11 2022-05-31 华南理工大学 Automatic calibration method for linear structured light vision system of arc welding robot
CN109443209B (en) * 2018-12-04 2019-12-31 四川大学 Line structured light system calibration method based on homography matrix
CN109827502B (en) * 2018-12-28 2020-03-17 北京航空航天大学 High-precision calibration method for line-structured light vision sensor for calibration point image compensation
CN110470320B (en) * 2019-09-11 2021-03-05 河北科技大学 Calibration method of swinging scanning type line structured light measurement system and terminal equipment
CN111256591B (en) * 2020-03-13 2021-11-02 易思维(杭州)科技有限公司 External parameter calibration device and method for structured light sensor
CN111750804B (en) * 2020-06-19 2022-10-11 浙江华睿科技股份有限公司 Object measuring method and device
CN112361958B (en) * 2020-11-04 2022-06-21 同济大学 Line laser and mechanical arm calibration method
CN112894209A (en) * 2021-01-19 2021-06-04 常州英迈乐智能系统有限公司 Automatic plane correction method for intelligent tube plate welding robot based on cross laser

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4013742C2 (en) * 1990-04-28 1994-06-30 Focus Mestechnik Gmbh & Co Kg Scanning head for a machine for measuring the micro surface contour of workpieces
JP3273026B2 (en) * 1998-09-02 2002-04-08 株式会社ミツトヨ Surface tracking type measuring machine
CN1216273C (en) * 2002-12-17 2005-08-24 北京航空航天大学 Method for calibrating structure optical vision sensor
CN1250942C (en) * 2003-06-11 2006-04-12 北京航空航天大学 Construction optical visual sense transducer calibration method based on plane targets
CN101526338B (en) * 2009-04-15 2010-09-08 北京信息科技大学 Field calibration method of structural parameter of line structure light vision sensing system
CN101526336B (en) * 2009-04-20 2011-08-24 陈炳生 Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks

Also Published As

Publication number Publication date
CN106705849A (en) 2017-05-24

Similar Documents

Publication Publication Date Title
CN106705849B (en) Calibrating Technique For The Light-strip Sensors
Salvi et al. A comparative review of camera calibrating methods with accuracy evaluation
JP6426968B2 (en) INFORMATION PROCESSING APPARATUS AND METHOD THEREOF
CN105205858B (en) A kind of indoor scene three-dimensional rebuilding method based on single deep vision sensor
CN100430690C (en) Method for making three-dimensional measurement of objects utilizing single digital camera to freely shoot
Hartley et al. Triangulation
CN109816724A (en) Three-dimensional feature extracting method and device based on machine vision
JP2007502473A (en) Computer vision system for classification and spatial localization of bounded 3D objects
WO2018201677A1 (en) Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
CN101887585A (en) Method for calibrating camera based on non-coplanar characteristic point
JP6876094B2 (en) Methods, devices, and systems for estimating subpixel positions of extremums in an image
Zhou et al. A novel way of understanding for calibrating stereo vision sensor constructed by a single camera and mirrors
JP2004506215A (en) Strapdown system for 3D reconstruction
JP2016217941A (en) Three-dimensional evaluation device, three-dimensional data measurement system and three-dimensional measurement method
Wan et al. A study in 3d-reconstruction using kinect sensor
Fofi et al. Uncalibrated reconstruction: an adaptation to structured light vision
Yamauchi et al. Calibration of a structured light system by observing planar object from unknown viewpoints
Cauchois et al. Calibration of the omnidirectional vision sensor: SYCLOP
US20020114509A1 (en) Three-dimensional data processing device, three-dimensional data processing method,and program providing medium
CN109741389A (en) One kind being based on the matched sectional perspective matching process of region base
Skabek et al. Comparison of photgrammetric techniques for surface reconstruction from images to reconstruction from laser scanning
Chane et al. Registration of arbitrary multi-view 3D acquisitions
Dhome Visual Perception Through Video Imagery
Uyanik et al. A method for determining 3D surface points of objects by a single camera and rotary stage
CN110634162A (en) Calibration method of structured light vision sensor based on concentric circles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant