CN116182703B - Line structure light sensor calibration method and system - Google Patents

Line structure light sensor calibration method and system Download PDF

Info

Publication number
CN116182703B
CN116182703B CN202310048348.7A CN202310048348A CN116182703B CN 116182703 B CN116182703 B CN 116182703B CN 202310048348 A CN202310048348 A CN 202310048348A CN 116182703 B CN116182703 B CN 116182703B
Authority
CN
China
Prior art keywords
calibration plate
plane
coordinate system
calibration
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310048348.7A
Other languages
Chinese (zh)
Other versions
CN116182703A (en
Inventor
李伟明
高兴宇
胡增
肖宏晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Electronic Technology
Original Assignee
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Electronic Technology filed Critical Guilin University of Electronic Technology
Priority to CN202310048348.7A priority Critical patent/CN116182703B/en
Publication of CN116182703A publication Critical patent/CN116182703A/en
Application granted granted Critical
Publication of CN116182703B publication Critical patent/CN116182703B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a line structure light sensor calibration method, which comprises the following steps: s1, acquiring camera internal parameters and distortion coefficients of a sensor; s2, acquiring a calibration plate image irradiated by an external light source at the same position and a calibration plate image projected with laser stripes, and correcting distortion; s3, obtaining a movement vector of each translation of the calibration plate; s4, solving plane coordinates of the feature points in a local coordinate system by utilizing a homography matrix; s5, acquiring three-dimensional coordinates of the feature points under a world coordinate system; s6, adopting a principal component analysis method to re-project the three-dimensional coordinates of the feature points under the world coordinate system into plane coordinates; s7, calculating a homography matrix from the light plane to the image plane. The three-dimensional coordinate of the characteristic point is solved by utilizing the homography matrix, the three-dimensional coordinate error is reduced by plane constraint, and the calibration precision is higher than that of the traditional calibration method based on space transformation.

Description

Line structure light sensor calibration method and system
Technical Field
The invention relates to the technical field of line structure light sensors, in particular to a line structure light sensor calibration method and system.
Background
The linear structure light sensor has the advantages of non-contact, high speed and high precision, and is widely applied to the fields of welding guidance, industrial measurement, quality detection and the like. The high-precision calibration of the optical plane is a key for realizing the precise measurement of the linear structure optical sensor, and in principle, the optical plane calibration method can be divided into a direct calibration method and a model calibration method.
The direct calibration method needs to precisely control the calibration target to do translational motion by means of a translation mechanism, and directly establishes the relationship between the three-dimensional coordinates of the characteristic points on the target by recording the three-dimensional coordinates and extracting the corresponding pixel coordinates. If Yuanyuan and the like adopt combined blocks, a two-dimensional lookup table of the pixel coordinates and the three-dimensional coordinates of the feature points is established, and the three-dimensional coordinates are obtained by using least square fitting of four adjacent points. The line ruler is adopted to establish a lookup table and three-dimensional coordinates are calculated by combining a cubic interpolation method. The direct calibration method can accurately describe the mapping relation between the three-dimensional coordinates and the pixel coordinates only by extracting more characteristic point pairs, and the manual adjustment workload is large.
The model calibration method mainly comprises two modes for describing the light plane of the linear structure light sensor, wherein the first mode is to establish a plane equation of a light beam plane under a camera coordinate system. If Zhou Fujiang, the coordinate of the characteristic point in the local coordinate system is calculated by adopting the principle of cross ratio invariance, and the characteristic point is unified to the world coordinate system fitting light plane equation through homography matrix decomposition. Then, various characteristic point coordinate solving methods, such as Han Jiandong, are developed, and three-dimensional coordinates of the characteristic points are solved by adopting a collinear three-point perspective principle; wei Yideng calculating the coordinates of the characteristic points by adopting a method of intersecting rays with a plane; and the light linear equation is unified to a camera coordinate system by adopting a space transformation method in dragon and the like. According to the method, the characteristic points are unified to the world coordinate system through space transformation, so that homography matrix decomposition errors are introduced, and the calibration accuracy of the sensor is limited.
Another way of model calibration is to build a plane coordinate system on the plane of the beam, and describe the correspondence of points between the coordinate system and the image coordinate system by means of a homography matrix. Such as Xu Guangyou and the like, and a saw-tooth target method based on cross ratio invariance. However, the method proposed by Xu Guangyou and the like requires that the light plane be parallel to the calibration block, and the method proposed by the stage generation and the like requires that the light plane be perpendicular to the serrated surface, so that manual adjustment is difficult and is not suitable for engineering application. Pan et al extract a series of feature point pairs by means of a translation mechanism and a toothed target using a feature matching method, but also require the target movement direction to be parallel to the light plane. Chen Xiaohui and Peng Qian achieve the calibration of the light plane by the aid of a calibration instrument to achieve the calibration of a homography matrix between a light plane and a camera coordinate system, but the inclined calibration plate is difficult to completely fit and adjust with the light plane, and the requirements on the depth of field of a camera are high, and a small-view-field short-distance structure light model is not suitable. Ping Yisha and the like obtain three-dimensional coordinates of a plurality of groups of characteristic points through a mobile platform to solve a homography matrix, and similar to Xu Guangyou and the like, the method requires that a laser plane is perpendicular to a plane calibration plate and also requires that the light plane is parallel to the axis of the mobile platform, so that the application range of the method is limited, calibration cannot be realized for an inclined laser plane, and absolute parallelism and verticality are difficult to ensure manually.
Although the method for calibrating the homography matrix by adopting the planar target can effectively reduce mapping conversion links, and solves three-dimensional coordinates of the feature points by utilizing the homography matrix, so that errors introduced by homography matrix decomposition in the traditional method are reduced, in general, the method based on the homography matrix can reduce introduced errors and calculated amount, and has simple steps and higher precision. Through the overview, the existing method has the problems of low precision and complex calibration flow, and particularly the existing calibration method based on homography matrix often has special requirements on the position relationship among the translation mechanism, the calibration target and the light plane, thereby improving the difficulty of manual adjustment.
Disclosure of Invention
The invention provides a line structure light sensor calibration method and a line structure light sensor calibration system, which belong to a homography matrix light plane calibration method based on principal component analysis projection, control a calibration plate to do a group of translational motions on a lifting mechanism, do not need to manually and accurately adjust the placement position, lifting displacement and the projection direction of a laser, and have higher calibration precision and convenience than the traditional method.
One aspect of the embodiment of the invention discloses a line structure light sensor calibration method, which comprises the following steps:
S1, acquiring camera internal parameters and distortion coefficients of a sensor;
S2, controlling the calibration plate to do a group of translational motions, collecting the calibration plate image irradiated by an external light source at the same position and the calibration plate image projected with laser stripes, and carrying out distortion correction on all the calibration plate images based on internal parameters and distortion coefficients of the camera;
s3, extracting corner coordinates, calculating a homography matrix from the plane of the calibration plate to the plane of the image, and solving a motion vector of each translation of the calibration plate;
S4, fitting a light bar center straight line equation, extracting feature points at intervals on the light bar straight line, extracting pixel coordinates of the feature points, and solving plane coordinates of the feature points in a local coordinate system by utilizing a homography matrix;
s5, establishing a world coordinate system at the initial position of the calibration plate, solving a translation vector of each translation movement of the calibration plate by utilizing a projective transformation matrix, and further optimizing the displacement of the calibration plate according to the coplanarity of the characteristic points by restraining, so as to obtain the three-dimensional coordinates of the optimized characteristic points under the world coordinate system;
S6, re-projecting the three-dimensional coordinates of the feature points under the world coordinate system into plane coordinates by adopting a principal component analysis method, wherein the projection plane is the light plane;
S7, calculating a homography matrix from the light plane to the image plane by using the plane coordinates and the corresponding pixel coordinates after the feature point re-projection, and completing the light plane calibration.
In some embodiments, in step S1, a Zhang Zhengyou calibration method is used to calibrate the camera internal parameters and distortion coefficients of the scanner, where the camera internal parameters are denoted as K, and K is shown in formula 1:
Wherein f x、fy is a scale factor in X, Y axis direction, γ is a tilt factor, and (u 0,v0) is a principal point coordinate.
In some embodiments, in step S2, the calibration plate is placed on the lifting platform, so that the line laser can be projected to the calibration plate and the camera can shoot the complete calibration plate image, the height of the lifting platform is adjusted for multiple times, and after each movement is stopped, an external light source is turned on to control the camera to collect the calibration plate image; then, the external light source is turned off, the line laser is turned on, the calibration plate image projected with the laser stripes under the same pose is shot, and the calibration plate image is acquired at least at 2 positions;
Recording all collected calibration plate images projected with laser stripes as M L, recording the number of the calibration plate images irradiated by all external light sources as M B,ML、MB as n, wherein n is more than or equal to 2, recording the ith Zhang Guangtiao image in M L as M Li, recording the target image under the same pose as M Bi, and then carrying out distortion correction on all the calibration plate images based on camera internal parameters and distortion coefficients.
In some embodiments, in step S3, a local coordinate system Ox iyizi is established with the first corner point of the lower left corner of the calibration plate as the origin, the X-axis direction is along the length direction of the calibration plate, the Y-axis direction is along the width direction of the calibration plate, and the Z-axis is perpendicular to the calibration plate; establishing a world coordinate system Oxyz at the origin of Ox 0y0z0, wherein the coordinate axis direction is the same as the local coordinate system;
The light bar generated by intersecting the line structure light with the calibration plate at the ith position is marked as L i, characteristic points are taken from L i along the Y-axis direction of the local coordinate system according to the interval deltay, the jth characteristic point on L i is marked as P m*i+j, wherein j=1, 2,3 … … m, m is the number of the characteristic points taken from each light bar, and deltay is equal to the side length of the square of the calibration plate;
Extracting pixel coordinates of the corner points of the calibration plate image in M B by adopting an openCV checkerboard corner extraction function, calculating to obtain plane coordinates of the corner points under a local coordinate system according to the size of the calibration plate, and obtaining a formula 2 according to the principle of plane perspective projection:
In the middle of Is the pixel coordinates of the nth feature point in the image M Bi,/>Plane coordinates of the corresponding feature points in a local coordinate system;
Solving a camera internal reference matrix K by adopting an L-M algorithm to obtain homography matrixes from the plane of the calibration plate to the image plane at each position;
As can be seen from Zhang Zhengyou calibration method, the translation vector t i from the camera coordinate system to the local coordinate system of the calibration plate can be resolved by utilizing the homography matrix Hi, and the solution method of the translation vector is as follows:
wherein K is the internal reference matrix of the camera, For H i, column 3,/>T i is a translation vector from the camera coordinate system to the local coordinate system at the ith position of the calibration plate, and when the calibration plate moves from the ith position to the (i+1) th position, the translation vector of the calibration plate is represented by formula 4:
Δti+1=ti+1-ti
In some embodiments, in step S4, the center of the light bar of L i in M Li is extracted by using a square weighted gray-scale gravity center method, and then a linear equation of the center of the light bar is fitted by using a random sampling consistency algorithm, and the linear equation of L i in the image is recorded as formula 5:
[ai,bi,ci]·[u,v,1]T=0;
The feature point P m*i+j is the intersection of the light bar L i and the straight line y= (j-1) Δy in the local coordinate system Ox iyizi, where j=0, 1 … …;
The equation of the straight line y= (j-1) deltay in the image after projective transformation is obtained by the principle of planar perspective projection is equation 6:
the pixel coordinate of the feature point P ij in the image is the intersection point of the two linear equations of the formula 5 and the formula 6, and the plane coordinate of the feature point under the local coordinate system is formula 7 by the principle of plane perspective:
In some embodiments, in step S5, let O 0、O1 be the origin of the local coordinate system of the calibration plate, t 0、t1 be the translation vector pointing from the origin of the camera coordinate system O cXcYcZc to the origin of the local coordinate system, and t i can be resolved by using the projective transformation matrix H i, so that the translation vector of each translational movement of the calibration plate can be solved;
Let H i be the m column The translation vector solving method is represented by equations 8 and 9:
Wherein t i is the translation vector of the local coordinate system relative to the camera coordinate system at the ith position of the calibration plate, deltat i+1 is the lifting translation vector of the (i+1) th time of the calibration plate, K is the camera internal reference matrix,
When the calibration plate moves in a translation manner along the Z axis, the three-dimensional coordinate of the characteristic point P m×i+j in the world coordinate system is 10:
wherein u m×i+j、vm×i+j is the pixel coordinate of the feature point, and x m×i+j、ym×i+j、zm×i+j is the three-dimensional coordinate of the feature point in the world coordinate system;
The displacement of the calibration plate is further optimized according to the coplanarity of the feature points, the precision of the three-dimensional coordinates of the feature points can be further improved, and the plane equation of the light recording plane in the world coordinate system is as shown in formula 11:
ax+by+cz+1=0;
From the three-dimensional coordinates of the feature points, equation 12 can be established:
Recording device Then the sum of squares of the residuals is equation 13:
f=||A·x-B||2
The optimal solution for x using the least squares method is (a TA)-1AT B, bringing the optimal solution to equation 13, equation 14:
f=||A(ATA)-1ATB-B||2
And taking Z as an independent variable, taking the sum of squares of residual errors as an objective function, namely minimizing the upper part, and carrying out optimization solution on the Z-axis coordinate of the feature point to obtain the optimized three-dimensional coordinate of the feature point.
In some embodiments, in step S6, the three-dimensional coordinate points containing noise are reprojected into plane coordinates by using a principal component analysis method, and the three-dimensional coordinates of the feature points are arranged in rows, denoted as formula 15:
where x n、yn、zn is the three-dimensional coordinates of the feature point P n in the world coordinate system, Respectively, performing orthogonal decomposition on X T X to obtain a formula 16, wherein n is the number of the feature points and is the average value of three coordinate components of all the feature points in a world coordinate system:
Wherein lambda 1>λ2>λ3;
To be used for Establishing a coordinate system Ox LyLzL for an origin, taking the direction of an X axis along eta 1, taking the direction of a Z axis along eta 3, taking an Ox LyL plane as a light plane, reprojecting three-dimensional coordinates of a feature point to Ox LyLzL, and enabling the Z axis coordinate to be 0, so that the coordinate of the feature point in the plane coordinate system Ox LyL is represented by formula 17:
in some embodiments, in step S7, the homography matrix from the light plane to the image plane is solved by using an L-M algorithm, and the light plane calibration is completed.
Another aspect of the embodiments of the present invention discloses a line structured light sensor calibration system, comprising:
The acquisition module is used for acquiring camera internal parameters and distortion coefficients of the sensor;
The distortion correction module is used for collecting the calibration plate image irradiated by the external light source at the same position and the calibration plate image projected with the laser stripes by controlling the calibration plate to do a group of translational movements, and correcting the distortion of all the calibration plate images based on the internal parameters and the distortion coefficients of the camera;
the first coordinate acquisition module is used for calculating a homography matrix from the plane of the calibration plate to the image plane by extracting angular point coordinates and solving a motion vector of each translation of the calibration plate;
the second coordinate acquisition module is used for extracting characteristic points at intervals on the light bar straight line by fitting a light bar center straight line equation, extracting pixel coordinates of the characteristic points, and solving plane coordinates of the characteristic points in a local coordinate system by utilizing a homography matrix;
The three-dimensional coordinate acquisition module is used for establishing a world coordinate system at the initial position of the calibration plate, solving a translation vector of each translation movement of the calibration plate by utilizing a projective transformation matrix, and further optimizing the displacement of the calibration plate according to the coplanarity of the characteristic points by restraining to obtain the three-dimensional coordinate of the optimized characteristic points under the world coordinate system;
The principal component analysis module is used for re-projecting the three-dimensional coordinates of the feature points under the world coordinate system into plane coordinates by adopting a principal component analysis method, and the projection plane is a light plane;
and the calibration module is used for calculating a homography matrix from the light plane to the image plane by utilizing the plane coordinates and the corresponding pixel coordinates after the feature point re-projection to finish the light plane calibration.
In some embodiments, the line structured light sensor calibration system further comprises:
The processor is respectively connected with the acquisition module, the distortion correction module, the first coordinate acquisition module, the second coordinate acquisition module, the three-dimensional coordinate acquisition module, the principal component analysis method module and the calibration module;
A memory coupled to the processor and storing a computer program executable on the processor;
when the processor executes the computer program, the processor controls the acquisition module, the distortion correction module, the first coordinate acquisition module, the second coordinate acquisition module, the three-dimensional coordinate acquisition module, the principal component analysis method module and the calibration module to work so as to realize the line structure light sensor calibration method.
In summary, the invention has at least the following advantages:
The method comprises the steps of firstly calibrating a camera to obtain an internal reference matrix and distortion coefficients of the camera, then controlling the calibration plates to do a group of translational motions, collecting target plate images and laser stripe images, and calculating a homography matrix from the plane of each calibration plate to the plane of an image. And establishing a local coordinate system on the calibration plate, and taking the local coordinate system of the initial position of the calibration plate as a world coordinate system. Based on the plane projective transformation principle, the coordinates of the feature points under a local coordinate system are solved by the pixel coordinates of the feature points. Solving the coordinates of the translation vector of the calibration plate under the world coordinate system, further obtaining the three-dimensional coordinates of the feature points under the world coordinate system, reducing the three-dimensional coordinate errors through plane constraint, then projecting the three-dimensional coordinates of the feature points to a plane formed by the first principal component and the second principal component by adopting a principal component analysis method, wherein the projection plane is a light beam plane, and calculating the coordinates of the feature points on the projection plane. And finally, solving a homography matrix from the projection plane to the image plane by the coordinates of the feature points on the projection plane and the corresponding pixel coordinates, and completing the calibration of the light plane. The invention only needs to control the calibration plate to do a group of translational motions, has no special requirements on the placement position of the calibration plate and the projection direction of the light beam, and solves the problem that manual accurate adjustment is needed in the traditional calibration method based on homography matrix. The method adopts the principal component analysis method to reduce the dimension of the three-dimensional coordinates of the characteristic points and project the three-dimensional coordinates on the light plane, furthest reserves the information of the original data by constraint denoising, solves the three-dimensional coordinates of the characteristic points by utilizing the homography matrix, reduces the error introduced by homography matrix decomposition, and has higher calibration precision than the traditional calibration method based on space transformation.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram illustrating steps of a line structured light sensor calibration method according to the present invention.
FIG. 2 is a schematic block diagram of a line structured light sensor calibration system according to the present invention.
Fig. 3 is a schematic view of a line laser, camera, lift platform and calibration plate according to the present invention.
Fig. 4 is a schematic diagram of an example set of images acquired in accordance with the present invention.
Fig. 5 is a schematic diagram of a translational motion model of a calibration plate according to the present invention.
Fig. 6 is a schematic diagram of the calibration plate translation vector solving principle involved in the present invention.
Fig. 7 is a schematic diagram of the feature point extraction effect according to the present invention.
Fig. 8 is a schematic diagram showing the world coordinates of feature points and the effect of principal component analysis according to the present invention.
Reference numerals:
1. the device comprises a line laser, 2, a camera, 3, a lifting platform, 4 and a calibration plate.
Detailed Description
Hereinafter, only certain exemplary embodiments are briefly described. As will be recognized by those of skill in the pertinent art, the described embodiments may be modified in numerous different ways without departing from the spirit or scope of the embodiments of the present invention. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.
The following disclosure provides many different implementations, or examples, for implementing different configurations of embodiments of the invention. In order to simplify the disclosure of embodiments of the present invention, components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit embodiments of the present invention. Furthermore, embodiments of the present invention may repeat reference numerals and/or letters in the various examples, which are for the purpose of brevity and clarity, and which do not themselves indicate the relationship between the various embodiments and/or arrangements discussed.
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
As shown in fig. 1, an aspect of the embodiment of the present invention discloses a line structured light sensor calibration method, which includes the following steps:
S1, acquiring camera internal parameters and distortion coefficients of a sensor.
In some embodiments, in step S1, a Zhang Zhengyou calibration method is used to calibrate the camera internal parameters and distortion coefficients of the scanner, where the camera internal parameters are denoted as K, and K is shown in formula 1.
Wherein f x、fy is a scale factor in X, Y axis direction, γ is a tilt factor, and (u 0,v0) is a principal point coordinate.
S2, controlling the calibration plate 4 to do a group of translational motions, collecting the calibration plate images irradiated by the external light source at the same position and the calibration plate images projected with the laser stripes, and carrying out distortion correction on all the calibration plate images based on the internal parameters and the distortion coefficients of the camera.
In some embodiments, in step S2, as shown in fig. 3, the checkerboard calibration plate 4 is placed on the lifting platform 3, so that line laser can be projected to the calibration plate 4 and the camera 2 can shoot a complete calibration plate image, the height of the lifting platform is adjusted for multiple times, and after each movement is stopped, an external light source is turned on to control the camera 2 to collect the calibration plate image; then turning off an external light source, turning on a line laser 1, shooting a calibration plate image projected with laser stripes under the same pose, and collecting the calibration plate image at least at 2 positions (namely collecting a calibration plate image group with light stripes and a target image group without light stripes);
Recording all collected calibration plate images projected with laser stripes as M L, recording the number of the images in M B,ML、MB under the irradiation of all external light sources as n which is more than or equal to 2, recording the ith Zhang Guangtiao image in M L as M Li, recording the target image in the same pose as M Bi, and then carrying out distortion correction on all the calibration plate images based on camera internal parameters and distortion coefficients (namely carrying out de-distortion treatment on two groups of images according to the camera internal parameters and the distortion coefficients to obtain an image group); fig. 4 is an example of a group of images taken, in fig. 4, (a) is a calibration plate image (target image) irradiated with an external light source, and (b) is a calibration plate image projected with laser stripes.
S3, extracting angular point coordinates, calculating a homography matrix from the plane of the calibration plate to the plane of the image, and solving a motion vector of each translation of the calibration plate 4.
In some embodiments, in step S3, fig. 5 is a model of the translational movement of the calibration plate 4, and a local coordinate system Ox iyizi is built with the first corner point of the lower left corner of the calibration plate 4 as the origin, the X-axis direction is along the length direction of the calibration plate 4, the Y-axis direction is along the width direction of the calibration plate 4, and the Z-axis is perpendicular to the calibration plate 4; establishing a world coordinate system Oxyz at the origin of Ox 0y0z0, wherein the coordinate axis direction is the same as the local coordinate system;
The central line of the light bar generated by intersecting the line structure light with the calibration plate 4 at the ith position is marked as L i, characteristic points are taken from L i along the Y-axis direction of the local coordinate system according to the interval deltay, the jth characteristic point on L i is marked as P m*i+j, wherein j=1, 2,3 … … m, m is the number of the characteristic points taken from each light bar, and deltay in fig. 5 is equal to the side length of the square lattice of the calibration plate 4;
Extracting pixel coordinates of corner points of the calibration plate image in M B by adopting an openCV checkerboard corner extraction function, calculating to obtain plane coordinates of the corner points under a local coordinate system according to the size of the calibration plate 4, and obtaining a plane perspective projection principle to obtain a plane perspective projection 2;
In the middle of Is the pixel coordinates of the nth feature point in the image M Bi,/>Plane coordinates of the corresponding feature points in a local coordinate system;
Solving a camera internal reference matrix K by adopting an L-M algorithm (Levenberg-Marquardt algorithm) to obtain a homography matrix from the plane of the calibration plate to the image plane at each position;
As known from Zhang Zhengyou calibration method, translation vector t i from the camera coordinate system to the local coordinate system of the calibration plate can be decomposed by utilizing homography matrix H i, and the solution method of the translation vector is shown as formula 3;
wherein K is the internal reference matrix of the camera, For H i, column 3,/>T i is a translation vector of the camera coordinate system to the local coordinate system at the i-th position of the calibration plate 4, and as shown in fig. 6, when the calibration plate 4 moves from the i-th position to the i+1-th position, the translation vector of the calibration plate 4 is represented by formula 4.
Δti+1=ti+1-ti (4)
S4, fitting a light bar center straight line equation, extracting feature points at intervals on the light bar straight line, extracting pixel coordinates of the feature points, and solving plane coordinates of the feature points in a local coordinate system by utilizing a homography matrix.
In some embodiments, in step S4, the center of the light bar in M Li is extracted by using a square weighted gray-scale gravity center method, and then an equation of a straight line L i of the center of the light bar is fitted by using a random sampling consistency algorithm, and the straight line equation of L i in the image is recorded as equation 5;
[ai,bi,ci]·[u,v,1]T=0 (5)
The feature point P m*i+j is the intersection of the light bar center line L i and a line y= (j-1) Δy in the local coordinate system Ox iyizi, where j=0, 1 … …;
the equation of the straight line y= (j-1) deltay in the image after projective transformation is obtained by the principle of plane perspective projection is shown as formula 6;
The pixel coordinate of the characteristic point P m*i+j in the image is the intersection point of the two linear equations of the formula 5 and the formula 6, and the plane coordinate of the characteristic point under the local coordinate system is the formula 7 according to the principle of plane perspective.
S5, establishing a world coordinate system at the initial position of the calibration plate, solving a translation vector of each translation movement of the calibration plate by utilizing a projective transformation matrix, and further optimizing the displacement of the calibration plate according to the constraint of the coplanarity of the characteristic points to obtain the three-dimensional coordinates of the optimized characteristic points under the world coordinate system.
In some embodiments, in step S5, let O 0、O1 be the origin of the local coordinate system of the calibration plate 4, t 0、t1 be the translation vector pointing from the origin of the camera coordinate system O cXcYcZc to the origin of the local coordinate system, and t i can be resolved by using the projective transformation matrix H i, so that the translation vector of each translational movement of the calibration plate 4 can be solved;
Let H i be the m column The solving method of the translation vector is shown as formula 8 and formula 9;
Δti+1=ti+1-ti (9)
Wherein t i is a translation vector of the local coordinate system relative to the camera coordinate system at the ith position of the calibration plate 4, deltat i+1 is a lifting translation vector of the (i+1) th time of the calibration plate 4, K is a camera internal reference matrix,
When the calibration plate 4 moves in a translation mode along the Z axis, the three-dimensional coordinate of the characteristic point P m×i+j in the world coordinate system is 10;
wherein u m×i+j、vm×i+j is the pixel coordinate of the feature point, and x m×i+j、ym×i+j、zm×i+j is the three-dimensional coordinate of the feature point in the world coordinate system;
The displacement of the calibration plate 4 is further optimized according to the coplanarity of the feature points, the precision of the three-dimensional coordinates of the feature points can be further improved, and the plane equation of the light plane in the world coordinate system is shown as formula 11;
ax+by+cz+1=0 (11)
From the three-dimensional coordinates of the feature points, the following equation 12 can be established;
Recording device Then the sum of squares of the residuals is equation 13;
f=||A·x-B||2 (13)
The optimal solution of x obtained by the least square method is (A TA)-1AT B, the optimal solution is brought into formula 13 to obtain formula 14;
f=||A(ATA)-1ATB-B||2 (14)
And (3) taking Z as an independent variable, taking the minimum sum of squares of residual errors as an objective function, namely minimizing the Z-axis coordinate of the characteristic points, calculating initial values of the axis coordinates according to a formula 9, and taking the three-dimensional coordinates of the characteristic points after optimization as a final result for the dimension reduction processing. In addition, when the motion precision of the lifting platform is high enough and the displacement can be directly read from the equipment, the calibration method can be further simplified, and the translation vector decomposition step is omitted.
And S6, re-projecting the three-dimensional coordinates of the feature points under the world coordinate system into plane coordinates by adopting a principal component analysis method, wherein the projection plane is the light plane.
In theory, all feature points are located on the light plane, but the feature points obtained by actual calculation are not located on one plane due to the influence of noise and feature extraction errors, and cannot be directly used for estimating the homography matrix from the light plane to the image plane.
In some embodiments, in step S6, the three-dimensional coordinate points containing noise are reprojected into plane coordinates by using a principal component analysis method, and the three-dimensional coordinates of the feature points are arranged in rows and marked as formula 15;
where x n、yn、zn is the three-dimensional coordinates of the feature point P n in the world coordinate system, Respectively carrying out orthogonal decomposition on X T X to obtain a formula 16, wherein n is the number of the characteristic points and is the average value of three coordinate components of all the characteristic points under a world coordinate system;
Wherein lambda 1>λ2>λ3; from the theory of Principal Component Analysis (PCA), the variance of the data points along the direction of the feature vector corresponding to the maximum feature value is the largest, and the variance along the direction of the feature vector corresponding to the minimum feature value is the smallest.
To be used forEstablishing a coordinate system Ox LyLzL for an origin, taking the direction of an X axis along eta 1, taking the direction of a Z axis along eta 3, taking an Ox LyL plane as a light plane, reprojecting the three-dimensional coordinate of the feature point to Ox LyLzL, and enabling the Z axis coordinate to be 0, so that the coordinate of the feature point in the plane coordinate system Ox LyL is 17. /(I)
S7, calculating a homography matrix from the light plane to the image plane by using the plane coordinates and the corresponding pixel coordinates after the feature point re-projection, and completing the light plane calibration.
In some embodiments, in step S7, the homography matrix from the light plane to the image plane is solved by using an L-M algorithm, and the light plane calibration is completed.
The correctness of the method is verified through experiments, and fig. 7 shows the characteristic point extraction effect, and the interval is half of the size of the checkerboard square. The three-dimensional coordinates of the feature points calculated by the method are shown in fig. 8, and the light plane obtained by the principal component analysis method is Ox LyL. The covariance matrix eigenvalues of the feature point coordinates obtained by calculation are 140.215, 74.266 and 9.224×10 -6 respectively, wherein the proportion of the minimum eigenvalue is extremely small. And (3) carrying out inverse transformation on the plane coordinates after re-projection into three-dimensional coordinates, and comparing the three-dimensional coordinates with data before dimension reduction, wherein the average Euclidean distance of each point is only 2.2x10 -3 mm.
Solving by utilizing the plane coordinates of the characteristic points after re-projection and the corresponding pixel coordinates to obtain a homography matrix from the light plane to the image plane, wherein the homography matrix is 18;
The ideal coordinate value of the characteristic point projected to the image plane is calculated by utilizing H p, and the average Euclidean distance between the ideal coordinate value and the actual pixel coordinate of the characteristic point is only 0.2560 pixels and is smaller than the calculation error of the homography matrix from the plane of the calibration plate to the image plane, so that the calibration method is proved to have higher precision.
As shown in fig. 2, another aspect of an embodiment of the present invention discloses a line structured light sensor calibration system, comprising:
The acquisition module is used for acquiring camera internal parameters and distortion coefficients of the sensor; step S1 may be performed;
The distortion correction module is used for collecting the calibration plate image irradiated by the external light source at the same position and the calibration plate image projected with the laser stripes by controlling the calibration plate 4 to do a group of translational movements, and correcting the distortion of all the calibration plate images based on the internal parameters and the distortion coefficients of the camera; step S2 may be performed;
the first coordinate acquisition module is used for calculating a homography matrix from the plane of the calibration plate to the plane of the image by extracting angular point coordinates and solving a motion vector of each translation of the calibration plate 4; step S3 may be performed;
the second coordinate acquisition module is used for extracting characteristic points at intervals on the light bar straight line by fitting a light bar center straight line equation, extracting pixel coordinates of the characteristic points, and solving plane coordinates of the characteristic points in a local coordinate system by utilizing a homography matrix; step S4 may be performed;
The three-dimensional coordinate acquisition module is used for establishing a world coordinate system at the initial position of the calibration plate, solving a translation vector of each translation movement of the calibration plate by utilizing a projective transformation matrix, and further optimizing the displacement of the calibration plate according to the coplanarity of the characteristic points by restraining to obtain the three-dimensional coordinate of the optimized characteristic points under the world coordinate system; step S5 may be performed;
The principal component analysis module is used for re-projecting the three-dimensional coordinates of the feature points under the world coordinate system into plane coordinates by adopting a principal component analysis method, and the projection plane is a light plane; step S6 may be performed;
And the calibration module is used for calculating a homography matrix from the light plane to the image plane by utilizing the plane coordinates and the corresponding pixel coordinates after the feature point re-projection to finish the light plane calibration. Step S7 may be performed.
In some embodiments, the line structured light sensor calibration system further comprises:
The processor is respectively connected with the acquisition module, the distortion correction module, the first coordinate acquisition module, the second coordinate acquisition module, the three-dimensional coordinate acquisition module, the principal component analysis method module and the calibration module;
A memory coupled to the processor and storing a computer program executable on the processor;
when the processor executes the computer program, the processor controls the acquisition module, the distortion correction module, the first coordinate acquisition module, the second coordinate acquisition module, the three-dimensional coordinate acquisition module, the principal component analysis method module and the calibration module to work so as to realize the line structure light sensor calibration method.
The above embodiments are provided to illustrate the present invention and not to limit the present invention, so that the modification of the exemplary values or the replacement of equivalent elements should still fall within the scope of the present invention.
From the foregoing detailed description, it will be apparent to those skilled in the art that the present invention can be practiced without these specific details, and that the present invention meets the requirements of the patent statutes.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention. The foregoing description of the preferred embodiment of the invention is not intended to be limiting, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.
It should be noted that the above description of the flow is only for the purpose of illustration and description, and does not limit the application scope of the present specification. Various modifications and changes to the flow may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
While the basic concepts have been described above, it will be apparent to those of ordinary skill in the art after reading this application that the above disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations of the application may occur to one of ordinary skill in the art. Such modifications, improvements, and modifications are intended to be suggested within the present disclosure, and therefore, such modifications, improvements, and adaptations are intended to be within the spirit and scope of the exemplary embodiments of the present disclosure.
Meanwhile, the present application uses specific words to describe embodiments of the present application. For example, "one embodiment," "an embodiment," and/or "some embodiments" means a particular feature, structure, or characteristic in connection with at least one embodiment of the application. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the application may be combined as suitable.
Furthermore, those of ordinary skill in the art will appreciate that aspects of the application are illustrated and described in the context of a number of patentable categories or conditions, including any novel and useful processes, machines, products, or materials, or any novel and useful improvements thereof. Accordingly, aspects of the present application may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or a combination of hardware and software. The above hardware or software may be referred to as a "unit," module, "or" system. Furthermore, aspects of the present application may take the form of a computer program product embodied in one or more computer-readable media, wherein the computer-readable program code is embodied therein.
Computer program code required for operation of portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python, etc., a conventional programming language such as C programming language, visualBasic, fortran2103, perl, COBOL2102, PHP, ABAP, a dynamic programming language such as Python, ruby, and Groovy, or other programming languages, etc. The program code may execute entirely on the user's computer, or as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, the order in which the elements and sequences are presented, the use of numerical letters, or other designations are used in the application is not intended to limit the sequence of the processes and methods unless specifically recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of example, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the application. For example, while the implementation of the various components described above may be embodied in a hardware device, it may also be implemented as a purely software solution, e.g., an installation on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation of the disclosure and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, the inventive subject matter should be provided with fewer features than the single embodiments described above.

Claims (8)

1. The line structure light sensor calibration method is characterized by comprising the following steps of:
S1, acquiring camera internal parameters and distortion coefficients of a sensor;
S2, controlling the calibration plate to do a group of translational motions, collecting the calibration plate image irradiated by an external light source at the same position and the calibration plate image projected with laser stripes, and carrying out distortion correction on all the calibration plate images based on internal parameters and distortion coefficients of the camera;
s3, extracting corner coordinates, calculating a homography matrix from the plane of the calibration plate to the plane of the image, and solving a motion vector of each translation of the calibration plate;
S4, fitting a light bar center straight line equation, extracting feature points at intervals on the light bar straight line, extracting pixel coordinates of the feature points, and solving plane coordinates of the feature points in a local coordinate system by utilizing a homography matrix;
s5, establishing a world coordinate system at the initial position of the calibration plate, solving a translation vector of each translation movement of the calibration plate by utilizing a projective transformation matrix, and further optimizing the displacement of the calibration plate according to the coplanarity of the characteristic points by restraining, so as to obtain the three-dimensional coordinates of the optimized characteristic points under the world coordinate system;
S6, re-projecting the three-dimensional coordinates of the feature points under the world coordinate system into plane coordinates by adopting a principal component analysis method, wherein the projection plane is the light plane;
S7, calculating a homography matrix from the light plane to the image plane by using the plane coordinates and the corresponding pixel coordinates after the feature point re-projection to finish the light plane calibration;
In step S5, let O 0、O1 be the origin of the local coordinate system of the calibration plate, t 0、t1 be the translation vector pointing from the origin of the camera coordinate system O cXcYcZc to the origin of the local coordinate system, and t i can be resolved by using the projective transformation matrix H i, so that the translation vector of each translation motion of the calibration plate can be solved;
Let H i be the m column The translation vector solving method is represented by equations 8 and 9:
Wherein t i is the translation vector of the local coordinate system relative to the camera coordinate system at the ith position of the calibration plate, deltat i+1 is the lifting translation vector of the (i+1) th time of the calibration plate, K is the camera internal reference matrix,
When the calibration plate moves in a translation manner along the Z axis, the three-dimensional coordinate of the characteristic point P m×i+j in the world coordinate system is 10:
wherein u m×i+j、vm×i+j is the pixel coordinate of the feature point, and x m×i+j、ym×i+j、zm×i+j is the three-dimensional coordinate of the feature point in the world coordinate system;
The displacement of the calibration plate is further optimized according to the constraint of the coplanarity of the characteristic points, and the plane equation of the light recording plane in the world coordinate system is as follows:
ax+by+cz+1=0;
From the three-dimensional coordinates of the feature points, equation 12 can be established:
Recording device Then the sum of squares of the residuals is equation 13:
f=||A·x-B||2
The optimal solution for x using the least squares method is (a TA)-1AT B, bringing the optimal solution to equation 13, equation 14:
f=||A(ATA)-1ATB-B||2
And taking Z as an independent variable, taking the sum of squares of residual errors as an objective function, namely minimizing the upper part, and carrying out optimization solution on the Z-axis coordinate of the feature point to obtain the optimized three-dimensional coordinate of the feature point.
2. The line structured light sensor calibration method of claim 1, wherein:
in step S1, a Zhang Zhengyou calibration method is used to calibrate the camera internal parameters and distortion coefficients of the scanner, and the camera internal parameters matrix is denoted by K, where K is as shown in formula 1:
Wherein f x、fy is a scale factor in X, Y axis direction, γ is a tilt factor, and (u 0,v0) is a principal point coordinate.
3. The line structured light sensor calibration method of claim 2, wherein:
In step S2, the calibration plate is placed on the lifting platform, so that line laser can be projected to the calibration plate, a camera can shoot a complete calibration plate image, the height of the lifting platform is adjusted for multiple times, and after each movement is stopped, an external light source is turned on to control the camera to collect the calibration plate image; then, the external light source is turned off, the line laser is turned on, the calibration plate image projected with the laser stripes under the same pose is shot, and the calibration plate image is acquired at least at 2 positions;
Recording all collected calibration plate images projected with laser stripes as M L, recording the number of the calibration plate images irradiated by all external light sources as M B,ML、MB as n, wherein n is more than or equal to 2, recording the ith Zhang Guangtiao image in M L as M Li, recording the target image under the same pose as M Bi, and then carrying out distortion correction on all the calibration plate images based on camera internal parameters and distortion coefficients.
4. A line structured light sensor calibration method according to claim 3, characterized in that:
in step S3, a local coordinate system Ox iyizi is established with the first corner point of the lower left corner of the calibration plate as the origin, the X-axis direction is along the length direction of the calibration plate, the Y-axis direction is along the width direction of the calibration plate, and the Z-axis is perpendicular to the calibration plate;
Establishing a world coordinate system Oxyz at the origin of Ox 0y0z0, wherein the coordinate axis direction is the same as the local coordinate system;
The light bar generated by intersecting the line structure light with the calibration plate at the ith position is marked as L i, characteristic points are taken from L i along the Y-axis direction of the local coordinate system according to the interval deltay, the jth characteristic point on L i is marked as P m*i+j, wherein j=1, 2,3 … … m, m is the number of the characteristic points taken from each light bar, and deltay is equal to the side length of the square of the calibration plate; extracting pixel coordinates of the corner points of the calibration plate image in M B by adopting an openCV checkerboard corner extraction function, calculating to obtain plane coordinates of the corner points under a local coordinate system according to the size of the calibration plate, and obtaining a formula 2 according to the principle of plane perspective projection:
In the middle of Is the pixel coordinates of the nth feature point in the image M Bi,/>Plane coordinates of the corresponding feature points in a local coordinate system;
Solving a camera internal reference matrix K by adopting an L-M algorithm to obtain homography matrixes from the plane of the calibration plate to the image plane at each position;
As can be seen from Zhang Zhengyou calibration method, the translation vector t i from the camera coordinate system to the local coordinate system of the calibration plate can be resolved by utilizing the homography matrix H i, and the solution method of the translation vector is as follows:
wherein K is the internal reference matrix of the camera, For H i, column 3,/>T i is a translation vector from the camera coordinate system to the local coordinate system at the ith position of the calibration plate, and when the calibration plate moves from the ith position to the (i+1) th position, the translation vector of the calibration plate is represented by formula 4:
Δti+1=ti+1-ti
5. The line structured light sensor calibration method of claim 4, wherein:
In step S4, the center of the light bar of L i in M Li is extracted by using a square weighted gray-scale gravity center method, and then a linear equation of the center of the light bar is fitted by using a random sampling consistency algorithm, and the linear equation of L i in the image is recorded as formula 5:
[ai,bi,ci]·[u,v,1]T=0;
The feature point P m*i+j is the intersection of the light bar L i and the straight line y= (j-1) Δy in the local coordinate system Ox iyizi, where j=0, 1 … …;
The equation of the straight line y= (j-1) deltay in the image after projective transformation is obtained by the principle of planar perspective projection is equation 6:
[0,1,-(j-1)Δy]·Hi -1·[u,v,1]T=0;
the pixel coordinate of the feature point P ij in the image is the intersection point of the two linear equations of the formula 5 and the formula 6, and the plane coordinate of the feature point under the local coordinate system is formula 7 by the principle of plane perspective:
[xi,j,yi,j,1]·=Hi -1·[ui,j,vi,j,1]T
6. the line structured light sensor calibration method of claim 1, wherein:
In step S6, the three-dimensional coordinate points containing noise are reprojected into plane coordinates by a principal component analysis method, and the three-dimensional coordinates of the feature points are arranged in rows and expressed as formula 15:
where x n、yn、zn is the three-dimensional coordinates of the feature point P n in the world coordinate system, Respectively, performing orthogonal decomposition on X T X to obtain a formula 16, wherein n is the number of the feature points and is the average value of three coordinate components of all the feature points in a world coordinate system:
Wherein lambda 1>λ2>λ3;
To be used for Establishing a coordinate system Ox LyLzL for an origin, taking the direction of an X axis along eta 1, taking the direction of a Z axis along eta 3, taking an Ox LyL plane as a light plane, reprojecting three-dimensional coordinates of a feature point to Ox LyLzL, and enabling the Z axis coordinate to be 0, so that the coordinate of the feature point in the plane coordinate system Ox LyL is represented by formula 17:
7. the line structured light sensor calibration method of claim 6, wherein:
in step S7, solving a homography matrix from the light plane to the image plane by adopting an L-M algorithm, and completing the light plane calibration.
8. A line structured light sensor calibration system, comprising:
The acquisition module is used for acquiring camera internal parameters and distortion coefficients of the sensor;
The distortion correction module is used for collecting the calibration plate image irradiated by the external light source at the same position and the calibration plate image projected with the laser stripes by controlling the calibration plate to do a group of translational movements, and correcting the distortion of all the calibration plate images based on the internal parameters and the distortion coefficients of the camera;
the first coordinate acquisition module is used for calculating a homography matrix from the plane of the calibration plate to the image plane by extracting angular point coordinates and solving a motion vector of each translation of the calibration plate;
the second coordinate acquisition module is used for extracting characteristic points at intervals on the light bar straight line by fitting a light bar center straight line equation, extracting pixel coordinates of the characteristic points, and solving plane coordinates of the characteristic points in a local coordinate system by utilizing a homography matrix;
The three-dimensional coordinate acquisition module is used for establishing a world coordinate system at the initial position of the calibration plate, solving a translation vector of each translation movement of the calibration plate by utilizing a projective transformation matrix, and further optimizing the displacement of the calibration plate according to the coplanarity of the characteristic points by restraining to obtain the three-dimensional coordinate of the optimized characteristic points under the world coordinate system;
The principal component analysis module is used for re-projecting the three-dimensional coordinates of the feature points under the world coordinate system into plane coordinates by adopting a principal component analysis method, and the projection plane is a light plane;
The calibration module is used for calculating a homography matrix from the light plane to the image plane by utilizing the plane coordinates and the corresponding pixel coordinates after the feature point re-projection to finish the light plane calibration;
The processor is respectively connected with the acquisition module, the distortion correction module, the first coordinate acquisition module, the second coordinate acquisition module, the three-dimensional coordinate acquisition module, the principal component analysis method module and the calibration module;
A memory coupled to the processor and storing a computer program executable on the processor; wherein when the processor executes the computer program, the processor controls the operation of the acquisition module, the distortion correction module, the first coordinate acquisition module, the second coordinate acquisition module, the three-dimensional coordinate acquisition module, the principal component analysis module, and the calibration module to implement the line structured light sensor calibration method according to any one of claims 1 to 7.
CN202310048348.7A 2023-01-31 2023-01-31 Line structure light sensor calibration method and system Active CN116182703B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310048348.7A CN116182703B (en) 2023-01-31 2023-01-31 Line structure light sensor calibration method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310048348.7A CN116182703B (en) 2023-01-31 2023-01-31 Line structure light sensor calibration method and system

Publications (2)

Publication Number Publication Date
CN116182703A CN116182703A (en) 2023-05-30
CN116182703B true CN116182703B (en) 2024-05-03

Family

ID=86437763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310048348.7A Active CN116182703B (en) 2023-01-31 2023-01-31 Line structure light sensor calibration method and system

Country Status (1)

Country Link
CN (1) CN116182703B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116485918B (en) * 2023-06-25 2023-09-08 天府兴隆湖实验室 Calibration method, calibration system and computer readable storage medium
CN116805335B (en) * 2023-07-04 2024-03-15 广东建石科技有限公司 Double-dimensional displacement sensing method for tile paving

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7177740B1 (en) * 2005-11-10 2007-02-13 Beijing University Of Aeronautics And Astronautics Method and apparatus for dynamic measuring three-dimensional parameters of tire with laser vision
CN104677305A (en) * 2015-02-11 2015-06-03 浙江理工大学 Method and system for three-dimensionally reconstructing object surface based on cross-structured light
CN108288294A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of outer ginseng scaling method of a 3D phases group of planes
CN109443209A (en) * 2018-12-04 2019-03-08 四川大学 A kind of line-structured light system calibrating method based on homography matrix
CN110118528A (en) * 2019-04-29 2019-08-13 天津大学 A kind of line-structured light scaling method based on chessboard target
CN113674360A (en) * 2021-08-17 2021-11-19 南京航空航天大学 Covariant-based line structured light plane calibration method
CN113686262A (en) * 2021-08-13 2021-11-23 桂林电子科技大学 Line structure optical scanner calibration method and device and storage medium
WO2021259151A1 (en) * 2020-06-24 2021-12-30 深圳市道通科技股份有限公司 Calibration method and apparatus for laser calibration system, and laser calibration system
CN114612573A (en) * 2022-03-17 2022-06-10 太原科技大学 Public-view-free multi-image sensor global calibration system and method
WO2022143796A1 (en) * 2020-12-29 2022-07-07 杭州海康机器人技术有限公司 Calibration method and calibration device for line structured light measurement system, and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4230525B2 (en) * 2005-05-12 2009-02-25 有限会社テクノドリーム二十一 Three-dimensional shape measuring method and apparatus
CN111243032B (en) * 2020-01-10 2023-05-12 大连理工大学 Full-automatic detection method for checkerboard corner points

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7177740B1 (en) * 2005-11-10 2007-02-13 Beijing University Of Aeronautics And Astronautics Method and apparatus for dynamic measuring three-dimensional parameters of tire with laser vision
CN104677305A (en) * 2015-02-11 2015-06-03 浙江理工大学 Method and system for three-dimensionally reconstructing object surface based on cross-structured light
CN108288294A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of outer ginseng scaling method of a 3D phases group of planes
CN109443209A (en) * 2018-12-04 2019-03-08 四川大学 A kind of line-structured light system calibrating method based on homography matrix
CN110118528A (en) * 2019-04-29 2019-08-13 天津大学 A kind of line-structured light scaling method based on chessboard target
WO2021259151A1 (en) * 2020-06-24 2021-12-30 深圳市道通科技股份有限公司 Calibration method and apparatus for laser calibration system, and laser calibration system
WO2022143796A1 (en) * 2020-12-29 2022-07-07 杭州海康机器人技术有限公司 Calibration method and calibration device for line structured light measurement system, and system
CN113686262A (en) * 2021-08-13 2021-11-23 桂林电子科技大学 Line structure optical scanner calibration method and device and storage medium
CN113674360A (en) * 2021-08-17 2021-11-19 南京航空航天大学 Covariant-based line structured light plane calibration method
CN114612573A (en) * 2022-03-17 2022-06-10 太原科技大学 Public-view-free multi-image sensor global calibration system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Structured-Light Binocular Vision System for Dynamic Measurement of Rail Wear;Wentao Li等;《2019 2nd International Conference on Electronics Technology》;20191231;全文 *
线结构光扫描传感器结构参数一体化标定;王金桥等;《传感技术学报》;20140930;第27卷(第9期);全文 *

Also Published As

Publication number Publication date
CN116182703A (en) 2023-05-30

Similar Documents

Publication Publication Date Title
CN116182703B (en) Line structure light sensor calibration method and system
WO2019090487A1 (en) Highly dynamic wide-range any-contour-error monocular six-dimensional measurement method for numerical control machine tool
Zhou et al. Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations
JP6092530B2 (en) Image processing apparatus and image processing method
CN109443209B (en) Line structured light system calibration method based on homography matrix
US9715730B2 (en) Three-dimensional measurement apparatus and robot system
JP2016001181A (en) System and method for runtime determination of camera mis-calibration
CN116182702B (en) Line structure light sensor calibration method and system based on principal component analysis
An et al. Building an omnidirectional 3-D color laser ranging system through a novel calibration method
Wei et al. Flexible calibration of a portable structured light system through surface plane
Ricolfe-Viala et al. Optimal conditions for camera calibration using a planar template
CN109773589A (en) Method and device, the equipment of on-line measurement and processing guiding are carried out to workpiece surface
Yang et al. A fast calibration of laser vision robotic welding systems using automatic path planning
Pless et al. Extrinsic calibration of a camera and laser range finder
Liang et al. An integrated camera parameters calibration approach for robotic monocular vision guidance
Qin et al. A novel hierarchical iterative hypothesis strategy for intrinsic parameters calibration of laser structured-light weld vision sensor
JP2004309318A (en) Position detection method, its device and its program, and calibration information creation method
KR102016988B1 (en) Camera pose vector calibration method for generating 3d information
Mosnier et al. A New Method for Projector Calibration Based on Visual Servoing.
CN113733078A (en) Method for interpreting fine control quantity of mechanical arm and computer-readable storage medium
Dong et al. Constructing a Virtual Large Reference Plate with High-precision for Calibrating Cameras with Large FOV
CN113251951B (en) Calibration method of line structured light vision measurement system based on single calibration surface mapping
Chen et al. A New Method for Error Analysis of Binocular Stereo Vision System
Liao et al. Computer vision system for an autonomous mobile robot
Yang et al. Laser and Grating Cross-Integrated Calibration Method for Wide Field-of-View 3D Cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant