CN116592766A - Precise three-dimensional measurement method and device based on fusion of laser and monocular vision - Google Patents

Precise three-dimensional measurement method and device based on fusion of laser and monocular vision Download PDF

Info

Publication number
CN116592766A
CN116592766A CN202310815987.1A CN202310815987A CN116592766A CN 116592766 A CN116592766 A CN 116592766A CN 202310815987 A CN202310815987 A CN 202310815987A CN 116592766 A CN116592766 A CN 116592766A
Authority
CN
China
Prior art keywords
laser
camera
information
dimensional
reflection mirror
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310815987.1A
Other languages
Chinese (zh)
Inventor
刘博�
易皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Optics and Electronics of CAS
Original Assignee
Institute of Optics and Electronics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Optics and Electronics of CAS filed Critical Institute of Optics and Electronics of CAS
Priority to CN202310815987.1A priority Critical patent/CN116592766A/en
Publication of CN116592766A publication Critical patent/CN116592766A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Abstract

The application discloses a precise three-dimensional measurement method and device based on fusion of laser and monocular vision. The measuring method comprises the following steps: calibrating the relative positions of the monocular camera and the deflection center of the quick reflection mirror; then the laser ranging module collects depth information, a monocular camera is used for determining pixel coordinates of the laser ranging points, and the three-dimensional information of the laser ranging points under a camera coordinate system is obtained by combining the depth information and the pixel coordinates; and judging whether the scanning process of the quick reflection mirror is finished, if not, continuing to deflect the quick reflection mirror, repeating the measuring process, and if so, outputting the three-dimensional point cloud. The method of the application utilizes the camera to obtain the coordinate information of the laser ranging point, combines the advantages of the camera and the laser measurement, realizes the precise three-dimensional measurement which reaches the pixel level precision in the two-dimensional plane direction and is equivalent to the laser ranging precision in the depth direction, has lower implementation cost and is beneficial to technical popularization and application.

Description

Precise three-dimensional measurement method and device based on fusion of laser and monocular vision
Technical Field
The application relates to the field of three-dimensional measurement, in particular to a precise three-dimensional measurement method and device based on fusion of laser and monocular vision.
Background
With the rapid development of sensor technology and computer vision technology, many tasks such as automatic driving, robots and remote sensing measurement have high requirements on a precise three-dimensional measurement method. The existing mainstream three-dimensional measurement method is divided into passive three-dimensional measurement based on a camera and active three-dimensional measurement based on a laser radar, and a monocular camera can measure and obtain accurate azimuth angle and pitching angle information of a target, but cannot recover scale information in real 3D motion from image data, and cannot acquire reliable high-precision three-dimensional measurement information. The laser radar can obtain high-precision distance information of a target, but the two-dimensional resolution of the plane of the laser radar is low, and for the main-stream scanning laser radar, the three-dimensional measurement precision of the laser radar is seriously dependent on an angle measurement device.
The single sensor is difficult to meet the requirement of high-precision three-dimensional measurement, and the existing method adopts combination of measurement results of multiple sensors to improve measurement precision when in measurement, namely, the laser radar and the camera are used for measurement respectively, and then a proper algorithm is used for carrying out fusion calculation on each scheme and finally obtaining the measurement result, so that the method has the defects in terms of calculation resource consumption and measurement precision.
Disclosure of Invention
The application aims to solve the defects of the existing laser and monocular vision fusion measurement technology and provides a precise three-dimensional measurement method and device based on laser and monocular vision fusion. According to the application, the camera is utilized to obtain the angle information of the laser measurement point, so that the scanning type laser radar is effectively prevented from depending on a high-precision angle measurement device, the pixel-level precision is achieved in the two-dimensional plane direction, and the precise three-dimensional measurement equivalent to the laser ranging precision in the depth direction is realized.
The application is realized in the following way:
first, the application provides a precise three-dimensional measurement method based on fusion of laser and monocular vision, which comprises the following steps:
s1, carrying out joint calibration on deflection centers of a monocular camera and a quick reflection mirror to obtain calibration parameters including a camera internal reference matrix K and position information (x) of the deflection center of the quick reflection mirror under a camera coordinate system f ,y f ,z f );
S2, the laser ranging module emits visible laser and is driven by a fast reflecting mirror to realize beam deflection;
s3, the camera collects pixel information of the laser ranging points emitted by the laser ranging module, and simultaneously records depth information obtained by measuring the corresponding laser ranging module;
s4, calculating to obtain three-dimensional coordinate information of the measuring point of the laser ranging module under the camera coordinate system according to the camera internal reference matrix obtained in the S1, the position information of the deflection center of the quick reflection mirror under the camera coordinate system, and the pixel information and the depth information obtained in the S3;
and S5, judging whether the scanning process of the quick reflection mirror is finished, if not, deflecting the quick reflection mirror, repeating S2, S3 and S4, and if so, outputting three-dimensional coordinate information of all the ranging points, namely three-dimensional point clouds.
Further, the step S1 specifically includes:
s11, performing monocular camera calibration by using a chess board pattern calibration plate through a Zhang Zhengyou calibration method to obtain a camera internal reference matrix K;
s12, deflecting the laser beam by using a quick reflection mirror, deflecting the laser ranging points to corner points of different checkered, and respectively recording depth information (d) of three laser spots (spot a, spot b and spot c) a 、d b And d c ) And pixel information (u) a ,v a 、u b ,v b And u c ,v c ). Because the checkerboard spacing is fixed, the spacing L between the light spots a and b can be obtained ab Spacing L of spots b and c bc And the spacing L between spots a and c ac Thereby, positional information (x) of the deflection center of the quick reflection mirror in the camera coordinate system f ,y f ,z f ) The calculation is as follows:
wherein t is 1 ,t 2 ,t 3 A real number to be solved greater than 0.
Further, the step S4 specifically includes:
s41, calculating three-dimensional coordinate information (x, y, z) of a measuring point of the laser ranging module, wherein the method comprises the following steps:
wherein t is the real number of the requirement greater than 0, K is an internal reference matrix, u, v are pixel information of a ranging point obtained by a camera, d is depth information measured by a laser ranging module, and x is the depth information f ,y f ,z f Is the position information of the deflection center of the quick reflection mirror under the camera coordinate system.
The application also provides a precise three-dimensional measurement device based on fusion of laser and monocular vision, which comprises:
the laser ranging module is used for measuring depth information and emitting visible light spots;
a fast mirror for beam deflection;
the camera is used for acquiring spot pixel information of the laser measuring point;
and the data processing unit is used for image processing, three-dimensional information acquisition and three-dimensional point cloud output.
Further, the camera, the quick reflection mirror and the laser ranging module are connected with the data processing unit through data lines.
Compared with the prior art, the application has the beneficial effects that:
(1) According to the application, the camera is adopted to obtain the angle information of the laser measurement point, so that the problem that the three-dimensional measurement precision of the scanning laser radar depends on a high-precision angle measurement device is effectively solved.
(2) The application provides a method for fusing laser and monocular vision, which is used for converting measurement results of a camera and laser under different coordinate systems into the same coordinate system by calibrating the position of a deflection center of a quick reflection mirror under the coordinate system of the camera.
(3) The application has simple structure and convenient operation, effectively improves the three-dimensional measurement precision, achieves the pixel-level precision in the two-dimensional plane direction and is equivalent to the laser ranging precision in the depth direction.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of the steps of a precise three-dimensional measurement method based on fusion of laser and monocular vision according to an embodiment of the present application.
Fig. 2 is a schematic diagram of calculating three-dimensional coordinate information of a measurement point of a laser ranging module in a camera coordinate system according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a precise three-dimensional measurement device based on fusion of laser and monocular vision according to an embodiment of the present application.
Reference numerals: the system comprises a 1-laser ranging module, a 2-quick reflector, a 3-camera and a 4-data processing unit.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. Based on the embodiments in the present application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, fall within the scope of the application.
As shown in fig. 1 and 2, the precise three-dimensional measurement method based on fusion of laser and monocular vision of the application comprises the following steps:
s1, installing a camera, a laser ranging module and a quick reflection mirror, and enabling the deflection center of the quick reflection mirror to be close to the optical center of the camera as much as possible, wherein O-XYZ is a camera coordinate system, namely a world coordinate system of the system, and O l -X l Y l Z l For measuring coordinate system by laser, O o Uv is a camera pixel coordinate system, the camera and the fast-reflection mirror deflection center are calibrated in a combined way, and calibration parameters including a camera internal reference matrix K and position information (x) of the fast-reflection mirror deflection center under the camera coordinate system are obtained f ,y f ,z f ) The method comprises the steps of carrying out a first treatment on the surface of the The camera adopts a monocular camera;
s2, the laser ranging module emits visible laser and is driven by a fast reflecting mirror to realize beam deflection;
and S3, the camera collects pixel information of the laser ranging point A emitted by the laser ranging module, namely, the pixel coordinates (u, v) of the laser ranging point A are given out in a pixel coordinate system, and meanwhile, depth information d obtained by measuring the corresponding laser ranging module is recorded.
S4, according to the camera internal reference matrix K obtained in S1 and the position information (x) of the deflection center of the quick reflection mirror under the camera coordinate system f ,y f ,z f ) And (3) calculating the pixel information (u, v) and the depth information d obtained in the step S3 to obtain three-dimensional coordinate information (x, y, z) of the distance measurement point A of the laser distance measurement module under a camera coordinate system.
And S5, judging whether the scanning process of the quick reflection mirror is finished, if not, deflecting the quick reflection mirror, repeating S2, S3 and S4, and if so, outputting three-dimensional coordinate information of all the ranging points, namely three-dimensional point clouds.
Further, the step S1 specifically includes:
s11, performing monocular camera calibration by using a chess board pattern calibration plate through a Zhang Zhengyou calibration method to obtain a camera internal reference matrix K;
s12, deflecting the laser beam by using a quick reflection mirror, deflecting the laser spots to corner points of different checkerboards, and respectively recording depth information (d) of three laser spots (spot a, spot b and spot c) a 、d b And d c ) Pixel information (u) a ,v a 、u b ,v b And u c ,v c ). Because the checkerboard spacing is fixed, the spacing Lac between the light spot a and the light spot b and the spacing L between the light spot b and the light spot c can be obtained bc And the spacing L between spots a and c ac Thereby, the position information (x f ,y f ,z f ) The calculation is as follows:
wherein t is 1 ,t 2 ,t 3 A real number to be solved greater than 0.
Further, the step S4 specifically includes:
s41, calculating three-dimensional coordinate information (x, y, z) of a ranging point A of a laser ranging module, wherein the method comprises the following steps:
wherein t is the real number of the requirement greater than 0, K is an internal reference matrix, u, v are pixel information of a ranging point obtained by a camera, d is depth information measured by a laser ranging module, and x is the depth information f ,y f ,z f Is the position information of the deflection center of the quick reflection mirror under the camera coordinate system.
As shown in fig. 3, the present application further provides a precise three-dimensional measurement device based on fusion of laser and monocular vision, including:
the laser ranging module 1 is used for measuring depth information and emitting visible light spots;
a quick reflection mirror 2 for beam deflection;
a camera 3 for acquiring spot pixel information of the laser measurement point;
and the data processing unit 4 is used for image processing, three-dimensional information acquisition and three-dimensional point cloud output.
Further, the camera, the quick reflection mirror and the laser ranging module are connected with the data processing unit through data lines.
When the device works, the deflection centers of the monocular camera 1 and the quick reflection mirror 2 are firstly calibrated in a combined way; secondly, the laser ranging module emits visible laser, depth information is obtained through measurement, and pixel information of a laser ranging point is obtained through a camera; then, the data processing unit is used for calculating and obtaining three-dimensional information of the ranging points by simultaneous calibration information, depth information and pixel information and storing the three-dimensional information; and finally, judging whether the scanning process of the quick reflection mirror is finished, if not, deflecting the quick reflection mirror, measuring and storing the three-dimensional information of the deflected laser ranging points, and if so, outputting the three-dimensional point cloud.
According to the specific embodiment, the application discloses a precise three-dimensional measurement method and device based on fusion of laser and monocular vision, which solve the problem that the three-dimensional measurement precision of a scanning laser radar depends on a high-precision angle measurement device compared with the prior art. Meanwhile, the application has simple structure and convenient operation, effectively fuses the advantages of the camera and the laser measurement, and realizes precise three-dimensional measurement with pixel-level precision in the two-dimensional plane direction and equivalent to the laser ranging precision in the depth direction.
The foregoing detailed description is provided for the purpose of illustrating and explaining the claims of the present application and is not to be construed as limiting the claims. It should be clear to a person skilled in the art that any simple modification, variation or substitution on the basis of the technical solution of the present application, the resulting new technical solution will fall within the scope of protection of the present application.

Claims (5)

1. The precise three-dimensional measurement method based on fusion of laser and monocular vision is characterized by comprising the following steps of:
s1, carrying out joint calibration on deflection centers of a monocular camera and a quick reflection mirror to obtain calibration parameters including a camera internal reference matrix K and position information (x) of the deflection center of the quick reflection mirror under a camera coordinate system f ,y f ,z f );
S2, the laser ranging module emits visible laser and is driven by a fast reflecting mirror to realize beam deflection;
s3, the monocular camera collects pixel information (u, v) of the laser ranging points emitted by the laser ranging module, and simultaneously records depth information d obtained by measuring the corresponding laser ranging module;
s4, according to the camera internal reference matrix K obtained in S1 and the position information (x) of the deflection center of the quick reflection mirror under the camera coordinate system f ,y f ,z f ) The pixel information (u, v) and the depth information d obtained in the step S3 are calculated to obtain three-dimensional coordinate information (x, y, z) of the ranging point of the laser ranging module under a camera coordinate system;
and S5, judging whether the scanning process of the quick reflection mirror is finished, if not, deflecting the quick reflection mirror, repeating S2, S3 and S4, and if so, outputting three-dimensional coordinate information of all the ranging points, namely three-dimensional point clouds.
2. The precise three-dimensional measurement method based on fusion of laser and monocular vision according to claim 1, wherein the step S1 specifically comprises:
s11, performing monocular camera calibration by using a chess board pattern calibration plate through a Zhang Zhengyou calibration method to obtain a camera internal reference matrix K;
s12, deflecting the laser beam by using a quick reflection mirror, deflecting the laser ranging points to corner points of different checkered, and respectively recording depth information d of three laser spots, namely a spot a, a spot b and a spot c a 、d b And d c And pixel information (u) a ,v a )、(u b ,v b ) Sum (u) c ,v c ) Because the checkerboard spacing is fixed, the spacing L between the light spots a and b can be obtained ab Spacing L of spots b and c bc And the spacing L between spots a and c ac Thereby, positional information (x) of the deflection center of the quick reflection mirror in the camera coordinate system f ,y f ,z f ) The calculation is as follows:
wherein t is 1 ,t 2 ,t 3 A real number to be solved greater than 0.
3. The precise three-dimensional measurement method based on fusion of laser and monocular vision according to claim 1, wherein S4 specifically comprises:
s41, calculating three-dimensional coordinate information (x, y, z) of a ranging point of the laser ranging module, wherein the method comprises the following steps:
wherein t is the real number of the distance measurement points which are larger than 0, K is the internal reference matrix, u, v are the pixel information of the distance measurement points obtained by the camera, and d is the measurement of the laser distance measurement moduleDepth information x of (x) f ,y f ,z f Is the position information of the deflection center of the quick reflection mirror under the camera coordinate system.
4. A measuring device used in the precision three-dimensional measuring method according to any one of claims 1 to 3, comprising:
the laser ranging module is used for measuring depth information and emitting visible light spots;
a fast mirror for beam deflection;
the monocular camera is used for acquiring spot pixel information of the laser measuring point;
and the data processing unit is used for image processing, three-dimensional information acquisition and three-dimensional point cloud output.
5. The measuring device of claim 4, wherein the monocular camera, the quick mirror and the laser ranging module are connected with the data processing unit through data lines.
CN202310815987.1A 2023-07-05 2023-07-05 Precise three-dimensional measurement method and device based on fusion of laser and monocular vision Pending CN116592766A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310815987.1A CN116592766A (en) 2023-07-05 2023-07-05 Precise three-dimensional measurement method and device based on fusion of laser and monocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310815987.1A CN116592766A (en) 2023-07-05 2023-07-05 Precise three-dimensional measurement method and device based on fusion of laser and monocular vision

Publications (1)

Publication Number Publication Date
CN116592766A true CN116592766A (en) 2023-08-15

Family

ID=87599363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310815987.1A Pending CN116592766A (en) 2023-07-05 2023-07-05 Precise three-dimensional measurement method and device based on fusion of laser and monocular vision

Country Status (1)

Country Link
CN (1) CN116592766A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116793257A (en) * 2023-08-28 2023-09-22 成都量芯集成科技有限公司 Three-dimensional measurement system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116793257A (en) * 2023-08-28 2023-09-22 成都量芯集成科技有限公司 Three-dimensional measurement system and method
CN116793257B (en) * 2023-08-28 2023-10-27 成都量芯集成科技有限公司 Three-dimensional measurement system and method

Similar Documents

Publication Publication Date Title
CN110021046B (en) External parameter calibration method and system for camera and laser radar combined sensor
CN111435162B (en) Laser radar and camera synchronization method, device, equipment and storage medium
CN107976669B (en) Device for determining external parameters between camera and laser radar
US7643135B1 (en) Telescope based calibration of a three dimensional optical scanner
US7797120B2 (en) Telescope based calibration of a three dimensional optical scanner
WO2022227844A1 (en) Laser radar correction apparatus and method
CN111815716A (en) Parameter calibration method and related device
CN107860337B (en) Structured light three-dimensional reconstruction method and device based on array camera
CN111161358B (en) Camera calibration method and device for structured light depth measurement
JP2003130621A (en) Method and system for measuring three-dimensional shape
US11692812B2 (en) System and method for measuring three-dimensional coordinates
CN116592766A (en) Precise three-dimensional measurement method and device based on fusion of laser and monocular vision
CN112595236A (en) Measuring device for underwater laser three-dimensional scanning and real-time distance measurement
CN109032329B (en) Space consistency keeping method for multi-person augmented reality interaction
CN107564051B (en) Depth information acquisition method and system
CN114384496B (en) Method and system for calibrating angle of laser radar
CN116755104A (en) Method and equipment for positioning object based on three points and two lines
CN114923665B (en) Image reconstruction method and image reconstruction test system for wave three-dimensional height field
CN115824170A (en) Method for measuring ocean waves by combining photogrammetry and laser radar
Wang et al. Distance measurement using single non-metric CCD camera
CN113034615B (en) Equipment calibration method and related device for multi-source data fusion
CN112648936A (en) Stereoscopic vision detection method and detection device based on differential projection
US20040263862A1 (en) Detecting peripheral points of reflected radiation beam spots for topographically mapping a surface
CN113847872A (en) Discrete single-point displacement static monitoring device and method based on laser ranging
CN117146710B (en) Dynamic projection three-dimensional reconstruction system and method based on active vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination