CN112833791B - Space-time calibration method for self-rotating line structured light scanning system - Google Patents

Space-time calibration method for self-rotating line structured light scanning system Download PDF

Info

Publication number
CN112833791B
CN112833791B CN202110145592.6A CN202110145592A CN112833791B CN 112833791 B CN112833791 B CN 112833791B CN 202110145592 A CN202110145592 A CN 202110145592A CN 112833791 B CN112833791 B CN 112833791B
Authority
CN
China
Prior art keywords
calibration
scanning system
camera
time
self
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110145592.6A
Other languages
Chinese (zh)
Other versions
CN112833791A (en
Inventor
王越
韩福长
张群康
熊蓉
谭启蒙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202110145592.6A priority Critical patent/CN112833791B/en
Publication of CN112833791A publication Critical patent/CN112833791A/en
Application granted granted Critical
Publication of CN112833791B publication Critical patent/CN112833791B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a space-time calibration method of a self-rotating line structure optical scanning system, which comprises the steps of placing a plurality of calibration plates in the measurement range of the self-rotating line structure optical scanning system, requiring no coplanarity between different calibration plates, making a rotating platform of the self-rotating line structure optical scanning system still at different positions to collect camera image information and record corresponding motor positions, and the like, wherein the calibration is completed by means of a multi-constraint method, the method does not depend on the installation precision of a mechanical structure, greatly reduces the installation requirements of equipment, utilizes various types of constraint conditions in one calibration process, therefore, calibration errors caused by single constraint conditions can be avoided, the relative pose relation and the time offset of the structured light system and the rotary platform are calibrated, and the application range is wide for hardware facilities which are low in cost and can not perform hardware synchronization.

Description

Space-time calibration method for self-rotating line structured light scanning system
Technical Field
The application relates to the technical field of three-dimensional measurement, in particular to a space-time calibration method for a self-rotating line structured light scanning system.
Background
In recent years, the line structured light three-dimensional vision measurement technology based on the optical triangulation measurement principle is more and more widely applied to scientific research and industrial production due to the characteristics and potential of low power consumption and low cost. The self-rotating structured light scanning system formed by combining the technology with a rotating device can be carried on a robot or an automatic device and used for scenes which need to obtain three-dimensional perception information of the scene but have large limitation on the power consumption of a sensor, such as large-scale pipeline internal exploration, underground mine exploration, extraterrestrial exploration and the like.
For a self-rotating line structure light scanning system, an obtained global three-dimensional measurement result coordinate system is established on a rotating axis center of a high-precision rotating platform, and a measurement result of a line structure light measurement device is established on a coordinate system of a camera, so that in order to obtain the rotated measurement result, a spatial position relation between the two coordinate systems needs to be obtained, and the single measurement result of the line structure light measurement device can be spliced into the global three-dimensional measurement result through the current position fed back by the rotating platform. However, the spatial positional relationship is difficult to be accurately determined by mechanical mounting.
Meanwhile, for a self-rotating structured light scanning system without hardware time synchronization, the angle value fed back by the rotating platform and the time of the image collected by the camera have deviation, which can affect the precision of the global three-dimensional measurement result.
Therefore, the time-space calibration of the self-rotating line structured light scanning system, including the spatial position calibration of the camera coordinate system and the center of the rotating shaft and the time difference calibration between the camera picture acquisition time and the rotating platform feedback angle time information, becomes a key problem for obtaining a high-precision measurement result.
Disclosure of Invention
In order to solve the above problems in the prior art, the present invention provides a space-time calibration method for a two-stage spin-beam structured light scanning system using multiple constraints.
In order to achieve the purpose, the technical scheme of the invention is as follows: the invention discloses a space-time calibration method of a two-stage self-rotating line structure optical scanning system utilizing multiple constraints, which comprises the following steps:
firstly, placing a plurality of calibration plates in a measurement range of a self-rotating line structure optical scanning system, wherein different calibration plates are required to be not coplanar;
step two, making the rotating platform of the self-rotating line structure light scanning system still at different positions to collect camera image information and record corresponding motor positions;
extracting visual characteristic points and laser stripes in the acquired image and detecting angular points of an AprilTag calibration board;
step four, obtaining the position of the corresponding camera under the global coordinate system through the rough external parameters and the motor position; and calculating the reprojection error of the characteristic point, the corner reprojection error with real world scale information and the error from the laser point to the plane of the calibration plate, and minimizing the square sum of the three errors by using a nonlinear optimization method to obtain an external reference calibration result.
And fifthly, obtaining a function relation between the pose of the motor and the time through continuous rotation of the motor, bringing the time for collecting the image and external parameters obtained by front optimization into the function relation, simultaneously obtaining the pose of a corresponding camera by considering time offset, extracting the feature points in the image, and minimizing the re-projection errors of the feature points by utilizing nonlinear optimization in combination with the camera pose obtained in the front so as to obtain a corresponding time offset calibration result.
As a further improvement, in the process of acquiring an image by using the external reference calibration camera in the first stage of the present invention, it is required that different positions of the camera can at least observe one same calibration board, and at least still acquire three pictures with different angle values.
As a further improvement, the calibration plate is AprilTag or checkerboard, namely, the calibration plate such as real scale information can be recovered.
As a further improvement, in the fourth step of the present invention, the external parameter is to obtain the rotational translation of the camera in the structured light to the center of the rotation axis through the three-dimensional parameters of the mounting plate and manual measurement.
Compared with the prior art, the invention has the beneficial effects that:
1. the invention completes calibration by means of a multi-constraint method, does not depend on the installation precision of a mechanical structure, and greatly reduces the installation requirement of equipment.
2. The invention utilizes various types of constraint conditions in one calibration process, thereby avoiding calibration errors caused by single constraint conditions.
3. The invention simultaneously calibrates the relative pose relationship and the time offset of the structured light system and the rotary platform. The method has wide application range for some low-cost hardware facilities which cannot perform hardware synchronization.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting implementations with reference to the accompanying drawings in which:
FIG. 1 is a diagram of the apparatus structure and coordinate system definition of the present invention;
FIG. 2 is a schematic diagram of an example scenario for implementing the present invention;
FIG. 3 is a schematic block diagram of the present invention;
fig. 4 is an algorithm flow diagram of the present invention.
Detailed Description
The present application will now be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific examples described herein are for purposes of illustration only and are not to be construed as limitations of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
Fig. 1 is a schematic diagram of the apparatus of the present invention and a coordinate system definition diagram, wherein a structured light scanning system consisting of a camera 106 and a line laser transmitter 102 is fixed on a rotating platform 111, which is rotated around a rotating shaft 101 by a rotating motor 108. The z-axis of the global coordinate system 105 and the motor coordinate system 107 at the initial time are defined as the rotation axis 101, and the x-axis at 107 and the x-axis of 105 point in the same direction. The origin of the camera's coordinate system 103 is located on the optical center of 106 with the y-axis of 103 down and the z-axis forward. The detected X-Y plane of the AprilTag calibration plate coordinate system 104 is located on the calibration plate surface, and the detected 3D point coordinate z-axis value of the corner point 105 under 104 is 0.
FIG. 2 is a schematic diagram of an example scenario for implementing the present invention; a plurality of AprilTag calibration boards 202 are placed around the spin-transfer structured light scanning system, so that when the cameras acquire pictures, the cameras 201 in adjacent positions can at least observe the same AprilTag calibration board and detect the visual feature points 203.
Fig. 3 is a schematic block diagram of the present invention, and the blocks used in the present invention are divided into two blocks, i.e., an external reference calibration 301 and a time offset calibration 308, as shown in fig. 3.
Wherein the external reference calibration 301 comprises three modules. Still image acquisition 302, image pre-processing 303 and non-linear optimization 304. A still image acquisition 302, configured to acquire image information when the motor is still at different positions; image preprocessing 303, which is used to perform data preprocessing on the module extracted from the still image acquisition 302, including extracting visual feature points, extracting laser stripes, and detecting angular points of AprilTag; nonlinear optimization 304 iteratively minimizes the sum of squared errors to obtain the desired calibrated external reference, taking into account a variety of error sources, including point-to-surface error calculation 305, feature point reprojection error calculation 306, and AprilTag error calculation 307.
The time offset calibration 308 module is divided into four modules. Continuous rotation image acquisition 309, image pre-processing 310, pose time function fitting 311, and non-linear optimization 312. The continuous rotation of the image acquisition 309 causes the motor to rotate continuously from the initial angle to the preset target angle during which images are acquired at a certain frequency, the acquired images pass, 310, and the visual feature points therein are extracted. In the continuous rotation image capturing 309, the system time at the time of image capturing is recorded at the same time, and the angle of the motor is read at a certain frequency and the system time at the time of reading is recorded. And fitting a function relation of the camera pose and the time in a pose time function fitting 311 module by recording a series of motor angle values and system time and combining an external reference result obtained by calibration in an external reference calibration 301. The image acquisition time and the time offset are brought into the functional relation to obtain a corresponding camera pose, feature points extracted from the image preprocessing module 310 are used for calculating feature point re-projection errors in the nonlinear optimization module 312, and the error square sum is minimized in an iterative mode to obtain a time offset calibration result.
Fig. 4 is an algorithm flow diagram of the present invention. Predefining a global coordinate system OGCamera coordinate system OCAnd a calibration plate coordinate system OAAnd a rotating platform, i.e. the motor coordinate system OM(ii) a Calibrating internal parameters of the structured light to obtain an internal parameter matrix K of the camera and an equation F of a structured light plane under camera coordinatesp(ii) a Defining a second coordinate variation matrix
Figure BDA0002930114590000061
Represents a coordinate system OBAt OAPose in (1), corresponding to rotation matrix
Figure BDA0002930114590000062
And
Figure BDA0002930114590000063
before the calibration process begins, a plurality of calibration plates are placed around a self-rotating line structure optical scanning system in space, and different calibration plates are required to be not coplanar; the calibration plate adopts an Apriltag calibration plate;
after the calibration is started, step 401 is performed to make the rotating platform stand at different positions
Figure BDA0002930114590000064
The camera simultaneously acquires image information. Then, an image preprocessing step 402 is performed to obtain the positions of the corresponding aprilat calibration plates in the coordinate system of the corresponding camera
Figure BDA0002930114590000065
And pixel coordinates u of four corner points of tagm3D coordinates of a 3D laser point in a camera coordinate system
Figure BDA0002930114590000066
3D coordinates of the extracted visual feature points in the corresponding camera coordinate system
Figure BDA0002930114590000067
And its pixel coordinate uk
Step 403 is performed next to calculate three types of errors. Obtaining corresponding camera global coordinate system pose through rough external parameters (rotation translation from a camera to the center of a rotating shaft in the structured light is obtained through three-dimensional parameters of a mounting plate and manual measurement) and the position of a rotating platform
Figure BDA0002930114590000071
Figure BDA0002930114590000072
Wherein
Figure BDA0002930114590000073
And the pose of the camera coordinate system in the motor coordinate system is shown. Further obtaining the coordinates of the visual feature points in the global coordinate system
Figure BDA0002930114590000074
3D coordinates of 3D laser point in global coordinate system
Figure BDA0002930114590000075
And the pose of the corresponding AprilTag calibration plate under the global coordinate
Figure BDA0002930114590000076
Calculating the reprojection error of the feature points to the corresponding camera coordinates:
Figure BDA0002930114590000077
wherein, pi represents an internal reference K calibrated by a camera and a coordinate of a corresponding 3D point is converted into a two-dimensional pixel point coordinate.
Since the dimensions of AprilTag are known, the location of the corresponding corner points in the AprilTag coordinate system is known. Combining the global coordinates of AprilTag mark plate to obtain the 3D point coordinates of the corresponding AprilTag corner point under the global coordinates
Figure BDA0002930114590000078
The corresponding aprilatag corner re-projection error is calculated as:
Figure BDA0002930114590000079
meanwhile, since the laser spot falls on the plane of the AprilTag calibration plate, the error from the laser spot to the plane of the AprilTag plate is calculated as:
Figure BDA0002930114590000081
wherein n is1=[001]T
And combining the error expressions (2), (3) and (4), considering the error expressions as a nonlinear optimization problem, and adopting a Gauss-Newton iteration method to minimize the sum of the three errors. After entering the specifically collected data, the joint optimization of step 404 is performed to obtain
Figure BDA0002930114590000082
The corresponding external parameters
Figure BDA0002930114590000083
And
Figure BDA0002930114590000084
when step 402 is executed, step 405 is executed in parallel to enable the motor to continuously rotate to record motor angles corresponding to different time, pictures of the camera and corresponding acquisition time are acquired in the process at a certain frequency, and then step 406 is executed to obtain the pose of the motor
Figure BDA0002930114590000085
Functional relation f (t) with time t
Combining the external reference result calibrated at 404 and the functional relation obtained in 406, taking the time offset into account, executing step 407 to obtain a camera pose containing the time offset
Figure BDA0002930114590000086
Where tk is the time when the corresponding camera acquires the picture, delat _ t is the time offset between the camera and the motor, and then the reprojection error of the corresponding feature point obtained in the same manner as (2) is
Figure BDA0002930114590000087
And carrying specific data, executing step 408, and obtaining a time calibration result delat _ t by using a nonlinear optimization method.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (4)

1. A space-time calibration method of a two-stage self-rotating line structure optical scanning system utilizing multiple constraints is characterized by comprising the following steps:
firstly, placing a plurality of calibration plates in a measurement range of a self-rotating line structure optical scanning system, wherein different calibration plates are required to be not coplanar;
step two, making the rotating platform of the self-rotating line structure light scanning system still at different positions to collect camera image information and record corresponding motor positions;
extracting visual characteristic points, laser stripes and angular points of a detection calibration plate in the collected image;
step four, obtaining the position of the corresponding camera under the global coordinate system through the rough external parameters and the motor position; calculating the reprojection error of the characteristic point, the angular point reprojection error with real world scale information and the error from the laser point to the plane of the calibration plate, and minimizing the square sum of the three errors by using a nonlinear optimization method to obtain an external reference calibration result;
and fifthly, obtaining a function relation between the pose of the motor and the time through continuous rotation of the motor, bringing the time for collecting the image and external parameters obtained by front optimization into the function relation, simultaneously obtaining the pose of a corresponding camera by considering time offset, extracting the feature points in the image, and minimizing the re-projection errors of the feature points by utilizing nonlinear optimization in combination with the camera pose obtained in the front so as to obtain a corresponding time offset calibration result.
2. The method of space-time calibration of a two-stage spin beam structured light scanning system with multiple constraints of claim 1 wherein: in the second step, during the image acquisition process of the camera, at least one same calibration board can be observed at different positions of the camera, and at least three pictures with different angle values are still acquired.
3. The method of space-time calibration of a two-stage spin beam structured light scanning system with multiple constraints of claim 1 wherein: the calibration version is AprilTag or checkerboard.
4. The method of space-time calibration for a two-stage spin-line structured light scanning system with multiple constraints, as recited in claim 1, wherein: in the fourth step, the external parameter is the rotation and translation from the camera to the center of the rotating shaft in the structured light obtained through the three-dimensional parameters of the mounting plate and manual measurement.
CN202110145592.6A 2021-02-02 2021-02-02 Space-time calibration method for self-rotating line structured light scanning system Active CN112833791B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110145592.6A CN112833791B (en) 2021-02-02 2021-02-02 Space-time calibration method for self-rotating line structured light scanning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110145592.6A CN112833791B (en) 2021-02-02 2021-02-02 Space-time calibration method for self-rotating line structured light scanning system

Publications (2)

Publication Number Publication Date
CN112833791A CN112833791A (en) 2021-05-25
CN112833791B true CN112833791B (en) 2021-11-19

Family

ID=75931645

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110145592.6A Active CN112833791B (en) 2021-02-02 2021-02-02 Space-time calibration method for self-rotating line structured light scanning system

Country Status (1)

Country Link
CN (1) CN112833791B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113884278B (en) * 2021-09-16 2023-10-27 杭州海康机器人股份有限公司 System calibration method and device for line laser equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104613899A (en) * 2015-02-09 2015-05-13 淮阴工学院 Full-automatic calibration method for structured light hand-eye three-dimensional measuring system
CN110163797A (en) * 2019-05-31 2019-08-23 四川大学 A kind of calibration turntable position orientation relation realizes the method and device of any angle point cloud
CN111308415A (en) * 2019-11-01 2020-06-19 华为技术有限公司 Online pose estimation method and device based on time delay
CN112179291A (en) * 2020-09-23 2021-01-05 中国科学院光电技术研究所 Calibration method of self-rotating scanning type line structured light three-dimensional measurement device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR102017002219B1 (en) * 2017-02-02 2020-01-07 Vale S/A SYSTEM AND METHOD FOR MONITORING RAILWAY WHEELS

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104613899A (en) * 2015-02-09 2015-05-13 淮阴工学院 Full-automatic calibration method for structured light hand-eye three-dimensional measuring system
CN110163797A (en) * 2019-05-31 2019-08-23 四川大学 A kind of calibration turntable position orientation relation realizes the method and device of any angle point cloud
CN111308415A (en) * 2019-11-01 2020-06-19 华为技术有限公司 Online pose estimation method and device based on time delay
CN112179291A (en) * 2020-09-23 2021-01-05 中国科学院光电技术研究所 Calibration method of self-rotating scanning type line structured light three-dimensional measurement device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
线结构光标定方法综述;张曦等;《激光与光电子学进展》;20170817;第020001-1-11页 *

Also Published As

Publication number Publication date
CN112833791A (en) 2021-05-25

Similar Documents

Publication Publication Date Title
CN110296691B (en) IMU calibration-fused binocular stereo vision measurement method and system
Singh et al. Bigbird: A large-scale 3d database of object instances
US9547802B2 (en) System and method for image composition thereof
CN108613628B (en) Overhead transmission line sag measurement method based on binocular vision
AU2013379669B2 (en) Apparatus and method for three dimensional surface measurement
CN109919911B (en) Mobile three-dimensional reconstruction method based on multi-view photometric stereo
CN111473739A (en) Video monitoring-based surrounding rock deformation real-time monitoring method for tunnel collapse area
CN105741379A (en) Method for panoramic inspection on substation
Liu et al. A global calibration method for multiple vision sensors based on multiple targets
CN110874854B (en) Camera binocular photogrammetry method based on small baseline condition
CN111220126A (en) Space object pose measurement method based on point features and monocular camera
WO2018146280A1 (en) Method and system for calibrating imaging system
Xia et al. Global calibration of non-overlapping cameras: State of the art
CN113281723B (en) AR tag-based calibration method for structural parameters between 3D laser radar and camera
CN112833791B (en) Space-time calibration method for self-rotating line structured light scanning system
CN110044266B (en) Photogrammetry system based on speckle projection
CN113962853A (en) Automatic precise resolving method for rotary linear array scanning image pose
CN112330740A (en) Pseudo-binocular dynamic distance measurement method based on monocular video
Hrabar et al. PTZ camera pose estimation by tracking a 3D target
CN117934636B (en) Dynamic external parameter calibration method and device for multi-depth camera
Beraldin et al. Applications of photo-grammetric and computer vision techniques in shake table testing
CN112435220B (en) Self-positioning porous characteristic moving window splicing method for part detection
CN116664698B (en) Automatic calibration method for vehicle-mounted binocular camera and GNSS/IMU
Liu et al. Vision measurement method for impact point in large planar region
Negahdaripour et al. Integrated system for robust 6-dof positioning utilizing new closed-form visual motion estimation methods in planar terrains

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Wang Yue

Inventor after: Han Fuchang

Inventor after: Zhang Qunkang

Inventor after: Xiong Rong

Inventor after: Tan Qimeng

Inventor before: Wang Yue

Inventor before: Han Fuchang

Inventor before: Zhang Qunkang

Inventor before: Xiong Rong

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant