CN111735479A - Multi-sensor combined calibration device and method - Google Patents

Multi-sensor combined calibration device and method Download PDF

Info

Publication number
CN111735479A
CN111735479A CN202010881818.4A CN202010881818A CN111735479A CN 111735479 A CN111735479 A CN 111735479A CN 202010881818 A CN202010881818 A CN 202010881818A CN 111735479 A CN111735479 A CN 111735479A
Authority
CN
China
Prior art keywords
calibration
camera
laser radar
calibration plate
plate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010881818.4A
Other languages
Chinese (zh)
Other versions
CN111735479B (en
Inventor
罗哉
江文松
朱志远
赵洪楠
陈艺文
黄杰伊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University
Original Assignee
China Jiliang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University filed Critical China Jiliang University
Priority to CN202010881818.4A priority Critical patent/CN111735479B/en
Publication of CN111735479A publication Critical patent/CN111735479A/en
Priority to JP2021003139A priority patent/JP7072759B2/en
Application granted granted Critical
Publication of CN111735479B publication Critical patent/CN111735479B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a multi-sensor combined calibration device and method, and relates to a multi-sensor calibration technology. The method solves the problem of multi-sensor combined calibration in the prior art. The multi-sensor combined calibration device and the multi-sensor combined calibration method comprise a mechanical arm, wherein a sensor fusion frame is arranged on the mechanical arm, a laser radar, a monocular camera and a computer for processing data are arranged on the sensor fusion frame, the multi-sensor combined calibration device also comprises a laser radar-camera four-calibration-plate combined calibration target, and the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration. The embodiment of the invention builds a portable multi-sensor fusion framework, thereby being convenient for calibration and secondary development; the embodiment of the invention uses a mechanical arm auxiliary calibration method, and can realize intelligent calibration and batch calibration.

Description

Multi-sensor combined calibration device and method
Technical Field
The invention belongs to the technical field of sensors, and particularly relates to a multi-sensor calibration technology.
Background
The instant positioning and Mapping (SLAM) technology provides environment perception information for unmanned driving, and the traditional SLAM technology is divided into a laser SLAM and a visual SLAM; the laser radar has the advantages of high ranging precision, no influence of light and the like, and the camera has the advantages of low cost, rich image information and the like; however, the single sensor SLAM has great limitations, for example, the laser radar has a slow update frequency, has motion distortion, and cannot provide accurate measurement values in severe environments such as rain and snow; the camera cannot obtain accurate three-dimensional information and is greatly limited by ambient light.
The inertial navigation system can provide accurate angular velocity and angular velocity as a pose estimation auxiliary tool. Therefore, the SLAM environment perception capability can be improved by fusing multi-sensor data such as laser radar, visual sensors and inertial navigation systems.
Calibration of a sensor in the SLAM system is divided into internal reference calibration and external reference calibration. The internal reference calibration of the sensor mainly refers to the calculation of an internal reference matrix of a camera and the calculation of an error coefficient of an inertial navigation system, so that the measured data of the sensor is accurate. External reference calibration among sensors is a prerequisite condition for accurately carrying out multi-sensor information fusion; and (4) external reference calibration between the sensors, namely determining the pose transformation relation between the coordinate systems of the sensors.
According to the traditional external reference calibration method, a pose conversion relation from a laser radar coordinate system, a camera coordinate system and an inertial navigation system coordinate system to a vehicle body coordinate system is sought by taking a vehicle body coordinate system as a reference. However, in the traditional external reference calibration, all sensors are fixed in a vehicle body, the vehicle body is limited by the movement dimension, the calibration process is complicated, and the accurate calibration of the yaw angle and the roll angle is difficult to realize.
The laser radar has high ranging precision and has no special requirement on a calibration reference object, but the camera is based on two-dimensional image characteristics and can be calibrated only by the specific calibration reference object. The existing external reference calibration method for the laser radar and the camera comprises the following steps: the method comprises a calibration method based on a single chessboard calibration plate, a calibration method based on an L-shaped calibration plate and a calibration method based on a 3D chessboard target, which are different in major and minor aspects, and solves the extrinsic parameter matrix by matching laser 3D characteristic points and camera 2D characteristic points.
The calibration of the laser radar and the inertial navigation system needs to be carried out under the motion condition, and the inertial navigation system can provide accurate acceleration measurement values and angular velocity measurement values. The traditional calibration method is a hand-eye calibration method, but the calibration precision is difficult to guarantee; the hundred-degree Apollo calibration tool controls the vehicle to move around the 8-shaped line, and data acquisition and external parameter calibration of the sensor are realized.
The multi-sensor combined calibration is one of the most fiery topics in the field of unmanned driving at present, and the existing calibration technology has the defects of low automation degree, complex operation, low accuracy and the like.
Disclosure of Invention
The invention aims to provide a multi-sensor combined calibration device and a multi-sensor combined calibration method aiming at the problems in the prior art.
The purpose of the invention can be realized by the following technical scheme: a multi-sensor combined calibration device comprises a mechanical arm, wherein a sensor fusion frame is arranged on the mechanical arm, a laser radar, a monocular camera and a computer for processing data are arranged on the sensor fusion frame, the multi-sensor combined calibration device also comprises a laser radar-camera four-calibration-plate combined calibration target, and the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration; black and white checkerboard patterns are distributed on the No. 4 calibration plate and used for calibrating internal references of the camera; the four calibration plates are distributed according to the shape of Chinese character' tianThe number 1 calibration plate and the number 2 calibration plate are arranged in parallel, the number 3 calibration plate and the number 4 calibration plate are arranged in front of the number 1 calibration plate and the number 2 calibration plate in parallel, and the number 3 calibration plate and the number 4 calibration plate are lower than the number 1 calibration plate and the number 2 calibration plate; no. 3 normal vector of calibration plate surfacen 3Demarcating the normal vector of the plate surface with No. 4n 4The angle difference is more than 30 degrees; no. 1 calibration plate normal vectorn 1Demarcating the normal vector of the plate with number 2
Figure 306694DEST_PATH_IMAGE001
The angular difference between them is greater than 30 deg..
A multi-sensor combined calibration method comprises the steps of calibrating internal parameters of a camera and calibrating external parameters of a laser radar-camera; calibrating internal parameters of the camera, controlling the mechanical arm and the camera by the computer unit to enable the camera to shoot the checkerboard calibration board at different postures, and calculating the internal parameters K of the camera by adopting a Zhang Yongyou calibration method; the external reference calibration method of the laser radar-camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, so that the calibration plate module appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera;
step two: analyzing and processing laser radar data, and extracting point cloud characteristic points; recording coordinate information of each point cloud, filtering abnormal points, partitioning point cloud data by adopting a point cloud partitioning method, and dividing the point cloud data of four calibration plates into four different groups
Figure 392461DEST_PATH_IMAGE002
Figure 915584DEST_PATH_IMAGE003
Figure 592553DEST_PATH_IMAGE004
Figure 702592DEST_PATH_IMAGE005
Extracting a point cloud clustering center point by adopting a K-means method,
Figure 60892DEST_PATH_IMAGE006
is the first
Figure DEST_PATH_IMAGE007
The three-dimensional coordinate value of the central point of each calibration plate in the laser radar coordinate system
Figure 267882DEST_PATH_IMAGE006
Selecting laser characteristic points for matching;
step three: analyzing and processing camera data, and extracting visual feature points; recording the gray value of each pixel, adopting FAST key point extraction algorithm to detect the place with obvious gray change of local pixels of the calibration plate, thereby extracting the central point position of each calibration plate, extracting the central point positions of the dots of the calibration plate No. 1, No. 2 and No. 3, recording the coordinate values thereof
Figure 850173DEST_PATH_IMAGE008
Figure 742781DEST_PATH_IMAGE009
Figure 842455DEST_PATH_IMAGE010
And the coordinate value of the center point of the No. 4 calibration plate is obtained by analyzing the relation between the checkerboards
Figure 841635DEST_PATH_IMAGE011
Figure 594827DEST_PATH_IMAGE012
Is the first
Figure 443572DEST_PATH_IMAGE007
The position of the central point of the calibration plate is in the camera coordinate system
Figure 940413DEST_PATH_IMAGE013
Two-dimensional coordinate values of
Figure 794099DEST_PATH_IMAGE012
Selecting visual feature points for matching;
step four: according to laser radar characteristic points
Figure 452614DEST_PATH_IMAGE014
And camera feature points
Figure 86857DEST_PATH_IMAGE015
And establishing a matching relation:
Figure 315886DEST_PATH_IMAGE016
establishing a minimum reprojection error according to the characteristic point matching relation, establishing an error equation, and performing least square solution according to the error equation to obtain an optimal external parameter matrix
Figure 86396DEST_PATH_IMAGE017
In some embodiments, the feature points are collected from the corner positions of the calibration plate based on the feature points of the center position of the calibration plate.
In some embodiments, the internal reference calibration of the camera adopts a checkerboard calibration method, and the computer unit controls the mechanical arm and the camera so that the camera shoots checkerboard calibration plates at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
In some embodiments, the system further comprises an inertial navigation system arranged on the sensor fusion frame, the combined calibration method comprises a laser radar and inertial navigation system combined calibration method, and the initial motion moment of the mechanical arm is defined as
Figure 650232DEST_PATH_IMAGE018
The moment when the mechanical arm finishes moving is
Figure 178297DEST_PATH_IMAGE019
Figure 984317DEST_PATH_IMAGE019
The laser point cloud scanned at any moment is
Figure 609333DEST_PATH_IMAGE020
Figure 812912DEST_PATH_IMAGE018
Laser radar coordinate system under time
Figure 890590DEST_PATH_IMAGE021
Coordinate system of inertial navigation system at moment
Figure 172404DEST_PATH_IMAGE022
,
Figure 589610DEST_PATH_IMAGE019
Laser radar coordinate system under time
Figure 26408DEST_PATH_IMAGE023
Figure 965283DEST_PATH_IMAGE019
Coordinate system of inertial navigation system at moment
Figure 145728DEST_PATH_IMAGE024
Laser radar in
Figure 417441DEST_PATH_IMAGE018
Is timed to
Figure 523675DEST_PATH_IMAGE019
The pose transformation matrix between moments is
Figure 575944DEST_PATH_IMAGE025
Inertial navigation systems in
Figure 28923DEST_PATH_IMAGE018
Is timed to
Figure 483038DEST_PATH_IMAGE019
Bits between momentsThe attitude transformation matrix is
Figure 464900DEST_PATH_IMAGE026
The external parameter matrix between the laser radar and the inertial navigation system is
Figure 237422DEST_PATH_IMAGE027
The method comprises the following steps:
the method comprises the following steps: the mechanical arm moves in an appointed track, and in the movement process of the mechanical arm, the laser radar and the inertial navigation system acquire data;
step two: controlling the mechanical arm to stop moving, and processing data acquired by the sensor; for the laser radar, performing distortion removal processing on each frame of laser point cloud according to a uniform motion model; removing outliers in the point cloud data by adopting a point cloud filtering algorithm; in order to reduce the computational complexity, each frame of laser point cloud data is subjected to down-sampling processing by using a voxelization grid method;
step three: calculating laser radar coordinate system
Figure 431774DEST_PATH_IMAGE028
At an initial moment
Figure 412499DEST_PATH_IMAGE018
Of a coordinate system
Figure 63798DEST_PATH_IMAGE029
Coordinate system related to end of movement
Figure 559502DEST_PATH_IMAGE023
Position and posture transformation matrix between
Figure 885441DEST_PATH_IMAGE025
Pose transformation matrix
Figure 48569DEST_PATH_IMAGE025
The calculation method comprises the following steps: matching by using iterative nearest neighbor method
Figure 168972DEST_PATH_IMAGE007
Frame point cloud and
Figure 683130DEST_PATH_IMAGE030
frame point cloud to obtain the first
Figure 717819DEST_PATH_IMAGE007
Frame point cloud
Figure 735454DEST_PATH_IMAGE031
And a first
Figure 230020DEST_PATH_IMAGE030
Frame point cloud
Figure 762633DEST_PATH_IMAGE032
The matching relationship of (1); first, the
Figure 397751DEST_PATH_IMAGE007
Time to
Figure 269892DEST_PATH_IMAGE030
The pose transformation of the laser radar at any moment is composed of a rotation matrix and a translation vector
Figure 732098DEST_PATH_IMAGE033
Composition, showing the pose transformation relation of two frames of point clouds, further constructing an error equation, converting the error equation into a least square problem and calculating by using an SVD method
Figure 158531DEST_PATH_IMAGE034
Figure 567647DEST_PATH_IMAGE033
According to
Figure 58409DEST_PATH_IMAGE034
Figure 957095DEST_PATH_IMAGE033
Get the first time to the second time
Figure 401982DEST_PATH_IMAGE030
The pose transformation matrix of the time laser radar is
Figure 411527DEST_PATH_IMAGE035
Further, the pose of the n frames of laser radar is transformed into a matrix
Figure 992681DEST_PATH_IMAGE036
Multiply by cumulatively to obtain lidar slave
Figure 593426DEST_PATH_IMAGE018
Is timed to
Figure 791189DEST_PATH_IMAGE019
Pose transformation matrix of time
Figure 338845DEST_PATH_IMAGE025
Step four: calculating inertial navigation system coordinate system
Figure 538620DEST_PATH_IMAGE037
At an initial moment
Figure 247950DEST_PATH_IMAGE018
Of a coordinate system
Figure 667430DEST_PATH_IMAGE022
And the end time of exercise
Figure 18777DEST_PATH_IMAGE019
Of a coordinate system
Figure 840103DEST_PATH_IMAGE024
Position and posture transformation matrix between
Figure 251493DEST_PATH_IMAGE026
Pose transformation matrix
Figure 158269DEST_PATH_IMAGE026
The calculation method comprises the following steps: acceleration measurement for inertial navigation systemIntegrating the quantity data and the angular velocity measurement data to obtain displacement data
Figure 844465DEST_PATH_IMAGE038
And rotation data
Figure 753253DEST_PATH_IMAGE039
Then, then
Figure 69965DEST_PATH_IMAGE018
Is timed to
Figure 464037DEST_PATH_IMAGE019
The pose transformation matrix of the moment inertial navigation system is expressed as
Figure 891607DEST_PATH_IMAGE040
Step five: by passing
Figure 953104DEST_PATH_IMAGE025
Will be provided with
Figure 440717DEST_PATH_IMAGE019
Laser point cloud of time
Figure 322086DEST_PATH_IMAGE020
Is projected to
Figure 51882DEST_PATH_IMAGE018
Of time of day
Figure 499044DEST_PATH_IMAGE029
Under the coordinate system, obtaining point cloud
Figure 423138DEST_PATH_IMAGE041
Step six: by passing
Figure 260644DEST_PATH_IMAGE026
And the parameter to be calibrated
Figure 295596DEST_PATH_IMAGE027
Will be
Figure 534947DEST_PATH_IMAGE019
Laser point cloud of time
Figure 629942DEST_PATH_IMAGE020
Is projected to
Figure 485903DEST_PATH_IMAGE018
Of time of day
Figure 557502DEST_PATH_IMAGE029
Under a coordinate system, obtaining
Figure 979256DEST_PATH_IMAGE042
Step seven, matching
Figure 713993DEST_PATH_IMAGE041
And
Figure 463775DEST_PATH_IMAGE042
two groups of point clouds, by aligning the two groups of point clouds, to the external reference matrix
Figure 106109DEST_PATH_IMAGE027
Optimizing by adopting an iterative closest point method
Figure 585632DEST_PATH_IMAGE041
And
Figure 520964DEST_PATH_IMAGE042
the described point cloud region registration of the same piece, the construction and the optimization of the nearest neighbor error
Figure 85938DEST_PATH_IMAGE043
According to
Figure 204067DEST_PATH_IMAGE043
Figure 803675DEST_PATH_IMAGE025
Figure 411374DEST_PATH_IMAGE026
Solving an extrinsic parameter matrix
Figure 198064DEST_PATH_IMAGE027
In some embodiments, under the condition of sufficient visual conditions, the observation data of the monocular camera is recorded, and the visual-IMU calibration tool is adopted to calculate the external parameter matrix between the monocular camera and the inertial navigation system
Figure 415157DEST_PATH_IMAGE044
(ii) a According to the determined three external parameter matrixes
Figure 806955DEST_PATH_IMAGE045
Figure 54397DEST_PATH_IMAGE046
Figure 397826DEST_PATH_IMAGE044
The pose consistency between the two sensors is verified in a combined mode, conversion parameters in the reference matrix are adjusted, and the multi-sensor combined calibration precision is improved; the parameter optimization adopts an on-line adjustment method, fuses the data of the laser radar, the monocular camera and the inertial navigation system, and carries out the parameter optimization
Figure 654495DEST_PATH_IMAGE046
Figure 461652DEST_PATH_IMAGE044
And (6) adjusting.
Compared with the prior art, the multi-sensor combined calibration device and method have the following advantages:
the invention is suitable for vision sensors such as 16-line, 32-line, 64-line and other multi-line laser radars, monocular cameras, binocular cameras, RGBD cameras and the like; the embodiment of the invention builds a portable multi-sensor fusion framework, thereby being convenient for calibration and secondary development; the embodiment of the invention uses a mechanical arm auxiliary calibration method, and can realize intelligent calibration and batch calibration.
Description of the drawings:
in the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. Like reference numerals having different letter suffixes may represent different examples of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed herein.
FIG. 1 is a schematic view of a robotic arm provided with a sensor fusion frame;
FIG. 2 is a schematic diagram of a multi-sensor fusion framework;
FIG. 3 is a schematic diagram of a position relationship of a combined calibration target of four calibration plates of a laser radar-camera;
FIG. 4 is a schematic view of the spatial arrangement of calibration targets;
FIG. 5 is a schematic diagram of a lidar-camera joint calibration;
FIG. 6 is a schematic diagram of a feature point distribution;
FIG. 7 is a schematic illustration of laser-visual feature matching;
FIG. 8 is a schematic diagram of laser radar pose transformation;
FIG. 9 is a schematic diagram of inertial navigation system pose transformation;
FIG. 10 is a schematic diagram of a calibration flow.
In the figure: 101. a sensor fusion framework; 102. a mechanical arm; 103. a connecting mechanism; 104. a computer unit; 105. an operation table; 201. a laser radar; 202. a monocular camera; 203. an inertial navigation system; 204. a metal frame; 205. a fixing device; 206 clamp.
Detailed Description
The following are specific examples of the present invention, and the technical solutions of the present invention are further described with reference to the drawings, but the present invention is not limited to these examples, and the following embodiments do not limit the invention according to the claims. Moreover, all combinations of features described in the embodiments are not necessarily essential to the solution of the invention.
Examples
A multi-sensor combined calibration device comprises a multi-sensor fusion frame 101, a mechanical arm 102, a connecting mechanism 103, a computer unit 104 and an operation table 105; the multi-sensor fusion frame 101 fixes a laser radar 201, a monocular camera 202 and an inertial navigation system 203 under a metal frame 204 which can move freely by a multi-sensor combined construction method, and the frame can be carried on environment sensing occasions such as unmanned vehicles and unmanned aerial vehicles through a clamp 206, is suitable for secondary development and is convenient for implementation of external reference calibration; the metal frame is 18cm long, 6cm wide and 8cm high; adopting a multi-sensor combined construction method, and installing the sensors according to the same coordinate system in a pointing manner; the laser radar 201 is arranged at the center of the top of a metal frame 204, and a fixing device 205 is arranged inside the metal frame; the monocular camera 202 is installed at the position 5cm on the left side of the fixing device 205, and the inertial navigation system 203 is installed at the position 5cm on the right side of the fixing device 205; the mechanical arm 102 is arranged above the operating platform 105 and provides translation and rotation motion in three axial directions; the tail end of the mechanical arm is connected with the fixing device 205 through the connecting mechanism 103, and the height of the multi-sensor fusion frame 101 from the ground is 140cm in an initial state; the computer unit 103 has functions of: controlling the motion of the mechanical arm 102, controlling the sensor to collect data, processing the sensor data and calculating the external parameter matrix.
The system also comprises a laser radar-camera four-calibration-plate combined calibration target, wherein the laser radar-camera four-calibration-plate combined calibration target consists of four calibration plates with the same size, the laser radar-camera four-calibration-plate combined calibration target 104 consists of four calibration plates with the size of 30cm × 30cm, as shown in figure 3, the central positions of No. 1 to No. 3 calibration plates are marked by round points and used for providing characteristic points for external reference calibration, black-and-white checkerboard patterns are distributed on the No. 4 calibration plate and used for internal reference calibration of the camera, the four calibration plates are distributed in the air in a 'tian' -shape, for more accurately dividing the four calibration plates, the calibration plates are distributed in the environment according to different distances and angles, as shown in figure 4, the No. 3 calibration plate and the No. 4 calibration plate are fixed by adopting a base with the height of 120cm, the base is placed on a horizontal line, and the distance multi-sensor fusion frame is arranged on the horizontal line
Figure 411153DEST_PATH_IMAGE047
130cm, No. 3 calibration plate normal vectorn 3And the normal vector of No. 4 calibration platen 4The angle difference is more than 30 degrees; no. 1 calibration board and No. 2 calibration board adopt the height to be fixed for 160 cm's base, and the base is placed on level line two, and distance sensor fuses the frame
Figure 624966DEST_PATH_IMAGE048
180cm, ensuring that the No. 1 calibration plate and the No. 2 calibration plate are not shielded; normal vector of No. 1 calibration platen 1And the normal vector of No. 2 calibration platen 2The angular difference between them is greater than 30 deg..
The computer unit controls the mechanical arm and the camera to enable the camera to shoot the checkerboard calibration board at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
The laser radar-camera four-calibration-plate combined calibration target is used for calibrating internal parameters of a monocular camera and calibrating external parameters of the laser radar-camera; the combined calibration method of the laser radar and the monocular camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, as shown in fig. 5, so that the combined calibration target appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera; then the computer unit processes the measurement data; finally, a least square problem is constructed to solve an external parameter matrix
Figure 357430DEST_PATH_IMAGE017
Step two: according to the data acquisition, laser radar data and camera data are respectively processed, for the laser radar data, coordinate information of each point cloud is recorded, abnormal points are filtered out, the point cloud data are divided by adopting a point cloud dividing method, and the point cloud data of four calibration plates are divided into four different groups
Figure 317295DEST_PATH_IMAGE002
Figure 608337DEST_PATH_IMAGE003
Figure 325757DEST_PATH_IMAGE004
Figure 94868DEST_PATH_IMAGE005
(ii) a Extracting a point cloud clustering center point by adopting a K-means method;
Figure 784606DEST_PATH_IMAGE006
is the first
Figure 341490DEST_PATH_IMAGE007
The central point of the calibration plate is positioned in the laser radar coordinate system
Figure 874102DEST_PATH_IMAGE028
Three-dimensional coordinate values of
Figure 276265DEST_PATH_IMAGE006
Selecting laser characteristic points for matching;
step three: for camera data, recording the gray value of each pixel, and detecting a place where the gray change of local pixels of the calibration plate is obvious by adopting a FAST key point extraction algorithm so as to extract the position of the central point of each calibration plate; for No. 1 to No. 3 calibration plates, the center position of the dot is extracted, and the coordinate value of the dot is recorded
Figure 413985DEST_PATH_IMAGE008
Figure 843567DEST_PATH_IMAGE009
Figure 676525DEST_PATH_IMAGE010
(ii) a For the No. 4 calibration plate, the coordinate value of the central point of the calibration plate is solved by analyzing the relation between the checkerboards
Figure 584176DEST_PATH_IMAGE011
Figure 779665DEST_PATH_IMAGE049
Is the first
Figure 616034DEST_PATH_IMAGE007
The position of the central point of the calibration plate is in the camera coordinate system
Figure 326501DEST_PATH_IMAGE013
Two-dimensional coordinate values of
Figure 569001DEST_PATH_IMAGE012
Selecting visual feature points for matching;
step four: from lidar 3D feature points
Figure 291101DEST_PATH_IMAGE014
And monocular camera 2D feature points
Figure 469010DEST_PATH_IMAGE015
And establishing a matching relation:
Figure 73298DEST_PATH_IMAGE016
as shown in fig. 7, according to the matching relationship between defined lidar coordinates (Xi, Xi) and monocular camera coordinates (Ui, Vi) based on feature points Pi, a minimum reprojection error is established, and a least square problem is constructed; defining the external parameter matrixes of the laser radar and the monocular camera as follows:
Figure 588331DEST_PATH_IMAGE050
wherein
Figure 492833DEST_PATH_IMAGE051
In order to be a matrix of rotations,
Figure 671004DEST_PATH_IMAGE052
is a translation vector; then combining the matching relationship and the external reference matrix, the error equation can be expressed as:
Figure 526702DEST_PATH_IMAGE053
wherein
Figure 143629DEST_PATH_IMAGE054
And constructing a least square optimization function for the depth values of the visual feature points according to an error equation to minimize errors and obtain an optimal external parameter matrix:
Figure 637058DEST_PATH_IMAGE055
further, optionally, more feature points are collected for matching, on the basis of feature points at the center positions of the calibration plates, the corner positions of the calibration plates are collected as feature points, as shown in fig. 6, 5 positions of each calibration plate are selected as feature points, a least square optimization function is constructed according to the method, and an external parameter matrix is solved; experiments prove that the more the number of characteristic points is, the more accurate the calculation of the external parameter matrix is.
The joint calibration method of the laser radar and the inertial navigation system comprises the following steps:
with the assistance of a mechanical arm, adopting a laser radar-inertial navigation system external reference calibration scheme based on point cloud matching; the method comprises the following specific steps:
firstly, controlling the mechanical arm to move, and acquiring data while the sensor moves in a space;
secondly, stopping the mechanical arm to move, and processing data acquired by the sensor;
then, calculating the initial time of the laser radar
Figure 782869DEST_PATH_IMAGE018
Of a coordinate system
Figure 453759DEST_PATH_IMAGE029
And end of exercise
Figure 343218DEST_PATH_IMAGE056
Of a coordinate system
Figure 487891DEST_PATH_IMAGE023
Position and posture transformation matrix between
Figure 335761DEST_PATH_IMAGE025
(ii) a Calculating inertial navigation system at initial time
Figure 729834DEST_PATH_IMAGE018
Of a coordinate system
Figure 157404DEST_PATH_IMAGE022
And the end time of exercise
Figure 953322DEST_PATH_IMAGE019
Of a coordinate system
Figure 503252DEST_PATH_IMAGE024
Position and posture transformation matrix between
Figure 617576DEST_PATH_IMAGE026
Thirdly, by
Figure 114416DEST_PATH_IMAGE025
Will be provided with
Figure 233682DEST_PATH_IMAGE019
Laser point cloud of time
Figure 157776DEST_PATH_IMAGE020
Is projected to
Figure 260861DEST_PATH_IMAGE018
Of time of day
Figure 295813DEST_PATH_IMAGE029
Under the coordinate system, obtaining point cloud
Figure 800744DEST_PATH_IMAGE041
(ii) a By passing
Figure 692476DEST_PATH_IMAGE026
And the parameter to be calibrated
Figure 781393DEST_PATH_IMAGE027
Will be
Figure 557719DEST_PATH_IMAGE019
Laser point cloud of time
Figure 182735DEST_PATH_IMAGE020
Is projected to
Figure 183052DEST_PATH_IMAGE018
Of time of day
Figure 260730DEST_PATH_IMAGE029
Under a coordinate system, obtaining
Figure 637485DEST_PATH_IMAGE042
(ii) a Finally, matching
Figure 913745DEST_PATH_IMAGE041
And
Figure 849078DEST_PATH_IMAGE042
two groups of point clouds, and calculating an external parameter matrix by aligning the two groups of point clouds
Figure 414051DEST_PATH_IMAGE027
The mechanical arm moves to control the mechanical arm to move in a specified track; taking the coordinate axis of the mechanical arm as a reference, moving 100cm along the positive direction of an X axis, moving 100cm along the negative direction of the X axis, moving 100cm along the positive direction of a Y axis, moving 100cm along the negative direction of the Y axis, moving 100cm along the positive direction of a Z axis, and moving 100cm along the negative direction of the Z axis; rotating 180 degrees clockwise around the X axis, rotating 180 degrees counterclockwise around the X axis, rotating 180 degrees clockwise around the Y axis, rotating 180 degrees counterclockwise around the Y axis, rotating 180 degrees clockwise around the Z axis, and rotating 180 degrees counterclockwise around the Z axis; in the motion process of the mechanical arm, data acquisition is carried out by a laser radar and an inertial navigation system;
the sensor data processing is to perform distortion removal processing on each frame of laser point cloud according to a uniform motion model for the laser radar; removing outliers in the point cloud data by adopting a point cloud filtering algorithm; in order to reduce the computational complexity, each frame of laser point cloud data is subjected to down-sampling processing by using a voxelization grid method;
the pose transformation matrix
Figure 594497DEST_PATH_IMAGE025
The calculation method comprises the following steps: matching by using iterative nearest neighbor method
Figure 928526DEST_PATH_IMAGE007
Frame point cloud and
Figure 536225DEST_PATH_IMAGE030
frame point cloud to obtain the first
Figure 227975DEST_PATH_IMAGE007
Frame point cloud
Figure 212112DEST_PATH_IMAGE031
And a first
Figure 72752DEST_PATH_IMAGE030
Frame point cloud
Figure 851352DEST_PATH_IMAGE032
The matching relationship of (1); definition of
Figure 663073DEST_PATH_IMAGE007
Time to
Figure 919742DEST_PATH_IMAGE030
The pose of the laser radar is changed by the rotation matrix
Figure 228364DEST_PATH_IMAGE034
And translation vector
Figure 381128DEST_PATH_IMAGE033
The components of the composition are as follows,the corresponding relationship between the two frames of point clouds is represented as:
Figure 407990DEST_PATH_IMAGE057
the error equation is expressed as:
Figure 435726DEST_PATH_IMAGE058
calculating by using SVD method through constructing least square problem
Figure 536537DEST_PATH_IMAGE034
Figure 656940DEST_PATH_IMAGE033
Figure DEST_PATH_IMAGE059
First, the
Figure 76158DEST_PATH_IMAGE007
Time to
Figure 940209DEST_PATH_IMAGE030
The pose transformation matrix of the laser radar at any moment is as follows:
Figure 223422DEST_PATH_IMAGE060
Figure 655672DEST_PATH_IMAGE025
for lidar slave
Figure 155661DEST_PATH_IMAGE018
Is timed to
Figure 557824DEST_PATH_IMAGE019
And the pose transformation matrix of the moment is expressed as:
Figure DEST_PATH_IMAGE061
the pose transformation matrix
Figure 367648DEST_PATH_IMAGE026
The calculation method comprises the following steps: integrating acceleration measurement data and angular velocity measurement data of the inertial navigation system to obtain displacement data
Figure 564274DEST_PATH_IMAGE038
And rotation data
Figure 754822DEST_PATH_IMAGE039
Figure 226254DEST_PATH_IMAGE018
Is timed to
Figure 484060DEST_PATH_IMAGE019
The pose transformation matrix of the moment inertial navigation system is expressed as:
Figure 117167DEST_PATH_IMAGE062
the device is to
Figure 296475DEST_PATH_IMAGE019
Laser point cloud of time
Figure 306020DEST_PATH_IMAGE020
Is projected to
Figure 152753DEST_PATH_IMAGE018
Of time of day
Figure 189717DEST_PATH_IMAGE029
Under the coordinate system, for the laser radar, as shown in fig. 8, the position and orientation transformation matrix can be directly passed through
Figure 121901DEST_PATH_IMAGE025
To obtain
Figure 935136DEST_PATH_IMAGE020
In that
Figure 636376DEST_PATH_IMAGE029
Coordinate representation in a coordinate system
Figure 876864DEST_PATH_IMAGE041
Figure 499607DEST_PATH_IMAGE063
For inertial navigation systems, as shown in FIG. 9, the pose transformation matrix can be used
Figure 83910DEST_PATH_IMAGE026
And external parameter matrix
Figure 905235DEST_PATH_IMAGE027
To obtain
Figure 316625DEST_PATH_IMAGE020
In that
Figure 223401DEST_PATH_IMAGE029
Coordinate representation in a coordinate system
Figure 378439DEST_PATH_IMAGE042
Figure 788691DEST_PATH_IMAGE064
The computing external parameter matrix
Figure 370983DEST_PATH_IMAGE027
The method is to align the point cloud
Figure 263590DEST_PATH_IMAGE041
And point cloud
Figure 956739DEST_PATH_IMAGE042
(ii) a In theory, it is possible to use,
Figure 549395DEST_PATH_IMAGE041
and
Figure 37008DEST_PATH_IMAGE042
described is the same piece of point cloud data, which are spatially coincident, i.e.:
Figure 918376DEST_PATH_IMAGE065
due to the existence of the external reference error,
Figure 87321DEST_PATH_IMAGE041
and
Figure 534482DEST_PATH_IMAGE042
are not completely coincident; using an iterative closest point method, will
Figure 425953DEST_PATH_IMAGE041
And
Figure 529038DEST_PATH_IMAGE042
and (3) registering the point cloud regions of the same piece, and constructing and optimizing a nearest neighbor error:
Figure 298411DEST_PATH_IMAGE066
according to
Figure 272183DEST_PATH_IMAGE043
Figure 836020DEST_PATH_IMAGE025
Figure 924936DEST_PATH_IMAGE026
Solving an extrinsic parameter matrix
Figure 763579DEST_PATH_IMAGE027
Figure DEST_PATH_IMAGE067
So far, the external parameter matrix is obtained by the calibration method
Figure 795120DEST_PATH_IMAGE017
And
Figure 529858DEST_PATH_IMAGE027
further, optionally, under the condition of sufficient visual conditions, the observation data of the monocular camera is recorded, and the external parameter matrix between the monocular camera and the inertial navigation system is calculated by adopting a visual-IMU calibration tool
Figure 106071DEST_PATH_IMAGE068
(ii) a According to the determined three external parameter matrixes
Figure 482825DEST_PATH_IMAGE017
Figure 962348DEST_PATH_IMAGE027
Figure 399146DEST_PATH_IMAGE068
The pose consistency between the two is verified in a combined mode, if the pose consistency fails to be verified, the conversion parameters in the external parameter matrix are adjusted until the verification passes, and therefore under the condition that the verification fails, the embodiment of the invention is used for verifying the conversion parameters in the external parameter matrix
Figure 698540DEST_PATH_IMAGE017
Figure 878986DEST_PATH_IMAGE027
Figure 213015DEST_PATH_IMAGE068
Optimizing parameters, and improving the precision of multi-sensor combined calibration; the parameters are optimized by adopting an online adjusting method, fusing a laser radar,Monocular camera, inertial navigation system data, pair
Figure 617452DEST_PATH_IMAGE027
Figure 669721DEST_PATH_IMAGE068
And (6) adjusting.
Although some terms are used more herein, the possibility of using other terms is not excluded. These terms are used merely to more conveniently describe and explain the nature of the present invention; they are to be construed as being without limitation to any additional limitations that may be imposed by the spirit of the present invention. The order of execution of the operations, steps, and the like in the apparatuses and methods shown in the specification and drawings may be implemented in any order as long as the output of the preceding process is not used in the subsequent process, unless otherwise specified. The descriptions using "first", "next", etc. for convenience of description do not imply that they must be performed in this order.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (6)

1. A multi-sensor combined calibration device comprises a mechanical arm, wherein a sensor fusion frame is arranged on the mechanical arm, a laser radar, a monocular camera and a computer for processing data are arranged on the sensor fusion frame, and the multi-sensor combined calibration device is characterized by further comprising a laser radar-camera four-calibration-plate combined calibration target, wherein the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration; black and white checkerboard patterns are distributed on the No. 4 calibration plate and used for calibrating internal references of the camera; the four calibration plates are distributed according to the shape of Chinese character' tianThe number 1 calibration plate and the number 2 calibration plate are arranged in parallel, the number 3 calibration plate and the number 4 calibration plate are arranged in front of the number 1 calibration plate and the number 2 calibration plate in parallel, and the number 3 calibration plate and the number 4 calibration plate are lower than the number 1 calibration plate and the number 2 calibration plate; no. 3 normal vector of calibration plate surfacen 3Demarcating the normal vector of the plate surface with No. 4n 4The angle difference is more than 30 degrees; no. 1 calibration plate normal vectorn 1Demarcating the normal vector of the plate with number 2n 2The angular difference between them is greater than 30 deg..
2. A multi-sensor joint calibration method of the multi-sensor joint calibration apparatus according to claim 1, comprising an internal reference calibration of a camera, an external reference calibration of a lidar-camera; calibrating internal parameters of the camera, controlling the mechanical arm and the camera by the computer unit to enable the camera to shoot the checkerboard calibration board at different postures, and calculating the internal parameters K of the camera by adopting a Zhang Yongyou calibration method; the external reference calibration method of the laser radar-camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, so that the calibration plate module appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera;
step two: analyzing and processing laser radar data, and extracting point cloud characteristic points; recording coordinate information of each point cloud, filtering abnormal points, partitioning point cloud data by adopting a point cloud partitioning method, and dividing the point cloud data of four calibration plates into four different groups
Figure 177508DEST_PATH_IMAGE001
Figure 371860DEST_PATH_IMAGE002
Figure 742799DEST_PATH_IMAGE003
Figure 879251DEST_PATH_IMAGE004
To adoptExtracting a point cloud clustering center point by using a K-means method;
Figure 702850DEST_PATH_IMAGE005
is the first
Figure 825527DEST_PATH_IMAGE006
The three-dimensional coordinate value of the central point of each calibration plate in the laser radar coordinate system
Figure 191918DEST_PATH_IMAGE005
Selecting laser characteristic points for matching;
step three: analyzing and processing camera data, and extracting visual feature points; recording the gray value of each pixel, and detecting a place where the gray change of local pixels of the calibration plate is obvious by adopting a FAST key point extraction algorithm so as to extract the position of the central point of each calibration plate; extracting the center positions of the dots of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate, and recording the coordinate values
Figure 374637DEST_PATH_IMAGE007
Figure 810167DEST_PATH_IMAGE008
Figure 470955DEST_PATH_IMAGE009
(ii) a By analyzing the relation between the chequers, the coordinate value of the central point of the No. 4 calibration plate is obtained
Figure 691852DEST_PATH_IMAGE010
Figure 45473DEST_PATH_IMAGE011
Is the first
Figure 968298DEST_PATH_IMAGE006
The position of the central point of the calibration plate is in the camera coordinate system
Figure 432778DEST_PATH_IMAGE012
Two-dimensional coordinate values of
Figure 632815DEST_PATH_IMAGE011
Selecting visual feature points for matching;
step four: according to laser radar characteristic points
Figure 32704DEST_PATH_IMAGE013
And camera feature points
Figure 52612DEST_PATH_IMAGE014
And establishing a matching relation:
Figure 710995DEST_PATH_IMAGE015
establishing a minimum reprojection error according to the characteristic point matching relation, establishing an error equation, and performing least square solution according to the error equation to obtain an optimal external parameter matrix
Figure 765539DEST_PATH_IMAGE016
3. The multi-sensor joint calibration method according to claim 2, wherein the corner positions of the calibration plate are collected as feature points on the basis of the feature points at the center position of the calibration plate.
4. The multi-sensor joint calibration method according to claim 2, wherein the camera is calibrated by using a checkerboard calibration method, and the mechanical arm and the camera are controlled by the computer unit so that the checkerboard calibration plate is shot by the camera at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
5. The multi-sensor joint calibration method according to claim 2, further comprising arranging a sensor fusion frameThe inertial navigation system comprises a laser radar and inertial navigation system combined calibration method, and the initial motion moment of the mechanical arm is defined as
Figure 336329DEST_PATH_IMAGE017
The moment when the mechanical arm finishes moving is
Figure 843534DEST_PATH_IMAGE018
Figure 915395DEST_PATH_IMAGE018
The laser point cloud scanned at any moment is
Figure 949079DEST_PATH_IMAGE019
Figure 815404DEST_PATH_IMAGE017
Laser radar coordinate system under time
Figure 685271DEST_PATH_IMAGE020
Coordinate system of inertial navigation system at moment
Figure 560823DEST_PATH_IMAGE021
,
Figure 58800DEST_PATH_IMAGE022
Laser radar coordinate system under time
Figure 486239DEST_PATH_IMAGE023
Figure 968036DEST_PATH_IMAGE018
Coordinate system of inertial navigation system at moment
Figure 991487DEST_PATH_IMAGE024
Laser radar in
Figure 875129DEST_PATH_IMAGE017
Is timed to
Figure 473470DEST_PATH_IMAGE018
The pose transformation matrix between moments is
Figure 442563DEST_PATH_IMAGE025
Inertial navigation systems in
Figure 394338DEST_PATH_IMAGE017
Is timed to
Figure 7853DEST_PATH_IMAGE018
The pose transformation matrix between moments is
Figure 652461DEST_PATH_IMAGE026
The external parameter matrix between the laser radar and the inertial navigation system is
Figure 967905DEST_PATH_IMAGE027
The method comprises the following steps:
the method comprises the following steps: the mechanical arm moves in an appointed track, and in the movement process of the mechanical arm, the laser radar and the inertial navigation system acquire data;
step two: controlling the mechanical arm to stop moving, and processing data acquired by the sensor; for the laser radar, performing distortion removal processing on each frame of laser point cloud according to a uniform motion model; removing outliers in the point cloud data by adopting a point cloud filtering algorithm;
step three: calculating laser radar coordinate system
Figure 723372DEST_PATH_IMAGE028
At an initial moment
Figure 456972DEST_PATH_IMAGE017
Of a coordinate system
Figure 6902DEST_PATH_IMAGE029
Coordinate system related to end of movement
Figure 75221DEST_PATH_IMAGE023
Position and posture transformation matrix between
Figure 368799DEST_PATH_IMAGE025
Pose transformation matrix
Figure 222486DEST_PATH_IMAGE025
The calculation method comprises the following steps: matching by using iterative nearest neighbor method
Figure 943317DEST_PATH_IMAGE030
Frame point cloud and
Figure 108719DEST_PATH_IMAGE031
frame point cloud to obtain the first
Figure 330622DEST_PATH_IMAGE030
Frame point cloud
Figure 897870DEST_PATH_IMAGE032
And a first
Figure 930548DEST_PATH_IMAGE031
Frame point cloud
Figure 583246DEST_PATH_IMAGE033
The matching relationship of (1); first, the
Figure 608840DEST_PATH_IMAGE030
Time to
Figure 296173DEST_PATH_IMAGE031
Pose transformation rotation matrix of time laser radar
Figure 358807DEST_PATH_IMAGE034
And translation vector
Figure 374168DEST_PATH_IMAGE035
The method comprises the steps of constructing an error equation, converting the error equation into a least square problem and calculating by using an SVD (singular value decomposition) method
Figure 813239DEST_PATH_IMAGE034
Figure 479713DEST_PATH_IMAGE035
According to
Figure 713248DEST_PATH_IMAGE034
Figure 215905DEST_PATH_IMAGE035
To obtain the first
Figure 458667DEST_PATH_IMAGE030
Time to
Figure 589434DEST_PATH_IMAGE031
The pose transformation matrix of the time laser radar is
Figure 384084DEST_PATH_IMAGE036
Transforming the pose of the n frames of laser radar into a matrix
Figure 233091DEST_PATH_IMAGE037
Multiply by cumulatively to obtain lidar slave
Figure 154911DEST_PATH_IMAGE017
Is timed to
Figure 405763DEST_PATH_IMAGE018
Pose transformation matrix of time
Figure 371314DEST_PATH_IMAGE038
Step four: calculating inertial navigation system coordinate system
Figure 707618DEST_PATH_IMAGE039
At an initial moment
Figure 292183DEST_PATH_IMAGE017
Of a coordinate system
Figure 538487DEST_PATH_IMAGE021
And the end time of exercise
Figure 550306DEST_PATH_IMAGE018
Of a coordinate system
Figure 498539DEST_PATH_IMAGE024
Position and posture transformation matrix between
Figure 621216DEST_PATH_IMAGE026
Pose transformation matrix
Figure 722027DEST_PATH_IMAGE026
The calculation method comprises the following steps: integrating acceleration measurement data and angular velocity measurement data of the inertial navigation system to obtain displacement data
Figure 904747DEST_PATH_IMAGE040
And rotation data
Figure 215642DEST_PATH_IMAGE041
Then, then
Figure 266644DEST_PATH_IMAGE017
Is timed to
Figure 612174DEST_PATH_IMAGE018
The pose transformation matrix of the moment inertial navigation system is expressed as
Figure 841162DEST_PATH_IMAGE042
Step five: by passing
Figure 373774DEST_PATH_IMAGE025
Will be provided with
Figure 228467DEST_PATH_IMAGE018
Laser point cloud of time
Figure 428504DEST_PATH_IMAGE019
Is projected to
Figure 828392DEST_PATH_IMAGE017
Of time of day
Figure 113880DEST_PATH_IMAGE029
Under the coordinate system, obtaining point cloud
Figure 116471DEST_PATH_IMAGE043
Step six: by passing
Figure 295649DEST_PATH_IMAGE026
And the parameter to be calibrated
Figure 256651DEST_PATH_IMAGE027
Will be
Figure 639222DEST_PATH_IMAGE018
Laser point cloud of time
Figure 445504DEST_PATH_IMAGE019
Is projected to
Figure 10347DEST_PATH_IMAGE017
Of time of day
Figure 611092DEST_PATH_IMAGE029
Under a coordinate system, obtaining
Figure 746538DEST_PATH_IMAGE044
Step seven, matching
Figure 356511DEST_PATH_IMAGE043
And
Figure 510281DEST_PATH_IMAGE044
two groups of point clouds, by aligning the two groups of point clouds, to the external reference matrix
Figure 813086DEST_PATH_IMAGE027
Optimizing by adopting an iterative closest point method
Figure 29304DEST_PATH_IMAGE043
And
Figure 318334DEST_PATH_IMAGE044
the described point cloud region registration of the same piece, the construction and the optimization of the nearest neighbor error
Figure 936397DEST_PATH_IMAGE045
According to
Figure 534738DEST_PATH_IMAGE045
Figure 503831DEST_PATH_IMAGE025
Figure 330972DEST_PATH_IMAGE026
Solving an extrinsic parameter matrix
Figure 69121DEST_PATH_IMAGE027
6. Multiple sensing according to claim 5The device combined calibration method is characterized in that observation data of the monocular camera are recorded under the condition of sufficient visual conditions, and a visual-IMU calibration tool is adopted to calculate an external parameter matrix between the monocular camera and an inertial navigation system
Figure 713729DEST_PATH_IMAGE046
(ii) a According to the determined three external parameter matrixes
Figure 294752DEST_PATH_IMAGE047
Figure 784639DEST_PATH_IMAGE048
Figure 252661DEST_PATH_IMAGE046
The pose consistency between the two sensors is verified in a combined mode, conversion parameters in the reference matrix are adjusted, and the multi-sensor combined calibration precision is improved; optimizing parameters, adopting an online adjusting method, fusing data of a laser radar, a monocular camera and an inertial navigation system, and performing parameter optimization
Figure 68170DEST_PATH_IMAGE048
Figure 136489DEST_PATH_IMAGE046
And (6) adjusting.
CN202010881818.4A 2020-08-28 2020-08-28 Multi-sensor combined calibration device and method Active CN111735479B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010881818.4A CN111735479B (en) 2020-08-28 2020-08-28 Multi-sensor combined calibration device and method
JP2021003139A JP7072759B2 (en) 2020-08-28 2021-01-12 Composite calibration device and method using multiple sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010881818.4A CN111735479B (en) 2020-08-28 2020-08-28 Multi-sensor combined calibration device and method

Publications (2)

Publication Number Publication Date
CN111735479A true CN111735479A (en) 2020-10-02
CN111735479B CN111735479B (en) 2021-03-23

Family

ID=72658909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010881818.4A Active CN111735479B (en) 2020-08-28 2020-08-28 Multi-sensor combined calibration device and method

Country Status (2)

Country Link
JP (1) JP7072759B2 (en)
CN (1) CN111735479B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112444798A (en) * 2020-11-27 2021-03-05 杭州易现先进科技有限公司 Multi-sensor equipment space-time external parameter calibration method and device and computer equipment
CN112509067A (en) * 2021-02-02 2021-03-16 中智行科技有限公司 Multi-sensor combined calibration method and device, electronic equipment and storage medium
CN112577517A (en) * 2020-11-13 2021-03-30 上汽大众汽车有限公司 Multi-element positioning sensor combined calibration method and system
CN112790786A (en) * 2020-12-30 2021-05-14 无锡祥生医疗科技股份有限公司 Point cloud data registration method and device, ultrasonic equipment and storage medium
CN112882000A (en) * 2021-02-05 2021-06-01 北京科技大学 Automatic calibration method of laser radar
CN112881999A (en) * 2021-01-25 2021-06-01 上海西虹桥导航技术有限公司 Semi-automatic calibration method for multi-line laser radar and vision sensor
CN112927302A (en) * 2021-02-22 2021-06-08 山东大学 Calibration plate and calibration method for multi-line laser radar and camera combined calibration
CN113192174A (en) * 2021-04-06 2021-07-30 中国计量大学 Mapping method and device and computer storage medium
CN113218435A (en) * 2021-05-07 2021-08-06 复旦大学 Multi-sensor time synchronization method
CN113269107A (en) * 2021-06-01 2021-08-17 航天智造(上海)科技有限责任公司 Interactive intelligent disassembling and assembling system based on deep learning
CN113298881A (en) * 2021-05-27 2021-08-24 中国科学院沈阳自动化研究所 Monocular camera-IMU-mechanical arm space combined calibration method
CN113376618A (en) * 2021-06-22 2021-09-10 昆明理工大学 Multi-path side laser radar point cloud registration device and using method
CN114252099A (en) * 2021-12-03 2022-03-29 武汉科技大学 Intelligent vehicle multi-sensor fusion self-calibration method and system
CN114643599A (en) * 2020-12-18 2022-06-21 沈阳新松机器人自动化股份有限公司 Three-dimensional machine vision system and method based on point laser and area-array camera
CN114882115A (en) * 2022-06-10 2022-08-09 国汽智控(北京)科技有限公司 Vehicle pose prediction method and device, electronic equipment and storage medium
CN114879153A (en) * 2022-06-08 2022-08-09 中国第一汽车股份有限公司 Radar parameter calibration method and device and vehicle
CN114894116A (en) * 2022-04-08 2022-08-12 苏州瀚华智造智能技术有限公司 Measurement data fusion method and non-contact measurement equipment
CN115097427A (en) * 2022-08-24 2022-09-23 北原科技(深圳)有限公司 Automatic calibration system and method based on time-of-flight method
CN115908589A (en) * 2023-02-23 2023-04-04 深圳佑驾创新科技有限公司 Multi-sensor calibration system and method
CN116038719A (en) * 2023-04-03 2023-05-02 广东工业大学 Method, device and equipment for tracking and measuring pose of tail end of mechanical arm
CN116630444A (en) * 2023-07-24 2023-08-22 中国矿业大学 Optimization method for fusion calibration of camera and laser radar
CN117554937A (en) * 2024-01-08 2024-02-13 安徽中科星驰自动驾驶技术有限公司 Error-controllable laser radar and combined inertial navigation external parameter calibration method and system

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114770517B (en) * 2022-05-19 2023-08-15 梅卡曼德(北京)机器人科技有限公司 Method for calibrating robot through point cloud acquisition device and calibration system
CN114993245B (en) * 2022-05-31 2024-04-05 山西支点科技有限公司 High-precision target calibrating method of target calibrating equipment in movable base platform and external field vibration environment
CN115026814B (en) * 2022-06-01 2024-04-12 中科苏州智能计算技术研究院 Camera automatic calibration method for mechanical arm movement space reconstruction
CN115092671B (en) * 2022-06-08 2023-09-26 深圳市南科佳安机器人科技有限公司 Feeding and discharging control method
CN115122331A (en) * 2022-07-04 2022-09-30 中冶赛迪工程技术股份有限公司 Workpiece grabbing method and device
CN115153925B (en) * 2022-07-18 2024-04-23 杭州键嘉医疗科技股份有限公司 Automatic drill bit positioning device and method for dental implant operation
CN115159149B (en) * 2022-07-28 2024-05-24 深圳市罗宾汉智能装备有限公司 Visual positioning-based material taking and unloading method and device
CN115241110B (en) * 2022-08-15 2023-12-08 魅杰光电科技(上海)有限公司 Wafer motion control method and wafer motion control system
CN115442584B (en) * 2022-08-30 2023-08-18 中国传媒大学 Multi-sensor fusion type special-shaped surface dynamic projection method
JP2024066886A (en) * 2022-11-02 2024-05-16 京セラ株式会社 Electronic device, electronic device control method, and program
CN115712111A (en) * 2022-11-07 2023-02-24 北京斯年智驾科技有限公司 Camera and radar combined calibration method and system, electronic device, computer equipment and storage medium
CN116000927A (en) * 2022-12-29 2023-04-25 中国工程物理研究院机械制造工艺研究所 Measuring device and method for spatial position guiding precision of robot vision system
CN115793261B (en) * 2023-01-31 2023-05-02 北京东方瑞丰航空技术有限公司 Visual compensation method, system and equipment for VR glasses
CN115908121B (en) * 2023-02-23 2023-05-26 深圳市精锋医疗科技股份有限公司 Endoscope registration method, device and calibration system
CN116358517B (en) * 2023-02-24 2024-02-23 杭州宇树科技有限公司 Height map construction method, system and storage medium for robot
CN115965760B (en) * 2023-03-15 2023-06-09 成都理工大学 Debris flow alluvial simulation experiment accumulation body surface reconstruction system
CN116423505B (en) * 2023-03-30 2024-04-23 杭州邦杰星医疗科技有限公司 Error calibration method for mechanical arm registration module in mechanical arm navigation operation
CN116512286B (en) * 2023-04-23 2023-11-14 九众九机器人有限公司 Six-degree-of-freedom stamping robot and stamping method thereof
CN116449387B (en) * 2023-06-15 2023-09-12 南京师范大学 Multi-dimensional environment information acquisition platform and calibration method thereof
CN116563297B (en) * 2023-07-12 2023-10-31 中国科学院自动化研究所 Craniocerebral target positioning method, device and storage medium
CN117008122A (en) * 2023-08-04 2023-11-07 江苏苏港智能装备产业创新中心有限公司 Method and system for positioning surrounding objects of engineering mechanical equipment based on multi-radar fusion
CN116687386B (en) * 2023-08-07 2023-10-31 青岛市畜牧工作站(青岛市畜牧兽医研究所) Radar detection system and method for comprehensive calibration of cattle body shape data
CN116862999B (en) * 2023-09-04 2023-12-08 华东交通大学 Calibration method, system, equipment and medium for three-dimensional measurement of double cameras
CN116883516B (en) * 2023-09-07 2023-11-24 西南科技大学 Camera parameter calibration method and device
CN117092625B (en) * 2023-10-10 2024-01-02 北京斯年智驾科技有限公司 External parameter calibration method and system of radar and combined inertial navigation system
CN117109505B (en) * 2023-10-24 2024-01-30 中国飞机强度研究所 Method for measuring blocking hook posture and determining space deformation data of carrier-based aircraft
CN117140536B (en) * 2023-10-30 2024-01-09 北京航空航天大学 Robot control method and device and robot
CN117257459B (en) * 2023-11-22 2024-03-12 杭州先奥科技有限公司 Map expansion method and system in electromagnetic navigation bronchoscopy with respiratory disturbance resistance
CN117284499B (en) * 2023-11-24 2024-01-19 北京航空航天大学 Monocular vision-laser-based pose measurement method for spatial unfolding mechanism
CN117433511B (en) * 2023-12-20 2024-03-12 绘见科技(深圳)有限公司 Multi-sensor fusion positioning method
CN117646828B (en) * 2024-01-29 2024-04-05 中国市政工程西南设计研究总院有限公司 Device and method for detecting relative displacement and water leakage of pipe jacking interface
CN117968680A (en) * 2024-03-29 2024-05-03 西安现代控制技术研究所 Inertial-radar integrated navigation limited frame measurement variable weight updating method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194983A (en) * 2017-05-16 2017-09-22 华中科技大学 A kind of three-dimensional visualization method and system based on a cloud and image data
CN109118547A (en) * 2018-11-01 2019-01-01 百度在线网络技术(北京)有限公司 Multi-cam combined calibrating system and method
CN109712189A (en) * 2019-03-26 2019-05-03 深兰人工智能芯片研究院(江苏)有限公司 A kind of method and apparatus of sensor combined calibrating
CN110599546A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method, system, device and storage medium for acquiring three-dimensional space data
CN110599541A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method and device for calibrating multiple sensors and storage medium
CN110686704A (en) * 2019-10-18 2020-01-14 深圳市镭神智能系统有限公司 Pose calibration method, system and medium for laser radar and combined inertial navigation
CN111127563A (en) * 2019-12-18 2020-05-08 北京万集科技股份有限公司 Combined calibration method and device, electronic equipment and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103983961A (en) 2014-05-20 2014-08-13 南京理工大学 Three-dimensional calibration target for joint calibration of 3D laser radar and camera
US10282591B2 (en) 2015-08-24 2019-05-07 Qualcomm Incorporated Systems and methods for depth map sampling
KR20190126458A (en) 2017-04-17 2019-11-11 코그넥스코오포레이션 High precision calibration system and method
CN109828262A (en) 2019-03-15 2019-05-31 苏州天准科技股份有限公司 Laser radar and the automatic combined calibrating method of camera based on plane and space characteristics
CN110322519B (en) 2019-07-18 2023-03-31 天津大学 Calibration device and calibration method for combined calibration of laser radar and camera
US10726579B1 (en) 2019-11-13 2020-07-28 Honda Motor Co., Ltd. LiDAR-camera calibration
CN112907676B (en) 2019-11-19 2022-05-10 浙江商汤科技开发有限公司 Calibration method, device and system of sensor, vehicle, equipment and storage medium
CN111627072B (en) 2020-04-30 2023-10-24 贝壳技术有限公司 Method, device and storage medium for calibrating multiple sensors

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194983A (en) * 2017-05-16 2017-09-22 华中科技大学 A kind of three-dimensional visualization method and system based on a cloud and image data
CN109118547A (en) * 2018-11-01 2019-01-01 百度在线网络技术(北京)有限公司 Multi-cam combined calibrating system and method
CN109712189A (en) * 2019-03-26 2019-05-03 深兰人工智能芯片研究院(江苏)有限公司 A kind of method and apparatus of sensor combined calibrating
CN110599546A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method, system, device and storage medium for acquiring three-dimensional space data
CN110599541A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method and device for calibrating multiple sensors and storage medium
CN110686704A (en) * 2019-10-18 2020-01-14 深圳市镭神智能系统有限公司 Pose calibration method, system and medium for laser radar and combined inertial navigation
CN111127563A (en) * 2019-12-18 2020-05-08 北京万集科技股份有限公司 Combined calibration method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴昱晗等: "一种基于点云匹配的激光雷达/IMU 联合标定方法", 《测控技术与仪器仪表》 *
韩栋斌等: "基于多对点云匹配的三维激光雷达外参数标定", 《激光与光电子学进展》 *

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112577517A (en) * 2020-11-13 2021-03-30 上汽大众汽车有限公司 Multi-element positioning sensor combined calibration method and system
CN112444798B (en) * 2020-11-27 2024-04-09 杭州易现先进科技有限公司 Method and device for calibrating space-time external parameters of multi-sensor equipment and computer equipment
CN112444798A (en) * 2020-11-27 2021-03-05 杭州易现先进科技有限公司 Multi-sensor equipment space-time external parameter calibration method and device and computer equipment
CN114643599A (en) * 2020-12-18 2022-06-21 沈阳新松机器人自动化股份有限公司 Three-dimensional machine vision system and method based on point laser and area-array camera
CN112790786A (en) * 2020-12-30 2021-05-14 无锡祥生医疗科技股份有限公司 Point cloud data registration method and device, ultrasonic equipment and storage medium
CN112881999B (en) * 2021-01-25 2024-02-02 上海西虹桥导航技术有限公司 Semi-automatic calibration method for multi-line laser radar and vision sensor
CN112881999A (en) * 2021-01-25 2021-06-01 上海西虹桥导航技术有限公司 Semi-automatic calibration method for multi-line laser radar and vision sensor
CN112509067B (en) * 2021-02-02 2021-04-27 中智行科技有限公司 Multi-sensor combined calibration method and device, electronic equipment and storage medium
CN112509067A (en) * 2021-02-02 2021-03-16 中智行科技有限公司 Multi-sensor combined calibration method and device, electronic equipment and storage medium
CN112882000A (en) * 2021-02-05 2021-06-01 北京科技大学 Automatic calibration method of laser radar
CN112882000B (en) * 2021-02-05 2023-02-03 北京科技大学 Automatic calibration method for laser radar
CN112927302A (en) * 2021-02-22 2021-06-08 山东大学 Calibration plate and calibration method for multi-line laser radar and camera combined calibration
CN112927302B (en) * 2021-02-22 2023-08-15 山东大学 Calibration plate and calibration method for combined calibration of multi-line laser radar and camera
CN113192174A (en) * 2021-04-06 2021-07-30 中国计量大学 Mapping method and device and computer storage medium
CN113192174B (en) * 2021-04-06 2024-03-26 中国计量大学 Picture construction method and device and computer storage medium
CN113218435A (en) * 2021-05-07 2021-08-06 复旦大学 Multi-sensor time synchronization method
CN113298881B (en) * 2021-05-27 2023-09-12 中国科学院沈阳自动化研究所 Spatial joint calibration method for monocular camera-IMU-mechanical arm
CN113298881A (en) * 2021-05-27 2021-08-24 中国科学院沈阳自动化研究所 Monocular camera-IMU-mechanical arm space combined calibration method
CN113269107A (en) * 2021-06-01 2021-08-17 航天智造(上海)科技有限责任公司 Interactive intelligent disassembling and assembling system based on deep learning
CN113376618A (en) * 2021-06-22 2021-09-10 昆明理工大学 Multi-path side laser radar point cloud registration device and using method
CN113376618B (en) * 2021-06-22 2024-03-01 昆明理工大学 Multi-path side laser radar point cloud registration device and use method
CN114252099B (en) * 2021-12-03 2024-02-23 武汉科技大学 Multi-sensor fusion self-calibration method and system for intelligent vehicle
CN114252099A (en) * 2021-12-03 2022-03-29 武汉科技大学 Intelligent vehicle multi-sensor fusion self-calibration method and system
CN114894116B (en) * 2022-04-08 2024-02-23 苏州瀚华智造智能技术有限公司 Measurement data fusion method and non-contact measurement equipment
CN114894116A (en) * 2022-04-08 2022-08-12 苏州瀚华智造智能技术有限公司 Measurement data fusion method and non-contact measurement equipment
CN114879153A (en) * 2022-06-08 2022-08-09 中国第一汽车股份有限公司 Radar parameter calibration method and device and vehicle
CN114882115A (en) * 2022-06-10 2022-08-09 国汽智控(北京)科技有限公司 Vehicle pose prediction method and device, electronic equipment and storage medium
CN114882115B (en) * 2022-06-10 2023-08-25 国汽智控(北京)科技有限公司 Vehicle pose prediction method and device, electronic equipment and storage medium
CN115097427A (en) * 2022-08-24 2022-09-23 北原科技(深圳)有限公司 Automatic calibration system and method based on time-of-flight method
CN115097427B (en) * 2022-08-24 2023-02-10 北原科技(深圳)有限公司 Automatic calibration method based on time-of-flight method
CN115908589A (en) * 2023-02-23 2023-04-04 深圳佑驾创新科技有限公司 Multi-sensor calibration system and method
US11919177B1 (en) 2023-04-03 2024-03-05 Guangdong University Of Technology Tracking measurement method, apparatus and device for pose of tail end of manipulator
CN116038719A (en) * 2023-04-03 2023-05-02 广东工业大学 Method, device and equipment for tracking and measuring pose of tail end of mechanical arm
CN116038719B (en) * 2023-04-03 2023-07-18 广东工业大学 Method, device and equipment for tracking and measuring pose of tail end of mechanical arm
CN116630444B (en) * 2023-07-24 2023-09-29 中国矿业大学 Optimization method for fusion calibration of camera and laser radar
CN116630444A (en) * 2023-07-24 2023-08-22 中国矿业大学 Optimization method for fusion calibration of camera and laser radar
CN117554937A (en) * 2024-01-08 2024-02-13 安徽中科星驰自动驾驶技术有限公司 Error-controllable laser radar and combined inertial navigation external parameter calibration method and system
CN117554937B (en) * 2024-01-08 2024-04-26 安徽中科星驰自动驾驶技术有限公司 Error-controllable laser radar and combined inertial navigation external parameter calibration method and system

Also Published As

Publication number Publication date
CN111735479B (en) 2021-03-23
JP7072759B2 (en) 2022-05-23
JP2022039906A (en) 2022-03-10

Similar Documents

Publication Publication Date Title
CN111735479B (en) Multi-sensor combined calibration device and method
CN109270534B (en) Intelligent vehicle laser sensor and camera online calibration method
CN112396664B (en) Monocular camera and three-dimensional laser radar combined calibration and online optimization method
CN111325801B (en) Combined calibration method for laser radar and camera
EP3479353A1 (en) Systems and methods for identifying pose of cameras in a scene
CN108594245A (en) A kind of object movement monitoring system and method
CN112837383B (en) Camera and laser radar recalibration method and device and computer readable storage medium
CN110987021B (en) Inertial vision relative attitude calibration method based on rotary table reference
CN106625673A (en) Narrow space assembly system and assembly method
CN109724586B (en) Spacecraft relative pose measurement method integrating depth map and point cloud
US20220230348A1 (en) Method and apparatus for determining a three-dimensional position and pose of a fiducial marker
CN114608554B (en) Handheld SLAM equipment and robot instant positioning and mapping method
CN112819711B (en) Monocular vision-based vehicle reverse positioning method utilizing road lane line
CN111915685B (en) Zoom camera calibration method
Luo et al. Docking navigation method for UAV autonomous aerial refueling
CN110030979B (en) Spatial non-cooperative target relative pose measurement method based on sequence images
CN108257184B (en) Camera attitude measurement method based on square lattice cooperative target
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN114001651A (en) Large-scale long and thin cylinder type component pose in-situ measurement method based on binocular vision measurement and prior detection data
CN117115271A (en) Binocular camera external parameter self-calibration method and system in unmanned aerial vehicle flight process
Gao et al. Altitude information acquisition of uav based on monocular vision and mems
CN110490934A (en) Mixing machine vertical blade attitude detecting method based on monocular camera and robot
CN113405532B (en) Forward intersection measuring method and system based on structural parameters of vision system
CN112489118B (en) Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle
CN115100287A (en) External reference calibration method and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant