CN111735479B - Multi-sensor combined calibration device and method - Google Patents
Multi-sensor combined calibration device and method Download PDFInfo
- Publication number
- CN111735479B CN111735479B CN202010881818.4A CN202010881818A CN111735479B CN 111735479 B CN111735479 B CN 111735479B CN 202010881818 A CN202010881818 A CN 202010881818A CN 111735479 B CN111735479 B CN 111735479B
- Authority
- CN
- China
- Prior art keywords
- calibration
- camera
- laser radar
- calibration plate
- point cloud
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Manufacturing & Machinery (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a multi-sensor combined calibration device and method, and relates to a multi-sensor calibration technology. The method solves the problem of multi-sensor combined calibration in the prior art. The multi-sensor combined calibration device and the multi-sensor combined calibration method comprise a mechanical arm, wherein a sensor fusion frame is arranged on the mechanical arm, a laser radar, a monocular camera and a computer for processing data are arranged on the sensor fusion frame, the multi-sensor combined calibration device also comprises a laser radar-camera four-calibration-plate combined calibration target, and the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration. The embodiment of the invention builds a portable multi-sensor fusion framework, thereby being convenient for calibration and secondary development; the embodiment of the invention uses a mechanical arm auxiliary calibration method, and can realize intelligent calibration and batch calibration.
Description
Technical Field
The invention belongs to the technical field of sensors, and particularly relates to a multi-sensor calibration technology.
Background
The instant positioning and Mapping (SLAM) technology provides environment perception information for unmanned driving, and the traditional SLAM technology is divided into a laser SLAM and a visual SLAM; the laser radar has the advantages of high ranging precision, no influence of light and the like, and the camera has the advantages of low cost, rich image information and the like; however, the single sensor SLAM has great limitations, for example, the laser radar has a slow update frequency, has motion distortion, and cannot provide accurate measurement values in severe environments such as rain and snow; the camera cannot obtain accurate three-dimensional information and is greatly limited by ambient light.
The inertial navigation system can provide accurate angular velocity and angular velocity as a pose estimation auxiliary tool. Therefore, the SLAM environment perception capability can be improved by fusing multi-sensor data such as laser radar, visual sensors and inertial navigation systems.
Calibration of a sensor in the SLAM system is divided into internal reference calibration and external reference calibration. The internal reference calibration of the sensor mainly refers to the calculation of an internal reference matrix of a camera and the calculation of an error coefficient of an inertial navigation system, so that the measured data of the sensor is accurate. External reference calibration among sensors is a prerequisite condition for accurately carrying out multi-sensor information fusion; and (4) external reference calibration between the sensors, namely determining the pose transformation relation between the coordinate systems of the sensors.
According to the traditional external reference calibration method, a pose conversion relation from a laser radar coordinate system, a camera coordinate system and an inertial navigation system coordinate system to a vehicle body coordinate system is sought by taking a vehicle body coordinate system as a reference. However, in the traditional external reference calibration, all sensors are fixed in a vehicle body, the vehicle body is limited by the movement dimension, the calibration process is complicated, and the accurate calibration of the yaw angle and the roll angle is difficult to realize.
The laser radar has high ranging precision and has no special requirement on a calibration reference object, but the camera is based on two-dimensional image characteristics and can be calibrated only by the specific calibration reference object. The existing external reference calibration method for the laser radar and the camera comprises the following steps: the method comprises a calibration method based on a single chessboard calibration plate, a calibration method based on an L-shaped calibration plate and a calibration method based on a 3D chessboard target, which are different in major and minor aspects, and solves the extrinsic parameter matrix by matching laser 3D characteristic points and camera 2D characteristic points.
The calibration of the laser radar and the inertial navigation system needs to be carried out under the motion condition, and the inertial navigation system can provide accurate acceleration measurement values and angular velocity measurement values. The traditional calibration method is a hand-eye calibration method, but the calibration precision is difficult to guarantee; the hundred-degree Apollo calibration tool controls the vehicle to move around the 8-shaped line, and data acquisition and external parameter calibration of the sensor are realized.
The multi-sensor combined calibration is one of the most fiery topics in the field of unmanned driving at present, and the existing calibration technology has the defects of low automation degree, complex operation, low accuracy and the like.
Disclosure of Invention
The invention aims to provide a multi-sensor combined calibration device and a multi-sensor combined calibration method aiming at the problems in the prior art.
The purpose of the invention can be realized by the following technical scheme: a multi-sensor combined calibration device comprises a mechanical arm, wherein a sensor fusion frame is arranged on the mechanical arm, a laser radar, a monocular camera and a computer for processing data are arranged on the sensor fusion frame, the multi-sensor combined calibration device also comprises a laser radar-camera four-calibration-plate combined calibration target, and the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration; black and white checkerboard patterns are distributed on the No. 4 calibration plate and used for calibrating internal references of the camera; the four calibration plates are arranged in a shape like a Chinese character 'tian', the No. 1 calibration plate and the No. 2 calibration plate are arranged in parallel, the No. 3 calibration plate and the No. 4 calibration plate are arranged in front of the No. 1 calibration plate and the No. 2 calibration plate in parallel, and the No. 3 calibration plate and the No. 4 calibration plate are lower than the No. 1 calibration plate and the No. 2 calibration plate; no. 3 normal vector n of calibration plate surface3With number 4 calibration plate normal vector n4The angle difference is more than 30 degrees; no. 1 calibration plate normal vector n1And No. 2 calibration plate normal vector n2The angular difference between them is greater than 30 deg..
A multi-sensor combined calibration method comprises the steps of calibrating internal parameters of a camera and calibrating external parameters of a laser radar-camera; calibrating internal parameters of the camera, controlling the mechanical arm and the camera by the computer unit to enable the camera to shoot the checkerboard calibration board at different postures, and calculating the internal parameters K of the camera by adopting a Zhang Yongyou calibration method; the external reference calibration method of the laser radar-camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, so that the calibration plate module appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera;
step two: analyzing and processing laser radar data, and extracting point cloud characteristic points; recording coordinate information of each point cloud, filtering abnormal points, partitioning the point cloud data by adopting a point cloud partitioning method, and dividing the point cloud data of the four calibration plates into four different groups { L }1}、{L2}、{L3}、{L4Extracting a point cloud clustering center point by adopting a K-means method,that is, the three-dimensional coordinate value of the center point position of the ith calibration plate in the laser radar coordinate systemSelecting laser characteristic points for matching;
step three: analyzing and processing camera data, and extracting visual feature points; recording the gray value of each pixel, adopting FAST key point extraction algorithm to detect the place with obvious gray change of local pixels of the calibration plate, thereby extracting the central point position of each calibration plate, extracting the central point positions of the dots of the calibration plate No. 1, No. 2 and No. 3, and recording the coordinate value C thereof1、C2、C3And the coordinate value C of the center point of the No. 4 calibration plate is obtained by analyzing the relation between the checkerboards4,CiI.e. the two-dimensional coordinate value of the central point position of the ith calibration plate under the camera coordinate system C, and calculating the coordinate value CiSelecting visual feature points for matching;
step four: according to laser radar characteristic pointsAnd camera feature points { C1,C2,C3,C4And establishing a matching relation:establishing a minimum reprojection error according to the feature point matching relation, establishing an error equation, and performing least square solution according to the error equation to obtain an optimal external parameter matrix TLC。
In some embodiments, the feature points are collected from the corner positions of the calibration plate based on the feature points of the center position of the calibration plate.
In some embodiments, the internal reference calibration of the camera adopts a checkerboard calibration method, and the computer unit controls the mechanical arm and the camera so that the camera shoots checkerboard calibration plates at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
In some embodiments, the system further comprises an inertial navigation system arranged on the sensor fusion frame, the combined calibration method comprises a laser radar and inertial navigation system combined calibration method, and the initial motion moment of the mechanical arm is defined as t0The mechanical arm movement end time is tn,tnThe laser point cloud scanned at any moment is Pn,t0Laser radar coordinate system L under time0,t0Coordinate system of inertial navigation system at moment I0,tnLaser radar coordinate system L under timen,tnCoordinate system of inertial navigation system at moment InLaser radar at t0Time tnThe pose transformation matrix between moments is TLInertial navigation system at t0Time tnThe pose transformation matrix between moments is TIThe external parameter matrix between the laser radar and the inertial navigation system is PLI;
The method comprises the following steps:
the method comprises the following steps: the mechanical arm moves in an appointed track, and in the movement process of the mechanical arm, the laser radar and the inertial navigation system acquire data;
step two: controlling the mechanical arm to stop moving, and processing data acquired by the sensor; for the laser radar, performing distortion removal processing on each frame of laser point cloud according to a uniform motion model; removing outliers in the point cloud data by adopting a point cloud filtering algorithm; in order to reduce the computational complexity, each frame of laser point cloud data is subjected to down-sampling processing by using a voxelization grid method;
step three: calculating the laser radar coordinate system L at the initial time t0Is a coordinate system L0Coordinate system L related to the end of the movementnPosition and posture transformation matrix T betweenLPose transformation matrix TLThe calculation method comprises the following steps: matching the mth frame point cloud and the (m + 1) th frame point cloud by adopting an iterative nearest neighbor method to obtain the mth frame point cloud P ═ { P ═1,...,pnAnd the i +1 th frame point cloud P '═ P'1,...p′nMatching relation of the points; the pose transformation of the laser radar from the mth frame to the (m + 1) th frame is rotated by a relation rotation matrix RmAnd a translation vector tmAnd further, constructing an error equation, converting the error equation into a least square problem, and calculating R by using an SVD (singular value decomposition) methodm、tmAccording to Rm、tmAnd obtaining a pose transformation matrix T of the laser radar from the mth moment to the m +1 th momentm=(Rm|tm) Transforming the pose of the n frames of laser radar into a matrix TmMultiply to obtain the laser radar from t0Time tnPose transformation matrix T of timeL;
Step four: calculating the coordinate system I of the inertial navigation system at the initial moment t0Coordinate system I of0And the end time t of the movementnCoordinate system I ofnPosition and posture transformation matrix T betweenIPose transformation matrix TIThe calculation method comprises the following steps: integrating acceleration measurement data and angular velocity measurement data of the inertial navigation system to obtain displacement data tIAnd rotation data RIThen t is0Time tnThe pose transformation matrix of the moment inertial navigation system is represented as TI=(RI|tI);
Step five: through TLWill tnLaser point cloud P of timenProjection to t0L of time0Under the coordinate system, obtaining a point cloud PnL;
Step six: through TIAnd a parameter T to be calibratedLIWill tnLaser point cloud P of timenProjection to t0L of time0Under the coordinate system, obtaining PnI;
Step seven, matching PnLAnd PnITwo groups of point clouds, by aligning the two groups of point clouds, with respect to the external reference matrix TLIOptimizing, adopting an iterative closest point method to obtain PnLAnd PnIThe point cloud region of the same piece is registered, and nearest neighbor error T is constructed and optimizederrorAccording to Terror、TL、TISolving an extrinsic parameter matrix TLI。
In some embodiments, under the condition of sufficient visual conditions, the observation data of the monocular camera is recorded, and the external parameter matrix T between the monocular camera and the inertial navigation system is calculated by adopting a visual-IMU calibration toolCI(ii) a According to the determined three external parameter matrixes TLC、TLI、TCIThe pose consistency between the two sensors is verified in a combined mode, conversion parameters in the reference matrix are adjusted, and the multi-sensor combined calibration precision is improved; the parameter optimization adopts an online adjustment method, fuses laser radar, a monocular camera and inertial navigation system data, and carries out T pairLI、TCIAnd (6) adjusting.
Compared with the prior art, the multi-sensor combined calibration device and method have the following advantages:
the invention is suitable for vision sensors such as 16-line, 32-line, 64-line and other multi-line laser radars, monocular cameras, binocular cameras, RGBD cameras and the like; the embodiment of the invention builds a portable multi-sensor fusion framework, thereby being convenient for calibration and secondary development; the embodiment of the invention uses a mechanical arm auxiliary calibration method, and can realize intelligent calibration and batch calibration.
Description of the drawings:
in the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. Like reference numerals having different letter suffixes may represent different examples of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed herein.
FIG. 1 is a schematic view of a robotic arm provided with a sensor fusion frame;
FIG. 2 is a schematic diagram of a multi-sensor fusion framework;
FIG. 3 is a schematic diagram of a position relationship of a combined calibration target of four calibration plates of a laser radar-camera;
FIG. 4 is a schematic view of the spatial arrangement of calibration targets;
FIG. 5 is a schematic diagram of a lidar-camera joint calibration;
FIG. 6 is a schematic diagram of a feature point distribution;
FIG. 7 is a schematic illustration of laser-visual feature matching;
FIG. 8 is a schematic diagram of laser radar pose transformation;
FIG. 9 is a schematic diagram of inertial navigation system pose transformation;
FIG. 10 is a schematic diagram of a calibration flow.
In the figure: 101. a sensor fusion framework; 102. a mechanical arm; 103. a connecting mechanism; 104. a computer unit; 105. an operation table; 201. a laser radar; 202. a monocular camera; 203. an inertial navigation system; 204. a metal frame; 205. a fixing device; 206 clamp.
Detailed Description
The following are specific examples of the present invention, and the technical solutions of the present invention are further described with reference to the drawings, but the present invention is not limited to these examples, and the following embodiments do not limit the invention according to the claims. Moreover, all combinations of features described in the embodiments are not necessarily essential to the solution of the invention.
Examples
A multi-sensor combined calibration device comprises a multi-sensor fusion frame 101, a mechanical arm 102, a connecting mechanism 103, a computer unit 104 and an operation table 105; the multi-sensor fusion frame 101 fixes a laser radar 201, a monocular camera 202 and an inertial navigation system 203 under a metal frame 204 which can move freely by a multi-sensor combined construction method, and the frame can be carried on environment sensing occasions such as unmanned vehicles and unmanned aerial vehicles through a clamp 206, is suitable for secondary development and is convenient for implementation of external reference calibration; the metal frame is 18cm long, 6cm wide and 8cm high; adopting a multi-sensor combined construction method, and installing the sensors according to the same coordinate system in a pointing manner; the laser radar 201 is arranged at the center of the top of a metal frame 204, and a fixing device 205 is arranged inside the metal frame; the monocular camera 202 is installed at the position 5cm on the left side of the fixing device 205, and the inertial navigation system 203 is installed at the position 5cm on the right side of the fixing device 205; the mechanical arm 102 is arranged above the operating platform 105 and provides translation and rotation motion in three axial directions; the tail end of the mechanical arm is connected with the fixing device 205 through the connecting mechanism 103, and the height of the multi-sensor fusion frame 101 from the ground is 140cm in an initial state; the computer unit 103 has functions of: controlling the motion of the mechanical arm 102, controlling the sensor to collect data, processing the sensor data and calculating the external parameter matrix.
The laser radar-camera four-calibration-plate combined calibration target is composed of four calibration plates with the same size; the laser radar-camera four-calibration-plate combined calibration target 104 consists of four calibration plates of 30cm multiplied by 30 cm; as shown in fig. 3, the center positions of the calibration plates nos. 1 to 3 are marked by dots for providing feature points for external reference calibration; black and white checkerboard patterns are distributed on the No. 4 calibration plate and used for calibrating internal references of the camera; the four calibration plates are arranged in the air according to the shape of Chinese character tian; in order to more accurately divide the four calibration plates, the calibration plates are arranged in the environment according to different distances and angles; as shown in FIG. 4, the No. 3 calibration plate and the No. 4 calibration plate are fixed by adopting a base with the height of 120cm, the base is placed on a horizontal line I, and a distance multi-sensor fusion frame D1130cm, No. 3 calibration plate normal vector n3And the normal vector n of No. 4 calibration plate4The angle difference is more than 30 degrees; no. 1 calibration board and No. 2 calibration board adopt the height to be fixed for 160 cm's base, and the base is placed on level line two, and distance sensor fuses frame D2180cm, ensure No. 1 calibration plate and No. 2 calibration plate are not shieldedBlocking; number 1 calibration plate normal vector n1And the normal vector n of No. 2 calibration plate2The angular difference between them is greater than 30 deg..
The computer unit controls the mechanical arm and the camera to enable the camera to shoot the checkerboard calibration board at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
The laser radar-camera four-calibration-plate combined calibration target is used for calibrating internal parameters of a monocular camera and calibrating external parameters of the laser radar-camera; the combined calibration method of the laser radar and the monocular camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, as shown in fig. 5, so that the combined calibration target appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera; then the computer unit processes the measurement data; finally, a least square problem is constructed, and an external parameter matrix T is solvedLC;
Step two: according to the data acquisition, laser radar data and camera data are respectively processed, for the laser radar data, coordinate information of each point cloud is recorded, abnormal points are filtered out, the point cloud data are segmented by adopting a point cloud segmentation method, and the point cloud data of the four calibration plates are divided into four different groups { L }1}、{L2}、{L3}、{L4Extracting a point cloud clustering center point by adopting a K-means method;that is, the three-dimensional coordinate value of the center point position of the ith calibration plate in the laser radar coordinate systemSelecting laser characteristic points for matching;
step three: analyzing and processing camera data, and extracting visual feature points; recording the gray value of each pixel, and adopting FAST key point extraction algorithm to detect the place where the local pixel gray change of the calibration plate is obvious, thereby extracting each pixelThe position of the central point of each calibration plate; extracting the center positions of the dots of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate, and recording the coordinate value C1、C2、C3(ii) a By analyzing the relation between the chequers, the coordinate value C of the central point of the No. 4 calibration plate is obtained4;CiI.e. the two-dimensional coordinate value of the central point position of the ith calibration plate under the camera coordinate system C, and calculating the coordinate value CiSelecting visual feature points for matching;
step four: from lidar 3D feature pointsAnd monocular camera 2D feature points { C1,C2,C3,C4And establishing a matching relation:as shown in fig. 7, according to the definition of the lidar coordinates (Xi, Yi, Zi) and the monocular camera coordinates (Ui, Vi), based on the matching relationship of the feature points Pi, the minimum reprojection error is established, and the least square problem is constructed; defining the external parameter matrixes of the laser radar and the monocular camera as follows:
wherein R is a rotation matrix and t is a translation vector; then combining the matching relationship and the external reference matrix, the error equation can be expressed as:
wherein s isiAnd constructing a least square optimization function for the depth values of the visual feature points according to an error equation to minimize errors and obtain an optimal external parameter matrix:
further, optionally, more feature points are collected for matching, on the basis of feature points at the center positions of the calibration plates, the corner positions of the calibration plates are collected as feature points, as shown in fig. 6, 5 positions of each calibration plate are selected as feature points, a least square optimization function is constructed according to the method, and an external parameter matrix is solved; experiments prove that the more the number of characteristic points is, the more accurate the calculation of the external parameter matrix is.
The joint calibration method of the laser radar and the inertial navigation system comprises the following steps:
with the assistance of a mechanical arm, adopting a laser radar-inertial navigation system external reference calibration scheme based on point cloud matching; the method comprises the following specific steps:
firstly, controlling the mechanical arm to move, and acquiring data while the sensor moves in a space;
secondly, stopping the mechanical arm to move, and processing data acquired by the sensor;
then, the laser radar is calculated at the initial time t0Is a coordinate system L0And the end time t of the movementnIs a coordinate system LnPosition and posture transformation matrix T betweenL(ii) a Calculating inertial navigation system at initial time t0Coordinate system I of0And the end time t of the movementnCoordinate system I ofnPosition and posture transformation matrix T betweenI;
Thirdly, passing through TLWill tnLaser point cloud P of timenProjection to t0L of time0Under the coordinate system, obtaining a point cloud PnL(ii) a Through TIAnd a parameter P to be calibratedLIWill tnLaser point cloud P of timenProjection to t0L of time0Under the coordinate system, obtaining PnI(ii) a Finally, match PnLAnd PnITwo groups of point clouds, and calculating an external parameter matrix T by aligning the two groups of point cloudsLI;
The mechanical arm moves to control the mechanical arm to move in a specified track; taking the coordinate axis of the mechanical arm as a reference, moving 100cm along the positive direction of an X axis, moving 100cm along the negative direction of the X axis, moving 100cm along the positive direction of a Y axis, moving 100cm along the negative direction of the Y axis, moving 100cm along the positive direction of a Z axis, and moving 100cm along the negative direction of the Z axis; rotating 180 degrees clockwise around the X axis, rotating 180 degrees counterclockwise around the X axis, rotating 180 degrees clockwise around the Y axis, rotating 180 degrees counterclockwise around the Y axis, rotating 180 degrees clockwise around the Z axis, and rotating 180 degrees counterclockwise around the Z axis; in the motion process of the mechanical arm, data acquisition is carried out by a laser radar and an inertial navigation system;
the sensor data processing is to perform distortion removal processing on each frame of laser point cloud according to a uniform motion model for the laser radar; removing outliers in the point cloud data by adopting a point cloud filtering algorithm; in order to reduce the computational complexity, each frame of laser point cloud data is subjected to down-sampling processing by using a voxelization grid method;
the pose transformation matrix TLThe calculation method comprises the following steps: matching the ith frame point cloud and the (i + 1) th frame point cloud by adopting an iterative nearest neighbor method to obtain the ith frame point cloud P ═ P1,...,pnAnd the i +1 th frame point cloud P '═ P'1,...p′nMatching relation of the points; defining the pose transformation of the laser radar from the ith moment to the (i + 1) th moment by a rotation matrix RiAnd a translation vector tiAnd if the point clouds are formed, the corresponding relation of the two frames of point clouds is expressed as follows:
the error equation is expressed as:
ek=pk-(Rip′k+ti)
calculating R by constructing least square problem and using SVD methodi、ti;
The position and posture transformation matrix of the laser radar from the ith moment to the (i + 1) th moment is as follows:
TLfor lidar from t0Time tnAnd the pose transformation matrix of the moment is expressed as:
the pose transformation matrix TIThe calculation method comprises the following steps: integrating acceleration measurement data and angular velocity measurement data of the inertial navigation system to obtain displacement data tIAnd rotation data RI,t0Time tnThe pose transformation matrix of the moment inertial navigation system is expressed as:
the step ofnLaser point cloud P of timenProjection to t0L of time0Under the coordinate system, for the laser radar, as shown in fig. 8, the position and orientation transformation matrix T can be directly passed throughLTo obtain PnAt L0Coordinate representation P in a coordinate systemnL:
For an inertial navigation system, as shown in FIG. 9, the pose transformation matrix T can be usedIAnd external reference matrix TLITo obtain PnAt L0Coordinate representation P in a coordinate systemnI:
The computing external parameter matrix TLIBy aligning the point cloud PnLAnd point cloud PnI(ii) a Theoretically, PnLAnd PnIDescribed is the same piece of point cloud data, which are spatially coincident, i.e. are:
Due to the presence of external reference errors, PnLAnd PnIAre not completely coincident; using an iterative closest point method, P isnLAnd PnIAnd (3) registering the point cloud regions of the same piece, and constructing and optimizing a nearest neighbor error:
according to Terror、TL、TISolving an extrinsic parameter matrix TLI:
So far, the external parameter matrix T is obtained by the calibration methodLCAnd TLI。
Further, optionally, under the condition of sufficient visual conditions, the observation data of the monocular camera is recorded, and the external parameter matrix T between the monocular camera and the inertial navigation system is calculated by adopting a visual-IMU calibration toolCI(ii) a According to the determined three external parameter matrixes TLC、TLI、TCIThe pose consistency between the two is verified in a combined mode, if the pose consistency does not pass the verification, the conversion parameters in the external parameter matrix are adjusted until the verification passes, and therefore under the condition that the verification does not pass, the embodiment of the invention carries out T-mode joint verification on the pose consistencyLC、TLI、TCIOptimizing parameters, and improving the precision of multi-sensor combined calibration; the parameter optimization adopts an online adjustment method, fuses laser radar, a monocular camera and inertial navigation system data, and carries out T pairLI、TCIAnd (6) adjusting.
Although some terms are used more herein, the possibility of using other terms is not excluded. These terms are used merely to more conveniently describe and explain the nature of the present invention; they are to be construed as being without limitation to any additional limitations that may be imposed by the spirit of the present invention. The order of execution of the operations, steps, and the like in the apparatuses and methods shown in the specification and drawings may be implemented in any order as long as the output of the preceding process is not used in the subsequent process, unless otherwise specified. The descriptions using "first", "next", etc. for convenience of description do not imply that they must be performed in this order.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.
Claims (2)
1. A multi-sensor combined calibration method is characterized by further comprising a laser radar-camera four-calibration-plate combined calibration target, wherein the laser radar-camera four-calibration-plate combined calibration target comprises a No. 1 calibration plate, a No. 2 calibration plate, a No. 3 calibration plate and a No. 4 calibration plate; the center positions of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate are marked by dots and are used for providing characteristic points for external reference calibration; black and white checkerboard patterns are distributed on the No. 4 calibration plate and used for calibrating internal references of the camera; the four calibration plates are arranged in a shape like a Chinese character 'tian', the No. 1 calibration plate and the No. 2 calibration plate are arranged in parallel, the No. 3 calibration plate and the No. 4 calibration plate are arranged in front of the No. 1 calibration plate and the No. 2 calibration plate in parallel, and the No. 3 calibration plate and the No. 4 calibration plate are lower than the No. 1 calibration plate and the No. 2 calibration plate; no. 3 normal vector n of calibration plate surface3With number 4 calibration plate normal vector n4The angle difference is more than 30 degrees; no. 1 calibration plate normal vector n1And No. 2 calibration plate normal vector n2The angle difference is more than 30 degrees;
the multi-sensor combined calibration method comprises the steps of calibrating internal parameters of a camera and calibrating external parameters of a laser radar-camera; calibrating internal parameters of the camera, controlling the mechanical arm and the camera by the computer unit to enable the camera to shoot the checkerboard calibration board at different postures, and calculating the internal parameters K of the camera by adopting a Zhang Yongyou calibration method; the external reference calibration method of the laser radar-camera comprises the following steps:
the method comprises the following steps: the computer unit adjusts the posture of the mechanical arm, so that the calibration plate module appears in the visual field range of the laser radar and the camera; controlling the mechanical arm to keep a static state, and acquiring data by using a laser radar and a camera;
step two: analyzing and processing laser radar data, and extracting point cloud characteristic points; recording coordinate information of each point cloud, filtering abnormal points, partitioning the point cloud data by adopting a point cloud partitioning method, and dividing the point cloud data of the four calibration plates into four different groups { L }1}、{L2}、{L3}、{L4Extracting a point cloud clustering center point by adopting a K-means method;that is, the three-dimensional coordinate value of the center point position of the ith calibration plate in the laser radar coordinate systemSelecting laser characteristic points for matching;
step three: analyzing and processing camera data, and extracting visual feature points; recording the gray value of each pixel, and detecting a place where the gray change of local pixels of the calibration plate is obvious by adopting a FAST key point extraction algorithm so as to extract the position of the central point of each calibration plate; extracting the center positions of the dots of the No. 1 calibration plate, the No. 2 calibration plate and the No. 3 calibration plate, and recording the coordinate value C1、C2、C3(ii) a By analyzing the relation between the chequers, the coordinate value C of the central point of the No. 4 calibration plate is obtained4;CiI.e. the two-dimensional coordinate value of the central point position of the ith calibration plate under the camera coordinate system C, and calculating the coordinate value CiSelecting visual feature points for matching;
step four: according to laser radar characteristic pointsAnd camera feature points { C1,C2,C3,C4And establishing a matching relation:establishing a minimum reprojection error according to the feature point matching relation, establishing an error equation, and performing least square solution according to the error equation to obtain an optimal external parameter matrix TLC(ii) a Collecting corner positions of the calibration plate as feature points on the basis of feature points at the center of the calibration plate; the method comprises the following specific steps:
establishing a minimized reprojection error and constructing a least square problem based on the matching relation of the feature points Pi according to the defined laser radar coordinates (Xi, Yi, Zi) and the monocular camera coordinates (Ui, Vi); defining the external parameter matrixes of the laser radar and the monocular camera as follows:
wherein R is a rotation matrix and t is a translation vector; then combining the matching relationship and the external reference matrix, the error equation can be expressed as:
wherein SiAnd constructing a least square optimization function for the depth values of the visual feature points according to an error equation to minimize errors and obtain an optimal external parameter matrix:
the multi-sensor combined calibration device also comprises a sensorAn inertial navigation system on a fusion frame, a combined calibration method comprises a laser radar and inertial navigation system combined calibration method, and the initial motion moment of a mechanical arm is defined as t0The mechanical arm movement end time is tn,tnThe laser point cloud scanned at any moment is Pn,t0Laser radar coordinate system L under time0,t0Coordinate system of inertial navigation system at moment I0,tnLaser radar coordinate system L under timen,tnCoordinate system of inertial navigation system at moment InLaser radar at t0Time tnThe pose transformation matrix between moments is TLInertial navigation system at t0Time tnThe pose transformation matrix between moments is TIThe external parameter matrix between the laser radar and the inertial navigation system is TLIThe method comprises the following steps:
the method comprises the following steps: the mechanical arm moves in an appointed track, and in the movement process of the mechanical arm, the laser radar and the inertial navigation system acquire data;
step two: controlling the mechanical arm to stop moving, and processing data acquired by the sensor; for the laser radar, performing distortion removal processing on each frame of laser point cloud according to a uniform motion model; removing outliers in the point cloud data by adopting a point cloud filtering algorithm;
step three: calculating the laser radar coordinate system L at the initial time t0Is a coordinate system L0Coordinate system L related to the end of the movementnPosition and posture transformation matrix T betweenLPose transformation matrix TLThe calculation method comprises the following steps: matching the mth frame point cloud and the (m + 1) th frame point cloud by adopting an iterative nearest neighbor method to obtain the mth frame point cloud P ═ { P ═1,...,pnAnd m +1 th frame point cloud P '═ P'1,...p′nMatching relation of the points; the pose transformation of the laser radar from the mth frame to the (m + 1) th frame is rotated by a relation rotation matrix RmAnd a translation vector tmAnd further, constructing an error equation, converting the error equation into a least square problem, and calculating R by using an SVD (singular value decomposition) methodm、tmAccording to Rm、tmObtaining the m +1 th time from the m th timeCarving the position and pose transformation matrix of the laser radar as Tm=(Rm|tm) Transforming the pose of the n frames of laser radar into a matrix TmMultiply to obtain the laser radar from t0Time tnPose transformation matrix T of timeL;
Step four: calculating the coordinate system I of the inertial navigation system at the initial moment t0Coordinate system I of0And the end time t of the movementnCoordinate system I ofnPosition and posture transformation matrix T betweenIPose transformation matrix TIThe calculation method comprises the following steps: integrating acceleration measurement data and angular velocity measurement data of the inertial navigation system to obtain displacement data tIAnd rotation data RIThen t is0Time tnThe pose transformation matrix of the moment inertial navigation system is represented as TI=(RI|tI);
Step five: through TLWill tnLaser point cloud P of timenProjection to t0L of time0Under the coordinate system, obtaining a point cloud PnL;
Step six: through TIAnd a parameter T to be calibratedLIWill tnLaser point cloud P of timenProjection to t0L of time0Under the coordinate system, obtaining PnI;
Step seven, matching PnLAnd PnITwo groups of point clouds, by aligning the two groups of point clouds, with respect to the external reference matrix TLIOptimizing, adopting an iterative closest point method to obtain PnLAnd PnIThe point cloud region of the same piece is registered, and nearest neighbor error T is constructed and optimizederrorAccording to Terror、TL、TISolving an extrinsic parameter matrix TLI;
The computing external parameter matrix TLIBy aligning the point cloud PnLAnd point cloud PnI(ii) a Theoretically, PnLAnd PnIDescribed is the same piece of point cloud data, which are spatially coincident, i.e.:
due to the presence of external reference errors, PnLAnd PnIAre not completely coincident; using an iterative closest point method, P isnLAnd PnIAnd (3) registering the point cloud regions of the same piece, and constructing and optimizing a nearest neighbor error:
according to Terror、TL、TISolving an extrinsic parameter matrix TLI:
So far, the external parameter matrix T is obtained by the calibration methodLCAnd TLI;
Under the condition of sufficient visual conditions, recording observation data of the monocular camera, and calculating an external parameter matrix T between the monocular camera and the inertial navigation system by adopting a visual-IMU calibration toolCI(ii) a According to the determined three external parameter matrixes TLC、TLI、TCIThe pose consistency between the two sensors is verified in a combined mode, conversion parameters in the reference matrix are adjusted, and the multi-sensor combined calibration precision is improved;
optimizing parameters, adopting an online adjusting method, fusing data of a laser radar, a monocular camera and an inertial navigation system, and carrying out T-ray correctionLI、TCIAnd (6) adjusting.
2. The multi-sensor joint calibration method according to claim 1, wherein the camera is calibrated by using a checkerboard calibration method, and the mechanical arm and the camera are controlled by the computer unit so that the checkerboard calibration plate is shot by the camera at different postures; and extracting corner information of 20 to 30 pictures, and calculating the camera internal parameter K by adopting a Zhang Zhengyou calibration method.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010881818.4A CN111735479B (en) | 2020-08-28 | 2020-08-28 | Multi-sensor combined calibration device and method |
JP2021003139A JP7072759B2 (en) | 2020-08-28 | 2021-01-12 | Composite calibration device and method using multiple sensors |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010881818.4A CN111735479B (en) | 2020-08-28 | 2020-08-28 | Multi-sensor combined calibration device and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111735479A CN111735479A (en) | 2020-10-02 |
CN111735479B true CN111735479B (en) | 2021-03-23 |
Family
ID=72658909
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010881818.4A Active CN111735479B (en) | 2020-08-28 | 2020-08-28 | Multi-sensor combined calibration device and method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7072759B2 (en) |
CN (1) | CN111735479B (en) |
Families Citing this family (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112577517A (en) * | 2020-11-13 | 2021-03-30 | 上汽大众汽车有限公司 | Multi-element positioning sensor combined calibration method and system |
CN112444798B (en) * | 2020-11-27 | 2024-04-09 | 杭州易现先进科技有限公司 | Method and device for calibrating space-time external parameters of multi-sensor equipment and computer equipment |
CN114643599B (en) * | 2020-12-18 | 2023-07-21 | 沈阳新松机器人自动化股份有限公司 | Three-dimensional machine vision system and method based on point laser and area array camera |
CN112790786A (en) * | 2020-12-30 | 2021-05-14 | 无锡祥生医疗科技股份有限公司 | Point cloud data registration method and device, ultrasonic equipment and storage medium |
CN112881999B (en) * | 2021-01-25 | 2024-02-02 | 上海西虹桥导航技术有限公司 | Semi-automatic calibration method for multi-line laser radar and vision sensor |
CN112509067B (en) * | 2021-02-02 | 2021-04-27 | 中智行科技有限公司 | Multi-sensor combined calibration method and device, electronic equipment and storage medium |
CN112882000B (en) * | 2021-02-05 | 2023-02-03 | 北京科技大学 | Automatic calibration method for laser radar |
CN112927302B (en) * | 2021-02-22 | 2023-08-15 | 山东大学 | Calibration plate and calibration method for combined calibration of multi-line laser radar and camera |
CN113192174B (en) * | 2021-04-06 | 2024-03-26 | 中国计量大学 | Picture construction method and device and computer storage medium |
CN113218435B (en) * | 2021-05-07 | 2023-01-17 | 复旦大学 | Multi-sensor time synchronization method |
CN113298881B (en) * | 2021-05-27 | 2023-09-12 | 中国科学院沈阳自动化研究所 | Spatial joint calibration method for monocular camera-IMU-mechanical arm |
CN113269107A (en) * | 2021-06-01 | 2021-08-17 | 航天智造(上海)科技有限责任公司 | Interactive intelligent disassembling and assembling system based on deep learning |
CN113376618B (en) * | 2021-06-22 | 2024-03-01 | 昆明理工大学 | Multi-path side laser radar point cloud registration device and use method |
CN114252099B (en) * | 2021-12-03 | 2024-02-23 | 武汉科技大学 | Multi-sensor fusion self-calibration method and system for intelligent vehicle |
CN114894116B (en) * | 2022-04-08 | 2024-02-23 | 苏州瀚华智造智能技术有限公司 | Measurement data fusion method and non-contact measurement equipment |
CN114770517B (en) * | 2022-05-19 | 2023-08-15 | 梅卡曼德(北京)机器人科技有限公司 | Method for calibrating robot through point cloud acquisition device and calibration system |
CN114993245B (en) * | 2022-05-31 | 2024-04-05 | 山西支点科技有限公司 | High-precision target calibrating method of target calibrating equipment in movable base platform and external field vibration environment |
CN115026814B (en) * | 2022-06-01 | 2024-04-12 | 中科苏州智能计算技术研究院 | Camera automatic calibration method for mechanical arm movement space reconstruction |
CN115092671B (en) * | 2022-06-08 | 2023-09-26 | 深圳市南科佳安机器人科技有限公司 | Feeding and discharging control method |
CN114879153A (en) * | 2022-06-08 | 2022-08-09 | 中国第一汽车股份有限公司 | Radar parameter calibration method and device and vehicle |
CN114882115B (en) * | 2022-06-10 | 2023-08-25 | 国汽智控(北京)科技有限公司 | Vehicle pose prediction method and device, electronic equipment and storage medium |
CN115122331A (en) * | 2022-07-04 | 2022-09-30 | 中冶赛迪工程技术股份有限公司 | Workpiece grabbing method and device |
CN115153925B (en) * | 2022-07-18 | 2024-04-23 | 杭州键嘉医疗科技股份有限公司 | Automatic drill bit positioning device and method for dental implant operation |
CN115159149B (en) * | 2022-07-28 | 2024-05-24 | 深圳市罗宾汉智能装备有限公司 | Visual positioning-based material taking and unloading method and device |
CN115241110B (en) * | 2022-08-15 | 2023-12-08 | 魅杰光电科技(上海)有限公司 | Wafer motion control method and wafer motion control system |
CN115097427B (en) * | 2022-08-24 | 2023-02-10 | 北原科技(深圳)有限公司 | Automatic calibration method based on time-of-flight method |
CN115442584B (en) * | 2022-08-30 | 2023-08-18 | 中国传媒大学 | Multi-sensor fusion type special-shaped surface dynamic projection method |
CN115597624A (en) * | 2022-08-31 | 2023-01-13 | 广州文远知行科技有限公司(Cn) | Performance detection method, device and equipment of inertial navigation unit and storage medium |
JP2024066886A (en) * | 2022-11-02 | 2024-05-16 | 京セラ株式会社 | Electronic device, electronic device control method, and program |
CN115712111A (en) * | 2022-11-07 | 2023-02-24 | 北京斯年智驾科技有限公司 | Camera and radar combined calibration method and system, electronic device, computer equipment and storage medium |
CN115890677B (en) * | 2022-11-29 | 2024-06-11 | 中国农业大学 | Dead chicken picking robot for standardized cage chicken house and method thereof |
CN116000927B (en) * | 2022-12-29 | 2024-06-18 | 中国工程物理研究院机械制造工艺研究所 | Measuring device and method for spatial position guiding precision of robot vision system |
CN115793261B (en) * | 2023-01-31 | 2023-05-02 | 北京东方瑞丰航空技术有限公司 | Visual compensation method, system and equipment for VR glasses |
CN115908589A (en) * | 2023-02-23 | 2023-04-04 | 深圳佑驾创新科技有限公司 | Multi-sensor calibration system and method |
CN115908121B (en) * | 2023-02-23 | 2023-05-26 | 深圳市精锋医疗科技股份有限公司 | Endoscope registration method, device and calibration system |
CN116358517B (en) * | 2023-02-24 | 2024-02-23 | 杭州宇树科技有限公司 | Height map construction method, system and storage medium for robot |
CN115965760B (en) * | 2023-03-15 | 2023-06-09 | 成都理工大学 | Debris flow alluvial simulation experiment accumulation body surface reconstruction system |
CN116423505B (en) * | 2023-03-30 | 2024-04-23 | 杭州邦杰星医疗科技有限公司 | Error calibration method for mechanical arm registration module in mechanical arm navigation operation |
CN116038719B (en) | 2023-04-03 | 2023-07-18 | 广东工业大学 | Method, device and equipment for tracking and measuring pose of tail end of mechanical arm |
CN116512286B (en) * | 2023-04-23 | 2023-11-14 | 九众九机器人有限公司 | Six-degree-of-freedom stamping robot and stamping method thereof |
CN116449387B (en) * | 2023-06-15 | 2023-09-12 | 南京师范大学 | Multi-dimensional environment information acquisition platform and calibration method thereof |
CN116563297B (en) * | 2023-07-12 | 2023-10-31 | 中国科学院自动化研究所 | Craniocerebral target positioning method, device and storage medium |
CN116630444B (en) * | 2023-07-24 | 2023-09-29 | 中国矿业大学 | Optimization method for fusion calibration of camera and laser radar |
CN117008122A (en) * | 2023-08-04 | 2023-11-07 | 江苏苏港智能装备产业创新中心有限公司 | Method and system for positioning surrounding objects of engineering mechanical equipment based on multi-radar fusion |
CN116687386B (en) * | 2023-08-07 | 2023-10-31 | 青岛市畜牧工作站(青岛市畜牧兽医研究所) | Radar detection system and method for comprehensive calibration of cattle body shape data |
CN116862999B (en) * | 2023-09-04 | 2023-12-08 | 华东交通大学 | Calibration method, system, equipment and medium for three-dimensional measurement of double cameras |
CN116883516B (en) * | 2023-09-07 | 2023-11-24 | 西南科技大学 | Camera parameter calibration method and device |
CN117092625B (en) * | 2023-10-10 | 2024-01-02 | 北京斯年智驾科技有限公司 | External parameter calibration method and system of radar and combined inertial navigation system |
CN117109505B (en) * | 2023-10-24 | 2024-01-30 | 中国飞机强度研究所 | Method for measuring blocking hook posture and determining space deformation data of carrier-based aircraft |
CN117140536B (en) * | 2023-10-30 | 2024-01-09 | 北京航空航天大学 | Robot control method and device and robot |
CN117257459B (en) * | 2023-11-22 | 2024-03-12 | 杭州先奥科技有限公司 | Map expansion method and system in electromagnetic navigation bronchoscopy with respiratory disturbance resistance |
CN117284499B (en) * | 2023-11-24 | 2024-01-19 | 北京航空航天大学 | Monocular vision-laser-based pose measurement method for spatial unfolding mechanism |
CN117433511B (en) * | 2023-12-20 | 2024-03-12 | 绘见科技(深圳)有限公司 | Multi-sensor fusion positioning method |
CN117554937B (en) * | 2024-01-08 | 2024-04-26 | 安徽中科星驰自动驾驶技术有限公司 | Error-controllable laser radar and combined inertial navigation external parameter calibration method and system |
CN117646828B (en) * | 2024-01-29 | 2024-04-05 | 中国市政工程西南设计研究总院有限公司 | Device and method for detecting relative displacement and water leakage of pipe jacking interface |
CN117974766B (en) * | 2024-03-28 | 2024-06-07 | 西北工业大学 | Multi-target identity judging method of distributed double infrared sensors based on space-time basis |
CN117968680A (en) * | 2024-03-29 | 2024-05-03 | 西安现代控制技术研究所 | Inertial-radar integrated navigation limited frame measurement variable weight updating method |
CN117994359B (en) * | 2024-04-07 | 2024-06-11 | 广东工业大学 | Linear array camera calibration method and related device based on auxiliary camera |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107194983A (en) * | 2017-05-16 | 2017-09-22 | 华中科技大学 | A kind of three-dimensional visualization method and system based on a cloud and image data |
CN109118547A (en) * | 2018-11-01 | 2019-01-01 | 百度在线网络技术(北京)有限公司 | Multi-cam combined calibrating system and method |
CN109712189A (en) * | 2019-03-26 | 2019-05-03 | 深兰人工智能芯片研究院(江苏)有限公司 | A kind of method and apparatus of sensor combined calibrating |
CN110599546A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method, system, device and storage medium for acquiring three-dimensional space data |
CN110599541A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method and device for calibrating multiple sensors and storage medium |
CN110686704A (en) * | 2019-10-18 | 2020-01-14 | 深圳市镭神智能系统有限公司 | Pose calibration method, system and medium for laser radar and combined inertial navigation |
CN111127563A (en) * | 2019-12-18 | 2020-05-08 | 北京万集科技股份有限公司 | Combined calibration method and device, electronic equipment and storage medium |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103983961A (en) * | 2014-05-20 | 2014-08-13 | 南京理工大学 | Three-dimensional calibration target for joint calibration of 3D laser radar and camera |
US10282591B2 (en) * | 2015-08-24 | 2019-05-07 | Qualcomm Incorporated | Systems and methods for depth map sampling |
WO2018195096A1 (en) * | 2017-04-17 | 2018-10-25 | Cognex Corporation | High-accuracy calibration system and method |
CN109828262A (en) * | 2019-03-15 | 2019-05-31 | 苏州天准科技股份有限公司 | Laser radar and the automatic combined calibrating method of camera based on plane and space characteristics |
CN110322519B (en) * | 2019-07-18 | 2023-03-31 | 天津大学 | Calibration device and calibration method for combined calibration of laser radar and camera |
US10726579B1 (en) * | 2019-11-13 | 2020-07-28 | Honda Motor Co., Ltd. | LiDAR-camera calibration |
CN112907676B (en) * | 2019-11-19 | 2022-05-10 | 浙江商汤科技开发有限公司 | Calibration method, device and system of sensor, vehicle, equipment and storage medium |
CN111627072B (en) * | 2020-04-30 | 2023-10-24 | 贝壳技术有限公司 | Method, device and storage medium for calibrating multiple sensors |
-
2020
- 2020-08-28 CN CN202010881818.4A patent/CN111735479B/en active Active
-
2021
- 2021-01-12 JP JP2021003139A patent/JP7072759B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107194983A (en) * | 2017-05-16 | 2017-09-22 | 华中科技大学 | A kind of three-dimensional visualization method and system based on a cloud and image data |
CN109118547A (en) * | 2018-11-01 | 2019-01-01 | 百度在线网络技术(北京)有限公司 | Multi-cam combined calibrating system and method |
CN109712189A (en) * | 2019-03-26 | 2019-05-03 | 深兰人工智能芯片研究院(江苏)有限公司 | A kind of method and apparatus of sensor combined calibrating |
CN110599546A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method, system, device and storage medium for acquiring three-dimensional space data |
CN110599541A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method and device for calibrating multiple sensors and storage medium |
CN110686704A (en) * | 2019-10-18 | 2020-01-14 | 深圳市镭神智能系统有限公司 | Pose calibration method, system and medium for laser radar and combined inertial navigation |
CN111127563A (en) * | 2019-12-18 | 2020-05-08 | 北京万集科技股份有限公司 | Combined calibration method and device, electronic equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
一种基于点云匹配的激光雷达/IMU 联合标定方法;吴昱晗等;《测控技术与仪器仪表》;20191231;第78-82页 * |
基于多对点云匹配的三维激光雷达外参数标定;韩栋斌等;《激光与光电子学进展》;20181231;第022803-1-022803-8页 * |
Also Published As
Publication number | Publication date |
---|---|
CN111735479A (en) | 2020-10-02 |
JP2022039906A (en) | 2022-03-10 |
JP7072759B2 (en) | 2022-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111735479B (en) | Multi-sensor combined calibration device and method | |
CN112396664B (en) | Monocular camera and three-dimensional laser radar combined calibration and online optimization method | |
CN109270534B (en) | Intelligent vehicle laser sensor and camera online calibration method | |
CN109308693B (en) | Single-binocular vision system for target detection and pose measurement constructed by one PTZ camera | |
US10909721B2 (en) | Systems and methods for identifying pose of cameras in a scene | |
CN109035200A (en) | A kind of bolt positioning and position and posture detection method based on the collaboration of single binocular vision | |
CN109472831A (en) | Obstacle recognition range-measurement system and method towards road roller work progress | |
CN110142805A (en) | A kind of robot end's calibration method based on laser radar | |
CN111325801A (en) | Combined calibration method for laser radar and camera | |
CN113627473A (en) | Water surface unmanned ship environment information fusion sensing method based on multi-mode sensor | |
CN112837383A (en) | Camera and laser radar recalibration method and device and computer readable storage medium | |
CN110998241A (en) | System and method for calibrating an optical system of a movable object | |
CN112819711B (en) | Monocular vision-based vehicle reverse positioning method utilizing road lane line | |
Luo et al. | Docking navigation method for UAV autonomous aerial refueling | |
CN110030979B (en) | Spatial non-cooperative target relative pose measurement method based on sequence images | |
CN108257184B (en) | Camera attitude measurement method based on square lattice cooperative target | |
CN114777768A (en) | High-precision positioning method and system for satellite rejection environment and electronic equipment | |
CN112712566B (en) | Binocular stereo vision sensor measuring method based on structure parameter online correction | |
CN112580683B (en) | Multi-sensor data time alignment system and method based on cross correlation | |
CN114001651A (en) | Large-scale long and thin cylinder type component pose in-situ measurement method based on binocular vision measurement and prior detection data | |
CN112330747B (en) | Multi-sensor combined detection and display method based on unmanned aerial vehicle platform | |
CN117115271A (en) | Binocular camera external parameter self-calibration method and system in unmanned aerial vehicle flight process | |
Sun et al. | Automatic targetless calibration for LiDAR and camera based on instance segmentation | |
Gao et al. | Altitude information acquisition of uav based on monocular vision and mems | |
CN113405532B (en) | Forward intersection measuring method and system based on structural parameters of vision system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |